SoS DOE Test Concept by Larry O. Harris, PEO-C4I,PMW-120, - - PowerPoint PPT Presentation

sos doe test concept
SMART_READER_LITE
LIVE PREVIEW

SoS DOE Test Concept by Larry O. Harris, PEO-C4I,PMW-120, - - PowerPoint PPT Presentation

SoS DOE Test Concept by Larry O. Harris, PEO-C4I,PMW-120, APM-T&E Luis A. Cortes, The MITRE Corporation 5 October 2016 Test, Evaluation, and Certification of C4ISR Systems Current State Primarily based on data provided by individual


slide-1
SLIDE 1

SoS DOE Test Concept

by

Larry O. Harris, PEO-C4I,PMW-120, APM-T&E Luis A. Cortes, The MITRE Corporation

5 October 2016

slide-2
SLIDE 2

Test, Evaluation, and Certification of C4ISR Systems

Current State

  • Primarily based on data provided by individual Programs of

Record (PoRs) and Enterprise Engineering and Certification (E2C) testing

  • The majority of these PoRs have interfaces and dependencies on
  • ther PoRs
  • Performance of these PoRs as an interrelated group (System of

Systems, or SoS) is often not fully evaluated, and not well understood A more robust & rigorous method to evaluate overall performance

  • f the SoS using mission based threads is needed

2

slide-3
SLIDE 3

Background

  • Many of our PoRs have different Acquisition strategies

– AGILE/Rapid-IT (at least five – expect more to follow) – Incremental Software Test Strategy(ISTS) Pilot w/ COTF (at least two) – Storefront/Widgets/Patches

  • They are developing or updating their Test Strategy in new, adjunct areas

– Design of experiments – Cyber Security – Reliability Growth

  • Their Test, Evaluation, and Certification is not synchronized

– Some are in contractor testing – Others are in Enterprise Engineering &Certification (E2C) Lab – Others are in in-house PoR Lab testing

These challenges heighten the need for shifting the focus to SoS Test, Evaluation, and Certification

PEO C4I/PMW 120 warfighting enabling capabilities

Communication C2 ISR METOC SIGINT

3

slide-4
SLIDE 4

4

KPP Flow

ISR PoR Example

slide-5
SLIDE 5

Factor Units KPP Factor Type No. Levels Levels Factor** Management Steady State Correlator Input Stream (ELINT correlatable

  • bservations)

Obs/hr 4 Continuous 2 250K, 2M ETC Correlator Candidate Pool (No. of tracks in database) tracks 4 Continuous 2 25K, 250K ETC Peak Correlator Input Stream Obs/sec 4 Continuous 2 150, 1500 ETC Installation Site 4 Categorical 2 Afloat, Shore HTC NTM Imagery Processing: NITF 2.1 Format NTM Images (3 GB/Images)/hr 5 Continuous 2 10, 50 ETC Organic Imagery Processing: NITF 2.1 Format Images (100 MB/Images)/hr 5 Continuous 2 250, 1500 ETC Organic FMV Processing: H264/MPEG4 Format (~2GB/hour)

  • Cont. Streams

5 Continuous 2 2, 8 ETC Virtual Machine: Cores Assigned cores H Categorical 2 1, 12* VHTC Virtual Machines: GPUs Assigned GPUs H Categorical 2 1, 4* VHTC Virtual Machine: RAM Assigned GB H Continuous 2 24, 192 HTC Available Disk I/O (350, 1500 IOPs) IOPs H Categorical 2 SATA, SAS SSD HTC Candidate Pool of ISR Platforms 7 Continuous 2 1, 25 ETC Candidate Pool of ISR Sensors 7 Continuous 2 1, 75 ETC ISR Platforms available to a Tactical Naval Warfighter 7 Continuous 2 1, 10 ETC ISR Sensors available to a Tactical Naval Warfighter 7 Continuous 2 1, 25 ETC

*FCR0; Start with 4 cores and 1 GPU and increase the numbers based on test results. **ETC – Easy-to-change; HTC – Hard-to-change; VHTC – Very hard to change

5

DOE Table

ISR PoR Example

slide-6
SLIDE 6

6

Response Variable Test Phase

OT-B1/IT-C1 IT-C2/ IT-D1/ IT-D2 OT-C1/ OT-D1/ OT-D2

  • Chat Latency
  • Data LAN Transfer Timeliness
  • Common Operating Picture

(COP) Timeliness

  • Imagery Display Timeliness
  • Chat Latency
  • Data LAN Transfer Timeliness
  • COP Timeliness
  • Imagery Display Timeliness
  • Chat Latency
  • Data LAN Transfer Timeliness
  • COP Timeliness
  • Imagery Display Timeliness

Factors Levels

Network Loading

  • high >74 percent user CCE

devices in use

  • low <51 percent user CCE

devices in use Systematically Vary Systematically Vary Systematically Vary Enclave UNCLAS, SECRET, SR, and SCI Systematically Vary Systematically Vary Systematically Vary Transmission Type Super Hi Frequency (SHF) satellite communications

  • Hi Frequency

Systematically Vary Systematically Vary Systematically Vary File Size Large ≥5 MB medium 1 to 5 MB small <1 MB Systematically Vary Systematically Vary Systematically Vary Transport Method upload download Systematically Vary Systematically Vary Systematically Vary Platform Type Unit Level Force Level Subsurface MOC Aviation Record Record Record Air Temperature As occurs Record Record Record Relative Humidity As occurs Record Record Record

DOE Table

COMMs PoR Example

slide-7
SLIDE 7

DOE Table

METOC PoR Example

7

Response Variables Test Phase

DT-B1:R1(Lab) IT-B2:R1(Lab) DT-C1:R1(Ship)

  • Reliability
  • Maintainability
  • Availability
  • Reliability
  • Maintainability
  • Availability
  • Reliability
  • Maintainability
  • Availability

Factors

Levels Network Loading

  • high >74 percent user CCE

devices in use

  • low <51 percent user CCE devices

in use Systematically Vary Systematically Vary Systematically Vary

ADNS WAN Availability

50Mbps Systematically Vary Systematically Vary Systematically Vary

Product Data Size

Large ≥5 MB medium 1 to 5 MB small <1 MB Systematically Vary Systematically Vary Systematically Vary

Data Transport Method

upload download Systematically Vary Systematically Vary Systematically Vary

Platform Type

CVN, LHD Record Record Record

Area of Interest product (Satellite Imagery) Access time

Small: Resolution: 1-2km, File size: 50-100KB, Time < 10 sec Medium: Resolution: 4-8km, File size:500-750MB, Time < 15 sec Large: Resolution: 8-16km, File size:1-2GB, Time < 10 sec. Record Record Record

slide-8
SLIDE 8

Factor Information Responses

Name Description Main Effects Factor Management2 Type Levels Level Descriptors ES IO System Tasking System Loading Network Loading x1 EMI Mitigation x R continuous 1,2,3? ?? x x2 DF Accuracy x R continuous 1,2,3? ?? x x3 Energy on Target x SV, R continuous 1,2,3? ?? x x4 Blockage/Cutouts x R discrete 1,2,3? ?? x x5 Tasking Loading x x SV continuous 1,2,3? ?? x x x6 Stare Resources x x SV discrete 1,2,3? ?? x x x7 SDF's x SV continuous 1,2,3? ?? x x8 Reporting x SV continuous 1,2,3? ?? x x9 Network Status x SV,R continuous 1,2,3? ?? x x x10 Loading (Imply NEA/SOIs) x SV continuous 1,2,3? ?? x x x11 Remoting x SV continuous 1,2,3? ?? x x Factor Management - Refers to the way in which the factors are varied throughout the test(SV -systematically vary; HC- hold constant; R- record) Type - Type of factor variable (continuous, discrete, etc.) Levels - How many levels (2, 3, 4, etc) Level Descriptor (High, Low, and Middle settings of the factor levels, includes units)

DOE Table

SIGINT PoR Example

8

slide-9
SLIDE 9

Test Design Factors Response Variables

. . . . . . . . .

Hold Constant Factors, Recordable Factors, Constraints

Practical Concerns from an SoS Perspective

  • One-system-at-a-time testing approach
  • The levels of common factors may not

be equally scaled

  • Blocking factors could be different
  • Hold constant factors may not be held

at the same levels

  • Response variables may not include

inputs to other systems

  • Disallowed combinations or constraints

may not be equally defined The basis for test, evaluation, and certification could be different even though all the systems support the same mission ?

Hold Constant Factors, Recordable Factors, Constraints

Test Design

Current State – One-System-at-a-Time

9

slide-10
SLIDE 10

SoS capability requirements is presently based on aggregating the constituent PoRs mission-based capabilities. But from an SoS standpoint, for example, it could like: 1. PEO-C4I SoS shall provide the capability to aggregate all sensor capabilities, with ability to direct and optimize sensor movement & performance ……. 2. PEO-C4I SoS shall provide the capability to disseminate sensor data via internal/external networks ………… 3. PEO-C4I SoS shall provide the capability to collect, ingest, process, and analyze Intel data ……….. 4. PEO-C4I SoS shall provide the capability to correlate & fused all source data in a timely manner in support of ASW, Strike, BMD, SUW, Mine, etc. mission areas ………….

What Would SoS Level Capabilities Requirements Look Like?

10

MISSION ENGINEERING Mission SoS Systems

slide-11
SLIDE 11

Critical SoS Requirements- ASW Mission SoS PoRs ISR COMMs SIGINT METOC C2

Factors Levels

Response Variables

Automated Fusion

  • Correlator Input
  • No of Tracks
  • Peak Input Stream
  • 250K, 2M Obs/hr
  • 25K, 250K
  • 150,1500 Obs/sec
  • Pd, 90%, < 5 min
  • Anomaly Det, Pd 80%
  • Non-Emitting, Pd 80%
  • Pd, 70%, < 10 min

Systematically Vary Systematically Vary Systematically Vary

X X X X

Exploitation & Detection

  • NTM Imagery

Processing

  • Organic

Imagery Proc

  • Organic FMV

Proc

  • 5, 10 (3 GB/Image/hr)
  • 250, 1500 (100

MB/images/hr)

  • 2,8 ( Continuous 2

GB/hour)

X X X X

Virtual Machines Cores Assigned 1, 12

X

  • Virtual Machine

GPUs Assigned 1, 8

X

  • Virtual machine

RAM assigned 24, 192

X

  • What Would an SoS DOE Table Look Like?

Notional Example - ISR PoR

11

slide-12
SLIDE 12

Test Design

Future State – Notional SoS Architecture

12

Test Design Factors Response Variables

Mission-Oriented Response Variable 1 Mission-Oriented Response Variable 2 Mission-Oriented Response Variable 3 Factor A Factor B Factor C Factor D HSI SSI Factor mapping Hold Constant Factors, Recordable Factors, Constraints

slide-13
SLIDE 13
  • Design of Experiments (DOE) provides objective quality evidence

(OQE) for the driving and limiting factors affecting mission-based SoS performance and individual PoR performance

  • Separates the critical few dependencies from the trivial many
  • Adds rigor to the Test Strategy
  • Drives efficiency into testing by providing objective data on how

much testing is needed

  • Improves the reliability of information available from test
  • Provides pedigree objective, quality evidence to inform

Certification and other acquisition decisions

Test, Evaluation, and Certification of C4ISR Systems

Future State

We are no longer just concerned with individual systems requirements, but with how these requirements mesh with dependent, enabling C4I architectures in meeting mission objectives

13

slide-14
SLIDE 14

Status and Future Plans

  • Trident Warrior 16
  • Trident Warrior 17

14

slide-15
SLIDE 15

Status and Future Plans

Trident Warrior 16 (As Proposed)

  • DCGS-N Inc. 2 was critical part of PEO-C4I prototype

efforts called Naval Integrated Tactical-Cloud Reference for Operational Superiority (NITROS), and due to priorities & schedule was not able to concentrate on initial look at a SoS DOE

  • Presently, the relevant input factors for DCGS-N Inc. 2’s

DOE are being updated

15

slide-16
SLIDE 16

DCGS-N Inc 2 SSEE-F NITES-Next

Tailored METOC data RF sensor performance prediction Weather from comms data

ICOP MTC2 EWBM

Tracks Tracks Tracks

Tracks Tracks Tracks

Trident Warrior 16

Integrated ISR

16

slide-17
SLIDE 17

17

Assessment

  • f weather
  • n UAV ops

Direct ingest U.S.S. Pinckney

ICOP

CVN/LHA

FNMOC Models, Data, Products OGC formats

Geolocation of tracks Enhanced combat ID and I&W Coordinated non-kinetic fires Red, Blue, White, Gray tracks

Cloud Data

Para 126/KL, CR 1551, USMTF C121, TACELINT SG-5302 I&W, Geolocation Red, Blue, White, Gray tracks

Combat System

UAV Operator

EWBM

Apertures Red, White, Gray tracks .othg Link 16?

EWBM Radiant Mercury CDS

Sensor performance prediction Tailored satellite

  • bservations

and AVWEAX Enhanced combat ID and I&W SCI CIP Red, White, Gray Tracks

SCI SSEE-F SCI SSEE-E

Quellfire queries

SLQ-32, CEC, JTT tracks .othg Link 16 to other ships via TEWMS

ICOP

COAMPS Wx data Target & SDF Para 126, NRTI

SCI to GENSER CDS Pump II

  • r MDeX

Unit Level Force Level

NITES Next

.dat, .kml, .mtif .jpeg, .gif, .grib, .ogc

U.S.S. Vinson High-side Fusion, FMV CIP (Red Halo) Warfighter High-side Fusion, S2A Correlation, Alerting, Missing from Imagery Workflow

SAVA SCI DCGS-N Inc 2 FCR 0

Link 16 or Zone Exchange

MTC2 SOA

C2

SCI to GENSER CDS MDeX NITROS Elements EWBM Elements Integrated C4ISR Elements Existing Elements

Wx data from radio and radar sensors (WDIC) TDAs, Threat, Countermeasures FMV CIP (Red Halo) minus SIGINT Red, Blue, White, Gray tracks with Halo

RSCD RSCD

Contact data Tasking Warfighter

Trident Warrior 16

Experiment 5

17

slide-18
SLIDE 18

Status and Future Plans

Trident Warrior 17

  • Focus now is on Trident Warrior 17, scheduled

for Summer of FY17

  • We are anticipating following systems playing

in Trident Warrior 17

– DCGS-N Inc. 2 – ICOP – NITES-Next – SSEE-E/F – MTC-2 – CANES

18