31 st Annual ITEA Symposium T&E to Achieve Better Buying Power - - PowerPoint PPT Presentation

31 st annual itea symposium
SMART_READER_LITE
LIVE PREVIEW

31 st Annual ITEA Symposium T&E to Achieve Better Buying Power - - PowerPoint PPT Presentation

31 st Annual ITEA Symposium T&E to Achieve Better Buying Power Dr. C. David Brown DASD(DT&E) Director, Defense Test Resource Management Center Crystal Gateway Marriott October 7, 2014 DISTRIBUTION STATEMENT A Cleared for


slide-1
SLIDE 1

1

DISTRIBUTION STATEMENT A – Cleared for Open Publication by OSR on October 7, 2014 – SR case number 15-S-0040

31st Annual ITEA Symposium

“T&E to Achieve Better Buying Power”

  • Dr. C. David Brown

DASD(DT&E) Director, Defense Test Resource Management Center Crystal Gateway Marriott October 7, 2014

slide-2
SLIDE 2

2

DISTRIBUTION STATEMENT A – Cleared for Open Publication by OSR on October 7, 2014 – SR case number 15-S-0040

“Shift Left” Initiative

  • Achieves Better Buying Power Objectives

– Focus critical DT&E activities earlier in the acquisition cycle! – Three initial focus areas: − Earlier Mission Context − Earlier Interoperability Testing − Earlier Cybersecurity Testing

  • “Shift Left” = the right information, right time:

– Technical (e.g., PDR/CDR) – Programmatic (e.g., LLP) – Acquisition (e.g., MS C)

USD(AT&L) White Paper BBP 3.0

Grounded in USD(AT&L) Better Buying Power Principles to Improve Acquisition Outcomes

slide-3
SLIDE 3

3

DISTRIBUTION STATEMENT A – Cleared for Open Publication by OSR on October 7, 2014 – SR case number 15-S-0040

C

O&S Production and Deployment Engineering & Manufacturing Development Technology Maturation & Risk Reduction

“Shift Left”

A

Requirements Decision Developmental RFP Decision

B

Materiel Solution Analysis

MDD

Detailed ed D Design gn Software Coding Hardware Fabrication Hi High-Lev evel el Design gn & & Subsy syst stem Requirem emen ents System Requirem emen ents Concept o

  • f

Operati tion

  • ns

System ems E Engi ginee eering g Man anag agement P Plan an Unit T t Testing Subsy syst stem Verif rific icatio ion System Verif rific icatio ion & & De Depl ploym yment System Validati tion

  • n

Operati tion

  • n &

& Mainte tenance

DT&E

TEM TEMP TEM TEMP

Draft

TEM TEMP TEM TEMP PDR DR CDR User Needs DOT&E IA Memo DIACAP/RMF Interoperability Mission Context FRP IOT&E

Ri Risk M Man anag agement Fram amework Cybersecurity DT/OT Earlie lier I r Intero rope pera rabili bility T Testing Int ntroduce Mi Mission C n Cont ntext Earlier

Relia iabil ilit ity Gro rowth

Current Practice Results in Inadequate Design and Late Discovery

  • f Problems

Higher Cost to Resolve Problems Reliability Optimized for IOT&E vice MS C Significant Growth planned for after MS C

Key Capabilities need to be designed-in early rather than tested-in later

C

LLP LP

“Most important single decision in the life cycle…sets in motion all that follows.” - Frank Kendall

slide-4
SLIDE 4

4

DISTRIBUTION STATEMENT A – Cleared for Open Publication by OSR on October 7, 2014 – SR case number 15-S-0040

Defense Industry Forum Top-Level

  • JHU/APL (Trusted Agent) will create an independent and

collaborative T&E Defense Industry Forum (DIF) for government, industry, and other stakeholders to identify policies and technical challenges that affect the efficiency and effectiveness of DoD Test infrastructure.

  • Build trust and help Government / Industry Stakeholders
  • Recommendations to identify actionable items for AT&L

to implement changes to T&E policy and processes The Defense Industry Forum was requested by OEMs in the TRMC Study

slide-5
SLIDE 5

5

KLP Qualification Board

First Q-Board Scheduled for December 9, 2014

slide-6
SLIDE 6

6

Developmental Evaluation Framework

D DSQ #1 DSQ #2 DSQ #3 DSQ #4 DSQ #5 D Functional evaluation areas System capability categories Technical Reqmts Document Reference Description 3.x.x.5 Technical Measure #1 DT#1 M&S#2 3.x.x.6 Technical Measure #2 M&S#1 DT#3 3.x.x.7 Technical Measure #3 DT#3 3.x.x.8 Technical Measure #4 M&S#4 3.x.x.1 Technical Measure #1 DT#3 3.x.x.2 Technical Measure #2 IT#2 M&S#4 3.x.x.3 Technical Measure #3 IT#2 3.x.x.4 Technical Measure #4 SW/System Assurance PPP 3.x.x SW Assurance Measure #1 SW Dev Assess SW Dev Asses S RMF RMF Contol Measure #1 Cont Assess Cont Assess Cont Assess Cont Assess Vulnerability Assess Vul Assess Measure #1 Blue Team Interop/Exploitable Vuln. Vul Assess Measure #2 Red Team 4.x.x.1 Technical Measure #11 M-demo#1 4.x.x.2 Technical Measure #12 M-demo#1 4.x.x.3 Technical Measure #13 M-demo#2 Reliability Cap #2 4.x.x.4 Technical Measure #14 M-demo#2 Interoperability Capability #4 Reliability Cap #1 Reliability

Decisions Supported

Performance Interoperability

Identify major decision points for which testing and evaluation phases, activity and even Cells contain description of data source to be used for evaluation information, for exam 1) Test event or phase (e.g. CDT1....) 2) M&S event or scenario 3) Description of data needed to support decision 4) Other logical data source description

Cybersecurity Decision #1 Decision #2

System Requirements and T&E Measures Developmental Evaluation Objectives

Performance Capability #1 Performance Capability #2 Interoperability Capability #3

What Why and When How

slide-7
SLIDE 7

7

DISTRIBUTION STATEMENT A – Cleared for Open Publication by OSR on October 7, 2014 – SR case number 15-S-0040

TEMPs

The Good

  • Robust T&E Planning
  • Contract among

Stakeholders

The Bad The Ugly

  • “TEMP is only for OSD”
  • Long term value is questionable
  • Cumbersome and

bureaucratic process

  • Document is too big
slide-8
SLIDE 8

8

Questions