31 st annual itea symposium
play

31 st Annual ITEA Symposium T&E to Achieve Better Buying Power - PowerPoint PPT Presentation

31 st Annual ITEA Symposium T&E to Achieve Better Buying Power Dr. C. David Brown DASD(DT&E) Director, Defense Test Resource Management Center Crystal Gateway Marriott October 7, 2014 DISTRIBUTION STATEMENT A Cleared for


  1. 31 st Annual ITEA Symposium “T&E to Achieve Better Buying Power” Dr. C. David Brown DASD(DT&E) Director, Defense Test Resource Management Center Crystal Gateway Marriott October 7, 2014 DISTRIBUTION STATEMENT A – Cleared for Open Publication by OSR on October 7, 2014 – SR case number 15-S-0040 1

  2. “Shift Left” Initiative • Achieves Better Buying Power Objectives – Focus critical DT&E activities earlier in the acquisition cycle! – Three initial focus areas: − Earlier Mission Context − Earlier Interoperability Testing − Earlier Cybersecurity Testing • “Shift Left” = the right information, right time: – Technical (e.g., PDR/CDR) – Programmatic (e.g., LLP) – Acquisition (e.g., MS C) USD(AT&L) White Paper BBP 3.0 Grounded in USD(AT&L) Better Buying Power Principles to Improve Acquisition Outcomes DISTRIBUTION STATEMENT A – Cleared for Open Publication by OSR on October 7, 2014 – SR case number 15-S-0040 2

  3. “Shift Left” “Most important single decision in the life cycle…sets in Requirements Developmental RFP motion all that follows.” - Frank K endall Decision Decision FRP IOT&E A B C Technology Materiel Engineering & Manufacturing Production and Maturation & Risk O&S MDD Solution Analysis Development Deployment Reduction PDR DR CDR LLP LP TEMP TEM TEMP TEM TEMP TEM TEMP TEM User DT&E Needs Draft DIACAP/RMF Ri Risk M Man anag agement Fram amework System ems E Engi ginee eering g Operati tion on & & Mainte tenance Man anag agement P Plan an DOT&E IA Memo Cybersecurity DT/OT Current Practice Concept o of System Operati tion ons Validati tion on Results in Inadequate Design Interoperability and Late Discovery Earlie lier I r Intero rope pera rabili bility T Testing System System Verif rific icatio ion & & of Problems Requirem emen ents Depl De ploym yment Higher Cost to Hi High-Lev evel el Resolve Problems Mission Context Subsy syst stem Design gn & & Int ntroduce Mi Mission C n Cont ntext Earlier Subsy syst stem Verif rific icatio ion Requirem emen ents Key Capabilities need Detailed ed D Design gn Unit T t Testing to be designed-in C Relia iabil ilit ity Gro rowth Reliability Optimized early rather than for IOT&E vice MS C Software Coding Significant Growth tested-in later Hardware Fabrication planned for after MS C DISTRIBUTION STATEMENT A – Cleared for Open Publication by OSR on October 7, 2014 – SR case number 15-S-0040 3

  4. Defense Industry Forum Top-Level • JHU/APL (Trusted Agent) will create an independent and collaborative T&E Defense Industry Forum (DIF) for government, industry, and other stakeholders to identify policies and technical challenges that affect the efficiency and effectiveness of DoD Test infrastructure. • Build trust and help Government / Industry Stakeholders • Recommendations to identify actionable items for AT&L to implement changes to T&E policy and processes The Defense Industry Forum was requested by OEMs in the TRMC Study DISTRIBUTION STATEMENT A – Cleared for Open Publication by OSR on October 7, 2014 – SR case number 15-S-0040 4

  5. KLP Qualification Board First Q-Board Scheduled for December 9, 2014 5

  6. Developmental Evaluation Framework Decisions Supported What Developmental System Requirements and T&E Decision #1 Decision #2 D Why and When Evaluation Measures Objectives DSQ #1 DSQ #2 DSQ #3 DSQ #4 DSQ #5 D Functional evaluation Identify major decision points for which testing and evaluation phases, activity and even Cells contain description of data source to be used for evaluation information, for exam areas Technical 1) Test event or phase (e.g. CDT1....) Reqmts 2) M&S event or scenario System capability Document 3) Description of data needed to support decision categories Reference Description 4) Other logical data source description Performance 3.x.x.5 Technical Measure #1 DT#1 Performance M&S#2 Capability #1 3.x.x.6 Technical Measure #2 M&S#1 DT#3 How 3.x.x.7 Technical Measure #3 Performance DT#3 Capability #2 3.x.x.8 Technical Measure #4 M&S#4 Interoperability 3.x.x.1 Technical Measure #1 Interoperability DT#3 Capability #3 3.x.x.2 Technical Measure #2 IT#2 M&S#4 3.x.x.3 Technical Measure #3 Interoperability IT#2 Capability #4 3.x.x.4 Technical Measure #4 Cybersecurity SW/System Assurance PPP 3.x.x SW Assurance Measure #1 SW Dev Assess SW Dev Asses S RMF Contol Measure #1 RMF Cont Assess Cont Assess Cont Assess Cont Assess Vul Assess Measure #1 Vulnerability Assess Blue Team Vul Assess Measure #2 Interop/Exploitable Vuln. Red Team Reliability 4.x.x.1 Technical Measure #11 M-demo#1 4.x.x.2 Technical Measure #12 Reliability Cap #1 M-demo#1 4.x.x.3 Technical Measure #13 M-demo#2 4.x.x.4 Technical Measure #14 Reliability Cap #2 M-demo#2 6

  7. TEMPs The Good The Bad • Robust T&E Planning • Cumbersome and bureaucratic process • Contract among Stakeholders • Document is too big The Ugly • “TEMP is only for OSD” • Long term value is questionable DISTRIBUTION STATEMENT A – Cleared for Open Publication by OSR on October 7, 2014 – SR case number 15-S-0040 7

  8. Questions 8

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend