Testing Lessons Learned and Suggested Improvements for Rapid - - PowerPoint PPT Presentation

testing lessons learned and suggested improvements for
SMART_READER_LITE
LIVE PREVIEW

Testing Lessons Learned and Suggested Improvements for Rapid - - PowerPoint PPT Presentation

U.S. ARMY WHITE SANDS MISSILE RANGE Testing Lessons Learned and Suggested Improvements for Rapid Prototyping and Acquisition Mr. Jerry Tyree, Deputy Commander / Tech Director White Sands Test Center 18 January 2017 Outline 1. Testing in


slide-1
SLIDE 1

U.S. ARMY WHITE SANDS MISSILE RANGE

Testing Lessons Learned and Suggested Improvements for Rapid Prototyping and Acquisition

  • Mr. Jerry Tyree, Deputy Commander / Tech Director

White Sands Test Center 18 January 2017

slide-2
SLIDE 2

2

Outline

  • 1. Testing in Rapid Acquisition – Criteria Based
  • 2. Risk Management Approach
  • 3. Test Design, Planning, Execution
  • 4. Test Execution and Analysis
  • 5. Training (DOTLmPF) in Testing
  • 6. Threat Environment / Presentation
  • 7. Data Acquisition / Instrumentation
  • 8. Test Results, Analysis, Reporting
  • 9. Prototype Testing in Rapid Acquisition
  • 10. Summary
slide-3
SLIDE 3

3

CUMULATIVE COST & RESOURCES TIME DESIGN / Prototype AoA / COEA TEST PROCURE & FIELD Operate / Maintain / Sustain – Improve – Adjust to threat & technology changes CUMULATIVE COST & RESOURCES TIME DESIGN / Prototype AoA / COEA TEST PROCURE & FIELD Operate / Maintain / Sustain – Improve – Adjust to threat & technology changes

Testing Is a LEARNING Tool to Inform & Enables Warfighter Effectiveness Survivability, Suitability  Develop & validate Warfighter capabilities  Develop & validate tactics, procedures, training

Life Cycle Cost is highly dependent on the T&E Phase

 Research & development of technologies  Exploit threat systems, capabilities, operations

Finding & Fixing issue Upfront Saves Lives, Cost Later

  • Reliability
  • Repairs or rework
  • Maintenance, spares

Acquisition Life Cycle

Notional Resources Over Time

Test occurs throughout life cycle

(stockpile reliability, product improvements, threat…)

Effe ffect ct ive T Test st ing

  • Finds / Solves Issues
  • Characterizes Performance
  • Is Relevant to the Threat Environment
  • Includes all aspects of DOTLmPF
  • Informs Risk
  • Numerous Studies over the last 2 decades
  • Aimed at Cost, Schedule, OT failure
  • Early Warfighter Involvement
  • Reliability, Readiness, Schedule
  • Combine DT/OT

Testing As a Solution to Acquisition

Com

  • m m on
  • n Acqu

quisit ion

  • n Obj

bj ect ive Provide Warfighter Capability to Defeat the Threat and Defend the Nation - Quality Performance, Responsive, Cost Effective

slide-4
SLIDE 4

4

Testing Rapid Acquisition “Historical Faster Testing Approach”

Criteria Based Acquisition Drives Criteria Based Test and Evaluation

  • Criteria Based – Schedule Driven Funding Constrained
  • Generally Sequential Process by Design – Not

Integrated

  • Test Type “Tools” – Generally Phase / Criteria Driven
  • Generally Binary – PASS / FAIL

CURRENT PROCESS IS NOT TIED TO MISSION RISK

Material Solution to Operational Rqmt Design to Spec Operational Criteria Ver/Val

  • Fab. Prototype

Spec / Function Criteria Ver/Val TTP / Log Train Ver/Val Combat M&S > COEA / BOI DT&E FDT&E Log Demo SE M&S > Spec Criteria Ctr Confidence DT OT&E Procure Field Train Sustain Report Criteria Met?

Capabilities & Limitations FOA Faster by Less of Each

  • Integration / Interoperability / Compatibility
  • Leader / Operator Training
slide-5
SLIDE 5

5

An Integrated Risk Assessment Approach

Integrated Risk Management Concept Fundamentals – Test Design

Key Challenges

  • How Much of Which Tool(s) will Provide

the Required Information?

  • M&S, DT, FDTE, OTE
  • How Much is Enough Information?

Safety & OT are Only Test Required by Law Sample Size, Reliability

Les esson L Lea earned ed

  • Criteria based requirements drive a pass/fail process –
  • Reliability drives schedule
  • Threat changes faster than our process can respond to (Cyber

example in software blocking)

  • Capabilities evolve over time – rarely work right / meet

requirements the first time

Rec ecom m en endat ion

  • Informed Risk Management Approach
  • Focus process Test Design and

resources on Risk based requirements

  • 1. Identified Risk
  • Lives/Impact, Performance, Cost
  • Risk to schedule (not fielding a

capability against a threat)

  • 2. Knowledge Required
  • What we know
  • What we don’t know
  • What is worth knowing (CBA)
  • What can be learned over time
  • Apply Knowledge Management and

track or assess risk throughout

slide-6
SLIDE 6

6

Rapid Acquisition T&E

Test Design, Planning, Execution

Lessons Learned:

  • Too much time repeatedly spent coordinating, socializing and mustering resources for test
  • Environmental, Safety procedures
  • Instrumentation Data Collection and Reduction
  • Threat Systems / Environment
  • Systems or Systems of Systems (acquisition, transportation, configuration)
  • Test is a learning continuum, capabilities evolve over time – changes in threat, technology,

tactics, cyber

Recommendation:

  • Establish persistent SME stakeholder teams – Consolidate and Co-locate
  • Knowledge grows/evolves over time improves quality, throughput, responsiveness
  • ALL stakeholders must be represented (materiel developer, combat developer (warfighter), tester, threat)
  • Establish persistent Test Bed - fall in
  • Representative systems of systems (networks)
  • Consistent instrumentation
  • Threat representation (threat systems, denied environments,
  • Climatic, dynamics, electromagnetic, logistics…

Time Readiness

slide-7
SLIDE 7

7

Rapid Acquisition T&E

Test Execution and Analysis

Lessons Learned:

  • Insufficient Instrumentation and Data Collection leads to re-test or indeterminate solutions
  • Constrains or cripples forensics required for efficient Test-Fix-Test
  • Post test data analysis leads to re-test, delays or inclusive evaluation
  • Sharing, communications and knowledge of all stakeholders is critical to rapidly fixing or

assessing capabilities

Recommendation:

  • Upfront, establish rapid, reconfigurable instrumentation and data collection
  • System /Threat Level - data links, emissions, optics, effects, signatures
  • Operations Level – Optics, Radiometric, TM, RF spectrum, Radar, Meteorological, GPS,
  • Require Real-Time Data Analysis and Display
  • Critical to test execution, refinement and execution decision and learning process
  • Must be re-configurable, acceptance of multiple inputs and custom presentation
  • Establish a “Joint” Analysis Team comprised of a lead by phase but include all

stakeholders (PM, OEM, Combat Developer, Logistics, Training, Test/Evaluator)

slide-8
SLIDE 8

8

Rapid Acquisition T&E

Training in Value in Test, Test Value in Training

Lessons Learned:

  • Early Prototypes and Rapid Acquisition Systems often have ‘engineering’ level or poor user

interfaces

  • Training with the Lack of or Insufficient Experience will result in poor or failed test
  • Major difference in Training and Experience – can skew measures and results

Recommendation:

  • Include Human Factors and Training Upfront and Continually
  • Warfighter involvement in test process results in matured training products and knowledgeable operators
  • Can mitigate or simultaneously satisfy the need for separate training/logistics test and data collection
  • Evolves with system maturity and development
slide-9
SLIDE 9

9

DT&E “Success” > OT&E “Failure”

Contributing Factor - Prototype vs. Production Rep SUT

M&S Combat Effectiveness M&S Design – component / System / SoS Prototype (s) Fab /form/fit Contractor Confidence Test DT&E Prototypes

  • Tech. Perf. Specs
  • World-wide environments
  • Threat / Vulnerability
  • Reliability Indicators

Test – Fix Test Cycle “Pass DT” LUT LRIP IOT&E

  • Schedule & Funding drive Acq Community to regard

hand built EMD Prototypes as Production Rep Result: Test design for full spectrum for world wide environments requires “sealed” systems

  • Fixing “sealed” systems is more expensive, time

consuming or impractical to support test/fix/test Result: Tendency to defer fixes or tests = Increased risk to OT. Increased cost and schedule. Tendency to mod only certain prototypes resulting in various configurations / fixes = risk to test success

  • Tendency for “special treatment” and allowances for

limited prototype resources Result: Fewer “operational” type reliability and durability issues are discovered

  • Prototypes get high use / use rates in excess of

OMS/MP estimates (through multiple environments) Result: Increased risk of failure in LUT / OT COMMON FACTS & EFFECTS

(there are exceptions)

Prototypes

  • limited number / spares
  • various configurations
  • delayed fixes to issues in DT
  • excessively used devices

Milestone B – COEA rqmts validation / spec Design / mature design concept

slide-10
SLIDE 10

10

DT&E Success – IOT&E Failure

Contributing Factor - Prototype vs. Production Rep SUT

M&S Combat Effectiveness M&S Design – component / System / SoS Prototype (s) Fab /form/fit Contractor Confidence Test DT&E Prototypes

  • Tech. Perf. Specs
  • World-wide environments
  • Threat / Vulnerability
  • Reliability Indicators

Test – Fix Test Cycle “Pass DT&E” LUT LRIP Fail IOT&E

COMMON CAUSES & EFFECTS

(there are exceptions)

Prototypes

  • limited number / spares
  • various configurations
  • delayed fixes to issues in DT
  • excessively used devices
  • Fixes often invalidate or reduce the confidence in

tests already conducted Result: Requires analysis, re-test or deferred tests resulting in increased, cost, schedule or risk

  • DT of LRIP systems prior to IOT not typically

planned for. Prototype DT data used for entrance. Result: Untested LRIP systems significantly different than “passed” DT systems > high risk of OT failure DT&E Success - IOT&E Failure

  • Prototype Expectations / Perceptions

is likely one of the dominate factors

Milestone B – rqmts validation / design spec Design / mature design concept

slide-11
SLIDE 11

11

Improved Acquisition Test Process

Prototype vs. Production Rep SUT

Planned Prototype DT – Mature the Design

  • Test Critical Technical Parameters – Effectiveness
  • Safety verification

Apply Risk Management Approach

  • Test Characteristics High Risk to Mission Perf
  • Exit criteria based on % risk to Mission Effectiveness

> Determine by Combat M&S using Test Data Planned Prototype LUT

  • Suitability / Survivability Indicators - Fixes
  • OT Readiness - % risk

LRIP Build

  • Incorporate test fixes / lessons learned
  • Production Representative

PPQT - IOT&E

  • Check test CTPs
  • Start IOT&E
  • Test world wide environments

M&S Combat Effectiveness M&S Design – component / System / SoS Prototype (s) Fab /form/fit Contractor Confidence Test DT&E Prototypes

  • Critical Technical Parameters
  • High Risk Requirements

Prototype DT Test – Fix Test Cycle DT&E Exit Planned LUT Planned LRIP IOT&E Milestone B – rqmts validation / design spec Design / mature design concept Mature Design Production Rep Systems

  • Quick Look Validate CTPs
  • Test WWE parallel with IOT

PPQT&E HIGHER PROBABILITY OF SUCCESS

LOWER COST – SHORTER SCHEDULE

PPQT&E Exit

slide-12
SLIDE 12

12

Operational Assessment Methodology

Graphical Overview

FI ELD & SUSTAIN Effective Capabilities Efficiently in advance of the Threat Fielding Decision RISK Management Based decision Risk / I m pact To Perf Schedule Cost / Benefit

(also a risk to not fielding even minimal capability against certain threats)

OA Knowledge Gap & Risk based

  • Task/ Org/ Arch
  • unit size
  • threat
  • scenario
  • instrument
  • data

Test For Knowledge based on risk Exploit or Defeat a Threat Operational Enviro -Climatic, Emi Threat - EW, cyber Integration, Compatibility Log demo Safety Security Training – TTP , Doctrine GOAL

Knowledge / Confidence

Know ledge Requirem ents Technical & Operation

  • KPP/ KSA…
  • COIC
  • CDD – ONS
  • DOTLmPF
  • Architecture
  • Task Org
  • MOP / MOE

Risk (due to unknowns)

Running Estimate of Gaps drives OA design Test to the Gaps

  • What do we need to know?
  • What do we know?
  • What don’t we know?
  • Is it worth learning?

Risk Report

  • Risk/ Impact
  • Cost/ Benefit
  • Ver/ Val knowns

in operational environment

Changes in Threat Technology, CBA/ ONS… Evolve When Needed, When Available

slide-13
SLIDE 13

13

SUT Anomaly

(Test Success May Look Like Failures)

Program Success is Often Perceived as Void of “Failures”

  • System “Failures” Are Common, Even Expected in Early DT&E

> Finding and Resolving them Early Reduces Costs – and Sometimes Saves Lives > They Also May Just Define the Limitations Unintended Consequences

  • Success Oriented Testing
  • Avoiding or Limiting DT&E
  • Redundant Pre-Test or Experiments
  • THE ONLY REAL FAILURE -

NOT Providing The Warfighter Required Capability (in a Timely and Cost Effective Manner)

slide-14
SLIDE 14

Doing the Right Thing for the Warfighter and the Nation

14

Success or Failure?

slide-15
SLIDE 15

Questions / Discussion?

15