. zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA 7 - - PDF document

zyxwvutsrqponmlkjihgfedcbazyxwvutsrqponmlkjihgfedcba
SMART_READER_LITE
LIVE PREVIEW

. zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA 7 - - PDF document

. zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA 7 zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA i zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA The Challenge A Cognitive Systems Engineering ~~ of DoD .Seeming mystique


slide-1
SLIDE 1

7 zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

. zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

i zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

A Cognitive Systems Engineering Application of the VnlnerabilityLethality Analysis Methodology to Simulation Credibility Assessment What is VV&A? O W & A zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

is

a method for Psseving

simulation credibility O W & A reduces the r

i s k s

sssoeiated with simulation development

and

. B y

verifying and validating the simulation, you gain confidence that it is the right tool to help you in your application zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

m m

ypothesis and Assumptions

#Hypothesis: zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

Thrt undcnlanding Ihe VV&A

p n r a i c m

be enhanced by luinba melhodology with which the audience is drrady familiar #Assumption L : The methodolow must be widely undrntaod and uwd within a spechie

DoD

community #Assumption1 I t m ~ ~ t b t p ~ u i b l e c o ~ d s p t

Ihr melhdolog Io the VV&A problem

The Challenge

~~

.Seeming mystique

  • f DoD

W&A .Broad diflkulty experienced in OW&A educational approachrs.ye*

~.

p o h implementhag policy

  • very general

. , 7 ' >

.~

..

. _

W & A Defined

I

slide-2
SLIDE 2

Aore Definitions Methodology Terminology .Space- the set zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

  • fdl combinations
  • ffactors that

.Level -the set zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

  • f

information zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

acrewary to

could result in a given system state describe the state ofthesystem .Mapping zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

  • Ihe types of information that allow

movement to another level Nelson's Application .Examine information requirements that .Identify required information first ODctcrmine what required information .Determine whether the available information .Define data voids where required informalion support zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

vn

ana~yses available through txisting data soumes is also reliable

b

either unavailable or unreliable Levels ofthe V n Methodolow

  • Level (-1)

Weapon detection

Level 0

lnltid t b r u t conditions Level 1 Initial ulreat/target conditions Level 2

Level3

Rmdnlng platform capability Level 4 Remaining pbtform utillty Danuge to taget .Iter i n t c d o n

&er interaction after interaction zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA n

I

Required Information 1 . Ukely combat renarios

  • 2. Scenario characteristies

3 . Mission succen criteria

  • 4. Auarment ofspcdtic capabilities
  • 5. Levels ofspecific upabilities

6 .

Subsystems to prwidc capabilities

  • 7. Critical components
  • 8. ukely dnmage to the system
  • 9. Expected vulnerabilities

I

. m

2

slide-3
SLIDE 3

Glasow's Application .The V/L methodology is a method for assessing the vulnerability of a weapon system in its environment .The W & A process is a method for assessing the vulnerability of a simulation in its environment zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

t zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

+HOWEVER, zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

A WWPONSYSTEMI.~NOTASlMUUTlON

.Axiom zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

1: Mappings from one level to the next

.Axiom

2: A level can be described with varying

are noninvertible. degrees of granularity, but the granularity at one level cannot exceed the granularity of its preceding level. zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

9 - g

L

? !

.Axiom 3: Metrics must be measurable and objective. Similarities in Information Requirements

1 . Likely comha zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA rrcnarins 2 . Scmario rhuracterirticr

  • 3. hfivsion suwess criteria
  • d. Airarment olspetific
  • 5. Lercb oriprrific capahilitia
  • 6. Suhryrtcms 10 prmidc
  • 7. Critical comjroncntq
  • 9. Eipccted \ulncrahilitim

1 . Likcl? simulation application

  • 2. Simulation funnionalitics
  • 3. Simulalian acccptahility criteria

1 . Vcritication and validation s . nmrhal,j, lor

vev Tritelip

  • 6. Simulation moduln nndd

to mcaJurr simulation pcdomanrc 7 . Criticnl algorithms and modules

  • 8. Likely rhangw 10 the

9 . Eipcctrd crrdihility ( V e V ) critcria cnpahilitin rrpahilitia Lihl? d a m w

10 the v s c m

Vulnerabilities of Simulations

.Simulations are subject to technical and political influences that may degrade the intended capability or functionality of the simulation . A poorly designed simulation may result in inaccurate results, negative training, or failure to suc~gsfully accomplish a mission . A n inaccurate simulation could result in program cancellation .Simulation development may be cancelled if th simulation fails to demonstrate significant enhancements to operational systems or cost effectiveness

!

' .

t@f zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

:;yc

Findings Common to Both the V/L Methodology and the VV&A Process .Capabilities are application-dependent .Users, strategists, and analysts are preferred sources for data regarding performance requirements most critical aspects .Learning and adaptation is the logical next step .Tailor the effort on the

3

slide-4
SLIDE 4

Conclusions .Use

  • f a familiar, accepted methodology is a

valuable and viable approach for enhancing understanding of zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA W & A within a specific community .The methodology emphasizes simulation requirements and likely applications .Political and technical changes zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

to simulation

development are proactively anticipated .The potential impact of change is evaluated vis-&vis the simulation’s ability to meet the intended use zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

4

slide-5
SLIDE 5

1 zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

A COGNITIVE SYSTEMS ENGINEERING APPLICATION OF THE VULNERABILITY/LETHALITY ANALYSIS METHODOLOGY TO SIMULATION CREDIBILITY ASSESSMENT zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA Priscilla A. Glasow The MITRE Corporation 1820 Dolley Madison Boulevard (W626) McLean, Virginia 22 102 pglasow@mitre.org 703-883-693 1 PURPOSE OF STUDY This paper examines the application of a widely-used methodology to determine the information requirements to support the Department of Defense’s (DoD) verification, validation, and accreditation (VV&A) process. VV&A of models and simulations is required by policy, specifically, DoD Instruction 5000.6

  • 1. More importantly, VV&A simply makes sense. Rational,

ethical analysts use analytical tools, such as simulation, in which they have confidence. VV&A simply assesses the credibility of the simulation tools that we use so that we can have confidence in their application and in their results. For the purposes of this paper, the DoD definitions for VV&A are used: Verification is the process of determining that a model accurately represents the developer’s conceptual description and specifications. Validation is the process of determining the degree to which a model is zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

an

accurate representation of the real world. The intended use of the model determines the degree of accuracy required for validation. Accreditation is the official certification that a model or simulation is acceptable to use for

a specific application.

The current challenge in implementing the spirit and intent of DoD’s policy for VV&A is in

  • vercoming a deeply embedded mystique that has arisen about this process. Despite years of

informally ensuring the credibility of models and simulations, many organizations within DoD have been unable to grasp the relatively simple philosophy of VV&A or effectively implement VV&A guidelines in practice. The lack of a common basis for understanding VV&A is one of the many reasons that have been cited to explain this problem. RATIONALE FOR THE SELECTED METHODOLOGY Background This paper hypothesizes that understanding the VV&A process can be enhanced by using a methodology with which the target audience is already familiar. Two assumptions are made. First, the methodology selected must be widely understood and used within a specific DoD

  • community. Second, it must be possible to adapt the methodology to the VV&A problem.

The methodology chosen for this study was the VulnerabilityLethality analysis methodology and its associated taxonomy. Two presentations given at the 15‘h International Symposium on Military Operational Research served as the genesis for this paper. First, Deitz (1998) presented his findings from recent applications of the VulnerabilityAethality methodology. A second

slide-6
SLIDE 6

2 zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

I zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

presentation given by Nelson (1 998) illustrated how the methodology could be adapted to examine information requirements associated with vulnerabilityAethality analyses. The following sections describe the VulnerabilityAethality analysis methodology, as defined by Deitz and others, and summarize Nelson’s application. This background serves as a foundation for applying the methodology to improving community understanding of VV&A. zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA The Selected Methodology The DoD missile defense community uses the vulnerability/lethality analysis methodology to assess the vulnerability of military platforms to attack. Vulnerability is defined as the characteristics of a system that cause it to suffer a degradation (loss or reduction) of capability to perform the designated mission(s) as a result of having been subjected to a hostile environment on the battlefield. It is generally an assumption in vulnerability studies that the threat warhead has engaged the target. (Deitz 1996, p. 8) , Lethality is defined as the ability of a system to cause the loss of, or a degradation in, the ability

  • f a target system to complete its designated mission(s) (p. 8). Deitz and Starks (1998) noted the

distinct perspectives inherent in these concepts. Lethality is concerned with the effectiveness of the threat weapon, whereas vulnerability addresses the survivability of the platform (p. 6). theory and history of use, and because it is an approach that enjoys user acceptance. This choice recognizes the importance of user understanding and confidence in its use (Allen, 1996, p. 140). This familiarity further facilitates quicker understanding of the problem (de Czege, 1997). The vulnerabilityAethality analysis methodology was selected because of its established Cognitive Systems Engineering Implications Cognitive systems engineering is an “emerging interdisciplinary field . . . [that] applies what we know about how human decision makers use information in problem solving to identify information requirements and to design information presentations and interactions’’ (Ehrhart and Lehner, 1998, p. 6). Simple tools and methodologies are recommended by the cognitive systems engineering literature during the early stages of a process where creativity is important (Hacker, 1997). The vulnerabilityAethality analysis methodology is a simple, understandable approach that can be used to facilitate understanding of VV&A. The methodology focuses the decision maker on the capabilities that are needed (Clegg, et al., 1997; Nelson, 1998) and provides practical procedures that are easy to implement (Hammond, et al., 1998). The VulnerabilityAethality analysis methodology uses existing cognitive representations and mental models to help inform users within the missile defense community about VV&A. Simply put, the cognitive approach facilitates the educational process and successfully engages the missile defense community to look at VV&A in a new and familiar way. DESCRIPTION OF THE METHODOLOGY Purpose and History The VulnerabilityAethality analysis methodology evaluates the degradation in utility of military platforms caused by damage in combat. Military platforms are generally vehicles (tanks, ships, and aircraft) that support other combat systems (weapons systems and communications equipment).

slide-7
SLIDE 7

3 Klopcic, Starks, and Walbert (1992) observed that expert judgement is often used to provide intuitive estimates of damage and remaining platform utility after attack (p. 2). There have also been attempts to collect empirical data that specifically relate types of damage to the loss of platform functions. Performance-oriented measures of effectiveness are a third source for determining the degradation of system effectiveness and functionality (pp. 2-3). Klopcic et al. (1992) found that deterministic models are not sufficient to adequately portray the multiple factors that determine combat damage. Instead, stochastic models are used to capture such factors zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

as

the direction and attitude zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

w i t h

which the missile impacts the system, the mechanics that govern the fracturing of armored vehicles, or the effects of ricochet on systems and operators zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

(p. zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

3). The vulnerability/lethality analysis methodology attempts to describe the interrelationships among these stochastic factors. Klopcic et al. (1 992) identified three primary components of vulnerability/lethality analysis information (p. 5). First, initial conditions between the threat and the target must be known. Second, the damage inflicted by the threat on the target must be identified. This is accomplished by assessing the damage states of the individual components of the target. Finally, the remaining functionality'of the target must be determined. Klopcic et al. (1992) also found that there is a lack of common understanding regarding the terminology that surrounds the vulnerability/lethality analysis methodology. Despite its use as a working vocabulary in the missile defense community, they determined that a vulnerability/lethality taxonomy was needed to clarify terms and definitions. This terminology is briefly described in the following section. Axioms derived from these terms are summarized in a succeeding section.

Terminology This section will describe some of the major terms used in vulnerability/lethality analyses. This section is a synopsis of the work conducted by Klopcic et al. (1992) and Deitz (1996). Readers are encouraged to refer to the original works for a more complete understanding of the vulnerability/lethality taxonomy. Spaces A zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA space is the set of all combinations of factors that could result in a given system state. Levels Klopcic et al. (1992) defined a level as the state of existence for the military platform (p. 5). The level contains the set of information that is necessary to describe the state of the system. Vulnerability begins at the point of initial interaction between a threat (such as a missile) and a target (military platform). Therefore, Level 1 of the vulnerability/lethality taxonomy represents the initial conditions between the threat and the target (Deitz and Starks, 1998, p. 2). Level 2 reflects the damage to the target after interaction with the threat. Level 3 is defined as the capability of the military platform remaining after damage (p. 4). As the vulnerability/lethality taxonomy evolved, three additional levels were added. Level 4 was added to represent the utility of the military platform to battlefield commanders given its remaining capability. The utility of the platform as it is engaged in a subsequent military action is described by measures of effectiveness for the platform in the given scenario. Level 4 is only achieved in actual wartime, hence, its metrics are not observable and simulation is often used to illustrate expected platform utility after damages have been incurred (Deitz, 1996, p. 2). Level 0 was added in recent years to illustrate the lethality of an interaction. Level 0 is used to represent the initial conditions of a threat prior to launch (Deitz, 1996, p. 2). Level

slide-8
SLIDE 8

4 zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

(-1) was also added to the methodology to reflect weapon detection identification conditions prior to threat launch (Deitz and Starks, 1998, p. zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

3). This level includes the decision to fire and

the configuration of the launcher. zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA Mappings zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA Mappings are the types of information that allow movement to another level. For example, the damage that occurs from missile impact on the target may be described by empirical data or by theoretical mappings about similar missile impacts. The effects of damage on the platform’s capabilities may require the use of analytical mappings (Klopcic et al.;1992, pp. 10-1 1). Axioms Klopcic et al. (1 992) also defined axioms that describe the relationships among mappings, levels, and spaces. Mappings and Levels For any given set of initial conditions, a probability distribution can be found that reflects the likelihood that those conditions will result in a particular type of damage (Klopcic et al., 1992, p. 11). Similarly, a probability distribution can be found that reflects the likelihood that a given type of damage will cause particular system capabilities to be lost. Klopcic et al. noted, however, that mappings from one level to the next are noninvertible zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA (p. 12). In other words, it is not possible to determine what set of damage vectors caused a given set of capabilities to remain. Similarly, it is not possible to identify a single set of initial conditions that would have resulted in

a particular damage vector.

Granularity Among Levels and Spaces that illustrate different degrees of granularity of the problem (p. 7). Each possible space could reflect a finer dissection of the target. The granularity required at a given level of the process must be supported by the granularity at preceding levels (Deitz, 1996, p. 3). For example, sufficient detail of remaining capabilities at Level 3 precludes the use of simple binary damage vectors at Level 2. The choice of granularity level is determined by the intended use of the model (p. 6). Metria Deitz (1996) observed that the initial conditions, damage, and remaining capabilities are all measurable and objective metrics. Measures of performance are often used to characterize these metrics, which can be used in various combinations to describe the vulnerability of a platform or the lethality of a weapon (p. 3). Klopcic et al. (1992) observed that a given level can be represented by any number of spaces APPLICATION OF THE METHODOLOGY Nelson (1 998) adapted the vulnerability/lethality analysis methodology to examine the information requirements that support vulnerability/lethality analyses. She described a process for determining data requirements and data availability, and developed a vulnerability assessment plan for filling voids where available data were insufficient or unreliable. Nelson’s methodology for identifying data requirements, data availability, and data reliability is a tailoring method that can be applied to the VV&A of models and simulations. The following section briefly describes Nelson’s application. Again, readers are urged to reference the original work to gain a full appreciation for her adaptation of the vulnerability/lethality analysis methodology.

slide-9
SLIDE 9

5 zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

Application One: Vulnerability/Lethality Information Requirements zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA Nelson (1 998) identified three categories of data required for vulnerability assessment (p. 13). First, information is needed to identify the likely scenarios in which the weapon system is likely to be exercised. Second, data are needed about the expected damage that might occur as a result

  • f threadtarget interaction. Third, the impact of damage on the platform’s capabilities must be
  • known. This information is not only required, but must also be available and reliable. Data

voids exist where required information is unavailable or unreliable. The data voids define the information that must be gathered in the vulnerability assessment plan. zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

(p. 7)

vulnerability/lethality analyses: (1) likely combat scenarios, (2) scenario characteristics, (3) mission success criteria, (4) assessment of specific capabilities, (5) levels of specific capabilities, zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA (6) subsystems to provide capabilities, (7) critical components, (8) likely damage to the system, and (9) expected vulnerability. using a three-phase approach. In Phase 1

,

the first seven information types were identified. This information described how the system would be used, the capabilities required, and the critical components that would provide those capabilities. In Phase 2, the likely sources of damage and the critical components that would be affected by that likely damage were determined (information types 8 and 7). During Phase 3, the critical components that were identified during Phase 1 were assessed according to their status as determined during Phase 2. The expected vulnerability (information type 9) was determined during Phase 3 from this assessment. Nelson (1998) identified nine types of required information for conducting She adapted the VulnerabilityAethality analysis methodology Application Two: Simulation Credibility Nelson’s (1998) application of the methodology suggested certain parallels that could be drawn between the assessment of weapon system vulnerability and the assessment of simulation

  • credibility. Clearly, a simulation is not a weapon system, and this article does not suggest that

this is the case. However, the absence of credibility is a significant vulnerability for models and simulations, and indirect comparisons of these processes can be made. Comparison of Definitions As stated earlier in this paper, vulnerability is a loss or reduction of capability to perform a mission as a result of having been subjected to a hostile environment (Deitz, 1996, p. 8). A simulation under development can lose capability as a result of environmental influences that are hostile to its development. For example, poor simulation design may preclude mission

  • accomplishment. Alternatively, inaccurate simulation of a warfighting capability could result in

program cancellation. Just as vulnerability assessments are used to determine potential risks in weapon system design, VV&A is used to reduce developmental risks associated with modeling and simulation. Applying the VulnerabilityLethality Axioms to Simulation Credibility Three axioms of the VulnerabilityAethality analysis methodology were described earlier. These postulates can be applied to simulation credibility in a similar fashion. The first axiom stated that mappings from one level to the next are noninvertible (Klopcic et al., 1992, p. 12). Nonintrovertibility refers to the inability to trace the exact origins of a particular

  • utcome. For example, in VulnerabilityAethality terms, it is impossible to determine the exact set
  • f damage vectors that caused a particular capability to survive. To apply this axiom to the

simulation credibility problem, a distinction must be made between deterministic and stochastic

  • simulations. Klopcic et al. noted that deterministic models are insufficient to adequately portray

the multiple factors that determine combat damage. Instead, stochastic models are used to

slide-10
SLIDE 10

6 describe these interrelationships (p. 3). Although a distribution of input parameters could be identified from stochastic outputs, the random variability of stochastic models precludes determination of the exact input conditions that generate a specific outcome. Hence, the first axiom holds true for stochastic models. zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA A second axiom addressed the granularity that is used to describe a level. A level can be represented by multiple spaces of different granularity. The degree of resolution selected for a particular application must provide consistency between levels, such that the granularity required at one level does not exceed the granularity of the preceding level (Klopcic et al., 1992, p. zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

8). In

simulation credibility assessments, granularity and resolution issues are described in terms of the simulation’s fidelity. Recent DoD initiatives to federate multiple models and simulations into a single simulation federation similarly require that the fidelity of the federation does not exceed the fidelity provided by any of its component models or simulations. A third axiom discussed the importance of measurable and objective metrics (Klopcic et al., 1992, p. 7). Measures of performance and measures of effectiveness are thresholds used to assess the simulation’s credibility against verification and validation (V&V) criteria and against acceptability criteria that are used to support the accreditation decision. Many simulation credibility assessments suffer from the lack of measurable, objective metrics. Without such thresholds, the decision maker does not know when enough verification and validation has been performed to adequately assess the simulation’s credibility and the accreditation decision is made without recourse to this vital information. Applying the Methodology to Simulation Credibility Comparison of methodology levels. Figure 1 is an attempt to illustrate the correspondence of the vulnerability/lethality levels to simulation development. The comparison of weapon system vulnerability to simulation credibility is obviously not a perfect fit. However, the similarities merit this attempt to use the established methodology to increase the missile defense community’s understanding of VV&A. The following section will describe the relationships depicted in Figure 1. In this discussion, it is useful to think of the simulation zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

as

the target, and the simulation development environment as the threat. [Figure 11 Level (- zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

1) is the detection phase of the methodology. For simulation credibility applications,

a problem is identified that requires resolution. For weapon system vulnerability applications, this level describes the initial detection of a threat. the simulation’s design and performance requirements to ensure that the simulation will be able to solve the identified problem. Similarly, in the VulnerabilityAethality analysis methodology, Level 0 is where the decision to use a particular weapon is made and the general conditions for its use are determined. Level 1 is the initial interaction phase. For simulation credibility applications, this initial interaction occurs when the requirements of the problem are used to develop the simulation’s conceptual model and high level design. This level corresponds to the initial interaction between the threat and the target in the vulnerability/lethality example. The simulation development process, however, is subject to the vagaries of the development environment, including technical changes and political influences. As the detailed design for the simulation is created and implemented in software, shortcomings of the conceptual model are identified and implementation of some planned capabilities may not be feasible. Level 2 At Level 0, the decision to use a simulation is made. This decision includes determination of

slide-11
SLIDE 11

indicates where changes may be needed. In the case of simulation credibility, such changes may be needed in the design or use of the simulation. For the vulnerability/lethality example, changes caused by damage require reconsideration of the platform’s mission. indicate where its capability may be adversely affected. It is important to note that the changes are not made at this stage, but remain proposed until determination can be made whether they are truly needed. The VV&A process traces the design and implementation back to the original

  • requirements. This activity helps determine those capabilities that were successfully

implemented and whether they are sufficient to meet the original performance requirements for the simulation. The VV&A process assists the user in determining whether the proposed design changes are necessary or whether the remaining capability is adequate. The simulation’s remaining capabilities parallel the remaining capabilities of the military platform at Level zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

3.

Finally, the degree to which the simulation meets the user’s needs is a measure of its utility. The accreditation process either confirms the simulation’s adequacy for the problem at hand or indicates that additional modeling or credibility assessment is needed. In extreme cases, the decision maker may decide that an entirely different simulation is needed. This corresponds to the Level 4 assessment of operational utility of the military platform. At the next level, the proposed changes to the simulation’s design and implementation- ComDarison of vulnerability factors. The application of the vulnerability/lethality analysis methodology to simulation credibility can also be assessed in terms of vulnerability factors. As previously noted, the weapon system

  • perates within an environment that is characterized by political and technical considerations.

Technical considerations include the mechanical and physical attributes of the weapon system that govern its use. The simulation is similarly designed and implemented within zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA an environment that reflects political and technical influences. For simulation development, technical influences are reflected in software engineering and design considerations. performance and the design changes that occur during the simulation development process. In particular, changes in simulation design affect the ability of the simulation to perform as

  • required. Changes that occur later in the development cycle increase the costs and risks to the
  • simulation. Such changes have been known to cause cancellation of the entire program as a

result of disagreements about the extent and intent of those changes. Nelson (1 998) emphasized the importance of making system vulnerability assessments early in the acquisition process to minimize the costs and risks of retrofit zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA (p. 3). A similar mandate exists for early VV&A to minimize the costs and risks of redesign and reengineering. A third point of comparison is between the operational mission of the weapon system and the intended application of the simulation. Just as operational tactics and doctrine govern the use of the weapon system, the styles and behaviors of simulation users govern the use of the simulation. Finally, weapon system vulnerability is manifested in immediate and direct consequences, such as personnel casualties, the loss of expensive military platforms and weapons systems, and the possibility of complete mission failure. In contrast, simulations are vulnerable to the risk of losing credibility or of creating false credibility where adequate assessment of the simulation’s validity has not been made. These risks may occur during the simulation development process itself or where simulation is used to support the acquisition of a weapon system or platform. In the latter case, simulations that are used to design weapons systems and platforms may increase the vulnerabilities of those systems, thereby resulting indirectly in personnel and hardware

  • losses. Military personnel may be physically lost due to weapon system vulnerabilities that may

A second comparison can be made between the threats to the success of the weapon system’s

slide-12
SLIDE 12

8 zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA not have been identified by unaccredited simulations. Similarly, personnel costs may be incurred for the mental burdens associated with fatigue, stress, and information overload resulting from poor simulation design. Again, the comparison is not perfect, but suffices to highlight the less recognized mental costs of military systems. Comparison of information requirements. The data elements defined by Nelson (1 998) can also be translated into information requirements to support simulation credibility assessments: (1) likely simulation application, (2) simulation functionalities, (3) simulation acceptability criteria, zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

(4) verification and validation

(V&V) criteria, (5) measures of performance and thresholds for V&V criteria, zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA (6) simulation components (modules) needed to measure simulation performance, (7) critical algorithms and modules, (8) likely changes to the simulation, and (9) expected credibility. Findings from Applications One and Two The first important finding was Nelson’s zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

(

1998) recognition that “a set of aggregate capabilities may lead to mission success in one scenario, but not in a second scenario that requires a different combination of capabilities for mission completion” (p. 11). This finding echoes the Department of Defense requirement that models and simulations must be reaccredited for each new use, where it is assumed that no two applications will require the same information and functionality. Second, the change in perspective afforded by this methodology also suggests the sources of data that might be used to support VV&A. Nelson (1998) observed that the data sources for likely weapon system scenarios included system users, military strategists, and operations research analysts (p. 13). Simulation users, strategists, and analysts would similarly be the preferred sources for data regarding the requirements for new simulations, albeit with different skill sets depending on the nature of the simulation. Nelson (1 998) further identified defense databases, system developers, and analytical agencies as possible sources for data to support the determination of expected damage and the impact of that damage on the weapon system. Again, these same sources would be appropriate for anticipating changes and impacts on the success of a simulation. Additional sources might be prior applications of similar simulations and in- process tests of the simulation under development. A third finding was that information needs must be prioritized according to the likelihood of the scenario and the decision maker’s confidence in the existing data. Nelson (1998) developed a matrix of nominal priorities to illustrate the interplay of these parameters (p. 19). She concluded that information searches should be tailored to fill those data voids where data are either unavailable or unreliable. Information searches should also be focused on the most critical functionalities required by the application. Current VV&A procedures also emphasize the importance of tailoring the effort to the most critical aspects of the simulation that will be used in the given application. Unfortunately, the concept of tailoring has not been well understood by the DoD modeling and simulation community, and tailoring methods remain immature. Nelson’s use of the VulnerabilityAethality analysis methodology suggests a different approach for reaching selected audiences through familiar, accepted methodologies. methodology to include a Level 5 that reflects learning and adaptation. Figure 2 illustrates the addition of this level to the methodology. A fourth finding suggests a possible extension of the VulnerabilityAethality analysis [Figure 21

slide-13
SLIDE 13

I ’

. zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

9 zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

For example, assume that the military platform is not lost but has incurred heavy damages that render the utility of the platform insufficient to continue its mission. The platform returns to base for repairs. At this time, small, real-time improvements in design or engineering can be made to implement lessons learned from actual battlefield conditions. In this manner, new learning is incorporated into system design, and into the doctrine that governs the system’s use. Similar learning zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA and adaptation occurs at Level 5 of the simulation credibility assessment. zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA As noted earlier, the accreditation authority may require additional modeling or credibility assessment before considering accreditation of the simulation. Changes to the conceptual model

  • r high level design will necessitate a return to Level 1. In extreme cases, the decision maker

may decide that the simulation is not the right tool to support problem resolution. This decision will necessitate a return to Level 0 and the selection of another tool. Learning is accommodated in this example through changes to simulation policy and procedure. CONCLUSIONS The selection of the vulnerabilityAethality analysis methodology was strongly supported by the literature. The use of a familiar, accepted methodology within a given community was found to be a valuable and viable approach for educating that community about another process, namely, VV&A of models and simulations. This change of perspective could excise the tendency to VV&A simulations against ideal standards. Instead, information would first be collected on the system’s requirements and likely applications. Additionally, insights into anticipated changes to the system as a result of environmental, technical, and political factors would be proactively explored. Finally, the potential impact of such changes would be evaluated in terms of their impact on the simulation’s capability to meet its intended use.

slide-14
SLIDE 14

REFERENCE LIST zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA Allen, Bryce L. Information tasks: Toward a user-centered approach to information systems. Library and Information Science. Ed. Harold Borko. San Diego: Academic Press, 1996. Clegg, Chris, et al. “Information technology: A study of performance and the role of human and

  • rganizational factors.” Ergonomics 40 (1997): 85

1-87

1.

de Czege, Huba Wass. Battle command of 2020 and beyond. Leesburg, VA: U.S. Army Research Institute, 1997. Deitz, Paul H. Recent applications and implications of the ballistic V/L taxonomy. Royal Military College of Science, Shrivenham, UK, 1998. Deitz, Paul H. “A V/L taxonomy for analyzing ballistic live-fire events.” Aberdeen Proving Ground, MD: U.S. Army Research Laboratory, Ballistic VulnerabilityLethality Division, 1996. vulnerability/lethality analyses.” Aberdeen Proving Ground, MD: U.S. Army Research Laboratory, 1998. technology.” The Technology Track, January 1998: 6-8. Ergonomics zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA 40 (1997): 1088-1096. making better decisions. Boston: Harvard Business School Press, 1998. vulnerability/lethality analysis process.” Aberdeen Proving Ground, MD: U.S. Army Laboratory Command, Ballistic Research Laboratory, 1992. Nelson, Martha Krug. “Assessing vulnerability: Benefits, costs, and risks.” Lancaster, PA: Franklin and Marshall College, 1998. Deitz, Paul H., and Michael W. Starks. “The generation, use, and misuse of “PKs” in Ehrhart, Lee S., and Paul E. Lehner. “Cognitive systems engineering: Where users meet Hacker, Winfried. “Improving engineering design: Contributions of cognitive ergonomics.” Hammond, John S., Ralph L. Keeney, and Howard Raiffa. Smart choices: A Dractical guide to Klopcic, J. Terrence, Michael W. Starks, and James zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

N.

  • Walbert. “A taxonomy for the

10

I

slide-15
SLIDE 15

Level zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA (-1) zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

  • DeLsfDn

Fieure 1. Mapping of Weapon System Vulnerability Levels to Simulation Credibility

slide-16
SLIDE 16

I zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

r zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

Level (-I) zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

  • Detection

r- zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA

A

Level 0 - zyxwvutsrqponmlkjihgfedcbaZYXWVUTSRQPONMLKJIHGFEDCBA Decision Level 1 - Before Interaction Level 4 - Utility Assessment

12

Level 5 - Repair or Redesign Figure 2. Extension of the Methodology to include Learning