Integrated Produet Teams for Test and Evaluation ‘Test Peogram Funding/Budgcting “Technical Reviews, Design Reviews, and Audis, Contractor Testing Spesitications Independent Test and Eva
Trang 1
TEST AND EVALUATION MANAGEMENT GUIDE
JANUARY 2005 FIFTH EDITION
Trang 3ACKNOWLEDGMENTS
This January 2005 update to the Defense Acquisition University’s Tost and Evaluation Management Guide includes updates ftom the Military Serviees, Defense Agencies, and other organizations, as well as changes in the May 2003 DoD $000 series Inputs were collated and finalized by DAU Program Direcor for Test and Evaluation Dr: John Clanton, Freelance writer Chaistina Cavoli and Defense ATEL editorin-chieF Collie Johnson edited the final document Freelance designer and cartoonist Jim Elmate designed the cover, and final deskiop publishing and layout was accomplished
by Bartlett Communications and DAU Visual Information Specialist Deborah Gonzalez
Trang 5
FOREWORD
“This book is one of many technical management educational guides written ftom a Department of Detease perspective; ie non-Service specific, They are intended primanly for wse in courses at Defense Acquisition University and secondarily 88 a desk eeferenee for program and project nan agement personnel, These guidebooks are writen for current and potential acquisition management personnel who are finiliar with basie terms and definitions employed in progeam offices The guide: books are designed to assist government and industry personnel in exceuting their management sponsbilities relative to the acquisition and support of defense systems
Dr John B, Claxton Progeam Director, T&E Defense Acquisition University
Trang 7MODULE 1 — Management of Test and Evaluatios
Chapter 1 — Importance of Test and Evaluatios
12 Testing a8 4 Risk Management Teo kí
13 The TRE Contribution at Major Milestones 4
Chapter 2— The Test and Evaluation Process
2.1 TnHodusios
2.2 Defense System Acquisition Process
2.3, TRE and che Systems Engineering Process {SEP)
3.3, OSD Oversight Structure
3⁄4 Service TRE Management Structures,
4.2 Relationship o the Program Manager 4
44 PMO!Contractor Test Management 44
Trang 8Integrated Produet Teams for Test and Evaluation
‘Test Peogram Funding/Budgcting
“Technical Reviews, Design Reviews, and Audis, Contractor Testing
Spesitications Independent Test and Evaluation Agencies PMO Responsibilities for Operational Test and Evaluation Summary
‘5 —Test-Related Documentation Inroduetian
Requirements Documentation Program Decision Dacumeniation Program Management Documentation
‘Test Program Documentation Other Test-Related Status Reports
Operational Te Male
Joint Test and Evaleation rvice Test and Evaluation Live Fire Testing
Nuclear, Biotogieal, and Chemical (NBC) Weapons Testing Nuclear Hardness and Survivability (NHS) Testing
‘Test Program Inception Areas of DT&E Focus System Design for Testing
“
ot 6s
Trang 952 DT&E and the Review Process sa
83, Configuration Change Control 86
10.6 Low Rate Initial Production (LRIP)
10.7 Production Acceptance Test and Evaluation PATKP)
(08 Summary
MODULE 111 — Operational Test and Evaluation
Chapter 11 — Introduction (o Operational Test and Evaluation
Trang 10Chapter 12 — OTSE to Support Decision Reviews
Ũ 2ˆ Conespt Refinsmem (CR) and Technology Development (TD) tr System Development and Demonsơadon (SDD) 12 Production and Deployrne (P&D)
133 The Evaluation Pro
134 Issues and Critena
Đã Mẹ
13.6 Evaluation Planning of Bffectiveness (MOES)
15.7 Evaluating Development and Operational Tests
14.3 Validity of Modeling and Simulation
144 Support to Tese Design and Planning
14.5) Support
st Execution 14.6 Support to Analysis and Test Reporting se 14.7 Simulation Integration re)
Chapter 15 — Test Resources
Obvaining Test Resources 154
“Test Resource Planning 1s4
Trang 11
Chapter 16 — Test and Evaluati
MODULE V — Specialized Testing
Chapter 17 — Software Systems Testing
TT Introduetion
17.2 Definitions
173 Purpose of Software Test and Evaluation
174 Software Development Process
ITS TAE in the Software Lite Cyele
17.6 Summary
Chapter 18 — Testing for Vulnerability and Lethality
18.1 Introduesion
182 Live Fite Testing
18.3 Testing for Nuclear Hardness and Survivability
19.3 Conducting Logistics Support System T&E
194 Limitations fo Logisties Support System TSE
19.5 Summary
Chapter 20 — ECICAISR Test and Evaluation
20.1 Intduetion
202 Testing EC Systems
203 Testing of CASR Systems
204 Trends in Testing C*l Systems
205 TRE of Surveianee and Reconnatssence Systems
20.6 Summary
iol eet t6 tor
tại
191
14 tòa 19.8
Trang 123 NATO Comparative Test Program
224 TSE Management in Multinational Progeams
225 US and NATO Acquisition Programs
22.6 Summary
Chapter 23 — Commercial and Non-Development Items
23.1 Introduction
232 Market Investigation and Proeu
233 Commercial Item and NDI Testing
234 Resoutces and Funding
315 Summary
Chapter 24 — Testing the Special Cases
24.1 Introduction
242 Testing with Limitations
243 Space System Testing
244 Operations Security and T&E
245 Advanced Concept Technology Demonstrations
Trang 13APPENDICES
Appendix A — Acronyms and Their Meanings Ad Appendix B ~ DoD Glossary of Test Terminology Bà Appendix © ~ Test-Related Data [em Desoripians or
Appendix F ~ Points of Contact for Service Test and Evaluation Courses Fa LIST OF FIGURES
Figure 1-1 The 000 Mex
Figure 1-2 Life Cyele Cost Decision Impact and E
penditures Figure 2-I Testing and the Acquisition Process
Figure
Figure 2-3, ‘The Systems Engineering Process Requirements to Design
Figure 2-4 Systems Engineering Process and Test and Evaluation
Figure 2-3 Design Reviews
Figure 2-6, DoD Test and Evaluation Prooess
Figure 3-1 DoD Test and Evaluation Organization
Figure 32 Army Test and Evaluation Organization
Figure 3-3 Navy Test and Evaluation Organization
Figure 3-4 Air Fores Test and Evaluation Organization
Figure 3-5 Test and Evaluation Executive Agent Structure
Figure 4-1 Lessons Learned ftom OT&E for the PM
Figure 7-2 Design for Testing Procedures
Relationship of DT&E to the Acquisition Process
Figure 8-2 Technieal Review Timeline 83
Trang 14“The Simulation Spectsum
‘Values of Solected Criteria Conducive to Modeling and Simulation
Modeling and Simulation Application in Test and Evaluation STEP Process
DoD Major Range and Test Facility Base The Error Avalanche
System Development Process Spiral Model of AIS Develanment Process Lise Fire Test and Evaluation Planning Guide Lozisties Supportability Objectives in the TRE Program Inggeated EC Testing Approsch
EC Test Process Concept
EC Test Resouree Categories The Evolutionary Acquisition Process Simple Mult-Service OT&E Test Team Composition The Spectrum of Technology Maturity
Sample Test Plan Contents
Differences Betwest DTSE and [OTE Technical Reviews and Audis
Combined vs Concurrent Testing: Advantages and Limitations PRR Guidelines Checklist
Sample Evaluation Plan TEMP Test Resource Summary Section Test and Evaluation Master Plan Format Relationships Between Key Concepts Types of Live Fire Testing
Nuclear Hardness and Survivability Assessment Activities
re
182 Thế
Trang 15
1 MODULE
MANAGEMENT OF TEST AND EVALUATION
Trang 171
IMPORTANCE OF
TEST AND EVALUATIO!
1.1 INTRODUCTION
‘The Test and Evsluation (TE) process is an
integra part ofthe Systems Engineering Process
(SEP), which identifies levels of performance and
assists the developer in correcting deficiencies
Ieisa significant clement in the decision-making
process, providing data that support trade-off
analysis, risk eduction, and requirements refine-
‘ment, Program decisions on system performance
maturity and readiness to advance to the next
phase of development take into consideration
‘demonstated performance, The issue of para-
‘mount importance to the servicemember user is
system performance; i will it fulfil the mise
sion The T&EE provess provides data tha fell the
user how well the system is performing during
development and if iis ready for Fielding The
Program Manager (PM) must balanee the risks
of cost, Schedule, and performance to keep the
program on track to praduction and fielding The
responsibility of decision-making autho
centers on assessing risk tradeofs As stated in
Department of Defense Directive (DoDD) $000.1,
The Defense Acquistion Sistem, “Test and eval=
ation shall be integrated throughout the defense
aequsition process Test and evaluation shall be
structured © provide essential information to de-
cision makers, asses atainient of tetinical per
formance parameters and determine wheter sys
ems are operationally effective suitable,
survivable, and safe for intended use,
uct of test and evaluation, integrated with mod-
cling and simulation, shall facilitate learning, as-
sess technology maturity and interoperability,
Facilitate integration into fielded forces, and eon=
Firm performance against documented capability
needs and adversary capabilities as deserbed in
‘he system threat assessment”!
12 TESTINGASA RISK MANAGEMENT TOOL
Correcting defects in weapons has heen estimated
to add fiom 1N perezat to 30 percent to the cast
of each item? Such costly redesign and moditi- cation efforts ea be redhiced iFearefully planned and executed TE programs are used 19 detect tnd fix system deficiencies early in the aequisi= tion process (Figure I-1} Fixes insiuted during carly work efforts (Systems Integration ($1) in the System Development and Demonstration (SDD) Phase cost significantly less than those
‘implemented afier the Critical Design Review (CDR), when most design decisions have already been made,
TAE results fgure prominently in the decisions reached at design and milestone reviews, How- ever, dhe foot that TE results are cequired at major decision points does not presuppase that TSE results must always be favorable Te Final decision responsibility lies with the decision maker who must examine the erticalisues and weigh the facts, Only the decision maker can determine the weight and importance that is to bbe auibuted 10 a system's capabiliies and short- comings and the degree of risk that ean be seeepied, The decision-making authority will be tunable to make tis judgment without a solid base
of information provided by TRE, Figure 1-2 illustrates the Life Cycle Cost (LCC) of a system and how decisions inmpast program expenditures,
Trang 19‘A Defense Science Board (DSB) 1999 Task Fore
Focused on a broad overview of te state of TSE
‘within the Department of Defense (DoD) This
group made the following observations about the
TRE process
+ The focus of T&E should be on how to best
support the acquisition pro
+ TRE planaing with Operational Test (OT) por-
sonnel should start esr'y in the acquisition
¬
+ Distrust remains between the đevelapttent and
+ Contractor testing, developmental resting, and
‘operational testing have some overlapping
functions;
+ Ensuring the test data ae independently evala- ate isthe essemial element, not the taking of the data isl
fn area of risk and then specified technical methods for reducing that risk PMs and test man~ fgets may wish to consult these templates for zuidanos in reducing the risks frequently associ sted with test programs The DoD manual Tran sivion from Development to Production contains sample risk management templates."
Trang 201.3 THE TSE CONTRIBUTION AT
MAJOR MILESTONES
TRE progress is monitored by the Ofliee of the
Seeretary of Defense (OSD) throughout the
quisition process, Their oversight extends 10
Major Defense Acquisition Programs (MDAPS)
‘or designated acquisitions, TRE officals within
(OSD render independent assessments to the De
fense Acquisition Boaed (DAB), the Defense
Acquisition Executive (DAE), and the Secretary
of Defense (SECDEF) at each system milestone
review These assessments are based on the
following TE information,
+ The Test snd Evaluation Master Plan (TEMP)
and more detailed supporting documents
‘developed by responsible Serviee activities;
Service test agency reports and briefings;
‘TSE, Modeling and Simulation (M&S), and
data from other sources uch as Service PMs,
laboratories, industry developers, studies, and
analyses,
AL Milestone B, the OSD T&E assessments
reflect an evaluation of system concepts and tech-
nology alternatives using early performance
parameter objectives and thresholds found in an
approved preliminary TEMP At Milestone C,
sssessnentsinchide an evaluation of previously
executed tet plans and! test results At the Full
Rate Production Decision Review (PRPDR),
assessmenis inelnde consideration of the opera-
tonal effectiveness ad suitability evaluations of
‘weapon systems,
A primary contribution made by TSE is the
‘detection and reporting of deficiencies that may
adversely impact the performance capability or
vailabiltyupportailiy of a system, A do
cieney reporting process is used throughout the
sequisition provess to report, evaluate, and track
system defieiencies and (o provide the impetus
for corrective actions that improve performance
10 desire levels, 13.41 TSE Contributions Prior to
ilestone B
During Concept Refinement (CR) and Technology Development (TD) activities prior Milestone B, laboratory testing and M&S are eos ducted hy the contractors and the development agency to demonstrate and assess the capable ties of Key subsystems and components, The est and simulation designs are based on th
‘operational needs documented in the Initial Ca- bilities Document (ICD) Studies, analyses, simulation, and test data ate used by the deve: lopment agency to explore and evaluate alterna- tive eoneepts proposed to satisfy the user's needs, Also during this period, the Operational Test
‘Ageney (OTA) monitors CR and TD activities gather information for future T&E planaing and
to provide elfeesiveness and suitability input de= sired by the PM The OTA also conduers Early Operational Assessments (EOAS), as feasible
289656 the operational impact of eandidate tech ical approaches and to assist in selecting pre= ferred allemative system eoncspS,
Trang 21an Engineering Development Model (EDM),
‘whic is tested in ts intended environment prior
to Milestone C
In SI the development ageney conducts DTAE to
assist with enginesring design, systom develop-
rent, risk identification, and to evaluate the
contractor’ ability to attain desired technical pet
formance in system specifications and achieve pro-
tram objectives The DT&E includes TRE of com>
ponents, subsystems, and prototype development
‘models, TRE of fantional compatibility, interop-
crabilty, and integration with fielded and devel-
‘oping equipment and systems is also includes
During this pase of testing, alequate DT&E is
sevomplished to ensure engineering is weasonably
complete including survivability vulnerability,
eompatbility, ansportability interoperability, re=
Fbuliy, maimanabiiay, safety, human fctors, and
Iogisies suppartabiliy Also this phase eonems
‘hatall significant design problems have been iden
‘fled ad solution to dese problems are in hand
The Service Operational Test and Evaluation
(OTE) agency should conduek an EOA for the
Design Readiness Review (DRR) to estimate the
system potential to be operationally effective and
stable: identify needed modifications: nd pro-
vide information on tacties, dectrine,
organization, and personnel requirements The
early OTS&E program is accomplished in an
environment containing limited operational realism
Typical operational and support personnel axe
used to obtain early estimates of the user's capa
bility to operate and maintain the system, Somme
fof the most imporant products of user assess-
‘ments of system maintainabilyy and supporlbiity
fre human factors and safety isses
In Systems Demonstration, the objective is
design, fabricate, and testa preproduction system
that elosely approximates the Final product TSE
activities of the EDM during this period yield
‘much useful information, For example, date
obiained during EDM T&E can be used to assist
‘in evaluating the system's maintenance taining
requirements and the proposed training program, Test resuls generated during EDM TRE also support the user in refining and updating templayment doctrine and tactics
During Systems Demonstration, TE is con ducted to satis’ the following objectives: (1) As specified in program documents, assess
‘he eritial technical issues (a) Determine haw well the development contract specifications have been met; (b) Identify system technical deficiencies and focus on areas for corrective
(6) Determine whether the system is com patible, interoperable, and ean be inte= grated with existing and planned equipment or systems
(d) Estimate the Reliability, Availabilty, and Maintainabiliy (RAM) of the system aller its deployed
() Determine whether dhe system 6 safe and ready for Low Rate Initial Production (LRP),
(8) Evaluate effects on performance of any configuration changes caused by cor- rectng deficiencies, modifications, of Product Improvements (Pls
(2) Assess human factors and identity limiting factors
(2) Assess the technical risk and evaluate the
uo among specifications, operational requirements, LCCS, and schedules (3) Assess the survivability, vulnerability, and logistics supportability of the system;
Trang 22(4) Verify the aesuraey and completeness ofthe
technical documentation developed to
maintain and operate te weapons system;
(5) Gather information for taining programs
nd technical training materials needed t0
Support dhe weapons system;
(6) Provide information on envitonmentat
Issues for use in preparing environmental
impact assessments
(7) Detetmine system performance limitations
and safe operating parameters
“This, TRE activities intensify during this phase
and make significant contributions to the overall
aequistion decision process
Thie development ageney evaluates the results of
TRE for review by the Service headquarters and
the Service acquisition review council prior ©
system acquisition review by the Milestone
Decision Authority (MDA) The evaluation in-
cludes the results of testing and supporting
Viormation, conclusions, and recommendations
for further engineering developmen At he same
time, the OT&E ageney prepares an Independent
Operational Assessment (JOA), whieh contains
estimates of the systems potential operational
effectiveness and suitability, The Operational
Assessinent (OA) provides @ permanent record
of OTRE events, an audit Wail of OTRE data,
test results, conclusions, and recommendations
This information is used to prepare for Milestone
Cand supports recommendation of whether the
esign and performance of the system in
{development justifies proceeding into LRP
3 TRE Contribotions Prior to Full Rate
Production Decision Review
‘The development agency transitions the final
design to LRIP while fixing and verifying any
technical problems discovered duting the Final testing of the EDM in is intended environment The maturity of the hardware and sofware con figurations and logistics support system available fiom LRIP are assessed when the development ageney considers certifying the system’ readi- ress for Initial Operational Test and Evaluation (IOT&E)
(2) ldentfy operational deficiencies:
(3) Evaluate changes in production configurtion; (4) Provide information for developing and refining logistics suppart requirements for the system and taining, tetes, techniques, and doctrine;
(5) Provide information 10 refine Operations and Support (O&S) cos estimates and iden-
TH system characteristics or deficiencies that can significantly impact O&S costs
fication of interoperability forthe system com pnents In parallel with IOT&E, Live Fite Test and Evaluation (LFTS&E) may be used f0 evalu- ate vulnerability or lethality aPa weapon system
8 appropriate and as required by public las The
PM briefing and the Beyond Low Rate Initial Produetion (BLRIP) report adress the tsks of proceeding into Full Rate Production (FR?),
Trang 231.34 T&E Contributions After the Full Rate
Production Decision Review
Aficr FRPDR, when the FRP dosision is normally
rade, T&E activities continue o provide impot=
tant insighs, Tests desenbod in the TEMP but
not conducted during earlier phases are eom-
pleted, The residual DTAE may inclode extreme
‘weather testing and testing corected deficiencies
System elements are integrated into the finat
operational configuration, and development tst-
ing is completed when all system performance
requirements are met During FRE, government
represeniatives normally monitor or conduct the
Production Acceptance Test and Evaluation
(PATE), Each system is verified by PAT&E for
compliance with the requirements and
specifications of the contact
Post-produetion testing requirements may result
from an acquisition strategy calling fr increment
changes to accommodate accumulated engineet=
ing changes or the application of Preplanned
Product Improvements (D°ls) This will allow
parallel development of high-risk techsology and
‘modular insertion of system uperades into pro-
duction equipment Technology breakthroughs
and significant threat changes may require system
‘magifieations The development of the meditix
cations will roquire development testing; i sys
‘em performance is significantly changed, some
level of operational testing may be appropriate
CORRE activities continue after the PRP decision
in the form of Follow-on Operational Test and
valuation (FOT&E), The initial phase of FOTSE
‘may be conducted by either the OTE agency or
‘ser commands, depending on Service directives
This verifies the operational effectiveness and
suitability of the production system, determines
if deficiencies identified during the [OTE have
been corrected, and evahists areas not test đục
Jing JOTSEE de to system limitations Additional
FOT&E may be conducted over the life of the
system to cefine doctrine, tates, echniques, and
to satisty critical issues and objectives established for FOT&E, and documents is assessment of
‘deficiencies resolved ater SDD Deficiencies that are not corrected are recorded
4 final report on FOTAE may also be prepared
by the using command test eam, emphasizing
‘he operational uility of the system when opet= sted, maintained, and supported by operational personnel using the eoncepts specified forthe sys tem Specific attention is devoted to the following
(2) The degree t0 which the system ean be placed in operational Field use, with spe- cific evaluations of availability, compstibil-
‘ty tansporabilty, interoperability reliable ity, wartime usage rates, maintainability, safety, human factors, manpower support= ability, lopisties supportabiity, and raining requirements;
(3) The conditions under whieh the system was tested including the natural weasher and climatic conditions, terrain effects, battle- Field disturbances, and enemy threat conditions;
(4) The ability of the system to perform its requited functions for the duration of a specified mission profile:
Trang 24(5) Ssstem weaknesses such as the vulnerability
if the system to explaitation by counter-
measures techniques andthe practicality and
probability of an adversary exploiting the
susceptibility of a system in combat
A specifi evaluation ofthe personnel an logis-
ties changes needed for the effective integration
fof the system into the users inventory is also
made, These assessments provide essential input
for the later acquisition phases of the system
‘development evele
14 SUMMARY
“Risk management,” acconing to Transition from
Development 10 Production, “is the means by
whieh the program areas of vulnerability and
concern are identified and managed" TRE is
te diseipine that helps to illuminate those areas
of vulnerability, The importance of TRE in the
acquisition process is summarized well i a Tuly
2000 General Accounting Office (GAO) Report
NSIAD-00-199, Bes! Practices: A More Can-
siructve Test Approach is Key to Better Meapon
Sistem Outcomes The summary serves to
‘underscore the importance of the TIRE process
a8 a whole:
+ Problems found late in development signal
‘weaknesses in testing and evaluation Early testing to validate product knowledge is
a best practice Different incentives make testing @ more ooo structive factor in commercial programs than
in weapon system programs
“To lessen the dependence on testing late in development and 10 foster a more constructive felationship between program managers and testers, GAO recommends that the Seetetary of Defense instruct acquisition managers to strue- ture test plans around the attainment of inereas- ing levels of product maturity, orchesirate the right mix of tools to validate these maturity levels, and build and resouree sequisition strategies around this approach GAO also recommends that vali- ation of lower levels of product maturity not be
‘deferred to the third level, Finally, GAO recom mend thut the Secretary require that weapon systems demonstrate a specified level of product :mulurty before major programmatic approvals
Trang 25
1
ENDNOTES DoDD 5000.1, The Defense Acguisition
System, May 12, 2003, p 8
BDM Comporation, Functional Description of
the Aequisition Test and Evaluation Process,
July 8, 1983,
Defense Science Board Task Force Report,
Solving the Risk Equation in Transitioning
{from Development to Production, May 2,
1983 (later published as DoD Manual
4248.0,
4 DoD 4245.7-M, Thosition fm Development
‘o Production, Sopomber 1985
xd
6 GAOINSIAD.00-199, Best Pacties: A more Constructive Fest Approach is Key to Better Heapon System Outcomes, July 2000, p
1 Bid,
Trang 261
Trang 272
THE TEST AND
EVALUATION PROCESS:
21 INTRODUCTION
‘The fundamental purpose of Test and Evaluation
(TRE) in a defense system's development and
Acquisition program i to identify the areas of rs
to be reduced or eliminated, During the early
phases of development, TRE is conducted to dem-
onstrate the feasibility of eoneeptal approaches,
evaluate design risk, idemify design alternatives,
compare and analyze tradeoffs, and estimate sát
‘sfaction of operational requirements, AS a sys-
‘em undergoes design and development, the it
crative process of testing maves gradually from
4 concentration on Development Test aad Evalu-
ation (DT&E), which is concerned chiefly with
atsinment of engineering design goals, o incteas-
ingly comprehensive Operational Test and Evilu-
ation (OTE), which foeuses on questions of op-
eratianal effectiveness, suitability, and
survivability Although there are usually separate
Development Test (DT) and Operational Test (OT)
events, DT&E and OTE are not necessarily
serial phases in the evolution of & weapon sys
‘em development Combined or coneurrent DT
and OT are encouraged when appropriate, 1c
conferring possible cost or time savings!
TRE has its origins in the testing of hardware
This tradition is heavily embedded in its vocabu-
lary and procedures, The advent of software-
{nlensive systems has brought new challenges to
ng, and new approaches are discussed jn
Chapter 17 of this guide Remaining constant
‘hroughout the T&E process, whether testing
hardware or software, is the need for thorough,
logical, systematic, and easly test planning
including feedback of well-documented and
‘o mission capability and operational support, in a timely manner, and at a fair and reasonable pic,
As itis now stryenred, the defense system life
‘epee isto replicate the preferred acquisition stat- egy of an Evoltionsry Acquistion (EA) process that wes either ineremental or spiral development processes The three major elements —pre-systemt acquisition, system acquisition, and sustain- ment may include the following Five phases (1) Coneept Refinement (CR)
(2) Technology Development (TD) (G) System Development and Demonstation Isp)
Trang 28
‘As Figure 2-1 shaws, these phases are separated
by key decision points wien a Milestone Deci-
sion Authority (MDA) reviews a program and
authorizes advancement to the nest phase in he
cycle, Thus TE planning and test results play
‘an important part in the MDA review proces
The following description of the defense system
acquisition process, summarized from Depart-
‘ment of Defense Instruction (DaDI) $000.2, Op-
eration ofthe Defense Acquistion Syste, shows
how TAGE fits within the context of the larger
Trang 29
investigated Shortly aftr the milestone devi-
sion, a integrated team begins tansitioning the
fest planning inthe TDS into the evaluation srat~
cay for formulation of a Test and Evaluation
Master Plan (TEMP) The TD effort concludes
with a decision review at Milestone B when an
affordable inerement of militarily useful cap
bility has been identified the technology for that
‘increment has been demonstrated in a relevant
environment, ané a system can be developed for
production within 2 short time frame (normally
les than 5 years} or the MDA decides to termix
note the effort Typical T&E-related documents
Atte Milestone B rvew the Asquison
Decision Memorandum (ADM) (exit evte
ICD Caps Develoment Document COD)
{performance parameters), Acquistion Strategy,
System Threat Assessment (STA), an Early Op
cratonal Assessment (EOA}, and the TEMP Ad-
Uitional program management documents pre=
pared hefore Milestone B include: the Ava,
Independent Cost Estimate (ICE), and the con-
cept bascline version ofthe Acquisition Program
Baseline (APB), which summarizes the weapon's
functional specifications, performance param
cles, and cost and schedule objectives,
The program office for major programs (Office
of the Seeretary of Defense (OSD) oversight)
rust give consideration to requesting a waiver
for fullup system level Live Fire Testing (LET)
and identification of Low Rate initial Production
(LRIP) quantities for Initial Operational Test and
Enaluation (OTE),
The Milestane B decision is program initiation
for systems acquisition and establishes broad
objectives for program cost, schedule, and ech-
nical performanes, After the Milestone B đcci-
son for program start he Systems Integration
(SI) work effort hegins during which a selected
concept, typically brassboatd or early prototype,
is refined through systems engineering, analysis
2s
and design Systems engineering must manage all requirements as an integrated set of design constraints that are allocated down through the various levels of design (Figure 2-2), This work effort ends when the integration of the system components has been demonstrated through a= equate developmental testing of prototypes The Design Realiness Review (DRR decision eval tes design maturity and readiness fo either enter imo System Demonstration or make a change to the acquisition strategy The System Demonstra- tion work effort is intended fo demonsate the ability of the system to operate ina useful way consistent with the approved Key Performance Parameters (KPPs) Work advances the design 10
sn Engineering Development Model (EDM) that {s evaluated for readiness to enter LRID "Sue~ exssful development test and evaluation to assess technical progress against critical technical pa- rameters, eatly operational assessments, and,
‘where proven espabiltics exist, the use of mad ling and simulation ce demonstrate system inte- ration are critical during this efor Deficien- cies encountered in testing prior to Milestone C shall be resolved prior t© proveeding beyond LRIP™ The EDM should have demonstrated dc ceptable performance in DEE and the Opera- tional Assessment (OA), with acceptable iterop- srailty and operational supportabiits
22.3 Produetion and Deployment Daring the LRIP work effort, the purpose is aghieve an operational capability that satisfies mission needs, The selected system design and
Trang 31its principal items of support are fabricated as
production configuration modes, Test anicles
nomally are subjected © qualification testing,
full-up LFT, and IOT&E This work effort ends
‘with the Full Rate Production Decision Review
UFRPDR) marking entry ito Full Rate Production
UFRP) and deployment ofthe system for Initial
Operational Capability OC), Key documents for
the TẾT manager atthe time of the FRPDR are
the updated TEMP, development testing resus,
the Service IOTISE report, and Live Fire Test Re-
part For Acquisition Category (ACA) Land des-
ated oversight programs, the Director of Op-
rational Test and Evaluation (DOT&E) is
required by law to document the assessment of
the adequsey of IOT&E and the teported opera-
tional effectiveness and suitability of the system,
This is done in the Beyond LRIP (BLRIP) Re-
or Also mandated by law is the requirement
forthe DOTA&E to submit the Live Fite Test Re-
prt prior «© the program proceeding beyond
LRIP These DOTAE Reports may be submited
ss a single document
2.2.4 Operations and Support
The production continues at full rate allowing
continued deployment of the system to operal-
Ing locations and achievement of Full Opera-
tional Capability (FOC) This phase may include
‘major modifications tothe production configu
ration, inerement upgrades, and related Pollow-
‘on Operational Test and Evaluation (FOTAE)
(0A), Approval for major modifications should
Identify the actions and resources needed 10
achieve and maintain operational eadiess and
support objectives The high cost of changes may
equite initiation of the modification as @ new
program To determine whether major upgrades!
modifications ace necessary or deficiencies wat-
aut consideration of replagement, the MDA may
review the impact of proposed changes on sy-
tem operational effectiveness, suitability, and
‘of system operational effectiveness and opera-
‘ional suitability, Ta met these objectives, pc propriate test activities had to be fully integrated imo the overall development process From a $95- tems engineering perspective tes planning, tst- ing, and analysis of test results are inteyral parts ofthe basie product detinition process
‘management decisionmaking (Figure 2-3)
A system’ life eycle begins withthe users noeds, which are expressed as constraints, and the required capabilities needed 10 satis mission objectives: Systems engineering i essential in the
‘ist planning peria, in conceiving the ssstem eoncept and defining performance requirements for system elements, As the detailed design is prepared, systems engineers ensure balanced influence of all required design specialties, including testability They resolve interface prob- lems, petform desiga reviews, perform trade-off analyses, and assist in verifying performance, The days when ene or sva individuals could design a complex system, especially 2 huge, modetn-age weapon system, are in the past Modern systems are (oo complex for a small umber af generalists to manage, systems require
‘nedepth knowledge of broad range of areas and
‘echnical disciplines, System engineers coordinate
‘he many speialized engineers iwolved inthe con-
‘current engineering process through Integrated
Trang 33Produet and Ptoesss Developttent(IPPD), Int-
grated Prdduet Tsams (IPÏSI are respondils for
the integration ofthe components into a system,
Through imerdisiplinay integration, a systems
engineer manages the progress of product
definigon from system level lo coi
item level, detailed level, defiieney correction,
‘und modifications Produet lmpravements (Pls),
Test results provide feedback to analyze the de-
sign progress toward performance goals Tools
of systems engineering include design reviews,
configuration management, simulation, Techai-
el Performance Measurement (TPM), trade-off
analysis, and specifications,
What happens during systems engineering? The
process determines what specialists are cequired,
‘what segments and Non-Developmental Items
(NDIs) are used, design performance limits,
trade-of itera, how totes, when 10 test, how
to document (specifiations) and what manage-
ment controls 10 apply (TPM and design
reviews}
Development and operational testing support the
technical reviews by providing Feedback 10 the
SEP More information on the reviews is com
tained in Chapter &
23,1 The Systems Engineering Process
The SEP is the iterative Topical sequence of
analysis, design, test, and decision ativities that
transforms an operational need into the deserip-
tions required for production and fielding of all
operational and support system elements This
process consists of four aetivitis, They include
equitements analysis, functional analysis and
allocation, synthesis, and verification of perfor=
mance (TRE), whieh support decisions on
tradeoffs and formalize the description of system
elements, The system engineering plan i$ de-
seribed in Chapter 16 of the Defense Acquistion
University guide to Systems Engineering Finda-
‘ewals,Janvary 2001
‘The requirements analysis activity is process used by the program office, in concert with the ser, establish and reine operational and design Fequitemnts that result in the proper balance between performance and cost within affordably constraats Requirements analysis shall be con ducted iteratively with Funevonal Analysis Allocation (FAVA) to develop and refine system level functional and performance requirements, external interfaces, and provide traceability among user requirements and design requirements
‘The FAVA activity identifies what the system, component, or part must do, normally works from the top downward ensuring requirements traceability and esamining alternative concep Tis is done without assuming bow functions will be accomplished, The product is a sevies of altemative Functional Flow Block Diagrams (FF BDs) A finetional analysis ean be applied at every level of development At the system level, 1N may be a contractor of Servi effort During the CR and TD phases, developmental testers assist the functional analysis activity to help de termine what each eompanent’s cole will be as part ofthe system being developed Performance requirements are allocated to system components
The synthesis activity involves invention—con- ceiving ways to do each FFBD task—to answer the "how" question Next, the physical interfaces implied by the “how” answers ate carefully iden- tified (topological or temporal), The ansveers must reflec all tehnology selection factors Synthesis tools include Requirements Allocation Shests (RASS), whic anslaeFtional statements ato design requirements and permiea long and com plex interactive invention process with contol visibiny and requirements traceability Develop: nental testers conduct prototype testing to termine how the components will perform assigned fumetions to assist this synhesis acti
‘The vetfieation loop and decision activity allows tradeofT of altemative appreaches to “how This 3S
Trang 34activity is conducted in accordance with decision
criteria st by higher-level technical equitements
for such things #8: Life Cyele Costs (LCS)
effectiveness; Reliability, Availability, and
Maintainability (RAM); tsk limits; and sebed-
lle, It is repeated at each level of development
Thee verification and decision activity is assisted
by developmental testers during the later SDD
phase when competitive testing between allerna-
tive approaches is performed
The final activity 1s a description of system
elements Developing as the result of previous ac-
tivities and a5 the final system design is deter=
‘ined this activity takes form when specifications
are verified through testing and when reviewed in
the Physical Configuration Audit(PCA) and Func-
‘ional Configuration Audit (FCA) Operations
testers may assist in this activity, They eondict
‘peraonal esting of the test emsisysiems to help
<etermine the personnel, equipment, faites sof
ware, and technical data requirements of the new
system wher used by typical military personnel
Figure 2-4, Systems Engineering Process and Test and Evaluation, depts the activities and their
23.2 Technical Management Planning
‘The technical management planning incorporates
‘op-level management planning forthe integra- tion ofall system design activities, ls purpose is
to develop the organizational mechanisms for direction and control, and identify personnel for the atainment of cost, performance, and sehedule objectives Planing defines and deseribes the type and degcee of system engineering manage- ment, the SEP, and the integration of related engineering programs The design evolution proeess forms the basis for comprehensive T&E: planning
The TEMP must be consistent with technieal
‘management planning The testing program out lined in the TEMP must provide the TPMs data requited for all design decision points, audits,
‘SYSTEMS ENGINEERING PROCESS J
chow Fue evan
Trang 35and reviews that are a part of the system's
engineering process The configuration manage-
ment process controls the baseline for the test
programs and incorporates design modifications
to the baseline determined ta be necessary by
TRE
The TEMP and technical management planning
must be traceable to each other The system
description in the TEMP must be traceable to
systems engincering documentation such asthe
FFBDs, the RASS, and the Test Requirements
Sheets (TRSs} Key functions and interfues of
the system with other systems must be described
and correlated with the systems engineering docu-
‘mentation and the system specifieation, Techni-
cal thesholds and objetves include specific per-
formance equicements that become test planning
limits They must be waceable through the
planned systems engineering doeumentation and
ean be correlated to the content ofthe TPM Pro-
ram For example, fire criteria for reliability
thvesholds during, OF&E testing must be delin-
eated and agreed upon by the Program Manager
(PM) and the Opecational Test Director (OTD),
und reflected in the TEMP
2.3.3 Technical Performance Measurement
‘TPM identifies eritcal technical parameters that
sme ata highet level ofisk during design, Ie aeks
TAE data, makes predictions about whether the
parameter ean geiieve final technical sucvess
Wwithia the allocated resources, and assists in
‘managing the felnica! program
‘The TPM Program isan integral part of the TRE
program The TPM is defined as product design
assessient and farms the backbone of the devel-
‘opment testing program, It estimates, through
engineering analyses and tests, the values of
essential performance parameters of the current
program design I setves asa major ipl inthe
continous overall evaluation of operational ef=
feciveness and suitability: Design reviews are
conducted to measure the systems engineering
29
progress, For more information, see Chapter 8, Figure 2-5 depicts the technical reviews that usually tke place during the SEP and the related specification documents
2344 System Baselining and T&E The SEP establishes phase baselines throughout the acquisition eyele These baselines (Functional, slloeated, product) can be modified withthe re- sults of engineering and testing The test
‘to prove the technical baselines is rarely the same asthe operational testing of requirements
elated to the baseline is the process of configu ration management Configuration management benefits the TALE community in two ways,
‘Throw configuration management, the baseline
to be used for testing is determined, Also, changes that occur to the baseline as a result of testing and desig reviews are incorporated into the test tile before the new phase of esting (to prevent retest of bad design,
3⁄4 DEFINITIONS, TAE is the deliberate and rational generation of performance data, which deseribes the nature of
‘he emerging system and the transformation of data into information usefel forthe technical and
‘managerial persoanel controling is development
In the broad sense, TRE may be defined as all physical testing, modeling, simulation, experi- mentation, and related analyses performed during researel, development, introduction, an employ- ment of @ weapon system or subsystem, The Glossary of Defense Acquisition Acronyms and Terms, produced by the Defense Acquisition University defines “Test” and “Test and Evaluation” as follows:*
“tes is any program or procedure that 1S designed to obtain, verify or provide data for the evaluation af any ofthe fol lowing: 1) progress in accomplishing developmental objectives; 2) the
Trang 37
performance, operational capability, and
suitability of systems, subsystems, com-
ponents, and equipment items; and 3) she
vulnerability and lethality of systems,
subsystems, components, and equipment
“Test and Evaluation” is the process by
which a system or components are
exercised and results analyzed to provide
performanee-related information, The
information has many uses including ek
identification and risk mitigation and
empirical data to validate models and
simulations T&E enables an assessment
of the attainment of (echnteal perfor
‘mance, specifications, and system man-
rity to determine whether systems are
operationally effective, suitable, and sur
vivable for intended use, and/or lethality
25 THE DOD TSE PROCESS
‘The DoD Test and Evaluation Process (Figure 2- 6) is an iterative fivestep process thar provides answers to etieal T&E questions for decision makers at varius times during a system aequisi~ tion The T&E process begins during the formative stages of the program with the TRE Coordination Function, in whiel the information reeds of the various decision makers are formo- Joted in conjunction with tae development of the program requirements, acquisition strategy, and ApAs
Given certain foundation documentation, Step 1
Js the idealfication of T&E information required
by the decision maker The required information usually centers om the current system unde test,
‘which may be in the form of concepts, proto types, EDMs, of production representative production systems, depending onthe aquisition
'S8UE T0 BE DECIDED
Aa
S07NMARE AE
asoR eas oot
we MODELING AND SIMULATION —
Figure 2-6 DoD Test and Evaluation Process
21
Trang 38phase The required information consists of pet-
Formance evaluations of effectiveness and su
ability, providing insights into hew well the
system meets the users needs a a point in sime,
Stop 2 is the pre-test analysis of the valuation
objectives fom Step I to determine the types and
‘quantities of data needed, the results expected or
salieipated from the tests, and the analytical tools
‘needed to conduct the tests and evalustions, The
use of validated models and simulation systems
during pre-test analysis can aid in determining:
hhow fo design test seenaros; how to set up the
‘est environment: how to seoperdy instrument the
test; haw fo staff and control test esourees; how
Dest to sequence the test trials; and how to
Step 3, test activity and data manage
actual fest activity planning, Tests are conducted
and data management for data requirements is
ientified in Step 2 T&E managers determine
‘what valid data exist in historical files that ean
"be applied and what new data must be developedt
‘through testing, The necessary tests ae planned
‘and executed to accumulate sufficient data (0
support analysis Data are sereened for com
accuracy, and validity belote being used for
Stop
isthe
Step 4, posttest synthesis and evaluation, is the
‘comparison ofthe measured outcomes (test data)
fiom Step 3 with the expected ontzames from
Step 2, tempered with technical and operationst
"uudgment This is where data ae syniesize into
information, Wren the measored outcomes differ
from the expevted outeomes, the test conditions and procediires must be reexamined to determin
‘tthe performance deviations are real or Were the result of test conditions, such as lack of fideliy
Jn computer simulation, insulticien or incorrect
"est support assets, instrimentation erat, oF faulty lest processes The assumptions of tactics, op- cerational environment, systems performance pa rameters, and logistics support must have been catefully chosen, fly deseribed and documented prior to (est Modeling and Simulation (M&S) nay normally be used during the data analysis extend the evaluation of performance effective- ness and suitabiliy,
Stop 5 is whon the decision maker weighs the
TQ information against other programmatic in- formation to decide a proper course of actin, This process may identify aiional requirements for test data and iterate the DoD THE process again
2.6 SUMMARY TRE is an engineering tool used to identity technical risk throughout the defense system quisition eyele and a process For veiving per formance, This iterative eyele consists of acguisi- tion phases separated by diserete milestones The DoD T&E process consiss of developmental and operational testing that is used to support enginecring design and programmatic reviews This TSE process forms an important part of the SEP used hy system developers und aids inthe decision proces used by senior decision authorities ia DoD,
Trang 391
ENDNOTES DoDI $000.2, Operation of the Defense
Acquisition System, May 12, 2008,