E5. ENCLOSURE 5
INTEGRATED TEST AND EVALUATION (T&E)
E5.1. The PM, in concert with the user and
test and evaluation communities, shall coordinate developmental test and
evaluation (DT&E), operational test and evaluation (OT&E), LFT&E,
family-of-systems interoperability testing, information assurance testing, and
modeling and simulation (M&S) activities, into an efficient continuum,
closely integrated with requirements definition and systems design and
development. The T&E strategy shall provide information about risk and
risk mitigation, provide empirical data to validate models and simulations,
evaluate technical performance and system maturity, and determine whether
systems are operationally effective, suitable, and survivable against the
threat detailed in the System Threat Assessment. The T&E strategy shall
also address development and assessment of the weapons support equipment
during the SDD phase, and into production, to ensure satisfactory test system
measurement performance, calibration traceability and support, required
diagnostics, and safety. Adequate time and resources shall be planned to
support pre-test predictions and post-test reconciliation of models and test
results, for all major test events. The PM, in concert with the user and test
communities, shall provide safety releases to the developmental and operational testers prior to any
test using personnel.
E5.2. The PM shall design DT&E
objectives appropriate to each phase and milestone of an acquisition program.
Testing shall be event driven and monitored by the use of success criteria
within each phase, OT&E entrance criteria, and other metrics designed to
measure progress and support the decision process. The OTA shall design
OT&E objectives appropriate to each phase and milestone of a program, and
submit them to the PM for inclusion in the Test and Evaluation Master Plan
(TEMP). Completed IOT&E and completed LFT&E shall support a beyond
LRIP decision for ACAT I and II programs for conventional weapons systems
designed for use in combat. For this purpose, OT&E shall require more than
an OA based exclusively on computer modeling, simulation, or an analysis of
system requirements, engineering proposals, design specifications, or any
other information contained in program documents (10 U.S.C. 2399 and 10 U.S.C.
2366, references (h) and (ae)).
E5.3. T&E Strategy
E5.3.1. Projects that undergo a Milestone A
decision shall have a T&E strategy that shall primarily address M&S,
including identifying and managing the associated risk, and that shall
evaluate system concepts against mission requirements. Pre-Milestone A projects
shall rely on the ICD as the basis for the evaluation strategy. For
programs on the OSD T&E Oversight List, the T&E strategy shall be submitted to USD(AT&L)
and DOT&E for approval.
E5.3.2. The T&E strategy for a program
using an evolutionary acquisition strategy shall remain consistent with the time-phased requirements
in the CDD/CPD.
E5.4. T&E Planning
E5.4.1. TEMP. The PMs for MDAPs, MAIS
Acquisition Programs, and programs on the OSD T&E Oversight List shall
submit a TEMP to the USD(AT&L) and the DOT&E for approval to support
Milestones B and C and the Full-Rate Production decision. The TEMP shall
describe planned developmental, operational, and live fire
testing, including measures to evaluate the performance of the system during these test periods;
an integrated test schedule; and the resource requirements to accomplish the
planned testing. The MDA or designee shall ensure that IOT&E entrance
criteria, to be used to determine IOT&E readiness certification in
support of each planned operational test, are developed and documented
in the TEMP.
E5.4.2. Planning shall provide for
completed DT&E, IOT&E, and LFT&E, as required,
before entering full-rate production.
E5.4.3. Test planning for commercial and
non-developmental items shall recognize commercial testing and experience,
but nonetheless determine the appropriate DT&E, OT&E, and LFT&E
needed to ensure effective performance in the intended
E5.4.4. Test planning and conduct shall
take full advantage of existing investment in DoD ranges, facilities, and other resources, including the
use of embedded instrumentation.
E5.4.5. Planning shall consider the
potential testing impacts on the environment (42 U.S.C. 4321-4370d and E.O. 12114,
references (x) and (az)).
E5.4.6. The concept of early and
integrated T&E shall emphasize prototype testing during system
development and demonstration and early OAs to identify technology risks and
provide operational user impacts.
E5.4.7. Appropriate use of accredited models and simulation shall support DT&E,
IOT&E, and LFT&E.
E5.4.8. The DOT&E and the Deputy
Director, DT&E/Office of Defense Systems (DS), Office of the
USD(AT&L), shall have full and timely access to all available developmental, operational, and live-fire T&E
data and reports.
E5.4.9. Interoperability Testing. All DoD
MDAPs, programs on the OSD T&E Oversight list, post-acquisition (legacy)
systems, and all programs and systems that must interoperate, are subject to
interoperability evaluations throughout their life cycles to validate their
ability to support mission accomplishment. For IT systems, including NSS,
with interoperability requirements, the Joint Interoperability Test Command
(JITC) shall provide system interoperability test certification memoranda to
the Director, Joint Staff J-6, throughout the system life-cycle and
regardless of ACAT.
E5.5. Developmental Test and Evaluation
DT&E, the materiel developer shall:
E5.5.1. Identify the technical
capabilities and limitations of the alternative concepts and design
options under consideration;
E5.5.2. Identify and describe design technical risks;
E5.5.3. Stress the system under test to at
least the limits of the Operational Mode Summary/Mission Profile, and, for
some systems, beyond the normal operating limits to ensure the robustness of
E5.5.4. Assess technical progress and
maturity against critical technical parameters, to include interoperability, documented
in the TEMP;
E5.5.5. Assess the safety of the
system/item to ensure safety during OT and other troop-supported testing and to support success in meeting design safety
E5.5.6. Provide data and analytic support to the decision process to certify
the system ready for IOT&E;
E5.5.7. Conduct information assurance
testing on any system that collects, stores, transmits, or processes unclassified or
E5.5.8. In the case of IT systems,
including NSS, support the DoD Information Technology Security Certification and Accreditation Process and Joint Interoperability
E5.5.9. In the case of financial
management, enterprise resource planning, and mixed financial management
systems, the developer shall conduct an independent assessment of compliance factors established by the Office of
the USD(C); and,
E5.5.10. Prior to full-rate production,
demonstrate the maturity of the production process through Production Qualification
Testing of LRIP assets.
E5.6. Readiness for IOT&E. The
Services shall each establish an Operational Test Readiness Process for programs on the OSD T&E Oversight List, consistent
with the following requirements:
E5.6.1. The process shall include a review
of DT&E results; an assessment of the system.s progress against critical
technical parameters documented in the TEMP; an analysis of identified
technical risks to verify that those risks have been retired during
developmental testing; and a review of the IOT&E entrance criteria
specified in the TEMP. Programs shall provide copies of the DT&E report and the progress assessment
to USD(AT&L) and DOT&E.
E5.6.2. The Service Acquisition Executive shall evaluate and determine materiel system
readiness for IOT&E.
E5.7. Operational Test and Evaluation
E5.7.1. OT&E shall determine the
operational effectiveness and suitability of a system under realistic
operational conditions, including combat; determine if thresholds in the
approved CPD and critical operational issues have been satisfied; and assess
impacts to combat operations.
E5.7.2. The lead OTA shall brief the
DOT&E on concepts for an OT&E 120 days prior to start. They shall
submit the OT&E plan 60 days prior, and shall report
major revisions as they occur.
E5.7.3. Typical users shall operate and
maintain the system or item under conditions simulating combat stress
and peacetime conditions.
E5.7.4. The independent OTAs shall use
production or production representative articles for the dedicated phase of
IOT&E that supports the full-rate production decision (or for ACAT IA or other acquisition programs, the
E5.7.5. Hardware and software alterations
that materially change system performance, including system upgrades and changes to correct deficiencies, shall
E5.7.6. OTAs shall conduct an independent,
dedicated phase of IOT&E before full-rate production to evaluate operational effectiveness and suitability, as required
by reference (h).
E5.7.7. All weapon, Command, Control,
Communications, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR), and information programs
that are dependent on external information sources, or that
provide information to other DoD systems, shall be tested and evaluated for information
E5.7.8. The DOT&E shall determine the
quantity of articles procured for IOT&E for MDAPs; the cognizant OTA shall make this decision
for non-MDAPs (reference (h)).
E5.7.9. The DOT&E shall assess the
adequacy of IOT&E and LFT&E, and evaluate the operational
effectiveness, suitability, and survivability, as applicable, of systems
under DOT&E oversight. DOT&E-oversight programs beyond LRIP, shall require continued DOT&E test plan approval, monitoring,
and FOT&E reporting to:
E220.127.116.11. Complete IOT&E activity;
E18.104.22.168. Refine IOT&E estimates;
E22.214.171.124. Verify correction of deficiencies;
E126.96.36.199. Evaluate significant changes to system design or
E188.8.131.52. Evaluate whether or not the
system continues to meet operational needs and retain operational effectiveness in a substantially new environment,
E5.7.10. OT&E Information
E184.108.40.206. The responsible test
organization shall release valid test data and factual information in as
near real-time as possible to all DoD organizations and contractors with a
need to know. Data may be preliminary and shall
be identified as such.
E220.127.116.11. To protect the integrity of the
OTA evaluation process, release of evaluation results may be withheld until
the final report, according to the established policies of each OTA. Nothing
in this policy shall be interpreted as limiting the statutory requirement for immediate access to all OT&E results
E18.104.22.168. The primary intent of this
policy is to give developing agencies visibility of factual data produced
during OT&E, while not allowing the developmental agency any influence over the outcome of
E5.7.11. Use of Contractors in Support of
E22.214.171.124. Per reference (h), persons
employed by the contractor for the system being developed may only
participate in OT&E of major defense acquisition programs to the extent
that is planned for them to be involved in the operation, maintenance, and other support of the system
when deployed in combat.
E126.96.36.199. A contractor that has
participated (or is participating) in the development, production, or
testing of a system for a DoD Component (or for another contractor of the
Department of Defense) may not be involved in any way in establishing
criteria for data collection, performance assessment, or evaluation
activities for OT&E. The DOT&E may waive such limitation if the
DOT&E determines, in writing, that sufficient steps have been taken to
ensure the impartiality of the contractor in providing the services. These
limitations do not apply to a contractor that has participated in such
development, production, or testing, solely in test or test support on behalf of
the Department of Defense.
E5.8. OSD T&E Oversight List. The
DOT&E and the Director, DS, shall jointly, and in consultation with the T&E executives of the cognizant DoD
Components, determine the programs designated for OSD T&E
oversight. The DoD memorandum entitled "Designation of Programs for OSD Test and Evaluation (T&E) Oversight"
(reference (ba)) identifies these programs.
E5.9. Live-Fire Test and Evaluation