of test data and information. Development teams would not have to ''reinvent the wheel" when beginning a project. Instead, they would identify and learn from previous studies and findings. Setting multiservice operational test and evaluation standards similar to those in ISO 9000 is critical to the creation of an environment of more systematic data collection, analysis, and documentation.

The panel recommends (see Chapter 3) that DoD and the services develop a centralized test and evaluation data archive and standardize test data archival practices based on ISO 9000. This archive should include both developmental and operational test data; use of uniform terminology in data collection across services; and careful documentation of development and test plans, development and test budgets, test evaluation processes, and the justification for all test-related decisions, including decisions concerning resource allocation.

Standardization, documentation, and data archiving facilitate use of all available information for efficient decision making. Routinely taking these steps will:

  • provide the information needed to validate models and simulations, which in turn can be used to plan for (or reduce the amount of) experimentation needed to reach specified operational test and evaluation goals;

  • allow the "borrowing" of information from past studies (if they are clearly documented and there is consistent usage of terminology and data) to inform the assessment of a system's performance based upon limited testing, by means of formal and informal statistical methods and other approaches;

  • make possible the use of data from developmental testing for efficient operational test design;

  • allow learning from best current practices across the services; and

  • lead to an organized accumulation of knowledge within the Department of Defense.

A key benefit of documentation and archival of test planning, test evaluation, and in-use system performance is the creation of feedback loops to identify system flaws for system redesign, and to identify where tests or models have missed important deficiencies in system performance.

Retention of records may involve additional costs, but it is clearly necessary for accountability in the decision-making process. The trend in industry is to empower employees by giving them more responsibility in the decision-making process; along with this responsibility comes the need to make people accountable for their decisions. This consideration is likely to be an important organizational aspect of the operational testing of defense systems.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement