The following HTML text is provided to enhance online
readability. Many aspects of typography translate only awkwardly to HTML.
Please use the page image
as the authoritative form to ensure accuracy.
Statistical Methods for Testing and Evaluating Defense Systems: Interim Report
The Committee on National Statistics of the National Research Council (NRC) has had a long-standing goal of helping to develop and encourage the use of state-of-the-art statistical methods across the federal government. As a result of this interest, discussions began several years ago during meetings of the Committee on National Statistics about the possibility of conducting a study for the U.S. Department of Defense (DoD). Mutual interest between the committee and the DoD Office of Program Analysis and Evaluation in greater application of statistics within DoD led to a meeting of key DoD personnel and several NRC staff. As a result of this meeting, system testing and evaluation emerged as an area where statistical science could prove useful.
Consequently, at the request of DoD, the Committee on National Statistics, in conjunction with the NRC Committee on Applied and Theoretical Statistics, held a two-day workshop in September 1992 on experimental design, statistical modeling, simulation, sources of variability, data storage and use, and operational testing of weapon systems. The workshop was sponsored by the Office of the Director of Operational Test and Evaluation, and the Office of the Assistant Secretary of Defense for Program Analysis and Evaluation. The overarching theme of the workshop was that using more appropriate statistical approaches could improve the evaluation of weapon systems in the DoD acquisition process.
Workshop participants expressed the need for a study to address in greater depth the issues that surfaced at the workshop. Therefore, at the request of DoD, a multiyear panel study was undertaken by the Committee on National Statistics in early 1994. The Panel on Statistical Methods for Testing and Evaluating Defense Systems was established to recommend statistical methods for improving the effectiveness and efficiency of testing and evaluation of defense systems, with emphasis on operational testing. The 13-member panel comprises experts in the fields of statistics (including quality management, decision theory, sequential testing, reliability theory, and experimental design), operations research, software engineering, defense acquisition, and military systems. The study is sponsored by the DoD Office of the Director of Operational Test and Evaluation.
Early in its work, the panel formed seven working groups to study particular aspects of defense testing: (1) design of experiments; (2) uses of modeling and computer simulation; (3) system reliability,