Statistics, Testing, and Defense Acquisition

New Approaches and Methodological Improvements

Michael L. Cohen, John E. Rolph, and Duane L. Steffey, Editors

Panel on Statistical Methods for Testing and Evaluating Defense Systems

Committee on National Statistics

Commission on Behavioral and Social Sciences and Education

National Research Council

NATIONAL ACADEMY PRESS
Washington, D.C.
1998



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page R1
Statistics, Testing, and Defense Acquisition New Approaches and Methodological Improvements Michael L. Cohen, John E. Rolph, and Duane L. Steffey, Editors Panel on Statistical Methods for Testing and Evaluating Defense Systems Committee on National Statistics Commission on Behavioral and Social Sciences and Education National Research Council NATIONAL ACADEMY PRESS Washington, D.C. 1998

OCR for page R1
NATIONAL ACADEMY PRESS 2101 Constitution Avenue, N.W. Washington, D.C. 20418 NOTICE: The project that is the subject of this report was approved by the Governing Board of the National Research Council, whose members are drawn from the councils of the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine. The members of the committee responsible for the report were chosen for their special competences and with regard for appropriate balance. The project that is the subject of this report is supported by Contract DASW01-94-C-0119 between the National Academy of Sciences and the Director, Operational Test and Evaluation at the U.S. Department of Defense. Any opinions, findings, conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the view of the organizations or agencies that provided support for this project. Additional copies of this report are available from the National Academy Press, 2101 Constitution Avenue, NW, Lock Box 285, Washington, DC 20055. (800) 624-6242 or (202) 334-3313 (in the Washington Metropolitan Area). Library of Congress Cataloging-in-Publication Data Statistics, testing, and defense acquisition: new approaches and methodological improvements / Michael L. Cohen, John E. Rolph, and Duane L. Steffey, editors; Panel on Statistical Methods for Testing and Evaluating Defense Systems, Committee on National Statistics, Commission on Behavioral and Social Sciences and Education, National Research Council. p. cm. Includes bibliographical references (p.) and index. ISBN 0-309-06551-8 (pbk.) 1. United States—Armed Forces—Weapons systems—Testing—Statistical methods. I. Cohen, Michael L. II. Rolph, John E. III. Steffey, Duane L. IV. National Research Council (U.S.). Commission on Behavioral and Social Sciences and Education. Panel on Statistical Methods for Testing and Evaluating Defense Systems. UF503 .S727 1998 358.4' 1807'0973—ddc21 98-9065 Copyright 1998 by the National Academy of Sciences. All rights reserved. Printed in the United States of America

OCR for page R1
PANEL ON STATISTICAL METHODS FOR TESTING AND EVALUATING DEFENSE SYSTEMS JOHN E. ROLPH (Chair), Marshall School of Business, University of Southern California MARION R. BRYSON, North Tree Management, Monterey, California HERMAN CHERNOFF, Department of Statistics, Harvard University JOHN D. CHRISTIE, Logistics Management Institute, McLean, Virginia LOUIS GORDON, Private Consultant, Palo Alto, California KATHRYN BLACKMOND LASKEY, Department of Systems Engineering and Center of Excellence in C3I, George Mason University ROBERT C. MARSHALL, Department of Economics, Pennsylvania State University VIJAYAN N. NAIR, Department of Statistics, University of Michigan ROBERT T. O'NEILL, Division of Biometrics, Food and Drug Administration, U.S. Department of Health and Human Services STEPHEN M. POLLOCK, Department of Industrial and Operations Engineering, University of Michigan JESSE H. POORE, Department of Computer Science, University of Tennessee FRANCISCO J. SAMANIEGO, Division of Statistics, University of California, Davis DENNIS E. SMALLWOOD, Department of Social Sciences, U.S. Military Academy MICHAEL L. COHEN, Study Director DUANE L. STEFFEY, Study Director (to July 1995); Consultant ANURADHA P. DAS, Research Assistant ERIC M. GAIER, Consultant CANDICE S. EVANS, Senior Project Assistant

OCR for page R1
COMMITTEE ON NATIONAL STATISTICS 1997-1998 NORMAN M. BRADBURN (Chair), National Opinion Research Center, University of Chicago JULIE DAVANZO, RAND, Santa Monica, California WILLIAM F. EDDY, Department of Statistics, Carnegie Mellon University JOHN F. GEWEKE, Department of Economics, University of Minnesota, Minneapolis ERIC A. HANUSHEK, W. Allen Wallis Institute of Political Economy, Department of Economics, University of Rochester RODERICK J.A. LITTLE, Department of Biostatistics, University of Michigan THOMAS A. LOUIS, School of Public Health, University of Minnesota CHARLES F. MANSKI, Department of Economics, University of Wisconsin WILLIAM NORDHAUS, Department of Economics, Yale University JANET L. NORWOOD, The Urban Institute, Washington, D.C. EDWARD B. PERRIN, School of Public Health and Community Medicine, University of Washington PAUL ROSENBAUM, Department of Statistics, Wharton School, University of Pennsylvania KEITH F. RUST, Westat, Inc., Rockville, Maryland FRANCISCO J. SAMANIEGO, Division of Statistics, University of California, Davis MIRON L. STRAF, Director ANDREW WHITE, Deputy Director

OCR for page R1
Acknowledgments The very nature of this study required the panel and staff to attend meetings (nine plenary and ten working group meetings) all over the country to collect information on test designs and evaluations on a wide variety of systems; a list of the systems we studied, including several we used as case studies, are in Appendix A. Locations we visited include the Army Test and Experimentation Command Headquarters at Fort Hunter Liggett in California; Eglin Air Force Base at Fort Walton Beach in Florida; the Air Force Operational Test and Evaluation Center in Albuquerque, New Mexico; the Navy Operational Test and Evaluation Force in Norfolk, Virginia; the Army Operational Test and Evaluation Command in Alexandria, Virginia; RAND in Santa Monica, California; and the Institute for Defense Analyses in Alexandria, Virginia. We were extremely fortunate to meet with so many people—from the military services, the Office of the Secretary of the U.S. Department of Defense (DoD), and private organizations in the testing community—willing to share their expertise with, and extend their hospitality to, the panel: to all these individuals, we are grateful. We particularly wish to acknowledge the support of Philip Coyle, director, and Ernest Seglie, science adviser, DoD Office of the Director, Operational Test and Evaluation (the study sponsors); Henry Dubin, technical director, U.S. Army Operational Test and Evaluation Command; Steven Whitehead, technical director, U.S. Navy Operational Test and Evaluation Force; Marion Williams, technical director, U.S. Air Force Operational Test and Evaluation Center; and Robert Bell, technical director, U.S. Marine Corps Operational Test and Evaluation Activity. In addition, many people went beyond the call of duty to assist the panel in

OCR for page R1
its work. We thank, first: Kwai Chan, Christine Fossett, Jackie Guin, Louis Rodrigues, and Robert Stolba, General Accounting Office; Ric Sylvester, Office of the Deputy Undersecretary of Defense for Acquisition Reform; Lee Frame and Austin Huangfu, Office of the Director, Operational Test and Evaluation; Margaret Myers and Ray Paul, Office of the Secretary of Defense; Patricia Sanders, Office of the Undersecretary of Defense for Acquisition and Technology; Dean Zerbe, Senator Grassely's Office; Donald Yockey, former Under Secretary of Defense for Acquisition. From the Air Force, we thank: Howard Leaf, Director of Test and Evaluation, U.S. Air Force; Suzanne Beers, David Blanks, Lyn Canham, Michael Carpenter, Charles Carter, Angie Crawford, David Crean, William Dyess, John Faris, Tim Gooley, Anthony "Shady" Groves, Ken Hebert, Brian Ishihara, Jeff Jacobs, Eric Keck, Roderick Leitch, Scott Long, Mike Malone, Michael McHugh, Donald Merkison, Terence Mitchell, Herbert Morgan, Ken Murphy, Sharon Nichols, Steve Ordonia, Ronald Reano, Mark Reid, James Sheedy, Brian Simes, Chuck Stansberry, Cecil Stevens, Robert Stovall, Frank Swehoskey, Scott Weisgerber, Larry Wolfe, and Dave Young, U.S. Air Force Operational Test and Evaluation Center. From the Army, we thank: Susan Wright, Army Digitization Office; Cy Lorber, Army Materiel Command; Will Brooks, Sam Frost, Dwayne Nuzman, Jim Streilein, and Bill Yeakel, Army Materiel Systems Analysis Activity; Charles Pate, Training and Doctrine Command; Larry Leiby, Scott Lucero, John McVey, and Hank Romberg, U.S. Army Operational Test and Evaluation Command; Michael Hall, Greg Kokoskie, Ed Miller, Harold Pasini, Patrick Sul, and Tom Zeberlein, U.S. Army Operational Evaluation Command; Michael Jackson and Carl Russell, U.S. Army Test and Experimentation Center. From the Navy, we thank: James Duff, former technical director of the Navy Operational Test and Evaluation Command; Donald Gaver, Naval Postgraduate School; Karen Ahlquist, Mike Alesi, Jeff Gerlitz, Kevin Smith, and Cynthia Womble, Navy Operational Test and Evaluation Force. And from other institutions and agencies, we thank: Nozer Singpurwalla, George Washington University; Robert Boling, Peter Brooks, William Buchanan, James Carpenter, Thomas Christie, Gary Comfort, Robert Daly, Robert Dighton, Richard Fejfar, Arthur Fries, David Hart, Kent Haspert, Anil Joglekar, Irwin Kaufman, Richard "Hap" Miller, Michael Shaw, David Spalding, Bradley Thayer, Alfred Victor, Charles Waespy, and Steve Warner, Institute for Defense Analyses; Dale Pace, Johns Hopkins University; and Patrick Vye, RAND. This report has been reviewed by individuals chosen for their diverse perspectives and technical expertise, in accordance with procedures approved by the NRC's Report Review Committee. The purpose of this independent review is to provide candid and critical comments that will assist the authors and the NRC in making the published report as sound as possible and to ensure that the report meets institutional standards for objectivity, evidence, and responsiveness to the

OCR for page R1
study charge. The content of the review comments and draft manuscript remain confidential to protect the integrity of the deliberative process. We thank the following individuals for their participation in the review of this report: David S.C. Chu, RAND, Washington, D.C.; Phil E. DePoy, National Opinion Research Center, University of Chicago; Gerald P. Dinneen, consultant, Edina, Minnesota; William Eddy, Department of Statistics, Carnegie Mellon University; Alexander H. Flax, consultant, Potomac, Maryland; David R. Heebner, Heebner Associates, McLean, Virginia; Robert J. Hermann, United Technologies Corporation, Hartford, Connecticut; James Hodges, Division of Biostatistics, University of Minnesota, Twin Cities; William Howard, consultant, Scottsdale, Arizona; Joseph B. Kadane, Department of Statistics, Carnegie Mellon University; Patrick D. Larkey, Heinz School of Public Policy, Carnegie Mellon University; John L. McLucas, consultant, Alexandria, Virginia; General Glenn K. Otis (ret.), Newport News, Virginia; and Warren F. Rogers, Warren Rogers Associates, Inc., Middletown, Rhode Island. While the individuals listed above provided many constructive comments and suggestions, responsibility for the final content of this report rests solely with the authoring committee and the NRC. The panel was fortunate to have an extremely able staff who both supported and led the panel through the past four years. I would like to particularly acknowledge the contributions of Duane Steffey, study director for the first phase of the study, and Michael Cohen, study director for the final phase. Their research and organizational skills, combined with their ability to develop contacts and foster relationships in the testing community made them an important asset to the success of this study. The panel is grateful to Eugenia Grohman, Associate Director for Reports of the Commission on Behavioral and Social Sciences and Education (CBASSE), for her fine technical editorial work, which contributed greatly to the readability of this report. Anu Das, research assistant for the study, provided invaluable support to the panel, particularly to the work of the software testing working group, assisting greatly in the preparation of Chapter 8. The panel's senior project assistant, Candice Evans, along with the difficult job of coordinating many offsite meetings—further complicated by the necessity for security clearances (and a hurricane!)—also handled all aspects of report production, most notably helping prepare drafts of Chapters 1 and 2 and Appendix D. Finally, no acknowledgment would be complete without thanking the panel members themselves: they traveled extensively to military bases and test facilities, contributed their time and expert knowledge, and drafted many of the sections of the report. John E. Rolph, Chair Panel on Statistical Methods for Testing and Evaluating Defense Systems

OCR for page R1
The National Academy of Sciences is a private, non-profit, self-perpetuating society of distinguished scholars engaged in scientific and engineering research, dedicated to the furtherance of science and technology and to their use for the general welfare. Upon the authority of the charter granted to it by the Congress in 1863, the Academy has a mandate that requires it to advise the federal government on scientific and technical matters. Dr. Bruce Alberts is president of the National Academy of Sciences. The National Academy of Engineering was established in 1964, under the charter of the National Academy of Sciences, as a parallel organization of outstanding engineers. It is autonomous in its administration and in the selection of its members, sharing with the National Academy of Sciences the responsibility for advising the federal government. The National Academy of Engineering also sponsors engineering programs aimed at meeting national needs, encourages education and research, and recognizes the superior achievements of engineers. Dr. William A. Wulf is president of the National Academy of Engineering. The Institute of Medicine was established in 1970 by the National Academy of Sciences to secure the services of eminent members of appropriate professions in the examination of policy matters pertaining to the health of the public. The Institute acts under the responsibility given to the National Academy of Sciences by its congressional charter to be an adviser to the federal government and, upon its own initiative, to identify issues of medical care, research, and education. Dr. Kenneth I. Shine is president of the Institute of Medicine. The National Research Council was organized by the National Academy of Sciences in 1916 to associate the broad community of science and technology with the Academy's purposes of furthering knowledge and advising the federal government. Functioning in accordance with general policies determined by the Academy, the Council has become the principal operating agency of both the National Academy of Sciences and the National Academy of Engineering in providing services to the government, the public, and the scientific and engineering communities. The Council is administered jointly by both Academies and the Institute of Medicine. Dr. Bruce Alberts and Dr. William A. Wulf are chairman and vice chairman, respectively, of the National Research Council.

OCR for page R1
Contents PREFACE   xi EXECUTIVE SUMMARY   1 PART I APPLICATIONS OF STATISTICAL PRINCIPLES TO DEFENSE ACQUISITION     1   INTRODUCTION   11 2   OPERATIONAL TESTING AND SYSTEM ACQUISITION   20 3   A NEW PARADIGM FOR TESTING AND EVALUATION IN DEFENSE ACQUISITION   34 4   UPGRADING STATISTICAL METHODS FOR TESTING AND EVALUATION   50

OCR for page R1
PART II APPLICATIONS OF STATISTICAL METHODS TO OPERATIONAL TESTING     5   IMPROVING OPERATIONAL TEST PLANNING AND DESIGN   63 6   ANALYZING AND REPORTING TEST RESULTS   87 7   ASSESSING OPERATIONAL SUITABILITY   101 8   TESTING SOFTWARE-INTENSIVE SYSTEMS   127 9   USING MODELING AND SIMULATION IN TEST DESIGN AND EVALUATION   137 10   INCREASING ACCESS TO STATISTICAL EXPERTISE FOR OPERATIONAL TESTING   157 APPENDICES     A   CASE STUDIES AND SYSTEM DESCRIPTIONS   167 B   ABSTRACTS OF BACKGROUND PAPERS   191 C   A POTENTIAL TAXONOMY FOR OPERATIONAL TESTS   194 D   ELEMENTS OF ISO 9000   204 E   GLOSSARY AND ACRONYMS   210 REFERENCES   215 BIOGRAPHICAL SKETCHES OF PANEL MEMBERS AND STAFF   221

OCR for page R1
Preface The Committee on National Statistics of the National Research Council (NRC) has had a long-standing goal of encouraging the development and use of state-of-the-art statistical methods across the federal government. In this context, discussions began several years ago during meetings of the Committee on National Statistics about the possibility of providing assistance to the U.S. Department of Defense (DoD). Mutual interest between the committee and the DoD Office of Program Analysis and Evaluation in the greater application of statistics within DoD led to a meeting of key DoD personnel and several NRC staff. As a result of this meeting, system testing and evaluation emerged as an area for which improvement in the application of statistics could prove useful. Consequently, at the request of DoD, the Committee on National Statistics, in conjunction with the NRC Committee on Applied and Theoretical Statistics, held a 2-day workshop in September 1992 on experimental design, statistical modeling, simulation, sources of variability, data storage and use, and operational testing of weapon systems. The workshop was sponsored by the Office of the Director, Operational Test and Evaluation (DOT&E) and the Office of the Assistant Secretary of Defense for Program Analysis and Evaluation. Defense analysts were invited to write and present background papers and discuss substantive areas in which they sought improvements through application of statistical methods. Statisticians and other participants responded by suggesting alternative approaches to specific problems and identifying program areas that might especially benefit from the application of improved statistical methods. The overarching theme of the workshop was that using more appropriate statistical approaches could improve the evaluation of weapon systems in the DoD acquisi

OCR for page R1
tion process. The workshop findings were published in Statistical Issues in Defense Analysis and Testing: Summary of a Workshop (Rolph and Steffey, 1994). Workshop participants expressed the need for a study to address in greater depth the issues that surfaced at the workshop. A multiyear panel study was undertaken by the Committee on National Statistics in early 1994 sponsored by DOT&E. The Panel on Statistical Methods for Testing and Evaluating Defense Systems was established to recommend statistical methods for improving the effectiveness and efficiency of testing and evaluation of defense systems, with emphasis on operational testing. The 13-member panel comprised experts in the fields of statistics (including quality management, decision theory, sequential testing, reliability theory, and experimental design), operations research, software engineering, defense acquisition, and military systems. The panel's interim report, Statistical Methods for Testing and Elvaluating Defense Systems (National Research Council, 1995), presented some preliminary findings, but it did not offer any recommendations. Key chapters were devoted to experimental design of operational test, operational testing of software-intensive systems, operational test and evaluation for reliability, availability, and maintainability, and use of modeling and simulation to assist in operational test design and evaluation. This report presents the conclusions and recommendations resulting from the panel's 4-year study. The report is structured to accommodate various types of readers. Chapters 1-4 are for a non-technical audience. Chapter 1 discusses the panel's scope of work and how this was adjusted to deal with constraints on the application of statistics in the test and acquisition of military systems. Chapters 2 and 3 assess the current use of testing in system development and identify key elements of a new paradigm for the use of testing as part of the development of defense systems. Chapter 4 summarizes the substantial benefits that defense operational test design and evaluation would obtain from the use of statistical methods that reflect current practices. The changes recommended in Chapter 4 do not assume that the new paradigm recommended in Chapter 3 will be adopted and therefore can be implemented immediately. Chapters 5-9 explore in more detail and more technically the topics covered in Chapter 4 as well as additional issues concerning the application of state-of-the-art statistical methods to defense operational test design and evaluation: experimental design for operational testing (Chapter 5), operational test evaluation (Chapter 6), test design and evaluation for reliability, availability, and maintainability (Chapter 7), software testing (Chapter 8), and modeling and simulation for use in operational test design and evaluation (Chapter 9). Chapter 10 considers the need for the defense test and acquisition community to develop greater access to statistical expertise and how to do so. Though the panel was not charged with the development or execution of technical work related to operational testing and evaluation, the panel decided

OCR for page R1
that further exploration of certain technical issues would be useful for its deliberations. Thus, three technical papers were prepared: "Strategic Information Generation and Transmission: The Evolution of Institutions in DoD Operational Testing." "On the Performance of Weibull Life Tests Based on Exponential Life Testing Designs," and "Application of Statistical Science to Testing and Evaluating Software Intensive Systems." The panel has drawn from the papers, which will be published separately; abstracts of them are presented in Appendix B. John E. Rolph, Chair Panel on Statistical Methods for Testing and Evaluating Defense Systems

OCR for page R1
This page in the original is blank.