National Academies Press: OpenBook
« Previous: Appendix D Elements of ISO 9000
Suggested Citation:"Appendix E Glossary and Acronyms." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×

APPENDIX E
Glossary and Acronyms


ACAT:

Acquisition Category; a designation for each program based on cost that determines both the level of review that is required by law and the level at which milestone (see below) decision authority rests in DoD. There are four acquisition categories, ACAT I through ACAT IV; the most expensive systems are designated ACAT I. ACAT I programs have two sub-categories: ACAT ID (milestone decision authority is the USD[A&T]); and ACAT IC (milestone decision authority is the DoD Component Head.

AFOTEC:

Air Force Operational Test and Evaluation Center

AMSAA:

Army Materiel Systems Analysis Activity

Analysis of Alternatives:

(formerly Cost and Operational Effectiveness Analysis - COEA); this is a cost-benefit analysis tool that provides justification for the selection of one procurement option over an alternative.

Acquisition Program Baseline (APB):

can be viewed as a contract between the milestone decision authority and the relevant service, including the program manager and his/her supervisors.

ATACMS:

Army Tactical Missile System


BAT:

Brilliant Anti-Tank

Suggested Citation:"Appendix E Glossary and Acronyms." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×

CAIG:

Cost Analysis Improvement Group


DAB:

Defense Acquisition Board

Developmental Testing (DT):

responsible for performing specification-based testing, verifies that the system meets all specifications, and certifies that the system is ready to enter operational testing; determines if and how the system works.

DIS:

Distributed Interactive Simulation

DoD:

Department of Defense

DOT&E:

Director, Operational Test and Evaluation

DT&E:

Developmental Testing and Evaluation


EADSIM:

Force-on-Force Air Defense Model

EMD:

Engineering and Manufacturing Development

Evolutionary Procurement:

pertains to continuous changes to a system and the associated stage-wise testing and development, so that what is operationally tested is not necessarily what is deployed; particularly relevant to software and software-intensive systems.


IDA:

Institute for Defense Analyses; a federally funded research and development center established to assist the Office of the Secretary of Defense, the Joint Staff, the Unified Commands and Defense Agencies in addressing important national security issues, particularly those requiring scientific and technical expertise.

IOT&E:

Initial Operational Test and Evaluation

ISO:

Organization for International Standardization; a worldwide federation of 92 member countries established to promote the development of international standards and related activities to facilitate the exchange of goods and services.


JROC:

Joint Requirements Oversight Council; serves to support milestone review, validate the Operational Requirements Document, and validate mission need.


LOSFH:

Line of Sight—Forward Heavy; LOSFH was the generic name given

Suggested Citation:"Appendix E Glossary and Acronyms." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×

early on to the conceptual ADATS system, while ADATS was the particular piece of hardware/software that was chosen to fill that role.


M&S:

Modeling and Simulation

MCOTEA:

Marine Corps Operational Test and Evaluation Activity

MDAPs:

Major Defense Acquisition Program; ACAT I programs are MDAPs

Measure of Effectiveness (MOE):

A measure of how well an operational task or an assigned task is accomplished; can be measured directly or may require the aggregation of MOPs.

Measure of Performance (MOP):

A measure of how well a system performs its function or how well a design characteristic meets an operational requirement.

Milestone:

One of five steps in the procurement of a weapons system; Milestone 0 is the concept studies approval, Milestone I is the concept demonstration approval, Milestone II is the development approval, Milestone III is the production approval, and Milestone IV is the major modification approval.

Mission Needs Statement (MNS):

a conceptual document, prepared by the relevant military service in response to a perceived threat, that is supposed to identify a broadly stated operational need (not a specific solution to counter the perceived threat).

MTBOMF:

Mean Time Between Operational Mission Failure

MTTF:

Mean Time To Failure


Operational Effectiveness:

the capability of a system to perform its mission in the operational environment, and in the face of the expected threat, including countermeasures.

OMS/MP:

Operational Mode Summary and Mission Profiles; defines the environment and stress levels the system is expected to encounter in the field. They include the overall length of the scenarios of use, the sequence of missions, and the maintenance opportunities.

Operational Requirements Document (ORD):

a document that describes in some detail the translation from the broadly stated mission need to the system performance parameters that the users and the program manager believe the system must have to justify its eventual procurement.
Suggested Citation:"Appendix E Glossary and Acronyms." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×

Operational Suitability:

the degree that a system can be placed satisfactorily in field use with consideration given to availability, compatibility, transportability, interoperability, wartime usage rates, maintainability, safety, human factors, manpower supportability, logistics supportability, natural environmental effects and impacts, documentation, and training requirements.

Operational Testing and Evaluation (OT&E):

pertains to field tests, under realistic conditions, to determine system effectiveness and suitability for use in combat by typical military users; assesses when and where the system will work.

OPTEC:

Army Operational Test and Evaluation Command

OPTEVFOR:

Navy Operational Test and Evaluation Force

OSD:

Office of the Secretary of Defense


PA&E:

Office of Program Analysis and Evaluation

PEO:

Program Evaluation Office

PM:

Program Manager; the "champion" in the Department of Defense of a military system in development.


RAM:

Reliability, Availability, and Maintainability

ROC:

Required Operational Capability


Test and Evaluation Master Plan (TEMP):

documents the overall structure and objectives of the test and evaluation program, provides a framework for generating detailed test and evaluation plans, and documents associated schedule and resource implications.


USD(A&T):

Under Secretary of Defense (Acquisition and Technology)


VV&A:

Verification, Validation, and Accreditation

Suggested Citation:"Appendix E Glossary and Acronyms." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×

This page in the original is blank.

Suggested Citation:"Appendix E Glossary and Acronyms." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page 210
Suggested Citation:"Appendix E Glossary and Acronyms." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page 211
Suggested Citation:"Appendix E Glossary and Acronyms." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page 212
Suggested Citation:"Appendix E Glossary and Acronyms." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page 213
Suggested Citation:"Appendix E Glossary and Acronyms." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page 214
Next: References »
Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements Get This Book
×
Buy Paperback | $57.00 Buy Ebook | $45.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

For every weapons system being developed, the U.S. Department of Defense (DOD) must make a critical decision: Should the system go forward to full-scale production? The answer to that question may involve not only tens of billions of dollars but also the nation's security and military capabilities. In the milestone process used by DOD to answer the basic acquisition question, one component near the end of the process is operational testing, to determine if a system meets the requirements for effectiveness and suitability in realistic battlefield settings. Problems discovered at this stage can cause significant production delays and can necessitate costly system redesign.

This book examines the milestone process, as well as the DOD's entire approach to testing and evaluating defense systems. It brings to the topic of defense acquisition the application of scientific statistical principles and practices.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!