Skip to main content

Currently Skimming:

2 Findings and Conclusions
Pages 16-29

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 16...
... The committee spent considerable time deliberating to ensure that the findings indeed represent what it had heard. The conclusions then represent a committee consensus of a generalization of the findings.
From page 17...
... In the program review process depicted in Figure 1-4, it can be seen that there are 6 MS or MDA reviews in a typical program, plus 6 overarching integrated product teams (OIPTs) reviews before those, plus 6 Air Force Review Board (AFRB)
From page 18...
... Board CSB Configuration Periodic Review Annual SAE/ Steering oversight requirements USD(AT&L) Board changes DAES Defense Periodic Ongoing Quarterly PM/ Acquisition oversight performance DUSD(A&T)
From page 19...
... , nation recommendations PM meetings Varies PM, user, None Report to OIPT NDA Partially OSD, service, KTR PM; OSD PM, PEO, IIPT or Report w/ No No staff OSD, Joint equivalent, recommendations Staff staff OIPT chair, USD (AT&L) , OIPT Acquisition Yes No PM, others DAB Decision members Memorandum PM SAE, Unknown Approve Yes NDA USD(AT&L)
From page 20...
... 20 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS Table 2-1  Program Review Matrix (continued) OPR/ Review Name Type Purpose Frequency Customer AFAA AF Audit Ad hoc Varies Varies AFAA/SECAF Agency oversight DOD IG DOD Ad hoc Varies Varies DoD IG/ Inspector oversight SecDef General GAO Government Ad hoc Cost, Varies GAO/Congress Accountability oversight schedule, Office performance Other ad Varies Varies Varies Varies hoc TRA Technology Technical Executability MS DDR&E/MDA Readiness confidence Assessment ASR Alternative Technical Ready for MS A Engineering/ System technology PM Review development SEAM Systems Technical Validates SE Varies AFCSE/PM Engineering process Assessment Model SRR System Technical Executability One time Engineering/ Requirements confidence PM Review SDR System Design Technical Replaced by One time Engineering/ Review the SFR PM SFR System Technical Ready for One time Engineering/ Functional prelim design PM Review PDR Preliminary Technical Ready for One time Engineering/ Design detailed PM Review design IBR Integrated Technical Align program Varies PM Baseline expectations Review LHA Logistics Technical Logistics Unknown Unknown; Health health currently AAC Assessment MRA Manufacturing Technical Executability One time DDR&E/MDA Readiness confidence Assessment CDR Critical Technical Ready for One time Engineering/ Design fabrication PM Review
From page 21...
... FINDINGS AND CONCLUSIONS 21 Unique Prereview Documenta­Presenter Stakeholders Briefings Product/Output tion? Duplication PM, staff SECAF, staff, Coordi- Report No Partially AFAA office nation meetings PM, staff Varies - Coordi- Written report w/ No Partially wide-ranging nation recommendations meetings Varies - often Varies - Coordi- Written report w/ No Partially PM or rep wide-ranging nation recommendations meetings Varies Varies Unknown Varies Varies Yes AF ST&E PM, S&T, Unknown Tech Readiness NDA Yes DDR&E, Levels/Plan DAE PM, PO & PM, User, Unknown Rationale for No No KTR SMEs KTR preferred alt PM/ AFCSE, PM None Assessment reports No Partially Engineering PM/ Engineering, Unknown Satisfy exit criteria No No Engineering user, KTR PM/ PM, Unknown Satisfy exit criteria No No Engineering contractor PM/ Engineering, Unknown Satisfy exit criteria No No Engineering user, KTR PM/ Engineering, None Satisfy exit criteria No No Engineering user, KTR, DAE PM PM, NDA Mutual No Partially contractor understanding of program baseline Unknown PM, user, AF Unknown Log/sustainment Unknown Partially logisticians assessment PM/ PM, S&T, Unknown Mfg Readiness NDA NDA Engineering Levels/Plan PM/ Engineering, None Satisfy exit criteria No No Engineering User, KTR, DAE continued
From page 22...
... The committee believes that the Air Force could improve the effectiveness of its program review effort and reduce the burden on PMs by thoughtfully combining and scheduling reviews. The committee looked at the policies and processes of the National Aeronautics and Space Administration (NASA)
From page 23...
... In general, reviews provide technical and programmatic support to successfully execute acquisition programs, to inform decisions, to share awareness, and to engender program advocacy. In their answers to Survey Question 2.3, PMs said that reviews facilitated program execution as well as problem dis­covery and resolution at all levels of the acquisition enterprise, including industry.
From page 24...
... The committee found that the many disparate concerns of higher-level staffs had an impact on the program manager. For ACAT I programs, many of the written-out responses to Survey Questions 4.1-4.4 described DOD staff as a stove-piped bureaucracy, where domain "czars" have purview over a breadth of programs (by virtue of the OIPT structure or their membership on the DAB)
From page 25...
... In every case, those interviewed or surveyed cited significant costs in terms of money and time to carry out the reviews, and most of them also noted an adverse impact on their ability to carry out other PM responsibilities. Sixty-nine percent of survey respondents said they were working 51 or more hours per week (Survey Question 1.4)
From page 26...
... . Some even went so far as to say that the review had actually had a negative impact on CSP by delaying program schedule or increasing program costs in the case of reviews that were ineffectively carried out (based on written-in responses to Survey Question 3.25)
From page 27...
... Many survey respondents noted that more effort should be given to ensuring that the right subject matter experts and appropriate senior officials attend program reviews and that the number of attendees be limited to those who can add value to the meeting (based on responses to Survey Questions 2.6 and 3.15 and written-in responses to Survey Questions 3.30-4.4)
From page 28...
... Survey respondents mentioned a number of ways to improve reviews. Suggestions for improvement to address the problems cited above included narrowing the review's focus or changing its charter (based on responses to Survey Questions 2.6 and 3.15 and on written-in responses to Survey Question 3.30)
From page 29...
... Program review format and design need to reflect the greater complexity and interrelationships inherent in many current Air Force programs to ensure that a system of systems works across organizational constructs.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.