National Academies Press: OpenBook

Stepping-Stones to the Future of Space Exploration: A Workshop Report (2004)

Chapter: 6 Risk Aversion -- Flying in the Face of Uncertainty

« Previous: 5 Technology as a Driver for Capability Transformation
Suggested Citation:"6 Risk Aversion -- Flying in the Face of Uncertainty." National Research Council. 2004. Stepping-Stones to the Future of Space Exploration: A Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/11020.
×

6
Risk Aversion—Flying in the Face of Uncertainty

The session was moderated by committee member Molly Macauley, a senior fellow at Resources for the Future, a research organization in Washington, D.C. The panelists were General John Barry, a member of the Columbia Accident Investigation Board; Joseph Fuller, founder and president of Futron Corporation, a technology management-consulting firm headquartered in Bethesda, Maryland; Gregg Hagedorn, with the Naval Sea Systems Command (NAVSEA); Allan Mazur, a sociologist, engineer, and professor of public affairs in the Maxwell School of Syracuse University; Richard Obermann, the Democratic professional staff member on the House Committee on Science; and Michael G. Stamatelatos, NASA Director for Safety and Assurance Requirements.

The panelists for this session were asked to address the following questions:

  • What lessons might be shared about differences in risk perception by the public, the Congress, and the agency (NASA)? How are perceptions influenced by risks that are low probability but high cost?

  • The NASA model under discussion (ASTRA) omits explicit treatment of risk. Risk can be defined in many ways—it can, for example, include economic, technological, and political uncertainty—but no matter how it is defined, the model does not explicitly include it. Specifically, the model does not (1) incorporate the consequences of failure to meet milestones, (2) identify decision points at which technology development might be terminated because of cost, engineering problems, or obsolescence, (3) illustrate the cost impacts of failure or redirection of technology development, or (4) include fallback strategies. What modeling techniques might you suggest that would enable the model to incorporate probabilistic treatment yet remain tractable?

  • Among the arguments against including probabilistic treatment in the model are that it renders the model more difficult for decision makers to comprehend, and can undermine the political ability to sell the technologies. How significant are these concerns and how can they be addressed? Lessons learned from the development of other technologies (for instance, nuclear power generation, the superconducting supercollider, synthetic fuels) might be useful if you can share them.

Molly Macauley introduced the topic by offering some perspective on how to treat risk. The intent of the session was to provide an overview of risk, which is not explicitly treated in the ASTRA model, in various forms. She pointed out that something uncertain is not necessarily risky, and that risk need not result from something uncertain. In other words, uncertainty—for instance scientific, engineering, or technological uncertainty or

Suggested Citation:"6 Risk Aversion -- Flying in the Face of Uncertainty." National Research Council. 2004. Stepping-Stones to the Future of Space Exploration: A Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/11020.
×

any other thing that is intrinsically unpredictable or stochastic—is not necessarily risky.

Risk can result from careless management decisions even in informed, certain situations. Risk can be technological, political, or financial. Macauley noted that in Steidle’s presentation the previous day, Steidle had commented that sustaining a long-term program through successive presidential administrations, Congresses, and budget cycles might be the greatest risk facing human space exploration.

Macauley also noted that neither uncertainty nor risk is necessarily bad and that to some degree, both can be managesd. In some cases, uncertainty can be reduced through additional research and development, as we learn by doing. Risk can be managed in a variety of ways, including private insurance for activities that take place in the private sector. In the case of government programs, however, the government typically selfinsures, so the public bears the risk.

Macauley observed that risks that are voluntarily undertaken are usually perceived differently from risks that are involuntary. Low-probability, high-consequence risk is usually perceived differently from high-probability, low-consequence risk. Loss of life is typically seen as a greater risk than loss of property, even if the property loss involves billions of dollars.

Mazur commented that deaths associated with spaceflight, if they are highly publicized, become symbolic and therefore have a much greater effect on public policy than a body count from a probabilistic risk assessment. He said that deaths of aerospace industry workers, even of astronauts, have little impact on public sentiment if they occur outside the public eye. But even one highly visible death in flight can greatly affect a program, causing long delays or even cancellation, or even an increase in funding if that seems to offer a solution. He stated that probabilistic risk assessment goes out the window when astronaut deaths make the headlines. Mazur noted that there were many more news stories covering the Challenger and Columbia accidents than the Apollo fire. As result, NASA was able to investigate the Apollo accident in-house and quickly resume the program. Because of increased media attention and the attendant public anguish, independent commissions were set up to investigate the two shuttle accidents, and these produced far longer delays in the program and severe criticism of the agency.

He pointed out that a sociological model of accident events differs from an engineering model and reveals aspects of the accidents that influence media coverage and public perception. According to Mazur, the engineering model focuses on the proximate causes of a disaster (e.g., frozen O-rings, broken foam) and their precursors (e.g., rushed launch schedules, NASA’s culture of risk). If engineers and risk analysts consider media coverage at all, they treat it peripherally, focusing on things like fairness and accuracy of the reporting. A sociological model treats an accident as a social event, the public reaction to which is greatly affected by the quantity and tone of news coverage. Just as the physical accident has precursors, journalistic coverage is also affected by important factors that precede or accompany the accident.

That a teacher was on Challenger, and that the accident itself was captured by the camera—the photo of the explosion is now iconic—reinforced the news coverage and contributed directly to President Reagan’s decision to form an investigative committee independent of NASA. (Mazur suggested how different the accident would have been, as a social event, had it occurred 1 minute later, out of camera range.) Because the Columbia accident came after September 11, 2001, and had an Israeli astronaut on board

Suggested Citation:"6 Risk Aversion -- Flying in the Face of Uncertainty." National Research Council. 2004. Stepping-Stones to the Future of Space Exploration: A Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/11020.
×

it was immediately interpreted as a possible terrorist event. The shuttle’s disintegration, also captured on film, was shown repeatedly in the days following the accident. Mazur, using the example of the Cassini mission, said that the sociological model applies as well to unmanned missions, such as space probes carrying radioactive material, that are perceived to present a risk to the public. The public controversy over the Cassini mission was familiar from other public controversies over risky technologies. The argument of opponents and proponents lead to balanced media coverage, with each side given a voice. Soon the controversy itself became newsworthy, which heightened the level of news reporting. With doubts about the safety of the mission featured in the newspapers and on television, public concern was aroused and opposition to the mission increased. Attempts by Cassini proponents to correct the news reports amounted to throwing gasoline on a fire, increasing the level of news coverage and therefore of public concern.

Asked by a panel member if NASA could influence the amount of media attention to human spaceflight, Mazur replied that coverage will be extensive if an accident is linked with an event such as having a schoolteacher on board. An unmanned flight can also escalate into a big media story if the flight itself is controversial (this could happen with future missions using technology developed in NASA’s Prometheus).

General John Barry began his summary of key points from the Columbia Accident Review Board (CAIB) report by noting that the patch honoring Apollo 1, Challenger, and Columbia on the back page of the report avows that exploration will continue in the face of adversity. In this context, he said, “adversity” can almost be replaced by “risk.” The CAIB found that the shuttle is not inherently unsafe—rather, it is a developmental vehicle, which has inherent risks, not an operational vehicle. An incorrect mindset both inside and outside NASA—namely that the shuttle is operational—contributed to safety problems, according to the CAIB.

Barry also pointed out that it was the first time the nation had used aging vehicles—the space shuttles—in an R&D environment, which presents a new challenge. The CAIB looked at both technical issues (for example, problems with the shuttle’s external tank foam insulation) and management/cultural issues inside NASA and then sought to make projections about high-risk areas associated with human spaceflight. The Columbia accident represents a turning point for NASA—impelling a new debate about the nation’s commitment to human spaceflight and a new vision for that commitment.

Barry emphasized that human spaceflight is not routine and has many risks. A risk-averse organization like NASA needs constant learning, but NASA did not go to school on Challenger. The U.S. Navy used the findings of the Challenger accident investigation as an example of how, after an accident, to learn from mistakes, but NASA did not. For instance, the same people controlled schedules for costs, testing, maintenance, flight, and so forth, yet the agency needed more checks and balances. Barry also discussed the tendency for the agency to normalize deviance—that is, when a mistake occurs more than once, the tendency is to accept it (for example, the repeated loss of foam became acceptable when it should not have). In addition, he said that when an issue is first raised, the agency’s approach is to prove that there is no problem but that after launch, the approach is to prove that there is a problem, making the agency reactive rather than proactive about risk.

One of the CAIB recommendations is that NASA needs to be a better-integrated organization. The agency’s shuttle program integration office did not really serve this

Suggested Citation:"6 Risk Aversion -- Flying in the Face of Uncertainty." National Research Council. 2004. Stepping-Stones to the Future of Space Exploration: A Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/11020.
×

purpose; rather, it was much like a specialty shop for particular expertise. Process had become too important. For example, how photos of the accident were requested became more important than the photos themselves and the reason for wanting them.

On the issue of improved communication, Barry observed that the use of e-mail has changed the way we communicate. He also criticized the propensity for PowerPoint presentations without written reports to back them up. Barry said that the safety program at NASA tends to be silent, yet the agency needs more effective data collection, trend analysis, and analysis of anomalies.

During the question and answer period, Barry affirmed that the CAIB recommendations are just as relevant to President Bush’s new vision of the human exploration of the Moon, Mars, and beyond. Asked how we can be sure that NASA implements the CAIB recommendations, Barry acknowledged that culture change is difficult and hard to measure and needs consistent, sustained leadership at the top. Asked about the value of human interaction relative to engineering interaction or systems interface, Barry emphasized the need for improved communication with a system of checks and balances in place.

In prepared remarks, Joseph Fuller discussed other kinds of risk besides safety risk. He pointed out that management risk is also important, and if poorly addressed, can result in failure and degraded performance. Fuller found that the ASTRA model fails to address any kind of risk. Because ASTRA is a systematic evaluation of technical or investment options, its use in a corporate setting would naturally include risk assessment. Fuller suggested an analysis of networked technology portfolios from which planners could generate systems and missions and perform system-of-systems analysis. He noted the difficulty of performing such analysis for ASTRA because of the absence of risk assessment standards and the limited capability for risk analysis within the community.

Fuller observed that the aerospace industry is underperforming owing to a lack of leadership in creating the complex risk capabilities required for the future. Fuller agreed that the public appears to accept reasonable risk and that it expressed strong interest in continuing spaceflight after the Challenger and Columbia accidents; bureaucrats, however, appeared much less willing to assume the same risk. Fuller commented that perception of risk is a function of the knowledge one has about a program or mission; those who are closest perceive the risk as highest. NASA needs to provide better information on risk to the public—it needs to explain why and how decisions are made. Not all risk assessment needs to be probabilistic—there are other methods—but risk assessment of some kind is necessary.

Gregg Hagedorn discussed the NAVSEA experience with day-to-day management of the risk associated with its fleet (such as aircraft carriers, submarines). He said that the public understands risks to humans but has less appreciation of risks to technology. At present, the Navy’s Chief of Naval Operations is telling NAVSEA to take more risks. In modeling risk, Hagedorn pointed out the challenge of separating programmatic and technical risks to avoid conflict of interest, and said that a leader needs emotional maturity to be able to stand up and say “We made a $2 billion mistake.” In response to a question, Hagedorn noted that a model like ASTRA “never really worked in the Navy.”

Richard Obermann described how Congress views risk. He pointed out different kinds of risk, including safety risk (a dominant focus in the space world after the Columbia accident), cost and schedule risk, performance risk, and political risk. He

Suggested Citation:"6 Risk Aversion -- Flying in the Face of Uncertainty." National Research Council. 2004. Stepping-Stones to the Future of Space Exploration: A Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/11020.
×

noted that Congress understands risk, although not in a rigorous way, and often has to make decisions with little information (which, he observed, is one type of risk management). Congress is looking at whether the President’s new plan is safe enough. However, the perception of risk and safety can vary over time, with perception immediately after an accident quite different from at other times. The questions Congress must address are wide ranging. One is the cost estimate. Is it right? Too low? Another is performance risk—what has to work right for the program to succeed? Is a miracle involved? What if performance falls short of the goal? Is the initiative resilient? What is political risk? It can mean risk to a member of Congress, but more broadly, it is the vulnerability of a mission to the external political environment, such as a new administration, a new economic climate, or a new geopolitical environment. The lack of clear information on risk will make Congress less likely to commit to a new vision.

Obermann also noted that members of Congress perceive risk based on their past experience with NASA, and previous problems with the agency’s cost estimates make them more likely to view costs as high risk. Members also want milestones or tracking points to make sure a program proceeds on target. Obermann mentioned that on a number of occasions Congress has questioned the rigor and realism of some of NASA's planning efforts.

On the subject of probabilistic risk, Obermann asked how confident one could be that a risk is low probability. He pointed out that the risk of losing a shuttle was originally set at about 1 in 100,000. After the Challenger accident, it was set at 1 in 100. He asked whether that risk is one of losing just a single space shuttle or if it is the risk of losing the entire human spaceflight program. The risk and reward calculation depends on what is at stake, and different members of Congress have different opinions. It is a matter not only of how confident we can be that a risk is low probability but also of whether we can achieve consensus on what the cost or consequence is.

Obermann said that risk planning tools (such as were used for ASTRA) are useful for internal planning but will not convince Congress that risk has been eliminated. More useful is a clear delineation of high risks and what the NASA strategy is to alleviate them.

In response to questions, Obermann said that while members of Congress understand statistics they also know that statistics can be used in different ways and are skeptical of a number in isolation. They also tend to cite the statistics that buttress their arguments. Once burned by a statistic, they are very wary of similar statistics. Asked about the extent to which his comments pertained also to agencies like the Government Accounting Office, the Congressional Research Service, and the Congressional Budget Office, Obermann pointed out that support agencies respond to questions from members and the analysts from these agencies present the desired information paired with the assumptions they use in their analyses. He also noted that with the demise of the Congressional Office of Technology Assessment, there is no entity to provide real risk analysis for members of Congress.

In response to a comment about the small statistical sampling that often constrains the modeling of low-probability events, Obermann noted that now and then members are interested in risks from near-Earth asteroids, but it becomes hard to understand the meaning of such probabilities. If the risk models were better for such low-probability events, perhaps there would be more interest in the issue.

Suggested Citation:"6 Risk Aversion -- Flying in the Face of Uncertainty." National Research Council. 2004. Stepping-Stones to the Future of Space Exploration: A Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/11020.
×

Michael Stamatelatos began his comments by noting that risk is a combination of likelihood and severity of consequence, and that types of mission risk include technical (safety, performance) and programmatic (cost, schedule). He asserted that there is no such thing as qualitative risk assessment, because any meaningful qualitative statement about risk has, either explicitly or implicitly, some quantitative basis. Terms like “high”, “medium,” and “low” risk can be interpreted differently by different people in the absence of a quantitative reference. Qualitative risk assessment has value mainly as a simple way of communicating risk results that are obtained quantitatively. He also noted that perceived risk changes with exposure to consequences, even though quantitative risk may not have changed.

Stamatelatos described the use of probabilistic risk assessment at NASA, pointing out that it is done for shuttle upgrades, construction in space (e.g., the ISS), safety compliance issues (e.g., those associated with Prometheus or Mars sample return), and design (e.g., Prometheus). He agreed that there is a need to improve risk awareness within NASA and to develop in-house expertise to understand probabilistic risk requirements, because risk assessment is a decision support tool and, as such, it cannot be effectively used if decision makers do not understand its methods and findings. He also commented on the need for risk awareness and for management decisions to be informed by risk but not be risk-based—that is, they should not rely solely on risk assessment.

In response to a question, Stamatelatos said that his office was involved with the President’s initiative only at a conceptual level, but that once the vision is more concrete, more rigorous risk assessment can be conducted. In response to a question about how important it is to start risk assessment early in program planning, Stamatelatos said that it should start sooner rather than later; however, if you wait until you have all the information, you will not need a risk assessment. Quantitative risk assessment can inform decisions about where to put money and can identify where the largest risks are. Operational risk can also be reduced once a system has been designed and built, since specific components and technologies can be evaluated.

Donna Shirley asked how, for a broad vision such as ASTRA, does losing one technology (or not achieving maturity for it) affect the risk for the entire system? She contended that if a specific amount of risk is decreed at the highest level, program managers would be inspired to suppress information to try to achieve that decreed risk. This is not productive. Barry commented that since as far back as the Apollo program NASA has never adopted a probabilistic risk assessment culture. The CAIB thought that NASA should use probabilistic risk assessment as a tool to inform decisions, but did not think that the agency should be a slave to the process. Barry went on to say that repeated untoward occurrences (e.g., foam falling off the booster rocket, the erosion of the O-rings) should have signaled potential problems. Although some disagree that probabilistic risk assessment would have correctly identified the foam as a risk, the point is that repeated loss of foam should have been a sign to engineers of a potential problem. Panelists and audience members agreed that the management culture and institutional barriers at NASA still need to be addressed.

Suggested Citation:"6 Risk Aversion -- Flying in the Face of Uncertainty." National Research Council. 2004. Stepping-Stones to the Future of Space Exploration: A Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/11020.
×
Page 37
Suggested Citation:"6 Risk Aversion -- Flying in the Face of Uncertainty." National Research Council. 2004. Stepping-Stones to the Future of Space Exploration: A Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/11020.
×
Page 38
Suggested Citation:"6 Risk Aversion -- Flying in the Face of Uncertainty." National Research Council. 2004. Stepping-Stones to the Future of Space Exploration: A Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/11020.
×
Page 39
Suggested Citation:"6 Risk Aversion -- Flying in the Face of Uncertainty." National Research Council. 2004. Stepping-Stones to the Future of Space Exploration: A Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/11020.
×
Page 40
Suggested Citation:"6 Risk Aversion -- Flying in the Face of Uncertainty." National Research Council. 2004. Stepping-Stones to the Future of Space Exploration: A Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/11020.
×
Page 41
Suggested Citation:"6 Risk Aversion -- Flying in the Face of Uncertainty." National Research Council. 2004. Stepping-Stones to the Future of Space Exploration: A Workshop Report. Washington, DC: The National Academies Press. doi: 10.17226/11020.
×
Page 42
Next: 7 International Cooperation/Competition -- Why, How, When? »
Stepping-Stones to the Future of Space Exploration: A Workshop Report Get This Book
×
 Stepping-Stones to the Future of Space Exploration: A Workshop Report
Buy Paperback | $29.00 Buy Ebook | $23.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

NASA’s Human Exploration and Development of Space (HEDS) program within the Office of Space Flight has proposed a new framework for space technology and systems development—Advanced Systems, Technology, Research, and Analysis (ASTRA) for future space flight capabilities. To assist in the development of this framework, NASA asked the National Research Council to convene a series of workshops on technology policy issues concerning the relationship of the various stakeholders in advancing human and robotic exploration and development of space. The first workshop, which is the topic of this report, focused on policy issues about the development and demonstration of space technologies. Four policy topics—selected by the project steering committee as the foci of this first workshop—are discussed in the report: the rationale for human and robotic space exploration; technology as a driver for capability transformation; risk mitigation and perception; and international cooperation and competition.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!