Page 71

B—
Brief Case Studies of Crises

Hurricanes

Hurricanes are second only to earthquakes as the most destructive natural force on Earth. They are capable of causing massive wind damage, catastrophic flooding, and landslides or mud slides, often covering large geographical areas. Improvements in advance warnings and consequent evacuations have led to a dramatic drop in U.S. casualties due to hurricanes. However, the economic losses continue to climb. In the United States, infrastructure damage from hurricanes, although serious, is relatively less costly than damage to private property and structures. In less developed regions, loss of both life and infrastructure remains at catastrophic levels. A recent illustration of the destructive power of hurricanes was Hurricane Mitch, which hit Central America in November 1998. The most destructive storm in the region since 1780, Mitch left up to 6 feet of rain in some places. More than 11,000 people died in this storm, which also destroyed a substantial portion of the infrastructure in Nicaragua and Honduras. Box B.1 describes efforts to assemble an integrated picture of the hurricane's aftermath.

Improved responses to hurricanes have been enabled by such information technology capabilities as advance warnings and modeling that indicates which populations require evacuation. Among the principal problems in responding to hurricanes are communicating warnings, managing evacuations, and coordinating local response and recovery operations. Warnings must be more precise and detailed to help people in particularly vulnerable locations prepare for the effects of extreme winds



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 71
Page 71 B— Brief Case Studies of Crises Hurricanes Hurricanes are second only to earthquakes as the most destructive natural force on Earth. They are capable of causing massive wind damage, catastrophic flooding, and landslides or mud slides, often covering large geographical areas. Improvements in advance warnings and consequent evacuations have led to a dramatic drop in U.S. casualties due to hurricanes. However, the economic losses continue to climb. In the United States, infrastructure damage from hurricanes, although serious, is relatively less costly than damage to private property and structures. In less developed regions, loss of both life and infrastructure remains at catastrophic levels. A recent illustration of the destructive power of hurricanes was Hurricane Mitch, which hit Central America in November 1998. The most destructive storm in the region since 1780, Mitch left up to 6 feet of rain in some places. More than 11,000 people died in this storm, which also destroyed a substantial portion of the infrastructure in Nicaragua and Honduras. Box B.1 describes efforts to assemble an integrated picture of the hurricane's aftermath. Improved responses to hurricanes have been enabled by such information technology capabilities as advance warnings and modeling that indicates which populations require evacuation. Among the principal problems in responding to hurricanes are communicating warnings, managing evacuations, and coordinating local response and recovery operations. Warnings must be more precise and detailed to help people in particularly vulnerable locations prepare for the effects of extreme winds

OCR for page 71
Page 72

BOX B.1 Creating an Integrated View of the Damage Caused by Hurricane Mitch Decision makers must know the scope of a disaster, its location, the size of the area it covers, casualty levels, estimates of damage to infrastructure and property, and the ways in which all these factors relate to centers of population concentrations and to jurisdictional boundaries. To Inform response actions, information is needed on possible further damage, options for immediate action, legal responsibilities, and possible long-term effects and related follow-ups. At the workshop, William Miller described work done in 1998 by the Center for Integration of Natural Disaster Information (CINDI) to prepare a synoptic view of the devastation caused in Central America by Hurricane Mitch. The work was requested by the Department of the Interior as background information in preparation for a visit to the disaster area by Mrs. Tipper Gore.1 A number of products were developed in response to this tasking. CINDI developed overview maps that indicated Mitch's storm track, including the areas struck by 150- to 180-mph winds. Satellite images were used to produce thematic maps with a 30-meter resolution mosaic of Central America and also helped to make clear the magnitude of the damage, revealing, for example, forest cover damage extending far into Mexico. Geographical information system databases that included a variety of remote sensing data were used to determine where roads crossed flooded rivers and thus the extent of damage to the road infrastructure. Integrating information from a variety of external sources presented a number of challenges. For example, during the course of assembling its briefing materials, the team found that data that had been on the World Wide Web and elsewhere was no longer available. Data sources had been moved or changed as the situation changed, and the team found that establishing an archive to capture data is essential. Another data source the team employed was the Earth Resources Observation Systems data center of the U.S. Geological Survey. However, during the course of the work on Hurricane Mitch, a blizzard delayed staff attempts to reach the center, located in Sioux City, South Dakota, to load and process images. __________ 1These models and data were complied after the fact and were not used to support first responders. and torrential rains. Warnings must reach people in remote locations and be specific enough to support evacuation of people to safe havens. Optimized evacuation routes would help reduce gridlock. Supplies and shelters must be identified and targeted for specific evacuee groups or individuals. Mutual aid resources must be effectively utilized as they converge from outside the affected area. Appropriate maps and position location aids must be provided to crisis responders so they can operate effectively. Damaged infrastructure must be identified, and repairs must

OCR for page 71
Page 73 be prioritized and adequately funded as rapidly as possible to lay the foundation for broad-based disaster recovery. Flash Floods Despite developments in hydrology and meteorology, prediction of flash floods is still difficult. Flash floods result in deaths in both rural and urban areas. In 1972, 238 people were killed by a torrential rain storm in Rapid City, South Dakota. In 1976, a flash flood in the Big Thompson Canyon in Colorado led to the death of 140 people. More recently, during the fall of 1998, more than 50 people were killed by flash floods in Kansas City and in Texas. Most flash floods are not merely meteorological events. Vulnerability to flash floods is a function of a number of human activities. More people have moved to parts of the country that are vulnerable to flash floods, and urban development can intensify the impacts of small storms. Although the rainfall in a storm might indicate only a 10-year event, land use patterns may result in the storm causing damage that exceeds the expected magnitude of a 100-year storm. In Fort Collins, Colorado, in July 1997, the loss of life was caused by a combination of heavy prolonged rainfall and a topography that had been altered by the construction of a railroad trestle. Aging infrastructure, a consequence of reduced public investment in the repair of bridges and dams, also presents a growing problem. Comprehensive emergency preparedness and response are essential components of reducing losses and include coordination between city agencies and increased recognition of the importance of emergency planning in everyday city business. Important factors in reducing deaths from flash floods include providing adequate warnings and helping people respond appropriately to these threats. For instance, if the number of deaths from flash floods is to drop, people must be willing to abandon their cars and climb to safety. In anticipation of El Niño events in 1998, Riverside, California, adopted a public education program that has reduced the number of people trying to drive through high water. The number of deaths was reduced from an average of seven per year to zero. In the Big Thompson flood, seven people initially responded appropriately and evacuated to high ground. They then decided to return to their homes—for reasons such as retrieving household possessions or checking if the stove had been left on. They lost their lives because they did not know how little time they had. Another challenge to reducing the impacts of flooding, including flash floods, is the inadequacy of many floodplain maps. For example, the maps show natural features but do not show areas protected by levees to be potential floodplains. Yet these areas are where serious floods have

OCR for page 71
Page 74 occurred recently. In addition, even the improved maps are more reliable for wide floodplains than for narrower or urban areas subject to severe flash floods. Efforts are now under way in California to use new global positioning system-based data collection to improve the precision of these maps. Response to flash flooding can be improved by information technology capabilities such as weather forecasting and hydrological modeling to predict flood levels for evacuations, sensor networks that provide input to the forecasting and modeling, and systems that provide affected populations with advance warnings. The recent deployment of flood sensor networks that make real-time data available has wide-reaching applications for not only flash flood warning but also other applications such as water resource planning and recreation.1 Principal challenges in responding to these disasters include the lack of sufficiently accurate floodplain maps, which decreases the accuracy of impact prediction as well as the ability to disseminate to affected populations warning information that clearly conveys the nature and timing of the threat along with the appropriate responses. Nuclear Emergencies Because of the potential for insidious, long-term effects, U.S. citizens are particularly conscious of the threat posed by nuclear accidents. In contrast to other types of disasters, the perception is that one could be the victim of a nuclear disaster and not even be aware of it. In response to the potential threat and the public concern about it, federal agencies have spent a great deal of effort to develop warning systems, evacuations plans, public awareness, and response capability. Integrated exercises that include federal, state, and local participants are held on a frequent basis to ensure a high level of readiness and maintain the confidence of the public. Albert Guber, with the Department of Energy's Nevada Operations Office, provided an overview of the federal response to nuclear accident with a focus on data needs, available data sources, and the available equipment to use those data. The federal response to a nuclear accident would start with establishment of a federal radiological monitoring and 1Workshop participants noted that the economic benefits of such alternative uses might well exceed the costs of the actual detection system. For a discussion of alternative applications of flood sensor networks, see Eve Gruntfest and Philippe Waterincks. September 1998. Beyond Flood Detection: Alternative Applications of Real-time Data. Technical Report, U.S. Bureau of Reclamation, Department of the Interior, Denver, Colo. Available online at < http://web.uccs.edu/geogenvs/work/Eve/Beyond%20Flood%20Detection%20Final.html>.

OCR for page 71
Page 75 assessment center, based on plans developed from the lessons of the Three Mile Island accident. Guber emphasized how important it is that all people responding to an emergency have access to the best information as quickly as possible. The centers can be set up in offices or field tents and have their own backup power. Computing is pervasive, and there are large video monitors to provide situational awareness. Drills are conducted to test equipment and, particularly, inter- and intra-agency communication. Drills strengthen links between participating agencies, data sets, and individuals and are essential if emergency management is to be effective. Major challenges in responding to nuclear emergencies include the difficulty of tracing the threat (using sophisticated sensors); the labor-intensive, slow nature of field assessment; and the difficulty of interpreting measured radiation values for both the general public and decision makers. Information technology capabilities—including support for advance warnings; modeling for evacuations; real-time GPS; field database entry and a tracking system to integrate field, laboratory, and analysis units; and advanced graphics for decision support—can help to improve the response to nuclear emergencies. Fires Large-scale fires capable of inflicting significant loss of life, property, and environmental resources are a serious disaster force worldwide. Population pressures increase the risk of catastrophic fires. People are moving into areas of known high fire hazard. In addition, fires are prevented from spreading through their normal course, creating a more serious threat in the future. A critical feature of fires is the need for total extinction of the threat ("put the fire out, dead out"). Many large fires, such as the 1991 fire in the hills of Oakland, California, that burned more than 2,000 structures worth $1.6 billion and killed 25 people, are flare-ups of small fires thought to have been put out. This factor greatly increases the cost of eliminating the threat. Once a fire starts, all possible hot spots must be put out completely. For this reason, 2,000 to 3,000 firefighters, at a cost of up to $2 million a day, may be needed to fight a large fire. Fires have caused billions of dollars in damage to property and serious loss of life. They consume millions of acres every year, many of them in critical watershed areas throughout the world. Ecosystems in many areas of the world are dependent on fire to maintain a natural balance. In most of these areas, human intervention, through eliminating much of the burning cycle, has caused serious imbalances, resulting in an even more serious threat. When fires do occur in these areas, they often burn much hotter than they would in a natural regime, burning much deeper into the

OCR for page 71
Page 76 root structures of plants. This situation leaves the hillsides much more susceptible to landslides and debris flow in subsequent winters. Additionally, these areas typically have little margin for error in their water sources. If local watersheds are destroyed, long-term economic and agricultural disruption may be the result. Human-made causes of fires include sparks from lawn mowers and other yard tools, cigarettes carelessly tossed from a car, and electrical wires blown into trees. The ease and numerous ways of starting a catastrophic fire also create a strong temptation for arsonists. Consequently, firefighting agencies have to maintain an extremely high level of vigilance for fire starts, thus reducing the threat posed by natural causes (such as lightning strikes), human-made causes, and criminal acts. Although the state of the art in physical remedies for fire suppression is mature, the command and control of these resources can be dramatically improved with better technology. Verbal command systems break down in complex environments and must be enhanced with digital systems. Information about fire perimeters and intensities, derived in part from remote sensing and delivered to field personnel, is necessary to optimize use of resources and increase safety. Wearable computers could play a significant role in fire suppression activities by assisting firefighters with such information-intensive tasks as hazardous material identification and by delivering information about building layouts or other environmental information to firefighters in the field. Earthquakes Earthquakes are the most devastating of all natural disasters. The cost can be prodigious—the 1995 Kobe, Japan, disaster caused as much as $100 billion in damage. Hundreds of thousands of people could die in a catastrophic earthquake. Modern building techniques have greatly reduced the death toll in developed countries, but the costs of earthquake-induced damage have increased dramatically because of the enormous increase in the built environment within high-risk areas. When major earthquakes do strike, the damage can penetrate deep into physical infrastructure. Roads and bridges are heavily damaged, as are pipelines carrying natural gas, water, and petroleum; communications lines and equipment are compromised; and satellite and microwave dishes are knocked out of alignment. Earthquake damage can remain undetected for years inside walls, underground, and deep in foundations. The complexity of this kind of damage increases the amount of time required to determine the extent of damage, so that reconstruction can be done. Earthquakes can be very destructive, and they occur nearly instantaneously. In a minute or two perhaps a million or more people are faced

OCR for page 71
Page 77 with constructing a new reality for themselves. There is no warning time for evacuation, staging of resources, or seeking of appropriate shelter. Emergency operation centers are not staffed up in anticipation of increased activity. Indeed, emergency staffing patterns are as much a victim of earthquakes as the rest of the community. When the quake occurs, nobody knows yet where it was centered or how strong it was, so there is no way for people to grasp the overall context of their immediate personal experiences. Was it the big one? Is it better to leave home and go to a shelter? Are the phones working? People are in a state of shock and need help to make decisions on how to respond. Information is essential to address this dimension of the earthquake problem. In the first few minutes after a quake, massive amounts of incident-generated information must be gathered from many sources. It must then be synthesized, interpreted, and distributed to everyone who needs it. Different kinds of information packages must be created for different sectors of the response effort. Some information can be mass distributed via television, radio, or the Internet, whereas other information must be targeted to specific incident responders, perhaps located at remote sites. The critical time lines will vary from minutes to hours for mass-distributed information, whereas the first responder may require a turn-around time on the order of seconds to minutes. Analysis of the information, such as for probable sheltering sites and medical and rescue resources required to meet the disaster, must be completed accurately and quickly and presented to critical decision makers in an easily understood format. Aftershocks are almost certain to occur but may be erratic in their timing. As a result, extra care is required during many rescue operations. Aftershocks also have implications with regard to immediate sheltering needs. Following the Loma Prieta, Northridge, and Kobe earthquakes, for example, hundreds of thousands of people camped out on the streets for the first few days until emergency shelters were set up or until they became confident enough to go back into their homes. Early warning systems for aftershocks could provide precious seconds to get rescue workers out of harm's way, and better understanding and modeling of the distribution and severity of aftershocks could provide the necessary confidence for many people to reenter their homes. Critical Infrastructure Failure or Attack The Administration has identified critical U.S. infrastructures—such as water, communications, power, food, and transportation—that must continue to function during and after natural disasters or physical attacks. These infrastructures are all extensively supported by information tech-

OCR for page 71
Page 78 nology systems and therefore are vulnerable when the information technology systems are attacked or fail abruptly—as is expected from electronic clock mechanisms at midnight on December 31, 1999. To deal with these threats, Presidential Decision Directive 63, issued in May 1998, called for a national effort to assure the security of critical U.S. infrastructures. Because the government does not operate most of these infrastructures, these efforts must be conducted in collaboration with the private-sector owners and operators. The federal approach to dealing with critical infrastructure issues, especially the looming year 2000 (Y2K) problem, was described by Bruce McConnell, then chief of the information policy and technology branch of the Office of Information and Regulatory Affairs at the Office of Management and Budget. McConnell said that the federal government can use help in developing a coordinated detection system for dealing with a series of questions such as the following: How does one know that a system is actually going down? And then how does one diagnose what is happening? Is the problem a symptom of a coordinated attack or a series of coincidences? Is there a law enforcement or national security problem? McConnell spoke about the expected Y2K scenarios and some of the technical problems that are being addressed. The experience will be used as a laboratory for studying and improving the response to critical infrastructure incidents, particularly in cyberspace, he said. Indeed, he observed that if one had set out to create a disaster scenario to test IT vulnerabilities and response capabilities, it would be difficult to come up with one better than the Y2K problem. Officials expect that multiple problems will occur in different places at the same time. In the United States, it is anticipated that the major organizations and pieces of the infrastructure will function adequately but that problems may arise in rural areas and in small and medium-sized enterprises. Thus, local power and telephone companies, and some less technologically sophisticated systems such as water treatment plants could, experience outages. Many of these have microprocessors embedded in device controllers. More generally, three basic issues must be addressed in critical infrastructure failure. First, how will officials get information? When the workload in a crisis center increases 100-fold, will the IT system have the capacity to feed that information to the necessary recipients? The Administration has been exploring, for example, how to handle multiple events at once. As local response capabilities become saturated, the workload will spill over to the state and federal levels. At that point, the higher echelons may have a limited capability to respond, so there will be heavy reliance on local capacity, at least at the beginning.

OCR for page 71
Page 79 Second, citizens will want to know early on if there is a crisis and how the nation is holding up. Specific information can reduce panic. For example, in the case of Y2K, plans are being made for a White House representative, such as the President's assistant on Y2K matters, to provide a status report early in the afternoon of Saturday, January 1,2000. Of course, many will learn from news reports starting with the events that take place at midnight in New Zealand, where the new year arrives 17 hours before it does in Washington, D.C. Third, the government will try to make decisions in real time about where help should be sent. This approach has been taken before in major disasters such as the explosion in the federal building in Oklahoma City, so critical disaster response groups already have been organized for Y2K. Some exercises performed during 1999 will help with evaluation of the information flow and capacities. Federal agencies are expending substantial resources to address the Y2K problem. Some agencies are deferring capital improvements in their information infrastructures so they can deal with Y2K issues first, whereas others are using the opportunity to recapitalize in critical areas of their infrastructure to build next-generation, Y2K-compliant systems. At lower levels of government, emergency managers from metropolitan areas are collaborating to monitor the status of Y2K planning, and they are also working with private corporations. Urban Search and Rescue at the Murrah Federal Building Bombing Geographical information systems (GISs) (Box B.2) played an important role in the response to the April 19, 1995, bombing of the Alfred P. Murrah Federal Building in Oklahoma City. Following the explosion, rescuers did not know if they were looking for 100 or 300 people in the building, and they needed a precise map to target locations to look for people. To help build this map, they relied on a variety of sources, including interviews with the building maintenance manager. In recognition of the utility of GISs in Oklahoma City, especially the value of having people dedicated to sorting out information, the urban search and rescue organization permanently added two GIS positions to the incident support team. Several lessons about the use of information systems emerged from the experience of the GIS unit at Oklahoma City: • Once an initial set of data is made available to responders, updates and changes must be made. As soon as responders enter a damaged building, they will undoubtedly discover discrepancies between the pre-existing data and the actual situations they encounter (many a result of

OCR for page 71
Page 80

BOX B.2 Geographical Information Systems Geographical information systems (GISs)—computer systems that manage, display, and support analysis of geographical reference data—are increasingly being used to fill many crisis management needs. All phases of crisis management deal with many location-specific details, drawn from sources including remote sensing and Global Positioning System (GPS) data on the region of an event and its effects. A GIS assists in managing such information by associating related geographic information and integrating multiple geographic information elements during a crisis. GIS offers a number of capabilities of interest for crisis management: • Dynamic capabilities. Unlike a static map, a GIS database can be updated during the course of a disaster to reflect what is known about the environment and situation during the response efforts. For example, following some disasters such as earthquakes, volcanic eruptions, or floods, the topography itself will have changed. In others, the topography itself may not have changed but the built environment, including, roads, buildings, and utility services, will have been affected. Crisis response is greatly assisted when changes such as damaged roads can be reflected quickly in maps used to coordinate response efforts. • Ease of distribution. When put into a GIS, spatial data can be reproduced electronically for distribution and access (e.g., through a network, Web access, or dial-up modem), and updates can be distributed as required to reflect changes in the situation. • Tool for analysis of data. In contrast to maps, a GIS provides an effective way to combine many types of data of value for crisis management. For example, linguistic demographic data might be imported into a GIS to determine the need for translators in the aftermath of a crisis. Data from a variety of sources, such as laser rangefinders or remote sensing, can also be imported directly into a GIS and further analyzed and modeled. Results of spatial models can be integrated together with incident data, existing map bases, and remotely sensed information.1 To give another example, following the Northridge earthquake, the results of an earthquake shake model were overlaid with zip code information to speed up processing of damage claims. A list of zip codes for areas that had shaking intensities of VIII or greater was produced. Based on this information all claims in these zip codes could be given emergency checks immediately without waiting for case-by-case field verification. The geographical data used in a crisis varies according to location. Some states have extensive GISs of their own with up-to-date details. These systems include information on evacuation routes and location of emergency shelters and estimates of populations-at-risk at various times of the day. Other sources used include background GIS maps, including roads and locations of industry; census data and some commercially available data sets; and aerial photographs for the area. When these preexisting data sets are available, they are used. When not available, special sets will be created using aircraft photography or imaging from remote sensing facilities. Some types of crises place special requirements for accurate and precise geographical data sets. For example, workshop participants noted that in many cases there are inaccuracies in the definitions of floodplains because topography is insufficiently understood. When dealing with a flood, 1 foot In elevation can make a major difference, yet topographical maps are typically accurate to plus or minus 5 or 10 feet. During the 1997 and 1998 floods in California, problems with levee breaks were difficult to handle because nobody knew where the breaks were (since then, California levee maps have been improved). ___________ 1A recent NSF-sponsored workshop explored research issues related to the integration of multiple data types and sources. See David M. Mark, ed. February 1999. Geographic Information Science: Critical Issues in an Emerging Cross-Disciplinary Research Domain. National Center for Geographical Information and Analysis, State University of New York at Buffalo. Available online at <http://www.geog.buffalo.edu/ncgia/workshopreport.html>. the disaster itself). This information is of great value to the entire response team. The ability to easily modify existing spatial data is one of the strengths of using digital data rather than, for example, printed maps. • The level of preparedness and the element of surprise in a disaster such as the bombing in Oklahoma City affect what will be required in responding to an emergency. Indeed, a critical factor in the success of the Oklahoma City GIS support was that digital floor plans were available. These plans, which had been developed by a local architectural firm for a remodeling project involving the whole building, enabled the GIS team to ramp up quickly and to provide operations maps within hours. The more complex the structure, the more important it is to have preexisting information in a readily usable format available for emergency personnel. • The reliability of the information provided to rescuers is a signifi-

OCR for page 71
Page 81 cant concern. The information developed by the GIS team in the Murrah building was developed by experienced personnel working directly with the people in the best position to have the correct information—thus the information had a high degree of accuracy. Preidentified data sources, reliable data paths, and reliable remote sensing technologies, crossed-checked with other sources for validation, are what contribute to developing data that will be believed. In contrast, information gleaned from other sources, such as the results of Web searches of various public sites, is not likely to be held in high esteem. Given the rule of thumb adopted by many crisis responders—that one-third of the information is accurate, one-third is wrong, and one-third might be either right or wrong—they are likely to be reluctant to rely too much on the outside information they are provided.