7
The Role of State-of-the-Art Technologies and Methods for Enhancing Studies of Hazards and Disasters

Technical and methodological enhancement of hazards and disaster research is identified as a key issue in Chapter 1, and computer systems and sensors are discussed in Chapter 2 as technological components of societal change having important implications for research on societal response to hazards and disasters. As summarized in Chapters 3 and 4, pre-impact investigations of hazard vulnerability, the characteristics and potential impacts of alternative hazards, and related structural and nonstructural hazard mitigation measures have been the sine qua non of hazards research. Post-impact investigations of disaster response, recovery, and related disaster preparedness measures have been the hallmark of disaster research. Indeed, post-impact investigations have been so prominent historically that special attention was given in the committee’s statement of task to offer strategies for increasing their value. Yet as highlighted in both Figure 1.1 and Figure 1.2, the committee believes that hazards and disaster research must continue to evolve in an integrated fashion. Thus, any discussion of state-of-the-art technologies and methods must ultimately be cast in terms of how they relate to this field as a whole.

Post-impact investigations inherently have an ad hoc quality because the occurrence and locations of specific events are uncertain. That is why special institutional and often funding arrangements have been made for rapid-response field studies and the collection of perishable data. However, the ad hoc quality of post-impact investigations does not mean that their research designs must be unstructured or that the data ultimately produced



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions 7 The Role of State-of-the-Art Technologies and Methods for Enhancing Studies of Hazards and Disasters Technical and methodological enhancement of hazards and disaster research is identified as a key issue in Chapter 1, and computer systems and sensors are discussed in Chapter 2 as technological components of societal change having important implications for research on societal response to hazards and disasters. As summarized in Chapters 3 and 4, pre-impact investigations of hazard vulnerability, the characteristics and potential impacts of alternative hazards, and related structural and nonstructural hazard mitigation measures have been the sine qua non of hazards research. Post-impact investigations of disaster response, recovery, and related disaster preparedness measures have been the hallmark of disaster research. Indeed, post-impact investigations have been so prominent historically that special attention was given in the committee’s statement of task to offer strategies for increasing their value. Yet as highlighted in both Figure 1.1 and Figure 1.2, the committee believes that hazards and disaster research must continue to evolve in an integrated fashion. Thus, any discussion of state-of-the-art technologies and methods must ultimately be cast in terms of how they relate to this field as a whole. Post-impact investigations inherently have an ad hoc quality because the occurrence and locations of specific events are uncertain. That is why special institutional and often funding arrangements have been made for rapid-response field studies and the collection of perishable data. However, the ad hoc quality of post-impact investigations does not mean that their research designs must be unstructured or that the data ultimately produced

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions from these investigations cannot become more standardized, machine readable, and stored in accessible data archives. Having learned what to look for after decades of post-disaster investigations by social scientists, the potential for highly structured research designs and replicable datasets across multiple disaster types and events can now be realized. As noted in Chapter 1, post-impact studies also provide a window of opportunity for documenting the influence of vulnerability analysis, hazard mitigation, and disaster preparedness on what takes place during and after specific events. However, pre-impact investigations of hazards and their associated risks are critically important on their own terms, less subject to the uncertainties of specific events, arguably more amenable to highly structured and replicable data sets, and no less in need of machine-readable data archives that are accessible to both researchers and practitioners. So what has been referred to in Chapter 1 as “hazards and disasters informatics” (i.e., the management of data collection, analysis, maintenance, and dissemination) is a major challenge and opportunity for future social science research. This chapter begins with an overview of how social science research on disasters and hazards has been conducted in the past, and consistent with Figure 1.2, a case is made for the essential relatedness in chronological and social time of post-disaster and pre-disaster investigations. This section also illustrates the influence of changes in technologies and methods in hazards and disaster studies. Survey research is highlighted specifically in this regard because of its historical prominence within hazards and disaster research as well as mainstream social science. Consistent with the committee’s statement of task, the second section provides a specific discussion on the challenges of post-disaster investigations and ways to increase their value. The third section discusses “hazards and disaster informatics” issues such as dealing with institutional review boards (IRBs), standardizing data across multiple hazards and events, archiving resulting data so that they accumulate over time, and facilitating access of accumulating data from original researchers to those engaged in secondary data analysis. The fourth section provides examples of how state-of-the-art technologies and methods enhance hazards and disaster research and, in so doing, relate directly or indirectly to these informatics issues. Although this chapter cannot cover everything in what amounts to the very broad terrain of “nuts and bolts” research matters, special attention is given to increased use of computing and communications technologies, geospatial and temporal methods, statistical modeling and simulation, and laboratory gaming experiments. Sensitivity to the roles of these technologies and methods will contribute to more focused attention and advancing solutions to hazards and disaster informatics issues. The chapter closes with specific recommendations for facilitating future hazards and disaster studies.

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions DOING HAZARDS AND DISASTER RESEARCH In examining hazards and disasters through disciplinary, multidisciplinary, and interdisciplinary lenses and perspectives (see Chapters 3 to 6), social science researchers have used a variety of technologies and methods. They have employed both quantitative and qualitative data collection and data analyses strategies. They have conducted pre-, trans-, and post-disaster field studies of individuals, groups, and organizations that have relied on open-ended to more highly structured questionnaires and face-to-face interviews. They have used public access data such as census materials and other historical records from public and private sources to document both the vulnerabilities of social systems to hazards of various types and the range of adaptations of social systems to specific events. They have employed state-of-the-art spatial-temporal, statistical, and modeling techniques. They have engaged in secondary analyses of data collected during previous hazards and disaster studies when such data have been archived for this purpose or otherwise made accessible. They have run disaster simulations and gaming experiments in laboratory and field settings and assessed them as more or less realistic. As research specialists, hazards and disaster researchers have creatively applied mainstream theoretical and methodological tools, thereby contributing to their continuing development and use. The Commonality of Hazards and Disaster Research The technologies and methods of hazards and disaster research are indistinguishable from those used by social scientists studying a host of other phenomena (Mileti, 1987; Stallings, 2002). That is as it should be. However, the simultaneity of hazards and disasters core topics within chronological and social time is a source of theoretical complexity, the consideration of which calls for creative applications of the most robust technologies and methods that are available. As noted in Chapter 1 (see Figure 1.2 and its related discussion), chronological time allows partitioning of collective actions by time phases of disaster events (pre-, trans-, and post-impact). The primary explanatory demands of hazards research in chronological time are to document interactions among conditions of vulnerability, disaster event characteristics, and pre-impact interventions in the determination of disaster impacts (see Chapter 3). The primary explanatory demands of disaster research in chronological time are to document interactions among disaster event characteristics, post-impact responses, and pre-impact interventions in the determination of disaster impacts (see Chapter 4). However, such straightforward partitioning in chronological time is not feasible with social time because, as discussed in Chapter 1, pre-, trans-, and post-disaster time phases become interchangeable analytical features of hazards

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions on the one hand and disasters on the other. In social time, in effect, the respective explanatory demands of hazards and disaster researchers become one and the same. So in considering how state-of-the-art technologies and methods can enhance studies of hazards and disasters, there must always be sensitivity to the way specific applications and findings within disaster research inform applications and findings within hazards research and vice versa. For example, post-impact field interviews and population surveys seek data on “present” behaviors during the disaster, relationships between these behaviors and “past” experiences with hazards and disasters, and links between present behaviors and past experiences with “future” expectations of vulnerability. Pre-impact field interviews and population surveys seek data on relationships between past experiences with hazards and disasters, future expectations of hazard vulnerability, and links between these experiences and expectations with decisions to locate in harm’s way, adopt hazard mitigation measures, or engage in disaster preparedness. Pre- and post-disaster uses of public access data and other historical materials, as well as searches for unobtrusive data (e.g., meeting minutes, formal action statements, communications logs, memoranda of understanding, telephone messages, e-mail exchanges), are undertaken with these same objectives in mind. Computer simulations and gaming experiments are always subject to reality checks, and with respect to hazards and disasters, these checks are subject to present behaviors, past experiences, and future expectations. Thus, taking an integrated approach to research on disasters and hazards requires that any assumed impediments of data production during post-impact investigations—such as the ad hoc selection of events, special pressures of the emergency period, lack of experimental controls, difficulties in sampling population elements, and perishable data (see Stallings, 2002)—should be considered also in terms of their consequences for hazards research. In the final analysis, it is because the explanatory demands of disaster and hazards studies are essentially inseparable that these impediments, whatever they may be, are of concern within this entire research community. Also, the impediments are not simply confined to doing either post-disaster or pre-disaster field research. They encompass the way data are collected, maintained, retrieved, and used for purposes above and beyond those of the original studies. The resulting informatics demands on state-of-the-art technologies and methods are major. Influence of Technology on How Hazards and Disaster Research Is Conducted Mainstream social science technologies and methods used to study hazards and disasters have changed over the years, and the role of technol-

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions ogy has been singularly important. A useful illustration because of its importance to hazards and disaster research is technological change in the administration of social surveys. As summarized in Chapters 3 and 4, survey research has provided an excellent source of data for post-impact investigations of the physical and social impacts of disasters, as well as individual and structural responses to these impacts (i.e., disaster research). No less important, survey research has provided an excellent source of data for pre-impact investigations of vulnerability expectations, as well as individual and structural responses to these expectations (i.e., hazards research). Over time, therefore, in hazards and disaster research the survey has been increasingly recognized as a valid form of quantitative data collection (Bourque et al., 2002). Yet like all other methodological tools, the use of surveys is subject to technical, methodological, and societal changes that can affect, both positively and negatively, the ability to collect high-quality data. Surveys of human populations threatened by hazards or actually experiencing disasters may be conducted using a number of different administration forms. They can be administered in face-to-face interviews, through telephone interviewing, or through self-administration of questionnaires. Each of these forms has its own merits and drawbacks, and new technologies are influencing the way they are implemented. Survey research has changed over the past three decades. In the 1970s, most surveys were administered using traditional face-to-face interviews or through mailed questionnaires. However, the near universal access to telephones by the 1990s made telephone interviewing a more attractive administration format. By 1998, 95 percent of U.S. households had telephones, with most of the remaining households having access to a phone. Telephone coverage is lowest in the South, with approximately 93 percent of households having a phone (Bourque et al., 2002). Moreover, the availability of computers and access to the Internet by the late 1980s and early 1990s for both the general population and, more notably, hazards and disaster management practitioners, has led to increased use of self-administered e-mail and web-based surveys. Survey research has become increasingly difficult during the more recent past. Response rates for all forms of administration are dropping, and the costs of conducting survey research are increasing. More people live in gated communities, have guard dogs, have answering machines or caller ID, or live in a “cell phone-only” home. All of these trends, along with increases in the elderly and non-English-speaking immigrants in the general population of the United States (see Chapter 2) are affecting interview completion and response rates. While the rates of nonresponse of all types are increasing, this does not appear to increase bias in the studies (Tourangeau, 2004). Certainly surveys have become more difficult to implement; however,

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions there have been significant changes in technology that have increased the choices of administration methods. In the 1970s, computer-assisted telephone interviewing became available. This methodology allowed researchers to load a questionnaire onto a computer from which interviewers could read and enter data directly into a database during the interview process. By the 1980s, similar systems for in-person interviewing became available. This methodology allows complex skip patterns to be programmed into the questionnaire and reduces the need for interviewers to find the correct question. It also eliminates the data entry step, creating a complete data set at the close of interviewing. However, this also means that paper interviews are not available for double entering of data. If errors are made in data entry during the interview process, there is no way to verify accuracy. By the time of the Second Assessment in 1994 (Mileti, 1999b), computers had gained widespread uses, and access to the Internet had just taken off. With the rise of the Internet, e-mail surveys quickly became available. The earliest form of e-mail surveys were questions typed into the body of an e-mail. When replying to the e-mail, the participant simply typed in his or her responses. Then in the 1990s, Web-based survey technology became available. In Web surveys, questions are programmed with response options. Although the methodology shows promise as a low-cost survey method, there are questions about its applications in academically sound research. While Internet access is increasing, the coverage is not currently sufficient to be able to adequately sample the general population without significant bias. Furthermore, unlike telephone samples, a sampling frame for all people who have access to the Internet does not exist currently. As a result, at present a probability sample of all Internet users cannot be determined. Web surveys may be useful for specific populations in which Internet use is high and there is a list of users in a closed system, such as a university. It is also possible to utilize Web surveys in a mixed-mode fashion. For example, in a survey of health care providers in California regarding their training needs for bioterrorism response, a list of all licensed providers was obtained from the licensing agency in the state. A sample was selected from the list and mailed an invitation to log into a Web site to participate in the survey. Each invitation letter included a unique password so that responses could be tracked. Notwithstanding problems of administration, technically enhanced and highly structured survey research has been used increasingly to produce quantitative data about hazards and disasters. When combined with more traditional qualitative field research methods, geospatial and temporal methods, considerable use of public access data and historical records, and some simulation and experimental work, the picture that emerges over the past half century is one of an ever-expanding volume of data on hazards and disasters. The production of these data has been and will continue to be

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions facilitated by state-of-the-art technologies and methods within mainstream social science. However, the data being produced are largely not standardized across multiple hazards and disasters, not archived for continuing access, and underutilized once the original research objectives have been met. Therein lies the “hazards and disasters informatics” problem discussed in the third section of this chapter. In 1954, a National Research Council (NRC) committee charged with writing a volume similar to this one gave highest priority to exploratory research to define major variables and discover trends (Williams, 1954). It is safe to say that in the ensuing 50 years that goal has been achieved through a host of descriptive and often comparative case studies. With that foundation, and through the National Earthquake Hazards Reduction Program (NEHRP) support during the past 25 years, the transition from descriptive work to the more integrated explanatory work demanded by Figure 1.2 is certainly well under way. THE CHALLENGES OF POST-DISASTER INVESTIGATIONS AND INCREASING THEIR VALUE Post-disaster investigations, especially the field work required for the collection of data on disaster impacts as well as activities related to emergency response and disaster recovery, are undertaken in widely varied contexts and often under difficult conditions. As suggested earlier, the selection of events to be studied is necessarily ad hoc. The timing and location of field observations are heavily constrained by the circumstances of the events themselves as is the possibility of making audio and video recordings of response activities. There are special constraints and difficulties in sampling and collecting data on individuals, groups, organizations, and social networks. Unobtrusive data such as meeting minutes, formal action statements, communications logs, Memoranda of Understanding, telephone messages, and e-mail exchanges are sometimes impossible to obtain, and so on. Post-disaster investigations rely heavily on case studies (the “events”). These case studies have accumulated over time, providing incomplete albeit often sufficient data upon which to base theoretical generalizations about community and societal responses to disasters. In so doing, they have often confirmed and reinforced existing knowledge about response to disasters and hazards (including the continued existence of hazard exposure and specific vulnerabilities). In documenting planned as well as improvised post-disaster responses, they have shed light on hazard mitigation and disaster preparedness practices. In addition, they have served as experience-gaining and training mediums for hazards and disaster researchers. While the analysis of hazards and disasters in social time requires a

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions historical perspective and research, post-disaster investigations can be characterized loosely by five principal and frequently overlapping chronological stages: (1) early reconnaissance (days to about two weeks), (2) emergency response and early recovery (days to about three months), (3) short-term recovery (three months to about two years), (4) long-term recovery and reconstruction (two to about ten years), and (5) revisiting the disaster-impacted community and society to document any other longer-term changes (five to at least ten years). Not all of these chronological stages necessarily require field research, and post-disaster investigations may not even take place in the stricken community. For example, studies of post-disaster national response, recovery, and public policy actions may best be completed in capital cities where decision agendas are established and resources are allocated. The level of funding, research foci, methods, availability of data, their quality, and the duration of the study vary greatly across these chronological stages, but resources permitting, the net long-term results can provide important advances in knowledge. Each chronological stage is described briefly below: Early Reconnaissance: Although of primary interest to physical scientists and engineers because of their need to examine and collect data about the direct physical impacts of a disaster, this stage presents social scientists with opportunities to identify the physical causes of social impacts, learn from scientists and engineers about why and how such physical impacts occurred, observe and document emergency response and immediate relief operations on an almost real-time basis, and define potential responding individuals, groups, organizations, and social networks for more structured follow-on research. Emergency Response and Early Recovery: Observing planned and improvised actions at the height of the emergency response stage provides knowledge about the analysis and management of disaster agent- and response-generated problems, the availability and allocation of local and externally provided resources, the types and effectiveness of individual and structural responses, and the transition from emergency responses (e.g., search and rescue) to early recovery (e.g., temporary shelter) activities. Short-Term Recovery: Studying the evolution from the emergency response and early recovery stages to the short-term recovery stage is particularly interesting because researchers can identify more clearly the characteristics of key responding groups and organizations, how these social units influence decisions, and how short-term decisions (e.g., location of temporary housing) influence the allocation of resources for long-term recovery and reconstruction.

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions Long-Term Recovery and Reconstruction: During this period the sometimes permanent consequences of earlier decisions (or non-decisions) and the application of resources to implement them become visible to researchers, as disaster-related workings of the marketplace. It is then possible to reconstruct how the host of earlier commitments (or noncommitments) combine to shape the previously stricken area spatially, demographically, economically, politically, and socially. This stage also provides the opportunity to document how influential leaders, groups, and organizations have affected the outcomes and why. Revisiting the Stricken Area: After significant time has passed (probably five years to more than a decade) and the disaster-related issues have receded largely from the public’s and decision makers’ agendas, post-disaster investigations of how the “new equilibrium” came to be and how and why the impacted social system is functioning the way it is can help researchers and users understand the anticipated, real, and unintended consequences of the full range of earlier decisions and their implementation. Research at this interval can include, for example, examining the effectiveness of mitigation/ loss prevention measures instituted after the previous disaster and understanding who benefited and who did not from the entire process. Sometimes operating alone or in partnership with engineers, earth scientists, and representatives from other disciplines, social scientists have been part of the continuing history of post-impact investigations. Within the context of post-earthquake studies, it was the National Academies’ comprehensive study of the March 1964 Alaska earthquake that saw a fully integrated social science component (NRC, 1970). To varying degrees, this model was repeated for subsequent events, such as the National Oceanic and Atmospheric Administration (NOAA) study of the 1971 San Fernando, California, earthquake, and it continues to serves a model for post-disaster investigations of earthquakes as well as other natural and technological disasters. As noted in Chapter 1, post-disaster investigations have been seen historically as so important to advancing knowledge that special institutional arrangements have been made and special funding has sometimes been made available (particularly for earthquake research) to enable social scientists and other researchers to enter the field and collect perishable data or conduct more systematic research. As suggested in Box 7.1, support for post-impact investigations of willful disasters is now part of the funding mix at the National Science Foundation (NSF). A possible model for enhancing the value of post-disaster investigations

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions BOX 7.1 National Science Foundation Support for Post-September 11 Research The National Science Foundation’s (NSF) Division of Civil and Mechanical Systems in the Directorate for Engineering has a long history of supporting post-disaster investigations, particularly those induced by natural and technological hazards. For example, funding from NSF has enabled social science and engineering researchers to carry out post-disaster investigations to gather information (perishable data) that might be lost once the emergency period is over. Such research is funded through NSF’s Small Grants for Exploratory Research Program and with funds made available for rapid response research programs administered by the Earthquake Engineering Research Institute (EERI) and the Natural Hazards Research and Applications Information Center (NHRAIC). The largest of the latter efforts is EERI’s Learning from Earthquakes Program, whose funds are used to support multidisciplinary reconnaissance teams after significant earthquakes in the United States and overseas. NHRAIC’s activity, called the Quick Response Program, supports primarily social science investigations. All three of these NSF funding mechanisms were put in play after the September 11, 2001 attacks on the World Trade Center and the Pentagon, and the plane crash in Pennsylvania, resulting in important social science and engineering studies. Upon completion, the results of these studies were published in a book (NHRAIC, 2003). This book includes social science analyses of the disaster responses following the September 11 attacks, such as individual and collective actions, public policy and private sector roles, and engineering analyses on physical impacts on physical structures and infrastructures. No less important, the book documents similarities and differences between the September 11, 2001 event and past disasters, offers policy and practice recommendations for willful and other kinds of disasters, and provides guidance for future research. An appendix includes a list of awards for social science and other studies funded by NSF that were published in the book as well as other awards related to homeland security made directly by NSF or through NHRAIC in fiscal year 2002. SOURCE: NHRAIC (2003). of natural, technological, and willful disasters is the Earthquake Engineering Research Institute’s (EERI’s) Learning from Earthquakes Program (LFE). When federal funding through NSF became available to support field investigations of (primarily) earthquakes, such studies were small in scale, of very limited duration, and composed virtually exclusively of engineers and earth scientists, and the dissemination of the knowledge gained was limited, for all practical purposes, to the earthquake engineering community. The paradigm shifted in 1973, resulting in more sustained federal support for post-disaster investigations and the inclusion of social scientists. The effec-

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions tiveness of today’s LFE program can be traced directly to that paradigm shift. Within the normal constraints of NSF funding, combined with the support capabilities of other organizations (such as the Natural Hazards Research and Applications Information Center [NHRAIC], the three earthquake research centers, and independent researchers from universities, nonprofit and consulting organizations), the availability of principal investigators (who are expected to and do contribute their time) continues to advance research and knowledge about earthquakes and other types of disasters. Drawn from its researcher and practitioner members, EERI (2005) has produced a thoughtful retrospective The EERI Learning from Earthquakes Program. This retrospective captures succinctly LFE’s significant accomplishments during the past 30 years. Among those accomplishments are 11 subjects identified and documents that have benefited directly from investments in social science post-disaster investigations. These 11 subjects have nearly universal application, transcending earthquakes as well as other natural, technological, and willful hazards and disasters. The initial four subjects include (1) strengthening research methods and broadening the mix of social science disciplines involved; (2) applying lessons learned to improve the development of loss estimates and their implications for planning scenarios, emergency operations plans, and training; (3) increasing the understanding of cross-cultural disaster impacts that have demonstrated both commonalities and differences related to key societal variables and levels of development; and (4) providing lessons learned that have or are being applied to improve emergency response capabilities, recovery and reconstruction plans, search and rescue actions, understanding the epidemiology of casualties, measures to reduce life loss and injuries, managing large-scale shelter and temporary housing services, and organizing mutual aid programs. The remaining subjects include (5) applying organizational response lessons learned to improve and standardize emergency response procedures; (6) developing clearer and more effective warning procedures and messages, a necessary component of improving warning system technologies; (7) applying lessons learned about fault rupture and other geologic hazards to land-use planning and zoning; (8) carefully examining the adaptive organizational and decision making processes involved in recovery; (9) understanding the need for and measures to organize and manage large-scale temporary shelter programs; (10) improving management related to the flow and on-site handling of inappropriate donations to impacted areas; and (11) adapting scientific data from instrumental networks to support real-time decision making and emergency operations. It is notable that all of the above subjects relate directly to the social science research summarized in Chapters 3 to 6 of this report. One implication is very clear: The future development and application of social science knowledge on hazards and disasters depends heavily on implementing

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions such as those that concentrate on the hazard characteristics of counties (resolution) for the entire United States (spatial extent). The chronological nature of spatial data involves two important concepts: frequency and lag time. Data often become “old” as the time increases between when they were first collected and ultimately used. Real time or near real time refers to data that has no discernible lag time, that is, receipt of the data is almost instantaneous to its collection. The use of sensors to provide data on traffic flows or Doppler radar that is used to identify tornado winds is a good example of real-time or near-real-time application. The frequency of data is another characteristic that has implications for social science research. For example, surveys about hazards and disaster experiences and expectations, as both relate to mitigation activities, heretofore have been done infrequently. While post-disaster field surveys are done in greater numbers, their frequency depends on the uncertainties of event frequencies. Post-hurricane evacuation behavior surveys are normally, but not always, conducted after major landfalls of hurricanes. An example of a frequency concern is the decennial census. Population and housing data are essential for modeling populations and infrastructures at risk from hazards, yet these data are only collected every 10 years (frequency). At the same time, there is often lag time between when they were collected (e.g., 2000) and when they become available for use (e.g., 2002). Thus, data that represent the social or demographic situation in 2000 (the census year) may or may not be applicable to a community in 2005, especially in areas that have experienced rapid growth. Given this time lag, communities often resort to population projections in producing demographic profiles. The temporal characteristics of data influence the types of research questions that can be addressed. A good example is data production with remote sensing technologies. Remotely sensed data are most often used for purposes of pre-event threat identification (e.g., identification of hurricanes in the mid-Atlantic) and post-event rescue and relief operations. While the collection of remote sensing data can be scheduled on demand, the lag time required for processing such data may negate their utility in immediate emergency response situations such as the attack on the World Trade Center (Thomas, 2001; Bruzewicz, 2003). Thus, both the frequency of data collection and the lag time between the collection of data and their availability influence what hazards and disasters researchers study and how the research questions are framed. Modeling and Simulation Models are abstractions of reality, and modeling is the process of creating these abstractions. Because reality is nearly infinitely complex and all empirical data are processed with reference to that complexity, model build-

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions ing involves the simplification of reality as data are transformed into knowledge. The models created are essentially forms of codified knowledge and used to represent the “reality” of things not known from things that are known (Waisel et al., 1998). Modeling is the sine qua non of science. Virtually all scientific activities require modeling in some sense, and any scientific theory requires this kind of representational system (Neressian, 1992). The structure of a model can be symbolic (i.e., equations), analog (i.e., graphs to model physical networks), or iconic (physical representations such as scale models). Models are usually thought of as quantitative, and able to be represented mathematically. However, qualitative models are no less, and arguably more, common. For example, mental models play a very important role in our conceptualization of a situation (Crapo et al., 2000), and verbal and textual models are used in the process of communicating mental models. Science can be seen as a model-building enterprise because it attempts to create abstractions of reality that help scientists understand how the world works. Technological advances in computing allow the development of complex computer-based models in a wide range of fields. These models can be used to describe and explain phenomena observed in physical systems from micro- to macrolevels, or to provide similar representations of real or hypothetical experiences of individuals and social systems. Models play an essential function in formalizing and integrating theoretical principles that pertain to whatever phenomena are being studied. For example, the computational models used for weather forecasting integrate scientific principles from a variety of natural science and engineering fields. In similar fashion, computational models used for social forecasting integrate theories from a variety of social science as well as interdisciplinary fields such as urban and regional planning, public policy and administration, and public health management. Computational modeling provides an opportunity for social scientists conducting studies of hazards and disasters to integrate theories and empirical findings from the natural sciences, engineering, and social sciences into models that can be used for decision making. For example, one of the most widely used models in emergency management is that of loss estimation. Loss estimation modeling for disasters has grown in the last decade. Early loss estimation methods were grounded in deterministic models, based on scenarios. Scenario events were chosen and estimates of impacts were based on those events. During the 1970s, for example, NOAA scenarios (NOAA, 1972, 1973) estimated regional physical and social impacts for large earthquakes in the San Francisco and Los Angeles, California, areas and were intended to provide a rational foundation for planning earthquake disaster relief and recovery activities. By the 1990s, technological advances in personal computing technology, relational database management systems, and

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions the above GIS and remote sensing systems had rendered the development of automated loss estimation tools feasible. As noted above, HAZUS (NIBS-FEMA, 1999) was developed by FEMA and the National Institute of Building Sciences (NIBS). It is a standardized, nationally applicable earthquake loss estimation methodology, implemented through PC-based GIS software. HAZUS methodology estimates damage expressed in terms of the probability of a building being in any of four damage states: slight, moderate, extensive, or complete. A range of damage factors (repair cost divided by replacement cost) is associated with each damage state. While the front-end of the loss estimation methodology is clearly driven by the earth sciences and engineering, the outputs of the model are much more social science driven. The outputs of interest to urban and regional planners and emergency management professionals are not ground motions, but rather the impacts of ground motion at community, regional, and societal levels. Researchers from the Pacific Earthquake Engineering Center have developed a performance-based earthquake engineering model that describes these outputs as the “decision variables” and often refers to them as “death, dollars, and downtime.” Other far less used computational models have the potential for significant use in social science hazards and disaster research. For example, what has come to be known as agent-based modeling is a set of computational methods that allows analysts to engage in thought experiments about real or hypothetical worlds populated by “agents” (i.e., individuals, groups, organizations, communities, societies) who interact with each other to create structural forms that range from relatively simple to enormously complex (Cederman, 2005). Such modeling, which has grown out of work on distributed artificial intelligence, can be used to simulate mental processes and behaviors in exploring how structural forms operate under various conditions (Cohen, 1986; Bond and Gasser, 1988; Gasser and Huhns, 1989). A major strength of agent-based modeling is its focus on decision making as search behavior. Model applications have been used to address issues of communication, coordination, planning, or problem solving, often with the intent of using models as the “brains” of real or artificial agents in interactions with each other. These models can facilitate descriptions and explanations of many social phenomena and test the adequacy and efficiency of various definitions or representation schemes (Carley and Wallace, 2001). The earlier example (Box 7.1) of planned and improvised post-disaster responses illustrates the kind of research topic in hazards and disaster research that can be advanced through use of agent-based modeling techniques. In that example, conventional and improvised roles are nested within different types of organizations and social networks, which connect roles and organizations. The networks themselves represent more inclusive structural (i.e., relational) aspects of agent-based modeling and inform knowl-

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions edge of when, where, how, and why role behaviors and organizational adaptations occur following a disaster (Mendonca and Wallace, 2004). It is important in this regard to develop representations of both network adaptation and how “agent” knowledge, behaviors, and actions affect and are affected by their respective position within the network. Network models have been used successfully to examine issues such as power and performance, information diffusion, innovation, and turnover. The adequacy of these models is determined using nonparametric statistical techniques (Carley and Wallace, 2001). From the perspective of a researcher concerned with social phenomena in disaster contexts, two issues stand out (Carley and Wallace, 2001). First, how scalable are agent-based models and representation schemes? That is, can the results from analyses of social networks from two to a relatively small number of members (agents) be generalized to larger more complex response systems that are so characteristic of events having high magnitude and scope of impact? Second, are cognitively simple characterizations of individuals as “agents” adequate or valid representations of agents when the actions of groups, organizations, communities and societies are at issue? Answers to these questions are not possible at this point in knowledge development. However, agent-based modeling techniques are developing rapidly (Gilbert and Abbot, 2005), their development is unambiguously interdisciplinary (Cederman, 2005), and their twin focus on human decision making and structural adaptation (Eguiluz et al., 2005) is a core feature of what has been termed the hazards and disaster management system. Decision support tools are needed in this system, and agent-based modeling techniques can facilitate their development and dissemination (Mendonca and Wallace, 2004). Perhaps the most familiar computational modeling tool to social scientists is simulation. Simulation models often represent an organization or various processes as a set of nonlinear equations and/or a set of interacting agents. In these models, the focus is on theorizing about a particular aspect of social action and structure. Accordingly, reality is often simplified by showing only the entities and relations essential to the theory that underlies them. Models embody theory about how an individual, household, small group to larger organization, community, or society will act. With a model structure in place, a series of simulations or virtual experiments can be run to test the effect of a change in a particular process, action, policy, or whatever. In so doing, models are used to illustrate a theory’s story about how some agent will act under specified conditions. Cumulative theory building evolves as multiple researchers assess, augment, reconstruct, and add variations to existing models (Carley and Wallace, 2001). The dominant use of computing in the natural sciences, social sciences, and engineering continues to involve statistical models of existing data.

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions These statistical models range from relatively simple to highly complex configurations of variables, but over time the increasing capacity of computers to process enormous volumes of data has allowed the development of the kinds of computational techniques discussed above. Computational models are more powerful to the extent that simulated data are informed by real data. The ready accessibility of those doing computational modeling to empirical data previously collected on the same topics, and hopefully archived for secondary data analysis, is therefore essential. Laboratory and Field Gaming Experiments It is certainly possible for any research program or center to include both field studies and laboratory simulations of responses to hazards and disasters. When the Disaster Research Center (DRC) was established during the mid-1960s, for example, its research program included both field studies and laboratory gaming experiments. The field studies have continued for decades. However, after a very creative early application (see Drabek and Haas, 1969), the simulation work was largely suspended because of its related cost and complexity. No formal program in social science hazards and disaster research involving laboratory or field gaming experiments has been sustained since the early 1970s. Certainly, emergency management professionals engage routinely in realistic simulations, either at their own local or regional emergency operations centers or perhaps at FEMA’s Emergency Management Institute in Emittsburg, Maryland. However, these simulations are designed as training exercises not as research opportunities for assessment of their effectiveness or realistic foundation in disaster field studies. For the purposes of this chapter, the early combination at the DRC of field studies, data archiving, and simulations continues to serve as a template for future hazards and disaster research. The use of experimentation has been both touted and criticized by researchers in the social sciences (Drabek and Haas, 1969; Hammond, 2000). Of particular concern is the need to ensure proper scientific conduct of experiments. Increasing realism in experimental situations leads potentially to problems of generalizability. However, the generalizability of “realistic” laboratory or field experiments may be compromised if participants are not experienced in the domain—the result being that the hypotheses postulated may not correspond to the phenomena actually encountered in a real decision environment. Moreover, the events or activities that are controlled in experiments may not be controllable in a real world. Gaming simulations are quasi-experimental designs that can provide both statistical power and the ability to generalize results to a variety of crisis situations. The advent of computational modeling, as discussed above, has provided another application for gaming simulations (i.e., the testing and

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions validation of computational models of social phenomena). In these simulations, “agents” in the computational model “play” the same roles as human participants. The actions taken by both human and artificial agents can then be compared. Also, agent-based models, when informed by empirical data, can be used to create a realistic setting for the gaming simulation and, in effect, “play against” the participants, again providing opportunities to investigate cognitive and behavioral phenomena of individuals in social entities of various types. Both research and practical experience have shown that written plans and procedures serve the valuable purposes of training and familiarization with the role of incumbents such as public officials in crisis-relevant organizations (Salas and Cannon Bowers, 2000). These plans and procedures serve as a normative model for education and training activities. Gaming simulations can provide a means for evaluating the plans and procedures in laboratory settings or in the field (e.g., emergency operations centers). An additional and equally important benefit of these simulations is that they can provide a field laboratory or field venue for experimentation on multiple types of circumstances. Thus, experiments on responses to terrorist events can readily be compared with those related to natural and technological disasters. A variety of data can be collected prior to a gaming simulation, subject only to the patience of the participants. Biographical data are certainly available—and they may be needed for designing the experiment (Grabowski and Wallace, 1993). For example, data could be collected on cognitive style prior to the exercise and the results used to design the experiment. However, it is important not to deluge participants with an extensive battery of questionnaires because they may create apprehension, alter behavior, or magnify the lack of realism of the simulation. Unobtrusive measures for data collection can also be devised in laboratory or field experiments to record the activities engaged in by the participants. All communications can be recorded and a digital record kept of phone messages, including recording sender, receiver, length of message, and content. These data can be collected for each sample run and categorized in a variety of ways. It must be recognized that participants may communicate with outsiders or with insiders who are not part of the experiment but are with the training group. Unobtrusive measurements can be built into the exercise, such as recording time and measuring the difference between the time that an event was initiated and the appropriate responses were made. To measure the degree of correctness, every initiated event can have a set of appropriate decisions. In addition to maintaining a record of the activities of participants in the game, many times simulations lend themselves to observation. Participants in the exercise can be observed in a very structured manner

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions with pre-designed instruments to be completed by trained observers. Video-taping can be also used, but usually needs to be electronically transcribed for analysis—resulting in a great deal of qualitative data that require extensive effort to analyze. Various techniques, such as protocol analysis, have been found useful for research purposes, but the benefits of their use must outweigh the costs because they are so time consuming. Behavioral coding of group interactions can be done both in real-time or from the videotapes. Here training of the coders is crucial (Fleiss, 1971). After a gaming experiment is run, participants can complete self-reporting questionnaires. These can be done as part of, or immediately after the activity, and digitally recorded (Litynski et al., 1997). Participants can also be asked to describe and rate each other’s behavior on a variety of dimensions, and to record their interactions with each other during the course of the exercise. Both preceding and following the exercise, interviews can be conducted with each of the participants. The foregoing activities will create a wealth of data. Analysis of the data generated by experimentation using gaming experiments can usually be assessed by standard statistical techniques (Cohen, 1977). The degree of realism of a game is extremely important, not only from the point of view of evaluating the decision aid per se, but in maintaining the interest in and enhancing the educational benefits of the simulation. Such validity can be easily ascertained by having experienced field researchers and emergency management professionals walk through the simulation prior to the actual exercises. Perhaps the most complex issue with gaming experiments is as follows: Do participants treat the simulation as realistic? This was certainly the case in the seminal work by Drabek and Haas (1969). Box 7.5 provides a case where the realism of the gaming simulation could be compared to an actual event that followed shortly after a gaming experiment was run. In this case, it was found that there was some in-game playing because the recovery activity in the simulation did taper off in comparison with the actual event; in fact it ended dramatically at 4:00 p.m. (Belardo et al., 1983). This suspension was obviously not the case with the actual event. However, gaming simulations can be designed in the laboratory or field as learning experiences, and the participants usually understand that training is very important as a precursor to the need to prepare for dealing with incidents with the potential to escalate to a disaster. In conclusion, gaming simulations with hazards and disaster management professionals as participants have an important role in social science research on disasters. The core idea here is to build gaming simulations with an eye toward realism. Such realism can be captured through standardized data from previous field studies that are maintained in effectively managed data archives, accessible to multiple researchers, and used to every

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions BOX 7.5 Realism in Gaming Simulation A serendipitous evaluation of a gaming simulation yielded the observation that realism in the crisis environment was replicated in the simulation environment in terms of both organizational- and individual-level responses. The evaluation entailed data collection during a training exercise held by the U.S. Nuclear Regulatory Commission and the Federal Emergency Management Agency (FEMA) at the Robert A.F. Genet Nuclear Facility in New York. Four days after the simulation an actual incident occurred that involved the activation of all emergency response activities throughout the State of New York. This provided an opportunity to evaluate the benefit of simulations. The realism of the crisis environment was well replicated, both organizationally and its impact on individuals. Stress levels were found to be similar between the simulation and the actual event. Communications were similar during the beginning of the crisis, but there were some differences during the latter stages of the exercise, particularly with respect to decisions concerning recovery operations. This may have been due to participants in the gaming simulation being aware of the need to end the exercise before the end of the working day. SOURCE: Belardo et al. (1983). extent possible in the development of computational models such as those summarized above. The hazards and disaster research community has developed knowledge to the point at which it is feasible to integrate these core informatics activities. RECOMMENDATIONS The research findings and recommendations from Chapters 3 to 6 of this report summarize what has been done in the past under NEHRP support and what the committee feels should be done in the future. The discussions of technologies, methods, and informatics issues in this chapter relate the substance of past and future hazards and disaster research to its actual implementation. Thus, regardless of the topics discussed in previous chapters, social science studies in the next several decades must be responsive to the changing environment of hazards and disaster research. By whatever available technological and methodological means available, they must capture data that are more highly structured and standardized across natural, technological, and willful hazards and disasters. They must analyze, store, and manage data with dissemination and formal rules of data sharing in mind.

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions Recommendation 7.1: The National Science Foundation and Department of Homeland Security should jointly support the establishment of a nongovernmental Panel on Hazards and Disaster Informatics. The panel should be interdisciplinary and include social scientists and engineers from hazards and disaster research as well as experts on informatics issues from cognitive science, computational science, and applied science. The panel’s mission should be (1) to assess issues of data standardization, data management and archiving, and data sharing as they relate to natural, technological, and willful hazards and disasters, and (2) to develop a formal plan for resolving these issues to every extent possible within the next decade. As summarized in this chapter, there are continuing issues in the following areas: (1) standardizing data on hazardous conditions, disaster losses, and pre-, trans-, and post-impact responses at multiple levels of analysis; (2) improving metrics in all of these same research areas; (3) developing formal data standards for storing, aggregating, disaggregating, and distributing data sets among researchers; and (4) using computing and communications technologies to enhance quantitative and qualitative data collection and data management. Addressing these issues systematically can, and the committee believes should, lead ultimately to the establishment of both centralized (virtual) and distributed data repositories on hazards and disasters. The range and depth of research inquiries and approaches in hazards and disaster research will perforce result in major increases of data. Thus, the status quo ante of continuing inattention to data management issues is no longer acceptable. Resolving what the committee has termed globally the “hazards and disasters informatics problem” will require careful consideration and planning. This research community is not in a position to simply adopt informatics solutions from other fields of inquiry because such solutions are only now in the process of being developed. Like other research domains, hazards and disaster research has its own unique theories, models, and findings. Yet informatics issues and their resolution are not field specific; they are generic to basic and applied science. The committee believes that the first step in becoming a more active participant in the “science of informatics” is to create the interdisciplinary panel of experts specified in Recommendation 7.1. The research domain of this community includes natural, technological, and willful hazards and disasters. Thus, the committee believes that it is quite appropriate for the National Science Foundation and the Department of Homeland Security to provide joint support for the work of the recommended interdisciplinary panel. The conceptual framework developed in Chapter 1 (see Figure 1.2 and its related discussion)—placed within the changing societal context described in Chapter 2, the research findings and

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions recommendations summarized in Chapters 3 to 6, and discussions of research methods, techniques, and informatics issues in this chapter—provides the foundation for the recommended panel. The work of this panel should commence as soon as possible. Recommendation 7.2: The National Science Foundation and Department of Homeland Security should fund a collaborative Center for Modeling, Simulation, and Visualization of Hazards and Disasters. The recommended center would be the locus of advanced computing and communications technologies that are used to support a distributed set of research methods and facilities. The center’s capabilities would be accessible on a shared-use basis. There is an immediate need in social science hazards and disaster research to expand the use of state of the art modeling and simulation techniques for studies of willful as well as natural and technological hazards and disasters. The joint support of the National Science Foundation and Department of Homeland Security is therefore encouraged for purposes of implementing Recommendation 7.2. Three areas of research would be supported by the center, each of which would be developed and maintained at distributed research sites: Modeling and simulation: The center would act as both a repository for models constructed by social science researchers at distributed sites, and would work to ensure collaboration, (including experimentation using the Internet), maintenance, and refinement of models. Compatibility, which permits “docking” of computational models, would be a major responsibility of researchers and support staff of the Center. Visualization: Social science researchers are making ever-increasing use of digitized spatial and graphical information, such as global positioning system (GPS)-GIS displays. In addition, human-computer interface technologies are being investigated for possible use as decision tools for hazards management and emergency response. Research on the cognitive processes underlying visualization under conditions of stress and information overload typical of emergency response situations is just one potential topic for this visualization component of the recommended center. Gaming experimentation: The recommended center would have its own and distributed laboratory settings with data collection technologies for research on individual, small group/team, and “organizational” decision making using exercises, “games,” and other interactive experimental media. Researchers could gather data and control treatment from distributed locations networked to the center.

OCR for page 248
Facing Hazards and Disasters: Understanding Human Dimensions As documented in this chapter, computational modeling, visualization, and gaming experiments are important tools for building on and applying knowledge gained from field studies. Heretofore the use of these technical tools has not been integrated, thus reducing their potential value. Such integrated use is best accomplished within a center established for that purpose. As noted above for example, the core idea of gaming and simulation is to build them with an eye toward realism. Such realism is enhanced through standard data production from previous field studies. As the resulting data from field studies become more effectively maintained in distributed data archives, they can be used systematically by the proposed center in the development of computational models and simulations, and the design of gaming experiments specifically for hazards and disaster management professionals. The hazards and disaster research community has developed to the point at which the sustained integration of field research, modeling, and experimentation can be accomplished. Recommendation 7.3: The hazards and disaster research community should educate university Institutional Review Boards (IRBs) about the unique benefits of, in particular, post-disaster investigations and the unique constraints under which this research community performs research on human subjects. The committee has noted above the difficulties involved in harmonizing the actual practice of research with the demands placed on researchers during field studies by the fluid situations that inevitably follow disasters. In particular, the fine points of consent forms, detailed interview protocols, and other research infrastructure are often unachievable in the hours to weeks after a disaster. Furthermore, such requirements may violate cultural norms in the places studied. At the same time, IRB members may have real but sometimes misplaced concerns about the risks of psychological harm that they believe attach to research on hazards and disasters. To the extent that they are not, hazards and disaster researchers must become familiar with federal (in particular, 45 CFR 46.101 et seq.) and local university regulations regarding human subjects research so that they can be knowledgeable resources for their respective IRBs and effective advocates for appropriate deviations from “standard” practices, while maintaining the personal privacy and dignity of research subjects. Members of the research community should seek to become members of human subjects review panels on IRBs or should assist in other policy-making roles.