Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 109
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program CHAPTER FOUR Tsunami Detection and Forecasting SUMMARY An incoming tsunami may be anticipated in many ways, from direct human recognition of cues such as earthquake shaking or an initial recession of the sea, to technological warnings based on environmental sensors and data processing. This chapter reviews and evaluates the technological detection and forecasting capabilities of the U.S. tsunami warning centers (TWCs) paying specific attention to the infrastructure of the earth and ocean observation networks and to the data processing and tsunami modeling that occur at the TWCs. The next chapter discusses the centers’ operations, their human resources, and the infrastructure for their warning functions. The initial decisions by the TWCs to issue an initial tsunami advisory, watch, or warning after an earthquake are based on analyses of data from a global seismic detection network, in conjunction with the historical record of tsunami production, if any, at the different seismic zones (see Weinstein, 2008; Whitmore et al., 2008 for greater detail on the steps taken). Although adequate for most medium-sized earthquakes, in the case of very large earthquakes or tsunami earthquakes1 the initial seismological assessment can underestimate the earthquake magnitude and lead to errors in assessing the tsunami potential (Appendix G). Far from the tsunami source, data from sea level networks provide the only rapid means to verify the existence of a tsunami and to calibrate numerical models that forecast the subsequent evolution of the tsunami. Near the source, a tsunami can come ashore before its existence is detected by the sparse sea level observation network. Two separate U.S. TWCs monitor seismic activity and sea levels in order to detect tsunamis and warn of their presence. Based on their own data analysis, the TWCs independently decide whether to issue alerts to the emergency managers in their respective and complementary areas of responsibility (AORs). The TWCs must not only provide timely warnings of destructive tsunamis, but also must obviate needless evacuations that can cost money and even lives. An ideal warning would provide emergency managers with the necessary information to call for an evacuation in a timely fashion at any particular location in the projected tsunami path. The ideal product would also be clearly worded so that the general public easily understands the threat and who is affected by the threat. This information includes predictions of the time of arrival of the ocean waves, the duration of the occurrence of damaging waves, when the larg- 1 An earthquake that produces an unusually large tsunami relative to the earthquake’s magnitude (Kanamori, 1972).
OCR for page 110
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program est wave is expected to arrive, the extent of the inundation and run-up, and the appropriate time to cancel the warning. Whether a call for evacuation is practicable, and how soon the “all clear” can be sounded, will depend on many factors, but especially on how soon the tsunami is expected to arrive and how long the damaging waves will continue to come ashore. Therefore, the warning system needs to be prepared to respond to a range of scenarios. They range from a near-field tsunami that arrives minutes after an earthquake to a far-field tsunami that arrives many hours after a triggering, distant earthquake yet lasts for many more hours due to the waves’ scattering and reverberation along their long path to the shore. In the case of the near-field tsunami, major challenges remain to provide warnings on such short timescales. The committee concludes that the global networks that monitor seismic activity and sea level variations remain essential to the tsunami warning process. The current global seismic network is adequate and sufficiently reliable for the purposes of detecting likely tsunami-producing earthquakes. However, because the majority of the seismic stations are not operated by the TWCs, the availability of this critical data stream is vulnerable to changes outside of the National Oceanic and Atmospheric Administration’s (NOAA’s) control. The complex seismic processing algorithms used by the TWCs, given the available seismic data, quickly yield adequate estimates of earthquake location, depth, and magnitude for the purpose of tsunami warning, but the methodologies are inexact. Recommendations to address these two concerns fall under the following categories: (1) prioritization and advocacy for seismic stations; (2) investigation and testing of additional seismic processing algorithms; and (3) adoption of new technologies. The tsunami detection and forecasting process requires near-real-time2 observations of tsunamis from both coastal sea level gauges and open-ocean sensors (such as provided by the Deep-ocean Assessment and Reporting of Tsunamis (DART) network). The committee finds that the upgrades enabled by the enactment of the Tsunami Warning and Education Act (P.L. 109-424) to both coastal sea level gauges and the DART network have significantly improved the capacity of the TWCs to issue timely and accurate tsunami advisories, watches, and warnings. Furthermore, these sensors provide researchers with the essential data to test and improve tsunami generation, propagation, and inundation models after the fact. The new and upgraded DART and coastal sea level stations have closed significant gaps in the sea level observation network that had left many U.S. coastal communities subject to uncertain tsunami warnings. Although both sea level gauge networks have already proven their value for tsunami detection, forecasting, and model development, fundamental issues remain concerning gaps in coverage, the value of individual components of the network, and the risk to the warning capability due to coverage gaps, individual component failures, or failures of groups of components. Of special concern is the relatively poor survivability of the DART sta- 2 The report generally uses the term near-real-time rather than real-time. Near-real-time data are returned by geophysical instruments after a variety of intermediary processes including filling a data buffer (e.g., with a length of a second or more) and transferring data through various switches and routers in the Internet. Normally the resulting latency can be as little as a second, several seconds, or minutes associated with the Internet connection modality (e.g., satellite, fiber optics, or network switches). Real-time data can generally be achieved only with very special sampling and transmission protocols.
OCR for page 111
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program tions that currently average a little over one year before failure, compared to a four-year design lifetime. Additional open questions include dependence of U.S. tsunami warning activities on sea level data supplied by foreign agencies and on sea level data derived from U.S. and foreign gauges that do not meet NOAA’s standards for establishment, operation, and maintenance. Looking to the future, the committee concludes that the numbers, locations, and prioritizations of the DART stations and coastal sea level gauges should not be considered static, in light of constantly changing fiscal realities, survivability experience, maintenance cost experience, model improvements, new technology developments, and increasing or decreasing international contributions. The committee finds of great value NOAA’s continual encouragement and facilitation of researchers, other federal and state agencies, and nongovernmental organizations (NGOs) who utilize their sea level observations for novel purposes. The committee believes that stations with a broad user base have enhanced sustainability. The committee is optimistic that continued enhancements to the sea level monitoring component of the U.S. Tsunami Program can measurably mitigate the tsunami hazard and protect human lives and property for far-field events. The committee’s recommendations for the DART and coastal sea level gauge networks fall under the following categories: (1) assessment of network coverage; (2) station prioritization; (3) data stream risk assessment and data availability; (4) cost mitigation and cost prioritization; and (5) sea level network oversight. Similar to open-ocean tsunami detection, tsunami forecast modeling has only recently become operational at the TWCs, as described below. The committee anticipates that further development and implementation of numerical forecast modeling methodologies at the TWCs will continue to help improve the tsunami warning enterprise. As described below, the rapid detection of a tsunami striking within minutes to an hour, either for the purpose of providing an initial warning or for confirming any natural warnings that near-field communities have already received, will likely require consideration of alternative detection technologies, such as sensors deployed along undersea cabled observatories and coastal radars that can detect a tsunami’s surface currents tens of kilometers from the shore. Finally, examples of other new technologies and methodologies that have the potential to improve both estimation of earthquake parameters and tsunami detection are discussed at the end of this chapter. DETECTION OF EARTHQUAKES All initial tsunami warnings are based on rapid detection and characterization of seismic activity. Because of the fundamental differences in nature between the solid earth in which an earthquake takes place and the fluid ocean where tsunami gravity waves propagate, the vast majority of earthquakes occurring on a daily basis do not trigger appreciable or even measurable tsunamis. Nevertheless, some smaller earthquakes could trigger submarine landslides that can result in local tsunamis. It takes a large event (magnitude >7.0) to generate a damaging tsunami in the near-field and a great earthquake (magnitude >8.0) to generate a tsunami in the far-field. However, the generation of a tsunami is affected not only by the magnitude of an
OCR for page 112
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program earthquake, but also by material conditions at the source, such as source focal geometry, earthquake source depth, and water depth above the fault-rupture area. Although estimating the size of a tsunami based on the magnitude of an earthquake has severe limitations (see Appendix G), the initial warning from a seismically generated tsunami is still based on the interpretation of the parent earthquake for several reasons: most tsunamis are excited (or initiated) by earthquakes; earthquake waves are easy to detect, and seismic instrumentation is available, plentiful, and accessible in near-real time (latencies of seconds to a few minutes); most importantly, seismic waves travel faster than tsunamis by a factor of 10 to 50, thereby allowing an earthquake to provide an immediate natural warning for people who feel it while leaving time for instrumental seismology to trigger official warnings for coasts near and far from the tsunami source; and earthquakes have been studied, and their sources are reasonably well understood. Although most tsunamis result from earthquakes, some are triggered by landslides or volcanic eruptions. Technological warning of a tsunami that has been generated without a detectable earthquake will likely require detection of the tsunami waves themselves by water-level gauges. Seismic Networks Used by the Tsunami Warning Centers Both TWCs access the same extensive seismic networks that provide near-real-time information on earthquakes from around the world. Currently, about 350 independent channels of seismic data are monitored and recorded by the TWCs (National Oceanic and Atmospheric Administration, 2008a; Figure 4.1). Seismic networks that provide these data are operated and funded by many different agencies and organizations, including the U.S. Geological Survey (USGS), the National Science Foundation (NSF), the National Tsunami Hazard Mitigation Program (NTHMP), the UN Comprehensive Nuclear Test-Ban Treaty Organization (CTBTO), various universities in the United States, non-U.S. networks, and stations run by the Pacific Tsunami Warning Center (PTWC) and the West Coast/Alaska Tsunami Warning Center (WC/ATWC) themselves. Many of the networks used by the TWCs are part of the USGS/NSF Global Seismographic Network (GSN), which currently comprises more than 150 globally distributed, digital seismic stations and provides near-real-time, open access data through the Data Management System (DMS) of the Incorporated Research Institutions for Seismology (IRIS). The IRIS DMS also serves as the primary archive for global seismic data. GSN is a partnership between the NSF/IRIS and the USGS. The TWCs access seismic network data through dedicated circuits, private satellites, and the Internet. The GSN is widely recognized as a high-quality network, having achieved global coverage adequate for most purposes, with near-real-time data access as well as data quality control and archiving (National Science Foundation, 2003; Park et al., 2005). GSN stations have proven
OCR for page 113
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program FIGURE 4.1 Data from approximately 350 seismic stations are accessed by the TWCs. SOURCE: West Coast/Alaska Tsunami Warning Center, NOAA. to be reliable, with current (2009-2010) data return rates of 89 percent. The GSN is sufficiently robust to support warnings for events far from the recording devices and provides good global coverage (U.S. Indian Ocean Tsunami Warning System Program, 2007). The USGS was provided funding through the Emergency Supplemental Appropriations Act for Defense, the Global War on Terror, and Tsunami Relief, 2005 (P.L. 109-13) to expand and upgrade the GSN for tsunami warning. For redundancy, the TWCs also receive seismic data from many other vendors on multiple communication paths. Given the wide array of uses of the existing seismic networks, GSN can generally be viewed as a data network that is likely to be continued, well-maintained, and improved over the long-term. A future broad upgrade of seismometers in the GSN may be important for tsunami warning. Nevertheless, the TWCs’ heavy reliance on data networks from partnering agencies exposes them to some degree of vulnerability to potential losses of data availability in the future. For example, much of the seismic data crucial to the operation of the TWCs comes from GSN stations whose deployment and maintenance have been and are currently funded primarily from NSF cooperative agreements with IRIS, renewable every five years. The Scripps Institution of Oceanography’s (SIO’s) International Deployment of Accelerometers (IDA) project with
OCR for page 114
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program NSF/IRIS funding operates 41 of the total 150 GSN stations through this mechanism. There can be no assurance that this funding will be sustained at current levels in the future. GSN stations have been operating since the mid-1980s (see Appendix G); much of their hardware is out of date and increasingly difficult to maintain. Operations and maintenance budgets regularly decrease and, except for events like the 2004 tsunami, modernization funds are generally not available to boost the data return rates including the necessary hardware. The more modern NSF EarthScope Transportable Array (with more than 400 telemetered broadband stations), for example, boasts data return rates in excess of 99 percent. Unfortunately, the TWCs could be among the most vulnerable of the IRIS clients in a constrained budget environment, because the TWCs are among the users needing some of the most remote seismic stations, which are difficult, hence expensive, to maintain. To meet the requirements for detection of near-field tsunami events, the TWCs have supplemented existing seismic networks with their own local stations. The WC/ATWC maintains a network of 15 sites throughout Alaska, and most stations were upgraded to satellite communications and broadband seismometers after 2005 (National Oceanic and Atmospheric Administration, 2008a). The PTWC, in collaboration with other partners, is also working to enhance an existing seismic network in Hawaii to improve tsunami and other hazard detection capabilities through a Hawaii Integrated Seismic Network (Shiro et al., 2006). NOAA’s Tsunami Program Strategic Plan (2009-2017; National Oceanic and Atmospheric Administration, 2008b) recommends that the TWCs “monitor critical observing networks, establish performance standards, and develop a reporting protocol with data providers” (e.g., the USGS and the NTHMP) and effect “complete upgrades of Alaska and Hawaii seismic … networks.” The committee agrees with these recommendations; however, to be strategic with limited resources, it is essential to determine and prioritize seismic stations that are critical to tsunami warning (e.g., oceanic stations in known tsunamigenic source regions or within 30º-50º from potential tsunami source areas to allow the more rapid determination of the tsunami potential). Algorithms for Estimating an Earthquake’s Tsunami Potential Once data from the seismic networks have been received, the data are analyzed by the TWCs to determine three key parameters for evaluating tsunamigenic potential: location, depth, and magnitude of an earthquake. Algorithms for determining the geographical location and depth of an earthquake source from seismic arrival times are based upon the concept of triangulation (U.S. Indian Ocean Tsunami Warning System Program, 2007). With the network of stations available to the TWCs, automatic horizontal locations are routinely obtained within a few minutes of origin time with accuracy on the order of 30 km. This is more than satisfactory to determine tsunami source locations, given the fact that earthquakes of such high magnitudes have much larger source areas. The three seismic parameters are used for issuing the initial bulletin. The focal mechanism characteristics are later obtained through moment tensor inversion of broadband seismic data if the data quality is adequate
OCR for page 115
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program (see below). In the present configuration of worldwide networks, the large number of available stations provides robust location determination, although losing a significant number of seismic stations could affect the accuracy of earthquake location and depth. A great earthquake on a subduction thrust tends to nucleate beneath shallow water, or even beneath land in the case of the giant 1960 Chile and 1964 Alaska earthquakes. The source of such an earthquake, and of the ensuing tsunami, extends far beyond the earthquake’s point of nucleation (the hypocenter, on the fault plane; the epicenter, if projected to the earth’s surface). What matters for earthquake size, and for tsunami size as well, is the fault-rupture area, which extends seaward into deep water as well as coastwise. The hypocenter is much like the match that initiates a forest fire in which the damage depends on the total area burned. The tendency to instead equate an earthquake with its hypocenter contributed to confusion during the near-field tsunami from the February 27, 2010, Chilean earthquake of magnitude 8.8. Partly because this earthquake’s hypocenter was located near the coast, the Chilean government retracted a tsunami warning before the largest waves came ashore. Depth determination is crucial to assessing an earthquake’s tsunamigenic potential because sources deeper than about 60 km generally pose no tsunami threat and are well resolved by location algorithms. Finer resolution of depth for shallower earthquakes remains a general seismological challenge, particularly in near-real time. This parameter can have some influence on the generation of tsunamis in the near-field; however, for far-field tsunamis generated by megathrust earthquakes, theoretical studies (Ward, 1980; Okal, 1988) have shown that the probability of tsunami excitation is moderate for depths less than 60 km. This somewhat paradoxical result reflects the fact that a shallower source may create a locally larger deformation of the ocean floor, but over a smaller area. This acts to compensate for the effect on the generation of the tsunami, which is controlled by the integral of the deformation over the whole ocean floor. Given the techniques and data available, the committee found that the location techniques used at the TWCs (Weinstein, 2008; Whitmore et al., 2008) were adequate in the context of tsunami warning. Determining an earthquake’s magnitude is a more problematic aspect of the initial earthquake parameterization. The concept of magnitude is probably the most popular, yet most confusing, parameter in seismology. In simple terms, it seeks to describe the size of an earthquake with a single number. Reliable and well-accepted determinations of earthquake size (the “moment tensor solution”—or the product of fault area with the amount of slip) are possible, but these estimates are necessarily based on long-period surface waves arriving too late to be useful for tsunami warning, which strives for initial estimates within five minutes of the first measurements having been received. Most seismologists agree that it is not currently possible to predict how much of a fault will ultimately break based on the seismic waves propagating away from the point of nucleation (the epicenter), and that only when the slip ends can the true size or moment be inferred. For an event such as the Sumatra earthquake, the propagation of breakage along the fault surface alone takes nearly eight minutes (e.g., de Groot-Hedlin, 2005; Ishii et al., 2005; Lay et al., 2005; Tolstoy and Bohnenstiehl, 2005; Shearer and Bürgmann, 2010). Magnitudes determined at shorter times will necessarily underestimate the true size of the earthquake.
OCR for page 116
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program In this regard, the major challenge for tsunami warning is that tsunamis are controlled by the lowest frequency part of a seismic source, with periods of 500 to 2,000 seconds, whereas routinely recorded seismic waves have energy in the treble domain, with periods ranging from 0.1 to 200 seconds, exceptionally 500 seconds. In addition, seismic waves fall into several categories. Body waves travel through the interior of the earth at average velocities of 10 km/sec, take seconds to minutes to reach recording stations, and their high-frequency components are a good source of information. By contrast, surface waves travel around the surface at considerably slower speeds (3-4 km/sec) and take as much as 90 minutes to reach the most distant stations. The surface waves carry low-frequency signals; that is, the part of the spectrum most relevant to tsunami warning, although high-frequency body wave methods can also resolve event duration and rupture length (e.g., Ishii et al., 2005; Ni et al., 2005). For this latter case, the high-frequency body waves have not yet been exploited by the USGS’s National Earthquake Information Center (NEIC) or the TWCs. In short, the evaluation of earthquake size for tsunami warning faces a double challenge: extrapolating the trebles in the earthquake source to infer the bass, and doing this as quickly as possible to give the warning enough lead time to be useful. Magnitudes can be obtained from various parts of the seismic spectrum, and expectedly such different scales have been “locked” to each other to quantify an earthquake with a single number. This is achieved through the use of “scaling laws,” which assert that the spectrum of a seismic source (the partitioning of its energy between bass and treble) is understood theoretically and can be estimated as a function of earthquake size. However, this universal character of scaling laws is far from proven, especially in its application to mega-earthquakes, which trigger the far-field tsunamis of major concern. In addition, scientists have identified a special class of generally smaller events, dubbed “tsunami earthquakes” by Kanamori (1972), whose source spectra systematically violate scaling laws (see Appendix G). Therefore, characterizing an earthquake source with a single number representing magnitude cannot describe all its properties, especially in the context of tsunami warning. A detailed technical review of these topics is given in Appendix G, and the special case of tsunami earthquakes is reviewed in Appendix H. A summary of the conclusions of Appendix G are: Classical magnitudes routinely determined by conventional seismological methods are inadequate for tsunami warning of great and mega-earthquakes. The authoritative measurement of earthquake size, the moment tensor solution, is based on normal modes and long-period surface waves arriving too late to be used for tsunami warning. The TWCs currently use an algorithm named Mwp which integrates the long-period components of the first arriving P-waves to infer the low-frequency behavior of the seismic source. PTWC has recently implemented the use of the “W-phase” algorithm as well as the Mwp algorithm.
OCR for page 117
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program Although the use of Mwp is satisfactory for the majority of the (small, non-tsunamigenic, and medium) events processed, Mwp has very serious shortcomings in its application to great earthquakes (magnitude greater than 8.0), to mega-earthquakes (magnitude greater than 8.5; Appendix G), and to the anomalous tsunami earthquakes (Whitmore et al., 2002; Appendix H). Thus, the committee is concerned that the TWCs have relied on a single technique applied without sufficient attention to its limitations discussed above. Other approaches are presently being studied including the “W-phase” algorithm, which could eventually be implemented after both the theoretical and operational bases of the approach are established and the limitations of current technologies are understood (Appendix G). Improvements are urgently needed for the determination of the tsunami potential of mega- and tsunami earthquakes. Potential Use of Earthquake Alerts from the NEIC While NOAA and the NTHMP lead the efforts relevant to tsunamis, the USGS and the National Earthquake Hazards Reduction Program (NEHRP) lead the efforts in research and reducing impacts from earthquakes. The USGS’ Earthquake Hazard Program provides and applies earthquake science information to mitigate potential losses from earthquakes. This separation in mission runs the risk of developing tsunami efforts that neglect the earthquake hazard within NOAA and vice versa within the USGS. One service the USGS provides through its NEIC is to rapidly determine the location and size of earthquakes around the world. The NEIC in Golden, Colorado, derives initial solutions, not made public, within seconds after arrival of the seismic data. The NEIC monitors the GSN and other stations and produces accurate seismic analysis within minutes of an event, which it disseminates to a broad range of customers (national and international agencies, academia, and the public). In a development that may influence the methods and roles of the TWCs, U.S. seismology is on the verge of being able to warn of earthquakes while they are still under way. The drive toward such earthquake early warning includes the NEIC. USGS sources say that the NEIC, which began operating 24/7 in January 2006, plans to support this warning function by developing a back-up center at a site other than Golden. At present, the two TWCs do not use the epicentral, hypocentral, or magnitude estimate provided by the NEIC. Instead, each TWC uses its own mix of seismic processing algorithms and as described above develops its own seismic solutions. The TWCs may correct their initial estimates, which are often made public faster than the NEIC’s solutions, to be more consistent with the NEIC’s solutions and at times confer with NEIC staff during an event to ensure consistency. With the availability of the new tsunami forecasting methods and sea level observations (as described below), the TWCs rely more on sea level data and numerical models than on details of earthquake parameters after the issuance of the initial warning product. Therefore, the committee discussed whether it remains necessary for the TWCs to run their own independent seismic analysis. For the forecast models, the TWCs require little more than location, rough magnitude, and time of the event, which could come directly from the NEIC.
OCR for page 118
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program The TWCs in-house analysis offers the benefit of obtaining solutions much faster than the NEIC’s publicly available solution, which might take tens of minutes longer. In addition, the TWCs’ assessment of the tsunami potential of any given earthquake depends on knowing the depth of the earthquake and the earthquake’s geometry, neither of which are as high of a priority for the NEIC. Regardless, there are many benefits to leveraging research and development at the TWCs and the NEIC and to more broadly find synergies in the tsunami and earthquake hazard reduction programs. Conclusion: The current global seismic network is adequate and sufficiently reliable for the purposes of detecting likely tsunami-producing earthquakes. Because the majority of the seismic stations are not operated by the TWCs, availability of this critical data stream is vulnerable to changes outside of NOAA’s control. Furthermore, as discussed in Appendix G, many of the STS-1 seismographs in the GSN are now more than two decades old, and because the STS-1 is no longer manufactured, spares are not available. Recommendation: NOAA and the USGS could jointly prioritize the seismic stations needed for tsunami warnings. These needs could be communicated with partner agencies and organizations to advocate for upgrading and maintenance of these critical stations over the long-term. Conclusion: The complex seismic processing algorithms used by the TWCs, given the available seismic data, quickly produce adequate estimates of earthquake location, depth, and magnitude for the purpose of tsunami warning. The methodologies are inexact, partly because of the physically variable nature of tsunami-generating earthquakes (one model does not fit all), and partly because of the need for rapid determination of earthquake parameters that may not be certain until the entire rupture process is complete (potentially minutes). For example, the methodologies applied by the TWCs do not properly reflect the tsunami-generating potential of mega-earthquakes or tsunami earthquakes. Conclusion: In parallel with their own analyses, staff at the TWCs and at the Tsunami Program could avail themselves of earthquake locations and magnitudes that are estimated within minutes of an event from the USGS’s NEIC. An interagency agreement could be established to make these initial estimates available on secure lines between the USGS and NOAA. Recommendation: Among the methodologies employed by the NEIC is the W-phase algorithm for estimating earthquake magnitude. The committee recommends that the TWCs work jointly with the NEIC to test the potential utility of the W-phase algorithm in the tsunami warning process, using both a sufficient dataset of synthetic seismograms and a set of waveforms from past great earthquakes, paying particular attention to the algorithm’s performance during tsunami earthquakes and to the assessment of a lower-magnitude bound for its domain of applicability.
OCR for page 119
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program DETECTION OF TSUNAMIS WITH SEA LEVEL SENSORS Because the seismic signal is the first observation available to the TWCs, seismic detection provides the basis for the initial evaluation of the potential for a tsunami. The decision about the content of the first message from the TWCs is based solely on seismic parameters and the historical record, if any, of tsunamis emanating from the neighborhood of the earthquake. However, as previously noted, this indirect seismic method is limited in the accuracy of its estimates of the strength of the tsunami, usually underestimating the tsunami potential of large earthquakes and tsunami earthquakes. In acknowledgment of this bias, and because forecasters must err on the side of caution when human lives may be at stake, the TWCs use conservative criteria to trigger advisories, watches, or warnings based on this initial seismic assessment (e.g., Weinstein, 2008), as seen in the PTWC’s far-field forecast of the tsunami from the Chilean earthquake of February 27, 2010 (Appendix J). However, these conservative assessments might cause unwarranted evacuations, which can cost millions of dollars and might threaten lives. A TWC must, therefore, not only provide timely warning of a destructive tsunami, but also must avoid causing unnecessary evacuations with their attendant negative impacts. The detection and forecasting process requires real-time observations of tsunamis from both coastal sea level gauges and open-ocean sensors (such as provided by the DART stations). The combination of the open-ocean and coastal sea level stations, which provide direct observations of tsunami waves, are important for adjusting and canceling warnings as well as for post-tsunami validation of models of the tsunami propagation and inundation (U.S. Indian Ocean Tsunami Warning System Program, 2007). These sea level networks can also detect tsunamis from sources that fail to generate seismic waves or are generated by an earthquake on land that generates a sub-aerial and/or a seafloor landslide. Progress to expand the ocean observing network and advances in oceanographic observing technologies allow the TWCs to incorporate the direct oceanographic detection of tsunamis into their decision processes. Conclusion: An array of coastal and open-ocean sea level sensors is necessary until such time, in some distant future, when the capability exists of observing the entire tsunami wave-front in real-time and with high horizontal resolution (e.g., perhaps with satellites) as it expands outward from its source and comes ashore. The Tsunami Warning Decision Process Before and After Enactment of Public Law 109-424 A majority of the funds authorized by the Tsunami Warning and Education Act (P.L. 109-424) have been used to manufacture, deploy, and maintain an array of 39 DART stations (not counting the 9 purchased and deployed by foreign agencies; http://www.ndbc.noaa.gov/dart.shtml), establish 16 new coastal sea level gauges, and upgrade 33 existing water level stations (National Tsunami Hazard Mitigation Program, 2008; http://tidesandcurrents.noaa.gov/1mindata.shtml). All these new and upgraded sea level stations, especially the DART sites, have
OCR for page 152
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program tsunami within minutes after the earthquake, there will likely only be an additional few minutes before inundation, barely enough time for individuals to flee a short distance. The earthquake itself, if severe enough, may have already disrupted local communications, destroyed structures, and cut evacuation routes, as happened in Samoa during the September 29, 2009, tsunami (http://www.eqclearinghouse.org/20090929-samoa/category/emergency-management-response). Nevertheless, successful evacuations have occurred during the recent events in Samoa and Chile. As for communities a little farther away from the tsunami source (where a tsunami might strike within an hour or so), the lack of communications could mean that tsunami forecasters will not receive data from the coastal sea level gauges that the tsunami reaches first. These communities might also be too distant from the triggering earthquake to have felt the ground shaking sufficiently to regard this as their warning. These communities depend on the detection system to very rapidly assess the threat and deliver the warning product and evacuation order. Almost every tsunami, because their likely sources are along undersea fault zones that tend to be near the continents or islands, will have a near-field region that is affected relatively soon (within minutes) after the earthquake, as well as a whole suite of regions at varying distances that are affected from minutes to many hours after the earthquake. As an example, Figure 4.10 presents a simulation of the great 1700 tsunami that was generated by a magnitude 9.0 earthquake on the Cascadia subduction zone. After 1 hour, the leading tsunami wave crest has already inundated the local coastlines of Oregon, Washington, and Vancouver Island and has reached as far south as San Francisco. After 2 hours, the leading crest is well within the Southern California Bight. For the benefit of the communities at intermediate and greater distances from likely tsunami source regions, and given the possibility that a near-coast earthquake will not only generate a large tsunami but also will destroy infrastructure (including sea level gauges or the telecommunication paths for their data) on the nearby coast, offshore open-ocean gauges that provide near-real-time, rapidly sampled sea level observations are needed. This need motivated the placement of five DART stations off the coasts of California, Oregon, Washington, and British Columbia (see Figure 4.6). Note that at least two of these DART stations would have observed the 1700 tsunami (Figure 4.10) well before the initial wave crest reached San Francisco. Despite the short lead time for a near-field tsunami, there is still value in providing rapid official warning to the local populace, so long as people are not taught to wait for such a warning if they have already felt a strong earthquake. Such formal warning from every possible means (e.g., loudspeakers, TV, radio, Internet, text message, Twitter, etc.) will urge people to evacuate more quickly (the people will likely be under strained conditions instilled by the strong ground shaking). More importantly, such warning could be the only way to notify the people to evacuate in the event of a tsunami earthquake that, because of its peculiar temporal evolution, generates a tsunami of greater amplitude than would be expected from the small amount of ground shaking. The most catastrophic example is the Meiji Sanriku tsunami of 1896 in northeast Japan. The earthquake magnitude was large, Ms = 7.2, but the ground shaking was so weak that few people were overly concerned about the quake. More than 22,000 people
OCR for page 153
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program FIGURE 4.10 Two snapshots from a simulation of the great 1700 tsunami that was generated by a magnitude 9.0 earthquake on the Cascadia subduction zone. The left panel shows the sea level displacements after one hour and the right panel after two hours. Warmer colors show wave crests; cooler colors are the troughs. After one hour, the leading crest has already inundated the local coastlines of Oregon, Washington, and Vancouver Island and has passed San Francisco Bay. On the west side of the disturbance, the initial crest is over 800 km from the coast after one hour. After two hours, the initial wave crest is well within the Southern California Bight on its way to Los Angeles. SOURCE: Satake et al., 2003; reproduced by permission of the American Geophysical Union; http://serc.carleton.edu/NAGTWorkshops/ocean/visualizations/tsunami.html. perished in the huge tsunami that followed, which had a maximum run-up in excess of 30 m. Tsunami earthquakes are not rare. In addition to the Meiji Sanriku tsunami, Okal and Newman (2001) list the following tsunami earthquakes: the 1946 Aleutian Island tsunami; the 1963 and 1975 Kuril Island tsunamis; the 1992 Nicaragua tsunami; the 1994 and 2006 Java tsunamis; and, the 1996 Chimbote, Peru, tsunami. To detect a tsunami earthquake, direct measurements of the water-surface variations and/or water currents are required in near-real time. Such measurements are also critical for detecting tsunamis generated by submarine landslides. One way to accomplish such measurements is to utilize the data from existing and planned cabled ocean observatories. Several cabled seafloor observatories are currently in operation or will be constructed in the near future off North America. These observatories comprise various sensors or sensor systems that are connected to each other and to the shore by a seafloor communications cable. This cable also provides power to the sensors and a pathway for high-speed data return from the sensors. The sensors gather a variety of oceanic and geophysical data that are transmitted in near-real time via the fiber optic cables from the seafloor to onshore data servers. Among the sensors are those useful for tsunami detection; for example, bottom pressure sensors, seismometers, current meters, hydrophones, gravimeters, and accelerometers. The cable can deliver relatively high amounts of electric power to support many sensors acquiring data at high sampling rates. Observatories are currently in operation off British Columbia (NorthEast Pacific Time-Series Underwater Networked Experiments, NEPTUNE-Canada:
OCR for page 154
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program http://www.neptunecanada.ca/) and in Monterey Bay, California (Monterey Accelerated Research System, MARS: http://www.mbari.org/mars/). Another large U.S. observatory has been funded by the NSF for deployment across Oregon’s continental shelf, slope, and the Cascadia subduction zone, over the Juan de Fuca plate, and on to the Juan de Fuca Ridge (Ocean Observatories Initiative, OOI: http://www.interactiveoceans.washington.edu/). Both the NEPTUNE-Canada and OOI networks can be used for quantitative tsunami detection primarily via their seismometers and seafloor pressure sensors. Off Oregon, Washington, and British Columbia, the water pressure sensors placed on the seafloor cabled observatories can readily replace or enhance the DARTs in providing warning to communities at mid- to far-ranges from the tsunami-producing Cascadia subduction zone. In addition, because the seismic data from the observatories can be used in near-real time by automatic computer algorithms in order to separate seismic and tsunami signals in the pressure data, the pressure gauges can be placed very near, and even on top of, the expected tsunami source regions. This can yield very rapid determination of the generation (or not) of a sizable tsunami, thus providing a capability for producing some modicum of warning to the near-field coasts. From a pragmatic operational point of view, the utilization of NEPTUNE-Canada and the OOI sensors for tsunami detection could be expected to eliminate the need for the DART buoys off Washington and Oregon, thus freeing up those resources for other purposes. In Japan, cabled observatories already exist that are focused on collecting measurements of earthquakes and tsunamis. For example, Japan Agency for Marine-Earth Science and Technology (JAMSTEC) has installed three observatories and is constructing a fourth, called Dense Ocean-floor Network System for Earthquakes and Tsunamis (DONET), that specifically aims at capturing the data from the next Tokai earthquake and tsunami. One exceptional event has already occurred on one of JAMSTEC’s observatories, the Tokachi-oki site, which was located atop the source area of the 2003 Tokachi-Oki earthquake; for the first time ever, seafloor sensors observed the pressure variations of the tsunami at the instant of creation. The abrupt changes in water pressure at the seafloor clearly show the seafloor displacements of the earthquake, with sustained acoustic (pressure) waves bouncing up and down between the hard bottom and the sea surface (Li et al., 2009) while the tsunami wave evolves outward therefrom. These observations of the 2003 Tokachi-Oki earthquake and tsunami provided an important lesson: the sensors and cables of an observatory placed at the epicenter can survive the earthquake, allowing the near-real-time data to be used effectively for rapid warning of local tsunamis. Another possible technology for detecting local tsunamis is high-frequency (HF) radar (Lipa et al., 2006). Coastal HF radar stations produce maps of the ocean surface currents using radar echoes from short period surface gravity waves. A tsunami wave, which exists at longer periods (1-30 minutes) than the waves (~10 seconds) that reflect the radar’s microwave energy, will transport the shorter waves, adding to the ambient current and producing a signature detectable by the radar. The method has not been proven in the field, but theoretical and analytical studies are encouraging. The radars could provide accurate and rapid observations of tsunami waves before they make landfall and thereby aid in the formulation of better warning products.
OCR for page 155
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program Many radar stations installed along the coast are threatened by the Cascadia subduction zone (e.g., see http://bragg.coas.oregonstate.edu/). With software enhancements, these stations, and new ones in critical locations, could be key elements of a rapid warning system for near-field events. The radar stations are typically installed on high bluffs overlooking the shore, above any possible inundation. The potential for a broad user base of HF radar data in many locations would help justify the expense of installation and operations, resulting in enhanced sustainability. Conclusion: Tsunami detection, warning, and preparedness activities for tsunamis arriving within minutes to an hour or so could benefit from existing, alternative technologies for rapid detection, especially considering the current sensor network’s limitations for detecting tsunami earthquakes and tsunamis generated by submarine landslides. Recommendation: For the purpose of developing more rapid and accurate warnings of local tsunamis, especially along the Washington and Oregon coasts, the committee recommends that the TWCs coordinate with the NEPTUNE-Canada and OOI observatory managers to ensure that their seismic and bottom pressure data are (or will be) made available in near-real time to the appropriate telecommunications gateways. Data interpretation tool(s), jointly applied to the seismic and bottom pressure data, will need to be developed to realize the most rapid tsunami detection possible. Other NTHMP member states could seek similar opportunities to utilize existing and/or planned systems (including coastal HF radars) for the detection and warning of local tsunamis. It must be emphasized that investment for this adaptation would be minimal, because the observatories are being constructed and will be maintained with funds external to the U.S. Tsunami Program; thus, the benefit could be substantial. RESEARCH OPPORTUNITIES AND NEW TECHNOLOGIES The previous sections of this chapter have made it clear that present technologies and methodologies for evaluating the potential of earthquakes to produce dangerous tsunamis, and for detecting and forecasting those tsunamis, are far from the ideal of having an accurate and complete forecast of the expanding tsunami wave train within a few minutes of the initiating rupture. It is appropriate therefore to briefly review nascent technologies and methodologies that might be able to improve the ability of the U.S. TWCs and their international counterparts to provide quicker and more accurate tsunami warnings. Some of these technologies and methodologies, like the undersea, cabled observatories discussed in the previous section, are already available, simply waiting for the appropriate testing and software development to be integrated into the TWCs warning processes. Others require much more development before they will become useful. Technologies such as satellite altimetry, passive microwave radiometry, ionospheric perturbation detection, and real-time kinematic-global positioning system (RTK-GPS) buoys
OCR for page 156
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program have been proposed for detecting tsunamis in the wake of the Indian Ocean event of 2004. Although potentially promising, there has not been any demonstration of a viable operational alternative to the current systems, perhaps due to lack of funding. In general, most alternatives are not adequately sensitive to serve as a replacement for present technologies, with which small waves (<1 cm) can be observed and used for wave model inputs, fine-tuning of forecasts and warnings (including cancellation of warnings), and tsunami research. Nevertheless, continued research and development may prove fruitful. The descriptions below of some interesting technologies and methodologies are provided simply to indicate possibilities and should not be interpreted as endorsements of their utility by this committee. Duration of High-Frequency P-Waves for Earthquake Moment Magnitude Estimation Because of the difficulty of obtaining reliable estimates of seismic moments at the long periods relevant to tsunami generation, research is needed to explore the possibility of using other methods, possibly drawing on different technologies, in order to improve the accuracy of moment estimates, and the ability to detect unusual events, such as tsunami earthquakes. One approach to the near-real-time investigation of large seismic sources consists of targeting their duration in addition to their amplitude. The comparison between the amplitude and duration reveal violations of scaling laws (e.g., slow events such as tsunami earthquakes). Following the 2004 Sumatra earthquake, Ni et al. (2005) noted that source duration can be extracted by high-pass filtering of the P-wave train at distant stations, typically between 2 and 4 Hz. Only P-waves escape substantial inelastic attenuation, so that this procedure eliminates spurious contributions by later seismic phases and delivers a “clean” record of the history of the source. This approach has been pursued recently by Lomax et al. (2007) and Okal (2007a). In particular, the latter study has applied techniques initially developed in the field of seismic source discrimination (of manmade explosions as opposed to earthquakes) to characterize the duration of the source through the time τ1/3 over which the envelope of the high-frequency P-wave is sustained above one third of its maximum value. It is shown, for example, that this approach would have clearly recognized the 2004 Sumatra earthquake as a great earthquake, or the 2006 Java tsunami earthquake rupture as exceptionally slow. In addition, alternative methods for the rapid identification of source duration of major earthquakes are presently the topic of significant research endeavors, e.g., by Lomax et al. (2007) and Newman and Convers (2008). The high-frequency band of the Sumatra earthquake was recorded in Japan using the Hi-Net seismic array comprising 700 borehole instruments at an approximate 20 km spacing. Ishii et al. (2005) used the data from the array to produce back-projected images of the earthquake rupture over approximately eight minutes across a 1,300 km long aftershock region including both the slip history and overall extent of the seismic zone. A comparison of the subsequent fault image, when compared to previous great earthquakes, supported the hypothesis that the moment magnitude of the earthquake was 9.3—the largest earthquake ever recorded with modern seismic instruments. The authors believe that such images of the aftershock
OCR for page 157
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program zone could be made available within 30 minutes of the initiation of a similar event. Although networks or arrays like Hi-Net are rare, a similar or even more capable array is currently being implemented across the continental United States, funded by the NSF EarthScope program. Today’s high-speed, high-capacity networks coupled with large-capacity computing facilities such as cloud computing provide the technologies for implementing an early warning system. The compressional wave velocity is high (>8 km/s) and will provide fault images more quickly than the hydrophone approaches discussed below. The technique used for acoustics, however, is similar to seismic back-projection. Conclusion: The P-wave duration and back projection methods appear robust and applicable to high-frequency records. These methods have some advantages over the W-phase approach because they can provide constraints on the rupture length and duration and do not rely on having seismometers with a stable long-period response. Recommendation: The committee recommends that NOAA and the TWCs consider the use of arrays and networks such as Hi-Net and EarthScope Array National Facility to determine rupture extent and moment of great earthquakes. The networking and computational requirements are significant and would need to be included in TWC upgrades in the future. Hydroacoustic Monitoring of Underwater Geophysical Events Sound wave (“hydroacoustic”) signals can propagate a great distance within a waveguide in the ocean, termed the sound fixing and ranging channel (“SOFAR channel”). This propagation was discovered during World War II, and immediately following declassification scientists began exploring the possibility of using hydroacoustic signals generated by large earthquakes (the so-called T phases) for the purpose of tsunami warning (Ewing et al., 1950). With the development of the UN International Monitoring System of the CTBTO, several state-of-the-art hydrophone stations have been deployed in the world ocean, offering an opportunity for complementary use in the context of tsunami warning. Each station comprises three hydrophones separated by approximately 2 km to provide some directionality at low frequencies. By placing hydrophone sensors within the SOFAR channel, a scientist can “listen” to seafloor seismic, tectonic, and volcanic events occurring at a great distance. The potential of using hydroacoustic techniques to monitor underwater landslides has yet to be fully explored, but it may represent the best approach for detecting unsuspected underwater landslides, as occurred in the 1998 Papua New Guinea (PNG) tsunami (Okal, 2003). However, that detection represents to this day a unique, unrepeated occurrence. Furthermore, the PNG landslide was identified as such because its hydroacoustic signal was too weak for its duration, in violation of earthquake scaling laws. At the same time, T phases can be used to complement the identification of anomalously slow events, such as tsunami earthquakes, because hydroacoustic signals include very high frequencies (3 Hz and above) and their energy bears the imprint of the earthquake at very short periods (Okal et al., 2003).
OCR for page 158
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program In this respect, hydroacoustic signals play a complementary role in tsunami warning because they travel slowly (1,500 m/s). However, de Groot-Hedlin (2005) and Tolstoy and Bohnenstiehl (2005) demonstrated that it was possible to use ocean hydrophones to track the rupture of the 2004 Sumatra event from the original epicenter to the termination more than 600 km to the north. The hydrophones were 2,800 and 7,000 km from the epicenter and acoustic propagation required 31-78 minutes while the fault itself ruptured for more than 8 minutes. The information would not be useful for alerting nearby communities but could have provided meaningful warnings for Sri Lanka and more distant countries. Other properties of T phases can shed some interesting, but again complementary, light on properties of the seismic sources, for example, their duration, along lines similar to the τ1/3 method described earlier. Salzberg (2008) has also proposed to precisely constrain hypocentral depth using the decay of very high frequency (20-80 Hz) T phases from the parent earthquakes. Once such techniques reach an operational status, they could contribute to tsunami warning. An additional aspect of SOFAR hydrophone sensors is that they can record pressure variations accompanying the passage of the tsunami, and in this sense could supplement the network of DART buoys, as their sensors (in both cases pressure detectors) essentially share the same technology, with the only difference being that the latter are deployed on the ocean bottom. However, within the context of the CTBTO, the Integrated Maritime Surveillance (IMS) sensors have been hard-wired with drastic high-pass filters (with a corner frequency of 10 Hz), and the main spectral components of the 2004 Sumatra tsunami (around 1 mHz) were recorded only as digital noise (Okal et al., 2007). The use of software rather than hardware filters for any future deployment of hydrophones in the SOFAR could be extremely valuable to the tsunami community. The cabled NSF OOI Regional Scale Nodes (RSNs) to be deployed off Washington and Oregon and the existing NEPTUNE-Canada network (see above) could support both bottom pressure gauges as well as hydrophones in the SOFAR channel for enhancing tsunami research and warning in the Cascadia area. Continuous GPS Measurements of Crustal Movement When combined with seismic data, continuous global positioning system (GPS) measurements of displacement have proven to be powerful in studying continental earthquakes; for example, in illuminating the processes of earthquake after-slip, creep, and viscoelastic deformation. Continuous GPS can provide a map of the three-dimensional deformation incurred at the surface in the proximity of the epicenter as a result of the earthquake rupture. It provides a resolution to the problem of the long-period component of the seismic source by simply allowing measurement during a time window long enough to be relevant to tsunami generation even for nearby sources. GPS and broadband seismic measurements differ substantially in that GPS geodetic measurements provide distances between neighboring stations, while individual seismometers are affected by applied forces and signals are proportional to acceleration. Normally, the output of a seismometer is “shaped” to be proportional to velocity above some frequency (1/360 Hz for an STS-1; Appendix G). Because earthquakes cannot apply a constant force at zero fre-
OCR for page 159
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program quency, it’s not possible to directly infer displacements from a seismometer. Furthermore, a seismometer is limited by its mechanics and electronics to recording signals smaller than some threshold; arbitrarily large displacements can be measured by GPS. Bock et al. (2000) demonstrated that GPS receivers can measure ground motion in real time as often as every few seconds. They tested the accuracy of these estimates over baselines as large as 37 km and found that the horizontal components have accuracies no worse than 15 mm; they anticipated that the baselines could be extended to at least 50 km with no further loss in accuracy. The vertical measurements were less useful with accuracies a factor 7-8 times worse. The accuracies have improved over the past decade with the advent of new receivers, new algorithms, and statistical analyses. GPS receivers of 10-50 Hz and methods are now practical and measurements routine (e.g., Genrich and Bock, 2006). The application of near-real-time, continuous GPS measurements have made great strides as well. For example, Song (2007) used coastal GPS stations (E-W and N-S horizontal measurements) to infer displacements on the seafloor offshore using the location of the fault and inferring the vertical uplift from conservation of mass. Song tested the method against geodetic data from the 2005 Nias, 2004 Sumatra, and 1964 Alaska earthquakes. In the case of Nias and Sumatra, both continuous GPS data as well as campaign GPS data were available. He tested the model against satellite altimetry measurements of the tsunami wave using Topex, Jason, and Envisat data (altimetry profiles included time epochs of 1:55-2:10, 1:48-2:03, and 3:10-3:22 [hr:min after the origin time]). The Nias and Alaska events were also tested against available coastal tide gauge data. The methods were used again after the February 27, 2010, Chile earthquake and later verified by satellite altimetry from JASON-1 & 2 satellites operated by National Aeronautics and Space Administration (NASA) and the French Space Agency. The successful use of GPS data for these four earthquakes makes a strong case for the use of continuous GPS stations to measure coastal ground displacements to infer the corresponding displacements offshore. In turn, these displacements can be used to predict tsunami generation including accurate wave heights as a function of time, range, and azimuth. Near-field tsunamis are generated by the rupture of hundreds of kilometers of an offshore subduction fault. As in the case of the Sumatra earthquake this rupture can last as long as eight minutes and more. During this period of time, GPS data will mimic seismic data with oscillatory behavior that obscures the smaller, permanent displacements. The most distant part of the fault from a station can be at least as large as eight minutes of propagation time away, and the displacements generated by that distant source will take as long to propagate back to the station. By that time, however, the static offsets will begin to be apparent, allowing the inference of offshore displacements and realistic assignment of magnitudes (as little as 4-5 minutes after the initiation of faulting). The tsunami associated with the earthquake will not come ashore much earlier than 30 minutes following the beginning of the rupture, and technology-based warnings could be made in time to provide useful warnings. None of these operations lie even remotely outside the capabilities of modern networks, computational workflows, and computing capabilities. Today there are thousands of GPS geodetic receivers located around the earth. Just in southern California, there are more than 250 continuously recording GPS geodetic stations that
OCR for page 160
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program are available in near-real time from a variety of sources (e.g., http://sopac.ucsd.edu; Schmidt et al., 2008). Recently, NSF Geosciences elected to undertake the improvement and densification of seismic and geodetic stations in the Cascadia region including the enhancement of near-real time access to GPS (http://www.oceanleadership.org/2010/nsf-cascadia-initiative-workshop/). Sweeney at al. (2005) have demonstrated that centimeter-level horizontal accuracy can be achieved on the seafloor using GPS coupled to seafloor geodetic monuments using acoustic methods. These technologies might be extended to verify offshore displacements predicted by accurate coastal GPS stations. Permanent GPS stations should be incorporated into the tsunami warning program and expanded, if needed, to provide tsunami prediction capabilities. Although Cascadia is one of the most critical sites for U.S. tsunami warning in the near-field regions, Alaska and the Caribbean are also critical sites. There are few new technologies that promise such revolutionary approaches for improving tsunami warning, especially in the near-field region. Conclusion: GPS geodesy, exploiting near-real-time data telemetry from permanent geodetic stations, holds great promise for extending the current seismic networks to include capabilities for measuring displacements in the coastal environment for great and mega-earthquakes. Displacements onshore can potentially be used to infer offshore displacements in times as short as five minutes in an area such as the Cascadia Fault Zone. Recommendation: NOAA should explore further the operational integration of GPS data into TWC operations from existing and planned GPS geodetic stations along portions of the coast of the United States potentially susceptible to near-field tsunami generation including Alaska, Cascadia, the Caribbean, and Hawaii. Where GPS geodetic coverage is not adequate NOAA should work with NSF and the states in extending coverage including the long-term operation and maintenance of the stations. Observation of Tsunami Wave Trains with Satellite Altimeters Satellite altimeter measurement of the ocean’s surface height, in use since 1978, consists of measuring (with a precision of a few centimeters) the deformation of the surface of the ocean by precisely timing the reflection of a radar beam emitted and received at a satellite. Its capability to detect a tsunami was proposed following the 1992 Nicaragua tsunami (Okal et al., 1999), and it achieved a definitive detection following the 2004 Sumatra tsunami, with a signal of 70 cm in the Bay of Bengal (Scharroo et al., 2005; Ablain et al., 2006). (See also the preceding topic, “Continuous GPS Measurements of Crustal Movement.”) Although the method has obvious promising potential in the field of tsunami warning, two major problems presently hamper its systematic use: (1) delayed processing of the data, which in the case of the 2004 event was made available to the scientific community several weeks after the event, and (2) the presently sparse coverage of the earth’s oceans by altimetry satellites. In lay terms, the satellite has to be over the right spot at the right time; in the case of the Sumatra tsunami, the passage of two satellites over the Bay of Bengal as the tsunami propa-
OCR for page 161
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program gated across was a lucky coincidence. Thus, making satellite altimetry operational for tsunami warning requires geostationary satellites over the ocean basins of interest, or a dense array of low earth orbit (LEO) satellites, with either set-up providing data availability in near-real time. In fact, Iridium Communications, Inc. is designing its second generation of LEO communications satellites (called Iridium NEXT), which are expected to be fully deployed by 2016 and will carry scientific payloads such as altimeters for sea height determination, including observation of tsunamis (http://www.iridium.com/About/IridiumNEXT/HostedPayloads.aspx). The planned constellation of 66 satellites suggests that a tsunami created anywhere in the world could be observed close to the moment of inception. At the present time, however, the NEXT constellation is not being touted as a tool for operational tsunami warning. Tsunami-Induced Sea-Surface Roughness and “Tsunami Shadows” Godin (2004) theoretically justified so-called “tsunami shadow” observations (Walker, 1996), namely that the surface of the ocean exhibits a change of appearance during the propagation of a tsunami. In simple terms, the tsunami creates a coherent change in sea-surface slope, inducing turbulence in wind currents at the surface, which in turn results in enhanced roughness of the sea- air interface. Godin et al. (2009) further showed that the phenomenon was detectable in the form of anomalous scattering in the radar signal from the JASON satellite altimeter, during its transit over the wavefront of the 2004 Sumatra tsunami in the Bay of Bengal. This remarkable scientific confirmation and physical explanation of what had amounted to anecdotal reports provides some promise as a complementary means of near-real-time tsunami detection. In its reported form, the method suffers from the same limitations as satellite altimetry, namely the need to have a satellite at the right place at the right time. On the other hand, it may be feasible to develop a land-based detector of sea-surface roughness using over-the-horizon radar technology. Direct Recording of Tsunami Waves by Island Seismometers Another notable observation made in the wake of the 2004 Sumatra event was that the actual tsunami wave was detectable on horizontal long-period seismometers located on oceanic islands or on the shores of continental masses (e.g., Antarctica) (Yuan et al., 2005). Okal (2007b) later verified that such signals could be extracted from past events (e.g., Peru, 2001), and showed that the recordings expressed the response of the seismometer to the combined horizontal displacement and tilt of the ocean floor during the passage of the tsunami wave, the latter having such large wavelengths (typically 300 km) that the structure of a small island can be neglected. In particular it was verified that such records could be interpreted quantitatively on this basis, which amounts to saying that near-shore seismometers can play the role of tsunameters deployed on the high seas for tsunami detection. The present network of island seismic stations (see Figure 4.1) thus has the potential of increasing the density of the tsunami (sea level) detection network, at essentially no cost, since the stations already exist.
OCR for page 162
Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program “Upward Continuation” of the Tsunami Wave and Its Detection in Space Because of the finite density of the atmosphere, a tsunami wave does not stop at the surface of the sea, but induces a displacement of the atmosphere, in the form of a gravitational wave accompanying the tsunami during its propagation. The volumetric energy density of this upward continuation of the tsunami decreases with height, but because the atmosphere rarefies even faster, the amplitude of the resulting vibration will actually increase with height. A tsunami wave of amplitude 10 cm at the surface of the ocean will reach 1 km at the base of the ionosphere at an altitude of 150 km. This fascinating proposition was initially suggested by Peltier and Hines (1976) and confirmed by Artru et al. (2005) during the 2001 Peruvian tsunami. The detection methodology uses dense arrays of GPS receivers, because large-scale fluctuations of the ionosphere affect the propagation of the electromagnetic waves from the GPS satellites, thus distorting the signals recorded at the receivers. Occhipinti et al. (2006) have successfully modeled such records quantitatively and have shown that other space-based techniques involving reflection at the bottom of the ionosphere (e.g., over-the-horizon radar) could be useful for remote detection of a tsunami on the high seas without the need to instrument the ocean basin itself. The speed of propagation of the atmospheric gravity wave, however, is very low and presents an even greater complication than that described above for acoustic propagation in the ocean’s SOFAR channel. Conclusion: Novel and potentially useful approaches to the estimation of earthquake magnitude and tsunami detection are emerging. Some of these approaches could become operational in the not-too-distant future with proper support for research and testing.