Warnings of impending danger: Science and Social Factors
Those who consider prediction* a part of their research and responsibility range from weather forecasters to seismologists and volcanologists. Warnings of impending danger cause predictable social and economic effects that must be considered along with the primary goal which is safety. If a disaster prediction is wrong, several million people might be unnecessarily affected (Olsen, 1989) and the region may suffer economic losses. If it is correct, but delivered inadequately, disaster is inevitable.
Accuracy of predictions is based on what is possible to observe and data that can be collected. For example, hurricane predictions are very accurate because scientists have extensive weather instruments and well-tested forecasting techniques to use. Volcanic hazard areas and those prone to tsunamis are mapped based on zones identified through historical records – scientists can find geologic evidence that the land was affected by lava, ash or debris flows.
For some forecasted events (such as volcanic eruption and severe weather), there is time to deliver the message and prepare for the event. The worst situation is certainly earthquakes because there are no widely accepted precursors and data-based forecasts are long-term probabilities — relatively unhelpful for short-term preparation. With the potential for large seismic events to kill huge numbers of people, earthquake prediction theories have been particularly problematic and fraught with ethical dilemmas for the scientific community, public authorities and media.
Scientific predictions must be supported by background theory and data and withstand skeptical scrutiny to be considered credible. The foundation mechanisms, explanations, calculations and assessments are expected to go through the gauntlet of peer review in order to gain acceptance. If the foundation is valid, then short-term, specific predictions will be credible. Predictive successes that have followed the conventional route include volcanic evacuations (Mt. St. Helens, Mt. Pinatubo in the Philippines, and the island of Montserrat) and severe weather alerts.
Psychic and pseudoscientific predictions are not supported by theory or data and are not credible. I’ll not be addressing the ethics of those predictions – they are in a whole other realm.
Failed predictions fall on an impact scale from low (creating public inconvenience) to high (massive death tolls) with economic losses and potential career destruction in between. The following are some notable examples that highlight the major pitfalls inherent in predicting (or ignoring predictions of) natural disasters.
The Brady-Spence Debacle
In 1976, Dr. Brian Brady, a U.S. government scientist, made a specific prediction for a huge seismic event to take place in Lima, Peru in July of 1981. While the prediction itself was remarkably detailed, the theory supporting it was completely opaque. (Olsen, 1989) Brady’s theory had not been tested or published for peer review. During the lead up years to the event, things got complicated. Egos, priorities, agendas and protocol hijacked opportunities for proper, coherent, scientific critique. Peruvian officials and the public were confused by the lack of a reliable feed of information. The unstable political situation at the time led Peruvian citizens to think that their government was using the prediction to continue military control (Olsen, 1989 p. 131; Sol and Turan, 2004). The predicted quake did not occur. But, widespread disorder, loss of tourism, decrease in property values and general public unrest did resulting in an estimated economic damage in Lima of $50 million (Mileti and Fitzpatrick, 1993 p. 55).
The lack of following scientific protocol led to the situation getting out of hand. This episode is an example of a loss of objectivity by the chief scientist, the failure of the scientific community to address a serious situation in a coordinated way, and government agencies accepting rumors and pursuing misguided agendas without accurate information.
Armero (Bruce, 2001)
In 1985, Columbian scientists were aware that villages in the valleys around the Nevado del Ruiz volcano were prone to disaster from eruptions. Yet, money was not allotted by the government to monitor the active volcano. The data that could be collected was ignored or not taken seriously by officials. When the media reported that an eruption would produce deadly mudflows that would obliterate the village of Armero, civic leaders called these press reports “volcanic terrorism”. Church leaders added to the propaganda by telling people of the village not to fear. The poor population made no preparations to evacuate. Inevitably, the volcano erupted. That night, those who attempted to evacuate did not know where to go. Civil defense tried to get people out of the town but many refused to go – telling rescuers they were certainly mistaken. 23,000 people perished when a flood of meltwater and warm mud buried the town. Armero no longer exists.
Government inaction in this entirely preventable situation was devastating. The situation was a heartbreaking testimony to the vulnerability of the poor to manipulation by authority.
Browning’s New Madrid Event
Iben Browning was a scientist with unconventional ideas who took his claim directly to the media who gave it wide coverage. He pronounced that an earthquake on the New Madrid fault in the US Midwest would be triggered in December 1990 by tidal forces. In light of his prediction, serious social disturbances occurred. When the quake did not occur, he was ridiculed. Sol and Turan (2004) note that one can not use the defense of free speech to support predictions such as this since they create social disturbances with harmful consequences.
Mr. Browning rejected scientific protocol and valid criticism but used the press to create a stir. While these actions were unethical if one subscribes to the ideals of the scientific community, the media also shares some blame for giving Browning’s opinion credibility it did not deserve.
Hurricane Katrina in 2005 was the costliest and one of the deadliest hurricanes to hit the United States. A US House Committee (2006) investigated the catastrophe and found, though the forecasts were remarkably good, the right information did not get to the right people on time and decision-makers seriously underestimated the threat. It was well known how vulnerable New Orleans was to hurricanes yet there were inadequate provisions, few acts of leadership, government ineptitude, misguided advice and media hype of violence that together resulted in a pathetic governmental response and heightened death toll. Katrina also revealed issues of race and class treatment which showed that being poor and black put one at a distinct disadvantage in a disaster situation. Previous federal government cuts for disaster preparedness had increased the vulnerabilities and taught a hard lesson about paying now or paying later.
Boxing Day Tsunami
The Sumatra-Indian Ocean tsunami of 2004 was an example of lack of coordinated monitoring, notification and evacuation procedures that caused an enormous and mostly preventable loss of life. Fifteen minutes after the quake that generated the deadly tsunami, U.S. scientists at the Pacific Tsunami Warning Center in Hawaii sent out a bulletin. In spite of attempts they made to contact counterparts in other countries, the calls were not answered, the information and warning did not get through. Thousands died along populated coastlines completely unaware of the incoming surge scientists knew was coming.
Dr. Phil Cummings of Australia had pushed for an expansion of the tsunami network into the Indian Ocean. Formation of a study group was met with resistance from participating countries and the network was never expanded. In hindsight, it was noted that Dr. Cummings had predicted the damage that would be done to Sumatra and India (Revkin, 2004).
Giampaolo Giuliani forecasted the 2009 L’Aquila earthquake in Italy based on radon ground emission readings – a scientifically questionable (but not outlandish) theory. Giuliani was reported to authorities for “spreading panic” by broadcasting his warnings weeks before the predicted event. Italian scientists assured the townspeople that quakes were not predicable and officials forced Guiliani to remove warnings from the internet (Neild, 2009; Mackey, 2009). When the predicted quake did not occur on the expected date, March 29, the Italian Civil Protection Agency denounced Guiliani as “an imbecile” (Israely, 2009). A quake occurred on April 6 and killed over 100 people.
In this case, a desperate scientist had made an attempt to do what he thought was the right thing. The government agency chose to use ridicule and censorship instead of providing a measured, coordinated response to a questionable scientific prediction. What might have been the result if a different tactic was undertaken?
The parties involved
Most crises do not become instantly obvious. They take time to develop, sometimes from vague or contradictory signals (Boin and t’Hart, 2006 p. 49). Citizens expect public official to make critical decisions, provide direction and issue emergency warnings (Barberi et al., 2008). Because they are not experts on scientific topics, officials are vulnerable to misunderstanding and mischaracterization (Olsen, 1989, p. 38 and 139).
The world’s most vulnerable population is the poor. Keys et al. (2006) asserts that expensive warning systems are a hard political sell if it is just to save the poor populations.
Predictions have a way of leaking to the press. The media can be effective and critical to deliver warnings and look to experts for information and confirmation. Scientists have not traditionally been open to making themselves available to address the public. One can argue that it is their ethical obligation to be accessible in such a situation and they MUST do so to establish and retain their place as a credible source of information. Otherwise, alternate, not-so-credible sources step in to fill the void.
Word-of-mouth takes on a whole different scale with electronic media as warnings are passed instantaneous around the world. “Prediction” via email or social network platforms is popular. Likely unaware that a warning is scientifically baseless, and without an easy way to judge its credibility, a receiver feels that she is doing a good deed by passing on a warning of impending doom. Warnings like this can cause undue concerns and economic effects.
The elemental question in predictive scenarios is: when is the evidence adequate to make a prediction to the public? Many prognosticators feel they have potentially life-saving information and are overcome with a moral obligation to inform the public regardless of protocol. They can’t seem to adequately weigh the potential fallout if they are wrong. The public, however, considers costs of all kinds and is not always compelled to follow scientific advice. The public may be misled by a manufactured scientific controversy (such as vaccine dangers or global warming).
Science gets accused of oppressing unorthodox ideas that may form the basis of innovative prediction theory. Desperate scientists with unorthodox ideas, rejected by their peers, will put forth their ideas to the community who will listen – the media and public. The punishment for a scientific maverick can mean the end of a career.
The modern public generally has veneration for science and scientists (Posner, 2004 p. 97; Barberi et al., 2008). Yet, science can not deliver absolutes or provide guarantees. The prediction scenario must take public perception into account or the prediction will cause harm whether the event occurs or not.
People will hesitate to undertake precautions that are expensive and time consuming but are influenced by seeing others taking a warning seriously (Mileti and Fitzgerald, 1993, p. 87). Where the people are poor, uneducated or distrustful of government, (Bolin, 2006 p. 129) there can be a reluctance to accept an “official” warning to evacuate. People who feel they are in control of their lives take action to survive. Those who feel their lives are controlled by an external force will passively await whatever fate will come. Fatalistic attitudes, especially as a result of religious beliefs are encountered today, most notably in poor populations (Quarantelli et al., 2006 p. 19, and Bruce, 2001 p. 19). Leaders must be forthright to convince citizens to take the most reasonable course of action. Compassion for personal human concerns must be displayed for a warning to be heeded. Government must be prepared to follow through with obligations to the population whether the event occurs or not.
Many predictions are valid attempts to do the right thing under uncertain circumstances. There are social and political reasons why a prediction is taken seriously or completely ignored. The media and public may give a baseless prediction credence where the scientific community does not. When the public, media and politicians become involved, a prediction becomes socially complex. Warnings must be delivered in relation to social conditions. (Rodrigues et al, 2006b p. 486)
Government and scientists have an obligation to learn from historical events and not repeat mistakes. Even false alarms do not diminish future response if the basis and reasons for the miss are understood and accepted by the public. (Sorensen and Sorensen, 2006 p. 196-7). Therefore, authorities should be willing to prepare their citizens without hesitation if the prediction is supported by science.
Science has an established process to be followed for a theory to gain acceptance. Scientists should be discouraged from short circuiting this process and appealing directly to the public. However, the scientific community must evolve its process to include modern technology and the new media in consideration of basic human needs and various responses to life-threatening events.
Barberi, F., M.S. Davis, R. Isaia, R. Nave, T. Riccia (2008). “Volcanic risk perception in the Vesuvius population.” Journal of Volcanology and Geothermal Research 172: 244 – 258.
Boin, A. and P. ‘t Hart (2006). “The Crisis Approach”. In Handbook of Disaster Research. H. Rodriguez, E. Quarantelli, R. R. Dynes. NY, Springer: 42-54.
Bolin, B. (2006). “Race, Class, Ethnicity, and Disaster Vulnerability”. In Handbook of Disaster Research. H. Rodriguez, E. Quarantelli, R. R. Dynes. NY, Springer:113-129.
Bruce, V. (2001). No Apparent Danger. NY, Harper Collins.
Hinman, L. M. (2005). “Hurricane Katrina: A ‘Natural’ Disaster?” San Diego Union-Tribune. San Diego, CA. Sept. 8, 2005.
Israely, J. (2009) “Italy’s Earthquake: Could Tragedy Have Been Avoided?” Time. Retrieved April 7, 2009 from http://www.time.com/time/world/article/0,8599,1889644,00.html.
Keys, A., H. Masterman-Smith, D. Cottle (2006). “The Political Economy of a Natural Disaster: The Boxing Day Tsunami, 2004.” Antipode 38(2): 195-204.
Mackey, R. (2009). “Earthquake Warning was Removed from Internet”. NY Times News Blog (The Lede) (06 April 2009). Retrieved April 6, 2009 from http://thelede.blogs.nytimes.com/2009/04/06/earthquake-warning-was-removed-from-internet.
Mileti, D. S. and C. Fitzpatrick (1993). The Great Earthquake Experiment. Boulder, CO, Westview Press.
Neild, B. and G. Deputato (2009) “Scientist: My quake prediction was ignorned”. CNN.com (06 April 2009) Retrieved April 6, 2009 from http://www.cnn.com/2009/WORLD/europe/04/06/italy.quake.prediction.
Olsen, R. S. (1989). The Politics of Earthquake Prediction. Princeton, NJ, Princeton Univ Press.
Posner, R.A. (2004). Catastrophe: Risk and Response. Oxford Univ Press.
Quarantelli, E. L., P. Lagadec, A. Boin (2006). “A Heuristic Approach to Future Disasters and Crises: New, Old and In-Between Types”. In Handbook of Disaster Research. H. Rodriguez, E.L. Quarantelli, R. R. Dynes. NY, Springer:16-41.
Revkin, A. C. (2004). “How Scientists and Victims Watched Helplessly”. New York Times. December 31, 2004.
Rodriguez, H., E.L. Quarantelli, R. R. Dynes (2006a). Handbook of Disaster Research. NY, Springer.
Rodriguez, H., W. Diaz, J. Santos, B.E. Aguirre (2006b). “Communicating Risk and Uncertainty: Science, Technology, and Disasters at the Crossroads”. In Handbook of Disaster Research. H. Rodriguez, E. Quarantelli, R. R. Dynes. NY, Springer:476-488.
Select Bipartisan Committee to Investigate the Preparation for and Response to Hurricane Katrina (2006). “A Failure of Initiative”. Washington, D.C., US House of Representatives.
Sol, A. and H. Turan (2004). “The Ethics of Earthquake Prediction.” Science and Engineering Ethics. 10(4): 655-666.
Sorensen, J. H. and B. V. Sorensen (2006). “Community Processes: Warning and Evacuation” In Handbook of Disaster Research. H. Rodriguez, E. Quarantelli, R. R. Dynes. NY, Springer:183-199.
* I use the term prediction throughout this post since I am referring to the cases where a particular event was said to occur within a discrete time frame in a certain location. Please see the post just previous to this in which I distinguish forecasting from prediction.
Posted on March 28, 2011, in Culture, Natural Disasters, Pseudoscience, Science and Nature and tagged anecdotes, earthquake, evidence, geology, politics, prediction, Pseudoscience, science, skeptical, tsunamis, volcanoes. Bookmark the permalink. 1 Comment.