"Warnings" asks the most important question of our time: "when it comes to predicting disasters, who should we trust?" Unfortunately, Clarke and Eddy's answer is vague and untenable. One of their four key recommendations is to build a system to "sift the credible from the dubious, separating the signal from the noise." Sounds great, but how should we do that? They recommend:
Knowing when the data is rich and extensive enough to trust it and when it is too scant is difficult. If data is in short supply, don’t worry about probability... Instead, focus on possibility. Is it possible? Could it happen?
But this is rather shabby advice - an infinite number of disasters are possible. So we're left with no way to narrow down our focus and we're right back to where we started. So while the premise of the book is excellent, I'm forced to give this book a 3/5 because it offers an obviously non-viable solution.
There are many redeeming elements of the book though - particularly some of the case studies: "artificial intelligence, genetic engineering, sea level rise, pandemic disease, a new risk of nuclear winter, the Internet of Things, and asteroid impacts." The background on the CitiGroup and Madoff financial shennanigans were particularly interesting. And Paul Ehrlich gets called out once again for being a false Cassandra with his "Population Bomb."
Yet I found nothing novel in the authors' philosophy of complexity/prediction. They trot out the obligatory Tetlock paper and briefly discuss Isaiah Berlin's hedgehogs and foxes. Their most interesting Cassandra factor is "Scientific Reticence, a reluctance to make a judgment in the absence of perfect and complete data" - but they don't offer any reasonable alternative for making decisions in those scenarios. They pull in a quote from Michael Crichton (from his "Climate of Fear" phase) about the danger of "consensus" in science, but again, their framework offers nothing to guide our path in these situations. They did reference Harvard's Neustadt on the 1976 swine flu overreaction - his claim that the erroneous prediction was a result of "the expert community [having] a vested interest in strengthening the public health and vaccine system" is particularly noteworthy in our current state of climate hysteria.
I'll end this review with my favorite chapter epigraph:
One man with the truth constitutes a majority.
—SAINT MAXIMUS THE CONFESSOR
My highlights below.
CHAPTER 1 - Cassandra: From Myth to Reality
People die because we fail to distinguish the prophet from the charlatan.
Cassandra was a beautiful princess of Troy, cursed by the god Apollo. He gave her the ability to see impending doom, but the inability to persuade anyone to believe.
What the ancient Greeks called Cassandra behavior today’s social scientists sometimes refer to as sentinel intelligence or sentinel behavior, the ability to detect danger from warning signs before others see it.
Often, however, true experts in a field do their job and sound the warning in time, only to be ignored or given only an inadequate, token response. We began calling such episodes Cassandra Events.
First we must hear the forecast, then believe it, and finally act upon it.
Thus, this book will seek to answer these questions: How can we detect a real Cassandra among the myriad of pundits? What methods, if any, can be employed to better identify and listen to these prophetic warnings? Is there perhaps a way to distill the direst predictions from the surrounding noise and focus our attention on them? Or will Cassandra forever be condemned to weep as she watches her beloved city of Troy burn?
Who now among us may be accurately warning us of something we are ignoring, perhaps at our own peril? We look at contemporary individuals and their predictions, and examine the ongoing public reaction to them. Our cases here include artificial intelligence, genetic engineering, sea level rise, pandemic disease, a new risk of nuclear winter, the Internet of Things, and asteroid impacts.
In fact, prediction is something that academics have spent a lot of time studying and considering. The statistician Nate Silver has taken a highly quantitative approach to prediction, one that works for a certain class of event. The jurist Richard Posner examined the phenomenon of catastrophes in the years after 9/11. Psychologists like Dan Ariely and Tsachi Ein-Dor have probed the way our brains work (and don’t) through empirical observation and the study of warnings. Unquestionably one of the foundational works in this area, predictions within the social sciences, is Philip Tetlock’s Expert Political Judgment.
Overall, experts were terrible at forecasting the future, but Tetlock did something interesting: in addition to asking the experts what they thought about a particular scenario, he also examined how they thought. After evaluating their cognitive styles, Tetlock divided the experts into two categories, “hedgehogs” and “foxes,” after an essay written by the philosopher Isaiah Berlin.
The foxes consistently beat the hedgehogs in the accuracy of their predictions by a significant margin.
Still, maddeningly, even the foxes, considered as a group, were only ever able to approximate the accuracy of simple statistical models that extrapolated trends.
One reason that no one has focused on the people who warn is that many, probably most, prophets are wrong, giving the others a bad reputation. We have listened to forecasts of the world running out of fossil fuels, being unable to grow enough food for humanity, being crushed by a wildly proliferating population. Today, we look around and see a glut of hydrocarbons, new ways to increase crop yields, and some nations now worrying about negative population growth. Whom to believe?
CHAPTER 2 - The Spook: Invasion of Kuwait
As the National Intelligence Officer for warning, Charlie’s job was the institutional embodiment of national security lessons painfully learned. His position existed so that the nation would not be blindsided again as it been by past crises, most notably at Pearl Harbor. There were five separate government investigations of why the United States had not seen the Japanese attack coming. A joint Senate-House probe resulted in a thirty-nine-volume report. It was not until 1962 that the definitive report on Pearl Harbor was written. And then, it was not written by a government committee, but by an academic named Roberta Wohlstetter. She concluded that the problem had not been a scarcity of information, but an overabundance of it.
The challenge the Warning Committee faced, Allen would remind them frequently, was to not be “the boy who cried wolf.”
If a nation were really going to war, certain key military units would do things that they would otherwise almost never do... In a training exercise, for example, division commanders might not move all of their ammunition from storage bunkers toward the front, but in a real war they would.
The Arab experts in the State Department agreed. “No Arab nation has ever gone to war with another Arab nation. It does not happen,” they told Dick Clarke (then the Assistant Secretary of State for Politico-Military Affairs), repeating the conventional wisdom. “Besides, it’s too hot. Temperatures in the Kuwaiti desert in late July are above 120 degrees Fahrenheit.” Most Arab leaders left the region that time of year for Switzerland or the south of France. Washington’s national security leadership was also packing up for cooler climes, like Maine. Washington always shut down in August, and August was just a few days away. No one wanted a crisis that would force them to cancel a vacation.
Then, in his quick, precise, but monotone style, he added “And most of the units in attack position have stopped communicating by radio. They have gone into EMCON — emissions control — they’ve gone silent.” At that point, both Dick Clarke from State and Richard Haass from the White House looked up from their note taking, staring across the table at each other in shock. Both of them students of military history, they knew what it meant when an army went silent. Clarke silently mouthed his reaction to Haass, “Oh, fuck.”
The Iraqi military had completely occupied the entirety of Kuwait in a matter of hours.
This leads us to the third element in the Iraq-Kuwait case, a problem we will encounter time and again in this book: lack of precedent, what we will refer to as Initial Occurrence Syndrome.
Psychologists have a term for this phenomenon: the availability bias.
We consider Initial Occurrence Syndrome a special case of availability bias, one that is more difficult to overcome because of the complete lack of precedent that would allow our brains to estimate the likelihood of such an event occurring.
Personal experiences are often part of what drives Cassandras to make the choices they make. The late Harvard historian Samuel Huntington proposed that, in analyzing leaders, it is always good to know what world events and personal experiences shaped them while they were young and their world view was being formed.
When Charlie Allen said that the ammunition depot in some small Iraqi town was empty, he expected alarm bells to go off in the minds of his audience, but many in his audience had no idea whether such a fact was important or not. Across all fields we have studied, expert Cassandras often make appeals to the compelling nature of data they themselves collected or which they endorse as important. The data speaks for itself, the Cassandras believe. For non-experts, however, obscure data or variables that they have never heard of before do not speak to them in the same way, and they understand them no more than they would a sentence in a language they don’t know.
President George W. Bush then asked his new Director of National Intelligence, John Negroponte, what he thought of Charlie Allen’s report on the threat of insurgency in Iraq. Negroponte was blindsided and embarrassed. He was also irate at what he believed was an attempt to circumvent his authority and report unilaterally to the President. Negroponte soon took his revenge. During a bureaucratic shakeup of the intelligence community, Charlie Allen’s position was shifted from the CIA to the office led by Negroponte. Everyone assumed that Charlie would continue to lead it, but Negroponte said he wanted someone else.
In 2009, the position of National Intelligence Officer for warning was abolished. In explaining the decision, one official noted, “It’s the job of all the national intelligence officers to warn.” Perhaps so, but the explanation raises a question: who among them is actually listening for Cassandra?
CHAPTER 3 - The Rebuilder: Hurricane Katrina
Perhaps our Cassandras may learn from the old adage that you catch more flies with honey than with vinegar.
CHAPTER 4 - The Arabist: The Rise of ISIS
Under Osama bin Laden, al Qaeda never truly controlled a single sizable town. Under al Baghdadi, ISIS has numerous towns and cities, spread across what used to be the Syria-Iraq border. That demarcation was now essentially gone, made meaningless by the creation of the new nation, the Islamic State. It was equipped with U.S.-made artillery and armored vehicles stolen in Iraq, funded by oil sales from wells under its control, and staffed by thousands of fighters from throughout the Muslim world. That “state” was governed by (not just influenced by) a terrorist organization and had aggressive plans to expand its revolution elsewhere.
CHAPTER 5 - The Seismologist: Fukushima Nuclear Disaster
This reaction is what we will call Scientific Reticence, a reluctance to make a judgment in the absence of perfect and complete data.
This was the globe’s fourth strongest earthquake since modern civilization began recording measurements (around 1900), and the most powerful to hit Japan in recorded history. People reported feeling its tremors from over 1,000 miles away. The quake moved the entire island of Honshu about eight feet to the east, and 250 miles of the coastline experienced a vertical drop of two feet.
Two years after the accident, the UN Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) concluded a study that determined that the radiation exposure “did not cause any immediate health effects,” though other studies contended that up to thirty-three thousand people will eventually die from cancers they contracted from the Fukushima radiation. The World Bank found the earthquake and tsunami to be the most expensive natural disaster of all time, with an economic cost estimated at $235 billion. Of the more than nineteen thousand people who perished in the disaster, most drowned or were battered to death by the flood.
In this country that once depended on nuclear power to meet 30 percent of its energy needs, no nuclear reactors operated at all for a full two years after the disaster. Today only one of Japan’s fifty nuclear reactors continues to operate. Other countries took a cue from Japan’s disaster, most notably Germany, which has pledged to phase out its nuclear program entirely by 2022.
With few fossil fuel reserves in the ground, Japan has historically imported most of its energy. Relying on nuclear power gave Japan a way to limit the amount of money it had to spend externally.
We postulate that there were four main reasons: the willingness by TEPCO and the Japanese government to accept an unusually high level of risk; a myth of nuclear safety pushed by the Japanese government; regulatory capture, the collusion of power companies and nuclear regulators; and, as we have seen in each preceding case, Initial Occurrence Syndrome.
The safety myth helped to enable significant collusion among companies like TEPCO and nuclear regulators, the third major factor in explaining why they ignored the warnings: regulatory capture.
CHAPTER 6 - The Accountant: Madoff’s Ponzi Scheme
In 2008, Madoff was arrested and confessed to the FBI. After the bottom fell out, $65 billion vanished overnight. Thousands of investors were wiped out—lost their retirement savings, forced to sell homes, flung overnight from security to desperation. Three people, if not more, died violently.
Harry listened to the strategy that Madoff was supposedly using and looked at the figures. “It doesn’t make any damn sense,” he told Casey. “This has to be a Ponzi scheme.” “It was obvious,” Harry told us when we met with him in Boston. “It literally took five minutes” to see that Madoff was faking his returns. The strategy that Madoff was supposedly using couldn’t produce those kinds of results.
“It wasn’t necessarily the returns that were the giveaway; it was the lack of risk and his consistency of returns.”
Tallying the funds they knew relied on Madoff as their money manager, Harry and his team came to estimate that Madoff had $3 billion to $6 billion under management. That was in 1999, when the biggest hedge funds in the world were managing about $2 billion dollars. The scale raised a big, glaring red flag. A trading operation that large would leave footprints in the market, humongous footprints all over the place. If Madoff was managing as much as $6 billion, the number of puts and calls he would need to hedge his portfolio would be staggering. Yet there was no sign of such a giant player stalking the options markets, throwing around at least three times as much money as the next-largest hedge funds in the world. In fact, the number of options he would need to be buying and selling, Harry quickly figured, would far exceed the total number of options available in the world.
Harry’s European tour also gave him a glimpse of a more menacing secret: a lot of these funds that were investing millions with Madoff were offshore funds. “The best people in an offshore feeder fund are going to be tax cheats, and it’s going to go quickly downhill from there to organized crime,” Harry explained to us, amid the civilized tinkle of cutlery and crystal at the Langham. “So I realized that Bernie was stealing from the Russians and the drug cartels.” Which meant that the man who was gunning to take down Madoff’s money machine suddenly had enemies a lot more dangerous than he had ever imagined. That’s when Harry started carrying that snub-nosed Smith & Wesson .38.
Ezra Merkin, the wise rabbi of Wall Street, had earned $470 million in fees by sending $2.4 billion of his clients’ money to Madoff.
Elie Wiesel’s Foundation for Humanity lost $15 million, everything it had; Wiesel and his wife lost millions more personally. When it was all said and done, the collapse of Madoff’s Ponzi scheme brought crippling financial losses to more than 13,500 victims.
But what explains the SEC’s extraordinary willingness to give Madoff the benefit of several doubts and dismiss specific, repeated, and compelling warnings from a well-informed, persistent Cassandra? “The government agency charged with being the industry’s watchdog was deaf, blind, and mute,” Harry wrote after the dust settled.
During his years of chasing Madoff, Harry became convinced that the SEC “had been captured by the private industry it was created to regulate.”
One of the biggest problems was what we will call Complexity Mismatch. As Harry explains, the SEC was employing the wrong kind of people: its staff was dominated by lawyers, not finance professionals. They didn’t have the training to understand complex financial strategies, options, and derivatives. Instead of grizzled old industry veterans who had seen it all and learned the tricks, the investigators were mainly young lawyers whose keenest interest in the financial industry was to eventually get a job there.
The SEC staff made a fundamental error of human judgment by assessing the messenger and not the underlying message. Their distaste for Harry’s irreverent style made them ignore the validity and urgency of his warning.
A little over a month after his testimony, he met with the new head of the agency in its headquarters in Washington. “I showed her the statistics on whistleblowers versus law enforcement,” he explained to us. “It was basically, ‘Look, whistleblowers are twenty-three times more effective at detecting fraud than law enforcement.’ And she said, ‘We need a whistleblower program.’ And shortly thereafter we had a whistleblower program, thanks to her efforts and leadership.”
CHAPTER 7 - The Inspector: Mine Disaster
The fatality rate in the coal mining industry climbed steadily through the turn of the century, and then held steady at around 34 deaths for every 10,000 miners per year (today the rate stands at about 1.4).
The greatest tragedy occurred at the Monongah Mine in West Virginia, where 362 men perished in a massive explosion, likely the result of a spark or open flame lamp igniting a cloud of methane or a buildup of coal dust. It remains the worst mining disaster in U.S. history.
The Coal Act for the first time gave the federal government the ability to really enforce the law for the protection of miners. It established new health and safety standards for the nation’s coal mines with substantial penalties for violations, and it required federal inspectors to inspect underground coal mines four times per year. But perhaps most significant, it enshrined in law the mantra that for too long coal companies had failed to live by: “The first priority and concern of all in the coal mining industry must be the health and safety of its most precious resource—the miner.” In two years, the annual fatality rate in the coal industry fell below 10 deaths per 10,000 miners. It has never exceeded that threshold since.
In 2007, coal mining was statistically no longer the most dangerous industry in the nation, having been surpassed by agriculture, forestry, and fishing.
The number of U.S. coal mine disasters dropped from 143 in the first decade of the twentieth century to 5 in the first decade of the twenty-first, while the amount of coal extracted annually more than doubled to over a billion tons. Federal regulation has played a critical role in making the industry safer and drastically reducing the frequency of disastrous accidents.
With institutional refusal, Cassandra faces one of the most challenging obstacles, because no amount of evidence can result in an effective response.
CHAPTER 8 - The Market Analyst: The 2008 Recession
Though the practice was widespread and more than $1.4 trillion in mortgages had been bundled into CDOs from 2004 to 2007, only a tiny handful of people realized that these CDOs were ticking time bombs placed under the largest banks, the largest businesses in the world. Warren Buffett called CDOs and other such financial products “financial weapons of mass destruction, carrying dangers that, while now latent, are potentially lethal.”
Whitney echoes our other Cassandras in her dedication to the data, her consistent questioning of her analysis, and her core skepticism.
Meredith told us that she thinks many analysts fail to be skeptical enough, that they fall for fast talk. A common error is that we believe that “people who talk in a way you don’t understand know exactly what they are talking about, when actually, people who talk in a way you do understand know exactly what they are talking about.”
Citigroup was not only the largest bank in the United States but the largest publicly traded company and bank in the world as measured by total assets, with 357,000 employees and the world’s largest financial services network, spanning 140 countries with approximately sixteen thousand offices worldwide. The company had been formed in October 1998 by one of the largest mergers in history, combining Citicorp bank with the financial conglomerate Travelers Group.
When we asked her how she determines whether an analyst is a Cassandra or a Chicken Little, Whitney replied, “A Chicken Little makes the same call over and over again. I make the call and move on.”
CHAPTER 9 - The Cassandra Coefficient
One man with the truth constitutes a majority. —SAINT MAXIMUS THE CONFESSOR
There are many systems and techniques being used today by governments, financial advisors, investors, and futurists to look over the horizon to detect disasters. We are unaware, however, of anyone who is using a technique to seek out possible Cassandras and vetting them and their warnings against the qualities and experiences of past Cassandras.
Our experience with senior decision makers in governments and corporations, however, suggests to us that such leaders prefer something they can unpack, understand, and apply themselves.
In our estimation, no obstacle to action is bigger than Initial Occurrence Syndrome, yet it is the easiest objection to logically assail.
Physician and author Michael Crichton had this to say about the danger of consensus in science: Consensus science [is] an extremely pernicious development that ought to be stopped cold in its tracks... it is a way to avoid debate by claiming that the matter is already settled.... Let’s be clear, the work of science has nothing whatever to do with consensus. Consensus is the business of politics. Science, on the contrary, requires only one investigator who happens to be right, which means that he or she has results that are verifiable by reference to the real world.... The greatest scientists in history are great precisely because they broke with consensus.
A prospective risk going from 10 million deaths to 100 million deaths does not multiply by 10 to strengthen our determination to stop. It adds one more zero on paper for our eyes to glaze over, an effect so small that one must usually jump several orders of magnitude to detect the difference experimentally.
Richard Farson notes, “The most important discoveries, the greatest art, and the best management decisions come from taking a fresh look at what people take for granted or cannot see precisely because it is too obvious.” Farson calls this the “Invisible Obvious.”
Frequently, no one wants to own an issue that’s about to become a disaster. This reluctance creates a “bystander effect,” wherein observers of the problem feel no responsibility to act. Increasingly, complex issues are multidisciplinary, making it unclear where the responsibility lies. New complex problems or “issues on the seams” are more likely to produce ambiguity about who is in charge of dealing with them.
If the individual or collective leadership at the top is overly cautious or lacks creativity, it will likely turn a deaf ear to Cassandra.
Usually Cassandras have not given a dire warning before, or if they have, they were clearly proven to be right. The Cassandras that we have selected are not people who issue so many warnings that they just happen to get one right.
QUESTIONERS: Most Cassandras tend to disbelieve anything that has not been empirically derived and repeatedly tested. They also tend to doubt their own work initially, especially when it predicts disaster. This characteristic is more than just a belief in the scientific method. Rather, they challenge what is generally accepted until it is proven to their satisfaction. They are the philosophical descendants of Pyrrho of Elis, a philosopher in ancient Greece who accompanied Alexander the Great to India. There Pyrrho learned from Indian philosophers who challenged everything. Pyrrho’s teachings influenced another Greek philosopher who taught that all beliefs and assumptions should be challenged, that doubt, skepticism, and disbelief are healthy. This later philosopher was Sextus Empiricus, and his name is forever attached in our minds to the empirical method: doubt until proven by data, by objectively true, observable facts. Many Cassandras seem to have incorporated Albert Einstein’s belief that “unthinking respect for authority is the greatest enemy of truth.” When the authority figures to whom they report their warning reject their analysis for what the Cassandras believe are non-evidence-based reasons, our warners begin to lose respect for the decision makers. They often are unable to hide that disrespect well.
SCIENTIFIC RETICENCE: For some issues, a high scientific standard of proof cannot be met in time to act. Some events cannot be accurately created, simulated, and repeated in the laboratory. To avoid the disaster, it may be necessary to abandon the normal protocol of waiting for all the evidence to be in and act instead on incomplete data and early indications. Scientists and decision makers in denial may argue to wait for the final results, for additional studies. Sometimes, though, waiting can prove fatal.
The real problem comes when the potential Cassandra is an established technical expert who is making the case with data. There are numerous examples of people who looked like true Cassandras but were simply wrong.
THE POPULATION BOMB: Stanford biologist Paul Ehrlich came to the public’s attention in 1968 when he published and vigorously promoted his book The Population Bomb.
Ehrlich underestimated the ability of agronomists and others to increase the world’s population-carrying capacity by improving crop yields and distribution systems.
Ehrlich was not an expert in demography or agronomy. His analytical failures seem mainly attributable to not taking into account feedback loops, i.e. not considering the role that could be played by elements of the system adjusting to address the problem.
Y2K: In 1984, Jerome and Marilyn Murray published a book, Computers in Crisis, which predicted that when 1999 rolled over into the year 2000, many software programs would malfunction.
SWINE FLU: In January 1976, thirteen U.S. Army personnel died of a new strain of flu. Analysis of the flu determined that it was related to the great Spanish flu epidemic of 1918, which killed millions in North America and Europe, including five hundred thousand people in the United States. Experts from the Center for Disease Control and the National Institutes of Health determined that there was a possibility that the flu would reemerge in the next winter and kill millions.
Although the scientific and public-policy analyses involved in the decision were flawed, as amply demonstrated by Harvard historian Richard Neustadt (The Epidemic That Never Was), the issue, framed as it was, gave President Ford little choice.
Neustadt took other lessons from his case study, regarding the interaction between the technical expert community and the policy community. There was not one warner, but rather the assembled virology and public health experts used by the federal government. Neustadt suggests that there were doubts in the expert community, but they were never made explicit to, or discovered by, the policy community. The expert community had a vested interest in strengthening the public health and vaccine system.
CHAPTER 10 - The Computer Scientist: Artificial Intelligence
The real problem of humanity... we have Paleolithic emotions; medieval institutions; and godlike technology. —E. O. WILSON
Eliezer has dedicated his life to preventing artificial intelligence from destroying humankind. Tall with a thick, dark beard that, along with wire-rim glasses, forms a frame around his large, oval face, he is a thirty-seven-year-old autodidact who dropped out of school after eighth grade. Married without children, Eliezer grew up in Chicago and now lives and works in Berkeley, California, at an organization he founded, the Machine Intelligence Research Institute (MIRI).
Eliezer told us that humanity’s best hope is to perhaps create one highly funded, highly secure, multilateral effort to develop a friendly superintelligence with himself (or perhaps another futurist he approves of) at the helm. The work of this massive global Manhattan Project would be explicitly “for the benefit of humanity internationally.” It simultaneously would ban, starve, or simply outpace other, less-well-thought-out efforts to develop superintelligence. Once created, this friendly AI would be unleashed to attack and destroy any competing efforts, ensuring that the only superintelligence in existence would help, not destroy, humankind.
Ng believes that resources and time would be better spent on more pragmatic realities, such as the job displacement that he believes weak AI will cause.
Large-scale unemployment in the current era is no less disruptive and dangerous. The rise of radical Islam throughout the Middle East, the rise of narco-terror in Latin America, and spikes in inner-city gun violence in the United States all have strong correlations with the very low employment rates of young men in those areas.
Yudkowsky’s suggested solution, a global Manhattan Project to develop safe AI, would be one of the most incredibly complicated multilateral bureaucratic solutions we could imagine.
CHAPTER 11 - The Journalist: Pandemic Disease
As happens more often than those in power want to admit, government concern was partly catalyzed by Hollywood. In March 1995, the same month as the Ebola emergence in Zaire, Dustin Hoffman, Rene Russo, and Morgan Freeman starred in the blockbuster Outbreak.
The Grim Reaper’s favorite disguise is disease. Disease makes other disasters look trivial. More human lives have ended by bacteria and viruses than every other kind of catastrophe combined, its constant presence masking its destruction.
Beginning in 1918 and lasting less than three years, the Spanish flu epidemic killed up to 5 percent of the earth’s population. Death often came quickly; the infected often felt fine in the morning but were dead before the next day’s light. The disease was so deadly that it burned itself out: it killed victims so fast that they didn’t have time to infect many others. Around 30 percent of the world population caught it, and fifty million people died, almost seven hundred thousand in the U.S. alone. By comparison, World War I killed fewer than nine million people; World War II, fifty-five million.
What the science does say is not reassuring. It says that this deadly flu is nearly identical to H1N1, commonly referred to as swine flu.
In fact, Spanish flu infected so many in 1918 that it is the genetic Adam and Eve of nearly any modern pandemic flu strain. These variants are so genetically similar that the devastating virulence of Spanish flu could return from a change in as few as three different proteins encoded by its RNA.
Jeffery Taubenberger, with his colleague Ann Reid, was first to sequence the Spanish flu. He heads the effort at the National Institutes of Health to combat our next flu pandemic.
He believes decision makers have a hard time accepting new costly solutions after epidemiologists’ expensive warnings that H5N1 was going to strike in 2009, when, in fact, it was an H1N1 epidemic that struck and killed three hundred thousand worldwide.
CHAPTER 12 - The Climate Scientist: Sea-Level Rise
When we asked people for names of potential future Cassandras now among us, one name kept coming up: James Hansen.
Hansen is currently the director of the Climate Science, Awareness, and Solutions program at the Earth Institute of Columbia University. Before his time with the Earth Institute, he was the director of the NASA Goddard Institute for Space Studies (GISS) from 1981 to 2013.
However, the year 1981 marked the beginning of a series of accurate predictions that remain “essentially unbroken” today.
In the run-up to the hearing, one of Hansen’s colleagues at NASA Headquarters remarked that no respectable scientist would attribute the decade’s warming to the greenhouse effect. Hansen remarked, “I don’t know if he’s respectable or not, but I know someone who is just about to make that statement.”
No one had really listened while experts toiled for over a decade on the likely magnitude of the problem. Kerr added, “Then came Hansen. Now greenhouse scientists have the attention they have wanted, but for reasons they think unsound.”
The IPCC disagrees significantly with Hansen about both the rate at which and the level to which the water will rise.
Meanwhile, Hansen is still receiving the same kind of criticism that he heard in the 1980s. The original version of his recent paper, the one he released for discussion, got more than just discussion. Kevin Trenberth of the National Center for Atmospheric Research, former lead author for the IPCC, said, “The new Hansen et al. study is provocative and intriguing but rife with speculation and ‘what if’ scenarios. It has many conjectures and huge extrapolations based on quite flimsy evidence... it is not a document that can be used for setting policy for anthropogenic climate change, although it pretends to be so.... There are too many assumptions and extrapolations for anything to be taken seriously.”
About 65 percent of the world’s cities with populations greater than five million are located in low-lying coastal zones. Such zones constitute 2 percent of the world’s land area, but contain at least 10 percent of the global population.
In 2005, the total value of assets exposed to possible flooding from sea-level rise in such coastal cities was about $3 trillion, corresponding to about 5 percent of global GDP.
By 2070, assuming a rise of just 0.5 meters with respect to today’s values, asset exposure grows to at least 9% of global GDP.
There is no real understanding of what it would cost to move millions of people and build new cities, no determination of who would pay for it and how, no projection of the taxes required, no understanding of the effect on gross domestic product.
His idea is to create the incentive the public needs to get behind a more serious push toward renewables, which would make them more economically competitive with fossil fuels. He’s bullish that it would raise GNP and create millions of new jobs.
We think Jim Hansen and his warnings about sea-level rise have a high Cassandra Coefficient.
CHAPTER 13 - The Weatherman: Nuclear Ice Age
Once again, the American military has a saying that captures the essence of the current relationship between India and Pakistan: “ready, fire, aim.”
One thing that most experts agree on is that Pakistan is producing nuclear weapons faster than any other nation on Earth. Four hundred warheads would mean that Pakistan’s nuclear inventory would surpass not only India’s, but also France’s, the United Kingdom’s, and China’s. This would mean that Pakistan would have fewer than only the United States and Russia.
The reality is that no one really knows what will happen when one side starts using nuclear weapons against another side that is also nuclear armed.
The scientists said that it could mean the end of humanity altogether. In 1983, Sagan pulled together a group of twenty-eight scientists, including both Americans and Russians, for the Conference on the Long-Term Worldwide Biological Consequences of Nuclear War, held in Washington in 1983. The results of the meeting were published in 1984 in The Cold and the Dark: The World after Nuclear War. The book received widespread attention in the media and, as we later learned, at the highest levels of government in Washington and Moscow.
Because of the climatic effects of numerous large firestorms, there would be the gradual death of most, if not all, humans. It was a chilling prediction, and it came from scientists, from experts. It became known as the nuclear winter theory.
Some scientists disagreed, claiming that there were flaws in TTAPS’s calculations, or at least that they relied on unproven assumptions. Sagan responded that it was impossible fully to test the hypothesis, as science normally demanded, without having a nuclear war.
They summarize what the leaders of ancient Troy told Cassandra: “We can’t be sure, and if we can’t be sure, we are going to ignore it.”
Alan Robock remained concerned. In 2007, he recalculated the effects of a nuclear war, using vastly more sophisticated global climatic, atmospheric, and ocean data and simulations than had been available in the 1980s. He confirmed the nuclear winter theory. The effects of nuclear war had not been exaggerated in the slightest, but, in fact, were worse than originally predicted.
CHAPTER 14 - The Engineer: The Internet of Everything
For almost two decades, Joe Weiss had been saying that could happen, yet American electric power companies have refused to believe him. Off the record, their spokesmen call Joe an alarmist, an exaggerator, a spreader of FUD (fear, uncertainty, and doubt).
All generators connected to a power grid must spin at exactly the same speed, even if they are hundreds of miles apart from one another. This is referred to as the electricity being “in phase.” If they are not spinning at the right rate, the result, Weiss says, can be “devastating.” This is because the grid briefly turns generators into electric motors if they are out of phase with each other. The generator undergoes a tremendous amount of mechanical stress in the few microseconds that the grid is forcing its rotation either faster or slower.
Not far from San Francisco International Airport, San Bruno is a middle class residential suburb, not an industrial town. Yet under the ground in San Bruno was a gas pipeline, controlled by SCADA software that used the Internet as its communications backbone. On September 9, 2010, a short circuit caused the operations room to see a valve as open when it had actually closed, spiking the readings coming from pipeline pressure sensors in different parts of the system. Unbeknownst to the families returning home from ballet and soccer practice, technicians were frantically trying to isolate and fix the problem. At 6:11 p.m., a corroded segment of pipe ruptured in a gas-fueled fireball. The resulting explosion ripped apart the neighborhood, most of whose residents had no idea they lived near a pipeline. Eight people died. Seventeen homes burned down. The utility, PG&E, was hit with a $1.6 billion fine.
CHAPTER 15 - The Planetary Defender: Meteor Strike
By the 1980s, with extensive test data to supplement what had been recorded over five decades earlier, scientists and engineers were able to calculate what happened that last night of June in 1908, and it terrified them. The Siberian blast was, they believe, caused by an asteroid ninety meters in diameter striking the Earth’s thick atmosphere at high speed and quickly superheated by the friction of its descent, heated to the point of explosion about twenty thousand feet above the surface. At that height, lower than today’s transcontinental flights, the asteroid vaporized because of overheating and, in so doing, let loose a blast wave equivalent to five to fifteen megatons of TNT. Even at the lower end of that range, it would have been over three hundred times larger than the nuclear bomb dropped on Hiroshima. The downward pressure from the blast completely devastated eight hundred square miles, about twenty times the size of Washington, D.C. It was one of the largest explosions on Earth ever recorded by humans.
Like so many tourists who came to the site, he asked, “So where’s the meteor?” There were little bits of meteorite material around, but no big rock. Like the thing that had flown at Siberia in 1908, it had been vaporized by the intense heat created by the friction of its entry through the atmosphere. The crater was caused not by the impact of a meteor digging itself into the surface, but by the blast wave from the atmospheric explosion.
The belief that the dinosaurs were wiped out by an asteroid was not a widely accepted scientific theory until relatively recently. It was first argued persuasively in a scientific journal in 1981, but the location of the impact was not known. That same year, geologists who had been searching for oil in Mexico released data suggesting that a massive impact crater existed hidden under the surface of the Yucatán Peninsula. Sixty-six million years of weathering and continental drift had completely covered it up.
The big rock that sent hundreds of Russians to the emergency rooms had snuck up on Earth, coming out of a part of the sky where the astronomers were blinded by the Sun. When it became clear that there had been an asteroid “sneak attack,” it grabbed the attention not only of astronomers, but government leaders in Russia, the United States, the United Nations, and elsewhere. It proved that we do not know enough about the location and trajectories of all local asteroids to be confident that we can always predict an impact.
Drawing on an unpublished report of a 1981 NASA workshop at Snowmass, Colorado, Morrison suggested that the probability of a civilization-extinction-level impact (like Chicxulub) was one in three hundred thousand a year. As unlikely as that still makes it sound, he wrote, for each individual it is ten times more likely than dying from a tornado, and we do have tornado warning systems.
CHAPTER 16 - The Biologist: Gene Editing
By then, Professor Doudna had taken a position at Yale, where she strengthened her reputation as a focused, creative, and eminently brilliant researcher with keen attention to detail. In 2002, she became a professor at the University of California at Berkeley, continuing her work by investigating the RNA of viruses.
Then, in 2012, the revolution began. Professor Doudna had been approached by Dr. Emmanuelle Charpentier at a microbiology conference the year before. Dr. Charpentier, a French scientist working at Umea University in Sweden, was studying the genome of flesh-eating bacteria. She and her colleagues had been investigating the process by which bacterial CRISPR sequences, coupled to a CRISPR-associated (Cas) protein, protect the cell from viral invasion. Charpentier hoped to recruit the well-known structural microbiologist to their effort and tease out the structure of the CRISPR/Cas complex. Dr. Doudna agreed.
The revolutionary aspect of CRISPR/Cas9 is that scientists can now potentially isolate and edit genes in a single generation that would have traditionally taken scores of generations to change.
Already, Chinese scientists have used the technique in goat embryos to delete genes that suppress hair and muscle growth, resulting in animals with longer hair and more muscle, ostensibly better at producing both meat and wool. In fact, dozens of Chinese labs have plunged headlong into CRISPR/Cas9 experimentation in fields ranging from animals to agriculture to biomedicine to human transformation.
As the group of biologists gathered in Napa, California, in late January 2015, several couldn’t help but recognize the similarities to a conference that had taken place almost exactly forty years earlier. In February 1975, about 150 leading professionals gathered at the Asilomar Conference Grounds that overlooks the Pacific Ocean on California’s Monterey Peninsula. The meeting had been called to discuss a recent breakthrough discovery that allowed scientists to artificially manipulate the genome.
Dr. Berg explained to us in his Stanford office, where he still serves as a professor emeritus, that “what Asilomar accomplished was establishing trust between the public and the science.”
CHAPTER 17 - Can You Hear Her Now?
Some readers will, by now, no doubt accuse us of being obsessed with the unlikely, purveyors of doom and gloom. Actually, we are still both very optimistic people with great faith in science and engineering, confidence in the potential of leaders to effect change, and hope that the future will be better than the past. Nonetheless, that brighter, better future will not occur by itself. Progress is not inevitable.
A desire to ensure progress leads us to the critical recommendation of this book: institutionalizing systems to deal with Cassandras, within universities, corporations, governments, and international bodies. Such systems must expressly encompass four functions.
- First, they must scan the horizon, alert for new warnings that might otherwise go undetected.
- Second, they must sift the credible from the dubious, separating the signal from the noise.
- Third, the systems should employ a consistent methodology to evaluate possible courses of action in response to the warning.
- And fourth, they must include a strategy that effectively implements the desired response.
Unlike those who favor statistical analysis or the development of people into being good predictors, and quite unlike those who think that we simply have to accept that “stuff happens,” we think that we can discriminate among the warnings by first focusing on the people giving them and applying our Cassandra Coefficient. Do they have the characteristics of our Cassandras? Are they proven experts in the field? Do they have professionally developed data? Do they lack a record of past, incorrect warning? Is there an absence of any personal monetary or other gain, e.g., working for an industry that would benefit?
Knowing when the data is rich and extensive enough to trust it and when it is too scant is difficult. If data is in short supply, don’t worry about probability. Probability is not likely to be a useful measure when dealing with the risk of a disaster that has never occurred before, or only happened at great intervals.
Instead, focus on possibility. Is it possible? Could it happen? What would have to occur to make it happen? What is there to stop it? How confident are we that those preventative systems would work? How well have we tested them?
Surprising people, presenting both a novel problem and the bill to fix it, is a likely recipe for rejection. Allowing (or working with) experts, the media, and legislators to expose the problem, and allowing the public to demand solutions, is frequently a better choice than letting the government be first to identify the problem and simultaneously propose the solution.