/ 3-stars

The War on Science: Who's Waging It, Why It Matters, What We Can Do About It

GoodReads: 3 stars

"The War on Science" does a good job of contextualizing the current science/politics debate but is unevenly written and methinks it smacks a bit too much of half-baked undergraduate political ranting. Otto's book gave me a better understanding of recent anti-science political history and helped me frame the current issues, but his grandiose Malthusian rhetoric and interminable list of political recommendations at the end of the book made me doubt his reliability. There are some good ideas in here, but it takes some work to sift them out from the fluff.

Otto's thesis goes something like this:

  • Scientists have lost touch with the public since Vannevar Bush's massive program of federal science funding made science a fear-driven national security issue and decoupled it from popular support. This has led to a serious decline in science outreach.
  • As a result, the average voter has little understanding or appreciation of science. The specialization and complexity of modern science don't make this any easier.
  • The post-modern academic left set the stage for the current anti-science movement by arguing that everything is a matter of perspective and there is no objective truth. As a society, we no longer have a standard for the "common authority of evidence".
  • Modern journalists do a massive disservice to science by presenting "both sides" of an issue with equal weight because they don't believe in objective facts.
  • The evangelical right uses this as a philosophical basis for resisting scientific advances that oppose their beliefs (evolution, reproductive controls, etc).
  • Big corporations manipulate the post-modern media (à la Bernays) to build popular support among the evangelical right for anti-science, pro-business objectives.

I was surprised that Otto tackled the academic left head-on. During my time in college, anti-science was discussed as an exclusively right-wing issue, so it was refreshing to see Otto discuss leftist anti-science bugbears (fluoridation, vaccination, and GMOs). But Otto reveals his leftist Malthusian tendencies by favorably citing Paul Ehrlich (of "The Population Bomb" notoriety) and by evidence-free rhetoric such as:

[Science] has enabled us to increase our population and our environmental impact beyond the capacity of our one small planet to support us... Population plus individualism plus technology may be our ultimate undoing.

Otto's epistemology is unclear as well. He dismisses Kuhn - "Kuhn’s error was one of overextension — to intertwine the politics of science and the discovery of truth and call them one" and seems to have a rather Popperian philosophy on science - "If there’s no possible way to prove the hypothesis is false, then we aren’t really doing science." Otto also claims that climate science is the most important scientific issue of our time, despite its lack of falsifiable claims. He ignores any issues about the "theory-ladenness" of Popperian observation and ignores the fact that modern philosophers of science have thoroughly refuted Popper's ideas. Otto leaves us with no reliable way to separate science from non-science, which is a bit of a problem considering that his whole book is based on how important "science" is.

The relationship between science, religion, and government is the source of Otto's most interesting questions. Regarding the increasingly complicated and arcane nature of highly-specialized science, he asks:

scientific knowledge now plays a major role in most public policy challenges, and is the main arbiter and protector of individual freedom and social justice. A question arises: how best to bridge the gap between the voter and science so that democracy can be preserved?

Otto's answer seems to be "more outreach" but I am skeptical that the general public has the interest or patience for it. He repeatedly claims that science is "anti-authoritarian" in its search for truth, but this runs directly contrary to Yuval Harari's claim that "Science is interested above all in power... science and religion prefer order and power over truth" in his Homo Deus. I found Harari more convincing than Otto, but it still feels like my understanding of these dynamics is fundamentally incomplete. Time for some more reading...

My highlights are below.


The United States itself was founded by people, like Thomas Paine, Thomas Jefferson, and Benjamin Franklin, for whom science and enlightenment were paramount. As a nation the United States has benefited more than any other because it became a center for technological innovation and progress.

there are forces at work coming from many directions that serve to undermine the simple proposition that public policy should be based on rational reflections on sound empirical evidence.

Shawn is not a professional scientist, but he is the epitome of a responsible citizen scientist. We met and came together with several other odd bedfellows in 2007 to form an organization called ScienceDebate 2008 because we felt that the key issues that would really determine the success of the next presidential administration were being ignored in public debate, and we thought it would be worthwhile trying to create an opportunity for the presidential candidates to discuss these issues in a national public forum.

This war is not an ordinary war. It is not a conflict for markets or territories. It is a desperate struggle for the possession of the souls of men. —Harold Ickes, May 18, 1941

PART I Democracy’s Science Problem


Wherever the people are well informed they can be trusted with their own government; that whenever things get so far wrong as to attract their notice, they may be relied on to set them to rights. —Thomas Jefferson, January 8, 1789

At the same time, science and technology have come to affect every aspect of life on the planet. There is a phase change going on in the scientific revolution: a shifting from one state to another, as from a solid to a liquid. There is a sudden, quantitative expansion of the number of scientists and engineers around the globe, coupled with a sudden qualitative expansion of their ability to collaborate with each other over the Internet.

Without a better way of incorporating science into our policymaking, democracy may ultimately fail its promise. We now have a population that we cannot support without destroying our environment — and the developing world is advancing by using the same model of unsustainable development. We are 100 percent dependent on science and technology to find a solution.

This pullback is affecting leading and emerging economies alike. The name of the radical pan-national Islamist group Boko Haram roughly translates as “Western education is forbidden.” The Islamic drive for al-asala, or authenticity, leads some fundamentalist Muslims to reject Western science in favor of Quranic instructions, says Islamic scholar Bassam Tibi. But radical Islam is not alone in this rejection. The vanguard of the retreat is in the Western democracies, where Christian fundamentalists; postmodernist academics, teachers, and journalists; liberal new age purists; and industry front groups all attack science for their own reasons.

Politically, the war on science is coming from both left and right. But the antiscience of those on the right — a coalition of fundamentalist churches and corporations largely in the resource extraction, petrochemical and agrochemical industries — has far more dangerous public-policy implications because it’s about forestalling policy based on evidence to protect destructive business models. As well, the right generally has far more money with which to spread disinformation and attack science on a host of issues. Those on the political left often unwittingly abet the right’s antiscience efforts by arguing that truth is relative, harboring suspicions about hidden dangers to health and the environment that are not supported by evidence, and selectively rejecting science that doesn’t affirm their health-food and back-to-Eden value system. While they are right that there are serious environmental and health threats afoot from poorly regulated industries, they undermine their credibility when they extend these suspicions to scientifically unsupported ideas like vaccines cause autism, cell phones cause brain cancer, or genetically modified crops are unsafe to eat.

Many journalists believe there is no such thing as objectivity, rendering otherwise brilliant minds unable to discern between objective knowledge developed from years of scientific investigation, on the one hand, and a well-argued opinion made by an impassioned and charismatic advocate on the other.

Can it be that science has simply advanced too far? That the problems are too big or too complex, or that knowledge is now too inaccessible to normal citizens to make good decisions—decisions in their own best interest? In a world dominated by science that requires extensive education to fully grasp, can democracy still prosper, or will the invisible hand finally fall idle? Are the people still sufficiently well informed to be trusted with their own government?

Lawyers are trained to start with a conclusion, discover evidence to support that conclusion, and craft it into a compelling narrative to win the argument.

That is the opposite of the approach of science, which starts with observation, accumulates evidence from studying nature, and forms a conclusion based on what the preponderance of the evidence as a whole suggests.

The George W. Bush White House had become notoriously antiscience, which legitimized science denial in a way the world is still dealing with. Bush appointed ideologues to key agency posts throughout the federal government and empowered them to hold back or alter scientific reports with which they disagreed. This represented a marked change from the Republican Party of just ten years prior.

And business-friendly FDA administrators failed to remove the arthritis drug rofecoxib (Vioxx) from the market even after it became apparent that it was causing heart attacks, resulting in more than fifty thousand American deaths — nearly as many as the number of American soldiers lost in Vietnam.

By the 2008 election, antiscience views had become entrenched as mainstream political planks of the Republican Party. The focus was on three main areas: denying the science of reproductive medicine, denying the science of evolution, and denying the science of climate change.

Doubt is our product,” a tobacco executive wrote in a 1969 memo to fellow tobacco executives, “since it is the best means of competing with the ‘body of fact’ that exists in the minds of the general public. It is also the means of establishing a controversy.”

Obama used our mission statement — to “restore science to its rightful place” — in his inauguration speech. And once in office, the candidate who had started out not particularly friendly toward science seemed to embrace it as a central part of his strategic approach. He appointed several of our early supporters to cabinet-level posts. Steven Chu became energy secretary. John Holdren became presidential science advisor. Jane Lubchenco became undersecretary of commerce and director of the National Oceanographic and Atmospheric Administration (NOAA). Harold Varmus led the National Cancer Institute. Marcia McNutt became director of the US Geological Survey. John Podesta led Obama’s transition team. The administration had more scientists than any in memory. Perhaps, scientists dared to hope, the dark days of unreason had finally passed. They couldn’t have been more wrong.

Many reporters (and editors, who often direct reporters’ lines of questioning) are — like many politicians —
humanities majors who were required to take few or no science classes in college. The classes were hard, and they ducked them, and now few seem to understand science’s unique importance to the democracies they report on.

Another part of the problem may be that journalists, scientists, and politicians each approach questions of fact from differing perspectives.

The modern journalistic approach does not work when applied to scientific questions, and it tends to skew public policy in counterfactual directions, as the above example shows. This is a bit ironic because journalistic techniques were originally developed as a means of fact-checking, akin to replication and peer review in scientific research. For example, reporters would get multiple sources to corroborate a story (which is an account of events in our shared, objective reality), establishing a relative confidence in its veracity, or they wouldn’t run the story. But today, journalism schools teach a mantra that scientists will say is completely false: “there is no such thing as objectivity” — a phrase frequently repeated by some of the profession’s leading figures, and contained in many newspaper reporters’ guidelines

The first casualty of this “false balance” is journalism’s own credibility, and journalists’ ability to speak truth to or about power, which is one of the field’s main functions.

Similarly, the tendency of politicians to look for compromises on disputed questions of fact instead of basing decisions on an objective standard of knowledge is eroding the country’s ability to solve its problems, leaving it mired in policies that don’t work and political battles that go on forever. And by allowing the teaching of “alternative theories” on politically contentious topics like evolution or climate change or birth control in science classes, those same politicians damage children’s ability to learn critical thinking, to compete in a science-driven global economy, and to live in a world increasingly impacted by climate disruption. This dumbing down of the people for ideological reasons is, of course, not new. It is an age-old authoritarian tactic.

By late 2012, antiscientific rhetoric had become normalized in US politics. Public statements that once would have been considered ludicrous and career-ending were accepted by media and voters without challenge, mostly on the Republican side of the aisle, and mostly on issues surrounding climate change, contraception, and evolution. That’s not to say that Democrats didn’t have their own issues with accepting science they didn’t agree with politically — they did — but they weren’t running loudly against science the way Republicans were.

On the other side of the Pacific, the left-leaning US city of Portland, Oregon, voted in May of that year to ban fluoridation.

Across the Atlantic, the city council in Dublin, Ireland, voted to oppose it, while the entire country of Israel banned fluoride after Health Minister Yael German, a history major and former mayor, ruled it must be removed from public water supplies over the criticisms of medical associations. Previously she had raised health concerns over cell-phone towers.

In Western Europe, many more countries decided against fluoridation, including Belgium, Denmark, Finland, Greece, Iceland, Italy, Luxembourg, Netherlands, Norway, Portugal, Scotland, and Sweden.

Genetic engineering is, in Europe, still politically tied to the Nazi practice of eugenics, and therefore still causes strong political reactions. Additionally, in Northern Europe especially, the left-wing focus on alternative medicine, holistic health, and bodily purity are major concerns that, when taken to an extreme, drive widespread opposition to fluoride, vaccinations, and genetically modified foods, all antiscience problems that are common in the EU.

In a world in which advanced molecular biology will increasingly present legal challenges as we parse out what it means to have the power to analyze, edit and design life, this raises serious questions about whether our judicial system is up to the task. The High Court’s willingness to redefine medical or scientific terms to accommodate ideological concerns, and its poor grasp of the science underlying major decisions, raises doubts about its ability to deliver justice in an age of advanced science where exact definitions matter even more than they do in the law.


Let’s consider the relationship between knowledge and power. “Knowledge and power go hand in hand,” said Francis Bacon, “so that the way to increase in power is to increase in knowledge.”

At its core, science is a reliable method for creating knowledge, and thus power. To the extent that I have knowledge about the world, I can affect it, and that exercise of power is political. Because science pushes the boundaries of knowledge, it pushes us to constantly refine our ethics and morality to incorporate new knowledge, and that, too, is political. In these two realms — the socioeconomic and the moral-ethical-legal — science disrupts hierarchical power structures and vested interests (including those based on previous science) in a long drive to grant knowledge, and thus power, to the individual. That process is always and inherently political.

But the statement of an observable fact is a political act that either supports or challenges the current power structure.

Why did the church go to such lengths to deal with Galileo? For the same reasons we fight political battles over issues like climate disruption today: facts and observations are inherently powerful, and that power means they are political.

Because it takes nothing on faith, science is inherently antiauthoritarian, and a great equalizer of political power. That is why it is under attack.

The scientific revolution has proven to be more beneficial to humanity than anything previously developed. By painstakingly building objective knowledge about the way things really are in nature instead of how we would wish them to be, we have been able to double our life spans and boost the productivity of our farms by thirty-five times.

These initial recorded observations suggest a hypothesis: a possible explanation for the observations that partially or fully answers the initial question. This hypothesis must make a risky prediction, one that, if true, might confirm our conclusion or, if false, will destroy it. If there’s no possible way to prove the hypothesis is false, then we aren’t really doing science.

PART II The History of Modern Science Politics


Puritanism wasn’t just a theology, it was a whole set of ideas that included taking an antiauthoritarian, experimental, empirical approach to discovering the natural laws by which God’s creation abided. In exercising his will, God did not contradict reason. Rather, he revealed himself to humans through two books: the Book of Revelation, made accessible by faith, and the Book of Nature, made accessible by observation and reason. Science was the “handmaiden” to theology, assisting in the study of “the vast library of creation” as a vehicle to religious understanding.

This idea that God does not contradict reason and that his laws are implicit in nature also lies at the foundation of English common law, as first set forth in Christopher St. Germain’s 1518 treatise The Doctor and Student, which relates a hypothetical conversation between a doctor of divinity and a student of the laws of England and established common law’s moral basis.

Through the long centuries of the Dark Ages, it was not Christianity but Islam that had kept the flame of science alive. Turkish Ottoman muskets and superior military technology had conquered the Balkans, Ukraine, Crimea, Palestine, Lebanon, Syria, Arabia, and much of North Africa, creating a vast Ottoman empire. Scholars in this golden age of Islam laid the foundation for much of modern Western thinking in ways few people realize today, down to the language and the numerical and mathematical systems we use. The word algebra, for instance, comes from al-gabr, Arabic for “completion,” one of two ways of solving quadratic equations developed by “the father of algebra,” Muhammad ibn Musa al-Khwārizmī, whose last name (al-Khwārizmī), when translated into Latin, is Algoritmi, the root of the word algorithm.

In fact, the approach of using the empirical observation of nature to discover the objective truth of things was first used not by Francis Bacon but by an eminent Islamic scientist, Ibn al-Haytham.

Roger Bacon, the thirteenth-century scientist and Franciscan friar, described a cycle of observation, hypothesis, experimentation, and independent verification, which sounds an awful lot like the modern scientific method, and which he got from studying Optics. Al-Haytham is the first scientist we know of, as the term is used today, to describe someone guided by empirical observation of nature.

A conservative, literalist scientist-theologian named al-Ghazāli, who is influential in Muslim thinking to this day, wrote a critique of Muslim scientists, or Mu’tazilites, called The Incoherence of the Philosophers, in which he attacked their assimilation of the ideas of Aristotle and the concept of a natural causality of things.

This fundamentalist interpretation led to many of the antiscience, anti-Western beliefs that have held back progress in more fundamentalist Muslim countries to this day. “The innate religious conservatism of the school of thought that grew around [al-Ghazāli’s] work inflicted lasting damage on the spirit of rationalism and marked a turning point in Islamic philosophy,” argues Al-Khalili. But the second reason was perhaps even more powerful: the Islamic world’s failure to do what the Europeans, and particularly the followers of Martin Luther, were doing: adopt the printing press, a new technology that was making knowledge much more widely available. While devout Muslim scholars were painstakingly hand-copying holy books with artistic fealty, Lutherans were printing Bibles by the thousands and putting knowledge in the hands of the people to judge for themselves.

Each arm of the double helix of Western Christianity — Roman Catholicism and the emerging Protestantism — embodied the two distinct worldviews of the authoritarian and the antiauthoritarian: that rules, methods, and laws were either proscribed from on high or built up by individuals in consensus.

Luther’s grand movement, and the very idea that knowledge could be accessible by individuals without an intervening authority, had been made possible by the 1451 invention of the printing press.

The printing press laid the intellectual foundation for the scientific revolution that was to come. This marked an important moment in human history, when Western thought was split into twin, competing paths: the authoritarian and the antiauthoritarian. The other three major sources of human power — government, economics, and science — developed similar authoritarian, top-down and antiauthoritarian, bottom-up strains of thought over the ensuing centuries as power was demystified.

Bacon was a lawyer who worked under Edward Coke, the attorney general, a position he would eventually assume himself. Toward the end of his legal career, he turned more of his attention to science and published what would become a foundational volume, Novum Organum Scientiarum, or “New Organon” — a “new instrument” of science. It was a devastating attack on Aristotle’s book and the logic of the Greeks with its emphasis on top-down reasoning and disdain for experimentation. In it he argued instead for using the inductive method of reasoning, which underlies much of the scientific method we use today. Inductive reasoning proceeds from the bottom up by observing with the senses and then building in logical steps to reach a general conclusion about reality.

This is why one hears scientists talking about the “theory” of evolution. It is not an observed fact; rather, it is a conclusion that is supported by all the facts observed so far, but one can never be absolutely sure because one can never see the whole universe at once, and because of the provisional nature of inductive reasoning, scientists hold out the possibility, no matter how small, that it could be invalidated. Science thus demands intellectual honesty, and a scientific conclusion will always contain a provisional statement: All observed swans are white; therefore, all swans are probably white.

That is why math and statistics have become such important parts of science: they quantify the relative probability that a conclusion is true or false.

Newton provides an example of how the idea of “science” had not yet fully emerged as something separate from religion in early Enlightenment thinking. In fact, during the seventeenth century, the word “scientist” was not commonly used to describe experimenters at all; they were called “natural philosophers,” an extension of the Puritan idea of the study of the Book of Nature.

By 1663, a time when Puritans were a decided minority in England, 62 percent of the natural philosophers of the famed Royal Society of London were Puritans, including Newton, who had studied Ibn al-Haytham’s work on light and refraction, and who wrote far more on religion and alchemy than he did on science. Newton believed in the inerrancy of scripture, biblical prophecy, and that the apocalypse would come in 2060. He was “not the first of the age of reason. He was the last of the magicians,” said economist John Maynard Keynes, who purchased a collection of Newton’s papers in 1936 and was astounded to find more than one million words on alchemy and four million on theology, dwarfing his scientific work

He considered Francis Bacon, Isaac Newton, and John Locke, whom he had studied at the College of William and Mary, to be the three most important thinkers of all time. He called them “my trinity of the three greatest men the world had ever produced.”

How do we know something to be true? What is the basis of knowledge? Locke’s An Essay Concerning Human Understanding, published in 1689, just two years after Newton’s Principia, strove to answer that question, by laying out what can be known empirically, how it is that we know it, and the inherent limits of knowledge.

This approach was critical to Jefferson because it laid the foundational argument for democracy, which was implicit in a different form in Coke’s argument for the primacy of English common law: If we can discover the truth by using reason and observation — i.e. by using science — then anyone can discover the truth, and therefore no one is naturally better able or more entitled to discover the truth than anyone else. Because of this, political leaders and others in positions of authority do not have the right to impose their beliefs on other people. By natural law, the people themselves retain this inalienable right. Based on Locke’s ideas of knowledge, and Coke’s ideas of law, the antiauthoritarian equality of all men in their ability to use reason to discern the truth for themselves is logically self-evident. It is intuitive knowledge. And that’s the heart of — and the most powerful argument for — democracy.

In the process they created something entirely new: a nation that respected and tolerated religion in every sense, but did not base its authority on religion. A nation whose authority was instead based on the underlying principles of liberty, reason, and science.


In the 1920s and early 1930s, Berlin was the world capital of science, culture, and art, and these aspects fed off one another. Persecution — particularly of Jews, homosexuals, and artists — spurred emigration that turned the United States into an intellectual mecca. The United States offered these intellectuals freedom, tolerance, egalitarianism, opportunity, and support for their work, and it had the military strength to protect those ideals. In return, the new immigrants gave the United States enormous breakthroughs in chemistry, biology, and physics, and helped shape Hollywood culture, which, together with advanced technology, became America’s chief cultural export.

Befitting the great westward expansion, in the nineteenth century it was America’s pioneer spirit and can-do attitude that produced the world’s great inventors and implementers, the great trial-and-error engineers involved in communication, lighting, and power, including Eli Whitney, Samuel Morse, Alexander Graham Bell, Thomas Edison, George Westinghouse, Nikola Tesla, and many others. But Europe was still the home of real science and the scientists
— the curiosity-driven experimentalists and theorists — who made the fundamental basic-science breakthroughs
, including Alessandro Volta, Michael Faraday, Andre Ampere, George Ohm, Charles Darwin, Marie Curie, James Maxwell, Gregor Mendel, Louis Pasteur, Max Planck, Alfred Nobel, and Lord Kelvin.

French political scholar Alexis de Tocqueville noted this focus on pragmatism and application when he toured America in 1831 and 1832, some fifty-five years after its birth. His report of what he learned, Democracy in America, contains a chapter titled “Why the Americans are More Addicted to Practical than to Theoretical Science.”

In the early twenty-first century, the political orientation that most stands for freedom, openness, tolerance, caution, and science is the liberals. In the United States, this ideology is represented by the Democrats, which may explain why 55 percent of US scientists polled in 2009 said they were Democrats while only 6 percent said they were Republicans, compared to 35 and 23 percent of the general public, respectively.

Early in the twentieth century this situation was almost reversed. It was the Southern Democrats, defending Jim Crow and traditional religion, who opposed science. Republican Abraham Lincoln had created the National Academy of Sciences in 1863. Republican Teddy Roosevelt, who had grown up wanting to be a scientist, became America’s great defender of wildlife and the environment. Republican William McKinley, who would later be admired by Karl Rove, won two presidential elections, in 1896 and 1900, both times over the anti-evolution Democrat William Jennings Bryan, and supported the creation of the Bureau of Standards, which would eventually become today’s National Institute of Standards and Technology. Bryan’s strident anti-evolution campaigns, culminating in the 1925 Scopes Monkey Trial, helped to drive even more scientists toward the Republican Party.

Even though Berlin was the world capital of culture, art, and science, right-wing relativity deniers were on the rise. Like modern climate-science deniers, relativity deniers mounted ad hominem attacks against Einstein, and loudly branded general relativity a “hoax,” despite — or perhaps because of — its recent, dramatic scientific confirmation.

Darwin himself had not seen it this way. He had written to John Fordyce about the issue in 1879, saying, “It seems to me absurd to doubt that a man may be an ardent Theist & an evolutionist,” though Darwin himself had by then given up his own Christianity. In 1880, he wrote to the young lawyer Francis McDermott that “I am sorry to have to inform you that I do not believe in the Bible as a divine revelation & therefore not in Jesus Christ as the son of God,” a view that only became known when the letter was sold at auction in 2015.

Milt Humason, who was a famous womanizer, told Sandage that, in 1926, during a month-long disappearance in which McPherson claimed to have been kidnapped, tortured, and held for ransom in Mexico, the attractive radio evangelist had actually been up on Mount Wilson, enjoying Humason’s special attentions in the Kapteyn Cottage.


In fact, during the 1930s, Adolf Hitler was an early adopter of the latest science and technology, which he used to great political advantage. He forbade smoking around him because German scientists had shown a link between smoking and lung cancer. He based his politics of white supremacy on ideas he appropriated from early research into genetics. He barnstormed twenty-one cities by airplane — the first politician to use an airplane to campaign on that level — in his 1932 race for president against Paul von Hindenburg, an effort the campaign called “Hitler über Deutschland.”

Presiding over the American science war effort was Edwin Hubble’s boss, Vannevar Bush, an engineer and the president of the Carnegie Institution of Washington.

After the Germans invaded Poland in September 1939, Bush became convinced of the need to establish a federal agency that would coordinate US research efforts. He scheduled a hasty meeting in June 1940 with President Franklin D. Roosevelt, who approved the agency in less than ten minutes. The National Defense Research Committee (NDRC), the forerunner to today’s National Science Foundation, was established on June 27.

In November of 1944, Roosevelt had asked Vannevar Bush to consider how the wartime science organization might be extended to benefit the country in peacetime — to improve national security, aid research, fight disease, and develop the scientific talent of the nation’s youth. After the war was won, Bush submitted his report to President Harry S. Truman. Science, the Endless Frontier, made the case that the creation of knowledge is boundless in its potential. The report is widely credited with laying the groundwork for the second golden age of Western science, during which governments, rather than wealthy philanthropists, became the principal funders of scientific research in peacetime as they had been in war.

In less than a year, a bill creating the National Science Foundation (NSF) was signed into law, and science began to undergo a subtle but profound change in its relationship to Western culture. For two centuries, it had been motivated by a sense of wonder on the part of noble idealists and adventurers, wealthy visionaries, civic-minded philanthropists, and scrappy entrepreneurs. But it was now largely driven by government investments that were, in no small part, motivated by the public’s sense of fear.

Canadian troops stationed at Zweibrücken, a NATO air base in West Germany, called the bomb “a bucket full of sunshine

Scientists might be sons of bitches, but they were American sons of bitches.

In the span of two short decades, science had attained sacred-cow status enjoyed by few other federal priorities. Gone were the days of scientists needing to reach out to wealthy benefactors to justify and explain their work in order to get funding. The adoption of science as a national strategic priority changed the relationship between science and the public. Over the course of a single generation, government funding allowed scientists to turn inward, away from the public and toward their lab benches, at the very time that the public had developed a love-hate relationship with science.

But the need to sell the worth of one’s work to the public and donors, to converse about new discoveries and their meaning, and to inspire and excite lay-people may be the only thing that keeps the public invested and supportive in the long term — support that, in a democracy, is critical to sustained effort. Bush may have done his job too well. The shift to public funding changed the incentive structure in science. This might not have been a problem if scientists had valued public outreach, but, by and large, they didn’t. As economists are quick to point out, people often adjust their behavior to maximize the benefit to themselves in any given transaction, and the economics of the new structure rewarded research but not public outreach or engagement. As a result, most scientists ignored it. Science coasted off the taxpayers’ fear of the USSR, even as public mistrust was building.

In a famous 1959 lecture titled “The Two Cultures and the Scientific Revolution,” Snow warned that the widening communication gulf between the sciences and the humanities threatened the ability of modern peoples to solve their problems

A great change had begun in Western universities, and humanities professors felt themselves slipping from the top spots and being supplanted by scientists, who generally seemed as if they couldn’t have cared less about the humanities. Why bother with all the reading and writing and talking when science was actually doing things? But this was equally shortsighted, and in this shift the West let go of something precious: a grasp on the classics that had informed Western culture.


He laid out a bold agenda, a desperate and visionary agenda, to regain the military and ideological lead, and, at the same time, to turn around the economy by landing a man on the moon and returning him safely to Earth. The effort would require a peacetime science mobilization on par with the Manhattan Project, requiring the building of entire cities to support it. At its peak, the Apollo program would employ some four hundred thousand people

In 1962, marine biologist Rachel Carson’s book Silent Spring came out and made a permanent impact on the national psyche, shocking Americans already suspicious of science into an awareness of chemical pollution, reaffirming Eisenhower’s warnings about the scientific-technological elite, and launching the field of environmental science and the modern environmental movement. That year, only about 35 percent of Americans thought Apollo was worth the cost.

In 1965, Ralph Lapp, the former head of the nuclear physics branch of the Office of Naval Research, captured this growing fear when he published The New Priesthood, in which he reiterated Eisenhower’s argument that the “scientific elite” — people who understood how science and technology work — were starting to supplant the country’s elected leadership. Lapp’s argument reflected an emerging and critically important idea: that “democracy faces its most severe test in preserving its traditions in an age of scientific revolution.”

From a purely scientific viewpoint, human spaceflight was wasteful. But from a public-engagement (and thus funding) viewpoint, it was sheer genius. It gave the public protagonists starring in an epic narrative about science. It was, in some sense, an example of the third culture C. P. Snow had hoped for — a marriage of literary resonance and “doing the big things,” as Kennedy had urged in his United Nations address.

There is a very good chance that, had Kennedy lived, the United States would not have put a man on the moon by 1969, since the costs and the politics were tilting so heavily against it.

This pattern has been repeating since the 1960s with escalating stakes. The same drama of object-oriented science and the development of technological solutions has led to the denial of consequences over and over again, from cigarettes to DDT to asbestos to acid rain to Love Canal to the hole in the ozone layer to Three Mile Island to the Dalkon Shield to toxic-shock syndrome to lead paint and leaded gas to atrazine to the Vioxx scandal to emerging battles over microbeads and other nanotech — and eventually to the granddaddy of them all (so far), global climate disruption.

For a generation, public funding of science had been driven by fear, and Ronald Reagan was running for president saying that “government is the problem.” Sagan set out to change the growing public apathy and distrust and created the 1980 television series Cosmos — “the greatest media work in popular science of all time,” as Gould would call it.

Following this rejection, and Sagan’s failure to secure tenure at Harvard, scientists developed a new term: the Sagan effect. One’s popularity with the general public was considered inversely proportional to the quantity and quality of one’s scientific work, a perception that, in Sagan’s case at least, was false.

Sagan’s rejection became a poignant and symbolic example of how America’s most prominent scientists had lost their appreciation of the value of their relationship to society

As for the science, the scientific consensus on the safety of eating GM foods is even stronger than that for the existence of human-caused global warming. A 2015 Pew Research Center/AAAS study found that 88 percent of all AAAS-member scientists said that genetically modified foods are safe to eat, compared with just 37 percent of the general public. The fifty-one-point gap makes this the largest opinion difference between scientists and the public.

In 2015, the number of mobile-phone subscribers exceeded five billion, or nearly 70 percent of the world’s population, yet the brain-cancer rate has not increased.

Microwaves are slightly stronger, followed by the infrared radiation your skin is giving off as you sit reading this. You are probably emitting somewhere around ninety watts.

Despite all the growing cultural ambivalence toward science, it was protected until the 1990s by the same motivation that had held sway since the Russians exploded their first atomic bomb in 1949 — fear. But in 1991 the Soviet Union collapsed. With the collapse, the West won the Cold War and lost its major scientific competitor. The Western science enterprise, without realizing it, suddenly lost the rationale Vannevar Bush had used to get government funding.

Michael Halpern of the Union of Concerned Scientists says that, as a result of losing the OTA, Congress wasted billions of dollars on policies like the fence along the Mexican border that OTA scientists could have told them would not work. “It was penny-wise and pound-foolish,” Halpern says. “Without the OTA, it all became rhetoric.”


By removing the goal of objectivity, it set modern politics up for endless arguments between warring pundits.

In fact, ideas do not exist in anything akin to a marketplace, and journalism’s function in a democracy is often to tell people what they don’t want to hear but is important to know anyway. Its role is to report the news, meaning the facts of recent events, in order to provide the public with a common ground for debate and discussion. The incorrect assumption that there exists some sort of marketplace of ideas where truth is the gold standard is destroying mainstream news and, with it, the balanced, moderate political weltanschauung of the countries that practice this approach.

This trifecta—talk radio, the Internet, and cable news—combined to devalue the factual reporting that once kept society balanced, supplanting it with the opinion wars of the new media. Having trained at postmodernist universities, many emerging leaders in journalism didn’t recognize this as a problem. It wasn’t their role to discern the reality of things, they believed. Truth was subjective, a matter of one’s perspective.

During the short time since the 1970s that journalists have been arguing about objectivity, scientists and engineers have completely transformed our world using objective knowledge.

If knowledge is falsifiable but holds up regardless of who does the testing, it is said to be reliable; i.e., objective. A scientific conclusion is always provisional, because knowledge is never complete, and it is always political, because new knowledge always threatens the status quo, but it is also increasingly reliable as it is tested and survives.

Such a low standard becomes problematic in an age when major political issues have considerable scientific dimensions. Very often, there is, in fact, objective knowledge that is readily available, and the misapplication of a reporter’s well-meaning view that there is no such thing as objectivity can become a recipe for disaster.

Stephanie Curtis is the senior producer of Climate Cast, a weekly program covering climate change on Minnesota Public Radio News that has both news and call-in/online-posting components. The show is nation-leading as a major broadcast weekly news program that goes in depth on climate change, featuring the world’s leading climate scientists as regular guests.

PART III The Three-Front War on Science


The first major front in the war on science is the identity-politics or postmodernist front, waged by academics and the press. On this front, science is subordinated to the “science studies” of humanities scholars and the journalistic denial of objectivity. Such academics and reporters insist that all truth is subjective or derivative of one’s political identity group, and they confuse the process of science with the culture of scientists, thereby falsely equating knowledge with opinion.

Even eminent postmodernists like Jacques Derrida, who saw postmodernism as an outgrowth of science, mistook the authority conferred by the theatrics of the white lab coats and the way science was being used by the military-industrial complex for science itself, seeing it as an authority system when, in fact, it is just the opposite. It bears repeating: while scientists may be “authorities” in their fields and able to speak “with authority” on a given topic, what authority they have comes only from the antiauthoritarian exploration of nature. It is not grounded in the scientist, but in the evidence from nature itself. It is the authority of gravity.

After all, science refuses to take any claims on faith and instead says, “Show me the evidence from nature, and the thought process you used to establish it, and I will conclude for myself if your observations are accurate, your thought processes are sound, and, therefore, if what you say is likely to be true.” Postmodernists would say something similar, but, instead of seeking evidence from nature and the process used to establish it, they would seek to deconstruct the claims by linguistic criticism and analysis
— “the dismantling of conceptual oppositions, the taking apart of hierarchical systems of thought.” Their thinking borrowed significantly from Sigmund Freud, as did their process of deconstruction.

Postmodernism was, in this sense, psychoanalysis writ large.

Following Nietzsche, a number of Austrian, German, and French philosophers — among them Martin Heidegger (who himself became a Nazi in 1933), Michel Foucault, Jacques Derrida, Jean-François Lyotard, Jacques Lacan, Julia Kristeva, and Bruno Latour — together with a few Americans—including Richard Rorty and Austrian-American Paul Feyerabend — began rejecting the idea that reality and facts existed independently of our thinking about them.

The goal of science is to create descriptions of reality that are independent of us and our identities, opinions, or beliefs. We call these descriptions knowledge. This knowledge can be expressed in language, mathematics, graphs, images (drawings, paintings, photographs, films), or some combination of these. To create this knowledge, we use the scientific method, which is a collection of techniques to measure the way things are in nature independent of our perspectives. These techniques include observation, inductive reasoning, hypothesizing, unique prediction, experimentation, recording, critical peer review, and replication. These techniques help cull objective, reliable knowledge out of our subjective perspectives.

But what the thinking got wrong was the idea that science is authoritarian rather than antiauthoritarian; the view that science is a culture instead of a process; the confusion of science with the power and politics that surround it; and, because subjectivity has a greater claim to truth in certain realms, the assertion that there is no such thing as objectivity.

Perhaps science really was nothing more than a myth to give legitimacy to the white male society from whence the Enlightenment sprang — a sort of ethnocentric rationalization. Perhaps its so-called objectivity was just a smoke screen. This view was embraced by large swaths of left-leaning academics in the humanities departments of universities (sometimes referred to as the last medieval institutions), who had found themselves—and their budgets—deposed from their thrones by science departments and their denizens in C. P. Snow’s battle of the two cultures.

Like the postmodernists, Kuhn cast science as an expression of politics and power. Science is a knowledgeable description of nature. That is inherently powerful, and that power makes it inherently political. But while others may use the results of science for power, in and of itself its practice is not an expression of power over others. Its practice is, instead, a search for truth. This is a critical distinction.

Kuhn’s error was one of overextension — to intertwine the politics of science and the discovery of truth and call them one.

Thus if someone from a disempowered political group did something morally reprehensible, he or she had to be given extra understanding, because it was probably partly due to disenfranchisement. Cultural conservatives objected to this on a rationalist basis and were crowded into the bottom-right political quadrant (authoritarian conservative) with scientists, who didn’t belong there. But suddenly rationalism and modernism seemed like old, conservative ideas — like expressions of authority. The political left lost many brilliant and otherwise liberal thinkers, such as E. O. Wilson, who could not stomach elevating a political goal over the ideas of reason and the Enlightenment. It was antithetical to the whole egalitarian view of modernity, because there was no common authority of evidence.

Seeking authority in the cultural dialogue, conservative Catholic scholars make similarly revisionist, neo-postmodern statements today, arguing that anyone who doesn’t agree with them is practicing “scientism” — that scientists think they are superior and look down on those who don’t “believe” in science, as if it were a belief system instead of a process of measuring nature with our hands and eyes and tools. It’s a brilliant tactic, and an excellent example of the intellectual handicap postmodernism has created for itself and science: as long as everyone has their own truth, no one can presume to question anyone else’s.

The entire postmodern movement was tinkering with the foundations of democracy in ways few understood. By painting objectivity as supremacist, the subjectivism and authoritarianism from which America’s founders had sought to free the country (but had partially failed to do, by excepting slaves and women from those created “equal”) was restored to the throne. Much of Western education and thought after the 1970s lost its grip on reality and became embroiled in “but faith, or opinion, but not knowledge,” in the words of John Locke.

In the sense that it is now international, with scientists from around the world collaborating on research projects over the Internet using the language of science, the global science enterprise is the most diverse and yet universal undertaking in human history.

more remote developed countries that were far away from immigrant influxes, like Finland, rising to the top in international science and math rankings. Less diversity, as it turns out, equals, on average, less classroom management and easier teaching. But in places like the United States, followed closely by England, France, Germany, and other leading developed countries, diversity became both an advantage and a cost center.

Social constructivist thinking became the mainstream paradigm in Western teacher education in the 1970s and 1980s, eventually influencing the educations of tens of millions of Western students.

Scientists argue that the purpose of education had shifted from teaching knowledge and skills to providing a learning environment in which students construct their own knowledge, which, in the case of teaching science, disregards the accumulated knowledge of more than five hundred years. Is it surprising that students in diverse postmodernist classrooms, such as those across the United States or the United Kingdom, perform poorly when compared to their peers in China, Korea, Poland, Finland, and Japan, which rank among the least diverse countries and also among the highest performers in OECD educational rankings in science and math?

The teaching that there is no objective reality, but rather many subjective realities — or, in this case, that subjective realities are on par with objective reality —
degrades students’ views of the primacy of knowledge and increases the education gap rather than closing it. It is no wonder that there is so much antiscience in Western culture — we’ve been teaching it for forty years.

By acknowledging that there is an objective reality, and that we can form knowledge about that reality by using science and observation, we remove questions of fact from the authoritarian argument. This is the great insight that the United States, the world’s oldest democracy, was founded upon.

In 1994, Rutgers University mathematician Norman Levitt and University of Virginia biologist Paul Gross published a polemic attacking this appropriation of scientific terminology called Higher Superstition: The Academic Left and Its Quarrels with Science.

But there were plenty of scientists on the left, and others who were ideologically unaffiliated, who were tired of the arguments over the cultural nature of reality that, by then, was being called the “science wars.” Among the most notable was eminent Harvard University entomologist E. O. Wilson, who declared in a New Orleans speech that “multiculturalism equals relativism equals no supercollider equals communism.”

The result of this erosion, as the feminist essayist Katha Pollitt wrote in the Nation, was “a pseudo-politics, in which everything is claimed in the name of revolution and democracy and equality and anti-authoritarianism, and nothing is risked.

But, without objective knowledge, all arguments become “but faith, or opinion,” and can go on forever. We are either paralyzed by it or we must resort to authority instead of objectivity to make decisions. This casts us all the way back to Hobbes’s predemocracy, pre-Enlightenment, pre-Locke war of every man against every man, where not evidence but raw power determines the outcome of a dispute.

Further, it is journalists, more than any other profession, whose confusion about the nature of objectivity has both enabled and — through journalists’ development of the field of public relations — directly caused much of the assault on science by industry.

The dissociation from history and the hard-won knowledge of science has thus led to a generation of leaders who are at once arrogant and ignorant, and thus likely unable to lead the world out of the morass. We embrace the forms of tradition but not the substance, focused only on winning, unable to discern between what feels good and what is true. It is a condition that threatens to leave the world permanently damaged.

This is a retreat into superstition and darkness with heartbreaking human consequences — and even more heartbreaking political ones — rendered under the auspices of openness, tolerance, and love.


Unlike scientists, churches still depended upon engaging the public for financial support, and they were alarmed by baby boomers’ deep skepticism of religion in the 1970s.

But now Graham was inviting people back into the world of miracles and belief, a world characterized, in the words of Immanuel Kant, as the “cowardice . . . of lifelong immaturity.”

With few exceptions, the fact that the religious right had become a national political force, and that the voice of science in the public dialogue was weakening, appears to have been largely ignored in the professional conversation among scientists.

The ideological front of the war on science is being waged by religious conservatives in three major battle zones, all of which deal with origins: the nature and age of Earth and the universe, the theory of evolution, and the origin and nature of life and reproduction.

Evolution is the most well-supported knowledge in science.

In the 1930s, a Russian geneticist names Dmitri Belyaev wanted to test just this question, and he carried out an amazing experiment. Belyaev knew that dogs were descended from wolves, and he knew that behavior ultimately arose from biology. But he couldn’t figure out the mechanism that could account for the wolf giving rise to so many unique breeds of dog, or how it could be that dogs seemed to be born with an affection for humans. He wondered if it would be possible to test this. He set out on a program of selectively breeding Russian silver foxes, which mature quickly enough that, he hoped, he could perhaps observe evolutionary changes within the span of his lifetime. He captured several wild foxes and put them in cages, and he began a program of selective breeding based only on a single trait: tameness.

Joshua Rosenau of the National Center for Science Education, an Oakland, California–based nonprofit that has fought to keep creationism out of the science classroom.

Opponents of evolution, climate disruption, vaccines, birth control, stem-cell research, HPV vaccination, sex education, and other science issues are all using the same methods. Motivated by the sense of identity, belonging, and purpose they receive from these well-funded causes, thousands of laypeople are delving into geology, biology, immunology, paleontology, statistics, climatology, meteorology, geophysics, and oceanography, with the support of churches and industry-funded front groups who, like the Cornwall Alliance, preach a gospel of biblical fundamentalism mixed with a heavy dose of Ayn Rand, free-market economics, science denial, and anti-tax ideology which, when combined, serve the vested interests of wealthy church and business executives. This antiscience militia is aided and abetted by trained scientists and professors like Fred Singer, Willie Soon, David Legates, and Michael Behe, who supply a steady stream of pseudoscience that can be used by foot soldiers to sway the public debate.

These ideologically motivated partisans generally make two arguments that sound plausible to average lawmakers or school-board members because they themselves use rhetorical arguments to navigate their daily lives. They also sound convincing, because they sound like science—but they’re not. The arguments are the same whether the subject is climate disruption, evolution, vaccination, tobacco, or sex education. The first argument is: Lacking certainty, we should do nothing.

The second argument is: Since the conclusion is not certain, we should get a balanced perspective from both sides.

A 2008 Pew Research Center survey found that that is exactly what is happening. Thirty-one percent of non-college-educated Republicans accepted the scientific consensus on climate change — a surprisingly low number compared to the public at large. But what was even more surprising was that, among college-educated Republicans, that number fell, to 19 percent. The proportion was the opposite for Democrats—an increase from 52 percent to 75 percent. The more educated Republicans were, the less they believed in climate change.

In 1968, for example, the national Study Group of Mao Zedong Thought was organized, which denounced many scientific theories, starting with Albert Einstein’s theory of relativity. Acupuncture had been banned as unethical in 1822 by the Great Imperial Medical Board of China, but Mao restored it during the Cultural Revolution, giving it equal weight with Western medicine and training peasant “barefoot doctors” to use it in order to disguise the fact that China had so few trained medical professionals. It is interesting to note that today such “traditional” and “alternative” health practices are largely antiscience issues for the political left.

Science should be something that everybody has access to and everybody can understand. Most scientists are perfectly willing and interested in engaging the public. Lots of them are actually pretty good at it, but there is no incentive in the university system to do that—tenure relies on grants, and then on teaching. Outreach is way down at the bottom and does not influence tenure or promotion decisions. People who do outreach are regarded as second-class citizens in the science world.

Westphal has had scientists and science advocates lead discussions about evolution and the big bang in his church. “Protestants started out by questioning,” he says. “These things don’t have to be in conflict. By meeting scientists and talking openly about these things, people can let go of their misconceptions and see why scientists think the way they do — and that they are our friends and neighbors, not threats or opponents.”

But teaching creationism in science classes means teaching a habit of mind that is toxic to the human problem solving that has led these advances. It teaches children to throw up their hands and declare that the problem is unsolvable, particularly if that problem is tough or might have consequences for a particular religious belief. To look instead to God and acquiesce to authority. It teaches them to value not diversity of ideas, but conformity. Not survival, but submittal.


To understand the modern era of industry-funded antiscience, we have to first look at where it began: with the contribution of Sigmund Freud’s nephew, Edward Bernays. Bernays was an Austrian-American journalist-turned-psychologist. He is widely credited as “the father of public relations,” and is author of the seminal book in the field, Propaganda.

But Edward Bernays had become amazed by the power of what such an immersive approach to message delivery, combined with a purported basis in science and facts, had accomplished. His 1928 book, Propaganda, laid out specific methods for controlling public opinion, arguing, “If we understand the mechanism and motives of the group mind, is it not possible to control and regiment the masses according to our will without their knowing about it? The recent practice of propaganda has proved that it is possible, at least up to a certain point and within certain limits.”

Goebbels, said Wiegand, was using my book Crystallizing Public Opinion as a basis for his destructive campaign against the Jews of Germany.

Bernays himself stayed in the background, protecting his and his client’s anonymity. This is what Bernays termed the “third-party technique.” If the public knows they are being manipulated, they become skeptical. They must take the information in as reality.

Feminists rushed to embrace the idea, not realizing they were being manipulated by a corporation, and female smoking shot up as it became associated with the expression of one’s individual identity and freedom.

Developed in 1939, DDT was first widely used during World War II, clearing South Pacific islands of malaria-carrying insects for U.S. troops, and was used in Europe as an ingredient in delousing powder. The Swiss scientist who identified its insecticide effect, Paul Hermann Müller, was awarded the 1948 Nobel Prize in Physiology and Medicine.

Monsanto published a parody of Silent Spring called The Desolate Year, in which a plague of uncontrolled insects destroyed America.

Rachel Carson’s legacy looms huge today,” wrote Stanford conservation biologist Paul Ehrlich to commemorate the fiftieth anniversary of the publication of Silent Spring. “Many people have the impression that climate disruption is the worst environmental problem humanity faces, and indeed, its consequences may be catastrophic. But the spread of toxic chemicals from pole to pole may be the dark horse in the race.”

Within a decade, the United States passed landmark environmental legislation in the Clean Air Act, the Clean Water Act, and the establishment of the Environmental Protection Agency, the formation of which was based on Silent Spring, as the agency notes on its own website. “In fact,” one article on the site notes, “the EPA today may be said without exaggeration to be the extended shadow of Rachel Carson.”

The theme of the right’s antiscience can be generally described as: Liberal scientists with a socialist agenda who are out for more government money want to control your life and limit your freedom. On the left, the theme of antiscience can be generally described as: Impersonal doctors, greedy corporations, and mechanistic scientists hide the real dangers to our health, the environment, and the human spirit.

Hard-core environmentalist activists like the Natural Resources Defense Council have been highly effective for years in utilizing the court system to enact policy, effect change, and generate significant exposure for their cause,” said one conservative think tank. “The same opportunities exist for those who advocate a free-market approach, and we have an impressive track record in the courts despite being significantly overmatched by those promoting more regulation, and government-based solutions.”

Coinciding with this decline in traceable funding was a rise in dark money given to denial organizations by the Donors Trust and the related Donors Capital Fund. Donors Trust is a donor-directed foundation whose funders cannot be traced, shielding them from accountability for the funding of controversial enterprises. The organizations’ funding of climate-denial outfits increased dramatically, from just 3 percent of the total in 2003 to 24 percent in 2010, when total annual revenues for these organizations climbed to nearly $1.2 billion. Donors Trust now provides about 25 percent of total foundation funding used by organizations engaged in denial of climate disruption and arguing against climate regulation, according to Drexel University sociologist Robert Brulle.

Searle Freedom Trust (Donald Rumsfeld served as president and CEO of the life-sciences company G.D. Searle, which later merged with Monsanto),

Americans for Prosperity (cofounded by Charles and David Koch, AFP is currently three times the size of the Republican National Committee),

the Cato Institute (originally the Charles Koch Foundation)

This was just one of many such exchanges between the Harvard-Smithsonian Center for Astrophysics and various extraction and energy companies, including proposals for “deliverables” — papers Soon would write that created “uncertainties” about mainstream climate science — and subsequent requests for payment.

Between 1999 and 2010, the energy industry spent more than $2 billion fighting climate-change legislation, more than a quarter of it — $500 million — from January 2009 to June 2010, when President Obama’s cap-and-trade bill was in play in Congress. That amounts to almost $1,900 per day in lobby expenditures for every single US senator and representative in Washington — and those numbers don’t include nonreportable expenses like public relations, publicity, earned media, paid media, nonprofit donations, rallies, and polling. They spent an estimated $73 million more on anti-clean-energy ads from January through October 2010, and Koch family foundations gave $48 million overall to groups engaged in climate-change denial between 1997 and 2008.

The way it went down is illustrative of the effectiveness of a full-out execution of the seven-stage antiscience PR strategy — phony science, controversy, grassroots astroturfing, proselytizing, outrage, policy response, and pleading in support — in circumventing the democratic process.

Koch Industries is America’s largest private oil refiner, and in 2014 Forbes named it the largest privately held company in the United States, with annual revenues of nearly $135 billion

Other ads claimed the science behind global warming was “a hoax” and called it “the greatest scam in history,” quoting climate-change denier John Coleman, founder of the Weather Channel.

The hacker then posted a link to the file on the climate-skeptic blogs the Air Vent and Watts Up with That? as well as the blog RealClimate, which is run by several leading climate scientists, including Michael Mann.

Journalists who don’t understand science and believe that there is no such thing as objectivity can easily be manipulated by propaganda campaigns (often run by former journalists), because they think taking a position on evidence is advocacy.

Graham Cogley, a geologist at Trent University in Peterborough, Ontario, who apparently had no connection to the propaganda campaign, noticed that the IPCC’s fourth assessment contained an error. The assessment, which was released in 2007, the year the IPCC and Al Gore had won the Nobel Prize, said, “Glaciers in the Himalayas are receding faster than in any other part of the world,” and that all the glaciers in the central and eastern Himalayas could disappear by 2035. The IPCC classed the statement as “very likely,” meaning that it had a probability of being true of 90 to 99 percent, implying that it was the result of measurement and observation and was supported by inductive reasoning and statistics. It also said the glaciers’ “total area will likely shrink from the present 500,000 to 100,000 km2 by the year 2035.” But it was wrong.

After all the publicity, here’s the most shocking thing: the errors were not in the scientific section of the IPCC report, published separately as Climate Change 2007: The Physical Science Basis, which is subject to peer review. They were in the section published as Climate Change 2007: Impacts, Adaptation and Vulnerability, which speculated about possible impacts of climate change and was not subject to peer review. They did not purport to be scientific. But journalists and the public didn’t seem to notice that distinction. [NOTE: Doesn't this seem a bit disingenuous? When people talk about "IPCC reports" are they actually distinguishing between these sections? I wasn't aware of the difference?]

Inhofe’s next act should be morally repugnant to every member of free society. In an act of McCarthyism, he publicly named the scientists he wanted investigated for possible referral to the US Justice Department for prosecution [NOTE: Doesn't our esteemed author do the same thing with Soon?]

What is at stake is the freedom to investigate, debate, and express ideas that run counter to the interests of corporations and their political allies. Attacks on this basic freedom hide behind the guise of transparency but, in reality, are a step toward tyranny.

Holtz-Eakin says the entire cap-and-trade argument is misplaced. “There needs to be some reeducation,” he says. “Conservatives invented cap-and-trade, to battle acid rain. They were leaders in overthrowing liberals on it under Reagan. Before that it was all command-and-control approaches, and we brought market forces in to bear through cap-and-trade and it saved a ton of money.” Indeed, what was expected to cost between $4 billion and $6 billion annually wound up costing a quarter of that, and has saved an estimated $70 billion annually in quantifiable health-care expenses — a return on investment of more than forty to one. Holtz-Eakin says conservatives have forgotten that, and lost their roots: “They’ve taken positions that are divorced from any reality on the policy and from their own history.”

The British journal Nature, one of the two leading periodicals of science, grieved the day as a new low point for science in America: “Misinformation was presented as fact, truth was twisted and nobody showed any inclination to listen to scientists, let alone learn from them. It has been an embarrassing display, not just for the Republican Party but also for Congress and the US citizens it represents.

The industry had directly invested $721 million, and had certainly paid hundreds of millions of dollars more through contributions to 118 anti-climate front groups, astroturf organizations, and aligned think tanks. Of these investments, the fossil-fuel industry directly contributed more than $64 million to candidates and political parties, spent more than $163 million on television ads across the country (many of them on climate change), and paid almost $500 million to Washington lobbyists.

In the end, this is where the rubber meets the road: legal and policy action based on extending existing legal theory to apply sanctions when companies promote antiscience propaganda — that is, propaganda that flies in the face of a substantial body of known evidence from science —
for the purposes of defrauding investors or forestalling public regulation that affects their business models.

You can find more of the common climate-denial myths, as well as both short and long answers debunking them, along with the associated science, at the excellent website SkepticalScience.com

PART IV Winning the War


We’ve now exposed what is happening in the three major fronts of the war on science:

1. The postmodernist front being fought by academics, activists, and journalists on the secular left, who elevate subjectivity at the expense of objectivity, which they deny exists. Their actions provide philosophical support for:
2. The ideological front being fought by religious fundamentalists, who object to the emerging scientific mastery of reproduction and the life cycle, and seek to redefine scientific terms according to their own values and to debate science as if it were an opinion. They are often the foot soldiers for:
3. The industry and public-relations front, financed by corporations and conducted by PR experts, shills, and front groups, who take advantage of journalists’ naivety about objectivity and truth in order to manipulate the media, thereby shaping public opinion using “uncertainties,” deception, personal attacks, and outrage to move public policy toward an antiscience position that supports the funders’ business objectives.

the cooption of scientific societies by government operatives, like the American Psychological Association’s collusion with the Bush-era CIA’s torture policies

Hardin’s paper was remarkable because it offered such a sound rebuttal to the ideas of the Scottish economist Adam Smith, whose collaborator and mentor was David Hume.

The simple dilemma that drives the tragedy of the commons is writ large in the greatest political argument of our time: the clash between individualism and collectivism at the heart of every environmental issue.

Former NOAA administrator Jane Lubchenco thinks we can. Lubchenco is one of America’s leading voices on tackling climate disruption, but she challenges the idea of a limited world, pointing out that science has consistently expanded the economy beyond the zero-sum days of Thomas Hobbes. “By creating new knowledge, science changed economics away from a zero-sum game,” she says.

If ignorance is tyranny because it removes choices, we need to develop a way to quantify the cost of the unknown wealth — or choices — we may be leaving on the table — in other words, our opportunity cost. This opportunity cost, and arriving at a present value for the unknown economic bounty our current resource stock may contain, is the most fundamental value equation we need to solve.

In 1997, ecological economist Robert Costanza set out to answer those questions. He and a team of fellow researchers embarked on a quest for metrics that would put a value on the ecosystem services that nature provides to our economy. Published, appropriately, in Nature, the paper had a powerful effect because it put a price tag on the commons, showing that the annual value of the world’s ecosystem services to the economy was at least $33 trillion, or nearly twice as much as the $18 trillion that made up the world’s combined gross national product—the sum total annual value of each nation’s traded goods and services—and possibly as much as $54 trillion, depending on certain assumptions, or three times the total world economic activity. For comparison, the US gross domestic product in 1996 was about $6.9 trillion. The economics behind this valuation were hotly debated, but ultimately the paper held up to the scrutiny it received.

After Tony Blair was elected British prime minister in 1997, the Labour Party’s first conference was on climate change. Chief scientific advisor Lord Robert May had prepared a report that Blair presented to all attendees, and it helped set a national course. “The wheels of government grind slowly,” says May, but Britain’s Climate Change Act, adopted a few years later, provided for specific carbon targets and established “a Climate Change Committee to recommend the targets and monitor progress towards them.

John Hubble, father of Edwin Hubble, was a staunch Baptist and an insurance underwriter, and he wrote in 1900 about the core human conflict of self versus the collective that was captured by insurance. “The best definition we have found for civilization,” he wrote, “is that a civilized man does what is best for all, while the savage does what is best for himself. Civilization is but a huge mutual insurance company against human selfishness.”

Winning the war on science is this generation’s calling.

Chapter 12 - BATTLE PLANS

We’ve also looked at the issues that form the conditions — the intellectual soil, if you will —
of the whole debate: that science has succeeded beyond our wildest dreams, and that, in so doing, it has torn away some of the spiritual mysteries of life and disrupted our sense of our place in the cosmos. But more importantly, it has enabled us to increase our population and our environmental impact beyond the capacity of our one small planet to support us. This, above all else, is breaking apart the foundation on which modern society has been built—that individuals, acting in their own self-interest in a free marketplace, can deliver the highest and most efficient good to society, and that such economic activity can expand without limit. Population plus individualism plus technology may be our ultimate undoing.

Consider one of the most successful legacy organizations now tackling the antiscience crisis, the Union of Concerned Scientists. Originally formed by scientists concerned about nuclear proliferation, since the beginning of the twenty-first century the organization has broadened its focus to target antiscience efforts such as the climate battle, to take on the US federal government over lapses in scientific integrity, and to form a new Center for Science and Democracy that focuses on many of the questions discussed in this book.

Climate Progress, an arm of the progressive think tank the Center for American Progress, founded by ScienceDebate supporter and former Bill Clinton chief of staff John Podesta, is edited by science blogger, author, and physicist Joe Romm, who has become one of the most influential thought leaders on climate change.

The accelerating quantity and complexity of science is producing a depth and breadth of knowledge no longer possible for any one voter to attain. This has opened up an opportunity for antiscience campaigns to gain an unprecedented foothold in the democratic process, undermining the role of science and data in decision making. At the same time, scientific knowledge now plays a major role in most public policy challenges, and is the main arbiter and protector of individual freedom and social justice. A question arises: how best to bridge the gap between the voter and science so that democracy can be preserved?

The president of the United States has a science advisor, as does the US secretary of state, as do prime ministers of several Western countries. But every legislator and executive at every level of government, from international to national to state to large municipalities, needs science advisors to navigate today’s science-driven policy issues intelligently and effectively.

The magnitude of the current issues suggests that this would be a very good time for church leaders to reach out to scientists. The West is in a moral, intellectual, economic, and ecological crisis, and it matters little whether a preacher is conservative or progressive if he or she is incorporating knowledge into moral reflections. As a new Catholicism may be emerging, so too is it time for a new Protestantism, and for the spirit of Luther’s questioning to be reborn, and a new Islam, in which the ideas of equality and science that are implicit in Islam, the one-time protector of science, are embraced anew.

The debate should be conducted in the Oxford style, with the class voting before the debate on whether they agree with the proposition or not, then voting again after the debate, and comparing the two votes to determine which argument was more convincing and thus who won the debate.

Parents in China, whose children score at the top of international science rankings, tend to regard science proficiency as a matter of effort, and that if you don’t do well in science class, it’s because you weren’t working hard enough. There’s considerable evidence to show that it’s this parental attitude that is perhaps the largest factor in these students’ superior performance. Chinese parents also are more likely to help their kids with science homework and to make sure they understand it. This insight has been borne out by other studies of cross-cultural and ethnic differences between low-scoring and high-scoring countries, which consistently find that proactive parental involvement in kids’ science homework and studies is important. This is especially true in highly diverse Western countries. Racial and cultural diversity create challenges. Finland tends to perform near the top of international rankings, in part because of homogeneity in its student bodies. A homogenous student body requires far less management time, which allows for more focus on academics. However, even among diverse American classes, Chinese-American students often outperform their peers, because of their parents’ attitude and support.

Science is always political. It is an antiauthoritarian activity. But it is never partisan, left or right.

Much would be done to reintegrate science into the public if granting agencies required that principal investigators — those in charge of the lab a grant is funding — hire science communicators to explain the importance of the lab’s work to the public, to act as liaisons, to promote the work, and to organize public interaction with the lab.

A not-insignificant reason the public distrusts science is because of scientists who engage in unethical behavior, such as misconduct for personal gain or overdramatized conclusions to get attention in the media. Considering all the problems that have been caused by these breaches, it is surprising that there is no broad scientific code of ethical conduct, no Hippocratic oath for scientists, beyond the scientific method itself.

The US National Academies recently looked at part of this when it issued a 2015 report, Diplomacy for the 21st Century: Embedding a Culture of Science and Technology Throughout the Department of State

The problem is that the rest of society is changing fast, and federal departments and agencies need to change fast as well.

In order to reflect candidates’ commitments to basing public-policy decisions on knowledge, versus opinion or belief, we need a vehicle. The Contract from America, the Taxpayer Protection Pledge, and the No Climate Tax Pledge have all sought to restrict reasoned debate. We need a pledge that seeks to expand it. Citizens can print out the Science Pledge and challenge their elected leaders to sign it.

The fact that journalists have fallen prey to the war on science is now well established. Journalism schools teach there is no such thing as objectivity, new reporter guidelines contain the phrase, and top journalists repeat it in speeches.

Max Nova

Max Nova

I love books! My reading theme for 2017 is "The Integrity of Western Science." I'm also the founder of www.SilviaTerra.com.

Read More