The Demon-Haunted World: Science as a Candle in the Dark

The Demon-Haunted World: Science as a Candle in the Dark

When I picked up Carl Sagan's "The Demon-Haunted World: Science as a Candle in the Dark", I wasn't expecting such a thorough treatment of alien abduction claims! This book is Sagan's plea for our society to adopt a more scientific perspective as we confront both personal and global challenges. In this wide-ranging and somewhat unfocused presentation of his views, Sagan takes us from classical Greece and medieval witchcraft trials to postmodern academic theorizing and nuclear winter. Yet it feels inconsistent between chapters and the sheer scope of this book is a bit overwhelming. Within it, you can find Sagan holding forth on everything from demonology to the removal of old statues (he's against). There are ton of great lines in here and I loved some of the epigraphs that he used to begin each chapter.

Sagan devotes a large chunk of the book to purported alien abductions. Through this lens, he explores how to differentiate science from pseudo-science and the damage that the latter does. Sagan suggests that many abduction experiences are actually cases of psychological reactions to the repression of childhood sexual abuse and he surprised me by drawing a connection between the modern alien abduction psychological phenomena and the "demon" hysteria of the Middle Ages. It's a seductive conjecture and an entertaining way to explore the fallacies of pseudo-science, but his major claims about childhood abuse and medieval demons are unsupported by any evidence.

Furthermore, Sagan himself was guilty of moving beyond the strict bounds of science in his own career. He briefly touches on "nuclear winter" as "the most controversial scientific debate I've been involved in" and notes that his own projections were shown to be incorrect when measured against the actual effects of the 1991 Kuwaiti oil well fires. He doesn't really own up to how much he publicly hyped the idea of nuclear winter - from what I can tell, he was nearly as bad as Ehrlich and his "Population Bomb." In his 2003 Caltech Michelin lecture, Michael Crichton drags Sagan pretty hard for his nuclear winter stuff, and he seems to have deserved it. Yet the idea of nuclear winter persists to this day - a whole chapter of ("Warnings: Finding Cassandras to Stop Catastrophes")[/warnings/] is dedicated to the idea.

Where Sagan really shines is when he talks about the role of skepticism in society:

In those cultures lacking unfamiliar challenges, external or internal, where fundamental change is unneeded, novel ideas need not be encouraged. Indeed, heresies can be declared dangerous; thinking can be rigidified; and sanctions against impermissible ideas can be enforced - all without much harm

This paragraph in particular rang with echoes of David Deutsch's "static vs. dynamic" society argument in his ("Beginning of Infinity")[/beginning-of-infinity] - one of my favorite books.

Overall, this was an interesting read that could have benefited from tighter composition and a bit more personal honesty. But there are certainly gems in here that I'll be referencing for quite some time.

My highlights below:


My parents were not scientists. They knew almost nothing about science. But in introducing me simultaneously to skepticism and to wonder, they taught me the two uneasily cohabiting modes of thought that are central to the scientific method.


All our science, measured against reality, is primitive and childlike and yet it is the most precious thing we have. -ALBERT EINSTEIN

He wanted to know about science. It’s just that all the science had gotten filtered out before it reached him. Our cultural motifs, our educational system, our communications media had failed this man. What the society permitted to trickle through was mainly pretense and confusion. It had never taught him how to distinguish real science from the cheap imitation. He knew nothing about how science works.

Spurious accounts that snare the gullible are readily available. Skeptical treatments are much harder to find. Skepticism does not sell well.

Science arouses a soaring sense of wonder. But so does pseudoscience. Sparse and poor popularizations of science abandon ecological niches that pseudoscience promptly fills. If it were widely understood that claims to knowledge require adequate evidence before they can be accepted, there would be no room for pseudoscience. But a kind of Gresham’s Law prevails in popular culture by which bad science drives out good.

Surveys suggest that some 95 percent of Americans are “scientifically illiterate.” That’s just the same fraction as those African Americans, almost all of them slaves, who were illiterate just before the Civil War

A God of the Gaps is assigned responsibility for what we do not yet understand.

It was very like what the historian Edward Gibbon described for the entire Eastern Empire, whose capital was Constantinople: In the revolution of ten centuries, not a single discovery was made to exalt the dignity or promote the happiness of mankind. Not a single idea had been added to the speculative systems of antiquity, and a succession of patient disciples became in their turn the dogmatic teachers of the next servile generation.

If the world is to escape the direst consequences of global population growth and 10 or 12 billion people on the planet in the late twenty-first century, we must invent safe but more efficient means of growing food — with accompanying seed stocks, irrigation, fertilizers, pesticides, transportation and refrigeration systems. It will also take widely available and acceptable contraception, significant steps toward political equality of women, and improvements in the standards of living of the poorest people. How can all this be accomplished without science and technology?

Roughly half the scientists on Earth work at least part-time for the military.

Advances in medicine and agriculture have saved vastly more lives than have been lost in all the wars in history.

The sword of science is double-edged. Its awesome power forces on all of us, including politicians, but of course especially on scientists, a new responsibility — more attention to the long-term consequences of technology, a global and transgenerational perspective, an incentive to avoid easy appeals to nationalism and chauvinism. Mistakes are becoming too expensive.

In The Genealogy of Morals, Friedrich Nietzsche, as so many before and after, decries the “unbroken progress in the self-belittling of man” brought about by the scientific revolution. Nietzsche mourns the loss of “man’s belief in his dignity, his uniqueness, his irreplaceability in the scheme of existence.” For me, it is far better to grasp the Universe as it really is than to persist in delusion, however satisfying and reassuring. Which attitude is better geared for our long-term survival? Which gives us more leverage on our future? And if our naïve self-confidence is a little undermined in the process, is that altogether such a loss? Is there not cause to welcome it as a maturing and character-building experience?

Pseudoscience speaks to powerful emotional needs that science often leaves unfulfilled.

The continuum stretching from ill-practiced science, pseudoscience, and superstition (New Age or Old), all the way to respectable mystery religion, based on revelation, is indistinct.

Pseudoscience is just the opposite. Hypotheses are often framed precisely so they are invulnerable to any experiment that offers a prospect of disproof, so even in principle they cannot be invalidated. Practitioners are defensive and wary. Skeptical scrutiny is opposed. When the pseudoscientific hypothesis fails to catch fire with scientists, conspiracies to suppress it are deduced.

The method of science, as stodgy and grumpy as it may seem, is far more important than the findings of science.

Britain had such a Prime Minister in Margaret Thatcher. Her early studies in chemistry, in part under the tutelage of Nobel Laureate Dorothy Hodgkins, were key to the U.K.’s strong and successful advocacy that ozone-depleting CFCs be banned worldwide.


Science is more than a body of knowledge; it is a way of thinking. I have a foreboding of an America in my children’s or grandchildren’s time — when the United States is a service and information economy; when nearly all the key manufacturing industries have slipped away to other countries; when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what’s true, we slide, almost without noticing, back into superstition and darkness.

One of the great commandments of science is, “Mistrust arguments from authority.”

Science is not only compatible with spirituality; it is a profound source of spirituality. When we recognize our place in an immensity of light-years and in the passage of ages, when we grasp the intricacy, beauty, and subtlety of life, then that soaring feeling, that sense of elation and humility combined, is surely spiritual.

Not every branch of science can foretell the future — paleontology can’t — but many can and with stunning accuracy.

Again, the reason science works so well is partly that built-in error-correcting machinery. There are no forbidden questions in science, no matters too sensitive or delicate to be probed, no sacred truths. That openness to new ideas, combined with the most rigorous, skeptical scrutiny of all ideas, sifts the wheat from the chaff. It makes no difference how smart, august, or beloved you are. You must prove your case in the face of determined, expert criticism. Diversity and debate are valued. Opinions are encouraged to contend—substantively and in depth.

This is one of the reasons that the organized religions do not inspire me with confidence. Which leaders of the major faiths acknowledge that their beliefs might be incomplete or erroneous and establish institutes to uncover possible doctrinal deficiencies?

The difference between physics and metaphysics, Wood concluded as he raised his glass high, is not that the practitioners of one are smarter than the practitioners of the other. The difference is that the metaphysicist has no laboratory.

In all uses of science, it is insufficient — indeed it is dangerous — to produce only a small, highly competent, well-rewarded priesthood of professionals. Instead, some fundamental understanding of the findings and methods of science must be available on the broadest scale.

The values of science and the values of democracy are concordant, in many cases indistinguishable. Science and democracy began — in their civilized incarnations — in the same time and place, Greece in the seventh and sixth centuries B.C.

Science is a way to call the bluff of those who only pretend to knowledge. It is a bulwark against mysticism, against superstition, against religion misapplied to where it has no business being.


Each field of science has its own complement of pseudoscience.

Economists have long-range economic forecasting. Meteorologists, so far, have long-range weather forecasting, as in the sunspot-oriented Farmer’s Almanac (although long-term climate forecasting is another matter).

They long for the scientific seal of approval, but are unwilling to put up with the rigorous standards of evidence that impart credibility to that seal.

Chapter 4 - ALIENS

I came upon a book called Extraordinary Popular Delusions and the Madness of Crowds, written by Charles Mackay in 1841,

An informative exposé by the journalist Jim Schnabel (Round in Circles; Penguin Books, 1994) — from which much of my account is taken — is in print. Schnabel joined the cerealogists early and in the end made a few successful pictograms himself. (He prefers a garden roller to a wooden plank, and found that simply stomping grain with one’s feet does an acceptable job.) But Schnabel’s work, which one reviewer called “the funniest book I’ve read in ages,” had only modest success. Demons sell; hoaxers are boring and in bad taste.


“Do you believe in UFOs?” I’m always struck by how the question is phrased, the suggestion that this is a matter of belief and not of evidence. I’m almost never asked, “How good is the evidence that UFOs are alien spaceships?”

As government deceit and conspiracies of silence have been exposed on so many other matters, it’s hard to argue that a coverup on this odd subject is impossible, that the government would never hide important information from its citizens.

The heyday of UFOs corresponds to the time when the main delivery vehicle for nuclear weapons was being switched from aircraft to missiles.

If we are convinced that the government is keeping visits of aliens from us, then we should take on the secrecy culture of the military and intelligence establishments.

With a few exceptions, secrecy is deeply incompatible with democracy and with science.

There is no difficulty in understanding the motivation of the hoaxers. A more or less typical example is the book of Deuteronomy — discovered hidden in the Temple in Jerusalem by King Josiah, who, miraculously, in the midst of a major reformation struggle, found in Deuteronomy confirmation of all his views.

Lorenzo of Valla was one of the polymaths of the Italian Renaissance. A controversialist, crusty, critical, arrogant, a pedant, he was attacked by his contemporaries for sacrilege, impudence, temerity and presumption — among other imperfections. After he concluded that the Apostles’ Creed could not on grammatical grounds have actually been written by the Twelve Apostles, the Inquisition declared him a heretic, and only the intervention of his patron, Alfonso, King of Naples, prevented his immolation. Undeterred, in 1440, he published a treatise demonstrating that the Donation of Constantine is a crude forgery. The language in which it was written was to fourth century court Latin as Cockney was to the King’s English. Because of Lorenzo of Valla, the Roman Catholic Church no longer presses its claim to rule European nations because of the Donation of Constantine. This work, whose provenance has a five-century hole in it, is generally understood to have been forged by a cleric attached to the Church’s curia around the time of Charlemagne, when the papacy (and especially Pope Adrian I) was arguing for unification of church and state.


[A]s children tremble and fear everything in the blind darkness, so we in the light sometimes fear what is no more to be feared than the things children in the dark hold in terror -LUCRETIUS, On the Nature of Things (ca. 60 B.C.)

Occasionally, I get a letter from someone who is in “contact” with extraterrestrials. I am invited to “ask them anything.” And so over the years I’ve prepared a little list of questions. The extraterrestrials are very advanced, remember. So I ask things like, “Please provide a short proof of Fermat’s Last Theorem.” Or the Goldbach Conjecture.

Such celebrated (and unhysterical) explorers as Admiral Richard Byrd, Captain Joshua Slocum, and Sir Ernest Shackleton all experienced vivid hallucinations when coping with unusual isolation and loneliness.

The Yale anthropologist Weston La Barre goes so far as to argue that “a surprisingly good case could be made that much of culture is hallucination,” and that “the whole intent and function of ritual appears to be... [a] group wish to hallucinate reality.”

Five to ten percent of us are extremely suggestible, able to move at a command into a deep hypnotic trance. Roughly ten percent of Americans report having seen one or more ghosts. This is more than the number who allegedly remember being abducted by aliens, about the same as the number who’ve reported seeing one or more UFOs, and less than the number who in the last week of Richard Nixon’s Presidency — before he resigned to avoid impeachment — thought he was doing a good-to-excellent job as President. At least 1 percent of all of us is schizophrenic. This amounts to over 50 million schizophrenics on the planet, more than the population of, say, England.

The American cartoonist Gary Larson, who draws in the horror genre, dedicates one of his books as follows: When I was a boy, our house was filled with monsters. They lived in the closets, under the beds, in the attic, in the basement, and — when it was dark — just about everywhere. This book is dedicated to my father, who kept me safe from all of them.

It makes good evolutionary sense for children to have fantasies of scary monsters. In a world stalked by lions and hyenas, such fantasies help prevent defenseless toddlers from wandering too far from their guardians. How can this safety machinery be effective for a vigorous, curious young animal unless it delivers industrial-strength terror?


Fear of things invisible is the natural seed of that which every one in himself calleth religion. -THOMAS HOBBES, Leviathan (1651)

The early Church Fathers, despite having imbibed Neo-Platonism from the culture they swam in, were anxious to separate themselves from “pagan” belief-systems. They taught that all of pagan religion consisted of the worship of demons and men, both misconstrued as gods. When St. Paul complained (Ephesians 6:14) about wickedness in high places, he was referring not to government corruption, but to demons, who lived in high places: For we wrestle not against flesh and blood, but against principalities, against powers, against the rulers of the darkness of this world, against spiritual wickedness in high places.

They can assume any form, and know many things — “demon” means “knowledge” in Greek —especially about the material world. However intelligent, they are deficient in charity. They prey on “the captive and outwitted minds of men,” wrote Tertullian. “They have their abode in the air, the stars are their neighbors, their commerce is with the clouds.”

Demons, the “powers of the air,” come down from the skies and have unlawful sexual congress with women. Augustine believed that witches were the offspring of these forbidden unions. In the Middle Ages, as in classical antiquity, nearly everyone believed such stories. The demons were also called devils, or fallen angels. The demonic seducers of women were labeled incubi; of men, succubi. There are cases in which nuns reported, in some befuddlement, a striking resemblance between the incubus and the priest-confessor, or the bishop, and awoke the next morning, as one fifteenth-century chronicler put it, to “find themselves polluted just as if they had commingled with a man.”

As they seduced, the incubi and succubi were perceived as a weight bearing down on the chest of the dreamer. Mare, despite its Latin meaning, is the Old English word for incubus, and nightmare meant originally the demon that sits on the chests of sleepers, tormenting them with dreams.

1400 years later, in his work De Daemonialitae, the Franciscan scholar Ludovico Sinistrari assures us that demons pass through walls.

Even humanists such as Desiderius Erasmus and Thomas More believed in witches. “The giving up of witchcraft,” said John Wesley, the founder of Methodism, “is in effect the giving up of the Bible.” William Blackstone, the celebrated jurist, in his Commentaries on the Laws of England (1765), asserted: To deny the possibility, nay, actual existence of witchcraft and sorcery is at once flatly to contradict the revealed word of God in various passages of both the Old and New Testament.

The Pope appointed Kramer and Sprenger to write a comprehensive analysis, using the full academic armory of the late fifteenth century. With exhaustive citations of Scripture and of ancient and modern scholars, they produced the Malleus Maleficarum, the “Hammer of Witches”—aptly described as one of the most terrifying documents in human history.

What the Malleus comes down to, pretty much, is that if you’re accused of witchcraft, you’re a witch.

The Malleus in hand, the Pope’s encouragement guaranteed, inquisitors began springing up all over Europe. It quickly became an expense account scam.

Innocent himself died in 1492, following unsuccessful attempts to keep him alive by transfusion (which resulted in the deaths of three boys) and by suckling at the breast of a nursing mother. He was mourned by his mistress and their children.

Those responsible for prosecuting, torturing, judging, burning, and justifying were selfless. Just ask them.

In the sixteenth century the scholar William Tyndale had the temerity to contemplate translating the New Testament into English. But if people could actually read the Bible in their own language instead of arcane Latin, they could form their own, independent religious views. They might conceive of their own private unintermediated line to God. This was a challenge to the job security of Roman Catholic priests. When Tyndale tried to publish his translation, he was hounded and pursued all over Europe. Eventually he was captured, garroted, and then, for good measure, burned at the stake. His copies of the New Testament (which a century later became the basis of the exquisite King James translation) were then hunted down house-to-house by armed posses — Christians piously defending Christianity by preventing other Christians from knowing the words of Christ.

We still use the word “pandemonium” (literally, all demons).

More than half of Americans tell pollsters they “believe” in the Devil’s existence, and 10 percent have communicated with him, as Martin Luther reported he did regularly.

There is no spaceship in these stories. But most of the central elements of the alien abduction account are present, including sexually obsessive non-humans who live in the sky, walk through walls, communicate telepathically, and perform breeding experiments on the human species. Unless we believe that demons really exist, how can we understand so strange a belief system, embraced by the whole Western world (including those considered the wisest among us), reinforced by personal experience in every generation, and taught by Church and State? Is there any real alternative besides a shared delusion based on common brain wiring and chemistry?

In Talmudic tradition the archetypical succubus was Lilith, whom God made from the dust along with Adam. She was expelled from Eden for insubordination — not to God, but to Adam. Ever since, she spends her nights seducing Adam’s descendants.

Put aside Gibbon’s social snobbery: The devil tormented the upper classes too, and even a king of England — James I, the first Stuart monarch — wrote a credulous and superstitious book on demons (Daemonologie, 1597). He also was the patron of the great translation of the Bible into English that still bears his name.

In earlier times we recognized them as gods, demons, fairies, or spirits; only now do we understand that it’s aliens who’ve been diddling us all these millennia.


But we have no apparitions cautioning the Church against, say, accepting the delusion of an Earth-centered Universe, or warning it of complicity with Nazi Germany — two matters of considerable moral as well as historical import, on which Pope John Paul II, to his credit, has admitted that the Church has erred.

Chapter 9 - THERAPY

In his book Abductions, Mack explicitly proposes the very dangerous doctrine that “the power or intensity with which something is felt” is a guide to whether it’s true.

The more I look into claims of alien abduction, the more similar they seem to reports of “recovered memories” of childhood sexual abuse.


[M]agic, it must be remembered, is an art which demands collaboration between the artist and his public. -E. M. BUTLER, The Myth of the Magus (1948)

Claims that cannot be tested, assertions immune to disproof are veridically worthless, whatever value they may have in inspiring us or in exciting our sense of wonder. What I’m asking you to do comes down to believing, in the absence of evidence, on my say-so.

Such “explanations” can explain anything, and therefore in fact nothing.

But, tellingly, when he tries to describe them, he reaches for physics and mathematics. He wants it both ways — the language and credibility of science, but without being bound by its method and rules. He seems not to realize that the credibility is a consequence of the method.

Keeping an open mind is a virtue — but, as the space engineer James Oberg once said, not so open that your brains fall out.


How is it, I ask myself, that channelers never give us verifiable information otherwise unavailable? Why does Alexander the Great never tell us about the exact location of his tomb, Fermat about his Last Theorem, John Wilkes Booth about the Lincoln assassination conspiracy, Hermann Göring about the Reichstag fire? Why don’t Sophocles, Democritus, and Aristarchus dictate their lost books? Don’t they wish future generations to have access to their masterpieces?

People pay attention to these puerile marvels mainly because they promise something like old-time religion, but especially life after death, even life eternal.

T. H. Huxley’s formulation was "The foundation of morality is to... give up pretending to believe that for which there is no evidence, and repeating unintelligible propositions about things beyond the possibilities of knowledge."

Among the tools:

  • Wherever possible there must be independent confirmation of the “facts.”
  • Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
  • Arguments from authority carry little weight — “authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
  • Spin more than one hypothesis.
  • Quantify.
  • If there’s a chain of argument, every link in the chain must work (including the premise) — not just most of them.
  • Occam’s Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler.
  • Always ask whether the hypothesis can be, at least in principle, falsified. Propositions that are untestable, unfalsifiable are not worth much.

inconsistency (e.g., Prudently plan for the worst of which a potential military adversary is capable, but thriftily ignore scientific projections on environmental dangers because they’re not “proved.” Or: Attribute the declining life expectancy in the former Soviet Union to the failures of communism many years ago, but never attribute the high infant mortality rate in the United States (now highest of the major industrial nations) to the failures of capitalism. Or: Consider it reasonable for the Universe to continue to exist forever into the future, but judge absurd the possibility that it has infinite duration into the past);

short-term vs. long-term — a subset of the excluded middle, but so important I’ve pulled it out for special attention (e.g., We can’t afford programs to feed malnourished children and educate preschool kids. We need to urgently deal with crime on the streets. Or: Why explore space or pursue fundamental science when we have so huge a budget deficit?);

Talleyrand said, “An important art of politicians is to find new names for institutions which under old names have become odious to the public”).

When the first work was published in the scientific literature in 1953 showing that the substances in cigarette smoke when painted on the backs of rodents produce malignancies, the response of the six major tobacco companies was to initiate a public relations campaign to impugn the research, sponsored by the Sloan Kettering Foundation. This is similar to what the Du Pont Corporation did when the first research was published in 1974 showing that their Freon product attacks the protective ozone layer.

And if they missed something, if independent scientists suggest a hazard, why would the companies protest? Would they rather kill people than lose profits? If, in an uncertain world, an error must be made, shouldn’t it be biased toward protecting customers and the public? And, incidentally, what do these cases say about the ability of the free enterprise system to police itself? Aren’t these instances where at least some government intrusion is in the public interest?

A more cynical formulation by the Roman historian Polybius: Since the masses of the people are inconstant, full of unruly desires, passionate, and reckless of consequences, they must be filled with fears to keep them in order. The ancients did well, therefore, to invent gods, and the belief in punishment after death.


And even the Apostle Paul, so credulous on so many matters, counsels us to “prove all things.”

Few rise to this challenge as energetically as James “The Amazing” Randi, accurately self-described as an angry man. He is angry not so much about the survival into our day of antediluvian mysticism and superstition, but about how uncritical acceptance of mysticism and superstition works to defraud, to humiliate, and sometimes even to kill.

He has received wide recognition among scientists and is a recipient of the MacArthur Foundation (so-called “genius”) Prize Fellowship. One critic castigated him for being “obsessed with reality.”

Secondly, the exposé of fraud and error in science is made almost exclusively by science. The discipline polices itself — meaning that scientists are aware of the potential for charlatanry and mistakes. But the exposure of fraud and error in faith-healing is almost never done by other faith-healers. Indeed, it is striking how reluctant the churches and synagogues are in condemning demonstrable deception in their midst.

When blindfolded patients are deceived into believing they’re being touched by a leaf such as poison ivy or poison oak, they produce an ugly red contact dermatitis.

One of the saddest lessons of history is this: If we’ve been bamboozled long enough, we tend to reject any evidence of the bamboozle. We’re no longer interested in finding out the truth. The bamboozle has captured us. It’s simply too painful to acknowledge, even to ourselves, that we’ve been taken. Once you give a charlatan power over you, you almost never get it back. So the old bamboozles tend to persist as the new ones rise.

Chapter 14 - ANTISCIENCE

If the established framework of science is plausibly in error (or arbitrary, or irrelevant, or unpatriotic, or impious, or mainly serving the interests of the powerful), then perhaps we can save ourselves the trouble of understanding what so many people think of as a complex, difficult, highly mathematical, and counterintuitive body of knowledge. Then all the scientists would have their comeuppance. Science envy could be transcended. Those who have pursued other paths to knowledge, those who have secretly harbored beliefs that science has scorned, could now have their place in the Sun.

So how is shamanistic or theological or New Age doctrine different from quantum mechanics? The answer is that even if we cannot understand it, we can verify that quantum mechanics works.

Another important distinction was suggested in Reason and Nature, the 1931 book by Morris Cohen, a celebrated philosopher of science: To be sure, the vast majority of people who are untrained can accept the results of science only on authority. But there is obviously an important difference between an establishment that is open and invites every one to come, study its methods, and suggest improvement, and one that regards the questioning of its credentials as due to wickedness of heart, such as [Cardinal] Newman attributed to those who questioned the infallibility of the Bible... Rational science treats its credit notes as always redeemable on demand, while non-rational authoritarianism regards the demand for the redemption of its paper as a disloyal lack of faith.

As brilliant, widely read, and sober a historian as Edward Gibbon would not meet with Benjamin Franklin when they found themselves at the same English country inn—because of the late unpleasantness of the American Revolution. (Franklin then volunteered source material to Gibbon when he turned, as Franklin was sure he soon would, from the decline and fall of the Roman Empire to the decline and fall of the British Empire. Franklin was right about the British Empire, but his timetable was about two centuries early.)

And yet who would deny that there were actual sequences of historical events, with real causal threads, even if our ability to reconstruct them in their full weave is limited, even if the signal is awash in an ocean of self-congratulatory noise? The danger of subjectivity and prejudice has been apparent from the beginning of history. Thucydides warned against it. Cicero wrote The first law is that the historian shall never dare to set down what is false; the second, that he shall never dare to conceal the truth; the third, that there shall be no suspicion in his work of either favoritism or prejudice.

Lucian of Samosata, in How History Should Be Written, published in the year 170, urged “The historian should be fearless and incorruptible; a man of independence, loving frankness and truth.”

Science is a collective enterprise with the error-correction machinery often running smoothly. It has an overwhelming advantage over history, because in science we can do experiments.

In those historical sciences where you cannot arrange a rerun, you can examine related cases and begin to recognize their common components. We can’t make stars explode at our convenience, nor can we repeatedly evolve through many trials a mammal from its ancestors. But we can simulate some of the physics of supernova explosions in the laboratory, and we can compare in staggering detail the genetic instructions of mammals and reptiles.

The American revolutionary Ethan Allen — leader of the Green Mountain Boys in their capture of Fort Ticonderoga — had some words on this subject: Those who invalidate reason ought seriously to consider whether they argue against reason with or without reason; if with reason, then they establish the principle that they are laboring to dethrone: but if they argue without reason (which, in order to be consistent with themselves they must do), they are out of reach of rational conviction, nor do they deserve a rational argument.

It might be useful for scientists now and again to list some of their mistakes. It might play an instructive role in illuminating and de-mythologizing the process of science and in enlightening younger scientists. Even Johannes Kepler, Isaac Newton, Charles Darwin, Gregor Mendel, and Albert Einstein made serious mistakes.

But why does it matter what biases and emotional predispositions scientists bring to their studies — so long as they are scrupulously honest and other people with different proclivities check their results? Presumably no one would argue that the conservative view on the sum of 14 and 27 differs from the liberal view, or that the mathematical function that is its own derivative is the exponential in the northern hemisphere but some other function in the southern.

Mathematics might be prized or ignored, but it is equally true everywhere — independent of ethnicity, culture, language, religion, ideology.

Darwin was about to become a minister of the Church of England when the opportunity to sail on H.M.S. Beagle presented itself.

He was himself almost thrown off the Beagle by Captain FitzRoy for his militant opposition to the Captain’s racism. Darwin was head and shoulders above most of his contemporaries in this regard. But again, even if he was not, how does it affect the truth or falsity of natural selection? Thomas Jefferson and George Washington owned slaves; Albert Einstein and Mohandas Gandhi were imperfect husbands and fathers. The list goes on indefinitely. We are all flawed and creatures of our times. Is it fair to judge us by the unknown standards of the future?

“A new era of the magical explanation of the world is rising,” said Adolf Hitler, “an explanation based on will rather than knowledge. There is no truth, in either the moral or the scientific sense.”

The idea that some state-endorsed ideology or popular prejudice would hog-tie scientific progress seems unthinkable. For 200 years Americans have prided themselves on being a practical, pragmatic, nonideological people. And yet anthropological and psychological pseudoscience has flourished in the United States — on race, for example. Under the guise of “creationism,” a serious effort continues to be made to prevent evolutionary theory — the most powerful integrating idea in all of biology, and essential for other sciences ranging from astronomy to anthropology — from being taught in the schools.

Chapter 15 - NEWTON’S SLEEP

[I]gnorance more frequently begets confidence than does knowledge: it is those who know little, and not those who know much, who so positively assert that this or that problem will never be solved by science. -CHARLES DARWIN, Introduction, The Descent of Man (1871)

There is a strangely waxing academic opinion, with roots in the 1960s, that holds all views to be equally arbitrary and “true” or “false” to be a delusion. Perhaps it is an attempt to turn the tables on scientists who have long argued that literary criticism, religion, aesthetics, and much of philosophy and ethics are mere subjective opinion, because they cannot be demonstrated like a theorem in Euclidean geometry nor put to experimental test.

Why is the prayer needed? Didn’t God know of the drought? Was he unaware that it threatened the bishop’s parishioners? What is implied here about the limitations of a supposedly omnipotent and omniscient deity?

What about longevity through prayer? The Victorian statistician Francis Galton argued that — other things being equal — British monarchs ought to be very long-lived, because millions of people all over the world daily intoned the heartfelt mantra “God Save the Queen” (or King). Yet, he showed, if anything, they don’t live as long as other members of the wealthy and pampered aristocratic class. Tens of millions of people in concert publicly wished (although they did not exactly pray) that Mao Zedong would live “for ten thousand years.” Nearly everyone in ancient Egypt exhorted the gods to let the Pharaoh live “forever.” These collective prayers failed. Their failure constitutes data.

Open and vigorous debate, even the consecration of doubt, is a Christian tradition going back to John Milton’s Areopagitica (1644).

In theological discussion with religious leaders, I often ask what their response would be if a central tenet of their faith were disproved by science. When I put this question to the current, Fourteenth, Dalai Lama, he unhesitatingly replied as no conservative or fundamentalist religious leaders do: In such a case, he said, Tibetan Buddhism would have to change. Even, I asked, if it’s a really central tenet, like (I searched for an example) reincarnation? Even then, he answered. However — he added with a twinkle — it’s going to be hard to disprove reincarnation.

I suggest that in every one of these cases, religious or secular, we are much better off if we know the best available approximation to the truth — and if we keep before us a keen apprehension of the errors our interest group or belief system has committed in the past. In every case the imagined dire consequences of the truth being generally known are exaggerated. And again, we are not wise enough to know which lies, or even which shadings of the facts, can competently serve some higher social purpose — especially in the long run.


Like assault weapons and market derivatives, the technologies that allow us to alter the global environment that sustains us should mandate caution and prudence. Yes, it’s the same old humans who have made it so far. Yes, we’re developing new technologies as we always have. But when the weaknesses we’ve always had join forces with a capacity to do harm on an unprecedented planetary scale, something more is required of us — an emerging ethic that also must be established on an unprecedented planetary scale.

From my point of view, the consequences of global nuclear war became much more dangerous with the invention of the hydrogen bomb, because airbursts of thermonuclear weapons are much more capable of burning cities, generating vast amounts of smoke, cooling and darkening the Earth, and inducing global-scale nuclear winter. This was perhaps the most controversial scientific debate I’ve been involved in (from about 1983–1990). Much of the debate was politically driven. The strategic implications of nuclear winter were disquieting to those wedded to a policy of massive retaliation to deter a nuclear attack, or to those wishing to preserve the option of a massive first strike. In either case, the environmental consequences work the self-destruction of any nation launching large numbers of thermonuclear weapons even with no retaliation from the adversary. A major segment of the strategic policy of decades, and the reason for accumulating tens of thousands of nuclear weapons, suddenly became much less credible.

Teller advocated exploding nuclear weapons from Alaska to South Africa, to dredge harbors and canals, to obliterate troublesome mountains, to do heavy earth-moving.

Also in the 1980s, Teller sold President Ronald Reagan the notion of Star Wars—called by them the “Strategic Defense Initiative,” SDI.

Ten thousand American scientists and engineers publicly pledged they would not work on Star Wars or accept money from the SDI organization. This provides an example of widespread and courageous non-cooperation by scientists (at some conceivable personal cost) with a democratic government that had, temporarily at least, lost its way.

Somehow, somewhere, he wants to believe, thermonuclear weapons, and he, will be acknowledged by the human species as its savior and not its destroyer.

The CIA Inspector General commented in 1995 that “absolute secrecy corrupts absolutely.” The most open and vigorous debate is often the only protection against the most perilous misuse of technology.

In Joshua and in the second half of Numbers is celebrated the mass murder of men, women, children, down to the domestic animals in city after city across the whole land of Canaan. Jericho is obliterated in a kherem, a “holy war.” The only justification offered for this slaughter is the mass murderers’ claim that, in exchange for circumcising their sons and adopting a particular set of rituals, their ancestors were long before promised that this land was their land. Not a hint of self-reproach, not a muttering of patriarchal or divine disquiet at these campaigns of extermination can be dug out of holy scripture. Instead, Joshua “destroyed all that breathed, as the Lord God of Israel commanded” (Joshua 10:40). And these events are not incidental, but central to the main narrative thrust of the Old Testament.

It is the particular task of scientists, I believe, to alert the public to possible dangers, especially those emanating from science or foreseeable through the use of science. Such a mission is, you might say, prophetic. Clearly the warnings need to be judicious and not more flamboyant than the dangers require; but if we must make errors, given the stakes, they should be on the side of safety.

Among the !Kung San hunter-gatherers of the Kalahari Desert, when two men, perhaps testosterone-inflamed, would begin to argue, the women would reach for their poison arrows and put the weapons out of harm’s way. Today our poison arrows can destroy the global civilization and just possibly annihilate our species. The price of moral ambiguity is now too high. For this reason — and not because of its approach to knowledge — the ethical responsibility of scientists must also be high, extraordinarily high, unprecedentedly high. I wish graduate science programs explicitly and systematically raised these questions with fledgling scientists and engineers. And sometimes I wonder whether in our society, too, the women — and the children — will eventually put the poison arrows out of harm’s way.


So the law strives for an impossible standard of accuracy, and we do the best we can.

One of my favorite cartoons shows a fortune-teller scrutinizing the mark’s palm and gravely concluding, “You are very gullible.”

In France there are more astrologers than Roman Catholic clergy.

As I’ve tried to stress, at the heart of science is an essential balance between two seemingly contradictory attitudes—an openness to new ideas, no matter how bizarre or counterintuitive, and the most ruthlessly skeptical scrutiny of all ideas, old and new.


But none of these civilizations, Cromer argues, had developed the skeptical, inquiring, experimental method of science. All of that came out of ancient Greece: The development of objective thinking by the Greeks appears to have required a number of specific cultural factors. First was the assembly, where men first learned to persuade one another by means of rational debate. Second was a maritime economy that prevented isolation and parochialism. Third was the existence of a widespread Greek-speaking world around which travelers and scholars could wander. Fourth was the existence of an independent merchant class that could hire its own teachers. Fifth was the Iliad and the Odyssey, literary masterpieces that are themselves the epitome of liberal rational thinking. Sixth was a literary religion not dominated by priests. And seventh was the persistence of these factors for 1,000 years. That all these factors came together in one great civilization is quite fortuitous; it didn’t happen twice.

The impediment to scientific thinking is not, I think, the difficulty of the subject. Complex intellectual feats have been mainstays even of oppressed cultures. Shamans, magicians, and theologians are highly skilled in their intricate and arcane arts. No, the impediment is political and hierarchical. In those cultures lacking unfamiliar challenges, external or internal, where fundamental change is unneeded, novel ideas need not be encouraged. Indeed, heresies can be declared dangerous; thinking can be rigidified; and sanctions against impermissible ideas can be enforced — all without much harm. But under varied and changing environmental or biological or political circumstances, simply copying the old ways no longer works. Then, a premium awaits those who, instead of blandly following tradition, or trying to foist their preferences onto the physical or social Universe, are open to what the Universe teaches. Each society must decide where in the continuum between openness and rigidity safety lies.

Likewise, in the history of ancient Greece, we can see nearly all significant events driven by the caprice of the gods in Homer, only a few events in Herodotus, and essentially none at all in Thucydides. In a few hundred years, history passed from god-driven to human-driven.


In a world in transition, students and teachers both need to teach themselves one essential skill — learning how to learn.

There are naïve questions, tedious questions, ill-phrased questions, questions put after inadequate self-criticism. But every question is a cry to understand the world. There is no such thing as a dumb question.

In 1993, the supreme religious authority of Saudi Arabia, Sheik Abdel-Aziz Ibn Baaz, issued an edict, or fatwa, declaring that the world is flat.

Those in America with the most favorable view of science tend to be young, well-to-do, college-educated white males. But three-quarters of new American workers in the next decade will be women, non-whites, and immigrants. Failing to rouse their enthusiasm — to say nothing of discriminating against them — isn’t only unjust, it’s also stupid and self-defeating. It deprives the economy of desperately needed skilled workers.

Suburban African-Americans with college-educated parents do just as well in college as suburban whites with college-educated parents. According to some statistics, enrolling a poor child in a Head Start program doubles his or her chances to be employed later in life; one who completes an Upward Bound program is four times as likely to get a college education. If we’re serious, we know what to do.

But nature is always more subtle, more intricate, more elegant than what we are able to imagine. Given our manifest human limitations, what is surprising is that we have been able to penetrate so far into the secrets of Nature.

Knowing and explaining, they say, are not the same thing. What’s the secret? There’s only one, I think: Don’t talk to the general audience as you would to your scientific colleagues. There are terms that convey your meaning instantly and accurately to fellow experts. You may parse these phrases every day in your professional work. But they do no more than mystify an audience of nonspecialists. Use the simplest possible language. Above all, remember how it was before you yourself grasped whatever it is you’re explaining. Remember the misunderstandings that you almost fell into, and note them explicitly. Keep firmly in mind that there was a time when you didn’t understand any of this either. Recapitulate the first steps that led you from ignorance to knowledge. Never forget that native intelligence is widely distributed in our species. Indeed, it is the secret of our success.

With the end of the Cold War, the national-defense trump card that provided support for all sorts of fundamental science became virtually unplayable. Only partly for this reason, most scientists, I think, are now comfortable with the idea of popularizing science. (Since nearly all support for science comes from the public coffers, it would be an odd flirtation with suicide for scientists to oppose competent popularization.)

As a youngster, I was inspired by the popular science books and articles of George Gamow, James Jeans, Arthur Eddington, J.B.S. Haldane, Julian Huxley, Rachel Carson, and Arthur C. Clarke — all of them trained in, and most of them leading practitioners of science.

Among the best contemporary scientist-popularizers, I think of Stephen Jay Gould, E. O. Wilson, Lewis Thomas, and Richard Dawkins in biology; Steven Weinberg, Alan Lightman, and Kip Thorne in physics; Roald Hoffmann in chemistry; and the early works of Fred Hoyle in astronomy. Isaac Asimov wrote capably on everything.

Chapter 20 - HOUSE ON FIRE

At the same time, children with special abilities and skills need to be nourished and encouraged. They are a national treasure. Challenging programs for the “gifted” are sometimes decried as “elitism.” Why aren’t intensive practice sessions for varsity football, baseball, and basketball players and interschool competition deemed elitism? After all, only the most gifted athletes participate. There is a self-defeating double standard at work here, nationwide.

What followed were images straight out of an America that many of us fear has vanished. In the tradition of pioneer barn raising, members of the community—bricklayers, doctors, carpenters, university professors, plumbers, farmers, the very young, and the very old—all rolled up their sleeves to build the Sciencenter. “The continuous seven-days-a-week schedule was maintained,” says Trautmann, “so that anyone would be able to help anytime. Everyone was given a job. Experienced volunteers built stairs, laid carpet and tile, and trimmed windows. Others painted, nailed, and carried supplies.” Some 2,200 townspeople donated more than 40,000 hours. Roughly 10 percent of the construction work was performed by people convicted of minor offenses; they preferred to do something for the community than to sit idle in jail. Ten months later, Ithaca had the only community-built science museum in the world.


We must not believe the many, who say that only free people ought to be educated, but we should rather believe the philosophers who say that only the educated are free. EPICTETUS, Roman philosopher and former slave, Discourses

To make a contented slave,” Bailey later wrote, “it is necessary to make a thoughtless one. It is necessary to darken his moral and mental vision, and, as far as possible, to annihilate the power of reason.”

But Auld had revealed to Bailey the great secret: “I now understood... the white man’s power to enslave the black man. From that moment, I understood the pathway from slavery to freedom.”

With his knowledge of reading playing a key role in his escape, Bailey fled to New England, where slavery was illegal and black people were free. He changed his name to Frederick Douglass (after a character in Walter Scott’s The Lady of the Lake), eluded the bounty hunters who tracked down escaped slaves, and became one of the greatest orators, writers, and political leaders in American history. All his life, he understood that literacy had been the way out.

Books, purchasable at low cost, permit us to interrogate the past with high accuracy; to tap the wisdom of our species; to understand the point of view of others, and not just those in power; to contemplate — with the best teachers — the insights, painfully extracted from Nature, of the greatest minds that ever were, drawn from the entire planet and from all of our history.

Books are key to understanding the world and participating in a democratic society.

A national survey done for the U.S. Department of Education paints a picture of a country with more than 40 million barely literate adults. Other estimates are much worse. The literacy of young adults has slipped dramatically in the last decade. Only 3 to 4 percent of the population scores at the highest of five reading levels (essentially everybody in this group has gone to college). The vast majority have no idea how bad their reading is. Only 4 percent of those at the highest reading level are in poverty, but 43 percent of those at the lowest reading level are.

If no one close to you takes joy in reading, where is the evidence that it’s worth the effort? If the quality of education available to you is inadequate, if you’re taught rote memorization rather than how to think, if the content of what you’re first given to read comes from a nearly alien culture, literacy can be a rocky road.

Instead of showing an enthusiasm, a zest for learning — as most healthy youngsters do — the undernourished child becomes bored, apathetic, unresponsive. More severe malnutrition leads to lower birth weights and, in its most extreme forms, smaller brains. However, even a child who looks perfectly healthy but has not enough iron, say, suffers an immediate decline in the ability to concentrate. Iron-deficiency anemia may affect as much as a quarter of all low-income children in America; it attacks the child’s attention span and memory, and may have consequences reaching well into adulthood.

The National Center for Family Literacy, based in Louisville, Kentucky, has been implementing programs aimed at low-income families to teach both children and their parents to read. It works like this: The child, 3 to 4 years old, attends school three days a week along with a parent, or possibly a grandparent or guardian. While the grown-up spends the morning learning basic academic skills, the child is in a preschool class. Parent and child meet for lunch and then “learn how to learn together” for the rest of the afternoon.

The British Royal Governor of the Colony of Virginia wrote in 1671: I thank God there are no free schools nor printing; and I hope we shall not have [them] these [next] hundred years; for learning has brought disobedience, and heresy, and sects into the world, and printing has divulged them and libels against the best government. God keep us from both!

the sixth-grade textbooks of today are much less challenging than those of a few decades ago, while the literacy requirements at the workplace have become more demanding than ever before.

The gears of poverty, ignorance, hopelessness, and low self-esteem mesh to create a kind of perpetual failure machine that grinds down dreams from generation to generation. We all bear the cost of keeping it running. Illiteracy is its linchpin.


We also know how cruel the truth often is, and we wonder whether delusion is not more consoling. HENRI POINCARÉ (1854–1912)

It costs very little to hire a graduate student to read the script for scientific accuracy. But, so far as I can tell, this is almost never done.

There is a pressing national need for more public knowledge of science. Television cannot provide it all by itself. But if we want to make short-term improvements in the understanding of science, television is the place to start.


"Why should we subsidize intellectual curiosity?" -RONALD REAGAN, campaign speech, 1980

""There is nothing which can better deserve our patronage than the promotion of science and literature. Knowledge is in every country the surest basis of public happiness." -GEORGE WASHINGTON, address to Congress, January 8, 1790

Why subsidize geeks to pursue their absurd and incomprehensible little projects? Well, we know the answer to that: Science is supported because it provides spectacular benefits at all levels in the society, as I have argued earlier in this book. So those who find nerds distasteful, but at the same time crave the products of science, face a kind of dilemma. A tempting resolution is to direct the activities of the scientists. Don’t give them money to go off in weird directions; instead tell them what we need — this invention, or that process. Subsidize not the curiosity of the nerds, but what will benefit the society. It seems simple enough. The trouble is that ordering someone to go out and make a specific invention, even if price is no object, hardly guarantees that it gets done. There may be an underpinning of knowledge that’s unavailable, without which no one will ever build the contrivance you have in mind. And the history of science shows that often you can’t go after the underpinnings in a directed way, either. They may emerge out of the idle musings of some lonely young person off in the boondocks. They’re ignored or rejected even by other scientists, sometimes until a new generation of scientists comes along. Urging major practical inventions while discouraging curiosity-driven research would be spectacularly counterproductive.

He invented a mythical being, now called “Maxwell’s demon,” whose actions generated a paradox that took modern information theory and quantum mechanics to resolve.

If Queen Victoria had ever called an urgent meeting of her counselors, and ordered them to invent the equivalent of radio and television, it is unlikely that any of them would have imagined the path to lead through the experiments of Ampère, Biot, Oersted and Faraday, four equations of vector calculus, and the judgment to preserve the displacement current in a vacuum. They would, I think, have gotten nowhere. Meanwhile, on his own, driven only by curiosity, costing the government almost nothing, himself unaware that he was laying the ground for the Westminster Project, “Dafty” was scribbling away. It’s doubtful whether the self-effacing, unsociable Mr. Maxwell would even have been thought of to perform such a study. If he had, probably the government would have been telling him what to think about and what not, impeding rather than inducing his great discovery.

No one asked to pay for this had the foggiest idea of what a Higgs boson is. I’ve read some of the material intended to justify the SSC. At the very end, some of it wasn’t too bad, but there was nothing that really addressed what the project was about on a level accessible to bright but skeptical non-physicists. If physicists are asking for 10 or 15 billion dollars to build a machine that has no practical value, at the very least they should make an extremely serious effort, with dazzling graphics, metaphors, and capable use of the English language, to justify their proposal. More than financial mismanagement, budgetary constraints, and political incompetence, I think this is the key to the failure of the SSC.

But that aside, would free market forces be adequate to support basic research? Only about 10 percent of meritorious research proposals in medicine are funded today. More money is spent on quack medicine than on all of medical research. What would it be like if government opted out of medical research?


Ubi dubium ibi libertas: Where there is doubt, there is freedom. -LATIN PROVERB

At great personal risk, von Spee protested the witch mania. So did a few others, mainly Catholic and Protestant clergy who had witnessed these crimes at first hand—including Gianfrancesco Ponzinibio in Italy, Cornelius Loos in Germany, and Reginald Scot in Britain in the sixteenth century; as well as Johann Mayfurth [“Listen, you money-hungry judges and bloodthirsty prosecutors, the apparitions of the Devil are all lies”] in Germany and Alonzo Salazar de Frias in Spain in the seventeenth century. Along with von Spee and the Quakers generally, they are heroes of our species.

The last execution for witchcraft in Holland, cradle of the Enlightenment, was in 1610; in England, 1684; America, 1692; France, 1745; Germany, 1775; and Poland, 1793. In Italy, the Inquisition was condemning people to death until the end of the eighteenth century, and inquisitorial torture was not abolished in the Catholic Church until 1816. The last bastion of support for the reality of witchcraft and the necessity of punishment has been the Christian churches.

If we’re absolutely sure that our beliefs are right, and those of others wrong; that we are motivated by good, and others by evil; that the King of the Universe speaks to us, and not to adherents of very different faiths; that it is wicked to challenge conventional doctrines or to ask searching questions; that our main job is to believe and obey — then the witch mania will recur in its infinite variations down to the time of the last man. Note Friedrich von Spee’s very first point, and the implication that improved public understanding of superstition and skepticism might have helped to short-circuit the whole train of causality. If we fail to understand how it worked in the last round, we will not recognize it as it emerges in the next.

In George Orwell’s novel 1984, the “Big Brother” state employs an army of bureaucrats whose only job is to alter the records of the past so they conform to the interests of those currently in power. 1984 was not just an engaging political fantasy; it was based on the Stalinist Soviet Union, where the rewriting of history was institutionalized. Soon after Stalin took power, pictures of his rival Leon Trotsky — a monumental figure in the 1905 and 1917 revolutions — began to disappear. Heroic and wholly anhistoric paintings of Stalin and Lenin together directing the Bolshevik Revolution took their place, with Trotsky, the founder of the Red Army, nowhere in evidence.

It works to erase public memory of profound political mistakes, and thus to guarantee their eventual repetition.

Trends working at least marginally towards the implantation of a very narrow range of attitudes, memories, and opinions include control of major television networks and newspapers by a small number of similarly motivated powerful corporations and individuals, the disappearance of competitive daily newspapers in many cities, the replacement of substantive debate by sleaze in political campaigns, and episodic erosion of the principle of the separation of powers.

“There is no national science,” said the Russian playwright Anton Chekhov, “just as there is no national multiplication table.”

When Ann and I once asked Pauling about the roots of his dedication to social issues, he gave a memorable reply: “I did it to be worthy of the respect of my wife,” Helen Ava Pauling. He won a second Nobel Prize, this one in peace, for his work on the nuclear test ban, becoming the only person in history to win two unshared Nobel Prizes.


It is not the function of our government to keep the citizen from falling into error; it is the function of the citizen to keep the government from falling into error. -U.S. SUPREME COURT JUSTICE ROBERT H. JACKSON, 1950

One of the perquisites of power on becoming prime minister in China in the fifth century B.C. was that you got to construct a model state in your home district or province. It was Confucius’ chief life failing, he lamented, that he never got to try.

Jefferson was an early hero of mine, not because of his scientific interests (although they very much helped to mold his political philosophy), but because he, almost more than anyone else, was responsible for the spread of democracy throughout the world.

He displayed a bust of his arch-adversary Alexander Hamilton in the vestibule at Monticello.

I wish that the oath of citizenship taken by recent immigrants, and the pledge that students routinely recite, included something like “I promise to question everything my leaders tell me.” That would be really to Thomas Jefferson’s point.

They wrote their own speeches. They were realistic and practical, and at the same time motivated by high principles. They were not checking the pollsters on what to think this week. They knew what to think. They were comfortable with long-term thinking, planning even further ahead than the next election. They were self-sufficient, not requiring careers as politicians or lobbyists to make a living.

At that time, there were only about two and a half million citizens of the United States. Today there are about a hundred times more. So if there were ten people of the caliber of Thomas Jefferson then, there ought to be 10 × 100 = 1,000 Thomas Jeffersons today. Where are they?

A purported scientific article or popular book asserting the “superiority” of one race over another may not be censored by the government, no matter how pernicious it is; the cure for a fallacious argument is a better argument, not the suppression of ideas.

Jefferson made the same point even more strongly: “If a nation expects to be both ignorant and free in a state of civilization, it expects what never was and never will be.” In a letter to Madison, he continued the thought: “A society that will trade a little liberty for a little order will lose both, and deserve neither.”

When permitted to listen to alternative opinions and engage in substantive debate, people have been known to change their minds. It can happen. For example, Hugo Black, in his youth, was a member of the Ku Klux Klan; he later became a Supreme Court justice and was one of the leaders in the historic Supreme Court decisions, partly based on the 14th Amendment to the Constitution, that affirmed the civil rights of all Americans: It was said that when he was a young man, he dressed up in white robes and scared black folks; when he got older, he dressed up in black robes and scared white folks.

New ideas, invention, and creativity in general, always spearhead a kind of freedom — a breaking out from hobbling constraints. Freedom is a prerequisite for continuing the delicate experiment of science — which is one reason the Soviet Union could not remain a totalitarian state and be technologically competitive. At the same time, science — or rather its delicate mix of openness and skepticism, and its encouragement of diversity and debate — is a prerequisite for continuing the delicate experiment of freedom in an industrial and highly technological society.

In every country, we should be teaching our children the scientific method and the reasons for a Bill of Rights. With it comes a certain decency, humility and community spirit. In the demon-haunted world that we inhabit by virtue of being human, this may be all that stands between us and the enveloping darkness.