Kindly Inquisitors: The New Attacks on Free Thought

Kindly Inquisitors: The New Attacks on Free Thought

Jonathan Rauch forcefully defends freedom of speech and liberal science in his short and crisp "Kindly Inquisitors: The New Attacks on Free Thought". Originally published in 1993, the book is even more relevant today than it was back then, resulting in an expanded edition in 2013. With brisk and efficient clarity, Rauch exposes the authoritarian intellectual underpinnings of the nominally liberal thought police that hold sway over America's universities. He goes further and argues that these initiatives - although carried out with the best of intentions - pose a grave danger to the very foundations of the liberal system. Ultimately, he says, "The answer to the question “Why tolerate hateful or misguided opinions?” has been the same ever since Plato unveiled his ghastly utopia: because the alternative is worse."

The book's tone is captured nicely by one of Rauch's incendiary passages:

If you are inclined to equate verbal offense with physical violence, think again about the logic of your position. If hurtful opinions are violence, then painful criticism is violence. In other words, on the humanitarian premise, science itself is a form of violence. What do you do about violence? You establish policing authorities — public or private — to stop it and to punish the perpetrators. You set up authorities empowered to weed out hurtful ideas and speech. In other words: an inquisition... It is bad enough to have to remind people that there is no right not to be offended, and that criticism is not the same as violence. It is deeply embarrassing to have to deliver this reminder to people at the center of American intellectual life.

The core of Rauch's argument is that "Epistemology — one’s view of who can have knowledge and when — is politics." That line could very well be the tagline for my 2017 reading theme on the integrity of Western science. This book gifted me a clear statement of how all of the philosophy of science reading I've been doing relates to larger political questions. Rauch has an explicitly Popperian view on the philosophy of science: "you may claim that a statement is established as knowledge only if it can be debunked, in principle, and only insofar as it withstands attempts to debunk it." For Rauch, skepticism and empricism are the pillars of the liberal system. I'm a bit concerned about how his overall argument holds up in the face of the holes that the modern philospher Godfrey-Smith pokes in Popper's theories, but to be fair, none of the critiques of free speech that I've ever heard have contested Popperian epistemology!

The book is full of little gems like those above - sentences that crystallize ideas that have been amorphously floating around in my brain all year. Rauch writes beautifully, tracing the genealogy of liberal thought and synthesizing millenia of Western history to make the case for why we must prevent restrictions on free speech and free thought. He recoils in horror from Plato's intellectually authoritarian regime. He rejoices in the skepticism of Montaigne. He whips us through a tour of major Enlightenment philosophers and ends with our good friend Sir Karl Popper.

Rauch then extends our intellectual journey up to the controversy over Salman Rushdie's "The Satanic Verses". Rushdie's searingly controversial novel earned him a big target on his back - in the form of a fatwa issued by Ayatollah Khomeini in Iran. For Rauch, this represented a defining moment in Western intellectual history:

It showed how readily Westerners could be backed away from a fundamental principle of intellectual liberalism, namely that there is nothing whatever wrong with offending — hurting people’s feelings — in pursuit of truth. That principle seemed to have been displaced by a belief in the right not to be offended, which was quickly gaining currency in America.

Rauch covers an astonishing amount of ground in this book and I felt like I highlighted about half the book as I tore through it. His principled, clearly argued position on free speech was just the antidote I needed for the philosophical morass I got sucked into during undergrad. I'll close with another one of my favorite quotes - some practical advice for free-speech campaigners in the hostile territory of the modern academic postmodern left:

The standard answer to people who say they are offended should be: “Is there any casualty other than your feelings? Are you or others being threatened with violence or vandalism? No? Then it’s a shame your feelings are hurt, but that’s too bad. You’ll live.” If one is going to enjoy the benefits of living in a liberal society without being shamelessly hypocritical, one must try to be thick-skinned, since the way we make knowledge is by rubbing against one another.

My highlights below:


Upon this first, and in one sense this sole, rule of reason, that in order to learn you must desire to learn, and in so desiring not be satisfied with what you already incline to think, there follows one corollary, which itself deserves to be inscribed upon every wall of the city of philosophy: Do not block the way of inquiry. —Charles Sanders Peirce

FOREWORD

And, of course, the entitlement flourishes on campuses, where people are taught that taking offense is a sign of intellectual acuity and moral refinement. New rights tend to trump old rights, such as those protected, or so we once thought, by the First Amendment.

Persons who dismiss stories such as those of Keith John Sampson as merely “anecdotal” need to be reminded that the plural of “anecdote” is “data.”

The unvarnished truth is that some people derive intense pleasure from bossing around other people.

The stakes of politics were raised radically higher by a nineteenth-century intellectual invention — historicism. This theory holds that history has its own inner logic and unfolding laws of development. Progress, by definition, is that condition toward which history flows.

Hard historicism teaches four things. First, it warns that history — actually, History (it becomes a proper noun) — is going to have its way. It will because its “iron laws” are just that: unbending. Second, it demonstrates that humanity’s only rational course is to get in step with the “march of history.” That resistance is reactionary is less a moral judgment than a scientific fact, because resistance to progress must be ultimately futile. Progress is, by definition, whatever is history’s destination. Third, hard historicism holds that the laws of history’s development are not equally clear to all. History’s path is not optional, but the smoothness of the path and the pace of progress on it can be influenced by a minority who understand what is happening. To this clerisy of the discerning few falls the high and solemn task of conveying to others a proper consciousness of the reality that history is dictating. Fourth, historicism assigns to a vanguard of discerning intellectuals the task of purging society of “false consciousness.

Everyone engaged in the never-ending arguments of political philosophy stands on the shoulders of giants. In his afterword, Jonathan Rauch pays fitting tribute to several of the shoulders on which he stands, especially those of Charles Saunders Peirce and Karl Popper.

1 - New Threats to Free Thought

A very dangerous principle is now being established as a social right: Thou shalt not hurt others with words. This principle is a menace — and not just to civil liberties. At bottom it threatens liberal inquiry — that is, science itself.

In English we have a word for the empanelment of tribunals — public or private, but in any case prestigious and powerful — to identify and penalize false and socially dangerous opinions. The word applies reasonably well to a system in which a university student is informed against, and then summoned to a hearing and punished, for making incorrect and hurtful remarks during a conversation late at night. The word has been out of general circulation for many years. It is “inquisition.”

This book tries to defend the morality, rather than the legality, of a knowledge - producing social system which often causes real suffering to real people. It tries to defend the liberal intellectual system against a rising anti-critical ideology.

We have standard labels for the liberal political and economic systems—democracy and capitalism. Oddly, however, we have no name for the liberal intellectual system, whose activities range from physics to history to journalism. So in this book I use the term “liberal science,” for reasons to be explained later.

The question which forms the central issue of this book is, What should be society’s principle for raising and settling differences of opinion? In other words, what is the right way, or at least the best way, to make decisions as to who is right (thus having knowledge) and who is wrong (thus having mere opinion)?

To the central question of how to sort true beliefs from the “lunatic” ones, here are five answers, five decision-making principles—not the only principles by any means, but the most important contenders right now:

  • The Fundamentalist Principle: Those who know the truth should decide who is right.
  • The Simple Egalitarian Principle: All sincere persons’ beliefs have equal claims to respect.
  • The Radical Egalitarian Principle: Like the simple egalitarian principle, but the beliefs of persons in historically oppressed classes or groups get special consideration.
  • The Humanitarian Principle: Any of the above, but with the condition that the first priority be to cause no hurt.
  • The Liberal Principle: Checking of each by each through public criticism is the only legitimate way to decide who is right.

The argument of this book is that the last principle is the only one which is acceptable, but that it is now losing ground to the others, and that this development is extremely dangerous. Impelled by the notions that science is oppression and criticism is violence, the central regulation of debate and inquiry is returning to respectability — this time in a humanitarian disguise.

By the 1980s the creationists were not alone. Exactly the same line of attack was now being pursued by their enemies on the political left. What about minority viewpoints? Why were they not being taught, at least as valid alternatives to the all-male, all-European tradition of “mainstream” history and social science?

Thus the rise of minority activists’ version of the creationist argument. They said that classical scholarship had lied about blacks’ role in history — for example, about the African ethnicity of the ancient Egyptians. An outline for “multicultural” curriculum reform, adopted in various school districts, said that Africa—specifically Egypt—was “the world center of culture and learning in antiquity” and that ancient Egypt was a black nation. Leave aside why it should matter what color people were; the agenda here was to use political pressure to obtain at least equal time for an “outsiders’” viewpoint — the creationists’ agenda precisely.

If we do not have an answer to the demands for fairness, if we cannot justify the imperialism of liberal science and the refusal to recognize the validity of other systems, then we are forced to admit that the scientific order is indeed nothing more than the rule of the strong. In that case we must concede that David and Ginger Twitchell were in fact political prisoners, condemned because they and their fellow churchmen lacked the strength or numbers to impose on society their idea of truth. That is the egalitarian challenge.

The core argument was that pornography hurt women by degrading them, aiding in their repression, denying them their rights. Pornography, said the influential feminist critic and scholar Catharine A. MacKinnon in 1983, “causes attitudes and behaviors of violence and discrimination that define the treatment and status of half of the population.” Real people were being hurt.

To ban books or words which cretins find exciting is to let the very lowest among us determine what we may read or hear.

“If pornography is an act of male supremacy, its harm is the harm of male supremacy made difficult to see because of its pervasiveness, potency, and success in making the world a pornographic place. . . . To the extent pornography succeeds in constructing social reality, it becomes invisible as harm.” In the world constructed by pornography, people who are not radical feminists can no more see the harm of pornography than a fish can see water.

In 1980, influenced by feminist legal theorists, the U.S. Equal Employment Opportunity Commission adopted three tests for deciding whether speech in the workplace constitutes sexual harassment punishable under civil-rights laws. Among those tests was whether the words at issue create an “intimidating, hostile, or offensive working environment.” If words make the social situation uncomfortable for somebody, the commission seemed to be saying, then they are not mere words at all; rather, they are acts of harassment (just as pornography is an act of oppression). So here was a theory which said that images and expressions and words could be, for all practical purposes, a form of hurt or violence.

As more and more people realized that they could win concessions and moral victories by being offended, more and more offended people became activists.

The humanitarians had discovered what liberals rarely realize and almost never admit: the liberal intellectual system, whatever else it may be, is not “nice.” Somehow the idea has grown up that “liberal” means “nice,” that the liberal intellectual system fosters sensitivity, toleration, self-esteem, the rejection of prejudice and bias. That impression is misguided. The truth is that liberal science demands discipline as well as license, and to those who reject or flout its rules, it can be cruel.

To advance knowledge, we must all sometimes suffer. Worse than that, we must inflict suffering on others.

Then came a defining moment, though to this day it has not, I believe, been properly recognized as such. All at once lightning illuminated a garish landscape which until then had been seen only in patches here and there. In February 1989, fundamentalist Muslims rose up against the British writer Salman Rushdie, who had written a novel which they regarded as deeply, shockingly, offensive to Islam’s holy truths and to the Muslim community.

In the end the Rushdie affair showed us graphically two things, one which we knew already and one which we did not know at all. What we knew already was that fundamentalism — not just religious fundamentalism, but any fundamentalist system for settling differences of opinion — is the enemy of free thought. More frightening was what we had not known: Western intellectuals did not have a clear answer, many had no answer at all, to the challenge that Khomeini set before them. That challenge was at least twofold. First, it was a restatement of the creationists’ challenge, the angry outsiders’ cry from the heart: Who gave you, the arrogant West, the right to make the rules? You are imperialists with your view of truth, with your insistence on the intellectual ways of secularism and of science. How dare you flout and mock our view of truth? The point was noted at the time. What was not so widely noted was the second dimension of Khomeini’s challenge: the humanitarian dimension. This is not to say that Khomeini was a humanitarian, only that the argument which his supporters commonly made was humanitarian in principle: “You have hurt us with your evil words, your impious words, disrespectfully and needlessly written in utter disregard of Muslim sensibilities. You have caused pain and offense to many people. And this you have no right to do.”

That was the sense in which the Rushdie affair was a defining moment. It showed how readily Westerners could be backed away from a fundamental principle of intellectual liberalism, namely that there is nothing whatever wrong with offending — hurting people’s feelings — in pursuit of truth. That principle seemed to have been displaced by a belief in the right not to be offended, which was quickly gaining currency in America.

“Only when insults, harassment, disrespect and obscenity are banned [in universities] can people engage in truly substantive argument,” wrote a syndicated columnist and a prominent scientist in the New York Times. Intellectual authoritarianism, so long disgraced, was returning to favor — this time not among religious reactionaries or fringe radicals or cultural primitives or McCarthyite paranoiacs, but among Western educated elites.

It is crucial to understand that the Humanitarian Principle is deadly — inherently deadly, not incidentally so — to intellectual freedom and to the productive and peaceful pursuit of knowledge. The principle takes aim not just at freedom of speech but at liberal science itself. It is equally deadly whether espoused by Islamic fundamentalists (“Rushdie owes Muslims an apology”), by Christians, or by minority activists (“Andy Rooney owes an apology to ‘any in our society who were given offense’”). It leads to the doctrine that people should be punished for holding false or dangerous beliefs. It leads, in other words, toward an inquisition.

What is the right answer to the person who demands something because he is offended? Just this: “Too bad, but you’ll live.”

First, there are not two great liberal social and political systems but three. One is democracy — political liberalism — by which we decide who is entitled to use force; another is capitalism — economic liberalism — by which we decide how to allocate resources. The third is liberal science, by which we decide who is right.

That trend must be fought, because, fourth, the alternatives to liberal science lead straight to authoritarianism. And intellectual authoritarianism, although once the province of the religious and the political right in America, is now flourishing among the secular and the political left.

fundamentalism, properly understood, is not about religion. It is about the inability to seriously entertain the possibility that one might be wrong.

Liberal science is not, finally, a way of making things. It is a way of organizing society and a way of behaving.

2 - The Rise of Liberal Science

There are many reasons to read Plato, among them the beauty and plasticity of his thought and the delightful character of Socrates, but surely one of the best reasons to read him is to be horrified. Read The Republic, putative wellspring of Western values, and you find that once you look past the glittering facade of Plato’s rhetoric you are face to face with the ethic of the totalitarian regime.

Plato’s ideal Republic, his vision of the good political regime, is built on the following principles. The founding principle is that of absolute individual devotion to, and submission to, the good of the state.

Supporting the whole regime, and giving it legitimacy, is “one noble lie” told among the ruling elite. The rulers, in their turn, will administer a regimen of propaganda lies to keep the social structure stable: “our rulers will have to make considerable use of falsehood and deception for the benefit of their subjects”.

Plato believed what so many of us instinctively believe: that the way to produce knowledge is to sit down in a quiet spot and think clearly. The best knowledge comes to him who thinks best. Liberalism holds that knowledge comes only from a public process of critical exchange, in which the wise and unwise alike participate.

Once you grant Plato his premises about knowledge, then it is clear who should rule the state and sort true opinions from false ones: the philosophers. “To them by their very nature belong the study of philosophy and political leadership, while it befits the other sort to let philosophy alone and follow their leadership”. Only to those who are capable of right knowledge should truth and power be entrusted. Few people are endowed with such a capability, though many might aspire. Philosophy “is impossible for the multitude,” and “the perfect philosopher is a rare growth among men and is found in only a few”.

And what about the “motley horde” of people who want to rule but lack the philosopher’s access to knowledge? Such persons are bound to be a problem. Unless, says Plato, they are “compulsorily excluded [from power], there can be no cessation of troubles”. There must be no Salman Rushdie in Plato’s Republic. If such a person were somehow to survive the state-controlled education with his ambitions intact, he would have to be eliminated.

Epistemology — one’s view of who can have knowledge and when — is politics, and it has the profoundest practical consequences. No better illustration exists than Plato’s ghastly state, with its central control of everything, founded on central control of truth.

Plato the epistemologist understood that truth is elusive for all of us, but Plato the realist understood that some of us can come closer to it than others.

Skeptical doubters have been around since at least the days of Socrates himself and of Pyrrho of Elis (fourth century B.C.), who is supposed to have made it his aim to withhold judgment on all matters on which there were conflicting views, including the matter of whether anything was known. Skepticism typically flourishes in response to divisive and sometimes violent differences of opinion, as a way to short-circuit dangerous conflict. Ancient skepticism thrived in the medical community of Alexandria in reaction to the stubborn dogmatism of rival camps of doctors. In periods of consensus, skepticism simmers down, as it does also in periods when debate is quashed or circumscribed by political controls. Thus the skeptical schools of thought more or less disappeared behind the walls of the Church. But the walls were eventually broken, and intellectual crisis ensued. In the early sixteenth century Martin Luther declared that all Christians, not just the ones in authority, had the power of seeing and judging what is right or wrong in matters of faith. Well, if the Church did not have the sole authority to identify truth, and if people disagreed in their conclusions (as of course they did), just how was anyone supposed to know which beliefs were the right ones? What was the rule for separating reality from illusion? Who should be believed? As Plato had understood almost two millennia earlier, the problem of knowledge could tear society to shreds, and indeed, as Catholics and Protestants bloodied each other in battles across Europe, it did so. No surprise, then, that at about that time the ancient skeptics were rediscovered. Amid the bickering and fighting they exerted a strong appeal. Skepticism cropped up in the academies and reached a new pinnacle with Michel de Montaigne.

With Montaigne’s having destroyed certainty without providing anything to replace it, the condition of knowledge seemed desperate. Here Descartes intervened. He searched until he found one proposition which was clearly beyond doubt: that he thought and thus knew he existed.

Descartes nonetheless achieved an important advance, not with his conclusion, but with his method. Systematic criticism was the key. Thus began the skeptical revolution. Skeptical reasoners marched straight down the road opened by Descartes. At last in 1739 David Hume, the brilliant twenty-eight-year-old enfant terrible of modern philosophy, came along with his bulldozer and made a ruin of the last pillars of certainty about the external world. Induction — generalizing from past to future, from known to unknown — is nothing more than an act of faith, Hume said.

In its most peculiar and extreme philosophical form, skepticism refers to the doctrine that we have no reason to believe anything, and so should believe nothing. That, however, is on its face an unsustainable argument. Believing nothing is impossible. Even the belief that you are justified in believing nothing is a belief. And even when we refuse to conclude, we do so only against the background of other conclusions. No one could possibly be a genuinely beliefless skeptic, even in principle. The “skepticism” upon which liberal science is based is something quite different. (To distinguish it from the kind which says that we should never conclude anything, philosophers often call it “fallibilism.”) This kind of skepticism says cheerfully that we have to draw conclusions, but that we may regard none of our conclusions as being beyond any further scrutiny or change. “Go ahead and conclude whatever you want; just remember that all of your conclusions, every single one of them, may need to be corrected.” This attitude does not require you to renounce knowledge. It requires you only to renounce certainty, which is not the same thing. In other words, your knowledge is always tentative and subject to correction. At the bottom of this kind of skepticism is a simple proposition: we must all take seriously the idea that any and all of us might, at any time, be wrong.

What, then, is so important about the emergence, eventually the triumph, of the skeptical ethic? The answer is this: Hidden in the pages of the skeptical philosophers’ tomes is a radical social principle. It is the principle of public criticism.

The result is this: A society which has accepted skeptical principles will accept that sincere criticism is always legitimate. In other words, if any belief may be wrong, then no one can legitimately claim to have ended any discussion — ever. In other words: No one gets the final say. Another conclusion also follows. If any person may be in error, then no one can legitimately claim to be above being checked by others — ever. Moreover, if anyone may be in error, no one can legitimately claim to have any unique or personal powers to decide who is right and who is wrong. In other words: No one has personal authority.

Even as the theorists were busy showing that certain knowledge is impossible, the scientists and scholars of the Enlightenment were showing that uncertain knowledge is possible. That process was already under way ten years after Descartes died. The physicist Freeman Dyson wrote: The Royal Society of London in 1660 proudly took as its motto the phrase Nullius in Verba, meaning “No man’s word shall be final.” The assertion of papal infallibility, even in questions of faith and morals having nothing to do with science, grates harshly upon a scientist’s ear. We scientists are by training and temperament jealous of our freedom. We do not in principle allow any statement whatever to be immune from doubt. Liberal science is a big and complicated thing. No one could begin to describe it fully. However, with nullius in verba we have reached one of the two great foundation stones of the liberal intellectual system.

First, the skeptical rule. If people follow it, then no idea, however wise and insightful its proponent, can ever have any claim to be exempt from criticism by anyone, no matter how stupid and grubby-minded the critic. The skeptical rule is, No one gets the final say: you may claim that a statement is established as knowledge only if it can be debunked, in principle, and only insofar as it withstands attempts to debunk it. This is, more or less, what the great twentieth-century philosopher of science Karl R. Popper and his followers have called the principle of falsifiability. Science is distinctive, not because it proves true statements, but because it seeks systematically to disprove (falsify) false ones.

Second, the empirical rule. If people follow it in deciding who is right and who is wrong, then no one gets special say simply on the basis of who he happens to be. The empirical rule is, No one has personal authority: you may claim that a statement has been established as knowledge only insofar as the method used to check it gives the same result regardless of the identity of the checker, and regardless of the source of the statement.

The skeptical revolution was gradual and nonviolent; it was fomented not by a few noisy activists but through the evolving everyday practices of thousands of intellectuals, moving as best they could from one decision about the world to the next. Its radicalism is thus easy to miss. Besides, science has a genius for looking sober and conservative; and in many ways, especially in the face it presents to the public (and the way it usually sees itself), it is sober and conservative. But in a deeper sense it is quite probably the most radical endeavor ever embarked on by mankind — radical in two ways. First, it has completely abolished inerrancy. “There is nothing like absolute certainty in the whole field of our knowledge,” writes Popper. Before the revolution Montaigne could declare, “Either we judge absolutely, or we absolutely cannot.” Afterwards, his formula stood on its head: if we judge absolutely, we absolutely do not. Knowledge must be debunkable and stands only until it is debunked.

Radical, too, in another way—breathtakingly so. Today we take empiricism almost completely for granted. We forget that a philosopher like Plato, who held that only the wise philosopher could hope for knowledge of things as they really are, would have been horrified by our widespread acceptance of the empirical rule (no personal authority). For that rule has opened up the entirety of human knowledge to scrutiny by anyone and everyone.

Interchangeability of persons (we all play by the same rules) is a hallmark of liberal social philosophy. Kant declared that an action can be right for one person only if it is right for any and all, and so codified the liberal standard of justice. The empiricists declared that a statement can be true for one person only if it is true for any and all, and so codified the liberal standard for knowledge. This is a point which has been missed again and again: scientific empiricism is a social philosophy.

Outside a small circle of cognoscenti, Peirce’s lot has been a tragic and undeserved obscurity. Yet no one better understood the social implications of science’s liberal ideal of objectivity. Unless truth be recognized as public — as that of which any person would come to be convinced if he carried his inquiry, his sincere search for immovable belief, far enough — then there will be nothing to prevent each one of us from adopting an utterly futile belief of his own which all the rest will disbelieve. Each one will set himself up as a little prophet; that is, a little “crank,” a half-witted victim of his own narrowness.

I can only say that the rules should deny respectability to anyone’s claim that some particular kind of person is favored with especially undistorted insight.

3 - The Politics of Liberal Science

But philosophers of science have moved sharply away from that view, and toward what has become known as evolutionary epistemology. Evolutionary epistemology holds that our knowledge comes to us not from revelation, as religious traditions maintain; nor from deep reflection by the wise, as in Plato; nor even from crisp experiments that unambiguously reveal nature’s secrets, as in the mechanistic view of science that prevailed until this century. Rather, our knowledge evolves — with all the haphazardness and improvisation that “evolving” implies. In biological evolution, species and their genes evolve as they compete for limited resources, with mutations providing the raw material for change. In evolutionary epistemology, hypotheses and ideas evolve as they compete under pressure from criticism, with intellectual diversity providing the raw material for change. The evolutionary view of knowledge recognizes that, in science, trial and error play as important a role as does mechanistic experimentation. It recognizes that scientific consensus doesn’t always march methodically toward a single inevitable conclusion; the consensus often meanders or drifts, and where it comes out on any given day can depend as much on circumstance and fashion, even on personalities, as on nature. (Which is not to say that the results are random; the method of trial and error may be unpredictable in the short term, but in the longer term it produces steady improvement. The path may veer this way or that, but the long-term direction is uphill.) Most important, the evolutionary view recognizes that knowledge comes from a social process. Knowledge comes from people checking with each other. Science is not a machine; it is a society, an ecology. And human knowledge, like the species themselves, is a product of the turmoil of the interreactions of living organisms.

The genius of Locke (and, later, of Adam Smith and Charles Darwin) was to see, as Plato had not, that social stability does not require social stasis; just the opposite, in fact.

Locke preached the sermon which every generation learns with such difficulty and forgets with such ease: “We should do well to commiserate our mutual ignorance, and endeavor to remove it in all the gentle and fair ways of information, and not instantly treat others ill, as obstinate and perverse, because they will not renounce their own, and receive our opinions... For where is the man that has incontestable evidence of the truth of all that he holds, or of the falsehood of all he condemns?”

The game of liberal science satisfies our craving for new beauties but not our appetite for final truth: No final say.

Diversity of biological form is the raw material of natural selection. Diversity of political inclination renews democratic governments and cracks authoritarian ones. Diversity of ability and of desire impels markets. And diversity of belief, thought, experience — the diversity of our various subjective worlds — is no less important. It is, indeed, among the richest of all natural resources; perhaps it is the richest of all.

A critical society — a community of error-seekers — stimulates curiosity by rewarding people, rather than punishing them, for finding mistakes.

The genius of liberal science lies not in doing away with dogma and prejudice; it lies in channeling dogma and prejudice — making them socially productive by pitting dogma against dogma and prejudice against prejudice. Science remains unbiased even though scientists are not. “One of the strengths of science,” the philosopher and historian of science David L. Hull has written, “is that it does not require that scientists be unbiased, only that different scientists have different biases.”

For not only is wiping out bias and hate impossible in principle, in practice eliminating prejudice through central authority means eliminating all but one prejudice — that of whoever is most politically powerful.

“Scientific investigation has had the most wonderful triumphs in the way of settling opinion,” Peirce wrote in 1877, making a point which has been too little noticed since. Liberal science has two invaluable social skills. First, it is very good at resolving conflicts. Second, it is very good at not resolving conflicts.

That is the hidden power in these four capacious words: We don’t yet know.

Derek J. de Solla Price found that the number of scientists and journals has tended to double every ten to fifteen years since Locke’s day; “using any reasonable definition of a scientist, we can say that 80 to 90 per cent of all the scientists that have ever lived are alive now.”

Newton said he stood on the shoulders of giants. Yes, but more important still is that liberal science allows each of us to stand on the shoulders of millions of ordinary inquirers, not just the few great ones. Authoritarian systems have their intellectual giants. What they lack is the capacity to organize and exploit their masses of middling thinkers.

No Final Say and No Personal Authority are not just operational procedures for professional intellectuals. Socially speaking, they are also moral commandments, ethical ideals. They are a liberal society’s epistemological constitution.

We take the absence of purges and inquisitions among those who play the science game so much for granted that we forget how extraordinary the absence really is. “Until the end of the eighteenth century,” the historian Arthur M. Schlesinger, Jr., notes, “torture was normal investigative procedure in the Catholic church as well as in most European states.” You could map a lot of human history simply by tracing the long line of creed wars within and between cultures. Creed wars are still going on today within and between orthodox groups of all kinds. But such wars have almost disappeared from critical society. Liberal science has brought peace.

Belief in liberal science is a faith, but it does deserve special standing: not only because it is the best social regime for mobilizing resources to produce knowledge, but also because it is inherently anti-authoritarian. If you care about freedom of thought, that’s important. The Inquisition died, not because people dispensed with faith, but because they learned to put their faith in liberal social institutions. They committed themselves to the rules which say that no one is immune from checking and no one is in charge. Thus they established the overarching political fact of a liberal intellectual community: the power to settle differences of opinion lies with no one in particular. And no one in particular is the very safest person to entrust with that great power.

I spoke in the last chapter of the skeptical moral commandment to take seriously the idea that you might be wrong. If you look politically at someone who lives by that commandment, you find you can describe him this way: he is one who feels that it is never a crime to be mistaken. That is the central tenet of a liberal scientific morality.

In particular, the morality of liberal science charges two kinds of institutions with an especial obligation not to punish people for what they say or believe: governments, because their monopoly on force gives them enormous repressive powers, and universities, because their moral charter is first and foremost to advance human knowledge by practicing and teaching criticism. If governments stifle criticism, then they impoverish and oppress their citizenry; if universities do so, then they have no reason to exist.

4 - The Fundamentalist Threat

Fundamentalism — the intellectual style, not the religious movement — is the strong disinclination to take seriously the notion that you might be wrong.

First, you’re not wrong just because you’re a fundamentalist. Second, to some extent we are all fundamentalists, each and every one of us. We are all true believers in something. What distinguishes the ethic of liberal science is not that liberals are undogmatic; it is that liberals believe they must check their beliefs, or submit them for checking, however sure they feel. And that ethic, I finally concluded, was what made me feel so removed from the free-marketers — or the environmentalists or animal-rights activists or feminists or whomever. What distinguishes them is not the rightness or wrongness of their beliefs, or even that they believe strongly. It is that they show no interest in checking.

It was Peirce, in his magnificent essay “The Fixation of Belief” (1877), who showed how what I call the fundamentalist intellectual style is quite separate from religion. His phrase “fixation of belief” went to the heart of what fundamentalism is about. The fundamentalist temperament tends to search for certainty rather than for errors. The fundamentalist’s tendency is to nail his beliefs in place.

You might think that the fundamentalist, so secure in his belief, would feel the least need to trouble others who were in error. But the reverse is true.

But here is a problem: if not a science game, what? One way or another, every community has to confront that signal problem — people disagree and you must have a way to decide who is right.

“It would be better,” William Jennings Bryan once said, “to destroy every other book ever written, and save just the first three verses of Genesis.”

And so the outside world is denied, and the text is assumed to answer all questions of any genuine importance. There is a problem, however: if a list of doctrinal statements is big enough to contain the answers to all important questions, it is also big enough to contain ambiguities and contradictions. Dangerous disputes will arise over the text. To settle them, the orthodox rely on a common authority, on someone whom they regard as having special powers of insight and thus special ability to sort truth from falsehood. And so at last we reach the fundamentalist social principle: Those who know the truth should decide who is right.

Interestingly, the cosmopolitan Plato of 380 B.C. was incomparably more cynical than the generally sincere Iranian fundamentalist of 1989. Khomeini the true believer really believed in the Koran, whereas Plato the sophisticate made no bones about building his regime on convenient lies and cradle-to-grave brainwashing. Khomeini wanted to defend obvious truth, Plato to maintain social order and national strength. Yet the regime of Plato’s Republic wound up looking eerily like the regime of Khomeini’s Iran: leadership by a philosopher-ruler and a guardian class of priest-administrators; propagation of a supporting ideology complete with creation story and moral code; regulation of words, art, and even music to stave off corruption and decadent influences.

Faced with the constant threat that someone will begin arguing with authority or challenging the fixed text, an orthodox society typically has only two ways to respond: by cracking up or by cracking down.

Disputes over fixed beliefs are always a threat to social cohesion, because both sides are stubborn. (Just look at the fight over abortion.) One strategy for avoiding socially dangerous disputes over fixed beliefs is to try to get rid of fixed beliefs. Generally speaking, that is liberal science’s approach. Another strategy is to try to get rid of disputes. Generally speaking, that is the orthodox society’s approach: get rid of disputes by suppressing criticism.

In an orthodox community, the threat of social disintegration is never further away than the first dissenter. So the community joins together to stigmatize dissent.

The more I get around, the more deeply I am impressed that the gardens of human belief flower more exotically than any in nature.

In 1964 Barry Goldwater’s presidential campaign rallied under the slogan: “In your heart you know he’s right.” That is the most revealing fundamentalist slogan I’ve ever heard.

Paul, who at first persecuted Jesus’s cause with as much zeal as he later championed it, was one of the most influential of all the fundamentalists. It was he who wrote this extraordinary statement of the fundamentalist creed: “For the wrath of God is revealed from heaven against all ungodliness and wickedness of men who by their wickedness suppress the truth. For what can be known about God is plain to them, because God has shown it to them. Ever since the creation of the world his invisible nature, namely, his eternal power and deity, has been clearly perceived in the things that have been made. So they are without excuse.” Upon that chilling Pauline declaration and others like it stand many centuries of killing, torture, and repression of people who perversely, “by their wickedness,” denied evident truth.

This is the morality of the Fundamentalist Principle: he who would deny evident truth should be punished. Wherever the believers in the Fundamentalist Principle get the upper hand, they strive restlessly and untiringly to suppress diversity of opinion, and they do so not simply out of cynicism or power lust, but, on the contrary, out of the purest and most principled of motives. In its full-blown modern form, totalitarianism is relatively new. But the idea of totalitarianism, that to believe incorrectly is a crime, is hardly an innovation. It is, indeed, a logical outgrowth of hard-line fundamentalism.

5 - The Humanitarian Threat

“The liberation of the human mind,” H. L. Mencken once wrote, “has been best furthered by gay fellows who heaved dead cats into sanctuaries and then went roistering down the highways of the world, proving to all men that doubt, after all, was safe — that the god in the sanctuary was a fraud. One horse-laugh is worth ten thousand syllogisms.

Liberals need to keep an eye on the religious authoritarians. Eternal vigilance is the price of science. But where religious true believers are concerned, at any rate, we are pretty vigilant. The greater threat lies in our letting down our guard against ourselves: in high-mindedly embracing authoritarianism in the name of fairness and compassion, as the Marxists did.

As is so often the case with egalitarian activists, they support equality for everybody, except people who don’t share their political agenda. That multiculturalists don’t fight for the inclusion of fundamentalist Christian viewpoints in high school and college classrooms is, no doubt, understandable. But it must qualify as one of our day’s great hypocrisies that those self-appointed guardians of “oppressed-minority viewpoints” have nothing to say in defense of one of America’s own “traditional” minority cultures, namely the religious fundamentalists.

To believe incorrectly is never a crime, but simply to believe is never to have knowledge. In other words, liberal science does not restrict belief, but it does restrict knowledge. It absolutely protects freedom of belief and speech, but it absolutely denies freedom of knowledge: in liberal science, there is positively no right to have one’s opinions, however heartfelt, taken seriously as knowledge. Just the contrary: liberal science is nothing other than a selection process whose mission is to test beliefs and reject the ones that fail.

But if you want your belief recognized as knowledge, there are things you must do. You must run your belief through the science game for checking. And if your belief is a loser, it will not be included in the science texts. It probably won’t even be taken seriously by most respectable intellectuals. In a liberal society, knowledge — not belief — is the rolling critical consensus of a decentralized community of checkers, and it is nothing else.

And who decides what the critical consensus actually is? The critical society does, arguing about itself. That is why scholars spend so much time and energy “surveying the literature” (i.e., assessing the consensus so far). Then they argue about their assessments. The process is long and arduous, but there you are. Academic freedom would be trampled instead of advanced by, say, requiring that state-financed universities put creationists on their biology faculties or give Afrocentrists rebuttal space in their journals. When a state legislature or a curriculum committee or any other political body decrees that anything in particular is, or has equal claim to be, our knowledge, it wrests control over truth from the liberal community of checkers and places it in the hands of central political authorities. And that is illiberal.

We would find ourselves in a world where knowledge was made by voting and agitating. Then we really would find ourselves living Bertrand Russell’s nightmare, where “the lunatic who believes that he is a poached egg is to be condemned solely on the ground that he is in the minority.”

Today it is possible that a majority of climatologists believe that global warming is a fact (one can’t say for sure, since scientists don’t vote on these things), but global warming is far from well enough established to be presented as fact in textbooks.

For various minorities, the answer is to do just what many black and feminist historians are doing, namely to propose new hypotheses about the role of, say, blacks and women in American history. But only after those hypotheses have stood up to extensive checking, only after each has convinced each, is it time to rewrite the texts. The checking process often takes years. So be it. The process often rules against someone whose cause seems sympathetic. So be it. All other paths to knowledge lead to creed wars. And the attempt to intimidate would-be debunkers by calling them “racist” or “sexist” or whatever is nothing but an attempt to replace science with political muscle.

Respect is no opinion’s birthright. People, yes, are entitled to a certain degree of basic respect by dint of being human. But to grant any such claim to ideas is to raid the treasury of science and throw its capital to the winds.

Is the liberal standard for respectability fair? That, really, is the big question today. If you believe that a society is just only when it delivers more or less equal outcomes, you will think liberalism is unfair. You will insist on admitting everyone’s belief into respectability as knowledge.

Humane motives, however, could not save the Inquisition from the same problem that faces humanitarians today: although allowing mistakes is risky, suppressing them is much riskier, because then a “mistake” becomes whatever it is that the authorities don’t like to hear. Suppressing offensiveness, too, comes at a high cost, since offensiveness is not the same thing as wrongness — often just the contrary.

It is not good to offend people, but it is necessary. A no-offense society is a no-knowledge society.

Yet there can be little doubt that the costs of the Japanese aversion to criticism have been enormous — not just for Japan but for the world. Japan is one of the world’s largest, richest, best-educated, and hardest working nations. Yet she relies on outsiders to set her intellectual agenda; her universities are, by international standards, backwaters; her record on intellectual innovation is bleak. From 1901 to 1985 Japan won five Nobel Prizes in science — one twenty-eighth of America’s share, one-tenth of Germany’s. And has avoiding offense produced a better society? Not many American thinkers would want to live there, sitting expressionlessly through academic meetings. The price of the no-offense society is high.

In other words, liberal science is built on two pillars. One is the right to offend in pursuit of truth. The other is the responsibility to check and be checked.

And so when someone says he is offended, the standard reply should be: “I don’t have to like you, but I won’t shut you up or shout you down. However, I am not under the least obligation to take what you say seriously, or even to listen to you, unless you submit your claim to critical examination by me and others, and try to abide by the results.”

The standard answer to people who say they are offended should be: “Is there any casualty other than your feelings? Are you or others being threatened with violence or vandalism? No? Then it’s a shame your feelings are hurt, but that’s too bad. You’ll live.” If one is going to enjoy the benefits of living in a liberal society without being shamelessly hypocritical, one must try to be thick-skinned, since the way we make knowledge is by rubbing against one another.

In a liberal society, the initial presumption ought to be that neither kind of concern deserves any better than to be politely ignored. If that sounds callous, remember that the establishment of a right not to be offended would lead not to a more civil culture but to a lot of shouting matches over who was being offensive to whom, and who could claim to be more offended.

But the conclusion which the humanitarians draw — that the hurting must be stopped — is all wrong. Impelling them toward their wrong conclusion is a dreadful error: the notion that hurtful words are a form of violence. Offensive speech hurts, say the humanitarians; it constitutes “words that wound” (writes one law professor); it does “real harm to real people” who deserve protection and redress (writes another law professor).

My own view is that words are words and bullets are bullets, and that it is important to keep this straight.

If you are inclined to equate verbal offense with physical violence, think again about the logic of your position. If hurtful opinions are violence, then painful criticism is violence. In other words, on the humanitarian premise, science itself is a form of violence. What do you do about violence? You establish policing authorities — public or private — to stop it and to punish the perpetrators. You set up authorities empowered to weed out hurtful ideas and speech. In other words: an inquisition.

It is bad enough to have to remind people that there is no right not to be offended, and that criticism is not the same as violence. It is deeply embarrassing to have to deliver this reminder to people at the center of American intellectual life.

Why is that a “bigoted” suggestion rather than an unpopular opinion? What’s the difference? And who is to say? The anti-bigotry people never approach the question directly, because doing so would show them up. The answer is: we, the right-thinking, are the ones who will say who is and isn’t bigoted. Whenever anyone says that bigoted or offensive or victimizing or oppressing or vicious opinions should be suppressed, all he is really saying is, “Opinions which I hate should be suppressed.” In other words, he is doing the same thing Plato did when he claimed that the philosopher (i.e., himself) should rule for the good of society: he is making a power grab. He wants to be the pope, the ayatollah, the philosopher-king. The answer to the question “Why tolerate hateful or misguided opinions?” has been the same ever since Plato unveiled his ghastly utopia: because the alternative is worse.

Any guidelines elaborate enough to distinguish vicious opinions from unpopular ones will be too elaborate to work. In practice, the distinction will be between the opinions which the political authorities find congenial and those which they find inconvenient.

To make speech punishable on grounds of intent is to give authorities the power to punish criticism whenever they are suspicious of the critic. We should know better than to give any authority such power, least of all at a university. Certainly, then, there is no excuse for a university code like Stanford’s, which prohibits speech that “is intended [my italics] to insult or stigmatize.”

The trouble with the argument that real pain outweighs airy abstractions is that it leaves out one whole side of the equation: the pain is very real and very concrete for the “offensive” speaker who is sentenced by political authorities to prison, privation, or, as in Salman Rushdie’s case, death. The whole point of liberal science is that it substitutes criticism for force and violence.

Look instead at the premise of the argument: that you can only do science where people feel good about each other, where they feel secure and unharassed — in other words, where they are exempt from upsetting criticism. Of course, that is dead wrong. A lot of researchers and theorists hate each other. The history of science is full of bitter criticism and hard feelings; there is simply no way around it. If you insist on an unhostile or nonoffensive environment, then you belong in a monastery, not a university.

Look also at the Orwellian nature of the attack. It basically says that the more you stifle upsetting (e.g., “intimidating,” “demeaning”) speech and thought, the more “free” everybody becomes — so that the most “free” intellectual regime is the one with the most taboos on criticism.

People who like authoritarianism always picture themselves running the show. But no one stays on top for long.

“Is it not a common experience,” says Popper, “that those who are most convinced of having got rid of their prejudices are most prejudiced?”

“You’re not black, or gay, or Hispanic, or whatever; you wouldn’t understand.” Only outsiders, only the oppressed, can understand the hurt, so only they can really comprehend the need for restrictions on debate. White males have no standing to protest controls, because they haven’t felt the pain. That argument deserves a special place in the hall of shame. For one thing, it assumes that only members of certified minority groups know what pain is like. Much worse, though: the only-minorities-can-understand argument is anti-intellectualism at its most rancid. It is the age-old tribalist notion that, as Popper put it, “we think with our blood,” “with our national heritage,” or “with our class.” White supremacists will always say that blacks shouldn’t be in charge because they “can’t understand” (they’re too stupid), anti-Semites will say the same about Jews (too corrupt), and now, shamefully, some American minority activists are saying something similar about “in-groups” (too pampered, too blind). They are denying the very possibility of liberal science, whose premise is that knowledge is available to everyone and comes through public inquiry and criticism, not from the color of your skin or your ethnic heritage or your social class. Accept their credo, and you have a race war or a class war where liberal inquiry once was.

However, proper justifications for affirmative action do not include the often-cited notion that affirmative-action policies will “include minority perspectives.” One of liberal science’s great social advances was to reject the idea that races or tribes have perspectives.

Dinesh D’Souza records this amazing conversation — a snapshot of a possible future: I asked Erdman Palmore, [a sociologist] who teaches a course on race relations at Duke, what constitutes a black perspective. Palmore shook his head. “I have no idea,” he said. “I am white. If I knew what a black perspective was, we wouldn’t need blacks to provide it.” Why then was he, a white man, teaching a course that engaged issues of black history and black consciousness? “It would be better to have a black teach my course,” Palmore agreed. Did he think it was possible for a woman to teach Shakespeare? Palmore looked puzzled. “Oh, I see what you’re getting at. He was a man. Yes, that is a problem. I don’t know the answer, I must confess.”

One expects that sort of thing, of course, in politics and among people who in their working lives do not fashion themselves truth-seekers. But McCarthyism, in its day, never caught on among professionals in the knowledge business, among academics and journalists. Terrible it is to see that this time around the movement to condemn the mistaken along with their errors is widely respected among the very people who most depend on the freedom to err. Intellectuals are losing their nerve or their souls, or both.

6 - Et Exspecto Resurrectionem

Therefore, when offended people devote their energies to shutting someone up or turning him out or getting him fired, rather than trying to show that he is wrong or trying to be thicker skinned, we should be in the habit of telling them to grow up. That’s all. No demands will be met, no punishments meted out.

The fact that you’re oppressed doesn’t mean you know anything. We must not tell creationists, Christian Scientists, and others that they are “loony”; we must not call them names. We must say, rather, “Your way of deciding what is true is illiberal and, if accepted, will substitute political turmoil or authoritarian control for peaceful and productive science.”

Afterword: Minorities, Moral Knowledge, and the Uses of Hate Speech

This book has two patron saints, the philosophers Charles Sanders Peirce and Karl Popper.

Science is unique not because it tests propositions experimentally but because it tests them socially, through a decentralized public process that refracts and distills the experience of countless observers, reaching conclusions which embody the view of no one in particular. The magic is not in the experiment but in the repeating of it and the criticism of it.

As in biological evolution, we cannot assume that any result is final. “We should not dismiss the possibility that we may have to be content with improving our approximations forever,” wrote Popper. Yet, though the process may have no final destination, it does have a direction. Reviewing a videotape of human knowledge, you can easily tell whether it is running forward or backward. Ideas become more sophisticated, technologies and societies built upon them become more effective. That is why relativism (“truth is in the eye of the beholder”) is wrong, even though fallibilism (“no beholder is infallible”) is right. Even without indubitable absolutes as anchors, we can confidently speak not just of knowledge but of progress of knowledge.

So the joke is on the positivists. In real life, crisp empirical verification is only a small part of what people in a science game do, and in many disciplines — ethics, literary criticism, interpretive history, philosophy, much of journalism, much of economics, and so on — crisp empirical verification hardly ever happens at all.

You cannot be gay in America today and doubt that moral learning is real and that the open society fosters it. And so, twenty years on, I feel more confident than ever in answering the humanitarian and egalitarian challenges, even in their newly refined versions. The answer to bias and prejudice is pluralism, not purism. The answer, that is, is not to try to legislate bias and prejudice out of existence or to drive them underground, but to pit biases and prejudices against each other and make them fight in the open. That is how, in the crucible of rational criticism, moral error is burned away. That is how, in my lifetime, moral error was burned away.

We cannot fight hate and fraud without seeing them and debunking them. John Stuart Mill, in On Liberty (1859), was right after all. “Wrong opinions and practices gradually yield to fact and argument: but facts and arguments, to produce any effect on the mind, must be brought before it.”

Our greatest enemy is not irrational hate, which is pretty uncommon. It is rational hate, hate premised upon falsehood.

What I am urging is a general proposition: minorities are the point of the spear defending liberal science. We are the first to be targeted with vile words and ideas, but we are also the leading beneficiaries of a system which puts up with them.