The Hunt for Vulcan: And How Albert Einstein Destroyed a Planet, Discovered Relativity, and Deciphered the Universe

The Hunt for Vulcan: And How Albert Einstein Destroyed a Planet, Discovered Relativity, and Deciphered the Universe

In "The Hunt for Vulcan", the head of MIT's writing program takes us on a fast-paced tour through some of the biggest controversies and discoveries in astronomy and physics. Throughout this short and thoroughly enjoyable book, Levenson weaves together explanations of complex science with their historical context in a way that gets you caught up in the excitement (and politics and frustration) of scientific discovery.

I especially enjoyed the way Levenson sprinkles the philosophy of science throughout the narrative. He points out that astronomers practically ignored evidence directly contradicting Newtonian gravitational theory:

Along those lines, once Vulcan refused to appear, decade after decade, what should have been done about that icon of the scientific revolution, Isaac Newton’s theory of gravity? Within the myth of the scientific method, there should have been no choice about the next move.

Having now read "Theory and Reality", I can recognize that Levenson's simple empiricist critique lacks the nuance of Kunn's "Structure of Scientific Revolutions" and the role of paradigms and politics in science. Even so, it's a great case study to be aware of.

And a final zinger - I can't get enough of Levenson calling the New York Times out as "fake news" for their misreporting: "there is an end of all discussion. Vulcan exists..."

My highlights below.


Albert Einstein’s talk that day and its sequel, presented the following week, completed the greatest individual intellectual accomplishment of the twentieth century. We now call that idea the general theory of relativity: at once a theory of gravity and the foundation for the science of cosmology, the study of the birth and evolution of the universe as a whole. Einstein’s results mark the triumph of a lone thinker, battling the odds, the doubts of his peers, and the most famous scientist in history, Sir Isaac Newton.

The story of Vulcan suggests something much deeper, an insight that gets to the heart of the way science really advances (as opposed to the way we’re taught in school).

And, contrary to the popular picture of science, a mere fact — Mercury’s misplaced motion — wasn’t nearly enough to undermine that sturdy edifice. As Vulcan’s troublesome history reveals, no one gives up on a powerful, or a beautiful, or perhaps simply a familiar and useful conception of the world without utter compulsion — and a real alternative.

But the issue of what to do with failure in science was tricky right at the start of the Scientific Revolution, and it remains so now.

Vulcan’s biography is one of the human capacity to both discover and self-deceive. It offers a glimpse of how hard it is to make sense of the natural world, and how difficult it is for any of us to unlearn the things we think are so, but aren’t.


In January, before his troubles began, Halley had produced a clever bit of celestial analysis, a calculation that suggested that whatever force held the planets on their paths around the sun grew weaker in proportion to the square of each object’s distance from the sun. But that prompted an immediate question: could that particular mathematical relationship — called an inverse square law — explain why all celestial objects moved down the paths they’d been observed to follow? The best minds in Europe knew what was at stake in that seemingly technical issue. This was the decisive climax in what we’ve come to call the Scientific Revolution, the long struggle through which mathematics supplanted Latin as the language of science.

Edmond Halley agreed. Three years after he’d innocently asked for a single proof, he delivered to the printer the last pages of what Newton again immodestly, again accurately, titled Philosophiae Naturalis Principia Mathematica — The Mathematical Principles of Natural Philosophy. Getting Newton’s enormous manuscript into book form while dealing with its ever - fractious author had left no time for Halley’s own work since 1684, but now, at the finishing line, he granted himself his own victory lap. As Principia went to press, he exercised his editor’s privilege to preface Newton’s prose with a poetic assessment of the great work and its author: “But we are now admitted to the banquets of the gods/We may deal with the laws of heaven above; and we now have/The secret key to unlock the obscure earth; and we know the immovable order of the world/...Join me in singing the praises of Newton, who reveals all this,/Who opens the chest of hidden truth.”

As the great French mathematician Joseph-Louis Lagrange famously said, “Newton was the greatest genius who ever lived, and the most fortunate; for we cannot find more than once a system of the world to establish.”


Uranus created a unique opportunity, as it was the first major finding that could pose an independent test of Newton’s mathematical version of reality. Put another way: a previously unknown object offered the astronomical community a chance to see how well their fundamental tools actually accommodated not just what was known, but what had, until that March evening, remained unsuspected.

A map of the solar system published in 1791 as part of the Tom Telescope series of science books for children. In this very British setting, Uranus is still known as “the Georgian Planet” — an attempt at interplanetary nationalism that didn’t last long.

Thus the deep power of Newtonian science as Laplace and his peers understood it: it was an engine of discovery, powered by reason expressed in the particular rigor of mathematics.

Released from matters of state, Napoleon delighted in putting awkward questions to his guests, and so he told his mathematical friend that he had read Newton, and saw that his great book had mentioned God often. But “I have perused yours, but failed to find his name even once.” Why is that, he asked? In the grand tradition of this story, Laplace is reported to have replied, “I have no need of that hypothesis.”

As the historian Roger Hahn puts it, “Nowhere in his writings, either public or private, does Laplace deny God’s existence. He merely ignores it.”

And, as Laplace surely knew, Celestial Mechanics could be read as a kind of demonic text, offering a set of tools with which its readers could discover “the future just like the past” of the solar system.


Galle’s sighting was the climax of what was almost immediately understood to be the popular triumph of Newtonian science. It’s unsurprising, given the stakes, that the discovery of Neptune produced its share of controversy.

Interlude - “SO VERY OCCULT”

Such unbroken success vindicated Newton on another matter as well. He never publicly said he knew what gravity was. He refused to propose any specific notion to explain why one hunk of matter pulls on another. Such an account was, to him, unnecessary. He said so in one of his most famous one-liners, added in response to critics to the third edition of Principia: “I do not feign hypotheses” — or, in full, “I have not as yet been able to deduce from phenomena the reason for these properties of gravity and I do not feign hypotheses.” — There you have it: one of the great non-apologies in the history of science.

This is what so offended Newton’s critics, themselves no slouches as natural philosophers and mathematicians. To them, Newton had abandoned the direct “local” explanations of Cartesian physics (and Aristotelian, for that matter) — the way that such explanations brought cause directly into contact with effect right where any effects occur. Once he denied the demand to explain how nature worked, he undermined (seemingly) the very nature of physical explanation. Gottfried Leibniz, the nearest Newton had to an intellectual equal, complained publicly that absent an explanation for how it made things go, Newton’s theory verged on blasphemy: “without any mechanism... [gravity] is an unreasonable and occult quality, and so very occult that it is impossible that it should ever be done though an angel or God himself should undertake to explain it.”

But what if that necessity was an illusion? Newton’s refusal to assert what he did not know was more subtle than a simple rejection of mechanical dogma. Instead, the deeper truth hidden within Newton’s seeming intellectual modesty comes from the realization that there is a real gap between mathematics and physics.

Instead, thinkers like Euler and Lagrange and Laplace and finally Le Verrier constructed in Newton’s name a worldview in which mathematics and not mechanism became the scaffolding of the universe — math, without God — to the point where Laplace’s “I have no need of that hypothesis,” could both echo and overwhelm Newton’s “I do not feign...”

These laws are simple in the sense the physicist Richard Feynman meant when he described solving a problem in Newton’s Principia: “ ‘Elementary’ does not mean easy to understand. ‘Elementary’ means that very little is required to know ahead of time in order to understand it, except to have an infinite amount of intelligence.”


Perhaps predictably, though, Le Verrier didn’t start right away. He had applause to reap — a tour of England in 1847 was only one such distraction —
and Paris between 1848 and 1850 was roiled by the political transition that ultimately produced France’s Second Empire, with the original Napoleon’s nephew seizing power as Napoleon III. Le Verrier, like Laplace before him, took part in revolutionary politics — and like his intellectual ancestor, managed to navigate treacherous shifts in power unscathed.

Observations are essential; but, as Le Verrier argued through his analysis of the minor planets, they are not in themselves sufficient. The scientist’s duty confronting some new circumstance is to find the meaning within the flood of new data. Half a century later, his compatriot the great mathematician Henri Poincaré would put it like this: “We can not know all facts and it is necessary to choose those which are worthy of being known.”

Le Verrier turned out to be a viciously effective academic politician. By the early 1850s, he laid his sights on control of the Paris Observatory, and with it control over the most significant astronomical research program in France.

With a good clock and an accurate fix on where on earth the event was being viewed, timing a planet’s entry or exit from a transit ranked among the most precise measurements available to astronomers.

A century and a half later, the one irreducibly extraordinary fact of this work remains how incredibly small an “error” Le Verrier uncovered. The unexplained residue of Mercury’s orbital dance came down to a perihelion that landed just .38 seconds of arc ahead of where it should every year. To put it into the form in which Le Verrier’s number became famous: every hundred years, during which Mercury travels a radial journey of 36,000 degrees, the perihelion of its orbit shifts about 1/10,000th beyond its appointed destination, an error of just 38 arcseconds per century.


“The man who untied Neptune with his nose — so to speak — cannot be accused of confounding accidental flies with actual planets. When he firmly asserts that he has not only discovered Vulcan, but has calculated its elements, and arranged a transit especially for its exhibition to routing astronomers...” the Times wrote, “there is an end of all discussion. Vulcan exists...”


And among the most prized trophies that brought them all to Rawlins? The outstanding solar system mystery: where, if anywhere, the elusive Vulcan might be seen. Henry Draper, a physician-turned-astronomer and a pioneering astrophotographer, led the largest expedition in town. Edison joined Draper’s party to pursue a technical goal of his own, testing a device he called a tasimeter, an infrared measuring instrument so sensitive that he wanted to see if it could detect faint IR radiation from the corona. Along with him came Norman Lockyer, probably the best-known scientist in the group. Founder of the journal Nature, he was one of the pioneers of the new technique of spectroscopy. In 1868 he had noticed a bright yellow band in the spectrum of solar light, which led him to identify the element helium — the first to be found beyond Earth, untouched by human hands. Then there was the man on a mission: James Craig Watson. Director of the Ann Arbor Observatory, Watson was the veteran of two prior eclipses and had discovered more than twenty asteroids. His reason for being in Wyoming was simple: Vulcan. The few minutes of daytime darkness during totality would be, as every astronomer knew, the perfect time to detect any intra-Mercurian bodies.

Edison returned to the station first, and he asked whether there might be anything else worth shooting nearby. Clarke told him that the surrounding plain enjoyed an abundance of jackrabbits — “what the locals call narrow-gauge mules.” Edison asked where he might find them, and Clarke “pointed west and noticing a rabbit in a clear space in the bushes, said there is one now.” Edison picked out a silhouette from the platform, but he wanted to make sure of his kill. He “advanced cautiously to within 150 feet and shot.” The animal did not move. He closed to one hundred feet. He fired again. The beast wouldn’t jump. He aimed, pulled the trigger once, and then again. His target stood its ground. Edison glanced over his shoulder and saw that the entire station staff had gathered for the show. The penny dropped. He’d been set up, played for a dude. His target looked like a desert hare, all right, all ears and legs. It was exactly where one might expect to spy such an exotic creature. And yet... Thomas Edison, genius, had just murdered... a stuffed jackrabbit. It had seemed so real.


And if the LHC hadn’t found its Higgs? That would have been a direct analogy to the problem Vulcan after 1878 seemed to pose (for all that no one addressed it): the failure to find the result theory anticipated in a context that demanded some solution would raise deep and (for theoretical physicists) very exciting questions.

Observing at the ragged edge of technology is always a tricky business.

Long gaps between prediction and observation always raise the question: what finally persuades science — scientists — to abandon a once successful idea? When do you take “no” for an answer? There’s a conventional response in science to that question: right away. Or at least as soon as you’re confident of the evidence. In a public talk delivered in 1963, Richard Feynman said that science is simply “a special method of finding things out.” But what makes it special? The way its answers get confirmed or denied: “Observation is the judge” — the only judge, as the catechism goes — “of whether something is so or not.”

Here’s a typical “Introduction to the Scientific Method” aimed at college students: “The scientific method requires that a hypothesis be ruled out or modified if its predictions are clearly and repeatedly incompatible with experimental tests…”—pretty much exactly what science fair contestants are told. The explanation goes on, though, to echo Feynman’s point: “No matter how elegant a theory is, its predictions must agree with experimental results if we are to believe that it is a valid description of nature. In physics, as in every experimental science, ‘experiment is supreme.’ ” In other words: when a long-anticipated outcome fails to materialize, more than a single prediction lies in peril. If gravity waves don’t show up in ever more acute CMB measurements, then at some point the strand of inflation theory that requires them will be in trouble. Along those lines, once Vulcan refused to appear, decade after decade, what should have been done about that icon of the scientific revolution, Isaac Newton’s theory of gravity? Within the myth of the scientific method, there should have been no choice about the next move. “Experiment is supreme”... “Observation is the judge.” We hold this truth to be self-evident: the hard test of nature trumps even the most beloved, battle-tested, long-standing idea. — Does history behave like that? Do human beings? — No. Real life and cherished fables routinely diverge. After July 1878, almost all of the astronomical community abandoned the idea that a planet or planets of any appreciable size existed between the Sun and Mercury. But that broad consensus did not lead to any radical reassessment of Newtonian gravitation.


So it was one day in 1907, he found himself staring out the window. Across the way, he saw a man fixing something on a roof. His imagination took over. In his mind’s eye, that suddenly luckless roofer slipped, slid, fell—and there it was, what Einstein would call “the happiest thought of my life.” It had just come to him that “if a man falls freely he will not feel his own weight.” A man crashing to his death would seem to be an odd image to evoke joy in anyone. And that treacherous roof was a very long way from the limb of the sun and the realm Vulcan had been supposed to roam. Even so: there stood an anonymous laborer, unaware of the mental play going on in the office across the way and, equally unknowing, about to take on a vital role in settling the fate of an undiscovered planet.

Once Einstein represented light as quanta in his equations, the calculation that followed reproduced Lenard’s results... and helped form the foundations of quantum mechanics, a set of ideas that is utterly intertwined with every facet of twenty-first-century life. That came in March. April brought Einstein’s proof of the existence and size of atoms and molecules, an exercise in statistical physics that remains the most frequently cited of his 1905 works, with applications that range from mixing of paint to Einstein’s own definitive explanation for why the sky is blue.

He followed that up with a related analysis that solved the long-standing mystery of Brownian motion — first observed in the random motion of dust or pollen in water. That sounds like a sidelight, a minor result, except that Einstein’s method of accounting for the outrageously large number of molecular collisions required to produce the wandering track of a pollen grain was a significant step in building perhaps the single most powerful idea in twentieth- and twenty-first-century science: the recognition that the fundamental nature of reality in many of its facets is determined by the behavior of crowds that can only be understood in statistical terms, and not by direct links in a chain of cause and effect.

From there, Einstein lays down the two pillars on which all the rest of his new idea will rest. One was the “relativity principle,” originally defined by Galileo. It holds that “the laws governing the change of state of any physical system do not depend” on whether someone observes that event from within a system or from the outside, looking in — as long as both vantage points “are in uniform motion relative to each other.” That is: it doesn’t matter whether you are standing by the track or riding a train. Newton’s laws of motion (and any other natural laws, of course) behave the same way in both circumstances, even if, say, the path of a ball thrown on the train looks different to people watching from either vantage point. Einstein’s second axiom was that the speed of light in a vacuum must be a constant, identical for all observers throughout the universe. The problem with that idea — and this had troubled scientists for decades before Einstein — is that if the speed of light truly does remain constant for all observers, that would seem to contradict Newton’s ideas about motion.

Einstein’s insight was to take seriously the implications of that evidence of a constant velocity for light. If the speed of light does not change with the motion of an observer, he argued, then to reconcile that fact with the rest of experience requires a change in the way one must think about the elements of speed—distance and time.

Newton’s God kept absolute time and absolute space throughout the universe, a divine clock striking the same hours at every point throughout all creation. That article of faith helped Newton to his genuinely revolutionary insight that the heavens above and the earth below are governed by a single set of laws, just one system of the world. As the flight of comets and the discoveries of planets seemed to prove, cosmic history seemed to possess a universal constancy, the same everywhere for all people at all times. Two centuries on, Einstein’s homely images of trains and timepieces and rulers laid waste to all that. His clocks tallied their seconds beautifully, but to a beat that varied in the eye of the beholder.

The realization that a falling man won’t feel his own weight provided the crucial hint that led Einstein to think about gravity along similar lines to those he used to analyze the relativity of time and space. Einstein formalized this insight as the “equivalence principle” — an axiom that would become as important to his thinking as the relativity principle had proved to be in 1905. In its simplest form, equivalence simply holds that a person in free fall — like the imaginary roofer — cannot distinguish between two possible descriptions of his circumstances. He can’t say whether he is falling under the influence of gravity, or just floating in a gravity-free region of space.

Privately, though, Einstein had a perfectly fine grasp of the tactics as well as the strategy of intellectual combat. He knew that he could win if and only if he could clearly demonstrate that his theory modeled reality better than Newton’s. On Christmas Eve he wrote to Conrad Habicht, an old friend, not a physicist, that he was working on a new, relativistic law of gravitation. His aim? “To explain the still unexplained secular changes in the perihelion of Mercury.”


His office overlooked the grounds of an insane asylum. He was thinking of quanta when he described the inmates he watched from his windows as the “madmen who do not study physics.”

But by no later than the middle of 1911, Einstein saw where any extension of relativity theory must lead. Gravity bends time.

But Einstein knew from E=mc2 that energy and mass are equivalent, two faces of that single entity, mass-energy. The next thought was in some ways obvious after the fact—but at the time it represented a breakthrough. Any change in the amount of potential energy contained within a gravitational system would alter the total amount of mass-energy present — and hence the intensity of the gravity acting on the objects involved. That is: Einstein now realized that gravity can impose its own effect on itself, that every change in the conformation of the system alters the system’s gravitational behavior.

Einstein’s great advance came when he thought about what it had to mean that acceleration and gravity affect the flow of time: space-time would have to bend too, as one of its dimensions (time) flexes under the influence of gravity. With that realization, Einstein’s thinking took on the elegant sweep of his best work. His new catechism: gravity is a property of matter and energy together (not matter alone, as in Newton’s view)—and gravity bends time. Taken together, those two facts led to this conclusion: the total amount of mass and energy determines the strength of the gravitational field in any particular location, and hence the amount any given region of space-time will flex. That warping of space-time in its turn has to affect the paths that matter and energy may take through the cosmos. Space-time is no stage, merely the box the universe comes in; rather, Einstein now realized, it is active, dynamic, shaped by what it contains. As he later put it, he had at last grasped a crucial truth: “the foundations of geometry have physical significance.”

But the idea itself was almost there, as Einstein knew, or at least felt very deeply. It was close enough for him to see that it could be tested. The theory made one clear prediction: light as well as matter would have to follow the contours of space-time, which meant that a ray of light passing close to the edge of the solar disk would bend round that gravity well created by the sun’s mask. The effect was big enough to be detectable, Einstein realized, but only during a total eclipse of the sun. Under the new theory that deflection would be .87 seconds of arc—a number well within the reach of experienced eclipse observers.


That was one of Einstein’s few consolations in that miserable time. He never reconciled himself to the shock of the war — not merely the fact of battle, but the naked joy that everyone, it seemed, took in the fight. “That a man can take pleasure in marching in fours to the strains of a band is enough to make me despise him,” he wrote years later. “Heroism on command, senseless violence and all the loathsome nonsense that goes by the name of patriotism — how passionately I hate them.” What was worse, in Einstein’s eyes, was that the extraordinary collection of scientific minds that had lured him to Berlin turned out to be as war-drunk as any mob in the street. The most potent symbol of this almost-personal betrayal was Einstein’s closest Berlin friend, Fritz Haber, who would later win the Nobel Prize for chemistry. With the start of the war, he shifted his lab to a near-total military focus.

To Einstein, this was simply madness. “Our whole, highly praised technological progress,” he wrote in what is now one of his most famous aphorisms, “and civilization in general can be likened to an axe in the hand of a pathological criminal.” World War I broke something in Einstein, destroying forever the faith he’d affirmed until the fall of 1914 — that there was a genuinely supranational, disinterested elite of the mind, united by what he most valued, the study of “this huge world, which exists independently of us human beings and which stands before us like a great external riddle.”

Even allowing for such difficulties, though, it’s remarkable that none of his German colleagues, despite all the effort they’d expended to lure Einstein into their midst, paid any real notice to what they’d just heard: first, that Isaac Newton was wrong about gravitation, and second, that to get gravity right, physicists had to reimagine fundamental assumptions about the behavior of the universe. Einstein told them to their faces, twice, and he published the argument as a fifty-five-page article in the proceedings of the Prussian Academy. When his article appeared, Einstein did receive a few letters from foreign researchers exploring the corners of his ideas, but nothing fundamental. No one in Berlin went even that deep. Einstein was unsurprised. The year before, the dean of German physics, Max Planck, had warned him not to tackle gravity. It was too hard a problem, he said, and “even if you succeed no one will believe you.” Science may celebrate the triumph of the better idea. Scientists don’t, not always, not immediately, not when the strangeness involved takes extraordinary effort to embrace.

November 18, 1915. Masking his emotions behind the required decorousness of scientific communication, Einstein revealed almost no sign of any excitement in his presentation to the Prussian Academy. “The calculation for the planet Mercury yields,” he told his audience, “a perihelion advance of 43 arc seconds per century, while the astronomers assign 45″ +/- 5″ per century as the unexplained difference between observations and the Newtonian theory.” Belaboring the obvious, he added that “this theory therefore agrees completely with the observations.” Such neutral tones could not conceal the explosion thus detonated. Decades of attempts to save the Newtonian worldview were at an end. Vulcan was gone, dead, utterly unnecessary.

It was said of Newton that he was a fortunate man, because there was only one universe to discover, and he had done it. It had been said of Le Verrier that he discovered a planet at the tip of his pen. On the 18th of November, 1915, Einstein’s pen destroyed Vulcan — and reimagined the cosmos.

Much later, Einstein tried again to describe what he felt at that first, private instant of great discovery. He couldn’t. “The years of searching in the dark for a truth that one feels but cannot express, the intense desire and the alternations of confidence and misgiving until one breaks through to clarity and understanding,” he wrote, “are known only to him who has experienced them.”


Three weeks into the era of general relativity, Vulcan was gone forever. After half a century in which it had been at once necessary and absent, it was finally revealed to be pure fiction. Its repeated “discovery” was nothing more than an object lesson in how easy it is to see what ought to be rather than what is.

What moral to draw, then, of the nonexistence of an innermost planet and the universal triumph of general relativity? At the least this: Science is unique among human ways of knowing because it is self-correcting. Every claim is provisional, which is to say each is incomplete in some small or, occasionally, truly consequential way. But in the midst of the fray, it is impossible to be sure what any gap between knowledge and nature might mean.

Give Einstein (almost) the last word. In 1918, he spoke at the German Physical Society. There, he tried to describe what goes on inside the mind of someone attempting to interrogate nature at the edge of understanding. He didn’t speak of logic, or rigor, or some exceptional mental talent. Instead, the driving force behind great work turned, he said, on “the longing to behold…preexisting harmony.” Getting there required the usual work of the researcher, of course, mathematics to learn, calculations to perform, the endless cat-and-mouse with errors of thought and execution. All that had to be done. But to do it, day after day, there was a certain way one had to be: “the state of mind which enables a man to do work of this kind” he said, “is akin to that of the religious worshiper or the lover; the daily effort does not originate from a deliberate intention or program, but straight from the heart.”


In the spring of 2014, Ta-Nehisi Coates talked me through the idea and after two or three afternoons he told me just to start writing, never mind what might happen with any accumulating pile of words.

John Durant, director of the MIT Museum, has been a friend and intellectual goad for years.

A special thanks all their own goes to my students in the MIT Graduate Program in Science Writing, and especially those of the class of 2015, present at the creation