Back | Next
Contents

CHAPTER 14
The End of Science

Maybe it is because we are at the turn of the century, facing the new millennium. Maybe it is because there has been no obvious big breakthrough for a couple of decades. Maybe global pessimism is the current fad. For whatever reason, several recent books have suggested that the "end of science" may be in sight.

Their titles betray the direction of their thinking: Dreams of a Final Theory: The Scientist's Search for the Ultimate Laws of Nature (Weinberg, 1992); The End of Physics: The Myth of a Unified Theory (Lindley, 1993); The End of Science: Facing the Limits of Knowledge and the Twilight of the Scientific Age (Horgan, 1996).

These all suggest, in the case of the last book with considerable relish, that the great moments of science have all occurred; that scientists are now on a road of diminishing returns; and that the next hundred years will offer nothing remotely comparable to the discoveries of the last few centuries.

Steven Weinberg is a physicist, and a great one. He would very much like to see a "theory of everything" in his lifetime. That does not mean that everything will then have been explained, only that the most basic underpinnings of everything, which he sees as the laws of fundamental physics, will have been established. He recognizes that there will be more questions to be answered, and perhaps many discoveries in other branches of science that will come to be regarded as absolutely radical and basic. But physics is nearing its final form.

David Lindley and John Horgan are both editors, at Science magazine and Scientific American respectively. Lindley, after a careful review of the development of physics since the end of the last century, disagrees with Weinberg. He concludes that the "theory of everything will be, in precise terms, a myth. A myth is a story that makes sense within in its own terms . . . but can neither be proved nor disproved." Scientists in pursuit of a final theory are then like dogs chasing an automobile. What will they do with it if they catch it?

Horgan takes a broader approach. He interviewed scores of eminent scientists who have made major contributions in their diverse fields. In the end, his conclusion is that, whether or not we are approaching a final theory, we are at any rate at the end of all discoveries of the most fundamental kind. The scientists of future generations will mainly be engaged in mopping-up operations.

This tune may sound familiar. It has been heard before, notably at the end of the nineteenth century. Here is Max Planck, recalling in 1924 the advice given to him by his teacher, Philipp von Jolly, in 1874: "He portrayed to me physics as a highly developed, almost fully matured science . . . Possibly in one or another nook there would perhaps be a dust particle or a small bubble to be examined and classified, but the system as a whole stood there fairly secured, and theoretical physics approached visibly that degree of perfection which, for example, geometry has had already for centuries."

Is it more plausible now than it was then, that the end of science is in sight? And if so, what does it mean for the future of science fiction?

As Sherlock Holmes remarked, it is a capital mistake to theorize before one has data. Let us examine the evidence.

First, let us note that because Horgan's scientists are already recognized major figures, most of them are over sixty. None is under forty, many are over seventy, a few are well into their eighties, and several have died in the two years since the book was published. Although everyone interviewed seems as sharp as ever, there is an element of human nature at work which Horgan himself recognizes and in fact points out. Gregory Chaitin, in a discussion with Richard Feynman, said he thought that science was just beginning. Feynman, a legend for open-mindedness on all subjects, said that we already know the physics of practically everything, and anything that's left over is not going to be relevant.

Chaitin later learned that at the time Feynman was dying of cancer. He said, "At the end of his life, when the poor guy knows he doesn't have long to live, then I can understand why he has this view. If a guy is dying he doesn't want to miss out on all the fun. He doesn't want to feel that there's some wonderful theory, some wonderful knowledge of the physical world, that he has no idea of, and he's never going to see it."

We are all dying, and anyone over seventy is likely to be more aware of that than someone twenty-five years old. But the latter is the age, particularly in science, where truly groundbreaking ideas enter the mind. If we accept the validity of Chaitin's comments, Horgan's interviews were foreordained to produce the result they did. Science, as perceived by elderly scientists, will always be close to an end.

As Arthur Clarke has pointed out, when elderly and distinguished scientists say that something can be done, they are almost always right; when they say that something cannot be done, they are almost always wrong. Fundamental breakthroughs, carrying us far from the scientific mainland, are, before they take place, of necessity unthought if not unthinkable. That is the philosophical argument in favor of the idea that we are not close to the end of progress. There is also a more empirical argument. Let us make a list of dates that correspond to major scientific events. Lists like this tend to be personal; the reader may choose to substitute or add milestones to the ones given here, or correct the dates to those of discovery rather than publication.

 

1543: Copernicus proposes the heliocentric theory displacing Earth from its position as the center of the universe.

1673: Leeuwenhoek, with his microscopes, reveals a whole new world of "little animals."

1687: Isaac Newton publishes Principia Mathematica, showing how Earth and heavens are subject to universal, calculable laws.

1781: Herschel discovers Uranus, ending the "old" idea of a complete and perfect solar system.

1831: Michael Faraday begins his groundbreaking experiments on electricity and magnetism.

1859: Darwin publishes The Origin of Species, dethroning Man from a unique and central position in creation.

1865: Mendel reports the experiments that establish the science of genetics.

1873: Maxwell publishes A Treatise on Electricity and Magnetism, giving the governing equations of electromagnetism.

1895-7: Röntgen, Becquerel, and J.J. Thomson reveal the existence of a world of subatomic particles.

1905: Einstein publishes the theory of special relativity.

1925: The modern quantum theory is developed, primarily by Heisenberg and Schrödinger.

1928: Hubble discovers the expansion of the universe.

1942: The first self-sustaining chain reaction is initiated by Fermi and fellow-workers in Chicago.

1946: The first digital binary computer is built by Eckert and Mauchly.

1953: Crick and Watson publish the structure of the DNA molecule.

1996: Evidence is discovered of early life-forms on Mars.

 

Is there a pattern here? The most striking thing about this list of dates and events might seem to be the long gap following 1953, since the discovery of Martian life is still highly tentative. We have not seen so long a hiatus for more than a hundred years.

But is the gap real? In the 1830s, Faraday's experiments on electricity were considered fascinating, but hardly something likely to change the world. In 1865, scarcely anyone knew of Mendel's experiments—they lay neglected in the Proceedings of the Brünn Society for the Study of Natural Science for twenty years. And in 1905, only a small handful of people realized that the relativity theory offered a radically new world-view. (Max Born, later one of Einstein's closest friends, wrote: "Reiche and Loria told me about Einstein's paper, and suggested that I should study it. This I did, and was immediately deeply impressed. We were all aware that a genius of the first order had emerged." Born, however, was himself a genius. It takes one to know one.)

It also takes a long time to accept ideas that change our basic perception of reality. Remember that Einstein was awarded the Nobel Prize in 1921 mainly for his work on the photoelectric effect, and not for the theory of relativity. That was still considered by many to be controversial.

Will posterity record the year that you read this book as an annus mirabilis, the marvelous year when the defining theory for the next centuries was created?

Am I an optimist, if I find that suggestion easier to believe than that we, in this generation, are seeing for the first time in scientific history the wall at the edge of the world? Humans are often guilty of what I call "temporal chauvinism." It takes many forms: "We are the first and last generation with both the resources and the will to go into space. If we do not do it now, the chance will be lost forever." "We are the final generation in which the Earth is able to support, in comfort, its population." "We are the last generation who can afford to squander fossil fuels." "After us, the deluge."

I believe that science, science new and basic and energetic, has a long and distinguished future, for as far as human eye can see. And I believe that science fiction, which as science draws on contemporary developments but which as literature draws on all of history, will play an important role in that future.

Certainly, we can envision and write about times as bleak and grim as you could choose; but we can also imagine better days, when our children's children may regard the world of the late twentieth century with horror and compassion, just as we look back on the fourteenth century in Europe.

Science fiction fulfills many functions; to entertain, certainly—otherwise it will not be read—but also to instruct, to stimulate, to warn, and to guide.

That is science fiction at its best, the kind that you and I want to read and write. I see no reason why any of us should settle for less.

 

Back | Next
Contents
Framed