Forget Y2K!

This issue carries the first of several stories linked to the millennium. Among other purposes, they aim to provide relief from the pervasive and pesky "Y2K" syndrome, that myopic angst about how the world's computers will feel when they wake up this coming January 1. Will they think we are back in January 1, 1900? Or will everybody's paycheck still be OK?

That is no small matter. But the arrival of A.D. 2000 probably ought to put us in mind of somewhat broader perspectives, including some long-lens looks into the far past, or leaps of the imagination back and forth in human history, 1,000 years at a time. What was going on a thousand years ago? Or, apart from the major event that gave rise to the terms "B.C." and "A.D." in our dating system, two thousand years ago?

If the editors of Time had been around to do their Man of the Year search near the time of the birth of Christ, their choice would not have fallen on the babe in the manger, but on Caesar Augustus, all-powerful in an empire that stretched the rule of Roman law from Britain across Europe and into Asia Minor. Augustus himself was about to be declared a god. One millennium ago Time would have picked Leif Eriksson; in the year 1000, navigating in a Viking longship, he sailed west to the New World and called it Vinland.

Because time flies, and keeping precise track of it, not merely in thousand-year units, but in days, months and years, as well as seconds, minutes and hours, is a human preoccupation, two of our millennium stories deal directly with time. The first article in the series, "Calendar," adapted from a remarkable new book by David Ewing Duncan, describes the long search for an accurate calendar and how, over tortuous centuries, science and religion poked and pinched into existence the one we still use. (For hundreds of years a major motive that drove calendar searchers was the pious compulsion to determine exactly when Easter ought to fall.) The second story concerns chronometers, and especially how their evolving degree of accuracy influenced — and continues to influence — trade, exploration and war.

Clocks and calendars, being human inventions, are relatively recent. Another recent — and arguably human — invention is the Devil himself. An upcoming story suggests that if the Devil didn't exist, it might have been necessary to invent him (or her), and tries to get a fix on how the Devil is making out today, in an age widely held to be shameless. Lashed to the dailiness of life, most people reckon a generation (at four to a century) as a considerable amount of time. Even the "threescore years and ten" biblically allotted to one man's lifetime can seem interminable. The three to four thousand years of history that go back to include the flowering of ancient Greece are regarded as vast stretches of time, in part because so much history has been crammed into them. Other civilizations started earlier, but it is only in the past three millennia that most of the events and human inventions that we think of as adding up to Western Civilization appeared. During this span there arose, in the arts and government, in religion and philosophy, above all in science, technology and medicine, the works of humankind that we point to most often as proof that we are making some kind of progress: the printing press, the steam engine, barbed wire, the telephone, electricity, the "painless" air drill that dentists now use, penicillin and the cappuccino machine, not to mention such mixed blessings as the internal combustion engine, television, and atomic power.

In that time, the Western world, at least, has also seen astonishing variations in religion, starting with polytheism, its most dramatic example being a dysfunctional family of deities on Mount Olympus. The father had a penchant for pursuing pretty mortals; his wife was vindictively jealous; the kids feuded back and forth. How to propitiate them all became a complex and risky business.

In most ways monotheism was an immense improvement — a radiant vision of the whole creation unified, alive with moral purpose, with a single power keeping score in this world and the next. Whether divinely inspired or not, it is surely one of the great and enduring inventions of man. Two hundred years ago, however, a more secular vision began to take hold, not based on the existence of a divine being but on high hopes for the perfectibility of human nature. The Marquis de Condorcet, an Enlightenment philosopher, put it clearly in 1793 when he wrote, "The total mass of the human species, through alternating periods of calm and agitation, good and evil, forever marches, albeit at a slow pace, towards a greater perfection." With continued help from science and a serious misreading of evolution as progress, that idea has become a secular religion of the modern world.

The change in attitude is easily summed up by a pair of diametrically opposed quotations. One, from Psalm 8, asks a rhetorical question: "What is man, that thou art mindful of him?" The other comes from the Greek Sophist Protagoras: "Man is the measure of all things." As the end of the second millennium A.D. approaches, the two resounding assertions may be worth putting in perspective, perhaps even on New Year's Eve of 1999, as a reminder not only of the breathtakingly brief time during which human beings have inhabited the planet, but of the breathtakingly long history of life on earth before we ever appeared. There is some measure of encouragement, perhaps, in considering just what latecomers we are.

Celebrated biologist H. J. Muller once asked the readers of one of his essays to imagine the history of all life-forms on earth as a rope composed of evolving cells and genes stretching several hundred miles from New England to New York City, and ending at the center of a desk in the Wall Street office of J. P. Morgan. The line represents three or four billion years, starting as far away as Boston with the appearance of the first minute signs of life on earth, primeval protoplasm. Blind chemical forces go to work. Genes mysteriously multiply and differentiate. Mutate, or don't. Trial and error in numbers beyond computing. Beyond imagination, even. Following along the line, Muller notes, will seem boring at first, because "there will be no actual 'beasts' as we ordinarily think of them (four-footed land animals) until we are well within the limits of New York City."

It is not until the line reaches Harlem that the first mammals and birds appear, coexisting with huge dinosaurs who do not drop out entirely until the lifeline crosses 42nd Street. Monkeys first arrive a bit south of there, somewhere around Macy's, but nothing more complex than an ape shows up until the line has reached a spot directly in front of the House of Morgan. Inside the building, 15 feet from the desk, stands the first Neanderthal. Homo sapiens, "'man the wise,'" as Muller notes with some irony, "leaves his first remains within the private office, only seven and a half feet from the desk." The earliest known "civilization" (not more than 14,000 years ago) leaves its crockery only "a yard and a half from the desk." Muller concludes: "On the desk, one foot from the center, stands old King Tut. Five and a half inches from the center we mark the Fall of Rome and the beginning of the Dark Ages. Only one and a half inches from the present end of the cord [New England-to-New York lifeline] come the discovery of America and the promulgation of the Copernican theory — through which man opens his eyes for the first time to the vastness of the world in which he lives and his own relative insignificance." Half an inch from the end "start the first faint reverberations of the Industrial Revolution.... A quarter of an inch from the end Darwin speaks, and man awakes to the transitory character of his shape and his institutions."

A few of the proportions have changed minutely since the piece was written more than a half-century ago. But anyone wanting to put the "Y2K" dilemma in perspective can safely turn to Muller.

Get the latest Science stories in your inbox.