David Alexander Gardiner (UK)
The question of the age of the Earth and its former inhabitants is one of great interest to us all. Most are aware that the Earth is understood today to be approximately 4.6 billion years in age, but what is the story of the momentous quest – to unravel the mystery of time?
Many early speculations as to the age of the universe abounded in ancient and medieval times. We are all familiar with the literalist understanding of the Old Testament, from which Archbishop Ussher famously calculated a 4004 BC date for the beginnings of the Earth. Yet, this was one of the shortest chronologies in existence: the Babylonians spoke of many hundreds of thousands; the Egyptians of many tens of thousands; and the Hindus many billions of years in their cosmological speculations of the past. However, all these early traditions were not scientific in basis. Rather, they were religious or philosophical and not based upon experimentation and observation. It would not be until after the Renaissance that people started employing scientific methodologies to unravel the mystery.
Various early scholars speculated upon the Earth’s geological history, including Leonardo da Vinci, the universal genius. Leonardo noted that fossils had once been actual living creatures and that the ocean must have once covered the land. As regards the age of the world, however, few people dared to challenge the conventional wisdom based upon the Genesis narrative – one wonders what da Vinci’s own view might have been. However, some scholars did suggest ideas of how to ascertain the Earth’s potential age, such as Edmund Halley.
In 1715, he published a paper detailing his idea of estimating the age of lakes and oceans by measuring their rate of salt accumulation, and that this could have implications for the age of the world itself. Though flawed and somewhat crude, such ideas were unprecedented and paved the way for future investigators. However, it was generally not until towards the end of the eighteenth century that the question of the age of the Earth began to be approached in more detail by academics.
Comte de Buffon and Benoit de Maillet were prominent eighteenth-century scientists, who proposed specific ages for the Earth itself. Buffon performed several calculations based upon the idea of a gradually cooling Earth, to ascertain how long might have elapsed since the Earth’s molten state. He eventually settled on a figure of around 75,000 years – young by today’s estimates, perhaps, but a whole order of magnitude greater than those envisioned by most of his contemporaries.
Meanwhile, Maillet had been working with the idea that the oceans had slowly evaporated, thus explaining why seashells were found so high above the present sea level – the ocean had simply dropped in height over time due to its loss of water. Though he was mistaken in this idea, his estimate of around two billion years must have been mind-blowing to his contemporaries – imagine hearing the idea that something believed to have taken place six thousand years ago had really taken place billions of years in the past.
At this stage, geology was still not a definite science in itself. However, this would suddenly change. James Hutton wrote his highly important book, A Theory of the Earth in 1795, in which he foreshadowed Sir Charles Lyell’s axiom that “the present is the key to the past” – in other words, current natural forces that we are familiar with should be extrapolated backwards in time, to help us understand how and why the Earth has changed.
Much of his inspiration had come from studying strata – the layers of sedimentary deposits that were accumulated in past ages. Most of us are familiar with Siccar Point, the famous location at which Hutton observed an unconformity between layers of strata. Observations such as these caused Hutton to speculate upon an almost endless past, though he did not dare to suggest a specific time dimension to the Earth, other than one shrouded in immensity (often now referred to as ‘deep time’). His views received only moderate attention initially and it was not really until a few decades later that the majority of academics accepted his general ideas.
Of all the nineteenth century geologists, Sir Charles Lyell is probably among the most well-known. His ground-breaking magnum opus, Principles of Geology, was first published in the 1830s and went through 12 editions, the last being published posthumously in 1875. He picked up Hutton’s observations on strata and their implications for the age of the Earth, and enlarged upon the subject with his own ideas and observations.
His book had a great impact upon the scientific world, notably on Charles Darwin, who read the work on his famous voyage of the Beagle. Lyell examined the changes in shell-fauna throughout the Cenozoic era, which consists of everything that has occurred since the mass extinction of the dinosaurs, and he attempted to quantify these changes into years. He found that, according to his calculations, 80 million years would have elapsed since the beginning of the Cenozoic and calculating further backwards in time he found that 240 million years have passed since the Cambrian began, the first period in which we get abundant lifeforms prevalent.
Compare these figures to our best modern calculations using the latest techniques, of approximately 65-66 million years since the beginning of the Cenozoic and around 540 million years since the beginning of the Cambrian. Though not exactly correct, his calculations were a vast improvement upon earlier estimates.
Various other calculations were made throughout the nineteenth century to ascertain an approximate age, but the most influential was a calculation made by Lord Kelvin, the most widely respected physicist of that century. In his opinion, the Earth’s antiquity could be obtained by calculating the length of time it would take for the Earth to cool since its molten origins – in essence, a more advanced version of Buffon’s theory. Ultimately, Kelvin settled upon a 25-million-year age for the Earth as the most probable.
However, geologists and biologists had enormous difficulties with this figure, as they found it almost impossible to believe that the vast accumulations of strata within the Earth’s rocks, and the enormous amount of evolutionary change recorded therein, could be crammed into a mere 25 million years or so – the blink of an eye in comparison to what was needed. However, Kelvin’s computations seemed so detailed and meticulous that few questioned the underlying assumptions involved.
Many other calculations that were performed seemed superficially to fit in with Kelvin’s estimate, but these were somewhat questionable. For example, Charles Walcott, after studying ancient sediments, decided to use an average sedimentation rate of one foot in 200 years, and consequently came up with a figure of 27.5 million years having elapsed since the beginning of the Cambrian.
His estimated rate of deposition was far too high, however, and his calculations were therefore flawed. In fact, estimates based upon sedimentation rates varied wildly across the Earth and consequently so did the calculations used to estimate the time dimension involved. Another issue was trying to estimate the maximum thickness of all the sedimentary rocks on Earth, to turn the sedimentation rate into an actual figure for the time involved.
However, dozens of scientists employed the method of sediment accumulation and estimates for the time duration since the beginning of the Cambrian varied from a mere 3 million to over 1,500 million years. Interestingly, however, the average between the various estimates of sedimentation rates, multiplied by the maximum thickness, does in fact yield a figure in close agreement with modern estimates. This shows such methods can be used effectively, but only when various studies are taken together to create a more balanced calculation.
Another method was to estimate how long it would have taken for the oceans to have absorbed their salt from the weathering of rock, assuming all the salt was obtained in this manner – similar to Halley’s suggestion in the early eighteenth century. John Joly and WJ Sollas used this method, obtaining figures ranging from tens of millions to hundreds of millions of years. However, estimates made some decades later demonstrated much longer possibilities of thousands of millions, thus being in line with modern understanding. This demonstrates the vagueness of these methods and the importance of accurate knowledge upon which to base the calculations. No matter how good the calculations are, if the underlying figures are in error, the conclusions will be in error.
Soon after this, however, early pioneers of the study of radioactivity began to cast doubts upon such limited durations, especially upon Kelvin’s calculations of a relatively young Earth. As radioactivity produced heat in the process of decay, any calculations of a gradual rate of heat loss were almost certainly wrong. Ernest Rutherford, a young yet important physicist at this time, delivered a lecture in 1904 in which Kelvin was present in the audience. Suddenly struck with anxiety at upsetting the old gentlemen, Rutherford was relieved to find Kelvin asleep.
However, as Rutherford approached the critical section of his lecture, Kelvin awoke and, as Rutherford later recorded, “I saw the old bird sit up, open an eye and cock a baleful glance at me!” Fortunately for Rutherford, he was able to mollify the blow to Kelvin by stating that Kelvin’s calculations were essentially correct, but only provided there was no extra heat source available. Radioactivity, of course, provided this extra form of heat. However, there was a second problem for those unaware of the potentials of radioactivity research. Though not initially apparent, radioactivity would provide a way of calculating the actual age of any rock that contained the appropriate radioactive isotopes.
While scientists abandoned Kelvin’s views, they still had not found a way to reliably calculate the age of the rocks within the Earth, let alone the Earth itself. In the early 1900s, Robert John Strutt was the first to attempt an estimate of the age of certain rocks, using the new discovery of radioactive decay. Noting the gradual production of helium from the decay of certain radioactive isotopes, he obtained inaccurate, yet important early estimates, which ran into hundreds of millions of years.
However, it was not until 1911 that the idea gained prominence in the geological community. In that year, Arthur Holmes, another young physicist at the time, published a paper in which he furthered the idea that radioactivity could be used to estimate the age of the Earth. His paper has become a classic and his early estimates are extraordinarily close to the best estimates made today,
The method of dating igneous rocks was simple in theory – as the current rate at which radioactive particles break down and become stable could be measured in a laboratory, it was only a matter of extrapolation to determine when this process had begun. This would only inform us of how long had passed since the formation of a particular rock, each rock having formed at different times, of course. However, by dating rocks in different places and strata, it was possible to determine a direct, and more or less accurate, chronology for the entirety of Earth’s history. Arthur Holmes and most physicists agreed that the helium method was somewhat flawed and difficult to use reliably, preferring a different method – the uranium-lead method – which became the most commonly used technique in estimating the ages of rocks.
This method was greatly refined throughout the twentieth century and many new radiometric dating methods were also developed. For example, radiocarbon dating could date recent organic materials from the past few tens of thousands of years, thus being particularly useful in archaeology; and methods such as potassium-argon dating helped confirm the accuracy of the uranium-lead method for the earlier periods. Although from time to time there may be minor inconsistencies, it is the overwhelming consistency that demonstrates the reliability of these methods – especially demonstrable when several different techniques can be applied to date a past event, all techniques agreeing with each other. Even when there are some inconsistencies, the errors are usually within a fairly narrow margin. By the second half of the twentieth century, every scientist accepted that radiometric dating offered a genuine and reliable way to determine the age of past events.
What then, of the age of the Earth? The oldest rocks directly dated upon the Earth are somewhat younger than its true age – the oldest complete rocks being about 4 billion years in age, with the oldest detrital fragments being about 4.4 billion. It is assumed the Earth must of course be older than this to a small extent. Therefore, meteorites have been dated to ascertain an age for the Earth, operating upon the assumption that they formed at a similar time. If this is correct – and it seems reasonable enough – then a figure of about 4.6 billion is obtained.
The importance of this cannot be overestimated. With an accurate understanding of time, we can now seek to unravel further mysteries that hitherto would have been impossible to decipher. Understanding the true history of our own genus is now possible, as accurate dates for fossils can be obtained. We can finally interpret not only the sequence of past events, but how long was involved and what their potential causes might have been. In scientific research, it seems, if we answer one question a dozen more take its place. This, however, is the nature of true science and is the reason we keep on searching for answers – to uncover more questions – find more answers and so the cycle continues endlessly in a never-ending quest to improve our understanding. Perhaps, in the future, we may learn more of Earth’s early history and refine our understanding further. For the time being, however, it seems the mystery of deep time has been solved.