Today, rising carbon dioxide in the atmosphere is a cause for concern, but 2.7 billion years ago, high levels of CO2 probably kept our planet warm enough for life even though the sun was about 20% fainter than it is today.
A newly published study, based on analyses of ancient micrometeorites and a fresh round of computer modeling, estimates just how high those CO2 levels were. The likeliest level is somewhere in excess of 70% CO2, scientists from the University of Washington report today in the open-access journal Science Advances.
That’s good news for astrobiologists, because such an environment matches up well with a picture of an early Earth where global mean temperatures were in the mid-80s Fahrenheit (roughly 30 degrees Celsius) or cooler. The high CO2 levels wouldn’t be livable for us humans, but it’d be fine for the early organisms that ruled the Earth before atmospheric oxygen levels rose.
The findings “could also inform our understanding of Earth-like exoplanets and their potential habitability,” said the study team, led by UW researcher Owen Lehmer.
Lehmer and his colleagues started out with a chemical analysis of tiny beads of metal that were found years ago in 2.7 billion-year-old limestone from northwestern Australia. Those iron-rich bits came to Earth as micrometeorites from space. As they fell through the atmosphere, the bits heated up to become droplets of molten metal, and then congealed into beads as they cooled.
While the micrometeorites were in their molten state, they reacted chemically with gases in the atmosphere. Some of the iron turned into oxidized minerals such as wüstite and magnetite. By analyzing the extent of the oxidation, and making some assumptions about atmospheric composition, scientists can estimate how much of which gases were present at the time.
An earlier team of researchers assumed that the micrometeorites reacted primarily with oxygen gas, but that led to conclusions that were out of sync with other evidence about early Earth’s environment. The UW scientists went with a different approach, assuming that carbon dioxide was the primary oxidant.
When the research team ran the numbers, they found that a wide range of CO2 concentrations could explain the oxidation levels seen in the micrometeorites — as little as 6%, or as much as 100%. But because some of the metal bits were fully oxidized, levels in excess of 70% provided the best fit for the data.
Obviously, we’re not seeing levels that high today (thank goodness). Over the course of our planet’s history, CO2 concentrations have decreased to a few hundred parts per million (0.04% by volume and rising).
The UW researchers said the levels of unoxidized iron should have risen in ancient times as CO2 levels fell, at least up to the time when oxygen levels became significant — in effect, providing a method to measure the evolution of Earth’s atmosphere. “To verify this hypothesis, additional micrometeorites should be collected and analyzed,” they wrote.
In addition to Lehmer, the authors of the paper appearing in Science Advances, “Atmospheric CO2 Levels From 2.7 Billion Years Ago Inferred From Micrometeorite Oxidation,” include David Catling, Roger Buick, Don Brownlee and Sarah Newport.