Washington: Human-induced carbon dioxide emissions for the past 100 to 200 years have already raised acidity of world’s oceans far beyond the range of natural variations, a new study has revealed.
By reacting with seawater, CO2 increases the water’s acidity, which may significantly reduce the calcification rate of such marine organisms as corals and mollusks.
The extent to which human activities have raised the surface level of acidity, however, has been difficult to detect on regional scales because it varies naturally from one season and one year to the next, and between regions, and direct observations go back only 30 years.
The team of climate modelers, marine conservationists, ocean chemists, biologists and ecologists, led by Tobias Friedrich and Axel Timmermann at the International Pacific Research Center, University of Hawaii at Manoa, came to their conclusions by using Earth system models that simulate climate and ocean conditions 21,000 years back in time, to the Last Glacial Maximum, and forward in time to the end of the 21st century.
They studied in their models changes in the saturation level of aragonite (a form of calcium carbonate) typically used to measure of ocean acidification.
As acidity of seawater rises, the saturation level of aragonite drops. Their models captured well the current observed seasonal and annual variations in this quantity in several key coral reef regions.
Today’s levels of aragonite saturation in these locations have already dropped five times below the pre-industrial range of natural variability. For example, if the yearly cycle in aragonite saturation varied between 4.7 and 4.8, it varies now between 4.2 and 4.3, which – based on another recent study – may translate into a decrease in overall calcification rates of corals and other aragonite shell-forming organisms by 15 percent.
Given the continued human use of fossil fuels, the saturation levels will drop further, potentially reducing calcification rates of some marine organisms by more than 40 percent of their pre-industrial values within the next 90 years.
“Any significant drop below the minimum level of aragonite to which the organisms have been exposed to for thousands of years and have successfully adapted will very likely stress them and their associated ecosystems,” said lead author Postdoctoral Fellow Tobias Friedrich.
“In some regions, the man-made rate of change in ocean acidity since the Industrial Revolution is hundred times greater than the natural rate of change between the Last Glacial Maximum and pre-industrial times.
“When Earth started to warm 17,000 years ago, terminating the last glacial period, atmospheric CO2 levels rose from 190 parts per million (ppm) to 280 ppm over 6,000 years. Marine ecosystems had ample time to adjust. Now, for a similar rise in CO2 concentration to the present level of 392 ppm, the adjustment time is reduced to only 100 – 200 years.”
On a global scale, coral reefs are currently found in places where open-ocean aragonite saturation reaches levels of 3.5 or higher. Such conditions exist today in about 50 percent of the ocean – mostly in the tropics.
By end of the 21st century this fraction is projected to be less than 5 percent. The Hawaiian Islands, which sit just on the northern edge of the tropics, will be one of the first to feel the impact.
“Our results suggest that severe reductions are likely to occur in coral reef diversity, structural complexity and resilience by the middle of this century,” said co-author Professor Axel Timmermann.
The study has been published in the online issue of Nature Climate Change.
First Published: Monday, January 23, 2012, 13:30