Climate myths: Global warming is down to the Sun, not humans
- 17:00 16 May 2007 by Fred Pearce
Switch off the Sun and Earth would become a very chilly place. No one denies our star's central role in determining how warm our planet is. The issue today is how much solar changes have contributed to the recent warming, and what that tells us about future climate.
The total amount of solar energy reaching Earth can vary due to changes in the Sun's output, such as those associated with sunspots, or in Earth's orbit. Orbital oscillations can also result in different parts of Earth getting more or less sunlight even when the total amount reaching the planet remains constant - similar to the way the tilt in Earth's axis produces the hemispheric seasons. There may also be more subtle effects (see Climate myths: Cosmic rays are causing climate change), but these remain unproven.
On timescales that vary from millions of years through to the more familiar 11-year sunspot cycles, variations in the amount of solar energy reaching Earth have a huge influence on our atmosphere and climate. But the Sun is far from being the only player.
How do we know? According to solar physicists, the sun emitted a third less energy about 4 billion years ago and has been steadily brightening ever since. Yet for most of this time, Earth has been even warmer than today, a phenomenon sometimes called the faint sun paradox. The reason: higher levels of greenhouse gases trapping more of the sun's heat.
Nearer our own time, the coming and going of the ice ages that have gripped the planet in the past two million years were probably triggered by fractional changes in solar heating (caused by wobbles in the planet's orbit, known as Milankovitch cycles).
The cooling and warming during the ice ages and interglacial periods, however, was far greater than would be expected from the tiny changes in solar energy reaching the Earth. The temperature changes must have been somehow amplified. This most probably happened through the growth of ice sheets, which reflect more solar radiation back into space than darker land or ocean, and transfers of carbon dioxide between the atmosphere and the ocean.
Analysis of ice cores from Greenland and Antarctica shows a very strong correlation between CO2 levels in the atmosphere and temperatures. But what causes what? Proponents of solar influence point out that that temperatures sometimes change first. This, they say, suggest that warming causes rising CO2 levels in the atmosphere, not vice versa. What is actually happening is a far more complicated interaction (see Ice cores show CO2 only rose after the start of warm periods).
So what role, if any, have solar fluctuations had in recent temperature changes? While we can work out how Earth's orbit has changed going back many millions of years, we have no first-hand record of the changes in solar output associated with sunspots before the 20th century.
It is true that sunspot records go back to the 17th century, but sunspots actually block the Sun's radiation. It is the smaller bright spots (faculae) that increase the Sun's output and these were not recorded until more recently. The correlation between sunspots and bright faculae is not perfect, so estimates of solar activity based on sunspot records may be out by as much as 30%.
The other method of working out past solar activity is to measure levels of carbon-14 and beryllium-10 in tree rings and ice cores. These isotopes are formed when cosmic rays hit the atmosphere, and higher sunspot activity is associated with increases in the solar wind that deflect more galactic cosmic rays away from Earth. Yet again, though, the correlation is not perfect. What is more, recent evidence suggests that the deposition of beryllium-10 can be affected by climate changes, making it even less reliable as a measure of past solar activity.
Despite these problems, most studies suggest that before the industrial age, there was a good correlation between natural "forcings" - solar fluctuations and other factors such as the dust ejected by volcanoes - and average global temperatures. Solar forcing may have been largely responsible for warming in the late 19th and early 20th century, levelling off during the mid-century cooling (see Global temperatures fell between 1940 and 1980).
The 2007 IPCC report halved the maximum likely influence of solar forcing on warming over the past 250 years from 40% to 20%. This was based on a reanalysis of the likely changes in solar forcing since the 17th century.
But even if solar forcing in the past was more important than this estimate suggests, as some scientists think, there is no correlation between solar activity and the strong warming during the past 40 years. Claims that this is the case have not stood up to scrutiny (pdf document).
Direct measurements of solar output since 1978 show a steady rise and fall over the 11-year sunspot cycle, but no upwards or downward trend .
Similarly, there is no trend in direct measurements of the Sun's ultraviolet output and in cosmic rays. So for the period for which we have direct, reliable records, the Earth has warmed dramatically even though there has been no corresponding rise in any kind of solar activity.
If you would like to reuse any content from New Scientist, either in print or online, please contact the syndication department first for permission. New Scientist does not own rights to photos, but there are a variety of licensing options available for use of articles and graphics we own the copyright to.