Thursday, March 29, 2012

Nature reviews Michael Mann's book as 'trope' about 'an unwinnable fight'

The April 2012 edition of Nature Climate Change critically reviews Michael Mann's new book describing it as "trope" about "an unwinnable fight". The review features Mann's backpedaling quote from the book, "the [climate science] community probably took [the results of the hockey stick] to be more definitive than Mann and colleagues intended."

 
 
[no link available]

New paper confirms 2010 Russian heat wave was result of natural variability

A paper published today in the journal Monthly Weather Review confirms (along with several other studies) "that the anomalous long-lasting Russian heat wave in summer 2010, linked to a long-persistent blocking high, appears as a result of natural atmospheric variability." Natural climate change denier Kevin "missing heat" Trenberth, however, continues to cry wolf insisting that the Russian heat wave and every other 'extreme' weather event of 2010 "would not have happened without global warming." 


Large scale flow and the long-lasting blocking high over Russia: Summer 2010

Andrea Schneidereit*
Leibniz-Institute of Atmospheric Physics at the University Rostock, Kühlungsborn, Germany
Silke Schubert
Meteorological Institute, KlimaCampus, University of Hamburg, Hamburg, Germany
Pavel Vargin
Central Aerological Observatory, Dolgoprudny, Moscow region, Russia
Frank Lunkeit and Xiuhua Zhu
Meteorological Institute, KlimaCampus, University of Hamburg, Hamburg, Germany
Dieter H. W. Peters
Leibniz-Institute of Atmospheric Physics at the University of Rostock, Kühlungsborn, Mecklenburg, Germany
Klaus Fraedrich
Meteorological Institute, KlimaCampus, University of Hamburg, Hamburg, Germany
Abstract
Several studies show that the anomalous long-lasting Russian heat wave in summer 2010, linked to a long-persistent blocking high, appears as a result of natural atmospheric variability.

This study analyzes the large scale flow structure based on ERA-Interim data (1989 to 2010). The anomalous long-lasting blocking high over Western Russia including the heat wave occurs as an overlay of a set of anticyclonic contributions on different time scales: (i) A regime change in ENSO towards La Niña modulates the quasi-stationary wave structure in the boreal summer hemisphere supporting the eastern European blocking. The polar Arctic dipole mode is enhanced and shows a projection on the mean blocking high. (ii) Together with the quasi-stationary wave anomaly the transient eddies maintain the long-lasting blocking. (iii) Three different pathways of wave action are identified on the intermediate time scale (~ 10-60 days). One pathway commences over the eastern North Pacific and includes the polar Arctic region; another one runs more southward and crossing the North Atlantic, continues to eastern Europe; a third pathway southeast of the blocking high describes the downstream development over South Asia.

Monday, March 26, 2012

Physicist William Happer: Global Warming Models Are Wrong Again

William Happer: Global Warming Models Are Wrong Again  WSJ.com 3/26/12

The observed response of the climate to more CO2 is not in good agreement with predictions.


During a fundraiser in Atlanta earlier this month, President Obama is reported to have said: "It gets you a little nervous about what is happening to global temperatures. When it is 75 degrees in Chicago in the beginning of March, you start thinking. On the other hand, I really have enjoyed nice weather."
What is happening to global temperatures in reality? The answer is: almost nothing for more than 10 years. Monthly values of the global temperature anomaly of the lower atmosphere, complied at the University of Alabama from NASA satellite data, can be found at the website http://www.drroyspencer.com/latest-global-temperatures/. The latest (February 2012) monthly global temperature anomaly for the lower atmosphere was minus 0.12 degrees Celsius, slightly less than the average since the satellite record of temperatures began in 1979.
The lack of any statistically significant warming for over a decade has made it more difficult for the United Nations Intergovernmental Panel on Climate Change (IPCC) and its supporters to demonize the atmospheric gas CO2 which is released when fossil fuels are burned. The burning of fossil fuels has been one reason for an increase of CO2 levels in the atmosphere to around 395 ppm (or parts per million), up from preindustrial levels of about 280 ppm.
happer
Getty Images
CO2 is not a pollutant. Life on earth flourished for hundreds of millions of years at much higher CO2 levels than we see today. Increasing CO2 levels will be a net benefit because cultivated plants grow better and are more resistant to drought at higher CO2 levels, and because warming and other supposedly harmful effects of CO2 have been greatly exaggerated. Nations with affordable energy from fossil fuels are more prosperous and healthy than those without.
The direct warming due to doubling CO2 levels in the atmosphere can be calculated to cause a warming of about one degree Celsius. The IPCC computer models predict a much larger warming, three degrees Celsius or even more, because they assume changes in water vapor or clouds that supposedly amplify the direct warming from CO2. Many lines of observational evidence suggest that this "positive feedback" also has been greatly exaggerated.
There has indeed been some warming, perhaps about 0.8 degrees Celsius, since the end of the so-called Little Ice Age in the early 1800s. Some of that warming has probably come from increased amounts of CO2, but the timing of the warming—much of it before CO2 levels had increased appreciably—suggests that a substantial fraction of the warming is from natural causes that have nothing to do with mankind.
Frustrated by the lack of computer-predicted warming over the past decade, some IPCC supporters have been claiming that "extreme weather" has become more common because of more CO2. But there is no hard evidence this is true. After an unusually cold winter in 2011 (December 2010-February 2011) the winter of 2012 was unusually warm in the continental United States. But the winter of 2012 was bitter in Europe, Asia and Alaska.
Weather conditions similar to 2012 occurred in the winter of 1942, when the U.S. Midwest was unusually warm, and when the Wehrmacht encountered the formidable forces of "General Frost" in a Russian winter not unlike the one Russians just had.
Large fluctuations from warm to cold winters have been the rule for the U.S., as one can see from records kept by the National Ocean and Atmospheric Administration, NOAA. For example, the winters of 1932 and 1934 were as warm as or warmer than the 2011-2012 one and the winter of 1936 was much colder.
Nightly television pictures of the tragic destruction from tornadoes over the past months might make one wonder if the frequency of tornadoes is increasing, perhaps due to the increasing levels of CO2 in the atmosphere. But as one can read at Andrew Revkin's New York Times blog, dotearth, "There is no evidence of any trend in the number of potent tornadoes (category F2 and up) over the past 50 years in the United States, even as global temperatures have risen markedly."
Like winter temperatures, the numbers, severity and geographical locations of tornadoes fluctuate from year-to-year in ways that are correlated with the complicated fluid flow patterns of the oceans and atmosphere, the location of the jet stream, El Niño or La Niña conditions of the tropical Pacific Oceans, etc.
As long as the laws of nature exist, we will have tornadoes. But we can save many more lives by addressing the threat of tornadoes directly—for example, with improved and more widely dispersed weather radars, and with better means for warning the people of endangered areas—than by credulous support of schemes to reduce "carbon footprints," or by funding even more computer centers to predict global warming.
It is easy to be confused about climate, because we are constantly being warned about the horrible things that will happen or are already happening as a result of mankind's use of fossil fuels. But these ominous predictions are based on computer models. It is important to distinguish between what the climate is actually doing and what computer models predict. The observed response of the climate to more CO2 is not in good agreement with model predictions.
We need high-quality climate science because of the importance of climate to mankind. But we should also remember the description of how science works by the late, great physicist, Richard Feynman:
"In general we look for a new law by the following process. First we guess it. Then we compute the consequences of the guess to see what would be implied if this law that we guessed is right. Then we compare the result of the computation to nature, with experiment or experience; compare it directly with observation, to see if it works. If it disagrees with experiment it is wrong."
The most important component of climate science is careful, long-term observations of climate-related phenomena, from space, from land, and in the oceans. If observations do not support code predictions—like more extreme weather, or rapidly rising global temperatures—Feynman has told us what conclusions to draw about the theory.
Mr. Happer is a professor of physics at Princeton.

New paper claims ozone is most important driver of recent climate

A paper published last week in the Journal of Atmospheric and Solar-Terrestrial Physics claims stratospheric ozone is the most important driver of recent climate, accounting for 75% of Earth's temperature variations during the period 1926-2011. Ozone is in turn controlled by natural variations in galactic cosmic rays & solar activity, rather than man-made chlorofluorocarbons or 'greenhouse gases.' The Svensmark hypothesis relates variations in solar activity to amplified variations of galactic cosmic rays, which in turn result in changes in cloud cover. This new paper may provide a second mechanism by which variations in solar activity are amplified by the effect on galactic cosmic rays and ozone.

Climate sensitivity to the lower stratospheric ozone variations
  • N.A. KilifarskaCorresponding author contact informationE-mail the corresponding author
  • National Institute of Geophysics, Geodesy and Geography, BAS

Abstract

The strong sensitivity of the Earth's radiation balance to variations in the lower stratospheric ozone – reported previously – is analyzed here by the use of non-linear statistical methods. Our non-linear model of the land air temperature (T) – driven by the measured Arosa total ozone (TOZ) – explains 75% of total variability of Earth's T variations during the period 1926–2011. We have analyzed also the factors which could influence the TOZ variability and found that the strongest impact belongs to the multi-decadal variations of galactic cosmic rays. Constructing a statistical model of the ozone variability, we have been able to predict the tendency in the land air T evolution till the end of the current decade. Results show that Earth is facing a weak cooling of the surface T by 0.05–0.25 K (depending on the ozone model) until the end of the current solar cycle. A new mechanism for O3 influence on climate is proposed.

Highlights

► An increased climate sensitivity to ozone variations is analyzed. ► O3 driven model of surface T explains the greatest part of its variability. ► Impact of different factors on lower stratospheric O3 variability is estimated. ► Galactic cosmic rays have a greatest influence on O3. ► Mechanism for ozone influence on climate is described.

New paper shows clouds have a significant negative feedback on temperature

A paper published last week in the Journal of Climate utilized satellite measurements to demonstrate that European hot summers since the 1980s are associated with a reduction in cloudiness. These observations corroborate other recent papers showing that clouds exert a negative feedback on global warming. Alarmists [such as the 'Skeptical Science' site here] hypothesize that clouds exert a net positive-feedback leading to a 'runaway greenhouse effect', but observational data from multiple papers* have shown that clouds instead exert a strong net negative-feedback cooling effect on the climate.

*Lindzen & Choi, Spencer & Braswell, Allen, Tang et al [below], others

European hot summers associated with a reduction of cloudiness

Qiuhong Tang1
Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Sciences, Beijing, China
Guoyong Leng
Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Sciences, Beijing, China, Graduate University of Chinese Academy of Sciences, Beijing, China
Pavel Ya. Groisman
National Climatic Data Center, Asheville, NC, United States
Abstract
A pronounced summer warming is observed in Europe since the 1980s that has been accompanied with an increase in the occurrence of heat waves. Water deficit that strongly reduces surface latent cooling is a widely accepted explanation for the causes of hot summers. We show that the variance of European summer temperature is partly explained by changes in summer cloudiness. Using observation-based products of climate variables, satellite-derived cloud cover and radiation products, we show that during the 1984-2007 period Europe has become less cloudy (except of northeastern Europe) and the regions east of Europe have become cloudier in summer daytime. In response, the summer temperatures increased in the areas of total cloud cover decrease, and stalled or declined in the areas of cloud cover increase. Trends in the surface shortwave radiation are generally positive (negative) in the regions with summer warming (cooling or stalled warming), while the signs of trends in top-of-atmosphere (TOA) reflected shortwave radiation are reversed. Our results suggest that total cloud cover is either the important local factor influencing the summer temperature changes in Europe or a major indicator of these changes.

Saturday, March 24, 2012

NY Times: 'The state of the electric car is dismal, the victim of hyped expectations, technological flops, high costs and a hostile political climate'

An article in tomorrow's New York Times proclaims, "The state of the electric car is dismal, the victim of hyped expectations, technological flops, high costs and a hostile political climate." In typical NYT fashion, the article concludes with the implication that the failure of electric cars is the fault of the fossil-fuel industry.

The Electric Car, Unplugged





THE future would appear bright for the electric car. Gasoline prices are high. The government is spending billions on battery technology. Auto companies are preparing to roll out a dozen new electrified models. Concern is growing about the climate impacts of burning oil. And tough new fuel economy standards are looming.
Yet the state of the electric car is dismal, the victim of hyped expectations, technological flops, high costs and a hostile political climate. General Motors has temporarily suspended production of the plug-in electric Chevy Volt because of low sales. Nissan’s all-electric Leaf is struggling in the market. A number of start-up electric vehicle and battery companies have folded. And the federal government has slowed its multibillion-dollar program of support for advanced technology vehicles in the face of market setbacks and heavy political criticism.
The $41,000 Volt, in particular, has become a target of conservatives. Glenn Beck called the Volt “crappy.” Rush Limbaugh accused General Motors of “trying to kill its customers” by selling an unsafe car. Former House Speaker Newt Gingrich said while campaigning for president in Georgia last month that the Volt was too small to handle a gun rack (a claim proved wrong repeatedly on YouTube).
Daniel F. Akerson, the chairman of General Motors, defended the Volt before Congress earlier this year after revelations that the battery pack in one Volt caught fire three weeks after a federal crash test. Federal authorities eventually declared the car no more flammable than regular gasoline-fueled vehicles.
“Unfortunately, there’s one thing we did not engineer,” Mr. Akerson said. “Although we loaded the Volt with state-of-the-art safety features, we did not engineer the Volt to be a political punching bag. And that, sadly, is what it’s become.”
Is this the beginning of the end of the latest experiment in the electric car, whose checkered history goes back to the dawn of the automobile age? Can the electric car survive only with heavy government subsidies and big consumer rebates? Are the Teslas and Fiskers and ActiveEs andVolts and Leafs destined to be the playthings of only rich technophiles with a couple of spare gas-powered cars at home?
Or is this what an emergent technology looks like before it crosses the valley of death?
“Face it, this is not an easy task,” said Brett Smith, assistant research director at the Center for Automotive Research in Ann Arbor, Mich. “You still have an energy storage device that’s not ready for prime time. You still have the chicken and egg problem with the charging infrastructure. That’s not to say it’s not viable over the long run. But the hype is gone and the challenges are still there.”
The market for all-electric and plug-in electric cars in the United States is tiny, amounting to fewer than 20,000 sales last year out of total light-vehicle sales of 12.8 million. Even in optimistic forecasts, plug-in vehicles will account for less than 5 percent of the global market by 2025.
Hybrids that do not require external charging, however, like today’s Toyota Prius and many others already in showrooms, are a growing segment. Forecasters say they could represent as much as 6 percent of the market by 2015 and 25 percent by 2025, in part because they are among the few vehicles currently on track to meet the government’s proposed new fuel economy standard of roughly 50 miles per gallon by 2025.
Other propulsion technologies, like natural gas and fuel cells, are more likely to be seen first in heavy trucks and local delivery vans because of limited refueling options.
Jon Bereisa is a former G.M. systems engineer who helped design the Volt and was among the lead developers of the company’s mid-1990s experiment in electric vehicles, the ill-fated EV1. He says that the prospects for the electric car are much better today than they were then, but technical development, cost reduction and consumer acceptance are going to take far longer than most people expect.
“There is much more political support for it today, for a variety of reasons,” he said. “Global warming, energy security, petroleum prices, all these vectors are aligned to support the electrification of the automobile, whether it’s hybrid, plug-in, extended-range hybrid or full battery-electric.”
But he added that the Volt was an incredibly complicated device in the early stages of development. “When you push the start button, you’ve got 10 million lines of software running. On an F-15, it’s about eight million lines of code. You’re really driving a modern data center, and a lot can go wrong.”
He noted that the current Volt was the first generation and predicted that its third version, which will come between 2020 and 2025, will gain wide acceptance, as long as G.M. does not end the project and the government backs a nationwide infrastructure of charging stations.
PRESIDENT OBAMA, who has been a strong supporter of alternative vehicle and fuel technologies, proposed this month spending more than $4 billion to encourage purchases of electric and natural gas vehicles and to speed construction of charging and fueling stations. He is seeking to raise the current $7,500 purchase incentive for electric and plug-in electric vehicles to $10,000, and to make it a point-of-sale rebate rather than a credit to be claimed on a tax return.
David B. Sandalow, the assistant secretary of energy for policy and international affairs, said that the Obama administration was fully committed to nurturing this technology, and he is persuaded that, eventually, it will catch on.
“It is the future of transportation,” said Mr. Sandalow, who saves money on gas by commuting to work in a Prius converted to run 30 miles on battery power alone. “The only question is how fast and how soon.”
He said that China, Germany, Israel, South Africa and other nations were racing ahead with electric vehicle programs and maintained that President Obama’s goal of putting one million electric vehicles on the road by 2015 was achievable if Congress fully financed his rebate and infrastructure proposals.
Most analysts doubt the million-car goal is achievable, as the enthusiasm over electrification in the industry has begun to flicker and the price of battery technology remains stubbornly high.
At the recent auto show in Geneva, for example, Peter Schwarzenbauer, a top executive at Audi, said of electric vehicles, which generated considerable buzz at previous shows, “Reality is phasing in.”
And Dieter Zetsche, the chief executive of the German automaker Daimler, said that cost, range and consumer rates remained serious problems for the electric car. Still, he said, the company would continue work on such vehicles, as well as those powered by gasoline, diesel and hydrogen.
“There is no alternative,” Mr. Zetsche told reporters. “We believe it is our responsibility to push this technology forward and make it marketable.”
The fate of the electric car remains hazy, with technical, economic and political forces working both for and against it. Chris Paine, who made the 2006 documentary “Who Killed the Electric Car?” about the demise of G.M.’s EV1 at the hands of the car company, government regulators and the oil industry, said he was alarmed at how quickly the political climate had turned against the Chevy Volt and other electric vehicles, and offered a theory as to why.
“The attacks leave me a bit stunned,” he said in an e-mail message. He said the Volt had been more successful in the marketplace than the early Prius was and that today, unlike in the late 1990s, the government and the auto industry are fully behind electric vehicle programs.
But one possible culprit still stands to gain if the electric car is killed yet again, Mr. Paine suggested.
“Not too hard to guess,” he said. “With Americans paying $250 a month to fill up on gasoline when electricity can do the job in a Volt for $50 a month, why are we being told electric cars are failures? Who could possibly be behind this?”

A reporter on energy and the environment for The New York Times

Related: NYT realizes the US could regain energy independence

Thursday, March 22, 2012

IPCC Expert Dr. Vincent Gray: 'There is not a scrap of evidence in any IPCC reports that human emissions of CO2 have any harmful effect on the climate'

Excerpts from a posting today by IPCC Expert Reviewer and climate scientist Dr. Vincent Gray:


Nobody seems to realise that the most elaborate and comprehensive conflict of interest that has been inflicted on the public is the "Global Warning" Theory.

I have been an Expert Reviewer on every one of the Reports of the Intergovernmental Panel on Climate Change and I can tell you that there is not a scrap of evidence in any of them that human emissions of carbon dioxide have any harmful effect on the climate.

How have they got away with it?

Attempts to "simulate" their unreliable and manipulated past climate "data" have been failures, yet are claimed as successes, But even if the "data" were genuine and the simulation successful it does not prove anything. Correlation, however convincing is not evidence of causation. The only way you can demonstrate the success of any theory is successful prediction  of future climate over the whole range it is intended to be used, to a satisfactory level of accuracy. This has already been done with Newton's Laws of motion and Darwin's theories of evolution. It has not been done with the "global warming" theory. There has been no successful attempt to predict any future climate event. They do not even pretend they can do it, as they only provide "projections" from their models, not "predictions": .

How have they persuaded us that they are able to predict future climate?

They operate a system called "attribution". This is a combination of "simulation" (correlation), and "assessment" by "experts".  The "experts" are all paid to provide the models that they are assessing. These assessments are therefore an elaborate and comprehensive  conflict of interest.

They apply a whole series of "likelihoods" to each "assessment" and apply a fake "statistical significance" which, unlike those normally applied to genuine science, have no background of actual experimental observations.

I attach the official list of instructions on how to perpetrate this elaborate fraud on the international community, from the Fourth IPCC Report.

Cheers
Vincent Gray
Wellington 6035

"To kill an error is as good a service as, and sometimes better than, the establishing of a new truth or fact"   Charles Darwin



Attachment: AR4 Uncertainy Guidance Notes (pdf) 

Settled science update: 'Greenhouse gases' don't cause sea level rise

A paper published this week in The Journal of Climate finds the "settled" belief that warming due to 'radiative forcing' from 'greenhouse gases' is causing the seas to rise is not supported by observational data. The authors "find a relationship between sea level and temperature and find that temperature causally depends on the sea level, which can be understood as a consequence of the large heat capacity of the ocean." In other words, increase in surface temperature is caused by sea level rise rather than the "consensus" belief that temperature causes sea level rise. Furthermore, the authors attempt to find evidence that 'radiative forcing' from 'greenhouse gases' caused sea level rise in the latter 20th century, "but unexpectedly find that the sea level does not depend on the forcing." This is of no surprise to some skeptics who understand downwelling infrared from 'greenhouse gases' is incapable of heating the ocean due to a penetration depth of only a few microns.
Hansen's corrupted temperature anomalies in top graph, corrupted sea level in second graph, total 'radiative forcing' anomalies from greenhouse gases in third graph. Also note graphs stop in ~1998 during a record El Nino and that temperature and sea levels have declined since in some datasets.

Statistical analysis of global surface temperature and sea level using cointegration methods

Torben Schmith1
Danish Meteorological Institute, Copenhagen, Denmark
Søren Johansen
University of Copenhagen, Copenhagen, Denmark
Peter Thejll
Danish Meteorological Institute, Copenhagen, Denmark

Abstract
Global sea level rise is widely understood as a consequence of thermal expansion and melting of glaciers and land-based ice caps. Due to the lack of representation of ice-sheet dynamics in present-day physically-based climate models, semi-empirical models have been applied as an alternative for projecting of future sea levels. There are in this, however, potential pitfalls due to the trending nature of the time series. We apply a statistical method called cointegration analysis to observed global sea level and land-ocean surface temperature, capable of handling such peculiarities. We find a relationship between sea level and temperature and find that temperature causally depends on the sea level, which can be understood as a consequence of the large heat capacity of the ocean. We further find that the warming episode in the 1940s is exceptional in the sense that sea level and warming deviates from the expected relationship. This suggests that this warming episode is mainly due to internal dynamics of the ocean rather than external radiative forcing. On the other hand, the present warming follows the expected relationship, suggesting that it is mainly due to radiative forcing. In a second step, we use the total radiative forcing as an explanatory variable, but unexpectedly find that the sea level does not depend on the forcing. We hypothesize that this is due to a long adjustment time scale of the ocean and show that the number of years of data needed to build statistical models that have the relationship expected from physics exceeds what is currently available by a factor of almost ten.


Full paper here

New paper finds NW China was significantly warmer ~ 4000 to 5000 years ago

A paper published today in Climate of the Past finds that mean annual temperatures in the Tianshui Basin of NW China were approximately 2.2°C [3.9°F] higher than today during periods from 5200 to 4900 and 4800 to 4300 years ago. Alarmists claim that the much smaller 0.7C increase in global temperatures since 1850 is mostly man-made and that global warming 2°C above pre-industrial levels would cause catastrophic climate change. This study and many others show the earth has naturally warmed and cooled many times and to far greater extents than experienced over the past century without resulting in a 'runaway greenhouse effect' or catastrophic climate change.

Clim. Past, 8, 625-636, 2012
www.clim-past.net/8/625/2012/
doi:10.5194/cp-8-625-2012

The quantitative reconstruction of the palaeoclimate between 5200 and 4300 cal yr BP in the Tianshui Basin, NW China

N. Sun1,2 and X. Q. Li1,2
1The Laboratory of Human Evolution, Institute of Vertebrate Palaeontology and Palaeoanthropology, Chinese Academy of Sciences, 142 Xizhimenwai street, Beijing, 100044, China
2State Lab of Loess & Quaternary Geology, Institute of Earth Environment, Chinese Academy of Sciences, Hi-Tech Zone, Xi'an, 710075, Shaanxi, China



Abstract. The quantitative reconstruction of the palaeoclimate is a prerequisite for understanding climate processes at time scales of centuries and millennia. Here, the coexistence approach (CA) was applied to reconstruct climatic factors quantitatively based on the fossil charcoal records between 5200 and 4300 cal yr BP in the Tianshui Basin, NW China. The CA analysis showed that the climate of the Tianshui Basin belonged to the northern subtropical zone between 5200 and 4300 cal yr BP. The mean annual temperature (MAT) was approximately 13.2 °C, and the mean annual precipitation (MAP) was approximately 778 mm between 5200 and 4900 cal yr BP. The MAT was approximately 13.2 °C, and the MAP was approximately 688 mm between 4800 and 4300 cal yr BP. The MAT was approximately 2.2 °C higher than today, and the MAP was approximately 280 mm higher than today from 5200 to 4900 cal yr BP. The MAT was also approximately 2.2 °C higher than today from 4800 to 4300 cal yr BP, while the MAP was approximately 196 mm higher than today. No abrupt cold event occurred between 5200 and 4300 cal yr BP; however, a drought tendency appeared after around 4800 cal yr BP.

Final Revised Paper (PDF, 4943 KB)   Discussion Paper (CPD)   

Wednesday, March 21, 2012

New study finds Medieval Warming Period and Little Ice Age were also present in Antarctica

A press release today from Syracuse University reports a new study has found evidence that the Medieval Warming Period and Little Ice Age also involved Antarctica. The research adds to overwhelming evidence collected by over 1000 scientists that the Medieval Warming Period [MWP] was a global phenomenon. Climate fraudsters such as Michael Mann et al have attempted to erase the global MWP using statistical tricks or failing that, dismiss the MWP as being limited to Europe.

Scientists use rare mineral to correlate past climate events in Europe, Antarctica

New study published in April issue of Earth and Planetary Science Letters
The first day of spring brought record high temperatures across the northern part of the United States, while much of the Southwest was digging out from a record-breaking spring snowstorm. The weather, it seems, has gone topsy-turvy. Are the phenomena related? Are climate changes in one part of the world felt half a world away?
glacierTo understand the present, scientists look for ways to unlock information about past climate hidden in the fossil record. A team of scientists led by Syracuse University geochemist Zunli Lu has found a new key in the form of ikaite, a rare mineral that forms in cold waters. Composed of calcium carbonate and water, ikaite crystals can be found off the coasts of Antarctica and Greenland.
“Ikaite is an icy version of limestone,” say Lu, assistant professor of earth sciences in SU’s College of Arts and Sciences. “The crystals are only stable under cold conditions and actually melt at room temperature.”
It turns out the water that holds the crystal structure together (called the hydration water) traps information about temperatures present when the crystals formed. This finding by Lu’s research team establishes, for the first time, ikaite as a reliable proxy for studying past climate conditions. The research was recently published online in the journal Earth and Planetary Science Letters and will appear in print on April 1. Lu conducted most of the experimental work for the study while a post-doctoral researcher at Oxford University. Data interpretation was done after he arrived at SU.
The scientists studied ikaite crystals from sediment cores drilled off the coast of Antarctica. The sediment layers were deposited over 2,000 years. The scientists were particularly interested in crystals found in layers deposited during the “Little Ice Age,” approximately 300 to 500 years ago, and during the “Medieval Warm Period,” approximately 500 to 1,000 years ago. Both climate events have been documented in Northern Europe, but studies have been inconclusive as to whether the conditions in Northern Europe extended to Antarctica.
Ikaite crystals incorporate ocean bottom water into their structure as they form. During cooling periods, when ice sheets are expanding, ocean bottom water accumulates heavy oxygen isotopes (oxygen 18). When glaciers melt, fresh water, enriched in light oxygen isotopes (oxygen 16), mixes with the bottom water. The scientists analyzed the ratio of the oxygen isotopes in the hydration water and in the calcium carbonate. They compared the results with climate conditions established in Northern Europe across a 2,000-year time frame. They found a direct correlation between the rise and fall of oxygen 18 in the crystals and the documented warming and cooling periods.
“We showed that the Northern European climate events influenced climate conditions in Antarctica,” Lu says. “More importantly, we are extremely happy to figure out how to get a climate signal out of this peculiar mineral. A new proxy is always welcome when studying past climate changes.”

Monday, March 19, 2012

Another death blow to the Hockey Stick: New paper finds many tree-ring analyses to be highly biased

Paging Michael Mann: A paper published this week finds that many tree-ring proxy studies are highly biased and calls for "great caution in the interpretation of historical growth trends from tree-ring analyses." The authors find that "big tree selection bias" resulted in a fictitious "doubling in growth rates over recent decades." Consequently, tree-ring analyses claiming to link growth rates to historical temperatures would show a fictitious large 'hockey stick' increase in temperature over recent decades.

GLOBAL BIOGEOCHEMICAL CYCLES, VOL. 26, GB1025, 13 PP., 2012
doi:10.1029/2011GB004143
Key Points
  • Observed increases in tree ring widths may be caused by sampling biases
  • Standard sampling methods lead to spurious trends in historical growth rates
  • Reported increases in ring width may often not be due to CO2 fertilization
Roel J. W. Brienen
School of Geography, University of Leeds, Leeds, UK
Programa de Manejo de Bosques de la Amazonía Boliviana, Riberalta, Bolivia
Emanuel Gloor
School of Geography, University of Leeds, Leeds, UK
Pieter A. Zuidema
Programa de Manejo de Bosques de la Amazonía Boliviana, Riberalta, Bolivia
Ecology and Biodiversity, Institute of Environmental Biology, Faculty of Science, Utrecht University, Utrecht, Netherlands
Forest Ecology and Forest Management, Centre for Ecosystem Studies, Wageningen, Netherlands
Tree ring analysis allows reconstructing historical growth rates over long periods. Several studies have reported an increasing trend in ring widths, often attributed to growth stimulation by increasing atmospheric CO2 concentration. However, these trends may also have been caused by sampling biases. Here we describe two biases and evaluate their magnitude. (1) The slow-grower survivorship bias is caused by differences in tree longevity of fast- and slow-growing trees within a population. If fast-growing trees live shorter, they are underrepresented in the ancient portion of the tree ring data set. As a result, reconstructed growth rates in the distant past are biased toward slower growth. (2) The big-tree selection bias is caused by sampling only the biggest trees in a population. As a result, slow-growing small trees are underrepresented in recent times as they did not reach the minimum sample diameter. We constructed stochastic models to simulate growth trajectories based on a hypothetical species with lifetime constant growth rates and on observed tree ring data from the tropical tree Cedrela odorata. Tree growth rates used as input in our models were kept constant over time. By mimicking a standard tree ring sampling approach and selecting only big living trees, we show that both biases lead to apparent increases in historical growth rates. Increases for the slow-grower survivorship bias were relatively small and depended strongly on assumptions about tree mortality. The big-tree selection bias resulted in strong historical increases, with a doubling in growth rates over recent decades. A literature review suggests that historical growth increases reported in many tree ring studies may have been partially due to the big-tree sampling bias. We call for great caution in the interpretation of historical growth trends from tree ring analyses and recommend that such studies include individuals of all sizes.


Move Over, OPEC—Here We Come

In energy, North America is becoming the new Middle East. The only thing that can stop it is domestic politics.

The United States has become the fastest-growing oil and gas producer in the world, and it is likely to remain so for the rest of this decade and into the 2020s. Add to this output the steadily growing Canadian production and a likely reversal of Mexico's recent production decline, and theoretically total oil production from the three countries could rise by 11.2 million barrels per day by 2020, or to 26.6 million barrels per day from around 15.4 million per day at the end of 2011.
Whether the increase results in the U.S. reducing its imports or whether our net exports grow doesn't matter much to world balances. Either way, North America is becoming the new Middle East. The only thing that can stop this is politics—environmentalists getting the upper hand over supply in the U.S., for instance; or First Nations impeding pipeline expansion in Canada; or Mexican production continuing to trip over the Mexican Constitution, impeding foreign investment or technology transfers—in North America itself.
On top of this, the U.S. and Canada could see natural gas output rise by 22 billion cubic feet per day by 2020, with 14 billion of it coming from the Lower 48 states, four billion from Alaska and four billion from Canada. That's an increase of one-third, catapulting this continent into the ranks of significant exporters of liquefied natural gas.
These numbers provide a useful benchmark for measuring ways that policies could obstruct the pace of supply growth. We already have the experience of the BP/Macondo disaster in the Gulf of Mexico in April 2010, which led to a moratorium on drilling in U.S. federal waters, and a revamping of the U.S. regulatory regime governing leasing, revenue collection and safety. As a consequence, U.S. deepwater production has fallen by more than 300,000 barrels per day since then.
morse
Bloomberg
Drilling operations at Pembina oil field near Pigeon Lake, Alberta, Canada
North America already has become the most important marginal source of oil and gas globally. U.S. imports of crude oil and petroleum-product imports have been plunging. The U.S. reached its peak as a net petroleum importing country in 2005-06. Since then, crude oil imports have fallen by almost two million barrels per day. Because the U.S. has the largest refining sector in the world, as domestic demand has fallen it has become a net petroleum-product exporting country—exporting more than 1.2 million barrels per day by the end of 2011—the first time it reached this status since 1949. The U.S.'s growing crude output is affecting the price difference between the traditionally more expensive light sweet crudes (which yield higher-value products like gasoline) and heavy sour grades.
Excess Canadian crude oil produced from oil sands is expanding at a rate of one million barrels a day every five years. The more that's produced, the less of a market there will be for oil from Venezuela and some other OPEC member countries with similar-quality oil, requiring them to either curtail production or lower prices. Even if oil prices rise in the medium term, we expect 2020 prices to be no more than $85 per barrel, compared with today's prevailing global price of $125.
The economic consequences of this supply-and-demand revolution are potentially extraordinary. We estimate that the cumulative impact of new production and reduced consumption could increase real U.S. gross domestic product (GDP) by 2%-3.3%, or by $370 billion-$624 billion, by 2020.
Of this estimate, $74 billion comes directly from the output of new hydrocarbon production alone. The rest is generated by multiplier effects as the surge in economic activity drives higher wealth, spending, consumption and investment effects that ripple through the economy. This potential re-industrialization of the U.S. economy is both profound and timely, occurring as the U.S struggles to shake off the lingering effects of the 2008 financial crisis.
Equally remarkable is the potential impact on the U.S. labor market. We estimate that as many as 3.6 million new jobs may be created on net by 2020. Some 600,000 jobs would be in the oil and gas extraction sector, another 1.1 million jobs in related industrial and manufacturing activity, and the remainder in ancillary job sectors. Overall, the national unemployment rate could decline by as much as 1.1 percentage points from what it otherwise would be in 2020.
Another potentially dramatic consequence of the North American hydrocarbon revolution is the impact on the U.S. current account deficit. The deficit, currently running at negative 3% of GDP, may be reduced by anywhere from 1.2% of GDP to 2.4% of GDP. This would also have implications for the U.S. dollar, potentially helping it appreciate by 2% to 5% in real exchange-rate terms, reversing its long-term decline and maintaining its status as the global reserve currency of choice.
It is now possible to meet the goal of energy independence for the U.S. One consequence is a significantly lower vulnerability of North America—and the world market—to oil price spikes. But also significant are the geopolitical consequences of a weakened OPEC and of the potentially reduced importance to the U.S. of changes in oil- and natural gas-producing countries world-wide.
Mr. Morse is head of global commodities research at Citi and a former State Department official.