Tom Friedman’s Peak Confusion
Twisted Energy Politics
By ROBERT BRYCE | February 16, 2010
When it comes to energy issues, Thomas Friedman simply doesn’t care about the facts.
That reality was made apparent, once again, in Friedman’s column in the February 10 issue of the New York Times. In an otherwise mostly sensible article, written from Yemen, where Friedman was talking about the need for proper educational opportunity in the Arabic and Islamic worlds, Friedman concluded that the US will have to maintain a strong military presence in the region in order to counter al-Qaeda. But he continues, we also must “help build schools and fund scholarships to America wherever we can. And please, please, let’s end our addiction to oil, which is what gives the Saudi religious ministry and charities the money to spread anti-modernist thinking across this region.”
Friedman has been bashing the Saudis for so long, it’s hardly worth recounting the many instances where he does so. But the fact that Friedman once again trots out the tired cliché of our “addiction to oil” and that he then immediately ties that issue to the Saudis shows that he simply doesn’t know what he’s talking about. Rather than stick to the facts, he retreats to a mindless slogan that contributes nothing to the need for a broader discussion of energy policy and the reality of the global marketplace.
The US could quit buying oil tomorrow, all oil, and it won’t put the Saudis out of business. According to the EIA, in 2008, the Saudis exported an average of 8.4 million barrels of oil per day. Of that quantity, the US accounted for about 1.5 million barrels per day.
Thus, even if the US somehow managed to segregate Saudi crude from its other oil imports, and also prevented the Saudis from selling that 1.5 million barrels per day somewhere else, Saudi Arabia would still be selling about 7 million barrels of oil on the global market. Needless to say, that 7 million barrels per day will bring the kingdom a fair bit of revenue.
Of course, Wednesday’s column isn’t the first time Friedman has shown that he cares more about polemics than facts. In August 2008, he held up Denmark as an energy model that should be copied by the US. In the wake of the 1973 Oil Embargo, Friedman claims that Denmark “responded to that crisis in such a sustained, focused and systematic way that today it is energy independent.” Friedman went on to lament America’s situation, writing that if “only we could be as energy smart as Denmark!”
Wrong. Wrong. Wrong.
Friedman clearly loves the idea of energy independence, but the data shows that Denmark is not energy independent – it’s not even close. The Danes import all of their coal. I repeat, Denmark imports all of its coal. Furthermore, those coal imports – and coal consumption – show little sign of declining even though Denmark’s wind power production capacity has increased rapidly over the past few years. And Denmark is even more dependent on coal than the US! (1)
Nor did Friedman bother to mention that thanks to the Danish government’s exorbitant taxes, the Danes now have some of the world’s most expensive electricity and most expensive motor fuel.
In 2006, the Energy Information Administration looked at residential electricity rates in 65 countries and found that Denmark’s rates were the highest, by far, at some $0.32 per kilowatt-hour. That was about 25% higher than the electricity costs in the Netherlands, which had the next-highest rates in the survey, at $0.25 per kilowatt-hour. And that’s not a new phenomenon. From 1999 through 2006, Denmark had either the highest – or the next-highest – electricity rates of the countries surveyed by the EIA. (In 1999 and 2000, Japan’s electricity rates were slightly higher than those in Denmark.) Furthermore, Denmark electricity rates are the highest in Europe – and no other country comes close. (2)
In 2008, electricity rates were even higher, with Danish residential customers were paying $0.38 per kilowatt-hour – or nearly four times as much as US residential customers who were paying about $0.10 per kilowatt-hour. And the Danes were paying more than twice as much as their counterparts in nuclear-heavy France, where residential electricity costs were $0.17 per kilowatt-hour.
While Danish homeowners are getting spanked by expensive electricity, Danish motorists are getting absolutely mugged at the service station. In late 2008, Danish drivers were paying $1.54 per liter for gasoline, while drivers in the U.K. were paying $1.44 and US motorists were paying $0.56. According to GTZ, an agency of the German government, only a handful of countries have more expensive fuel than Denmark, a list that includes Italy, Norway, Turkey and Germany.
Unfortunately, Friedman’s polemics on energy are nothing new. Back in 2006, Friedman published a column in the Times saying that the U.S. should build a wall around itself. “Build a virtual wall. End our oil addiction.” Getting rid of our need for oil will, he wrote, “protect us from the worst in the Arab-Muslim world….These regimes will never reform as long as they enjoy windfall oil profits.” The solution, he declared is for America to build “a wall of energy independence” around itself. Doing so, “will enable us to continue to engage honestly with the most progressive Arabs and Muslims on a reform agenda.”
Remember that this is the same Friedman, who, in his 2005 best-selling book, The World is Flat, declared that the world was increasingly globalized and the implications of that were obvious. In this new “flat” world, money, jobs, and opportunity, Friedman said, will “go to the countries with the best infrastructure, the best education system that produces the most educated work force, the most investor-friendly laws, and the best environment. (3)
Hmmm. So doesn’t that also mean that in our new “flat” world, that energy will be exported by the countries that have the best infrastructure for providing that energy to the world market?
Friedman’s problem is that he wants it both ways: he espouses the merits and potential of the new flat world, while also insisting that the US should withdraw into energy isolationism, and thereby surrender any participation in the world’s single biggest industry, the global energy sector. The irreconcilable contradictions in Friedman’s arguments are easily seen in the penultimate paragraph in The World is Flat where he claims that the “two greatest dangers we Americans face are an excess of protectionism – excessive fears of another 9/11 that prompt us to wall ourselves in, in search of personal security – and excessive fears of competing in a world…that prompt us to wall ourselves off, in search of economic security. Both would be a disaster for us and for the world.”
So, to summarize Friedman’s world view, he wants a “wall of energy independence” around America while simultaneously warning Americans that the two greatest dangers are a) walling “ourselves in” and b) walling “ourselves off.”
Friedman sees a flat world where walls are dangerous because they will isolate the US from other countries. But when it comes to energy, walls are good because they isolate the US from other countries. Oh, and along the way, we need to bankrupt the Saudis, because, well, they might give money to people who don’t think like we do.
Is anyone else here confused?
Robert Bryce’s fourth book, Power Hungry: The Myths of “Green” Energy and the Real Fuels of the Future, will be published in April.
Notes.
(1) BP Statistical Review of World Energy. In 2007, Denmark got 26% of its primary energy from coal while the US got 24.3%.
(2) Eurostat data. Available: http://epp.eurostat.ec.europa.eu/cache/ITY_OFFPUB/KS-QA-08-045/EN/KS-QA-08-045-EN.PDF.
Also: http://www.cbs.nl/en-GB/menu/themas/industrie-energie/publicaties/artikelen/archief/2007/2007-2187-wm.htm
(3) Yale Global Online, “’Wake Up and Face the Flat Earth’ – Thomas L. Friedman,” April 18, 2005. Available: http://yaleglobal.yale.edu/display.article?id=5581
Russia finds ‘strategic oil deposit’ in East Siberia
January 29, 2010
MOSCOW, RUSSIA: Russian oil producer Rosneft uncovered a giant oil field in East Siberia with more than 1 billion barrels of oil, the Russian natural resources minister said. Russian Natural Resources Minister Yuri Trutnev said Rosneft made an “important” oil discovery in the Irkutsk Oblast near Mongolia, Russia’s state-run news agency RIA Novosti reports.
“We can report today that we have opened the Sevastyanovo oil field, with reserves of over 1.1 billion barrels,” he said. “This is a strategic deposit.” Trutnev said the amount of natural resources recovered from Russia in 2009 exceeded national expectations.
Russia extracted roughly 3.6 billion barrels of oil in 2009. Discoveries for 2009 eclipsed 4.5 billion barrels.The Sevastyanovo oil field is located in Irkutsk Oblast near the route for the East Siberia-Pacific Ocean oil pipeline that links to Asian markets.
Trutnev made the announcement during a meeting with Russian Prime Minister Vladimir Putin.
(EUNewsNet.com and OfficialWire) – Source
Venezuela, Eni Invest $18 Billion to Pump, Refine Oil
By Steven Bodzin
Jan. 26 (Bloomberg) — Eni SpA, Italy’s biggest oil company, and Petroleos de Venezuela SA, the South American country’s state-owned oil company, agreed to develop almost $18 billion worth of projects to pump and refine oil in Venezuela.
The companies’ joint venture will start producing crude in the Orinoco Belt in central Venezuela, Oil and Energy Minister Rafael Ramirez said today on state television, at a ceremony attended by Venezuelan President Hugo Chavez and Eni Chief Executive Officer Paolo Scaroni.
The venture expects to pump 240,000 barrels a day after spending $8.3 billion to develop the Junin 5 block, Ramirez said. First oil will be pumped in 2013, Eni said today on its Web site. It will reach full production in 2016, Scaroni said.
“That gigantic oil reserve — it could not be exploited by Venezuela alone,” Chavez said, referring to the roughly 235 billion barrels of reserves in the Orinoco Belt. “Foreign investment is absolutely necessary.”
Rome-based Eni is seeking oil projects abroad to maintain output. Venezuela, to make up for declining production in its aging Lake Maracaibo fields, is inviting foreign companies to become minority partners in the Orinoco.
Eni also plans to build a $9.3 billion, 350,000 barrel-a- day refinery to convert crude oil from the existing Petromonagas project in the Orinoco into higher-value products, Ramirez said.
Eni will pay a $646 million signing fee, the company said on its Web site. It will pay $300 million when the development joint venture is formed and the remainder will be paid later.
International Arbitration
Eni will hold 40 percent of the venture. PDVSA, as the state company is known, will own the rest.
Eni was granted access to the Orinoco after dropping an international arbitration case against Venezuela in 2008 over an oil field nationalization.
U.S. oil companies Exxon Mobil Corp. and ConocoPhillips continue to pursue arbitration against Venezuela for seizing operations of Orinoco Belt projects that began in the 1990s.
Venezuela expects to complete joint venture agreements with Chinese and Russian companies “soon” and to complete bidding for three projects in the Carabobo blocks, Ramirez said.
Eni also signed a memorandum of understanding to build a 1- gigawatt power plant to be powered by natural gas from the Delta Caribe Oriental offshore fields, Ramirez said, without giving a potential price tag.
To contact the reporter on this story: Steven Bodzin in Caracas at sbodzin@bloomberg.net.
Obama Administration Orders World Bank To Keep Third World In Poverty
More starvation and death guaranteed by blocking poorer countries from building coal-fired power plants

Paul Joseph Watson
Prison Planet.com
January 26, 2010
Under the provably fraudulent and completely corrupted justification of fighting global warming, the Obama administration has ordered the World Bank to keep “developing” countries underdeveloped by blocking them from building coal-fired power plants, ensuring that poorer countries remain in poverty as a result of energy demands not being met.
Even amidst the explosive revelations of the United Nations IPCC issuing reports on the Himalayan Glaciers and the Amazon rainforest littered with incorrect data, the U.S. government has “Stepped up pressure on the World Bank not to fund coal-fired power plants in developing countries,” reports the Times of India.
The order was made by U.S. Executive Director of the World Bank Whitney Debevoise, who represents the United States in considering all loans, investments, country assistance strategies, budgets, audits and business plans of the World Bank Group entities.
By preventing poor nations from becoming self-sufficient in blocking them from producing their own energy, the Obama administration is ensuring that millions more will die from starvation and lack of access to hospitals and medical treatment.
Not only does strangling the energy supply to poorer countries prevent adequate food distribution and lead to more starvation, but hospitals and health clinics in the third world are barely even able to operate as a result of the World Bank and other global bodies ordering them to be dependent on renewable energy supplies that are totally insufficient.
A prime example appeared in the documentary The Great Global Warming Swindle, which highlighted how a Kenyan health clinic could not operate a medical refrigerator as well as the lights at the same time because the facility was restricted to just two solar panels.
“There’s somebody keen to kill the African dream. And the African dream is to develop,” said author and economist James Shikwati. “I don’t see how a solar panel is going to power a steel industry … We are being told, ‘Don’t touch your resources. Don’t touch your oil. Don’t touch your coal.’ That is suicide.”
The program labels the idea of restricting the world’s poorest people to alternative energy sources as “the most morally repugnant aspect of the global warming campaign.”
As we have previously highlighted, the implementation of policies arising out of fraudulent fearmongering and biased studies on global warming is already devastating the third world, with a doubling in food prices causing mass starvation and death.
Poor people around the world, “Are being killed in large numbers by starvation as a result of (climate change) policy,” climate skeptic Lord Monckton told the Alex Jones Show last month, due to huge areas of agricultural land being turned over to the growth of biofuels.
“Take Haiti where they live on mud pie with real mud costing 3 cents each….that’s what they’re living or rather what they’re dying on,” said Monckton, relating how when he gave a speech on this subject, a lady in the front row burst into tears and told him, “I’ve just come back from Haiti – now because of the doubling in world food prices, they can’t even afford the price of a mud pie and they’re dying of starvation all over the place.”
As a National Geographic Report confirmed, “With food prices rising, Haiti’s poorest can’t afford even a daily plate of rice, and some must take desperate measures to fill their bellies,” by “eating mud,” partly as a consequence of “increasing global demand for biofuels.”
In April 2008, World Bank President Robert Zoellick admitted that biofuels were a “significant contributor” to soaring food prices that have led to riots in countries such as Haiti, Egypt, the Philippines, and even Italy.
“We estimate that a doubling of food prices over the last three years could potentially push 100 million people in low-income countries deeper into poverty,” he stated.
Even if we are to accept that fact that overpopulation will be a continuing problem in the third world, the very means by which poorer countries would naturally lower their birth rates, by being allowed to develop their infrastructure, is being blocked by global institutions who craft policies designed to keep the third world in squalor and poverty.
This goes to the very heart of what the real agenda behind the global warming movement really is – a Malthusian drive to keep the slaves oppressed and prevent the most desperate people on the planet from pulling themselves out of destitution and despair.
Venezuela and North Dakota Oil Updates
Estimates of Original Oil-in-Place
A comprehensive study by Petroleos de Venezuela S.A. (PDVSA) established the magnitude of the original oil-in-place (OOIP) at 1,180 billion barrels of oil (BBO), a commonly cited estimate for the Orinoco Oil Belt (Fiorillo, 1987); PDVSA recently revised this value to more than 1,300 BBO (Gonzalez and others, 2006). In this study the median OOIP was estimated at 1,300 BBO and the maximum at 1,400 BBO. The minimum OOIP was estimated at 900 BBO, given the uncertainty of regional sandstone distribution and oil saturation (Fiorillo, 1987).
Estimates of Recovery Factor
Recovery factor, or that percentage of the OOIP that is determined to be technically recoverable, was estimated from what is currently known of the technology for recovery of heavy oil in the Orinoco Oil Belt AU and in other areas, particularly California, west Texas, and western Canada. The minimum recovery factor was estimated to be 15 percent, the recovery expected for cold production using horizontal wells. The median recovery factor was estimated to be 45 percent, on the assumption that horizontal drilling and thermal recovery methods might be widely used. The maximum recovery factor was estimated to be 70 percent, on the assumption that other recovery processes, in addition to horizontal drilling and steam-assisted gravity drainage, might eventually be applied on a large scale in the Orinoco Oil Belt AU.
The assessment of technically recoverable heavy oil and associated gas resources is shown in table 2. The mean of the distribution of heavy oil resources is about 513 BBO, with a range from 380 to about 652 BBO. The mean estimate of associated dissolved-gas resource is 135 trillion cubic feet of gas (TCFG), with a range from 53 to 262 TCFG. No attempt was made in this study to estimate either economically recoverable
2. North Dakota raised its forecast for oil output on growth in and around the Bakken Shale formation There is another 100,000 barrels a day in north Dakota from oil that is not in the Bakken.
Output may reach 300,000 to 400,000 barrels a day by mid- 2011 and stay at that level for 10 to 15 years, said Lynn Helms, director of the North Dakota Mineral Resources Department. The state’s previous estimate was 220,000 to 280,000.
The forecast was raised on discoveries by companies such as Continental Resources Inc., Helms said in an interview. Drilling advances are enabling producers to tap the Bakken, where rocks lack the porosity and permeability of conventional oil fields. The Bakken contributed to last year’s 7.5 percent gain in U.S. crude output, the biggest since 1955 and the first in 18 years. The Energy Department forecast a 1.8 percent increase in 2010.
The top end of North Dakota’s production projection would represent more than 7 percent of nationwide oil output.
One quarter of US grain crops fed to cars – not people, new figures show
New analysis of 2009 US Department of Agriculture figures suggests biofuel revolution is impacting on world food supplies
John Vidal | environment editor
guardian.co.uk | 22 January 2010
One-quarter of all the maize and other grain crops grown in the US now ends up as biofuel in cars rather than being used to feed people, according to new analysis which suggests that the biofuel revolution launched by former President George Bush in 2007 is impacting on world food supplies.
The 2009 figures from the US Department of Agriculture shows ethanol production rising to record levels driven by farm subsidies and laws which require vehicles to use increasing amounts of biofuels.
“The grain grown to produce fuel in the US [in 2009] was enough to feed 330 million people for one year at average world consumption levels,” said Lester Brown, the director of the Earth Policy Institute, a Washington think tank that conducted the analysis.
Last year 107m tonnes of grain, mostly corn, was grown by US farmers to be blended with petrol. This was nearly twice as much as in 2007, when Bush challenged farmers to increase production by 500% by 2017 to cut oil imports and reduce carbon emissions.

More than 80 new ethanol plants have been built since then, with more expected by 2015, by which time the US will need to produce a further 5bn gallons of ethanol if it is to meet its renewable fuel standard.
According to Brown, the growing demand for US ethanol derived from grains helped to push world grain prices to record highs between late 2006 and 2008. In 2008, the Guardian revealed a secret World Bank report that concluded that the drive for biofuels by American and European governments had pushed up food prices by 75%, in stark contrast to US claims that prices had risen only 2-3% as a result.
Since then, the number of hungry people in the world has increased to over 1 billion people, according to the UN’s World Food programme.
“Continuing to divert more food to fuel, as is now mandated by the US federal government in its renewable fuel standard, will likely only reinforce the disturbing rise in world hunger. By subsidising the production of ethanol to the tune of some $6bn each year, US taxpayers are in effect subsidising rising food bills at home and around the world,” said Brown.
“The worst economic crisis since the great depression has recently brought food prices down from their peak, but they still remain well above their long-term average levels.”
The US is by far the world’s leading grain exporter, exporting more than Argentina, Australia, Canada, and Russia combined. In 2008, the UN called for a comprehensive review of biofuel production from food crops.
“There is a direct link between biofuels and food prices. The needs of the hungry must come before the needs of cars,” said Meredith Alexander, biofuels campaigner at ActionAid in London. As well as the effect on food, campaigners also argue that many scientists question whether biofuels made from food crops actually save any greenhouse gas emissions.
But ethanol producers deny that their record production means less food. “Continued innovation in ethanol production and agricultural technology means that we don’t have to make a false choice between food and fuel. We can more than meet the demand for food and livestock feed while reducing our dependence on foreign oil through the production of homegrown renewable ethanol,” said Tom Buis, the chief executive of industry group Growth Energy.
Copenhagen Accord formalized by 9 of 193 nations
Copenhagen Climate Accord Deadline Is Flexible, De Boer Says
By Alex Morales
Jan. 20 (Bloomberg) — The Jan. 31 deadline for countries to sign onto the Copenhagen Accord climate-change agreement that was brokered last month is flexible, United Nations climate chief Yvo De Boer said.
“I think you could describe it as a soft deadline,” de Boer said today on a Webcast from Bonn. “There’s nothing deadly about it. If you fail to meet it, you can still associate with the accord afterwards.”
The Copenhagen Accord was crafted by the U.S., China and two dozen other countries on the sidelines of a two-week UN climate summit in the Danish capital that was beset by walkouts and squabbles between developed and developing nations.
The accord called for countries to indicate their support by the end of this month. As of yesterday, nine of the UN Framework Convention on Climate Change’s 193 members had done so formally, a UN spokesman said. Most of the countries who agreed to the deal in Denmark have yet to do so, according to the UN.
Countries have been asked to “associate” themselves with the accord, which is “an important tool to advance the negotiations,” de Boer said. “Countries are not being asked to sign the accord, they’re not being asked to take on a legally binding target; they will not be bound to the action which they submit to the secretariat.”
De Boer said the deadline is to enable him to meet internal requirements to produce a report on the Copenhagen meeting and that countries can indicate whether they support the agreement and their own targets later.
‘Living Document’
“I very much see the accord as a living document that tracks actions that countries want to take,” de Boer said.
Under the deal, countries will aim to keep the global rise in temperatures since industrialization in the 1800s to 2 degrees Celsius (3.6 degrees Fahrenheit). Industrialized nations can submit greenhouse-gas reduction targets for inclusion in an appendix and developing nations can spell out in a separate annex actions they intend to take to limit their own emissions.
Australia, Canada, France, Ghana, the Maldives, Papua New Guinea, Serbia, Singapore and Turkey have notified the UNFCCC that they want to be “associated” with the accord while Cuba has rejected it, the UN spokesman said yesterday.
De Boer said the document will be an “important tool” to advance the formal UN negotiations, which countries “want to reach a conclusion” at another meeting in Mexico at the end of the year.
“Copenhagen didn’t produce the final cake but it left countries with all the right ingredients to bake a new one in Mexico,” de Boer said. Even so, it isn’t clear whether the outcome in Mexico will be a legally binding treaty, he said.
To contact the reporter on this story: Alex Morales in London at amorales2@bloomberg.net
Climate science: models vs. observations
By Richard K. Moore | Aletho News | January 16, 2010
This document continues to evolve, based on continuing research. The latest version is always maintained at this URL:
http://rkmdocs.blogspot.com/2010/01/climate-science-observations-vs-models.html
You can click on any graphic in this document to see a larger image.
If a man is offered a fact which goes against his instincts, he will scrutinize it closely, and unless the evidence is overwhelming, he will refuse to believe it. If, on the other hand, he is offered something which affords a reason for acting in accordance to his instincts, he will accept it even on the slightest evidence.
— Bertrand Russell, Roads to Freedom, 1918
Science and models
True science begins with observations. When patterns are recognized in these observations, that leads to theories and models, which then lead to predictions. The predictions can then be tested by further observations, which can validate or invalidate the theories and models, or be used to refine them.
This is the paradigm accepted by all scientists. But scientists being people, typically in an academic research community, within a political society, there can be many a slip between cup and lip in the practice of science. There are the problems of getting funding, of peer pressure and career considerations, of dominant political dogmas, etc.
In the case of models there is a special problem that frequently arises. Researchers tend to become attached to their models, both psychologically and professionally. When new observations contradict the model, there is a tendency for the researchers to distort their model to fit the new data, rather than abandoning their model and looking for a better one. Or they may even ignore the new observations, and simply declare that their model is right, and the observations must be in error. This problem is even worse with complex computer models, where it is difficult for reviewers to figure out how the model really works, and whether ’fudging’ might be going on.
A classic example of the ’attached to model’ problem can be found in models of the universe. The Ptolemaic model assumed that the Earth is the center of the universe, and that the universe revolves around that center. Intuitively, this model makes a lot of sense. On the Earth, it feels like we are stationary. And we see the Sun and stars moving across the sky. “Obviously” the universe revolves around the Earth.
However, in order for this model to work in the case of the planets, it was necessary to introduce the arbitrary mechanism of epicycles. When Galileo and Copernicus came along, a much cleaner model was presented, that explained all the motions with no need for arbitrary assumptions. But no longer would the Earth be the center.
In this case it was not so much scientists that were attached to the old model, but the Church, which liked the model because it fit their interpretation of scripture. We’ve all heard the story of the Bishop who refused to look through the telescope, so he could ignore the new observations and hold on to the old model. Galileo was forced to recant. Thus can political interference hold back the progress of science, and ruin careers.
Climate models and global warming
Over the past century there has been a strong correlation between rising temperatures, and rising CO2 levels in the atmosphere, caused by the ever-increasing burning of fossil fuels. And it is well known that CO2 is a greenhouse gas. Other things being equal, higher CO2 levels must cause an increase in temperature, due to trapping more heat from the sun. Many scientists, quite reasonably, began to explore the theory that continually rising CO2 emissions would lead to continually rising temperatures.
Intuitively, it seems that the theory is “obviously” true. Temperatures have been rising along with CO2 levels; CO2 is a greenhouse gas; what is there to prove? And if the theory is true, and we keep increasing our emissions, then temperatures will eventually reach dangerous levels, melting the Antarctic ice sheet, raising sea levels, and all the other disasters presented by Al Gore in his famous documentary. “Obviously” we are facing a human-generated crisis – and something has got to be done!
But for many years, before Gore’s film, governments didn’t seem to be listening. Environmentalists, however, were listening. Public concern began to grow about CO2 emissions, and the climate scientists investigating the theory shared these concerns. They had a strong motivation to present the scientific case convincingly, in order to force governments to pay attention and take effective action — the future of humanity was at stake!
The climate scientists began building computer models, based on the observed correlation between temperature and CO2 levels. The models looked solid, not only for the past century, but extending back in time. Research with ice-core data revealed a general correlation between temperature and CO2 levels, extending back for a million years and more. What had been “obvious” to begin with, now looked even more obvious, confirmed by seemingly solid science.
These are the very conditions that typically cause scientists to become attached to their models. The early success of the model confirms what the scientists suspected all along: the theory must be true. A subtle shift happens in the mind of the scientists involved. What began as a theory starts to become an assumption. If new data seems to contradict the theory, the response is not to discard the theory, but rather to figure out what the model is lacking.
In the case of the Ptolemaic model, they figured out that epicycles must be lacking, and so epicycles were added. They were certain the universe revolved around the Earth, and so epicycles had to exist. Similarly, the climate scientists have run into problems with their models, and they’ve needed to add more and more machinery to their models in order to overcome those problems. They are certain of their theory, and so their machinery must be valid.
Perhaps they are right. Or perhaps they’ve strayed into epicycle territory, where the theory needs to be abandoned and a better model needs to be identified. This is the conclusion that quite a few scientists have reached. Experts do differ on this question, despite the fact that Gore says emphatically that the “science is settled”. Which group of scientists is right? This is the issue we will be exploring in this article.
Question 1
Compared to the historical record, are we facing a threat of dangerous global warming?
Let’s look at the historical temperature record, beginning with the long-term view. For long-term temperatures, ice-cores provide the most reliable data. Let’s look first at the very-long-term record, using ice cores from Vostok, in the Antarctic.
Data source:
ftp://ftp.ncdc.noaa.gov/pub/data/paleo/icecore/antarctica/vostok/deutnat.txt
Vostok Temperatures: 450,000 BC — Present
Here we see a very regular pattern of long-term temperature cycles. Most of the time the Earth is in an ice age, and about every 125,000 years there is a brief period of warm tempertures, called an inter-glacial period. Our current inter-glacial period has lasted a bit longer than most, indicating that the next ice age is somewhat overdue. These long-term cycles are probably related to changes in the eccentricity of the Earth’s orbit, which follows a cycle of about 100,000 years.
We also see other cycles of more closely-spaced peaks, and these are probably related to other cycles in the Earth’s orbit. There is an obliquity cycle of about 41,000 years, and a precession cycle, of about 20,000 years, and all of these cycles interfere with one another in complex ways. Here’s a tutorial from NASA that discusses the Earth’s orbital variations:
http://www-istp.gsfc.nasa.gov/stargaze/Sprecess.htm
Next let’s zoom-in on the current inter-glacial period, as seen in Vostok and Greenland, again using ice-core data. Temperatures here are relative to the value for 1900, which is shown as zero:
Vostok Temperatures: 12,000 BC — 1900
Data source:
http://www.ncdc.noaa.gov/paleo/metadata/noaa-icecore-2475.html
Greenland Temperatures: 9,500 BC — 1900
Here we see that the Southern Hemisphere emerged from the last ice age about 1,000 years earlier than did the Northern Hemisphere. As of 1900, in comparison to the whole inter-glacial period, the temperature was 3°C below the maximum in Vostok, and 3°C below the maximum in Greenland. Thus, as of 1900, temperatures were rather cool for the period in both hemispheres, and in Greenland, temperatures were close to a minimum.
During this recent inter-glacial period, temperatures in both Vostok and Greenland have oscillated through a range of about 4°C, although the patterns of oscillation are quite different in each case. In order to see just how different the patterns are, let’s look at Greenland and Vostok together, for the period 500BC–1900. Vostok is shown with a feint line, actually a dotted line if you click to see the enlarged version.
The patterns are very different indeed. In many cases we see an extreme high in Greenland, while at the same time Vostok is experiencing an extreme low. And in the period 1500—1900, while Greenland temperatures were relatively stable, within a range of .5°C, Vostok went through a radical oscillation of 3°C, from an extreme high to an extreme low. These differences between the two hemispheres might be related to the Earth’s orbit (See NASA tutorial), or they might be related to the fact that the Southern Hemisphere is dominated by oceans, while most of the land mass is in the Northern Hemisphere. Whatever the reason, the difference is striking.
There may be some value in trying to average these different records, to obtain a ’global average’, but it is important to understand that a global average is not the same as a global temperature. For example, consider temperatures 2,000 years ago. Greenland was experiencing a very wram period, 2°C above the baseline, while Vostok was experiencing a cold spell, nearly 1°C below the baseline. While the average for year 1000 might be near the baseline, that average does not represent the real temperature in either location.
This distinction between a global average, and real temperatures, is very important to keep in mind. Consider for example the concern that warming might lead to melting of the tundra in the Arctic, leading to the runaway release of methane. If that happens, it must happen in the Arctic. So it is the temperature in the Arctic that is relevant, not any kind of global average. In Greenland, temperatures 2,000 years ago were a full 2°C higher than 1900 temperatures, and there was no runaway release of methane.
The fact that the global average 2,000 years ago was dragged down by Antarctic cooling is completely irrelevant to the issue of melting tundra. Temperatures in the Arctic must rise by more than 2°C above 1900 levels before tundra-melting might be a problem, and this fact is obscured when we look at the global-average-derived hockey stick put out by the IPCC:
This graph gives the impression that temperatures 2,000 years ago were relatively low, and that in 1900 temperatures were higher than that. This may have some kind of abstract meaning, but it has nothing to do with what’s been going on in the Arctic, and it is very misleading as regards the likelihood of tundra-melting, or Arctic-melting in general. The graph is a gross misrepresentation of what’s been happening in the real world. It obscures the actual temperature record in both hemispheres, by presenting an artifical average that has existed nowhere.
Let’s now look at some other records from the Northern Hemisphere, to find out how typical the Greenland record is of its hemisphere. This first record is from Spain, based on the mercury content in a peat bog, as published in Science, 1999, vol. 284, for the most recent 4,000 years. Note that this graph is backwards, with present day on the left:
This next record is from the Central Alps, based on stalagmite isotopes, as published in Earth and Planteary Science Letters, 2005, vol. 235, for the most recent 2,000 years:
And finally, let’s include our Greenland record for the most recent 4,000 years:
While the three records are clearly different, they do share certain important characteristics. In each case we see a staggered rise, followed by a staggered decline — a long-term up-and-down cycle over the period. In each case we see that during the past few thousand years, temperatures have been 3°C higher than 1900 temperatures. And in each case we see a gradual descent towards the overdue next ice age. The Antarctic, on the other hand, shares none of these characteristics.
If we want to understand warming-related issues, such as tundra-melting and glacier-melting, we must consider the two hemispheres separately. If glaciers melt, they do so either because of high northern termperatures, or high southern temperatures. Whether or not glaciers are likely to melt cannot be determined by global averages. In this article we will concern ourselves with the Northern Hemisphere.
In the Northern Hemisphere, based on the shared characteristics we have observed, temperatures would need to rise at least 3°C above 1900 levels before we would need to worry about things like the extinction of polar bears, the melting of the Greenland ice sheet, or runaway methane release. We know this because none of these things have happened in the past 4,000 years, and temperatures have been3°C higher during that period.
However such a 3°C rise seems very unlikely to happen, given that all three of our Nothern Hemisphere samples show a gradual but definite decline toward the overdue next ice age. Let’s now zoom in the temperature record since 1900, and see what kind of rise has actually occurred. Let’s turn to Jim Hansen’s latest article, published on realclimate.org, 2009 temperatures by Jim Hansen. The article includes the following two graphs.
Jim Hansen is of course one of the primary proponents of the CO2-dangerous-warming theory, and there is considerable reason to believe these graphs show an exaggerated picture as regards to warming. Here is one article relevant to that point, and it is typical of other reports I’ve seen:
Son of Climategate! Scientist says feds manipulated data
Nonetheless, let’s accept these graphs as a valid representation of recent temperature changes, so as to be as fair as possible to the warming alarmists. We’ll be using the red line, which is from GISS, and which does not use the various extrapolations that are included in the green line. We’ll return to this topic later, but for now suffice it to say that these extrapolations make little sense from a scientific perspective.
The red line shows a temperature rise of .7°C from 1900 to the 1998 maximum, a leveling off beginning in 2001, and then a brief but sharp decline starting in 2005. Let’s enter that data into our charting program, using values for each 5-year period that represent the center of the oscillations for that period. Here’s what we get for 1900-2008:
Consider the downward trend at the right end of the graph. Hansen tells us this is very temporary, and that temperatures will soon start rising again. Perhaps he is right. However, as we shall see, his arguments for this prediction are seriously flawed. What we know for sure is that a downward trend has begun. How far that trend will continue is not yet known.
Next, let’s append that latest graph to the Greenland data, to get a reasonable characterization of Northern Hemisphere temperatures from 2000 BC to 2008:
This graph shows us that the temperature rise in the Northern Hemipshpere from 1800 to 2005 was not at all unnatural. That rise follows precisely the long-term pattern, where such rises have been occurring approximately every 1,000 years, with no help from human-caused CO2. Based on the long-term pattern of diminishing peaks, we would expect the recent down-trend to continue, and not turn upward again as Hansen predicts. If the natural pattern continues, then the recent warming has reached its maximum, and we will soon experience about two centuries of rapid cooling, as we continue our descent to the overdue next ice age.
So everything depends on the next decade or so. If temperatures turn upwards again, then the IPCC may be right, and human-caused CO2 emissions may have taken control of climate. However, if temperatures continue downward, then climate has been following natural patterns all along in the Northern Hemisphere. In this case there has been no evidence of any noticeable influence on climate from human-caused CO2, and we are now facing an era of rapid cooling. Within two centuries we could expect temperatures in the Northern Hemisphere to be consideralby lower than they were in the recent Little Ice Age.
We don’t know for sure which way temperatures will go, rapidly up or rapidly down. But I can make this statement:
As of this moment, based on the long-term temperature patterns in the Northern Hemisphere, there is no evidence that human-caused CO2 has had any effect on climate. The rise since 1800, as well as the downward dip starting in 2005, are entirely in line with the natural long-term pattern. If temperatures turn sharply upwards in the next decade or so, that will be the first-ever evidence for human-caused warming in the Northern Hemisphere.
As regards the the recent downturn, here are two other records, both of which show an even more dramatic downturn than the one shown in the GISS data:
University of Alabama, Huntsville (UAH)
Dr. John Christy
UAH Monthly Means of Lower Troposphere LT5-2
2004 – 2008
Remote Sensing Systems of Santa Rosa, CA (RSS)
RSS MSU Monthly Anomaly – 70S to 82.5N (essentially Global)
2004 – 2008
Based on the data we have looked at, all from mainstream scientific sources, we are now in a position to answer our first question with a reasonable level of confidence:
Answer 1
Temperatures, at least in the Northern Hemisphere, have been continuing to follow natural, long-term patterns — despite the unusually high levels of CO2 caused by the burning of fossil fuels. There have indeed been two centuries of global warming, and that is exactly what we would expect based on the natural pattern. Temperatures now are more than 2°C cooler than they were only 2,000 years ago, which means we have not been experiencing dangerously high temperatures in the Northern Hemisphere.
The illusion of global warming arises from a failure to recognize that global averages are are a very poor indicator of actual conditions in either hemisphere.
Within the next decade, or perhaps sooner, we are likely to learn which way the climate is going. If it turns again sharply upwards, as Hansen predicts, that will be counter to the long-term pattern, and evidence for human-caused warming. If it levels off, and continues downwards, that is consistent with long-term patterns, and we are likely to experience about two centuries of rapid cooling in the Northern Hemisphere, as we continue our descent toward the overdue next ice age.
Question 2
Why haven’t unsually high levels of CO2 significantly affected temperatures in the Northern Hemisphere?
One place to look for answers to this question is in the long-term patterns that we see in the temperature record of the past few thousand years, such as the peaks separated by about 1,000 years in the Greenland data, and other more closely-spaced patterns that are also visible. Some forces are causing those patterns, and whatever those forces are, they have nothing to do with human-caused CO2 emissions. Perhaps the forces have to do with cycles in solar radiation and solar magnetism, or perhaps they have something to do with cosmic radiation on a galactic scale, or something we haven’t yet identified. Until we understand what those forces are, how they intefere with one another, and how they effect climate, we can’t really build useful climate models, except on very short time scales.
We can also look for answers in the regulatory mechanisms that exist within the Earth’s own climate system. If an increment of warming happens on the surface, for example, then there is more evaporation from the oceans and more precipitation. While an increment of warming may melt glaciers, it may also cause increased snowfall in the arctic regions. Do these balance each other or not? Increased warming of the ocean’s surface may gradually heat the ocean, but the increased evaporation acts to cool the ocean. Do these balance each other?
Vegetation also acts as a regulatory system. Plants and trees gobble up CO2; that is where their substance comes from. Greater CO2 concentration leads to faster growth, taking more CO2 out of the atmosphere. Until we understand quantitively how these various regulatory systems function and interact, we can’t even build useful models on a short time scale.
In fact a lot of research is going on, investigating both lines of inquiry. However, in the current public-opinion and media climate, any research not related to CO2 causation is dismissed as the activity of contrarians, deniers, and oil-company hacks. Just as the Bishop refused to look through Galileo’s telescope, so today we have a whole society that refuses to look at many of the climate studies that are available.
I’d like to draw attention to one example of a scientist who has been looking at one aspect of the Earth’s regulatory system. Roy Spencer has been conducting research using the satellite systems that are in place for climate studies. Here are his relevant qualifications:
http://en.wikipedia.org/wiki/Roy_Spencer_(scientist)
Roy W. Spencer is a principal research scientist for the University of Alabama in Huntsville and the U.S. Science Team Leader for the Advanced Microwave Scanning Radiometer (AMSR-E) on NASA’s Aqua satellite. He has served as senior scientist for climate studies at NASA’s Marshall Space Flight Center in Huntsville, Alabama.
He describes his research in a presentation available on YouTube:
http://www.youtube.com/watch?v=xos49g1sdzo&feature=channel
In the talk he gives a lot of details, which are quite interesting, but one does need to concentrate and listen carefully to keep up with the pace and depth of the presentation. He certainly sounds like someone who knows what he’s talking about. Permit me to summarize the main points of his research:
When greenhouse gases cause surface warming, a response occurs, a ‘feedback response’, in the form of changes in cloud and precipitation patterns. The CRU-related climate models all assume the feedback response is a positive one: any increment of greenhouse warming will be amplified by knock-on effects in the weather system. This assumption then leads to the predictions of ‘runaway global warming’.
Spencer set out to see what the feedback response actually is, by observing what happens in the cloud-precipitation system when surface warming is occurring. What he found, by targeting satellite sensors appropriately, is that the feedback response is negative rather than positive. In particular, he found that the formation of storm-related cirrus clouds is inhibited when surface temperatures are high. Cirrus clouds are themselves a powerful greenhouse gas, and this reduction in cirrus cloud formation compensates for the increase in the CO2 greenhouse effect.
This is the kind of research we need to look at if we want to build useful climate models. Certainly Spencer’s results need to be confirmed by other researchers before we accept them as fact, but to simply dismiss his work out of hand is very bad for the progress of climate science. Consider what the popular website SourceWatch says about Spencer.
We don’t find there any reference to rebuttals to his research, but we are told that Spencer writes columns for a free-market website funded by Exxon. They also mention that he spoke at conference organized by the Heartland Institute, that promotes lots of reactionary, free-market principles. They are trying to discredit Spencer’s work on irrelevant grounds, what the Greeks referred to as an ad hominem argument. Sort of like, “If he beats his wife, his science must be faulty”.
And it’s true about ‘beating his wife’ — Spencer does seem to have a pro-industry philosophy that shows little concern for sustainability. That might even be part of his motivation for undertaking his recent research, hoping to give ammunition to pro-industry lobbyists. But that doesn’t prove his research is flawed or that his conclusions are invalid. His work should be challenged scientifically, by carrying out independent studies of the feedback process. If the challenges are restricted to irrelevant attacks, that becomes almost an admission that his results, which are threatening to the climate establishment, cannot be refuted. He does not hide his data, or his code, or his sentiments. The same cannot be said for the warming-alarmist camp.
Question 3
What are we to make of Jim Hansen’s prediction that rapid warming will soon resume?
Once again, I refer you to Dr. Hansen’s recent article, 2009 temperatures by Jim Hansen.
Jim begins with the following paragraph:
The past year, 2009, tied as the second warmest year in the 130 years of global instrumental temperature records, in the surface temperature analysis of the NASA Goddard Institute for Space Studies (GISS). The Southern Hemisphere set a record as the warmest year for that half of the world. Global mean temperature, as shown in Figure 1a, was 0.57°C (1.0°F) warmer than climatology (the 1951-1980 base period). Southern Hemisphere mean temperature, as shown in Figure 1b, was 0.49°C (0.88°F) warmer than in the period of climatology.
The Southern Hemisphere may be experiencing warming, but it has 2°C to go before that might become a problem there, and it has nothing to do with the Northern Hemisphere, where temperatures have been declining recently, not setting records for warming. This mathematical abstraction, the global average, is characteristic of nowhere. It creates the illusion of a warming crisis, when in fact no evidence for such a crisis exists. In the context of IPCC warnings about glacers melting, runaway warming, etc., this global-average argument serves as deceptive and effective propaganda, but not as science.
Jim continues with this paragraph, emphasis added:
The global record warm year, in the period of near-global instrumental measurements (since the late 1800s), was 2005. Sometimes it is asserted that 1998 was the warmest year. The origin of this confusion is discussed below. There is a high degree of interannual (year‐to‐ year) and decadal variability in both global and hemispheric temperatures. Underlying this variability, however, is a long‐term warming trend that has become strong and persistent over the past three decades. The long‐term trends are more apparent when temperature is averaged over several years. The 60‐month (5‐year) and 132 month (11‐year) running mean temperatures are shown in Figure 2 for the globe and the hemispheres. The 5‐year mean is sufficient to reduce the effect of the El Niño – La Niña cycles of tropical climate. The 11‐ year mean minimizes the effect of solar variability – the brightness of the sun varies by a measurable amount over the sunspot cycle, which is typically of 10‐12 year duration.
As I’ve emphasized in bold, Jim is assuming that there is a strong and persistent warming trend, which he of course attributes to human-caused CO2 emissions. And then that assumption becomes the justification for the 5 and 11-year running averages. Those running averages then give us phantom ’temperatures’ that don’t match actual observations. In particular, if a downard decline is beginning, the running averages will tend to ‘hide the decline’.
It seems we are looking at a classic case of over-attachment to model. What began as a theory has now become an assumption, and actual observations are being dismissed as “confusion” because they don’t agree with the model. The climate models have definitely strayed into the land of imaginary epicycles. The assumption of CO2 causation, plus the preoccupation with an abstract global average, creates a warming illusion that has no connection with reality in either hemisphere, as we see in these two graphs from Jim’s article:
As with the Ptolemaic model, there is a much simpler explantation for our recent era of warming , at least in the Northern Hemisphere: long term patterns are continuing, for whatever reasons, and human-caused CO2 has so far had no noticeable effect. This simpler explanation is based on actual observations, and requires no abstract mathematical epicycles or averages, but it removes CO2 from the center of the climate debate. And just as powerful forces in Galileo’s day wanted the Earth to remain the center of the universe, powerful forces today want CO2 to remain at the center of climate debate, and global warming to be seen as a threat.
Question 4
What is the real agenda of the politically powerful factions who are promoting global-warming alarmism?
One thing we always need to keep in mind is that the people at the top of the power pyramid in our society have access to the very best scientific information. They control dozens, probably hundreds, of high-level think tanks, able to hire the best minds, and carrying out all kinds of research we don’t hear about. They have access to all the secret military and CIA research, and a great deal of influence over what research is carried out in think tanks, the military, and in universities.
Just because they might be promoting fake science for its propaganda value, that doesn’t mean they believe it themselves. They undoubtedly know that global cooling is the real problem, and the actions they are promoting are completely in line with such an understanding.
Cap-and-trade, for example, won’t reduce carbon emissions. Rather it is a mechanism that allows emissions to continue, while pretending they are declining — by means of a phony market model. You know what a phony market model looks like. It looks like Reagan and Thatcher telling us that lower taxes will lead to higher government revenues due to increased business activity. It looks like globalization, telling us that opening up free markets will “raise all boats” and make us all prosperous. It looks like Wall Street, telling us that mortgage derivatives are a good deal, and we should buy them. And it looks like Wall Street telling us the bailouts will restore the economy, and that the recession is over. In short, it’s a con. It’s a fake theory about what the consequences of a policy will be, when the real consequences are known from the beginning.
Cap-and-trade has nothing to do with climate. It is part of a scheme to micromanage the allocation of global resources, and to maximize profits from the use of those resources. Think about it. Our ‘powerful factions’ decide who gets the initial free cap-and-trade credits. They run the exchange market itself, and can manipulate the market, create derivative products, sell futures, etc. They can cause deflation or inflation of carbon credits, just as they can cause deflation or inflation of currencies. They decide which corporations get advance insider tips, so they can maximize their emissions while minimizing their offset costs. They decide who gets loans to buy offsets, and at what interest rate. They decide what fraction of petroleum will go to the global North and the global South. They have ‘their man’ in the regulation agencies that certify the validity of offset projects. And they make money every which way as they carry out this micromanagement.
In the face of global cooling, this profiteering and micromanagenent of energy resources becomes particularly significant. Just when more energy is needed to heat our homes, we’ll find that the price has gone way up. Oil companies are actually strong supporters of the global-warming bandwagon, which is very ironic, given that they are funding some of the useful contrary research that is going on. Perhaps the oil barrons are counting on the fact that we are suspicious of them, and asssume we will discount the research they are funding, as most people are in fact doing. And the recent onset of global cooling explains all the urgency to implement the carbon-management regime: they need to get it in place before everyone realizes that warming alarmism is a scam.
And then there’s the carbon taxes. Just as with income taxes, you and I will pay our full share for our daily commute and for heating our homes, while the big corporate CO2 emitters will have all kinds of loopholes, and offshore havens, set up for them. Just as Federal Reserve theory hasn’t left us with a prosperous Main Street, despite its promises, so theories of carbon trading and taxation won’t give us a happy transition to a sustainable world.
Instead of building the energy-efficient transport systems we need, for example, they’ll sell us biofuels and electric cars, while most of society’s overall energy will continue to come from fossil fuels, and the economy continues to deteriorate. The North will continue to operate unsustainably, and the South will pay the price in the form of mass die-offs, which are already ticking along at the rate of six million children a year from malnutrition and disease.
While collapse, suffering, and die-offs of ‘marginal’ populations will be unpleasant for us, it will give our ‘powerful factions’ a blank canvas on which to construct their new world order, whatever that might be. And we’ll be desperate to go along with any scheme that looks like it might put food back on our tables and warm up our houses.
Author contact – rkm@quaylargo.com
Up in Smoke
Why Biomass Wood Energy is Not the Answer
By George Wuerthner | January 12, 2010
After the Smurfit-Stone Container Corp.’s linerboard plant in Missoula Montana announced that it was closing permanently, there have been many people including Montana Governor Switzer, Missoula mayor and Senator Jon Tester, among others who advocate turning the mill into a biomass energy plant. Northwestern Energy, a company which has expressed interest in using the plant for energy production has already indicated that it would expect more wood from national forests to make the plant economically viable.
The Smurfit Stone conversion to biomass is not alone. There have been a spate of new proposals for new wood burning biomass energy plants sprouting across the country like mushrooms after a rain. Currently there are plans and/or proposals for new biomass power plants in Maine, Vermont, Pennsylvania, Florida, California, Idaho, Oregon and elsewhere. In every instance, these plants are being promoted as “green” technology.
Part of the reason for this “boom” is that taxpayers are providing substantial financial incentives, including tax breaks, government grants, and loan guarantees. The rationale for these taxpayer subsidies is the presumption that biomass is “green” energy. But like other “quick fixes” there has been very little serious scrutiny of real costs and environmental impacts of biomass. Whether commercial biomass is a viable alternative to traditional fossil fuels can be questioned.
Before I get into this discussion, I want to state right up front, that coal and other fossil fuels that now provide much of our electrical energy need to be reduced and effectively replaced. But biomass energy is not the way to accomplish this end goal.
BIOMASS BURNING IS POLLUTION
First and foremost, biomass burning isn’t green. Burning wood produces huge amounts of pollution. Especially in valleys like Missoula where temperature inversions are common, pollution from a biomass burner will be the source of numerous health ailments. Because of the air pollution and human health concerns, the Oregon Chapter of the American Lung Association, the Massachusetts Medical Society and the Florida Medical Association, have all established policies opposing large-scale biomass plants.
The reason for this medical concern is that even with the best pollution control devises, biomass energy is extremely dirty. For instance, one of the biggest biomass burners now in operation, the McNeil biomass plant in Burlington, Vermont is the number one pollution source in the state, emitting 79 classified pollutants. Biomass releases dioxins, and as much particulates as coal burning, plus carbon monoxide, nitrogen oxide, sulfur dioxide, and contributes to ozone formation. […]
BIOMASS ENERGY IS INEFFICIENT
Wood is not nearly as concentrated a heat source as coal, gas, oil, or any other fossil fuel. Most biomass energy operations are only able to capture 20-25% of the latent energy by burning wood. That means one needs to gather and burn more wood to get the same energy value as a more concentrated fuel like coal. That is not to suggest that coal is a good alternative, rather wood is a worse alternative. Especially when you consider the energy used to gather the rather dispersed source of wood and the energy costs of trucking it to a central energy plant. If the entire carbon footprint of wood is considered, biomass creates far more CO2 with far less energy output than other energy sources.
The McNeil Biomass Plant in Burlington Vermont seldom runs full time because wood, even with all the subsidies (and Vermonters made huge and repeated subsidies to the plant—not counting the “hidden subsidies” like air pollution) wood energy can’t compete with other energy sources, even in the Northeast where energy costs are among the highest in the nation. Even though the plant was also retrofitted so it could burn natural gas to increase its competitiveness with other energy sources, the plant still does not operate competitively. It generally is only used to off- set peak energy loads.
One could argue, of course, that other energy sources like coal are greatly subsidized as well, especially if all environmental costs were considered. But at the very least, all energy sources must be “standardized” so that consumers can make informed decisions about energy—and biomass energy appears to be no more green than other energy sources.
BIOMASS SANITIZES AND MINES OUR FORESTS
The dispersed nature of wood as a fuel source combined with its low energy value means any sizable energy plant must burn a lot of wood. For instance, the McNeil 50 megawatt biomass plant in Burlington, Vermont would require roughly 32,500 acres of forest each year if running at near full capacity and entirely on wood. Wood for the McNeil Plant is trucked and even shipped on trains from as far away as Massachusetts, New Hampshire, Quebec and Maine.
Biomass proponents often suggest that wood [gathered] as a consequence of forest thinning to improve “forest health” (logging a forest to improve health of a forest ecosystem is an oxymoron) will provide the fuel for plant operations. For instance, one of the assumptions of Senator Tester’s Montana Forest Jobs bill is that thinned forests will provide a ready source of biomass for energy production. But in many cases, there are limits on the economic viability of trucking wood any distance to a central energy plant. Again without huge subsidies, this simply does not make economic sense. Biomass forest harvesting is even worse for forest ecosystems than clear-cutting. Biomass energy tends to utilize the entire tree, including the bole, crown, and branches. This robs a forest of nutrients, and disrupts energy cycles.
Worse yet, such biomass removal ignores the important role of dead trees to sustain the forest ecosystems. Dead trees are not a “wasted” resource. They provide home and food for thousands of species, including 45% of all bird species in the Nation. Dead trees that fall to the ground are used by insects, small mammals, amphibians and reptiles for shelter and even potentially food. Dead trees that fall into streams are important physical components of aquatic ecosystems and provide critical habitat for many fish and other aquatic species. Removal of dead wood is mining the forest. Keep in mind that logging activities are not benign. Logging typically requires some kind of access, often roads which are a major source of sedimentation in streams, and disrupt natural subsurface water flow. Logging can disturb sensitive wildlife like grizzly bear and even elk are known to abandon locations with active logging. Logging can spread weeds. And finally since large amounts of forest carbon are actually tied up in the soils, soil disturbance from logging is especially damaging, often releasing substantial additional amounts of carbon over and above what is released up a smoke stack.
BIOMASS ENERGY USES LARGE AMOUNTS OF WATER
A large-scale biomass plant (50 MW) uses close to a million gallons of water a day for cooling. Most of that water is lost from the watershed since approximately 85% is lost as steam. Water channeled back into a river or stream typically has a pollution cost as well, including higher water temperatures that negatively impact fisheries, especially trout. Since cooling need is greatest in warm weather, removal of water from rivers occurs just when flows are lowest, and fish are most susceptible to temperature stress.
BIOMASS ENERGY SAPS FUNDS FROM OTHER TRULY GREEN ENERGY SOURCES LIKE SOLAR
Since biomass energy is eligible for state renewable portfolio standards (RPS), it has captured the bulk of funding intended to move the country away from fossil fuels. For example, in Vermont, 90% of the RPS is from “smokestack” sources—mostly biomass incineration. This pattern holds throughout many other parts of the country. Biomass energy is thus burning up funds that could and should be going into other energy programs like energy conservation, solar and insulation of buildings.
PUBLIC FORESTS WILL BE LOGGED FOR BIOMASS ENERGY
Many of the climate bills now circulating in Congress, as well as Montana Senator Jon Tester’s Montana Jobs and Wilderness bill target public forests. Some of these proposals even include roadless lands and proposed wilderness as a source for wood biomass. One federal study suggests that 368 million tons of wood could be removed from our national forests every year—of course this study did not include the ecological costs that physical removal of this much would have on forest ecosystems.
The Biomass Crop Assistance Program, or BCAP, which was quietly put into the 2008 farm bill has so far given away more than a half billion dollars in a matching payment program for businesses that cut and collect biomass from national forests and Bureau of Land Management lands. And according to a recent Washington Post story, the Obama administration has already sent $23 million to biomass energy companies, and is poised to send another half billion.
And it is not only federal forests that are in jeopardy. Many states are eying their own state forests for biomass energy. For instance, Maine recently unveiled a new plan known as the Great Maine Forest Initiative which will pay timber companies to grow trees for biomass energy.
JOB LOSSES
Ironically one of the main justifications for biomass energy is the creation of jobs, yet the wood biomass rush is having unintended consequences for other forest products industries. Companies that rely upon surplus wood chips to produce fiberboard, cabinet makers, and furniture are scrambling to find wood fiber for their products. Considering that these industries are secondary producers of products, the biomass rush could threaten more jobs than it may create.
BOTTOM LINE
Large scale wood biomass energy is neither green, nor truly economical. It is also not ecologically sustainable and jeopardizes our forest ecosystems. It is a distraction that funnels funds and attention away from other more truly worthwhile energy options, in particular, the need for a massive energy conservation program, and changes in our lifestyles that will in the end provide truly green alternatives to coal and other fossil fuels.
George Wuerthner is a wildlife biologist and a former Montana hunting guide. His latest book is Plundering Appalachia.
Related articles
- Massachusetts Restricts Dirty Biopower (switchboard.nrdc.org)
- Forest Owners Tell EPA to Avoid Pitfalls in Biomass Review (prweb.com)
- Greens warn biomass plan could reduce food supplies (morningstaronline.co.uk)
- Biggest English Polluter Spends $1 Billion to Burn Wood (bloomberg.com)
- California Proposes Forest Thinning for Biomass Energy, But is it a Good Idea? (kcet.org)











