Venezuela and North Dakota Oil Updates
Estimates of Original Oil-in-Place
A comprehensive study by Petroleos de Venezuela S.A. (PDVSA) established the magnitude of the original oil-in-place (OOIP) at 1,180 billion barrels of oil (BBO), a commonly cited estimate for the Orinoco Oil Belt (Fiorillo, 1987); PDVSA recently revised this value to more than 1,300 BBO (Gonzalez and others, 2006). In this study the median OOIP was estimated at 1,300 BBO and the maximum at 1,400 BBO. The minimum OOIP was estimated at 900 BBO, given the uncertainty of regional sandstone distribution and oil saturation (Fiorillo, 1987).
Estimates of Recovery Factor
Recovery factor, or that percentage of the OOIP that is determined to be technically recoverable, was estimated from what is currently known of the technology for recovery of heavy oil in the Orinoco Oil Belt AU and in other areas, particularly California, west Texas, and western Canada. The minimum recovery factor was estimated to be 15 percent, the recovery expected for cold production using horizontal wells. The median recovery factor was estimated to be 45 percent, on the assumption that horizontal drilling and thermal recovery methods might be widely used. The maximum recovery factor was estimated to be 70 percent, on the assumption that other recovery processes, in addition to horizontal drilling and steam-assisted gravity drainage, might eventually be applied on a large scale in the Orinoco Oil Belt AU.
The assessment of technically recoverable heavy oil and associated gas resources is shown in table 2. The mean of the distribution of heavy oil resources is about 513 BBO, with a range from 380 to about 652 BBO. The mean estimate of associated dissolved-gas resource is 135 trillion cubic feet of gas (TCFG), with a range from 53 to 262 TCFG. No attempt was made in this study to estimate either economically recoverable
2. North Dakota raised its forecast for oil output on growth in and around the Bakken Shale formation There is another 100,000 barrels a day in north Dakota from oil that is not in the Bakken.
Output may reach 300,000 to 400,000 barrels a day by mid- 2011 and stay at that level for 10 to 15 years, said Lynn Helms, director of the North Dakota Mineral Resources Department. The state’s previous estimate was 220,000 to 280,000.
The forecast was raised on discoveries by companies such as Continental Resources Inc., Helms said in an interview. Drilling advances are enabling producers to tap the Bakken, where rocks lack the porosity and permeability of conventional oil fields. The Bakken contributed to last year’s 7.5 percent gain in U.S. crude output, the biggest since 1955 and the first in 18 years. The Energy Department forecast a 1.8 percent increase in 2010.
The top end of North Dakota’s production projection would represent more than 7 percent of nationwide oil output.
One quarter of US grain crops fed to cars – not people, new figures show
New analysis of 2009 US Department of Agriculture figures suggests biofuel revolution is impacting on world food supplies
John Vidal | environment editor
guardian.co.uk | 22 January 2010
One-quarter of all the maize and other grain crops grown in the US now ends up as biofuel in cars rather than being used to feed people, according to new analysis which suggests that the biofuel revolution launched by former President George Bush in 2007 is impacting on world food supplies.
The 2009 figures from the US Department of Agriculture shows ethanol production rising to record levels driven by farm subsidies and laws which require vehicles to use increasing amounts of biofuels.
“The grain grown to produce fuel in the US [in 2009] was enough to feed 330 million people for one year at average world consumption levels,” said Lester Brown, the director of the Earth Policy Institute, a Washington think tank that conducted the analysis.
Last year 107m tonnes of grain, mostly corn, was grown by US farmers to be blended with petrol. This was nearly twice as much as in 2007, when Bush challenged farmers to increase production by 500% by 2017 to cut oil imports and reduce carbon emissions.

More than 80 new ethanol plants have been built since then, with more expected by 2015, by which time the US will need to produce a further 5bn gallons of ethanol if it is to meet its renewable fuel standard.
According to Brown, the growing demand for US ethanol derived from grains helped to push world grain prices to record highs between late 2006 and 2008. In 2008, the Guardian revealed a secret World Bank report that concluded that the drive for biofuels by American and European governments had pushed up food prices by 75%, in stark contrast to US claims that prices had risen only 2-3% as a result.
Since then, the number of hungry people in the world has increased to over 1 billion people, according to the UN’s World Food programme.
“Continuing to divert more food to fuel, as is now mandated by the US federal government in its renewable fuel standard, will likely only reinforce the disturbing rise in world hunger. By subsidising the production of ethanol to the tune of some $6bn each year, US taxpayers are in effect subsidising rising food bills at home and around the world,” said Brown.
“The worst economic crisis since the great depression has recently brought food prices down from their peak, but they still remain well above their long-term average levels.”
The US is by far the world’s leading grain exporter, exporting more than Argentina, Australia, Canada, and Russia combined. In 2008, the UN called for a comprehensive review of biofuel production from food crops.
“There is a direct link between biofuels and food prices. The needs of the hungry must come before the needs of cars,” said Meredith Alexander, biofuels campaigner at ActionAid in London. As well as the effect on food, campaigners also argue that many scientists question whether biofuels made from food crops actually save any greenhouse gas emissions.
But ethanol producers deny that their record production means less food. “Continued innovation in ethanol production and agricultural technology means that we don’t have to make a false choice between food and fuel. We can more than meet the demand for food and livestock feed while reducing our dependence on foreign oil through the production of homegrown renewable ethanol,” said Tom Buis, the chief executive of industry group Growth Energy.
Copenhagen Accord formalized by 9 of 193 nations
Copenhagen Climate Accord Deadline Is Flexible, De Boer Says
By Alex Morales
Jan. 20 (Bloomberg) — The Jan. 31 deadline for countries to sign onto the Copenhagen Accord climate-change agreement that was brokered last month is flexible, United Nations climate chief Yvo De Boer said.
“I think you could describe it as a soft deadline,” de Boer said today on a Webcast from Bonn. “There’s nothing deadly about it. If you fail to meet it, you can still associate with the accord afterwards.”
The Copenhagen Accord was crafted by the U.S., China and two dozen other countries on the sidelines of a two-week UN climate summit in the Danish capital that was beset by walkouts and squabbles between developed and developing nations.
The accord called for countries to indicate their support by the end of this month. As of yesterday, nine of the UN Framework Convention on Climate Change’s 193 members had done so formally, a UN spokesman said. Most of the countries who agreed to the deal in Denmark have yet to do so, according to the UN.
Countries have been asked to “associate” themselves with the accord, which is “an important tool to advance the negotiations,” de Boer said. “Countries are not being asked to sign the accord, they’re not being asked to take on a legally binding target; they will not be bound to the action which they submit to the secretariat.”
De Boer said the deadline is to enable him to meet internal requirements to produce a report on the Copenhagen meeting and that countries can indicate whether they support the agreement and their own targets later.
‘Living Document’
“I very much see the accord as a living document that tracks actions that countries want to take,” de Boer said.
Under the deal, countries will aim to keep the global rise in temperatures since industrialization in the 1800s to 2 degrees Celsius (3.6 degrees Fahrenheit). Industrialized nations can submit greenhouse-gas reduction targets for inclusion in an appendix and developing nations can spell out in a separate annex actions they intend to take to limit their own emissions.
Australia, Canada, France, Ghana, the Maldives, Papua New Guinea, Serbia, Singapore and Turkey have notified the UNFCCC that they want to be “associated” with the accord while Cuba has rejected it, the UN spokesman said yesterday.
De Boer said the document will be an “important tool” to advance the formal UN negotiations, which countries “want to reach a conclusion” at another meeting in Mexico at the end of the year.
“Copenhagen didn’t produce the final cake but it left countries with all the right ingredients to bake a new one in Mexico,” de Boer said. Even so, it isn’t clear whether the outcome in Mexico will be a legally binding treaty, he said.
To contact the reporter on this story: Alex Morales in London at amorales2@bloomberg.net
Climate science: models vs. observations
By Richard K. Moore | Aletho News | January 16, 2010
This document continues to evolve, based on continuing research. The latest version is always maintained at this URL:
http://rkmdocs.blogspot.com/2010/01/climate-science-observations-vs-models.html
You can click on any graphic in this document to see a larger image.
If a man is offered a fact which goes against his instincts, he will scrutinize it closely, and unless the evidence is overwhelming, he will refuse to believe it. If, on the other hand, he is offered something which affords a reason for acting in accordance to his instincts, he will accept it even on the slightest evidence.
— Bertrand Russell, Roads to Freedom, 1918
Science and models
True science begins with observations. When patterns are recognized in these observations, that leads to theories and models, which then lead to predictions. The predictions can then be tested by further observations, which can validate or invalidate the theories and models, or be used to refine them.
This is the paradigm accepted by all scientists. But scientists being people, typically in an academic research community, within a political society, there can be many a slip between cup and lip in the practice of science. There are the problems of getting funding, of peer pressure and career considerations, of dominant political dogmas, etc.
In the case of models there is a special problem that frequently arises. Researchers tend to become attached to their models, both psychologically and professionally. When new observations contradict the model, there is a tendency for the researchers to distort their model to fit the new data, rather than abandoning their model and looking for a better one. Or they may even ignore the new observations, and simply declare that their model is right, and the observations must be in error. This problem is even worse with complex computer models, where it is difficult for reviewers to figure out how the model really works, and whether ’fudging’ might be going on.
A classic example of the ’attached to model’ problem can be found in models of the universe. The Ptolemaic model assumed that the Earth is the center of the universe, and that the universe revolves around that center. Intuitively, this model makes a lot of sense. On the Earth, it feels like we are stationary. And we see the Sun and stars moving across the sky. “Obviously” the universe revolves around the Earth.
However, in order for this model to work in the case of the planets, it was necessary to introduce the arbitrary mechanism of epicycles. When Galileo and Copernicus came along, a much cleaner model was presented, that explained all the motions with no need for arbitrary assumptions. But no longer would the Earth be the center.
In this case it was not so much scientists that were attached to the old model, but the Church, which liked the model because it fit their interpretation of scripture. We’ve all heard the story of the Bishop who refused to look through the telescope, so he could ignore the new observations and hold on to the old model. Galileo was forced to recant. Thus can political interference hold back the progress of science, and ruin careers.
Climate models and global warming
Over the past century there has been a strong correlation between rising temperatures, and rising CO2 levels in the atmosphere, caused by the ever-increasing burning of fossil fuels. And it is well known that CO2 is a greenhouse gas. Other things being equal, higher CO2 levels must cause an increase in temperature, due to trapping more heat from the sun. Many scientists, quite reasonably, began to explore the theory that continually rising CO2 emissions would lead to continually rising temperatures.
Intuitively, it seems that the theory is “obviously” true. Temperatures have been rising along with CO2 levels; CO2 is a greenhouse gas; what is there to prove? And if the theory is true, and we keep increasing our emissions, then temperatures will eventually reach dangerous levels, melting the Antarctic ice sheet, raising sea levels, and all the other disasters presented by Al Gore in his famous documentary. “Obviously” we are facing a human-generated crisis – and something has got to be done!
But for many years, before Gore’s film, governments didn’t seem to be listening. Environmentalists, however, were listening. Public concern began to grow about CO2 emissions, and the climate scientists investigating the theory shared these concerns. They had a strong motivation to present the scientific case convincingly, in order to force governments to pay attention and take effective action — the future of humanity was at stake!
The climate scientists began building computer models, based on the observed correlation between temperature and CO2 levels. The models looked solid, not only for the past century, but extending back in time. Research with ice-core data revealed a general correlation between temperature and CO2 levels, extending back for a million years and more. What had been “obvious” to begin with, now looked even more obvious, confirmed by seemingly solid science.
These are the very conditions that typically cause scientists to become attached to their models. The early success of the model confirms what the scientists suspected all along: the theory must be true. A subtle shift happens in the mind of the scientists involved. What began as a theory starts to become an assumption. If new data seems to contradict the theory, the response is not to discard the theory, but rather to figure out what the model is lacking.
In the case of the Ptolemaic model, they figured out that epicycles must be lacking, and so epicycles were added. They were certain the universe revolved around the Earth, and so epicycles had to exist. Similarly, the climate scientists have run into problems with their models, and they’ve needed to add more and more machinery to their models in order to overcome those problems. They are certain of their theory, and so their machinery must be valid.
Perhaps they are right. Or perhaps they’ve strayed into epicycle territory, where the theory needs to be abandoned and a better model needs to be identified. This is the conclusion that quite a few scientists have reached. Experts do differ on this question, despite the fact that Gore says emphatically that the “science is settled”. Which group of scientists is right? This is the issue we will be exploring in this article.
Question 1
Compared to the historical record, are we facing a threat of dangerous global warming?
Let’s look at the historical temperature record, beginning with the long-term view. For long-term temperatures, ice-cores provide the most reliable data. Let’s look first at the very-long-term record, using ice cores from Vostok, in the Antarctic.
Data source:
ftp://ftp.ncdc.noaa.gov/pub/data/paleo/icecore/antarctica/vostok/deutnat.txt
Vostok Temperatures: 450,000 BC — Present
Here we see a very regular pattern of long-term temperature cycles. Most of the time the Earth is in an ice age, and about every 125,000 years there is a brief period of warm tempertures, called an inter-glacial period. Our current inter-glacial period has lasted a bit longer than most, indicating that the next ice age is somewhat overdue. These long-term cycles are probably related to changes in the eccentricity of the Earth’s orbit, which follows a cycle of about 100,000 years.
We also see other cycles of more closely-spaced peaks, and these are probably related to other cycles in the Earth’s orbit. There is an obliquity cycle of about 41,000 years, and a precession cycle, of about 20,000 years, and all of these cycles interfere with one another in complex ways. Here’s a tutorial from NASA that discusses the Earth’s orbital variations:
http://www-istp.gsfc.nasa.gov/stargaze/Sprecess.htm
Next let’s zoom-in on the current inter-glacial period, as seen in Vostok and Greenland, again using ice-core data. Temperatures here are relative to the value for 1900, which is shown as zero:
Vostok Temperatures: 12,000 BC — 1900
Data source:
http://www.ncdc.noaa.gov/paleo/metadata/noaa-icecore-2475.html
Greenland Temperatures: 9,500 BC — 1900
Here we see that the Southern Hemisphere emerged from the last ice age about 1,000 years earlier than did the Northern Hemisphere. As of 1900, in comparison to the whole inter-glacial period, the temperature was 3°C below the maximum in Vostok, and 3°C below the maximum in Greenland. Thus, as of 1900, temperatures were rather cool for the period in both hemispheres, and in Greenland, temperatures were close to a minimum.
During this recent inter-glacial period, temperatures in both Vostok and Greenland have oscillated through a range of about 4°C, although the patterns of oscillation are quite different in each case. In order to see just how different the patterns are, let’s look at Greenland and Vostok together, for the period 500BC–1900. Vostok is shown with a feint line, actually a dotted line if you click to see the enlarged version.
The patterns are very different indeed. In many cases we see an extreme high in Greenland, while at the same time Vostok is experiencing an extreme low. And in the period 1500—1900, while Greenland temperatures were relatively stable, within a range of .5°C, Vostok went through a radical oscillation of 3°C, from an extreme high to an extreme low. These differences between the two hemispheres might be related to the Earth’s orbit (See NASA tutorial), or they might be related to the fact that the Southern Hemisphere is dominated by oceans, while most of the land mass is in the Northern Hemisphere. Whatever the reason, the difference is striking.
There may be some value in trying to average these different records, to obtain a ’global average’, but it is important to understand that a global average is not the same as a global temperature. For example, consider temperatures 2,000 years ago. Greenland was experiencing a very wram period, 2°C above the baseline, while Vostok was experiencing a cold spell, nearly 1°C below the baseline. While the average for year 1000 might be near the baseline, that average does not represent the real temperature in either location.
This distinction between a global average, and real temperatures, is very important to keep in mind. Consider for example the concern that warming might lead to melting of the tundra in the Arctic, leading to the runaway release of methane. If that happens, it must happen in the Arctic. So it is the temperature in the Arctic that is relevant, not any kind of global average. In Greenland, temperatures 2,000 years ago were a full 2°C higher than 1900 temperatures, and there was no runaway release of methane.
The fact that the global average 2,000 years ago was dragged down by Antarctic cooling is completely irrelevant to the issue of melting tundra. Temperatures in the Arctic must rise by more than 2°C above 1900 levels before tundra-melting might be a problem, and this fact is obscured when we look at the global-average-derived hockey stick put out by the IPCC:
This graph gives the impression that temperatures 2,000 years ago were relatively low, and that in 1900 temperatures were higher than that. This may have some kind of abstract meaning, but it has nothing to do with what’s been going on in the Arctic, and it is very misleading as regards the likelihood of tundra-melting, or Arctic-melting in general. The graph is a gross misrepresentation of what’s been happening in the real world. It obscures the actual temperature record in both hemispheres, by presenting an artifical average that has existed nowhere.
Let’s now look at some other records from the Northern Hemisphere, to find out how typical the Greenland record is of its hemisphere. This first record is from Spain, based on the mercury content in a peat bog, as published in Science, 1999, vol. 284, for the most recent 4,000 years. Note that this graph is backwards, with present day on the left:
This next record is from the Central Alps, based on stalagmite isotopes, as published in Earth and Planteary Science Letters, 2005, vol. 235, for the most recent 2,000 years:
And finally, let’s include our Greenland record for the most recent 4,000 years:
While the three records are clearly different, they do share certain important characteristics. In each case we see a staggered rise, followed by a staggered decline — a long-term up-and-down cycle over the period. In each case we see that during the past few thousand years, temperatures have been 3°C higher than 1900 temperatures. And in each case we see a gradual descent towards the overdue next ice age. The Antarctic, on the other hand, shares none of these characteristics.
If we want to understand warming-related issues, such as tundra-melting and glacier-melting, we must consider the two hemispheres separately. If glaciers melt, they do so either because of high northern termperatures, or high southern temperatures. Whether or not glaciers are likely to melt cannot be determined by global averages. In this article we will concern ourselves with the Northern Hemisphere.
In the Northern Hemisphere, based on the shared characteristics we have observed, temperatures would need to rise at least 3°C above 1900 levels before we would need to worry about things like the extinction of polar bears, the melting of the Greenland ice sheet, or runaway methane release. We know this because none of these things have happened in the past 4,000 years, and temperatures have been3°C higher during that period.
However such a 3°C rise seems very unlikely to happen, given that all three of our Nothern Hemisphere samples show a gradual but definite decline toward the overdue next ice age. Let’s now zoom in the temperature record since 1900, and see what kind of rise has actually occurred. Let’s turn to Jim Hansen’s latest article, published on realclimate.org, 2009 temperatures by Jim Hansen. The article includes the following two graphs.
Jim Hansen is of course one of the primary proponents of the CO2-dangerous-warming theory, and there is considerable reason to believe these graphs show an exaggerated picture as regards to warming. Here is one article relevant to that point, and it is typical of other reports I’ve seen:
Son of Climategate! Scientist says feds manipulated data
Nonetheless, let’s accept these graphs as a valid representation of recent temperature changes, so as to be as fair as possible to the warming alarmists. We’ll be using the red line, which is from GISS, and which does not use the various extrapolations that are included in the green line. We’ll return to this topic later, but for now suffice it to say that these extrapolations make little sense from a scientific perspective.
The red line shows a temperature rise of .7°C from 1900 to the 1998 maximum, a leveling off beginning in 2001, and then a brief but sharp decline starting in 2005. Let’s enter that data into our charting program, using values for each 5-year period that represent the center of the oscillations for that period. Here’s what we get for 1900-2008:
Consider the downward trend at the right end of the graph. Hansen tells us this is very temporary, and that temperatures will soon start rising again. Perhaps he is right. However, as we shall see, his arguments for this prediction are seriously flawed. What we know for sure is that a downward trend has begun. How far that trend will continue is not yet known.
Next, let’s append that latest graph to the Greenland data, to get a reasonable characterization of Northern Hemisphere temperatures from 2000 BC to 2008:
This graph shows us that the temperature rise in the Northern Hemipshpere from 1800 to 2005 was not at all unnatural. That rise follows precisely the long-term pattern, where such rises have been occurring approximately every 1,000 years, with no help from human-caused CO2. Based on the long-term pattern of diminishing peaks, we would expect the recent down-trend to continue, and not turn upward again as Hansen predicts. If the natural pattern continues, then the recent warming has reached its maximum, and we will soon experience about two centuries of rapid cooling, as we continue our descent to the overdue next ice age.
So everything depends on the next decade or so. If temperatures turn upwards again, then the IPCC may be right, and human-caused CO2 emissions may have taken control of climate. However, if temperatures continue downward, then climate has been following natural patterns all along in the Northern Hemisphere. In this case there has been no evidence of any noticeable influence on climate from human-caused CO2, and we are now facing an era of rapid cooling. Within two centuries we could expect temperatures in the Northern Hemisphere to be consideralby lower than they were in the recent Little Ice Age.
We don’t know for sure which way temperatures will go, rapidly up or rapidly down. But I can make this statement:
As of this moment, based on the long-term temperature patterns in the Northern Hemisphere, there is no evidence that human-caused CO2 has had any effect on climate. The rise since 1800, as well as the downward dip starting in 2005, are entirely in line with the natural long-term pattern. If temperatures turn sharply upwards in the next decade or so, that will be the first-ever evidence for human-caused warming in the Northern Hemisphere.
As regards the the recent downturn, here are two other records, both of which show an even more dramatic downturn than the one shown in the GISS data:
University of Alabama, Huntsville (UAH)
Dr. John Christy
UAH Monthly Means of Lower Troposphere LT5-2
2004 – 2008
Remote Sensing Systems of Santa Rosa, CA (RSS)
RSS MSU Monthly Anomaly – 70S to 82.5N (essentially Global)
2004 – 2008
Based on the data we have looked at, all from mainstream scientific sources, we are now in a position to answer our first question with a reasonable level of confidence:
Answer 1
Temperatures, at least in the Northern Hemisphere, have been continuing to follow natural, long-term patterns — despite the unusually high levels of CO2 caused by the burning of fossil fuels. There have indeed been two centuries of global warming, and that is exactly what we would expect based on the natural pattern. Temperatures now are more than 2°C cooler than they were only 2,000 years ago, which means we have not been experiencing dangerously high temperatures in the Northern Hemisphere.
The illusion of global warming arises from a failure to recognize that global averages are are a very poor indicator of actual conditions in either hemisphere.
Within the next decade, or perhaps sooner, we are likely to learn which way the climate is going. If it turns again sharply upwards, as Hansen predicts, that will be counter to the long-term pattern, and evidence for human-caused warming. If it levels off, and continues downwards, that is consistent with long-term patterns, and we are likely to experience about two centuries of rapid cooling in the Northern Hemisphere, as we continue our descent toward the overdue next ice age.
Question 2
Why haven’t unsually high levels of CO2 significantly affected temperatures in the Northern Hemisphere?
One place to look for answers to this question is in the long-term patterns that we see in the temperature record of the past few thousand years, such as the peaks separated by about 1,000 years in the Greenland data, and other more closely-spaced patterns that are also visible. Some forces are causing those patterns, and whatever those forces are, they have nothing to do with human-caused CO2 emissions. Perhaps the forces have to do with cycles in solar radiation and solar magnetism, or perhaps they have something to do with cosmic radiation on a galactic scale, or something we haven’t yet identified. Until we understand what those forces are, how they intefere with one another, and how they effect climate, we can’t really build useful climate models, except on very short time scales.
We can also look for answers in the regulatory mechanisms that exist within the Earth’s own climate system. If an increment of warming happens on the surface, for example, then there is more evaporation from the oceans and more precipitation. While an increment of warming may melt glaciers, it may also cause increased snowfall in the arctic regions. Do these balance each other or not? Increased warming of the ocean’s surface may gradually heat the ocean, but the increased evaporation acts to cool the ocean. Do these balance each other?
Vegetation also acts as a regulatory system. Plants and trees gobble up CO2; that is where their substance comes from. Greater CO2 concentration leads to faster growth, taking more CO2 out of the atmosphere. Until we understand quantitively how these various regulatory systems function and interact, we can’t even build useful models on a short time scale.
In fact a lot of research is going on, investigating both lines of inquiry. However, in the current public-opinion and media climate, any research not related to CO2 causation is dismissed as the activity of contrarians, deniers, and oil-company hacks. Just as the Bishop refused to look through Galileo’s telescope, so today we have a whole society that refuses to look at many of the climate studies that are available.
I’d like to draw attention to one example of a scientist who has been looking at one aspect of the Earth’s regulatory system. Roy Spencer has been conducting research using the satellite systems that are in place for climate studies. Here are his relevant qualifications:
http://en.wikipedia.org/wiki/Roy_Spencer_(scientist)
Roy W. Spencer is a principal research scientist for the University of Alabama in Huntsville and the U.S. Science Team Leader for the Advanced Microwave Scanning Radiometer (AMSR-E) on NASA’s Aqua satellite. He has served as senior scientist for climate studies at NASA’s Marshall Space Flight Center in Huntsville, Alabama.
He describes his research in a presentation available on YouTube:
http://www.youtube.com/watch?v=xos49g1sdzo&feature=channel
In the talk he gives a lot of details, which are quite interesting, but one does need to concentrate and listen carefully to keep up with the pace and depth of the presentation. He certainly sounds like someone who knows what he’s talking about. Permit me to summarize the main points of his research:
When greenhouse gases cause surface warming, a response occurs, a ‘feedback response’, in the form of changes in cloud and precipitation patterns. The CRU-related climate models all assume the feedback response is a positive one: any increment of greenhouse warming will be amplified by knock-on effects in the weather system. This assumption then leads to the predictions of ‘runaway global warming’.
Spencer set out to see what the feedback response actually is, by observing what happens in the cloud-precipitation system when surface warming is occurring. What he found, by targeting satellite sensors appropriately, is that the feedback response is negative rather than positive. In particular, he found that the formation of storm-related cirrus clouds is inhibited when surface temperatures are high. Cirrus clouds are themselves a powerful greenhouse gas, and this reduction in cirrus cloud formation compensates for the increase in the CO2 greenhouse effect.
This is the kind of research we need to look at if we want to build useful climate models. Certainly Spencer’s results need to be confirmed by other researchers before we accept them as fact, but to simply dismiss his work out of hand is very bad for the progress of climate science. Consider what the popular website SourceWatch says about Spencer.
We don’t find there any reference to rebuttals to his research, but we are told that Spencer writes columns for a free-market website funded by Exxon. They also mention that he spoke at conference organized by the Heartland Institute, that promotes lots of reactionary, free-market principles. They are trying to discredit Spencer’s work on irrelevant grounds, what the Greeks referred to as an ad hominem argument. Sort of like, “If he beats his wife, his science must be faulty”.
And it’s true about ‘beating his wife’ — Spencer does seem to have a pro-industry philosophy that shows little concern for sustainability. That might even be part of his motivation for undertaking his recent research, hoping to give ammunition to pro-industry lobbyists. But that doesn’t prove his research is flawed or that his conclusions are invalid. His work should be challenged scientifically, by carrying out independent studies of the feedback process. If the challenges are restricted to irrelevant attacks, that becomes almost an admission that his results, which are threatening to the climate establishment, cannot be refuted. He does not hide his data, or his code, or his sentiments. The same cannot be said for the warming-alarmist camp.
Question 3
What are we to make of Jim Hansen’s prediction that rapid warming will soon resume?
Once again, I refer you to Dr. Hansen’s recent article, 2009 temperatures by Jim Hansen.
Jim begins with the following paragraph:
The past year, 2009, tied as the second warmest year in the 130 years of global instrumental temperature records, in the surface temperature analysis of the NASA Goddard Institute for Space Studies (GISS). The Southern Hemisphere set a record as the warmest year for that half of the world. Global mean temperature, as shown in Figure 1a, was 0.57°C (1.0°F) warmer than climatology (the 1951-1980 base period). Southern Hemisphere mean temperature, as shown in Figure 1b, was 0.49°C (0.88°F) warmer than in the period of climatology.
The Southern Hemisphere may be experiencing warming, but it has 2°C to go before that might become a problem there, and it has nothing to do with the Northern Hemisphere, where temperatures have been declining recently, not setting records for warming. This mathematical abstraction, the global average, is characteristic of nowhere. It creates the illusion of a warming crisis, when in fact no evidence for such a crisis exists. In the context of IPCC warnings about glacers melting, runaway warming, etc., this global-average argument serves as deceptive and effective propaganda, but not as science.
Jim continues with this paragraph, emphasis added:
The global record warm year, in the period of near-global instrumental measurements (since the late 1800s), was 2005. Sometimes it is asserted that 1998 was the warmest year. The origin of this confusion is discussed below. There is a high degree of interannual (year‐to‐ year) and decadal variability in both global and hemispheric temperatures. Underlying this variability, however, is a long‐term warming trend that has become strong and persistent over the past three decades. The long‐term trends are more apparent when temperature is averaged over several years. The 60‐month (5‐year) and 132 month (11‐year) running mean temperatures are shown in Figure 2 for the globe and the hemispheres. The 5‐year mean is sufficient to reduce the effect of the El Niño – La Niña cycles of tropical climate. The 11‐ year mean minimizes the effect of solar variability – the brightness of the sun varies by a measurable amount over the sunspot cycle, which is typically of 10‐12 year duration.
As I’ve emphasized in bold, Jim is assuming that there is a strong and persistent warming trend, which he of course attributes to human-caused CO2 emissions. And then that assumption becomes the justification for the 5 and 11-year running averages. Those running averages then give us phantom ’temperatures’ that don’t match actual observations. In particular, if a downard decline is beginning, the running averages will tend to ‘hide the decline’.
It seems we are looking at a classic case of over-attachment to model. What began as a theory has now become an assumption, and actual observations are being dismissed as “confusion” because they don’t agree with the model. The climate models have definitely strayed into the land of imaginary epicycles. The assumption of CO2 causation, plus the preoccupation with an abstract global average, creates a warming illusion that has no connection with reality in either hemisphere, as we see in these two graphs from Jim’s article:
As with the Ptolemaic model, there is a much simpler explantation for our recent era of warming , at least in the Northern Hemisphere: long term patterns are continuing, for whatever reasons, and human-caused CO2 has so far had no noticeable effect. This simpler explanation is based on actual observations, and requires no abstract mathematical epicycles or averages, but it removes CO2 from the center of the climate debate. And just as powerful forces in Galileo’s day wanted the Earth to remain the center of the universe, powerful forces today want CO2 to remain at the center of climate debate, and global warming to be seen as a threat.
Question 4
What is the real agenda of the politically powerful factions who are promoting global-warming alarmism?
One thing we always need to keep in mind is that the people at the top of the power pyramid in our society have access to the very best scientific information. They control dozens, probably hundreds, of high-level think tanks, able to hire the best minds, and carrying out all kinds of research we don’t hear about. They have access to all the secret military and CIA research, and a great deal of influence over what research is carried out in think tanks, the military, and in universities.
Just because they might be promoting fake science for its propaganda value, that doesn’t mean they believe it themselves. They undoubtedly know that global cooling is the real problem, and the actions they are promoting are completely in line with such an understanding.
Cap-and-trade, for example, won’t reduce carbon emissions. Rather it is a mechanism that allows emissions to continue, while pretending they are declining — by means of a phony market model. You know what a phony market model looks like. It looks like Reagan and Thatcher telling us that lower taxes will lead to higher government revenues due to increased business activity. It looks like globalization, telling us that opening up free markets will “raise all boats” and make us all prosperous. It looks like Wall Street, telling us that mortgage derivatives are a good deal, and we should buy them. And it looks like Wall Street telling us the bailouts will restore the economy, and that the recession is over. In short, it’s a con. It’s a fake theory about what the consequences of a policy will be, when the real consequences are known from the beginning.
Cap-and-trade has nothing to do with climate. It is part of a scheme to micromanage the allocation of global resources, and to maximize profits from the use of those resources. Think about it. Our ‘powerful factions’ decide who gets the initial free cap-and-trade credits. They run the exchange market itself, and can manipulate the market, create derivative products, sell futures, etc. They can cause deflation or inflation of carbon credits, just as they can cause deflation or inflation of currencies. They decide which corporations get advance insider tips, so they can maximize their emissions while minimizing their offset costs. They decide who gets loans to buy offsets, and at what interest rate. They decide what fraction of petroleum will go to the global North and the global South. They have ‘their man’ in the regulation agencies that certify the validity of offset projects. And they make money every which way as they carry out this micromanagement.
In the face of global cooling, this profiteering and micromanagenent of energy resources becomes particularly significant. Just when more energy is needed to heat our homes, we’ll find that the price has gone way up. Oil companies are actually strong supporters of the global-warming bandwagon, which is very ironic, given that they are funding some of the useful contrary research that is going on. Perhaps the oil barrons are counting on the fact that we are suspicious of them, and asssume we will discount the research they are funding, as most people are in fact doing. And the recent onset of global cooling explains all the urgency to implement the carbon-management regime: they need to get it in place before everyone realizes that warming alarmism is a scam.
And then there’s the carbon taxes. Just as with income taxes, you and I will pay our full share for our daily commute and for heating our homes, while the big corporate CO2 emitters will have all kinds of loopholes, and offshore havens, set up for them. Just as Federal Reserve theory hasn’t left us with a prosperous Main Street, despite its promises, so theories of carbon trading and taxation won’t give us a happy transition to a sustainable world.
Instead of building the energy-efficient transport systems we need, for example, they’ll sell us biofuels and electric cars, while most of society’s overall energy will continue to come from fossil fuels, and the economy continues to deteriorate. The North will continue to operate unsustainably, and the South will pay the price in the form of mass die-offs, which are already ticking along at the rate of six million children a year from malnutrition and disease.
While collapse, suffering, and die-offs of ‘marginal’ populations will be unpleasant for us, it will give our ‘powerful factions’ a blank canvas on which to construct their new world order, whatever that might be. And we’ll be desperate to go along with any scheme that looks like it might put food back on our tables and warm up our houses.
Author contact – rkm@quaylargo.com
Up in Smoke
Why Biomass Wood Energy is Not the Answer
By George Wuerthner | January 12, 2010
After the Smurfit-Stone Container Corp.’s linerboard plant in Missoula Montana announced that it was closing permanently, there have been many people including Montana Governor Switzer, Missoula mayor and Senator Jon Tester, among others who advocate turning the mill into a biomass energy plant. Northwestern Energy, a company which has expressed interest in using the plant for energy production has already indicated that it would expect more wood from national forests to make the plant economically viable.
The Smurfit Stone conversion to biomass is not alone. There have been a spate of new proposals for new wood burning biomass energy plants sprouting across the country like mushrooms after a rain. Currently there are plans and/or proposals for new biomass power plants in Maine, Vermont, Pennsylvania, Florida, California, Idaho, Oregon and elsewhere. In every instance, these plants are being promoted as “green” technology.
Part of the reason for this “boom” is that taxpayers are providing substantial financial incentives, including tax breaks, government grants, and loan guarantees. The rationale for these taxpayer subsidies is the presumption that biomass is “green” energy. But like other “quick fixes” there has been very little serious scrutiny of real costs and environmental impacts of biomass. Whether commercial biomass is a viable alternative to traditional fossil fuels can be questioned.
Before I get into this discussion, I want to state right up front, that coal and other fossil fuels that now provide much of our electrical energy need to be reduced and effectively replaced. But biomass energy is not the way to accomplish this end goal.
BIOMASS BURNING IS POLLUTION
First and foremost, biomass burning isn’t green. Burning wood produces huge amounts of pollution. Especially in valleys like Missoula where temperature inversions are common, pollution from a biomass burner will be the source of numerous health ailments. Because of the air pollution and human health concerns, the Oregon Chapter of the American Lung Association, the Massachusetts Medical Society and the Florida Medical Association, have all established policies opposing large-scale biomass plants.
The reason for this medical concern is that even with the best pollution control devises, biomass energy is extremely dirty. For instance, one of the biggest biomass burners now in operation, the McNeil biomass plant in Burlington, Vermont is the number one pollution source in the state, emitting 79 classified pollutants. Biomass releases dioxins, and as much particulates as coal burning, plus carbon monoxide, nitrogen oxide, sulfur dioxide, and contributes to ozone formation. […]
BIOMASS ENERGY IS INEFFICIENT
Wood is not nearly as concentrated a heat source as coal, gas, oil, or any other fossil fuel. Most biomass energy operations are only able to capture 20-25% of the latent energy by burning wood. That means one needs to gather and burn more wood to get the same energy value as a more concentrated fuel like coal. That is not to suggest that coal is a good alternative, rather wood is a worse alternative. Especially when you consider the energy used to gather the rather dispersed source of wood and the energy costs of trucking it to a central energy plant. If the entire carbon footprint of wood is considered, biomass creates far more CO2 with far less energy output than other energy sources.
The McNeil Biomass Plant in Burlington Vermont seldom runs full time because wood, even with all the subsidies (and Vermonters made huge and repeated subsidies to the plant—not counting the “hidden subsidies” like air pollution) wood energy can’t compete with other energy sources, even in the Northeast where energy costs are among the highest in the nation. Even though the plant was also retrofitted so it could burn natural gas to increase its competitiveness with other energy sources, the plant still does not operate competitively. It generally is only used to off- set peak energy loads.
One could argue, of course, that other energy sources like coal are greatly subsidized as well, especially if all environmental costs were considered. But at the very least, all energy sources must be “standardized” so that consumers can make informed decisions about energy—and biomass energy appears to be no more green than other energy sources.
BIOMASS SANITIZES AND MINES OUR FORESTS
The dispersed nature of wood as a fuel source combined with its low energy value means any sizable energy plant must burn a lot of wood. For instance, the McNeil 50 megawatt biomass plant in Burlington, Vermont would require roughly 32,500 acres of forest each year if running at near full capacity and entirely on wood. Wood for the McNeil Plant is trucked and even shipped on trains from as far away as Massachusetts, New Hampshire, Quebec and Maine.
Biomass proponents often suggest that wood [gathered] as a consequence of forest thinning to improve “forest health” (logging a forest to improve health of a forest ecosystem is an oxymoron) will provide the fuel for plant operations. For instance, one of the assumptions of Senator Tester’s Montana Forest Jobs bill is that thinned forests will provide a ready source of biomass for energy production. But in many cases, there are limits on the economic viability of trucking wood any distance to a central energy plant. Again without huge subsidies, this simply does not make economic sense. Biomass forest harvesting is even worse for forest ecosystems than clear-cutting. Biomass energy tends to utilize the entire tree, including the bole, crown, and branches. This robs a forest of nutrients, and disrupts energy cycles.
Worse yet, such biomass removal ignores the important role of dead trees to sustain the forest ecosystems. Dead trees are not a “wasted” resource. They provide home and food for thousands of species, including 45% of all bird species in the Nation. Dead trees that fall to the ground are used by insects, small mammals, amphibians and reptiles for shelter and even potentially food. Dead trees that fall into streams are important physical components of aquatic ecosystems and provide critical habitat for many fish and other aquatic species. Removal of dead wood is mining the forest. Keep in mind that logging activities are not benign. Logging typically requires some kind of access, often roads which are a major source of sedimentation in streams, and disrupt natural subsurface water flow. Logging can disturb sensitive wildlife like grizzly bear and even elk are known to abandon locations with active logging. Logging can spread weeds. And finally since large amounts of forest carbon are actually tied up in the soils, soil disturbance from logging is especially damaging, often releasing substantial additional amounts of carbon over and above what is released up a smoke stack.
BIOMASS ENERGY USES LARGE AMOUNTS OF WATER
A large-scale biomass plant (50 MW) uses close to a million gallons of water a day for cooling. Most of that water is lost from the watershed since approximately 85% is lost as steam. Water channeled back into a river or stream typically has a pollution cost as well, including higher water temperatures that negatively impact fisheries, especially trout. Since cooling need is greatest in warm weather, removal of water from rivers occurs just when flows are lowest, and fish are most susceptible to temperature stress.
BIOMASS ENERGY SAPS FUNDS FROM OTHER TRULY GREEN ENERGY SOURCES LIKE SOLAR
Since biomass energy is eligible for state renewable portfolio standards (RPS), it has captured the bulk of funding intended to move the country away from fossil fuels. For example, in Vermont, 90% of the RPS is from “smokestack” sources—mostly biomass incineration. This pattern holds throughout many other parts of the country. Biomass energy is thus burning up funds that could and should be going into other energy programs like energy conservation, solar and insulation of buildings.
PUBLIC FORESTS WILL BE LOGGED FOR BIOMASS ENERGY
Many of the climate bills now circulating in Congress, as well as Montana Senator Jon Tester’s Montana Jobs and Wilderness bill target public forests. Some of these proposals even include roadless lands and proposed wilderness as a source for wood biomass. One federal study suggests that 368 million tons of wood could be removed from our national forests every year—of course this study did not include the ecological costs that physical removal of this much would have on forest ecosystems.
The Biomass Crop Assistance Program, or BCAP, which was quietly put into the 2008 farm bill has so far given away more than a half billion dollars in a matching payment program for businesses that cut and collect biomass from national forests and Bureau of Land Management lands. And according to a recent Washington Post story, the Obama administration has already sent $23 million to biomass energy companies, and is poised to send another half billion.
And it is not only federal forests that are in jeopardy. Many states are eying their own state forests for biomass energy. For instance, Maine recently unveiled a new plan known as the Great Maine Forest Initiative which will pay timber companies to grow trees for biomass energy.
JOB LOSSES
Ironically one of the main justifications for biomass energy is the creation of jobs, yet the wood biomass rush is having unintended consequences for other forest products industries. Companies that rely upon surplus wood chips to produce fiberboard, cabinet makers, and furniture are scrambling to find wood fiber for their products. Considering that these industries are secondary producers of products, the biomass rush could threaten more jobs than it may create.
BOTTOM LINE
Large scale wood biomass energy is neither green, nor truly economical. It is also not ecologically sustainable and jeopardizes our forest ecosystems. It is a distraction that funnels funds and attention away from other more truly worthwhile energy options, in particular, the need for a massive energy conservation program, and changes in our lifestyles that will in the end provide truly green alternatives to coal and other fossil fuels.
George Wuerthner is a wildlife biologist and a former Montana hunting guide. His latest book is Plundering Appalachia.
Related articles
- Massachusetts Restricts Dirty Biopower (switchboard.nrdc.org)
- Forest Owners Tell EPA to Avoid Pitfalls in Biomass Review (prweb.com)
- Greens warn biomass plan could reduce food supplies (morningstaronline.co.uk)
- Biggest English Polluter Spends $1 Billion to Burn Wood (bloomberg.com)
- California Proposes Forest Thinning for Biomass Energy, But is it a Good Idea? (kcet.org)
Copenhagen climate summit: confusion as ‘historic deal’ descends into chaos
The “historic” climate change deal at the Copenhagen climate summit has descended into chaos after some developing nations rejected the plan for fighting global warming championed by US President Barack Obama.
By David Barrett and Louise Gray, in Copenhagen
The Telegraph | December 19, 2009

(From Left) European Commission President Barroso, German Chancellor Angela Merkel, Swedish Prime Minister Fredrik Reinfeldt, French President Nicolas Sarkozy, US President Barack Obama and British PM Gordon Brown Photo: STEFFEN KUGLER/AFP/Getty Images
An agreement to limit global warming to a 3.6F (2C) temperature rise, alongside a $100 billion (£62bn) a year in aid from 2020, were condemned as inadequate by some delegates and appeared to be in danger of unravelling.
Developing nations, including Venezuela, said they could not accept a text originally agreed by the United States, China, India, Brazil and South Africa as the blueprint of a wider United Nations plan to fight climate change.
Tempers flared during an all-night plenary session, held after most of 120 visiting world leaders had left.
Lumumba Stanislaus Di-Aping, the Sudanese negotiator, said the draft text asked “Africa to sign a suicide pact”.
One Saudi delegate said it was without doubt “the worst plenary I have ever attended.”
Ed Miliband, the Environment Secretary, warned delegates that the plan would have to be endorsed to unlock funds outlined in the deal, including $30 billion in “quick-start” aid from 2010-12, rising to $100 billion a year from 2020.
Apart from the original five nations supporting the scheme, European Union states, Japan and groups representing small island states, least developed nations and African countries spoke in favour of the plan during the overnight session.
The two-week summit ended late on Friday night after a row between the US and China overshadowed negotiations, yet its conclusions were initially hailed as a significant deal.
[…]
The accord declared that “deep cuts in emissions are required”. But instead of a detailed pledge to halve carbon emissions by 2050, leaders agreed only to the vague promise to limit the rise in global temperatures to 2C, with no specifics on how to achieve that.
The leaders also put off setting emissions targets for 2020, saying they would attempt to agree them by February… Full article
British Columbia: New terminal for LNG exports to China
Picture – Horn River News
WSJ: Apache To Provide Natural Gas To Proposed Kitimat LNG Terminal For Export To Asia – Update
December 18, 2009
(RTTNews) – Sunday, according to The Wall Street Journal, oil and gas company Apache Corp. (APA) has agreed to provide natural gas to Canadian firm Kitimat LNG Inc. for export to Asia through Kitimat’s proposed liquefied-natural-gas or LNG export terminal in Kitimat, British Columbia. The construction of the $3-billion LNG export facility is set to begin in late 2009 or early 2010, with the LNG facility coming into operation 36 to 40 months later by 2013 or 2014. The companies are expected to announce an agreement on Monday.
Privately-owned Calgary-based Kitimat LNG is committed to build a state-of-the-art LNG terminal in Kitimat that would transport natural gas via a pipeline from the Western Canadian Sedimentary Basin to the Kitimat LNG Terminal, where the natural gas will be cooled to -160 degrees centigrade, condensed and liquefied in preparation for export via ship to Asian markets. In Asia, the LNG will undergo a regasification process and be transported through pipelines to its final destination.
Pursuant to an agreement, Kitimat LNG and Houston, Texas-based Apache would negotiate a definitive agreement under which Apache would supply specific quantities of the LNG facility’s 700 million cubic feet per day of natural gas feedstock. In mid-July, EOG Resources, Inc. (EOG) also signed a memorandum of understanding or MOU, to supply natural gas to Kitimat LNG’s proposed LNG export terminal.
In a statement while signing the EOG agreement, President of Kitimat LNG Rosemary Boulton said, “Kitimat LNG presents a compelling opportunity for producers to leverage growing natural gas reserves in Western Canada and sell into significant new international markets such as Asia.”
After EOG, Apache is the second major North American gas producer to have reportedly agreed to supply natural gas to Kitimat LNG. Kitimat LNG has also signed MOUs with leading LNG companies such as Korea Gas Corporation (KOGAS) and Gas Natural for the purchase of LNG produced at the terminal. However, there are other companies active in British Columbia, where the proposed project is situated, including EnCana Corp. (ECA, ECA.TO)), Devon Energy Corp. (DVN) and industry giant Exxon Mobil Corp. (XOM).
Kitimat LNG’s export terminal proposal is supported by natural gas market fundamentals that show growth in the supply of natural gas in Western Canada and strong, growing demand for natural gas in Asia. As a politically and economically stable country that is close to Asian markets, Canada offers a reliable, plentiful natural gas supply to customers in the Pacific Rim.
The project is expected to take advantage of the rising natural gas demand and the higher LNG prices in Asia, with prices Asian prices expected to continue to climb. The U.S. natural gas prices have been stuck at between US$7 and $9 per million British Thermal Units or BTU, for most of the year, while in Asia, LNG have been traded with increasing frequency at record spot prices of US$20 per million BTU.
The Kitimat project comprises of a 40-hectare LNG export terminal site with two storage tanks, marine jetty and berthing facility. It would have an annual LNG capacity of three to five million tons and would take about 36 to 40 months for completion. It would handle three to five shipments monthly and would target key potential markets like Japan, South Korea, China, and Taiwan. – source
Copenhagen: Bolivia, Sudan, Venezuela and S.A. set to humiliate Obama
Update: Obama departs Copenhagen without a binding agreement
December 18, 2009 | Highlights from Politico.com
On Friday morning, Obama warned delegates that U.S. offers of funding for poor nations would remain on the table “if and only if” developing nations, including China, agreed to international monitoring of their greenhouse gas emissions. […]
Back home, senators critical to getting a climate bill through Congress have stressed that developing nations must submit to international monitoring — particularly if they want the U.S. to pay hundreds of billions to help combat the destructive impact of climate change.
“The only way we’ll be successful in America is for countries like China and India to make an equivalent commitment,” said Sen. Lindsey Graham (R-S.C.), who is crafting a bipartisan climate bill. “We’re not going to unilaterally disarm.”
While Obama emphasized the U.S. commitment to taking action on climate change, he did not set a deadline for specific Senate action on the climate bill. […]
Overnight reports that world leaders had agreed to a tentative final climate change deal in Copenhagen were greatly exaggerated — and the outcome of the COP-15 conference was still very much up in the air when Air Force One touched down at 9:01 a.m. local time. […]
After addressing the delegates, Obama met with Chinese Premier Wen Jiabao for close to an hour to discuss emissions goals, verification mechanisms and climate financing. The lack of agreement between China and the U.S. — the world’s two largest greenhouse gas emitters — has been a major stumbling block in the talks.
A White house official described the discussion as “constructive” and said that the two leaders asked their negotiators to get together one-on-one after the meeting. […]
One key sticking point: a demand by industrialized nations that the document produced here be legally binding, the so-called “operational” agreement Secretary of State Hillary Clinton spoke about yesterday.
… none of the several drafts circulating in Copenhagen represented even the bones of a final deal, with many key issues still in flux and time running out. Moreover, U.S. predictions that roadblocks could be thrown up by smaller countries seemed to be coming true, with last-minute objections voiced by Venezuela, Bolivia, Sudan and Saudi Arabia, according to people familiar with talks. […]
An official with a developing nation told Reuters that rich nations were offering to cut their carbon emissions by 80 percent by 2050, a proposal that had been rejected by developing nations. Developing nations have always insisted on the need for mid-term targets…
Why are the oligarchic elites trying so hard to push their climate change policies through right now?
December 9, 2009 by Notsilvia Night
Why are the political and financial elites and their obedient servants in the their faith – sorry scientific community – pushing so madly for a final decision on a global Carbon Tax legislation at this very moment?
Why don´t they just wait until the scandal of “climate gate” has blown over?
Because those elites know they are wrong on the issue of human caused climate change.
They know that their lies are being revealed to the public piece by piece, faster and faster.
Most of all, they know that the planet is at the moment once again in a cooling phase as occurs every thirty or forty odd years.
Looking at the current lack of solar activity, this cooling phase might even be a more severe one than the one that ended 40 years ago, possibly as severe as during what is called the Maunder Minimum, a cooling phase lasting several decades during the 17. and 18. century.
In a couple of years their claims would no longer be tenable at all. The cooling trend would be obvious to even the most ideologically blinded environmentalist on earth.
The scheme of taxing global population, creating new revenue streams for the world´s financial markets establishing a central control over the world economy and preventing the rise of developing countries out of poverty, would lose out.
The political leaders of all less powerful countries are being bullied at the moment into signing a treaty which gives away their country’s national sovereignty to the leadership of the powerful ones, namely Britain and United States – and more to the point to the shadow leadership behind them, the world´s financial elites of Goldman Sachs and Co.
So why are so many decent people on the left fighting hand and foot for the profits in carbon trading of Goldman Sachs and Al Gore´s Generation Investment Management (GIM) company?
It´s a psychological problem; most people, especially on the left, want to be on the side of the good and caring people.
For over 40 years now we have been told that being environmentally minded means being a good person. It means we care about nature, wild animal life, about future generations of human beings.
Being environmentally minded means we are opposed to polluting the air and the water;
we are opposed to deforestation (especially in the rain-forest regions);
we are opposed to dumping our own poisonous waste unto the developing world;
we are opposed to rampant consumerism, in which driven by the advertisement industry we keep on buying and buying. Buying things we actually don´t need, things which do not make us either happier or more comfortable, just more indebted.
All those nice middle class people who want to feel good about themselves, they all support these ideas as part of the program for the left. And yes, there are plenty of real environmental issues we should be concerned about. But while marginalizing these real issues for the environment, the financial and right-wing ideological elites have – with the help of the media they control – succeeded to infiltrate their own agenda into the “green” movement with the bogus Anthropogenic Global Warming ideology.
The propaganda has been very successful indeed. People who want with all their heart to be “good” and decent are now supporting the agenda of the most selfish and anti-humanist forces on the planet.
The propaganda has created a belief-system which is hard to break. In Europe this belief-system is even more entrenched, since it has been developed for a over a few more years, hence it may be harder to break among Europeans than in the United States.
But after the “climate-gate” revelations chances aren’t so bad any more. A global storm is brewing against the liars (which include most of the mainline media) and their masters. No matter how bad it looks when we listen to the sound-bites of the top-level political hacks, down on the bottom, in the population, minds are changing en masse.
In just a little while, those who honestly strive to be the “good” guys (and girls) will realize that being good and caring about future generations means not caring for the Goldman Sachs carbon credits scheme.
The truth will indeed set us free from global tyranny:
Watch also:
Lord Monckton on Climategate at the 2nd International Climate Conference
on Vimeo.
Copenhagen climate summit in disarray after ‘Danish text’ leak
- John Vidal in Copenhagen
- guardian.co.uk, Tuesday 8 December 2009 14.09 GMT

Photograph: Attila Kisbenedek/AFP/Getty Images
The UN Copenhagen climate talks are in disarray today after developing countries reacted furiously to leaked documents that show world leaders will next week be asked to sign an agreement that hands more power to rich countries and sidelines the UN’s role in all future climate change negotiations.
The document is also being interpreted by developing countries as setting unequal limits on per capita carbon emissions for developed and developing countries in 2050; meaning that people in rich countries would be permitted to emit nearly twice as much under the proposals.
The so-called Danish text, a secret draft agreement worked on by a group of individuals known as “the circle of commitment” – but understood to include the UK, US and Denmark – has only been shown to a handful of countries since it was finalised this week.
The agreement, leaked to the Guardian, is a departure from the Kyoto protocol‘s principle that rich nations, which have emitted the bulk of the CO2, should take on firm and binding commitments to reduce greenhouse gases, while poorer nations were not compelled to act. The draft hands effective control of climate change finance to the World Bank; would abandon the Kyoto protocol – the only legally binding treaty that the world has on emissions reductions; and would make any money to help poor countries adapt to climate change dependent on them taking a range of actions.
The document was described last night by one senior diplomat as “a very dangerous document for developing countries. It is a fundamental reworking of the UN balance of obligations. It is to be superimposed without discussion on the talks”.
A confidential analysis of the text by developing countries also seen by the Guardian shows deep unease over details of the text. In particular, it is understood to:
• Force developing countries to agree to specific emission cuts and measures that were not part of the original UN agreement;
• Divide poor countries further by creating a new category of developing countries called “the most vulnerable”;
• Weaken the UN’s role in handling climate finance;
• Not allow poor countries to emit more than 1.44 tonnes of carbon per person by 2050, while allowing rich countries to emit 2.67 tonnes.
Developing countries that have seen the text are understood to be furious that it is being promoted by rich countries without their knowledge and without discussion in the negotiations.
“It is being done in secret. Clearly the intention is to get [Barack] Obama and the leaders of other rich countries to muscle it through when they arrive next week. It effectively is the end of the UN process,” said one diplomat, who asked to remain nameless.
Antonio Hill, climate policy adviser for Oxfam International, said: “This is only a draft but it highlights the risk that when the big countries come together, the small ones get hurting. On every count the emission cuts need to be scaled up. It allows too many loopholes and does not suggest anything like the 40% cuts that science is saying is needed.”
Hill continued: “It proposes a green fund to be run by a board but the big risk is that it will run by the World Bank and the Global Environment Facility [a partnership of 10 agencies including the World Bank and the UN Environment Programme] and not the UN. That would be a step backwards, and it tries to put constraints on developing countries when none were negotiated in earlier UN climate talks.”
The text was intended by Denmark and rich countries to be a working framework, which would be adapted by countries over the next week. It is particularly inflammatory because it sidelines the UN negotiating process and suggests that rich countries are desperate for world leaders to have a text to work from when they arrive next week.
Few numbers or figures are included in the text because these would be filled in later by world leaders. However, it seeks to hold temperature rises to 2C and mentions the sum of $10bn a year to help poor countries adapt to climate change from 2012-15.











