Aletho News

ΑΛΗΘΩΣ

Venezuela and North Dakota Oil Updates

1. The U.S. Geological Survey estimated a mean volume of 513 billion barrels of technically recoverable heavy oil in the Orinoco Oil Belt Assessment Unit of the East Venezuela Basin Province; the range is 380 to 652 billion barrels. (4 page pdf)

Estimates of Original Oil-in-Place
A comprehensive study by Petroleos de Venezuela S.A. (PDVSA) established the magnitude of the original oil-in-place (OOIP) at 1,180 billion barrels of oil (BBO), a commonly cited estimate for the Orinoco Oil Belt (Fiorillo, 1987); PDVSA recently revised this value to more than 1,300 BBO (Gonzalez and others, 2006). In this study the median OOIP was estimated at 1,300 BBO and the maximum at 1,400 BBO. The minimum OOIP was estimated at 900 BBO, given the uncertainty of regional sandstone distribution and oil saturation (Fiorillo, 1987).

Estimates of Recovery Factor

Recovery factor, or that percentage of the OOIP that is determined to be technically recoverable, was estimated from what is currently known of the technology for recovery of heavy oil in the Orinoco Oil Belt AU and in other areas, particularly California, west Texas, and western Canada. The minimum recovery factor was estimated to be 15 percent, the recovery expected for cold production using horizontal wells. The median recovery factor was estimated to be 45 percent, on the assumption that horizontal drilling and thermal recovery methods might be widely used. The maximum recovery factor was estimated to be 70 percent, on the assumption that other recovery processes, in addition to horizontal drilling and steam-assisted gravity drainage, might eventually be applied on a large scale in the Orinoco Oil Belt AU.

The assessment of technically recoverable heavy oil and associated gas resources is shown in table 2. The mean of the distribution of heavy oil resources is about 513 BBO, with a range from 380 to about 652 BBO. The mean estimate of associated dissolved-gas resource is 135 trillion cubic feet of gas (TCFG), with a range from 53 to 262 TCFG. No attempt was made in this study to estimate either economically recoverable

2. North Dakota raised its forecast for oil output on growth in and around the Bakken Shale formation There is another 100,000 barrels a day in north Dakota from oil that is not in the Bakken.

Output may reach 300,000 to 400,000 barrels a day by mid- 2011 and stay at that level for 10 to 15 years, said Lynn Helms, director of the North Dakota Mineral Resources Department. The state’s previous estimate was 220,000 to 280,000.

The forecast was raised on discoveries by companies such as Continental Resources Inc., Helms said in an interview. Drilling advances are enabling producers to tap the Bakken, where rocks lack the porosity and permeability of conventional oil fields. The Bakken contributed to last year’s 7.5 percent gain in U.S. crude output, the biggest since 1955 and the first in 18 years. The Energy Department forecast a 1.8 percent increase in 2010.

The top end of North Dakota’s production projection would represent more than 7 percent of nationwide oil output.

Source

January 24, 2010 Posted by | Economics, Malthusian Ideology, Phony Scarcity | Leave a comment

One quarter of US grain crops fed to cars – not people, new figures show

New analysis of 2009 US Department of Agriculture figures suggests biofuel revolution is impacting on world food supplies

John Vidal | environment editor
guardian.co.uk | 22 January 2010

One-quarter of all the maize and other grain crops grown in the US now ends up as biofuel in cars rather than being used to feed people, according to new analysis which suggests that the biofuel revolution launched by former President George Bush in 2007 is impacting on world food supplies.

The 2009 figures from the US Department of Agriculture shows ethanol production rising to record levels driven by farm subsidies and laws which require vehicles to use increasing amounts of biofuels.

“The grain grown to produce fuel in the US [in 2009] was enough to feed 330 million people for one year at average world consumption levels,” said Lester Brown, the director of the Earth Policy Institute, a Washington think tank that conducted the analysis.

Last year 107m tonnes of grain, mostly corn, was grown by US farmers to be blended with petrol. This was nearly twice as much as in 2007, when Bush challenged farmers to increase production by 500% by 2017 to cut oil imports and reduce carbon emissions.

Graph - US grain used to make ethanol

More than 80 new ethanol plants have been built since then, with more expected by 2015, by which time the US will need to produce a further 5bn gallons of ethanol if it is to meet its renewable fuel standard.

According to Brown, the growing demand for US ethanol derived from grains helped to push world grain prices to record highs between late 2006 and 2008. In 2008, the Guardian revealed a secret World Bank report that concluded that the drive for biofuels by American and European governments had pushed up food prices by 75%, in stark contrast to US claims that prices had risen only 2-3% as a result.

Since then, the number of hungry people in the world has increased to over 1 billion people, according to the UN’s World Food programme.

“Continuing to divert more food to fuel, as is now mandated by the US federal government in its renewable fuel standard, will likely only reinforce the disturbing rise in world hunger. By subsidising the production of ethanol to the tune of some $6bn each year, US taxpayers are in effect subsidising rising food bills at home and around the world,” said Brown.

“The worst economic crisis since the great depression has recently brought food prices down from their peak, but they still remain well above their long-term average levels.”

The US is by far the world’s leading grain exporter, exporting more than Argentina, Australia, Canada, and Russia combined. In 2008, the UN called for a comprehensive review of biofuel production from food crops.

“There is a direct link between biofuels and food prices. The needs of the hungry must come before the needs of cars,” said Meredith Alexander, biofuels campaigner at ActionAid in London. As well as the effect on food, campaigners also argue that many scientists question whether biofuels made from food crops actually save any greenhouse gas emissions.

But ethanol producers deny that their record production means less food. “Continued innovation in ethanol production and agricultural technology means that we don’t have to make a false choice between food and fuel. We can more than meet the demand for food and livestock feed while reducing our dependence on foreign oil through the production of homegrown renewable ethanol,” said Tom Buis, the chief executive of industry group Growth Energy.

January 22, 2010 Posted by | Economics, Malthusian Ideology, Phony Scarcity | Leave a comment

Copenhagen Accord formalized by 9 of 193 nations

Copenhagen Climate Accord Deadline Is Flexible, De Boer Says

By Alex Morales

Jan. 20 (Bloomberg) — The Jan. 31 deadline for countries to sign onto the Copenhagen Accord climate-change agreement that was brokered last month is flexible, United Nations climate chief Yvo De Boer said.

“I think you could describe it as a soft deadline,” de Boer said today on a Webcast from Bonn. “There’s nothing deadly about it. If you fail to meet it, you can still associate with the accord afterwards.”

The Copenhagen Accord was crafted by the U.S., China and two dozen other countries on the sidelines of a two-week UN climate summit in the Danish capital that was beset by walkouts and squabbles between developed and developing nations.

The accord called for countries to indicate their support by the end of this month. As of yesterday, nine of the UN Framework Convention on Climate Change’s 193 members had done so formally, a UN spokesman said. Most of the countries who agreed to the deal in Denmark have yet to do so, according to the UN.

Countries have been asked to “associate” themselves with the accord, which is “an important tool to advance the negotiations,” de Boer said. “Countries are not being asked to sign the accord, they’re not being asked to take on a legally binding target; they will not be bound to the action which they submit to the secretariat.”

De Boer said the deadline is to enable him to meet internal requirements to produce a report on the Copenhagen meeting and that countries can indicate whether they support the agreement and their own targets later.

‘Living Document’

“I very much see the accord as a living document that tracks actions that countries want to take,” de Boer said.

Under the deal, countries will aim to keep the global rise in temperatures since industrialization in the 1800s to 2 degrees Celsius (3.6 degrees Fahrenheit). Industrialized nations can submit greenhouse-gas reduction targets for inclusion in an appendix and developing nations can spell out in a separate annex actions they intend to take to limit their own emissions.

Australia, Canada, France, Ghana, the Maldives, Papua New Guinea, Serbia, Singapore and Turkey have notified the UNFCCC that they want to be “associated” with the accord while Cuba has rejected it, the UN spokesman said yesterday.

De Boer said the document will be an “important tool” to advance the formal UN negotiations, which countries “want to reach a conclusion” at another meeting in Mexico at the end of the year.

“Copenhagen didn’t produce the final cake but it left countries with all the right ingredients to bake a new one in Mexico,” de Boer said. Even so, it isn’t clear whether the outcome in Mexico will be a legally binding treaty, he said.

To contact the reporter on this story: Alex Morales in London at amorales2@bloomberg.net

January 20, 2010 Posted by | Malthusian Ideology, Phony Scarcity, Science and Pseudo-Science | Leave a comment

Climate science: models vs. observations

By Richard K. Moore | Aletho News | January 16, 2010

This document continues to evolve, based on continuing research. The latest version is always maintained at this URL:
http://rkmdocs.blogspot.com/2010/01/climate-science-observations-vs-models.html

You can click on any graphic in this document to see a larger image.

If a man is offered a fact which goes against his instincts, he will scrutinize it closely, and unless the evidence is overwhelming, he will refuse to believe it. If, on the other hand, he is offered something which affords a reason for acting in accordance to his instincts, he will accept it even on the slightest evidence.
— Bertrand Russell, Roads to Freedom, 1918

Science and models

True science begins with observations. When patterns are recognized in these observations, that leads to theories and models, which then lead to predictions. The predictions can then be tested by further observations, which can validate or invalidate the theories and models, or be used to refine them.

This is the paradigm accepted by all scientists. But scientists being people, typically in an academic research community, within a political society, there can be many a slip between cup and lip in the practice of science. There are the problems of getting funding, of peer pressure and career considerations, of dominant political dogmas, etc.

In the case of models there is a special problem that frequently arises. Researchers tend to become attached to their models, both psychologically and professionally. When new observations contradict the model, there is a tendency for the researchers to distort their model to fit the new data, rather than abandoning their model and looking for a better one. Or they may even ignore the new observations, and simply declare that their model is right, and the observations must be in error. This problem is even worse with complex computer models, where it is difficult for reviewers to figure out how the model really works, and whether ’fudging’ might be going on.

A classic example of the ’attached to model’ problem can be found in models of the universe. The Ptolemaic model assumed that the Earth is the center of the universe, and that the universe revolves around that center. Intuitively, this model makes a lot of sense. On the Earth, it feels like we are stationary. And we see the Sun and stars moving across the sky. “Obviously” the universe revolves around the Earth.

However, in order for this model to work in the case of the planets, it was necessary to introduce the arbitrary mechanism of epicycles. When Galileo and Copernicus came along, a much cleaner model was presented, that explained all the motions with no need for arbitrary assumptions. But no longer would the Earth be the center.

In this case it was not so much scientists that were attached to the old model, but the Church, which liked the model because it fit their interpretation of scripture. We’ve all heard the story of the Bishop who refused to look through the telescope, so he could ignore the new observations and hold on to the old model. Galileo was forced to recant. Thus can political interference hold back the progress of science, and ruin careers.

Climate models and global warming

Over the past century there has been a strong correlation between rising temperatures, and rising CO2 levels in the atmosphere, caused by the ever-increasing burning of fossil fuels. And it is well known that CO2 is a greenhouse gas. Other things being equal, higher CO2 levels must cause an increase in temperature, due to trapping more heat from the sun. Many scientists, quite reasonably, began to explore the theory that continually rising CO2 emissions would lead to continually rising temperatures.

Intuitively, it seems that the theory is “obviously” true. Temperatures have been rising along with CO2 levels; CO2 is a greenhouse gas; what is there to prove? And if the theory is true, and we keep increasing our emissions, then temperatures will eventually reach dangerous levels, melting the Antarctic ice sheet, raising sea levels, and all the other disasters presented by Al Gore in his famous documentary. “Obviously” we are facing a human-generated crisis – and something has got to be done!

But for many years, before Gore’s film, governments didn’t seem to be listening. Environmentalists, however, were listening. Public concern began to grow about CO2 emissions, and the climate scientists investigating the theory shared these concerns. They had a strong motivation to present the scientific case convincingly, in order to force governments to pay attention and take effective action — the future of humanity was at stake!

The climate scientists began building computer models, based on the observed correlation between temperature and CO2 levels. The models looked solid, not only for the past century, but extending back in time. Research with ice-core data revealed a general correlation between temperature and CO2 levels, extending back for a million years and more. What had been “obvious” to begin with, now looked even more obvious, confirmed by seemingly solid science.

These are the very conditions that typically cause scientists to become attached to their models. The early success of the model confirms what the scientists suspected all along: the theory must be true. A subtle shift happens in the mind of the scientists involved. What began as a theory starts to become an assumption. If new data seems to contradict the theory, the response is not to discard the theory, but rather to figure out what the model is lacking.

In the case of the Ptolemaic model, they figured out that epicycles must be lacking, and so epicycles were added. They were certain the universe revolved around the Earth, and so epicycles had to exist. Similarly, the climate scientists have run into problems with their models, and they’ve needed to add more and more machinery to their models in order to overcome those problems. They are certain of their theory, and so their machinery must be valid.

Perhaps they are right. Or perhaps they’ve strayed into epicycle territory, where the theory needs to be abandoned and a better model needs to be identified. This is the conclusion that quite a few scientists have reached. Experts do differ on this question, despite the fact that Gore says emphatically that the “science is settled”. Which group of scientists is right? This is the issue we will be exploring in this article.

Question 1

Compared to the historical record, are we facing a threat of dangerous global warming?

Let’s look at the historical temperature record, beginning with the long-term view. For long-term temperatures, ice-cores provide the most reliable data. Let’s look first at the very-long-term record, using ice cores from Vostok, in the Antarctic.

Data source:
ftp://ftp.ncdc.noaa.gov/pub/data/paleo/icecore/antarctica/vostok/deutnat.txt

Vostok Temperatures: 450,000 BC — Present

Here we see a very regular pattern of long-term temperature cycles. Most of the time the Earth is in an ice age, and about every 125,000 years there is a brief period of warm tempertures, called an inter-glacial period. Our current inter-glacial period has lasted a bit longer than most, indicating that the next ice age is somewhat overdue. These long-term cycles are probably related to changes in the eccentricity of the Earth’s orbit, which follows a cycle of about 100,000 years.

We also see other cycles of more closely-spaced peaks, and these are probably related to other cycles in the Earth’s orbit. There is an obliquity cycle of about 41,000 years, and a precession cycle, of about 20,000 years, and all of these cycles interfere with one another in complex ways. Here’s a tutorial from NASA that discusses the Earth’s orbital variations:
http://www-istp.gsfc.nasa.gov/stargaze/Sprecess.htm

Next let’s zoom-in on the current inter-glacial period, as seen in Vostok and Greenland, again using ice-core data. Temperatures here are relative to the value for 1900, which is shown as zero:

Vostok Temperatures: 12,000 BC — 1900

Data source:
http://www.ncdc.noaa.gov/paleo/metadata/noaa-icecore-2475.html

Greenland Temperatures: 9,500 BC — 1900

Here we see that the Southern Hemisphere emerged from the last ice age about 1,000 years earlier than did the Northern Hemisphere. As of 1900, in comparison to the whole inter-glacial period, the temperature was 3°C below the maximum in Vostok, and 3°C below the maximum in Greenland. Thus, as of 1900, temperatures were rather cool for the period in both hemispheres, and in Greenland, temperatures were close to a minimum.

During this recent inter-glacial period, temperatures in both Vostok and Greenland have oscillated through a range of about 4°C, although the patterns of oscillation are quite different in each case. In order to see just how different the patterns are, let’s look at Greenland and Vostok together, for the period 500BC–1900. Vostok is shown with a feint line, actually a dotted line if you click to see the enlarged version.

The patterns are very different indeed. In many cases we see an extreme high in Greenland, while at the same time Vostok is experiencing an extreme low. And in the period 1500—1900, while Greenland temperatures were relatively stable, within a range of .5°C, Vostok went through a radical oscillation of 3°C, from an extreme high to an extreme low. These differences between the two hemispheres might be related to the Earth’s orbit (See NASA tutorial), or they might be related to the fact that the Southern Hemisphere is dominated by oceans, while most of the land mass is in the Northern Hemisphere. Whatever the reason, the difference is striking.

There may be some value in trying to average these different records, to obtain a ’global average’, but it is important to understand that a global average is not the same as a global temperature. For example, consider temperatures 2,000 years ago. Greenland was experiencing a very wram period, 2°C above the baseline, while Vostok was experiencing a cold spell, nearly 1°C below the baseline. While the average for year 1000 might be near the baseline, that average does not represent the real temperature in either location.

This distinction between a global average, and real temperatures, is very important to keep in mind. Consider for example the concern that warming might lead to melting of the tundra in the Arctic, leading to the runaway release of methane. If that happens, it must happen in the Arctic. So it is the temperature in the Arctic that is relevant, not any kind of global average. In Greenland, temperatures 2,000 years ago were a full 2°C higher than 1900 temperatures, and there was no runaway release of methane.

The fact that the global average 2,000 years ago was dragged down by Antarctic cooling is completely irrelevant to the issue of melting tundra. Temperatures in the Arctic must rise by more than 2°C above 1900 levels before tundra-melting might be a problem, and this fact is obscured when we look at the global-average-derived hockey stick put out by the IPCC:

This graph gives the impression that temperatures 2,000 years ago were relatively low, and that in 1900 temperatures were higher than that. This may have some kind of abstract meaning, but it has nothing to do with what’s been going on in the Arctic, and it is very misleading as regards the likelihood of tundra-melting, or Arctic-melting in general. The graph is a gross misrepresentation of what’s been happening in the real world. It obscures the actual temperature record in both hemispheres, by presenting an artifical average that has existed nowhere.

Let’s now look at some other records from the Northern Hemisphere, to find out how typical the Greenland record is of its hemisphere. This first record is from Spain, based on the mercury content in a peat bog, as published in Science, 1999, vol. 284, for the most recent 4,000 years. Note that this graph is backwards, with present day on the left:

This next record is from the Central Alps, based on stalagmite isotopes, as published in Earth and Planteary Science Letters, 2005, vol. 235, for the most recent 2,000 years:

And finally, let’s include our Greenland record for the most recent 4,000 years:

While the three records are clearly different, they do share certain important characteristics. In each case we see a staggered rise, followed by a staggered decline — a long-term up-and-down cycle over the period. In each case we see that during the past few thousand years, temperatures have been 3°C higher than 1900 temperatures. And in each case we see a gradual descent towards the overdue next ice age. The Antarctic, on the other hand, shares none of these characteristics.

If we want to understand warming-related issues, such as tundra-melting and glacier-melting, we must consider the two hemispheres separately. If glaciers melt, they do so either because of high northern termperatures, or high southern temperatures. Whether or not glaciers are likely to melt cannot be determined by global averages. In this article we will concern ourselves with the Northern Hemisphere.

In the Northern Hemisphere, based on the shared characteristics we have observed, temperatures would need to rise at least 3°C above 1900 levels before we would need to worry about things like the extinction of polar bears, the melting of the Greenland ice sheet, or runaway methane release. We know this because none of these things have happened in the past 4,000 years, and temperatures have been3°C higher during that period.

However such a 3°C rise seems very unlikely to happen, given that all three of our Nothern Hemisphere samples show a gradual but definite decline toward the overdue next ice age. Let’s now zoom in the temperature record since 1900, and see what kind of rise has actually occurred. Let’s turn to Jim Hansen’s latest article, published on realclimate.org, 2009 temperatures by Jim Hansen. The article includes the following two graphs.

Jim Hansen is of course one of the primary proponents of the CO2-dangerous-warming theory, and there is considerable reason to believe these graphs show an exaggerated picture as regards to warming. Here is one article relevant to that point, and it is typical of other reports I’ve seen:
Son of Climategate! Scientist says feds manipulated data

Nonetheless, let’s accept these graphs as a valid representation of recent temperature changes, so as to be as fair as possible to the warming alarmists. We’ll be using the red line, which is from GISS, and which does not use the various extrapolations that are included in the green line. We’ll return to this topic later, but for now suffice it to say that these extrapolations make little sense from a scientific perspective.

The red line shows a temperature rise of .7°C from 1900 to the 1998 maximum, a leveling off beginning in 2001, and then a brief but sharp decline starting in 2005. Let’s enter that data into our charting program, using values for each 5-year period that represent the center of the oscillations for that period. Here’s what we get for 1900-2008:

Consider the downward trend at the right end of the graph. Hansen tells us this is very temporary, and that temperatures will soon start rising again. Perhaps he is right. However, as we shall see, his arguments for this prediction are seriously flawed. What we know for sure is that a downward trend has begun. How far that trend will continue is not yet known.

Next, let’s append that latest graph to the Greenland data, to get a reasonable characterization of Northern Hemisphere temperatures from 2000 BC to 2008:

This graph shows us that the temperature rise in the Northern Hemipshpere from 1800 to 2005 was not at all unnatural. That rise follows precisely the long-term pattern, where such rises have been occurring approximately every 1,000 years, with no help from human-caused CO2. Based on the long-term pattern of diminishing peaks, we would expect the recent down-trend to continue, and not turn upward again as Hansen predicts. If the natural pattern continues, then the recent warming has reached its maximum, and we will soon experience about two centuries of rapid cooling, as we continue our descent to the overdue next ice age.

So everything depends on the next decade or so. If temperatures turn upwards again, then the IPCC may be right, and human-caused CO2 emissions may have taken control of climate. However, if temperatures continue downward, then climate has been following natural patterns all along in the Northern Hemisphere. In this case there has been no evidence of any noticeable influence on climate from human-caused CO2, and we are now facing an era of rapid cooling. Within two centuries we could expect temperatures in the Northern Hemisphere to be consideralby lower than they were in the recent Little Ice Age.

We don’t know for sure which way temperatures will go, rapidly up or rapidly down. But I can make this statement:

As of this moment, based on the long-term temperature patterns in the Northern Hemisphere, there is no evidence that human-caused CO2 has had any effect on climate. The rise since 1800, as well as the downward dip starting in 2005, are entirely in line with the natural long-term pattern. If temperatures turn sharply upwards in the next decade or so, that will be the first-ever evidence for human-caused warming in the Northern Hemisphere.

As regards the the recent downturn, here are two other records, both of which show an even more dramatic downturn than the one shown in the GISS data:

University of Alabama, Huntsville (UAH)
Dr. John Christy
UAH Monthly Means of Lower Troposphere LT5-2
2004 – 2008

Remote Sensing Systems of Santa Rosa, CA (RSS)
RSS MSU Monthly Anomaly – 70S to 82.5N (essentially Global)
2004 – 2008

Based on the data we have looked at, all from mainstream scientific sources, we are now in a position to answer our first question with a reasonable level of confidence:

Answer 1

Temperatures, at least in the Northern Hemisphere, have been continuing to follow natural, long-term patterns — despite the unusually high levels of CO2 caused by the burning of fossil fuels. There have indeed been two centuries of global warming, and that is exactly what we would expect based on the natural pattern. Temperatures now are more than 2°C cooler than they were only 2,000 years ago, which means we have not been experiencing dangerously high temperatures in the Northern Hemisphere.

The illusion of global warming arises from a failure to recognize that global averages are are a very poor indicator of actual conditions in either hemisphere.

Within the next decade, or perhaps sooner, we are likely to learn which way the climate is going. If it turns again sharply upwards, as Hansen predicts, that will be counter to the long-term pattern, and evidence for human-caused warming. If it levels off, and continues downwards, that is consistent with long-term patterns, and we are likely to experience about two centuries of rapid cooling in the Northern Hemisphere, as we continue our descent toward the overdue next ice age.

Question 2

Why haven’t unsually high levels of CO2 significantly affected temperatures in the Northern Hemisphere?

One place to look for answers to this question is in the long-term patterns that we see in the temperature record of the past few thousand years, such as the peaks separated by about 1,000 years in the Greenland data, and other more closely-spaced patterns that are also visible. Some forces are causing those patterns, and whatever those forces are, they have nothing to do with human-caused CO2 emissions. Perhaps the forces have to do with cycles in solar radiation and solar magnetism, or perhaps they have something to do with cosmic radiation on a galactic scale, or something we haven’t yet identified. Until we understand what those forces are, how they intefere with one another, and how they effect climate, we can’t really build useful climate models, except on very short time scales.

We can also look for answers in the regulatory mechanisms that exist within the Earth’s own climate system. If an increment of warming happens on the surface, for example, then there is more evaporation from the oceans and more precipitation. While an increment of warming may melt glaciers, it may also cause increased snowfall in the arctic regions. Do these balance each other or not? Increased warming of the ocean’s surface may gradually heat the ocean, but the increased evaporation acts to cool the ocean. Do these balance each other?

Vegetation also acts as a regulatory system. Plants and trees gobble up CO2; that is where their substance comes from. Greater CO2 concentration leads to faster growth, taking more CO2 out of the atmosphere. Until we understand quantitively how these various regulatory systems function and interact, we can’t even build useful models on a short time scale.

In fact a lot of research is going on, investigating both lines of inquiry. However, in the current public-opinion and media climate, any research not related to CO2 causation is dismissed as the activity of contrarians, deniers, and oil-company hacks. Just as the Bishop refused to look through Galileo’s telescope, so today we have a whole society that refuses to look at many of the climate studies that are available.

I’d like to draw attention to one example of a scientist who has been looking at one aspect of the Earth’s regulatory system. Roy Spencer has been conducting research using the satellite systems that are in place for climate studies. Here are his relevant qualifications:

http://en.wikipedia.org/wiki/Roy_Spencer_(scientist)

Roy W. Spencer is a principal research scientist for the University of Alabama in Huntsville and the U.S. Science Team Leader for the Advanced Microwave Scanning Radiometer (AMSR-E) on NASA’s Aqua satellite. He has served as senior scientist for climate studies at NASA’s Marshall Space Flight Center in Huntsville, Alabama.

He describes his research in a presentation available on YouTube:
http://www.youtube.com/watch?v=xos49g1sdzo&feature=channel

In the talk he gives a lot of details, which are quite interesting, but one does need to concentrate and listen carefully to keep up with the pace and depth of the presentation. He certainly sounds like someone who knows what he’s talking about. Permit me to summarize the main points of his research:

When greenhouse gases cause surface warming, a response occurs, a ‘feedback response’, in the form of changes in cloud and precipitation patterns. The CRU-related climate models all assume the feedback response is a positive one: any increment of greenhouse warming will be amplified by knock-on effects in the weather system. This assumption then leads to the predictions of ‘runaway global warming’.

Spencer set out to see what the feedback response actually is, by observing what happens in the cloud-precipitation system when surface warming is occurring. What he found, by targeting satellite sensors appropriately, is that the feedback response is negative rather than positive. In particular, he found that the formation of storm-related cirrus clouds is inhibited when surface temperatures are high. Cirrus clouds are themselves a powerful greenhouse gas, and this reduction in cirrus cloud formation compensates for the increase in the CO2 greenhouse effect.

This is the kind of research we need to look at if we want to build useful climate models. Certainly Spencer’s results need to be confirmed by other researchers before we accept them as fact, but to simply dismiss his work out of hand is very bad for the progress of climate science. Consider what the popular website SourceWatch says about Spencer.

We don’t find there any reference to rebuttals to his research, but we are told that Spencer writes columns for a free-market website funded by Exxon. They also mention that he spoke at conference organized by the Heartland Institute, that promotes lots of reactionary, free-market principles. They are trying to discredit Spencer’s work on irrelevant grounds, what the Greeks referred to as an ad hominem argument. Sort of like, “If he beats his wife, his science must be faulty”.

And it’s true about ‘beating his wife’ — Spencer does seem to have a pro-industry philosophy that shows little concern for sustainability. That might even be part of his motivation for undertaking his recent research, hoping to give ammunition to pro-industry lobbyists. But that doesn’t prove his research is flawed or that his conclusions are invalid. His work should be challenged scientifically, by carrying out independent studies of the feedback process. If the challenges are restricted to irrelevant attacks, that becomes almost an admission that his results, which are threatening to the climate establishment, cannot be refuted. He does not hide his data, or his code, or his sentiments. The same cannot be said for the warming-alarmist camp.

Question 3

What are we to make of Jim Hansen’s prediction that rapid warming will soon resume?

Once again, I refer you to Dr. Hansen’s recent article, 2009 temperatures by Jim Hansen.

Jim begins with the following paragraph:

The past year, 2009, tied as the second warmest year in the 130 years of global instrumental temperature records, in the surface temperature analysis of the NASA Goddard Institute for Space Studies (GISS). The Southern Hemisphere set a record as the warmest year for that half of the world. Global mean temperature, as shown in Figure 1a, was 0.57°C (1.0°F) warmer than climatology (the 1951-1980 base period). Southern Hemisphere mean temperature, as shown in Figure 1b, was 0.49°C (0.88°F) warmer than in the period of climatology.

The Southern Hemisphere may be experiencing warming, but it has 2°C to go before that might become a problem there, and it has nothing to do with the Northern Hemisphere, where temperatures have been declining recently, not setting records for warming. This mathematical abstraction, the global average, is characteristic of nowhere. It creates the illusion of a warming crisis, when in fact no evidence for such a crisis exists. In the context of IPCC warnings about glacers melting, runaway warming, etc., this global-average argument serves as deceptive and effective propaganda, but not as science.

Jim continues with this paragraph, emphasis added:

The global record warm year, in the period of near-global instrumental measurements (since the late 1800s), was 2005. Sometimes it is asserted that 1998 was the warmest year. The origin of this confusion is discussed below. There is a high degree of interannual (year‐to‐ year) and decadal variability in both global and hemispheric temperatures. Underlying this variability, however, is a long‐term warming trend that has become strong and persistent over the past three decades. The long‐term trends are more apparent when temperature is averaged over several years. The 60‐month (5‐year) and 132 month (11‐year) running mean temperatures are shown in Figure 2 for the globe and the hemispheres. The 5‐year mean is sufficient to reduce the effect of the El Niño – La Niña cycles of tropical climate. The 11‐ year mean minimizes the effect of solar variability – the brightness of the sun varies by a measurable amount over the sunspot cycle, which is typically of 10‐12 year duration.

As I’ve emphasized in bold, Jim is assuming that there is a strong and persistent warming trend, which he of course attributes to human-caused CO2 emissions. And then that assumption becomes the justification for the 5 and 11-year running averages. Those running averages then give us phantom ’temperatures’ that don’t match actual observations. In particular, if a downard decline is beginning, the running averages will tend to ‘hide the decline’.

It seems we are looking at a classic case of over-attachment to model. What began as a theory has now become an assumption, and actual observations are being dismissed as “confusion” because they don’t agree with the model. The climate models have definitely strayed into the land of imaginary epicycles. The assumption of CO2 causation, plus the preoccupation with an abstract global average, creates a warming illusion that has no connection with reality in either hemisphere, as we see in these two graphs from Jim’s article:

As with the Ptolemaic model, there is a much simpler explantation for our recent era of warming , at least in the Northern Hemisphere: long term patterns are continuing, for whatever reasons, and human-caused CO2 has so far had no noticeable effect. This simpler explanation is based on actual observations, and requires no abstract mathematical epicycles or averages, but it removes CO2 from the center of the climate debate. And just as powerful forces in Galileo’s day wanted the Earth to remain the center of the universe, powerful forces today want CO2 to remain at the center of climate debate, and global warming to be seen as a threat.

Question 4

What is the real agenda of the politically powerful factions who are promoting global-warming alarmism?

One thing we always need to keep in mind is that the people at the top of the power pyramid in our society have access to the very best scientific information. They control dozens, probably hundreds, of high-level think tanks, able to hire the best minds, and carrying out all kinds of research we don’t hear about. They have access to all the secret military and CIA research, and a great deal of influence over what research is carried out in think tanks, the military, and in universities.

Just because they might be promoting fake science for its propaganda value, that doesn’t mean they believe it themselves. They undoubtedly know that global cooling is the real problem, and the actions they are promoting are completely in line with such an understanding.

Cap-and-trade, for example, won’t reduce carbon emissions. Rather it is a mechanism that allows emissions to continue, while pretending they are declining — by means of a phony market model. You know what a phony market model looks like. It looks like Reagan and Thatcher telling us that lower taxes will lead to higher government revenues due to increased business activity. It looks like globalization, telling us that opening up free markets will “raise all boats” and make us all prosperous. It looks like Wall Street, telling us that mortgage derivatives are a good deal, and we should buy them. And it looks like Wall Street telling us the bailouts will restore the economy, and that the recession is over. In short, it’s a con. It’s a fake theory about what the consequences of a policy will be, when the real consequences are known from the beginning.

Cap-and-trade has nothing to do with climate. It is part of a scheme to micromanage the allocation of global resources, and to maximize profits from the use of those resources. Think about it. Our ‘powerful factions’ decide who gets the initial free cap-and-trade credits. They run the exchange market itself, and can manipulate the market, create derivative products, sell futures, etc. They can cause deflation or inflation of carbon credits, just as they can cause deflation or inflation of currencies. They decide which corporations get advance insider tips, so they can maximize their emissions while minimizing their offset costs. They decide who gets loans to buy offsets, and at what interest rate. They decide what fraction of petroleum will go to the global North and the global South. They have ‘their man’ in the regulation agencies that certify the validity of offset projects. And they make money every which way as they carry out this micromanagement.

In the face of global cooling, this profiteering and micromanagenent of energy resources becomes particularly significant. Just when more energy is needed to heat our homes, we’ll find that the price has gone way up. Oil companies are actually strong supporters of the global-warming bandwagon, which is very ironic, given that they are funding some of the useful contrary research that is going on. Perhaps the oil barrons are counting on the fact that we are suspicious of them, and asssume we will discount the research they are funding, as most people are in fact doing. And the recent onset of global cooling explains all the urgency to implement the carbon-management regime: they need to get it in place before everyone realizes that warming alarmism is a scam.

And then there’s the carbon taxes. Just as with income taxes, you and I will pay our full share for our daily commute and for heating our homes, while the big corporate CO2 emitters will have all kinds of loopholes, and offshore havens, set up for them. Just as Federal Reserve theory hasn’t left us with a prosperous Main Street, despite its promises, so theories of carbon trading and taxation won’t give us a happy transition to a sustainable world.

Instead of building the energy-efficient transport systems we need, for example, they’ll sell us biofuels and electric cars, while most of society’s overall energy will continue to come from fossil fuels, and the economy continues to deteriorate. The North will continue to operate unsustainably, and the South will pay the price in the form of mass die-offs, which are already ticking along at the rate of six million children a year from malnutrition and disease.

While collapse, suffering, and die-offs of ‘marginal’ populations will be unpleasant for us, it will give our ‘powerful factions’ a blank canvas on which to construct their new world order, whatever that might be. And we’ll be desperate to go along with any scheme that looks like it might put food back on our tables and warm up our houses.

Author contact – rkm@quaylargo.com

January 16, 2010 Posted by | Aletho News, Deception, Environmentalism, Full Spectrum Dominance, Malthusian Ideology, Phony Scarcity, Science and Pseudo-Science | Leave a comment

Up in Smoke

Why Biomass Wood Energy is Not the Answer

By George Wuerthner | January 12, 2010

After the Smurfit-Stone Container Corp.’s linerboard plant in Missoula Montana announced that it was closing permanently, there have been many people including Montana Governor Switzer, Missoula mayor and Senator Jon Tester, among others who advocate turning the mill into a biomass energy plant. Northwestern Energy, a company which has expressed interest in using the plant for energy production has already indicated that it would expect more wood from national forests to make the plant economically viable.

The Smurfit Stone conversion to biomass is not alone. There have been a spate of new proposals for new wood burning biomass energy plants sprouting across the country like mushrooms after a rain. Currently there are plans and/or proposals for new biomass power plants in Maine, Vermont, Pennsylvania, Florida, California, Idaho, Oregon and elsewhere. In every instance, these plants are being promoted as “green” technology.

Part of the reason for this “boom” is that taxpayers are providing substantial financial incentives, including tax breaks, government grants, and loan guarantees. The rationale for these taxpayer subsidies is the presumption that biomass is “green” energy. But like other “quick fixes” there has been very little serious scrutiny of  real costs and environmental impacts of biomass. Whether commercial biomass is a viable alternative to traditional fossil fuels can be questioned.

Before I get into this discussion, I want to state right up front, that coal and other fossil fuels that now provide much of our electrical energy need to be reduced and effectively replaced. But biomass energy is not the way to accomplish this end goal.

BIOMASS BURNING IS POLLUTION

First and foremost, biomass burning isn’t green. Burning wood produces huge amounts of pollution. Especially in valleys like Missoula where temperature inversions are common, pollution from a biomass burner will be the source of numerous health ailments. Because of the air pollution and human health concerns, the Oregon Chapter of the American Lung Association, the Massachusetts Medical Society and the Florida Medical Association, have all established policies opposing large-scale biomass plants.

The reason for this medical concern is that even with the best pollution control devises, biomass energy is extremely dirty. For instance, one of the biggest biomass burners now in operation, the McNeil biomass plant in Burlington, Vermont is the number one pollution source in the state, emitting 79 classified pollutants. Biomass releases dioxins, and as much particulates as coal burning, plus carbon monoxide, nitrogen oxide, sulfur dioxide, and contributes to ozone formation. […]

BIOMASS ENERGY IS INEFFICIENT

Wood is not nearly as concentrated a heat source as coal, gas, oil, or any other fossil fuel. Most biomass energy operations are only able to capture 20-25% of the latent energy by burning wood. That means one needs to gather and burn more wood to get the same energy value as a more concentrated fuel like coal. That is not to suggest that coal is a good alternative, rather wood is a worse alternative. Especially when you consider the energy used to gather the rather dispersed source of wood and the energy costs of trucking it to a central energy plant. If the entire carbon footprint of wood is considered, biomass creates far more CO2 with far less energy output than other energy sources.

The McNeil Biomass Plant in Burlington Vermont seldom runs full time because wood, even with all the subsidies (and Vermonters made huge and repeated subsidies to the plant—not counting the “hidden subsidies” like air pollution) wood energy can’t compete with other energy sources, even in the Northeast where energy costs are among the highest in the nation. Even though the plant was also retrofitted so it could burn natural gas to increase its competitiveness with other energy sources, the plant still does not operate competitively. It generally is only used to off- set peak energy loads.

One could argue, of course, that other energy sources like coal are greatly subsidized as well, especially if all environmental costs were considered. But at the very least, all energy sources must be “standardized” so that consumers can make informed decisions about energy—and biomass energy appears to be no more green than other energy sources.

BIOMASS SANITIZES AND MINES OUR FORESTS

The dispersed nature of wood as a fuel source combined with its low energy value means any sizable energy plant must burn a lot of wood. For instance, the McNeil 50 megawatt biomass plant in Burlington, Vermont would require roughly 32,500 acres of forest each year if running at near full capacity and entirely on wood. Wood for the McNeil Plant is trucked and even shipped on trains from as far away as Massachusetts, New Hampshire, Quebec and Maine.

Biomass proponents often suggest that wood [gathered] as a consequence of forest thinning to improve “forest health” (logging a forest to improve health of a forest ecosystem is an oxymoron) will provide the fuel for plant operations. For instance, one of the assumptions of Senator Tester’s Montana Forest Jobs bill is that thinned forests will provide a ready source of biomass for energy production. But in many cases, there are limits on the economic viability of trucking wood any distance to a central energy plant. Again without huge subsidies, this simply does not make economic sense. Biomass forest harvesting is even worse for forest ecosystems than clear-cutting. Biomass energy tends to utilize the entire tree, including the bole, crown, and branches. This robs a forest of nutrients, and disrupts energy cycles.

Worse yet, such biomass removal ignores the important role of dead trees to sustain the forest ecosystems. Dead trees are not a “wasted” resource. They provide home and food for thousands of species, including 45% of all bird species in the Nation. Dead trees that fall to the ground are used by insects, small mammals, amphibians and reptiles for shelter and even potentially food. Dead trees that fall into streams are important physical components of aquatic ecosystems and provide critical habitat for many fish and other aquatic species. Removal of dead wood is mining the forest. Keep in mind that logging activities are not benign. Logging typically requires some kind of access, often roads which are a major source of sedimentation in streams, and disrupt natural subsurface water flow. Logging can disturb sensitive wildlife like grizzly bear and even elk are known to abandon locations with active logging. Logging can spread weeds. And finally since large amounts of forest carbon are actually tied up in the soils, soil disturbance from logging is especially damaging, often releasing substantial additional amounts of carbon over and above what is released up a smoke stack.

BIOMASS ENERGY USES LARGE AMOUNTS OF WATER

A large-scale biomass plant (50 MW) uses close to a million gallons of water a day for cooling. Most of that water is lost from the watershed since approximately 85% is lost as steam. Water channeled back into a river or stream typically has a pollution cost as well, including higher water temperatures that negatively impact fisheries, especially trout. Since cooling need is greatest in warm weather, removal of water from rivers occurs just when flows are lowest, and fish are most susceptible to temperature stress.

BIOMASS ENERGY SAPS FUNDS FROM OTHER TRULY GREEN ENERGY SOURCES LIKE SOLAR

Since biomass energy is eligible for state renewable portfolio standards (RPS), it has captured the bulk of funding intended to move the country away from fossil fuels. For example, in Vermont, 90% of the RPS is from “smokestack” sources—mostly biomass incineration. This pattern holds throughout many other parts of the country. Biomass energy is thus burning up funds that could and should be going into other energy programs like energy conservation, solar and insulation of buildings.

PUBLIC FORESTS WILL BE LOGGED FOR BIOMASS ENERGY

Many of the climate bills now circulating in Congress, as well as Montana Senator Jon Tester’s Montana Jobs and Wilderness bill target public forests. Some of these proposals even include roadless lands and proposed wilderness as a source for wood biomass. One federal study suggests that 368 million tons of wood could be removed from our national forests every year—of course this study did not include the ecological costs that physical removal of this much would have on forest ecosystems.

The Biomass Crop Assistance Program, or BCAP, which was quietly put into the 2008 farm bill has so far given away more than a half billion dollars in a matching payment program for businesses that cut and collect biomass from national forests and Bureau of Land Management lands. And according to a recent Washington Post story, the Obama administration has already sent $23 million to biomass energy companies, and is poised to send another half billion.

And it is not only federal forests that are in jeopardy. Many states are eying their own state forests for biomass energy. For instance, Maine recently unveiled a new plan known as the Great Maine Forest Initiative which will pay timber companies to grow trees for biomass energy.

JOB LOSSES

Ironically one of the main justifications for biomass energy is the creation of jobs, yet the wood biomass rush is having unintended consequences for other forest products industries. Companies that rely upon surplus wood chips to produce fiberboard, cabinet makers, and furniture are scrambling to find wood fiber for their products. Considering that these industries are secondary producers of products, the biomass rush could threaten more jobs than it may create.

BOTTOM LINE

Large scale wood biomass energy is neither green, nor truly economical. It is also not ecologically sustainable and jeopardizes our forest ecosystems. It is a distraction that funnels funds and attention away from other more truly worthwhile energy options, in particular, the need for a massive energy conservation program, and changes in our lifestyles that will in the end provide truly green alternatives to coal and other fossil fuels.

George Wuerthner is a wildlife biologist and a former Montana hunting guide. His latest book is Plundering Appalachia.

Source

January 12, 2010 Posted by | Environmentalism, Malthusian Ideology, Phony Scarcity | , , , , , | Leave a comment

Copenhagen climate summit: confusion as ‘historic deal’ descends into chaos

The “historic” climate change deal at the Copenhagen climate summit has descended into chaos after some developing nations rejected the plan for fighting global warming championed by US President Barack Obama.

By David Barrett and Louise Gray, in Copenhagen
The Telegraph | December 19, 2009

Copenhagen climate summit: confusion as 'historic deal' descends into chaos

(From Left) European Commission President Barroso, German Chancellor Angela Merkel, Swedish Prime Minister Fredrik Reinfeldt, French President Nicolas Sarkozy, US President Barack Obama and British PM Gordon Brown Photo: STEFFEN KUGLER/AFP/Getty Images

An agreement to limit global warming to a 3.6F (2C) temperature rise, alongside a $100 billion (£62bn) a year in aid from 2020, were condemned as inadequate by some delegates and appeared to be in danger of unravelling.

Developing nations, including Venezuela, said they could not accept a text originally agreed by the United States, China, India, Brazil and South Africa as the blueprint of a wider United Nations plan to fight climate change.

Tempers flared during an all-night plenary session, held after most of 120 visiting world leaders had left.

Lumumba Stanislaus Di-Aping, the Sudanese negotiator, said the draft text asked “Africa to sign a suicide pact”.

One Saudi delegate said it was without doubt “the worst plenary I have ever attended.”

Ed Miliband, the Environment Secretary, warned delegates that the plan would have to be endorsed to unlock funds outlined in the deal, including $30 billion in “quick-start” aid from 2010-12, rising to $100 billion a year from 2020.

Apart from the original five nations supporting the scheme, European Union states, Japan and groups representing small island states, least developed nations and African countries spoke in favour of the plan during the overnight session.

The two-week summit ended late on Friday night after a row between the US and China overshadowed negotiations, yet its conclusions were initially hailed as a significant deal.

[…]

The accord declared that “deep cuts in emissions are required”. But instead of a detailed pledge to halve carbon emissions by 2050, leaders agreed only to the vague promise to limit the rise in global temperatures to 2C, with no specifics on how to achieve that.

The leaders also put off setting emissions targets for 2020, saying they would attempt to agree them by February… Full article

December 19, 2009 Posted by | Malthusian Ideology, Phony Scarcity, Science and Pseudo-Science | Leave a comment

British Columbia: New terminal for LNG exports to China

Picture – Horn River News

WSJ: Apache To Provide Natural Gas To Proposed Kitimat LNG Terminal For Export To Asia – Update

December 18, 2009

(RTTNews) – Sunday, according to The Wall Street Journal, oil and gas company Apache Corp. (APA) has agreed to provide natural gas to Canadian firm Kitimat LNG Inc. for export to Asia through Kitimat’s proposed liquefied-natural-gas or LNG export terminal in Kitimat, British Columbia. The construction of the $3-billion LNG export facility is set to begin in late 2009 or early 2010, with the LNG facility coming into operation 36 to 40 months later by 2013 or 2014. The companies are expected to announce an agreement on Monday.

Privately-owned Calgary-based Kitimat LNG is committed to build a state-of-the-art LNG terminal in Kitimat that would transport natural gas via a pipeline from the Western Canadian Sedimentary Basin to the Kitimat LNG Terminal, where the natural gas will be cooled to -160 degrees centigrade, condensed and liquefied in preparation for export via ship to Asian markets. In Asia, the LNG will undergo a regasification process and be transported through pipelines to its final destination.

Pursuant to an agreement, Kitimat LNG and Houston, Texas-based Apache would negotiate a definitive agreement under which Apache would supply specific quantities of the LNG facility’s 700 million cubic feet per day of natural gas feedstock. In mid-July, EOG Resources, Inc. (EOG) also signed a memorandum of understanding or MOU, to supply natural gas to Kitimat LNG’s proposed LNG export terminal.

In a statement while signing the EOG agreement, President of Kitimat LNG Rosemary Boulton said, “Kitimat LNG presents a compelling opportunity for producers to leverage growing natural gas reserves in Western Canada and sell into significant new international markets such as Asia.”

After EOG, Apache is the second major North American gas producer to have reportedly agreed to supply natural gas to Kitimat LNG. Kitimat LNG has also signed MOUs with leading LNG companies such as Korea Gas Corporation (KOGAS) and Gas Natural for the purchase of LNG produced at the terminal. However, there are other companies active in British Columbia, where the proposed project is situated, including EnCana Corp. (ECA, ECA.TO)), Devon Energy Corp. (DVN) and industry giant Exxon Mobil Corp. (XOM).

Kitimat LNG’s export terminal proposal is supported by natural gas market fundamentals that show growth in the supply of natural gas in Western Canada and strong, growing demand for natural gas in Asia. As a politically and economically stable country that is close to Asian markets, Canada offers a reliable, plentiful natural gas supply to customers in the Pacific Rim.

The project is expected to take advantage of the rising natural gas demand and the higher LNG prices in Asia, with prices Asian prices expected to continue to climb. The U.S. natural gas prices have been stuck at between US$7 and $9 per million British Thermal Units or BTU, for most of the year, while in Asia, LNG have been traded with increasing frequency at record spot prices of US$20 per million BTU.

The Kitimat project comprises of a 40-hectare LNG export terminal site with two storage tanks, marine jetty and berthing facility. It would have an annual LNG capacity of three to five million tons and would take about 36 to 40 months for completion. It would handle three to five shipments monthly and would target key potential markets like Japan, South Korea, China, and Taiwan. – source

December 18, 2009 Posted by | Economics, Malthusian Ideology, Phony Scarcity | Leave a comment

Copenhagen: Bolivia, Sudan, Venezuela and S.A. set to humiliate Obama

Update: Obama departs Copenhagen without a binding agreement

December 18, 2009 |  Highlights from Politico.com

On Friday morning, Obama warned delegates that U.S. offers of funding for poor nations would remain on the table “if and only if” developing nations, including China, agreed to international monitoring of their greenhouse gas emissions. […]

Back home, senators critical to getting a climate bill through Congress have stressed that developing nations must submit to international monitoring — particularly if they want the U.S. to pay hundreds of billions to help combat the destructive impact of climate change.

“The only way we’ll be successful in America is for countries like China and India to make an equivalent commitment,” said Sen. Lindsey Graham (R-S.C.), who is crafting a bipartisan climate bill. “We’re not going to unilaterally disarm.”

While Obama emphasized the U.S. commitment to taking action on climate change, he did not set a deadline for specific Senate action on the climate bill. […]

Overnight reports that world leaders had agreed to a tentative final climate change deal in Copenhagen were greatly exaggerated — and the outcome of the COP-15 conference was still very much up in the air when Air Force One touched down at 9:01 a.m. local time. […]

After addressing the delegates, Obama met with Chinese Premier Wen Jiabao for close to an hour to discuss emissions goals, verification mechanisms and climate financing. The lack of agreement between China and the U.S. — the world’s two largest greenhouse gas emitters — has been a major stumbling block in the talks.

A White house official described the discussion as “constructive” and said that the two leaders asked their negotiators to get together one-on-one after the meeting. […]

One key sticking point: a demand by industrialized nations that the document produced here be legally binding, the so-called “operational” agreement Secretary of State Hillary Clinton spoke about yesterday.

…  none of the several drafts circulating in Copenhagen represented even the bones of a final deal, with many key issues still in flux and time running out. Moreover, U.S. predictions that roadblocks could be thrown up by smaller countries seemed to be coming true, with last-minute objections voiced by Venezuela, Bolivia, Sudan and Saudi Arabia, according to people familiar with talks. […]

An official with a developing nation told Reuters that rich nations were offering to cut their carbon emissions by 80 percent by 2050, a proposal that had been rejected by developing nations. Developing nations have always insisted on the need for mid-term targets…

December 18, 2009 Posted by | Malthusian Ideology, Phony Scarcity, Progressive Hypocrite | Leave a comment

UK Group Proposes Using Carbon Offsets to Stop Poor From Breeding

Carbon hysteria reaches its logical conclusion

James Corbett
The Corbett Report
9 December, 2009

The Optimum Population Trust (OPT), a UK-based “think tank” and registered charity, has launched a new initiative urging wealthy members of the developed world to participate in carbon offsets that fund programs for curbing the population of developing nations. The scheme is being promoted as a more cost-effective way to reduce CO2 emissions than investing in alternative energy sources and offers a way for elitist racists to feel ethical in their quest to exterminate the third world masses.

A BBC News article on the proposal dutifully reports the OPT’s proposal and their justifications for proposing it. They note that the program is designed to fund “contraception” programs in poor nations, a term that helpfully obscures the fact that such programs—including those run by FPA, one of the agencies listed as a supporting organization of this new program—have used bribes to get poor men and women to volunteer for sterilization. The article does, however, allow space for a detractor of the proposal to point out that even if one does accept that limiting carbon emissions is necessary (which it is not), the focus on limiting emissions of people in Least Developed Countries (LDCs) is in itself nonsensical: “Carbon emissions from people in much of sub-Saharan Africa are so low that they can barely be counted.”

What this error exposes, however, is not that the OPT has set its sights on the wrong target. In fact, they are simply introducing the idea as a politically expedient precedent which will eventually be expanded to include the developed world as well. Indeed, this is merely the latest such proposal from the group, which has previously said that the world’s population must be cut by as much as half and the UK’s population reduced to as little as 17 million in order to reach “sustainable levels.” The group’s patrons include world renowned environmental campaigners, academics and media figures like Jane Goodall, James Lovelock and Sir David Attenborough.

One patron of the Optimum Population Trust who stands out is Jonathon Porritt, a well-known baronet and a green campaigner who advises the likes of Prince Charles on environmental matters. He has long argued the link between “environmental sustainability” and enforced abortions. He once claimed to be “unapologetic about asking people to connect up their own responsibility for their total environmental footprint and how they decide to procreate and how many children they think are appropriate.” He is also on the board of BBC Wildlife magazine, perhaps explaining why BBC News tends to treat every pronouncement from the OPT as if it were a major policy announcement (see this and this and this for starters).

Another prominent OPT patron is Paul Ehrlich, George W. Bush’s chief science advisor and co-author (with his wife, Anne, and Obama’s science advisor, John P. Holdren) of Ecoscience, a 1977 textbook that outlined in painstaking detail the various measures that the governments of the world could take to confront the “problem” of population, from forced abortions and one-child policies to mass sterilization of the populace through the contamination of the water supply. One representative passage reads:

“The development of a long-term sterilizing capsule that could be implanted under the skin and removed when pregnancy is desired opens additional possibilities for coercive fertility control. The capsule could be implanted at puberty and might be removable, with official permission, for a limited number of births.”

With such patrons in its ranks, it is hardly surprising that the group would endose a plan to sterilize the poor in the name of reducing carbon emissions. Of course, the green rhetoric of “sustainability” and “carbon reduction” is only the latest garb for a very old ideology, eugenics, a 19th century junk science which concluded that the human race consisted of genetically “superior” and “inferior” breeds. Unsurprisingly, this long-since discredited hucksterism, invented by an inbred group of British gentlemen scientists concluded that inbred British gentlemen scientists were the master race and everyone else was expendable.

Full article

December 15, 2009 Posted by | Deception, Ethnic Cleansing, Racism, Zionism, Malthusian Ideology, Phony Scarcity, Science and Pseudo-Science, Timeless or most popular | Leave a comment

Why are the oligarchic elites trying so hard to push their climate change policies through right now?

December 9, 2009 by Notsilvia Night

Why are the political and financial elites and their obedient servants in the their faith – sorry scientific community – pushing so madly for a final decision on a global Carbon Tax legislation at this very moment?

Why don´t they just wait until the scandal of “climate gate” has blown over?

Because those elites know they are wrong on the issue of human caused climate change.

They know that the data doesn´t support their fear-mongering, because they themselves have fudged it to support a political agenda.

They know that their lies are being revealed to the public piece by piece, faster and faster.

Most of all, they know that the planet is at the moment once again in a cooling phase  as occurs every thirty or forty odd years.

Looking at the current lack of solar activity, this cooling phase might even be a more severe one than the one that ended 40 years ago, possibly as severe as during what is called the Maunder Minimum, a cooling phase lasting several decades during the 17. and 18. century.

In a couple of years their claims would no longer be tenable at all. The cooling trend would be obvious to even the most ideologically blinded environmentalist on earth.

The scheme of taxing global population, creating new revenue streams for the world´s financial markets establishing a central control over the world economy and preventing the rise of developing countries out of poverty, would lose out.

The political leaders of all less powerful countries are being bullied at the moment into signing a treaty which gives away their country’s national sovereignty to the leadership of the powerful ones, namely Britain and United States – and more to the point to the shadow leadership behind them, the world´s financial elites of Goldman Sachs and Co.

So why are so many decent people on the left fighting hand and foot for the profits in carbon trading of Goldman Sachs and Al Gore´s Generation Investment Management (GIM)  company?

It´s a psychological problem; most people, especially on the left, want to be on the side of the good and caring people.

For over 40 years now we have been told that being environmentally minded means being a good person. It means we care about nature, wild animal life, about future generations of human beings.

Being environmentally minded means we are opposed to polluting the air and the water;
we are opposed to deforestation (especially in the rain-forest regions);
we are opposed to dumping our own poisonous waste unto the developing world;
we are opposed to rampant consumerism, in which driven by the advertisement industry we keep on buying and buying. Buying things we actually don´t need, things which do not make us either happier or more comfortable, just more indebted.

All those nice middle class people who want to feel good about themselves, they all support these ideas as part of the program for the left. And yes, there are plenty of real environmental issues we should be concerned about. But while marginalizing these real issues  for the environment, the financial and right-wing ideological elites have – with the help of the media they control – succeeded to infiltrate their own agenda into the “green” movement with the bogus Anthropogenic Global Warming ideology.

The propaganda has been very successful indeed. People who want with all their heart to be “good” and decent are now supporting the agenda of the most selfish and anti-humanist forces on the planet.
The propaganda has created a belief-system which is hard to break. In Europe this belief-system is even more entrenched, since it has been developed for a over a few more years, hence it may be harder to break among Europeans than in the United States.

But after the “climate-gate” revelations chances aren’t so bad any more. A global storm is brewing against the liars (which include most of the mainline media) and their masters. No matter how bad it looks when we listen to the sound-bites of the top-level political hacks, down on the bottom, in the population, minds are changing en masse.

In just a little while, those who honestly strive to be the “good” guys (and girls) will realize that being good and caring about future generations means not caring for the Goldman Sachs carbon credits scheme.

The truth will indeed set us free from global tyranny:

Watch also:

Lord Monckton on Climategate at the 2nd International Climate Conference

on Vimeo.

December 9, 2009 Posted by | Deception, Environmentalism, Malthusian Ideology, Phony Scarcity, Science and Pseudo-Science | Leave a comment

Copenhagen climate summit in disarray after ‘Danish text’ leak

COP15: A Haitian delegation during second-day session at the Bella center in Copenhagen

Photograph: Attila Kisbenedek/AFP/Getty Images

The UN Copenhagen climate talks are in disarray today after developing countries reacted furiously to leaked documents that show world leaders will next week be asked to sign an agreement that hands more power to rich countries and sidelines the UN’s role in all future climate change negotiations.

The document is also being interpreted by developing countries as setting unequal limits on per capita carbon emissions for developed and developing countries in 2050; meaning that people in rich countries would be permitted to emit nearly twice as much under the proposals.

The so-called Danish text, a secret draft agreement worked on by a group of individuals known as “the circle of commitment” – but understood to include the UK, US and Denmark – has only been shown to a handful of countries since it was finalised this week.

The agreement, leaked to the Guardian, is a departure from the Kyoto protocol‘s principle that rich nations, which have emitted the bulk of the CO2, should take on firm and binding commitments to reduce greenhouse gases, while poorer nations were not compelled to act. The draft hands effective control of climate change finance to the World Bank; would abandon the Kyoto protocol – the only legally binding treaty that the world has on emissions reductions; and would make any money to help poor countries adapt to climate change dependent on them taking a range of actions.

The document was described last night by one senior diplomat as “a very dangerous document for developing countries. It is a fundamental reworking of the UN balance of obligations. It is to be superimposed without discussion on the talks”.

A confidential analysis of the text by developing countries also seen by the Guardian shows deep unease over details of the text. In particular, it is understood to:

• Force developing countries to agree to specific emission cuts and measures that were not part of the original UN agreement;

• Divide poor countries further by creating a new category of developing countries called “the most vulnerable”;

• Weaken the UN’s role in handling climate finance;

• Not allow poor countries to emit more than 1.44 tonnes of carbon per person by 2050, while allowing rich countries to emit 2.67 tonnes.

Developing countries that have seen the text are understood to be furious that it is being promoted by rich countries without their knowledge and without discussion in the negotiations.

“It is being done in secret. Clearly the intention is to get [Barack] Obama and the leaders of other rich countries to muscle it through when they arrive next week. It effectively is the end of the UN process,” said one diplomat, who asked to remain nameless.

Antonio Hill, climate policy adviser for Oxfam International, said: “This is only a draft but it highlights the risk that when the big countries come together, the small ones get hurting. On every count the emission cuts need to be scaled up. It allows too many loopholes and does not suggest anything like the 40% cuts that science is saying is needed.”

Hill continued: “It proposes a green fund to be run by a board but the big risk is that it will run by the World Bank and the Global Environment Facility [a partnership of 10 agencies including the World Bank and the UN Environment Programme] and not the UN. That would be a step backwards, and it tries to put constraints on developing countries when none were negotiated in earlier UN climate talks.”

The text was intended by Denmark and rich countries to be a working framework, which would be adapted by countries over the next week. It is particularly inflammatory because it sidelines the UN negotiating process and suggests that rich countries are desperate for world leaders to have a text to work from when they arrive next week.

Few numbers or figures are included in the text because these would be filled in later by world leaders. However, it seeks to hold temperature rises to 2C and mentions the sum of $10bn a year to help poor countries adapt to climate change from 2012-15.

December 8, 2009 Posted by | Deception, Economics, Ethnic Cleansing, Racism, Zionism, Full Spectrum Dominance, Malthusian Ideology, Phony Scarcity, Progressive Hypocrite | Leave a comment

The Recurring Myth of Peak Oil

By ISMAEL HOSSEIN-ZADEH | October 1, 2008

The Peak Oil theory maintains that world production of conventional oil will soon reach a maximum, or peak, and decline thereafter, with grave socio-economic consequences. Some proponents of the theory argue that world oil production has already peaked, and is now in a terminal decline [1]. Although, on the face of it, this sounds like a fairly reasonable proposition, it has been challenged on both theoretical and empirical grounds. While some critics have called it a myth, others have branded it as a money-making scam promoted by the business interests that are vested in the fossil fuel industry, in the business of war and militarism, and in the Wall Street financial giants that are engaged in manipulative oil speculation.

Regardless of its validity (or lack thereof), the fact is that Peak Oil has had significant policy and political implications. It has also generated considerable reactions among various interest groups and political activists.

While environmental and similar activists have used Peak Oil to promote more vigorous conservation and more energetic pursuit of alternative fuels, the oil industry and its representatives in and out of the government have taken advantage of Peak Oil to argue in support of unrestrained extraction of oil and expanded drilling in the offshore or wildlife regions.

Because of its simple logic and facile appeal, Peak Oil has also led many ordinary citizens, burdened by high fuel bills during periods of energy crisis, to support unrestrained or expanded drilling. According to a recent Rasmussen poll, 57 percent of Americans favor more offshore drilling. Misled and misplaced popular perceptions, in turn, play into the hands of the oil industry and their representatives to lobby for the lifting of the Federal ban on oil production in hitherto restricted regions.

Citing voter anger over soaring energy prices, Senator John McCain of Arizona, the Republican presidential nominee, recently argued that opening vast stretches of the country’s coastline to oil exploration would help America eliminate the dependence on foreign oil. “We have untapped oil reserves of at least 21 billion barrels in the United States. But a broad federal moratorium stands in the way of energy exploration and production,” he said. “It is time for the federal government to lift these restrictions” [2].

Perhaps the financial giants of New York and London have benefited the most from the misleading implications of Peak Oil: “As much as 60% of today’s crude oil price is pure speculation driven by large trader banks and hedge funds. It has nothing to do with the convenient myths of Peak Oil. It has to do with control of oil and its price. . . . Since the advent of oil futures trading and the two major London and New York oil futures contracts, control of oil prices has left OPEC and gone to Wall Street. It is a classic case of the tail that wags the dog,” points out William Engdahl, a top expert on energy and financial markets [3].

Just as Peak Oil plays into the hands of manipulative speculators and beneficiaries of fossil fuel, so too can it be used by the champions of unilateral wars and military adventures, as it implies that war power and military strength are key to access or control of the “shrinking” or “soon-to-be-shrinking” oil. It thus provides fodder for the cannons of war profiteering militarists who are constantly on the look out to invent new enemies and find new pretexts for continued war and escalation of military spending—that is, for the looting of the national treasury, or public money.

By the same token that Peak Oil can serve as a pretext for war and military adventures, it can also serve as a disarming or pacifying factor for many citizens who accept the Peak Oil thesis and, therefore, internalize responsibility for U.S. foreign policy every time they fill their gas tank. In a vicarious way, they may feel that they own the war!

Thus, Peak Oil serves as a powerful trap and a clever manipulation that lets the real forces of war and militarism (the military-industrial complex and the pro-Israel lobby), and the main culprits behind the soaring energy prices (the Wall Street financial giants engaged in manipulative commodity speculation) off the hook; it is a fabulous distraction. All evils are blamed on a commodity upon which we are all utterly dependent.

Not only millions of lay-citizens, but also many scholars and academics have taken the bait and fallen right into this trap by arguing that recent U.S. wars of choice are driven primarily by oil and other “scarce” resources. More broadly, they argue that most wars of the future, like the recent and/or present ones, will be driven by conflicts over natural resources, especially energy and water—hence, for example, the title of Michael T. Klare’s popular book, Resource Wars [4].

As a number of critics have pointed out, this is reminiscent of Thomas R. Malthus’s theory of “scarcity” and “overpopulation.” Malthus (1766-1834), a self-styled British economist, argued that the woes and vagaries of capitalism such as poverty, inequality and unemployment are largely to be blamed on the poor and the unemployed, since they produce too many mouths to be fed, or too many hands to be employed.

In a similar fashion, Peak Oil implies that the current crisis in energy (and other commodities) markets is to be blamed, in part, on less-developed or relatively poorer nations such as India and China for growing “too fast” and creating “too much” demand on “scarce” resources. (Similarities between the Peak Oil theory and the Malthusian theory of scarcity are further discussed below.)

Peak Oil Thesis Is Not New: Geology vs. Geopolitics

Peak Oil theory is not altogether new. M. King Hubbert, a well-known geologist, provided a dramatic discussion of the theory in 1956. A year later, Admiral Rickover discussed the end of the fossil fuel era even more emphatically—at the time, he gave oil about fifty more years to run out. Thirty years ago, the Club of Rome predicted an end of oil long before the present day.

Indeed, there is evidence that projections of oil peaking, then declining and running out, have been floated around ever since oil was discovered in the second half of 19th century. For example, the chief geologist of Pennsylvania predicted in 1874 that we would run out of oil in four years—just using it for kerosene [5].

While Peak Oil theory has been around for a long time, it has usually been dormant during “normal” economic times, or “reasonable” oil prices, but has gained heightened currency during periods of energy crisis and high oil prices. For example, Peak Oil became quite popular during (and immediately after) all of the three recent oil crises: the early 1970s crisis, the late 1970s and early 1980s crisis, and the early 1990s crisis.

The obvious reason for the rise in the Peak Oil popularity in the context of those periods of energy crisis was the perception that oil shortage must have played a major role in the respective oil price hikes. It is not surprising, then, that as recent geopolitical convulsions in the Middle East have triggered a new round of oil price hikes, Peak Oil theory has once again become fashionable.

It turns out, however, that oil price shocks of all the previous periods of energy crisis were precipitated not by oil shortages, or any real prospects of oil “peaking and running out,” but by international political convulsions, revolutions and wars: the Arab-Israeli war of 1973, the 1979 Revolution in Iran, and the 1990-91 invasion of Kuwait by Saddam Hussein’s armed forces. Each time, as the turbulent period of war or revolutionary atmosphere ended, higher oil prices of the respective crisis situation subsided accordingly [6].

The current oil price hike too is precipitated not by an oil shortage, as popularly perceived, but by manipulative speculation in energy futures markets—which are, in turn, prompted largely by the unstable atmosphere of war and geopolitical turbulence in the Middle East.

Evidence is therefore unambiguous that, so far, almost all oil price shocks can be explained not by geology, or the so-called Peak Oil, but by geopolitics.

The Paradoxical Reasonableness of Peak Oil: Return of Thomas Malthus

Peak Oil has a prima facie reasonableness that makes it readily acceptable to most people: since oil is a finite natural resource, it is subject to depletion.

But while the rationale behind Peak Oil seems reasonable, it is also seriously flawed and misleading.

One of the major defects of Peak Oil is its facile extrapolation or transition from micro to macro level, that is, an unwarranted generalization or extention of what is true in the case of an existing oil well or oil field to the entire world oil production. It is true that every operating or producing oil well or field increases in production rate until it reaches a maximum or peak flow rate, after which the rate of production enters a terminal decline. It does not follow, however, that global world oil production as a whole must soon reach a maximum and begin to run out afterward—some Peak Oil champions claim that this has already taken place.

Proponents of Peak Oil are quick to point to oil wells or fields that have actually peaked and declined, such as those correctly predicted by geologist M. King Hubbert. They fail, however, to point out the ever newer discoveries of new oil fields and/or other sources of energy that tend to more than offset the depleted ones.

The Peak Oil debate boils down, essentially, to natural versus social limits, or naturally-determined versus socially-determined limits. A similar debate erupted more than 200 hundred years ago over the limits of population growth, on the one hand, and the growth of food supplies, on the other. The debate was prompted largely by a 1778 essay written by the British economist Thomas R. Malthus, titled “An Essay on the Principle of Population.”

Malthus projected an alarming specter of food shortages, hardship, and even starvation “because of faster population growth than food supply.” According to his theory, poverty and distress are unavoidable because, if unchecked, population increases at a geometrical rate (i.e. 1, 2, 4, 8, 16, etc.), whereas the means of subsistence grow at an arithmetical rate (i.e. 1, 2, 3, 4, 5, etc.), thereby leading to inevitable shortages of foodstuff.

As Malthus thus blamed misery and poverty on the poor and the miserable (for giving birth to too many mouths to be fed), he also concluded (logically) that poverty alleviation depended on selective restriction of population growth, that is, curbing the number of the poor and working people.

As checks on population growth, Malthus accepted war, famine, and disease. He also recommended “moral restraint” (marrying late or not at all, coupled with sexual abstinence prior to, and outside of, marriage) as additional checks on the growth of population. His hostility toward the poor was expressed most vividly when he openly argued in favor of dismantling social safety net programs, called “poverty laws”: “We cannot, in the nature of things, assist the poor, in any way, without enabling them to rear up to manhood a greater number of their children.”

By blaming social ills and economic calamities on the poor and working people, Malthus’s views tended, willy-nilly, to exonerate the underlying socio-economic structure, and to prove the inevitability of privation and misery under any social system.

What Malthus failed to see is the fact that growth rates of population and food supplies are not determined purely by nature as fixed, innate, or immutable rates. Instead, they are dynamic categories that can change drastically, depending on the level of economic development, social structure of production, and the state of technology.

Although not identical, the Peak Oil theory is similar to the Malthusian theory in that it too is based on natural, innate, or fixed and immutable limits. There are, of course, limits to everything—energy, food, water, population. But those limits are not absolute or pre-determined, as implied by the Peak Oil thesis. They are perhaps more social than natural limits.

This is why although the Peak Oil theory is not false in saying that there are limits to oil production, it does not explain much. In a real sense, it is a truism. It explains neither the current energy crisis nor any of the past ones. Nor can, therefore, its dire predictions about future global oil production be trustworthy.

More Oil Found than Used Up

Peak Oil misconceptions have many times led to alarmist predictions and dire warnings of an end of global oil production before the current day. Time and again, those forecasts turned out wrong because oil reserves, including proven or cost-efficient reserves, have continued to grow, and more oil wells or fields have been brought under utilization than those peaked and declined. The following is a partial list, as collected by Jason Schwarz, Options Strategist for Lone Peak Asset Management, Westlake Village, CA:

1. An offshore find by Brazilian state oil company Petrobras (PBR) in partnership with BG Group (BRGYY.PK) and Repsol-YPF may be the world’s biggest discovery in 30 years, the head of the National Petroleum Agency said. A deep-water exploration area could contain as much as 33 billion barrels of oil, an amount that would nearly triple Brazil’s reserves and make the offshore bloc the world’s third-largest known oil reserve. “This would lay to rest some of the peak oil pronouncements that we were out of oil, that we weren’t going to find any more and that we have to change our way of life,” said Roger Read, an energy analyst and managing director at New York-based investment bank Natixis Bleichroeder Inc.

2. A trio of oil companies led by Chevron Corp. (CVX) has tapped a petroleum pool deep beneath the Gulf of Mexico that could boost U.S. reserves by more than 50 percent. A test well indicates it could be the biggest new domestic oil discovery since Alaska’s Prudhoe Bay a generation ago. Chevron estimated the 300-square-mile region where its test well sits could hold up to 15 billion barrels of oil and natural gas.

3. Kosmos Energy says its oil field at West Cape Three Points is the largest discovery in deep water West Africa and potentially the largest single field discovery in the region.

4. A new oil discovery has been made by Statoil (STO) in the Ragnarrock prospect near the Sleipner area in the North Sea. “It is encouraging that Statoil has made an oil discovery in a little-explored exploration model that is close to our North Sea infrastructure,” says Frode Fasteland, acting exploration manager for the North Sea.

5. Shell (RDS.A) is currently analyzing and evaluating the well data of their own find in the Gulf of Mexico to determine next steps. This find is rumored to be capable of producing 100 billion barrels. Operating in ultra-deep waters of the Gulf of Mexico, the Perdido spar will float on the surface in nearly 8,000 ft of water and is capable of producing as much as 130,000 barrels of oil equivalent per day.

6. In Iraq, excavators have struck three oil fields with reserves estimated at about 2 billion barrels, Kurdish region’s Oil Minister Ashti Horami said.

7. Iran has discovered an oil field within its southwest Jofeir oilfield that is expected to boost Jofeir’s oil output to 33,000 barrels per day. Iran’s new discovery is estimated to have reserves of 750 million barrels, according to Iran’s Oil Minister, Gholamhossein Nozari.

8. The United States holds significant oil shale resources underlying a total area of 16,000 square miles. This represents the largest known concentration of oil shale in the world and holds an estimated 1.5 trillion barrels of oil with 800 billion recoverable barrels—enough to meet U.S. demand for oil at current levels for 110 years. More than 70 percent of American oil shale is on Federal land, primarily in Colorado, Utah, and Wyoming.

9. In western North Dakota there is a formation known as the Bakken Shale. The formation extends into Montana and Canada. Geologists have estimated the area holds hundreds of billions of barrels of oil. In an interview provided by USGS, scientist Brenda Pierce put the North Dakota oil in context: “Of the current USGS estimates, this is the largest oil accumulation in the lower 48. . . . It is also the largest continuous type of oil accumulation that we have ever assessed.” The USGS study says with today’s technology, about 4 billion barrels of oil can be pumped from the Bakken formation [7].

In the face of such overwhelming evidence, which seriously undermines the Peak Oil theory, proponents of the theory argue that their thesis is based on “proven,” not all, reserves. Proven reserves are reserves that, given a certain level of technology and a certain amount of investment, are proven or estimated to be economical, or cost efficient. Let us briefly examine this “proven vs. total reserves” argument of the Peak Oil champions.

Proven Reserves Are not a Measure of Future Oil Production: Short-Term Market Imperatives vs. Long-Term Public Policy/Interests

That oil companies would want to invest only in the narrow category of proven, or cost efficient, reserves is understandable; it is a simple business principle. But to base future oil supplies on the currently proven reserves, as Peak Oil theory does, is problematic. It represents a short-term, static view of future oil supplies that implicitly ignores the critical role of new investments and technological innovations that can make profitable, or cost efficient, what is currently considered unprofitable, or cost inefficient.

M.A. Adelman points out that “in 1944 a special expert mission estimated Persian Gulf reserves at 16 billion proved and 5 billion probable. By 1975, those same fields had produced 42 billion barrels and had 74 billion remaining. In 1984, geologists estimated a five percent probability of another 199 billion barrels remaining to be added in the Gulf region. In five years those reserves had already been added” [8].

Market imperatives and short-term profitability measures, thus severely limit oil reserve estimates because they effectively exclude not only huge reserves of unconventional oil, but also vast reservoirs of conventional oil that are not currently profitable. This is obviously a major flaw of the Peak Oil theory, as it judges future supplies of oil by the narrowest definition of oil production: currently proven reserves.

However, just as proven reserves determine the current level of oil production, and therefore of investment, the amount of current investment also plays a crucial role in the determination of the amount of proven reserves in the future. Peak Oil views this mutual relationship as a one-way street, or causality—going from the amount of currently proven reserves to the level of the necessary (or cost efficient) investment, and the global production of oil.

Furthermore, reserves that may be considered unprofitable from the viewpoint of private oil companies may well be economical from the viewpoint of state- or publicly-owned companies. For example, while a private oil company, may find an estimated profit rate of below x or y percent cost inefficient, a publicly-owned oil company might invest in reserves as long as estimated profit rate is not negative.

Indeed, as the experiences of state-owned oil companies in Russia, China, Venezuela, and many other countries show, publicly-owned oil companies often take large short-term losses in pursuit of long-term returns or rewards. Free from short-term market imperative, Russia, for example, has invested heavily in long-term oil projects, with fantastic results that have more than offset the enormous short-term costs of those projects. Here is how Joe Vialls, an expert with first-hand experience in “ultra-deep drilling,” explains:

“In 1970, the Russians started drilling Kola SG-3, an exploration well which finally reached a staggering world record depth of 40,230 feet. Since then, Russian oil majors including Yukos have quietly drilled more than 310 successful super-deep oil wells, and put them into production. Last Year Russia overtook Saudi Arabia as the world’s biggest single oil producer, and is now set to completely dominate global oil production and sales for the next century. . . . With no shareholders holding out their grubby little hands for a wad of pocket money every month, the Russian oil industry managed to surge ahead, under-reaming thousands of its older existing onshore wells in less than ten years” [9].

The Role of Technology: a Dynamic, not Static, Process

A major flaw of Peak Oil, as already pointed out, is that it discounts the fact that energy-saving technologies have drastically improved (and will continue to further improve) not only the efficiency of oil production but also of oil consumption. Evidence shows that, for example, “over a period of five years (1994-99), U.S. GDP expanded over 20 percent while oil usage rose by only nine percent. Before the 1973 oil shock, the ratio was about one to one” [10].

Cars, airplanes and other means of transportation have become more fuel-efficient than ever before—though not as much as they could, or should. Both businesses and consumers are also doing a better job of trimming their energy costs. Obviously, this means that our demand for energy does not grow as fast as the growth of our economy. For example, According to the Energy Information Administration, in 1981 the United States devoted nearly 14 percent of its overall gross domestic product to energy; by 2006 that number had fallen to about 9 percent.

One of the results of the more efficient means of research and development has been a far higher success rate in finding new oil fields. The success rate has risen in twenty years from less than 70 percent to over 80 percent. Computers have helped to reduce the number of dry holes. Horizontal drilling has boosted extraction. Another important development has been deep-water offshore drilling, which the new technologies now permit. Good examples are the North Sea, the Gulf of Mexico, and more recently, the promising offshore oil fields of West Africa [11].

The following are some of the recent technological advances that (as described by Red Cavaney, a top oil expert) have dramatically increased the ability not only to find and extract new oil, but perhaps more importantly, to recover more or additional oil from existing reserves that were formerly considered “peaked and dried” under old technologies.

  • Directional Drilling. It used to be that wellbores were basically vertical holes. This made it necessary to drill virtually on top of a potential oil deposit. However, the advent of miniaturized computers and advanced sensors that can be attached to the drill bit now allows companies to drill directional holes with great accuracy because they can get real-time information on the subsurface location throughout the drilling process.
  • Horizontal Drilling. Horizontal drilling is similar to directional drilling, but the well is designed to cut horizontally through the middle of the oil or natural gas deposit. Early horizontal wells penetrated only 500 to 800 feet of reservoir laterally, but technology advances recently allowed a North Slope operator to penetrate 8,000 feet of reservoir horizontally. Moreover, horizontal wells can operate up to 10 times more productively than conventional wells.
  • 3-D Seismic Technology. Substantial enhancements in computing power during the past two decades have allowed the industry to gain a much clearer picture of what lies beneath the surface. The ability to process huge amounts of data to produce three-dimensional seismic images has significantly improved the drilling success rate of the industry [12].

“Primarily due to these advances,” Cavaney further points out, “the U.S. Geological Survey (USGS), in its 2000 World Petroleum Assessment, increased by 20 percent its estimate of undiscovered, technically recoverable oil. USGS noted that, since oil became a major energy source about 100 years ago, 539 billion barrels of oil have been produced outside the United States. USGS estimates there are 649 billion barrels of undiscovered, technically recoverable oil outside the United States. But, importantly, USGS also estimates that there will be an additional 612 billion barrels from reserve growth—nearly equaling the undiscovered resources. Reserve growth results from a variety of sources, including technological advancement in exploration and production, increases over initially conservative estimates of reserves, and economic changes” [13].

Thanks to new technologies, additional oil can now be recovered from the apparently exhausted reserves. Specifically, the peaking and declining of oil from an existing well is not the same as the peaking and declining of oil from the respective oil field or reservoir. While oil production from an existing well is bound to peak and then slow down, “offset wells” can be drilled later into the same field or reservoir to produce more oil. Here is how Vialls explains:

“Now we come to the completely false [or deliberately misleading] claim by Peak Oil shills that production from existing oil wells is ‘slowing down,’ thereby proving that the oil fields are ‘running dry.’ This is so wrong that it is almost breathtaking. Think of this slowing down process in the same way you might think of the engine oil in your automobile. The longer you run the engine, the higher the level of contaminates that get into the oil. The higher the level of contaminates, the higher the level of friction. Sooner or later you have something closely akin to glue coating your piston rings, and the performance of your engine declines accordingly. This is an inevitable mechanical process well known to all automobile owners.

“Henry Ford and others managed to slow down the rate of contamination in engine oils by inventing the oil filter, through which the oil has to circulate each time it passes around inside the engine. A high percentage of the contaminates stick to the filter element, thereby allowing extra miles between oil changes, though heaven help the careless motorist who thinks he can get away without ever changing his clogged oil filter when recommended.

“When oil is extracted from a producing formation underground, it flows out through pores in the reservoir rock, and then into the open borehole, from where it is transported to surface by the production tubing string. So by the very nature of the beast, the bottom section of the well is ‘open hole’ which allows the oil to flow out in the first place, but because it is comprised of exposed and sometimes unstable rock, this open hole section is also continually subject to all manner of turbulence and various contaminates. For example, tiny quantities of super fine silt may exit through the pores but not continue to the surface with the oil, tumbling around in the turbulence instead, until the silt very slowly starts to block off the oil-producing pore throats. Yes, of course there are a variety of liners that can be used to slow down the contamination, but there is no such thing as a Henry Ford oil filter 10,000 feet underground.

“The inevitable result of this is that over time, the initial production rate of the well will slowly decline, a hard fact known to every exploration oilman in the business. However, this is certainly not an indication that the oil field itself is becoming depleted, proved thousands of times by ‘offset wells’ drilled later into the same reservoir. Any new well comes on stream at the original production rate of its older cousins, because it has not yet had time to build up a thin layer of contaminates across the open hole. Though as we shall see it is possible to ‘do an oil change’ on a producing well and bring it back to full production, this is extremely expensive, and rarely used in the west” [14].

Substitutes or Alternative Sources of Energy

Peak Oil is also subject to criticism because it pays insufficient attention to substitutes or alternative sources of energy, both actual and potential. These include solar, wind, non-food bio-fuel, and nuclear energies. They also include natural gas. Natural gas is now about 25 percent of energy demand worldwide. It is estimated that by 2050 it will be the main source of energy in the world. A number of American, European, and Japanese firms have and are investing heavily in developing fuel cells for cars and other vehicles that would significantly reduce gasoline consumption [15].

Peak Oil also pays short shrift to what is sometimes called “unconventional” oil. These include Tar Sands, Heavy Oils, and Oil Shale.

Tar Sands can be recovered via surface mining or in-situ collection techniques. Canada’s Athabasca Tar Sands is the best known example of this kind of unconventional reserve—estimated at 1.8 trillion barrels. Although this was originally considered cost inefficient, experts working in this area now claim that they have brought down the cost from over $20 a barrel to $8 per barrel.

Heavy Oils can be pumped and refined just like conventional petroleum except that they are thicker and have more sulfur and heavy metal contamination, necessitating more extensive refining. Venezuela’s Orinoco heavy oil belt is the best known example of this kind of unconventional reserve—estimated at 1.2 trillion barrels.

Oil Shale requires extensive processing and consumes large amounts of water. Still, reserves far exceed supplies of conventional oil, and costs are bound to decline as newer and more efficient processing techniques become available [16].

A rarely mentioned but potentially very important substitute for conventional oil “is an even bigger hydrocarbon resource that can be developed to provide nearly endless amounts of energy: methane hydrates (methane frozen in ice crystals). The deposits of methane hydrates are so vast that when we develop the technology to bring them to market, we will have clean-burning energy for 2,000 years. It’s just one of the exciting scenarios we may see in the far-off future” [17].

Except for natural gas and nuclear energy, most of these alternative sources of energy are still highly costly, and are therefore used in only insignificant quantities. But, considering the ever evolving newer and more efficient technologies, they are bound to rise in significance. This means that the prospects of reaching a day in our search for energy sources when conventional oil is no longer the world’s dominant source of energy are quite realistic. Humans did not invent motor vehicles because they ran out of horses or horse-driven carriages; nor did they invent electricity because they ran out of candles.

Concluding Remarks

Predictions of global oil production peaking, and then running out, have been around almost as long as oil was discovered in the second half of the 19th century. Time and again, such dire predictions turned out to be false, largely because of the Peak Oil’s apparently sound but actually deceitful logic: while it is true that, as Peak Oil maintains, oil is a finite natural resource that is bound to run out some day, it does not follow, again as Peak Oil argues, that therefore oil is or must be running out soon.

A major flaw of Peak Oil is that it is based on a static, or technology-neutral, assumption: it implicitly assumes that limits to oil are set as natural, innate, and immutable. Yet, limits to oil, like those to most other resources, are determined as much (if not more) socially as they are naturally. Research, development, and technological advances have made (and will continue to make) both the amounts of oil reserves and of oil production much more fluid or elastic than perceived by the champions of Peak Oil.

Viewed in conjunction with the vast pool of substitutes, both actual and potential, oil limits loom less vitally than when they are considered in isolation from such energy alternatives. The constantly evolving newer and more efficient technologies are bound to further expand those limits far beyond the narrow, “natural” limits set by the Peak Oil theory.
_______________________________

References

[1] Robert L. Hirsch, Roger Bezdek, and Robert Wendling, “Peaking of World Oil Production: Impacts, Mitigation, and Risk Management,” Testimony on Peak Oil before the House Subcommittee on Energy and Industry (7 December 2005), http://www.netl.doe.gov/publications/others/pdf/Oil_Peaking_NETL.pdf

[2] Matthew Mosk, “Industry Gushed Money After Reversal on Drilling,” Washington Post (27 July 2008), http://www.washingtonpost.com/wp-dyn/content/article/2008/07/26/AR2008072601891.html

[3] F. William Engdahl, “Perhaps 60% of Today’s Oil Price Is Pure Speculation,” financialsense.com (2 May 2008), http://www.financialsense.com/editorials/engdahl/2008/0502.html

[4] Michael T. Klare, Resource Wars: The New Landscape of Global Conflict (Holt Paperbacks, 2002).

[5] Red Cavaney, “Global Oil Production about to Peak? A Recurring Myth,” World Watch (01 January 2006), http://goliath.ecnext.com/coms2/gi_0199-5142950/Global-oil-production-about-to.html

[6] Eliyahu Kanovsky, “Oil: Who’s Really Over a Barrel?” Middle East Quarterly (Spring 2003), http://www.meforum.org/article/527

[7] Jason Schwarz, The Peak Oil Myth: New Oil is Plentiful,” Seeking Alpha (22 June 2008), http://seekingalpha.com/article/82236-the-peak-oil-myth-new-oil-is-plentiful

[8] M.A. Adelman, The Genie out of the Bottle: World Oil since 1970, (Cambridge: MIT Press, 1995); cited in Bill Kovarik, “The Oil Reserve Fallacy: Proven reserves are not a measure of future supply,” http://www.radford.edu/~wkovarik/oil/

[9] Joe Vialls, “Russia Proves ‘Peak Oil’ Is A Misleading Zionist Scam,” rense.com (25 August 2004), http://www.rense.com/general75/zoil.htm

[10] Eliyahu Kanovsky, “Oil: Who’s Really Over a Barrel?” Middle East Quarterly (Spring 2003), http://www.meforum.org/article/527

[11] Ibid.

[12] Red Cavaney, “Global Oil Production about to Peak? A Recurring Myth,” World Watch (01 January 2006), http://goliath.ecnext.com/coms2/gi_0199-5142950/Global-oil-production-about-to.html

[13] Ibid.

[14] Joe Vialls, “Russia Proves ‘Peak Oil’ Is A Misleading Zionist Scam,” rense.com (25 August 2004), http://www.rense.com/general75/zoil.htm

[15] The Wall Street Journal (10 March 1998); cited in Eliyahu Kantovsky, “Oil: Who’s Really Over a Barrel?” Middle East Quarterly (Spring 2003), http://www.meforum.org/article/527

[16] For an informative discussion of unconventional oil reserves, and a scathing critique of Peak Oil see Bill Kovarik, “The Oil Reserve Fallacy: Proven reserves are not a measure of future supply,” http://www.radford.edu/~wkovarik/oil/

[17] Red Cavaney, “Global Oil Production about to Peak? A Recurring Myth,” World Watch (01 January 2006), http://goliath.ecnext.com/coms2/gi_0199-5142950/Global-oil-production-about-to.html

Ismael Hossein-zadeh, author of the recently published The Political Economy of U.S. Militarism (Palgrave-Macmillan 2007), teaches economics at Drake University, Des Moines, Iowa.

December 3, 2009 Posted by | Deception, Full Spectrum Dominance, Illegal Occupation, Malthusian Ideology, Phony Scarcity, Militarism, Science and Pseudo-Science, Timeless or most popular | Leave a comment