Jordanians sue Israel over Dimona nuclear plant
Press TV – February 16, 2010
A Jordanian human rights group is set to sue Israel for the damage its nuclear facility in Dimona has caused to the environment and the residents in south Jordan.
The al-Jisr al-Arabi center has accused Israel of causing death, cancer and other afflictions among the Jordanian population residing in the area adjacent to the nuclear plant and thus prone to the toxic gases and radiation the facility emits.
The Amman-based human rights group has reportedly taken the preliminary steps necessary ahead of filing a legal lawsuit against Israel in the coming month with Jordanian legal bodies.
The center met with a number of the reactor’s victims and “collected evidence and proof that attest to a rise in the number of cancer cases, especially among residents of the southern region, which is adjacent to the reactor.”
“This is an ongoing and deliberate crime on Israel’s part, which still causes cases of death and injury in large numbers among the Jordanians,” said the center’s General Manager Attorney Amjad Shammout.
He went on to add that he was optimistic the complaint will be accepted in accordance with the constitution and international treaties on human rights, which Jordan has recently adopted.
The Forest Thinning Trap
Fear, Fire and Logging
By George Weurthner | February 9, 2010
The effects of fire suppression on fuels is likely exaggerated. Most forests types are well within their historic range of variability. Most of the acreage burned in fires annually is in forest types that historically experienced moderate to significant stand replacement blazes. Therefore, the idea that large fires that occur are the result of fire exclusion is inaccurate.
There is new evidence that suggests that even low elevation dry forests of Ponderosa pine and Douglas fir occasionally experienced large stand replacement blazes. The old model that characterized such forests as primarily a consequence of high frequency, low intensity blazes that created open and park-like forests may not be universally applicable.
Thinning won’t significantly affect large blazes because fuels are not the major factor driving large blazes. Climatic/weather conditions are responsible for blazes.
Large blazes are driven by drought, wind, low humidity, and high temperatures. These factors do not occur in one place very frequently. That’s why most fires go out without burning more than a few acres.
The probability that any particular thinned stand will experience a blaze during the period when the thinning may still be effective is extremely low.
The majority of acreage burned is the result of a very small percentage of blazes—less than 0.1% of all fires are responsible for the vast majority of acres charred. Most fires go out without burning more than a few acres.
Even if it were possible to limit large blazes, it would be unwise to do so since the large blazes are the only fires that do a significant amount of ecological work.
Large fires are not “unnatural”. There are many species of plants and animals that are adapted to and/or rely upon dead trees and snags. There would be no evolutionary incentive for such adaptations if large fires were “unnatural.”
Dead trees are important physical and biological components of forest ecosystems. They are not a wasted resource. Beetles and wildfires are the prime agents that create dead trees. Removal of significant amounts of biomass by thinning and/or logging likely poses a long term threat to forest ecosystems. Biomass energy is the latest threat to forest ecosystems.
Logging/thinning is not benign. Logging has many impacts to forest ecosystems including spread of weeds, sedimentation of streams, alteration in water drainage, removal of biomass, and so on. These impacts are almost universally ignored and externalized by thinning/logging proponents.
Alternatives to logging/thinning to reduce fuels that do not remove biomass and avoid most of the negatives associated with logging practices exist, including prescribed burns and wildlands fire.
Reducing home flammability is the most economical and most reliable way to safeguard communities, not landscape scale thinning/logging projects.
The rush to formulate new forest legislation that advocates thinning forests, use of biomass for energy production, and the presumption that our forests are “unhealthy” and/or that large fires and beetle outbreaks are undesirable may soon create a new threat to our forests. There are a host of different bills before Congress including legislation introduced by Mark Udall of Colorado, Jon Tester of Montana, Ron Wyden of Oregon, among others that are all predicated upon a number of flawed or exaggerated assumptions.
Some of this legislation is better than others, and some of it even has some very good things in the language and policies that are an improvement over present policies. Nevertheless, there are many underlying assumptions that are troubling.
THE FIRE SUPPRESSION CONUNDRUM
There is a circular logic going on around the issue of fuel buildup and fire suppression. Currently the major federal agencies including the Forest Service and BLM generally attempt to suppress fires, except in a few special locations like designated wilderness. Despite the fact that most agencies now recognize that wildfires have a very important ecological role to play, we are told by managing agencies that they must continue to suppress fires or face “catastrophic” blazes—which they consider to be “uncharacteristic”.
The problem is that thinning won’t solve the “problem” of large blazes because the problem isn’t fuels. By allowing the timber industry to define the problem and propose a solution we have a circular situation whereby the land management agencies continue to suppress fires, thereby presumably permitting fuels to build up, which they assert thus drives large blazes, creating a need for more logging and fire suppression. This cycle of fire suppression, logging, grazing, and more fire suppression has no end.
In addition, since thinning reduces completion and opens up the forest floor to more light, thus new plant growth, thinning can often lead to the creation of even more of the flashy fine fuels that sustain forest fires. Unless these thinned stands are repeatedly treated, they can actually exacerbate the fire hazard by increasing the overall abundance of the very fuels which are most problematic—the smaller shrubs, grasses, and small trees that sustain fire spread.
In addition, thinning can increase solar penetration leading to more rapid drying and greater penetration of wind—both factors that aid fire spread.
This is not unlike the approach taken with predator control, whereby agencies for years have shot, poisoned, and trapped coyotes in the belief that they were reducing coyote numbers. But since coyotes respond to such persecution with greater fecundity, predator control becomes a self fulfilling activity whereby predator control begets more predator control.
While fire suppression (and logging, grazing, and so forth) may be a contributing factor in fire spread in some forest types (primarily Ponderosa pine), they are not ultimately what is driving most large fires. Large blazes are almost universally associated with climatic features like severe drought, wind, and ultimately by shift in oceanic currents such as the Pacific Decadal Oscillation. Therefore fuel reductions will not substantively change the occurrence of large blazes.
Even if one wanted to buy into the fuels-is-driving- large blazes story, it would behoove us to rethink the range of solutions. The National Park Service, the only agency that does not have a commercial logging mandate, has effectively dealt with fuel reductions through wildlands fire and prescribed burning. At the very least, any fuel reduction that may be needed should be done by prescribed burning.
QUESTIONING FIRE SUPPRESSION
One of the underlying assumptions of all these pieces of legislation is the idea that our forests are unhealthy and possess unnatural fuel loads due to fire suppression or fire exclusion. There is, of course, a bit of truth to the generalization that some forest types may have had some fuel build ups as a consequence of fire exclusion, but whether these fuel build ups are outside of the historic range of variability is increasingly under scrutiny.
It’s also very important to note that the majority of all forests/plant types in the West like Lodgepole pine, sub-alpine fir, aspen, juniper, red fir, silver fir, Engelmann spruce, western red cedar, Douglas fir in west coast ecosystems, and many others have such naturally long fire intervals, that suppression, even if it were as effective as some might suggest, has not affected the historic fire frequency. Indeed, the majority of acreage of forest types burned annually tend to be characterized by moderate to severe fire, and are not the forest types where fuel build up is presumed to be a major problem—namely Ponderosa pine forest type. Yet most people apply the Ponderosa pine model of less intense frequent fires to all other forest types and thus assume that fire suppression has created unnatural fuel levels.
In particular the timber industry has adopted the convenient theme that fire suppression has created a presumed “fuel build-up” responsible for large wildfires. (Never mind that there were always large wildfires long before there was any effective fire suppression—for instance, the 1910 Burn which charred more than 3 million acres of northern Idaho and western Montana)
Thus logging proponents have created a “problem” namely fuel build up, and then by happy coincidence, have a solution that just happens to benefit them– logging the forest.
Fire suppression may have influenced some low elevation dry forests like those dominated by pure Ponderosa pine, but perhaps not nearly to the degree or over the large geographical area that timber interest and logging proponents try to suggest. Those who want to justify logging try to conflate low elevation forests with all forest types—many of which such as Lodgepole pine—are very likely not affected by fire suppression due to the naturally long intervals between fires in these forests.
CLIMATIC DRIVERS OF LARGE BLAZES
The emphasis on fuel reductions has obscured the fact that nearly all large blazes are climate/weather driven events. Evidence is building that wet, cool climatic conditions may be more responsible for dense forest stands and/or lack of fires than anything to do with fire suppression. In other words, fire suppression may not be as effective as some suggest and any fuel build up may be within natural or expected range.
In addition, there is also a growing body of scientific analysis that calls into question the very methods and conclusions used to construct fire histories. These analyses suggest that historic fire intervals, even in lower elevation dry forests like Ponderosa pine, are biased. Fire intervals may be far longer than previously assumed. Because of this longer fire interval, dense forest stands may be natural, and/or no different than what existed in the past. There is also new evidence for mixed “severity” (i.e. moderate change) fires as well as crown fires in these dry forests. The implications of these findings is that many forests, even low elevation forests, may well be within the historic range of variability.
LARGE BLAZES NECESSARY
One of the issues missed by thinning proponents is that the vast majority of all ecological work occurs in a very small number of fires—the big so-called “catastrophic” fires. Even though most agencies and environmental groups now profess to believe that wildfire is important to healthy forest ecosystems, they are not willing to let fires do the work.
For example, in the years between 1980 and 2003, there were more than 56,350 fires in the Rockies. These fires burned 3.6 million hectares (8.64 million acres) Most of these fires were small—despite all the fuels that has supposedly made conditions in forests ready to “explode”. Out of these 56,350 fires, the vast majority of blazes totaling 55,228 fires or 98% of all blazes only charred 4% of the acreage.
On the other hand, a handful of fires—1,222 or less than 2% of the fires accounted for 96% of the acreage burned. Even more astounding is that 0.1% of the fires or about 50 fires charred more than 50% of the acreage burned.
This suggests four things to me. First, fuels are not driving large blazes. There is plenty of fuel throughout the Rockies, but most fires never burn more than a few acres—despite all the fuel that is sitting around. Fire suppression if it was responsible for a fuel build up doesn’t appear to be creating a lot of big fires.
The few very large fires that everyone is concerned about occur during very special conditions of drought, combined with low humidity, high temperatures and wind. And these conditions simply do not occur very often. When they do line up in the same place at the same time you get a large fire—no matter what the fuel loading may be. My conclusion is that large blazes are climate driven events, not fuel driven.
Finally, the take home message for me is that even if we were successful at stopping big blazes through thinning and/or fire suppression, we would be in effect eliminating fire from the landscape. Since almost everyone today at least professes to the goal of restoring fire, then we have to tolerate the few large blazes—not try to stop them. Of course, it appears that despite our best efforts with logging, thinning, and all the rest, we have not had that much influence in eliminating the large blazes.
FRAMING THE ISSUE
One of the other major problems I have with the way many organizations have chosen to work on these issues is the way they “frame” the issues. When words like “working landscapes”, “restoration” , “unhealthy forests” “catastrophic blazes” “beetle outbreaks” are used in any discussion related to forests, they solidify in the public’s mind that there is a major problem with our forests, and more importantly that the “cure” is some kind of major invasive manipulation of forest ecosystems.
One must be careful about how you frame this issue. Even though most environmentalists do not support large scale commercial logging of our national forests, and have a lot of sidebars on how any logging should be done to address ecological concerns, when environmental groups say things like “we need to maintain our timber industry to restore the forests” the public just hears that our forests are a mess and the ONLY solution is more logging. I maintain that is not a message environmentalists want to be conveying. The public does not hear the sidebars, nor the cautionary words, rather they hear that we need to log our forests, and do so in a big way or ecological Armageddon is about to befall the West.
WHAT IS PRUDENT BEHAVIOR?
There is an important lesson in science called the precautionary principle. In the absence of full understanding of a problem, it is usually best to prescribe the least invasive and least manipulative actions. Conservation groups would be wise to apply this principle to forest policy.
That doesn’t mean I don’t support some “restoration” activities. To make an analogy, let’s look at the issue of wolf restoration. Putting wolves back on the land restores predation influences, but this is a very different thing than allowing hunters to kill elk. Especially because it allows the wolves, and natural conditions like drought, etc. t o determine what is the “right” number of elk and deer, not some agency with an agenda to sell licenses. Hunters influence elk differently than wolves and logging is different than say fires. Just as an elk killed by a wolf leaves behind carrion that other animals can use, a forest with fire leaves behind a lot of biomass that helps to sustain many other functions in the forest. Logging short circuits those ecosystems functions. As with hunting whenever you have a commercial enterprise involved in natural resource policy, it distorts the conclusions and it’s convenient to ignore anything that suggests the activity—whether hunting or logging is creating problems.
NEW PARADIGM
There is a growing challenge to many of the assumptions about fires and its influence on forests. These challenges to assumptions about constitutes forest “health” and the historic role of large blazes and beetle influences is not unlike the challenges to common assumptions about predators that began with people like Adolph Murie, George Wright, and other scientists back in the 1930s and 1940s who started to question predator policy. These early ecologists were not only challenging politicians and citizens, but many other scientists who were advocates of killing predators to create “healthy” populations of deer and elk.
I need not remind many conservationists that there are still plenty of scientists around that will support killing predators like wolves, despite decades of research about the ecological need for top down predators. So assurances that any logging on public lands will use the “best” science are not reassuring to me. When there is a commercial/economic aspect to any management, that tends to distort and often compromise the science and scientists that are consulted. It would naïve for anyone to believe that this is any different when dealing with fire and forest policy issues, especially when there’s an economic benefit to some industry and/or individuals for the policy.
QUESTIONING SUPPRESSION
There is a growing scientific body of work that is challenging the notion that fire suppression is responsible for dense forests and/or that crown fires, even in low elevation forests consisting of Ponderosa pine and/or Douglas fir. The implications of this for forest policy are significant for if this is correct, our current conditions are not outside of the historical normal range of variability, especially when you consider past climatic conditions that are similar to the current dry, warm conditions.
One can find plenty of scientists who think our forests are out of whack, and prescribe logging to reduce fuels and so forth, however, if one is monitoring the scientific literature one will find enough evidence here and there to question the current assumptions about “forest health” and the presumed need for logging.
At the very least, it would seem a prudent approach to avoid endorsing logging when there is at least some evidence to suggest that our forests are not as out of whack as previously assumed, and/or that logging cannot do what advocates suggest—like restore the ecosystem or prevent large blazes.
PROBABILITY OF FIRES
Another unchallenged assumption of those prescribing thinning to protect say old growth Ponderosa pine is the idea that somehow without thinning, we would lose all the old growth to fires. However, that ignores the low probability that any particular acre of land will burn in a fire. For one thing, most fires are small as mentioned earlier. They do not burn more than a few acres and go out. The few fires that do grow into large blazes occur under very special climatic/weather conditions of extreme drought, high wind, low humidity and high temperatures. These conditions do not occur that frequently, and to this you must provide an ignition. So even if you have drought, wind, low humidity, etc. you may not get a blaze.
In addition, even big blazes do not consume all the forest. Most large fires burn in a mosaic pattern for a host of reasons, the likelihood that any particular acre of old growth will burn is extremely small.
Finally, since thinning effectiveness even under the best circumstances rapidly declines over time, in order to protect old growth stands, thinning of that particular location in a forest must be very recent otherwise new growth generated by the opening of the forest, reduced competition, etc. often negates any advantage created by forest manipulation (logging).
LOGGING IS NOT BENIGN
Even if one disagreed with these new insights and interpretation of forest an ecosystem, and the presumed effectiveness of thinning projects, that doesn’t necessarily lead to logging as the “cure”. It wasn’t that long ago we heard many groups outlining the many ways that logging created ecological outcomes that were undesirable—the spread of weeds, changes in the abundance of snags, and down wood, that human activity in the woods disturbs and displaces sensitive wildlife, that disturbance of the land and use of logging roads (even temporarily logging roads) adds sediments to our streams, and so forth. Most of those critiques are still valid today, but we don’t hear that kind of criticism coming from many environmental groups anymore. This silence and unwillingness to continuously remind the public that logging has many, many negative impacts on forest ecosystems has compromised their environmental effectiveness as defenders of our public forests. After all who is going to assume that role if environmental groups do not continuously remind the public that logging has many unexamined and ignored externalities.
LESS MANIPULATIVE ALTERNATIVES EXIST
Even if one did not want to challenge the common perception that we have an “emergency” as Senators Wyden, Udall, Tester and others proclaim, logging isn’t necessarily the only or the best way to address this presumed emergency.
The National Park Service does fuel reductions and ecosystem restoration without logging. They have a long track record demonstrating that one can modify fuels and restore the ecological value of wildfire to the landscape without logging, and without jeopardizing communities. Yosemite NP, for instance, does prescribed burning in the crowded Yosemite Valley as does Muir Woods in adjacent areas, as well as many other national parks. That is not to suggest that prescribed burning will alleviate all concerns, but at the very least, it should be the approach that environmentalists advocate. Prescribed burning combined with natural wildfire can “restore” forest resilience as well as reduce fuels. Such an approach avoids many of the negatives associated with commercial logging, including the need for roads, the disturbance of water drainage by road building, soil compaction, removal of biomass, and so forth.
REDUCE HOME FLAMMABILITY AS FIRST DEFENSE AGAINST FIRE
There is an abundance of evidence to suggest that if community security is a concern, the best way to achieve that is through reduction of flammability of homes and the area immediately around the community, not wholesale logging of the forest ecosystem. Jack Cohen’s research at the Missoula Fire had demonstrated that thinning the forest is not the best way to protect homes.
ADVOCATE FOR NATURAL PROCESSES
Even if the majority of you believe our forests are out of whack and are unwilling to accept the critiques from those who suggest that our understanding of forest ecosystems may be incorrect, that doesn’t mean one has to be a hand maiden for the timber industry. Nature does the best management—that is why we all are advocates for wilderness—we believe that allowing wild places to determine what is right for the landscape is the best way to preserve “healthy ecosystems”. If the forests are overstocked as some may want to conclude, then let natural processes select which trees should survive and do any thinning that is necessary using insects, disease, drought, fire, wind storms, and all the other mechanisms that regulate plant communities—and Nature will do a far better job of determining which trees should survive than any forester.
Our role as humans is to get out of the way as much as possible, not to intrude and advocate for invasive solutions like logging. The only role for logging on public lands that I see is as listed below.
WHEN TO SUPPORT LOGGING/THINNING
If you must support logging, make sure it is very limited, and framed not in terms of forest health, but as a useful way to reduce human anxiety. Logging around houses and communities to reduce public anxiety over fires may be a political necessity. A fire break of significant size around the perimeter of a community may reduce public fears about large fires; however, as has been shown in numerous cases around the West fuel breaks alone will not ensure that homes are safe. Flammability of individual homes must be addressed.
Climate draft offers new support for nuclear power
By Jim Snyder | The Hill | 02-02-10
Climate change legislation being written by a Senate climate trio includes additional loan guarantees, tax breaks and a streamlined regulatory approval process to boost the nuclear energy industry.
A draft of the title, obtained by E2 Wire from an energy lobbyist, shows Sens. John Kerry (D-Mass.), Joseph Lieberman (I-Conn.) and Lindsey Graham (R-S.C.) are contemplating a series of incentives for nuclear power.
The language is the first to emerge from the behind-the-scenes talks the three have led in hopes of striking an accord on climate change to attract centrists.
A spokesman for Kerry said the language was not current but declined to say how it had changed. The draft title reflects themes that Kerry, Lieberman and Graham have already laid out.
President Barack Obama also called for additional support for the industry in his recent State of the Union address. His budget includes $54.5 billion in loan guarantees for the industry, tripling the $18 billion in authority Congress has already approved.
A summary of the draft title attached to the legislative text lists several nuclear incentives:
“Regulatory risk insurance to increase investor confidence and minimize the financial risks associated with prolonged regulatory delays,
Accelerated depreciation for nuclear plants;
Investment tax credits to create parity with the benefits enjoyed by wind and solar power; and
A doubling the authorization for loan guarantees from $48.5 billion to $100 billion, of which $38 billion will be available for nuclear plants.”
Kerry confirmed that tax incentives and loan guarantees are part of the nuclear section.
“We have made huge progress on it and I think we have a terrific title,” he told reporters in the Capitol Tuesday.
Kerry said the distribution of the draft titles has been limited.
“We have not circulated any component of this widely because we are trying to tie all the pieces together before we start having any kind of dissection,” he said.
Kerry said there has also been progress on titles addressing renewable and alternative energy, natural gas and offsets.
He added that the lawmakers are still determining their exact mechanism for putting a price on carbon emissions.
Other provisions include support for worker retraining program to respond to concerns that “an aging nuclear workforce is on the brink of retirement,” according to the summary.
See also:
Nuclear energy firms seek more than loan guarantees for revival
Ben Geman contributed to this post.
Bribery, Indentured Science, PR & Toxic Sludge
By Ronnie Cummins & Alexis Baden-Mayer | Organic Consumers Association | February 4, 2010
Greg Kester, Natalie Sierra and Liz Ostoich, along with municipal governments across the U.S. in need desperately of getting rid of the noxious stuff called sewage sludge, want Americans to believe that that toxic brew is good for you. Specifically, these operators are waging a massive PR campaign to get farmers and gardeners, including school gardens, to “fertilize” their veggies with sewage sludge. Their campaign would have us believe that the chemicals in sewage sludge—thousands of them present in every degree of hazardous and toxic combination—are somehow magically gone from sewage sludge once you “apply” it to your garden.
Before you reach for the science on the practice of “land application” of sewage sludge (and you will not find any science in the hands of the purveyors of this practice), consider the elementary logic: the purpose of sewage treatment being to clean up the sewage that arrives without cease at its doors, sludge will by definition contain everything the sewage treatment plant did in fact take out of the sewage. This means, besides urine and feces from flush toilets, every chemical from every industry, every pharmaceutical, disinfectant, and pathogen from every hospital hooked into the municipal sewer system; it means all the chemicals—tens of thousands of them—produced in our society and flushed or washed into sewers at the industry end or the consumer end: heavy metals, flame retardants, endocrine disruptors, carcinogens, pharmaceutical drugs and other hazardous chemicals coming from residential drains. It also means untold—and unpredictable— new chemicals created by the negative synergy in the toxic soup that sewage is and the toxic stew that sludge is. It means hosts of new pathogenic bacteria also created through horizontal gene transfer in the stress of this same toxic soup and this same toxic stew.
Keep these plain, incontrovertible facts in mind as you read on and when you hear the 1984 talk of “biosolids” (the PR word concocted by the sludge gang), of “land application” of “biosolids” (euphemism for disposal of sludge), of “class A biosolids,” or “EQ” (for “Exceptional Quality”) “biosolids.”
Greg Kester represents the California Association of Sanitation Agencies’ Biosolids Program, Natalie Sierra works for the San Francisco Public Utilities Commission, and Liz Ostoich works for the corporate giant of toxic sludge, Synagro (recently bought by the infamous Carlyle Group). Their job is to make sure that “land application” of toxic sludge on American farmland—the cheapest way to dispose of toxic sludge since ocean dumping was stopped in 1992—remains legal.
Opposition so far comes from people who have been made very sick by sludge “applied” to farmland close to their homes, by those who have had their entire dairy herds wiped out after being fed with hay or silage grown on sludge, and those whose own guts warn them against allowing sewage sludge to be either processed or spread near their homes or their farms. Like Monsanto’s genetically modified organisms (GMOs) polluting the gene pool, once toxic sludge contaminates our farmland, parks, schoolyards, and backyard gardens there is long term—or really, for all intents and purposes, permanent—damage. Growing food organically won’t mean much if the soil is contaminated with the pharmaceuticals, chemicals, and heavy metals contained in sewage sludge. This is exactly why the organic community rose up in 1998 and forced the government to prohibit the spreading of sewage sludge on organic farms and gardens.
Sludge propagandists like Kester, Sierra and Ostoich are the front line troups for municipal governments trying to avoid their responsibility for this noxious product of wastewater treatment: convincing the public that toxic sludge is good for you to have on your land, in your backyard. All three spoke at an industry conference several of us attended last week in San Francisco, “Biosolids: Understanding Future Regulatory Trends and Impacts on Biosolids Management in California.”
Greg “Buy the Science” Kester, California Association of Sanitation Agencies’ Biosolids Program
Greg Kester pitched sewage professionals on a national strategy of getting ahead of the toxic sludge news cycle by “filling in the data gaps” with research funded with what he described as “Congressional funny money” and conducted by organizations like WERF, the Water Environment Research Foundation, a PR think tank and lobbyist for the toxic sludge industry.
What really upsets Kester are reporters who “overlook” what he sees as the “benefits” of toxic sludge. Case in point, the John Hopkins study in Baltimore that examined the possibilities of using toxic sludge “to reduce the impact of lead contamination” in poor black neighborhoods by “tying it up” in the sludge. The plan was to test the blood of the children living in this project—before and after the “application” of the “biosolids”—to see if the lead levels had gone up or down.
The researchers “applied” “sterilized Baltimore sewage sludge mixed and composted with wood chips and sawdust,” along with grass seed, to backyard soil contaminated with lead. After a year, the grass cover was shown to reduce the amount of lead-contaminated soil being tracked from people’s yards to their homes, making it less likely that the lead-contaminated soil would be ingested and absorbed into the blood stream. Kester believes this is really a positive story about how toxic sludge can improve the environment by producing “lush green grass.”
Kester says the industry should have gotten great PR out of this one, but this isn’t how the story played to the media. After all, even contaminated soils can grow grass. The trouble with toxic sludge is that it, too, contains hazardous levels of lead. The “exceptional quality,” “class A biosolids” that were used in the experiment are permitted to have up to 300 mg/kg of lead. The law allows land used to dispose of toxic sludge to cumulatively reach a load of 264 pounds of lead per acre.
The AP reporter who had suggested a comparison with this “study” and the Tuskeegee studies of the 1950s was removed from his post and sent to no man’s land at the United Nations.
A second story Kester thinks could have been played better by the toxic sludge industry is the “application” of toxic sludge to the White House lawn during past administrations. This story made the news again when the First Lady Michelle Obama began growing what she intended to be an organic garden in a piece of that lawn. Kester said the lead contamination caused by use of a toxic sludge product called OrGrow (incidentally, the same thing that was used in the Baltimore study) had left lead contamination of “only” 93 ppm, “lower than expected for urban soils and safe gardens.” Kester is technically, if deceptively, correct: our EPA says that soil with more than 56 parts per million of lead might not provide “adequate protection of terrestrial ecosystems,” but doesn’t suggest worrying about anything below 400 parts per million as a threat to human health. However, some soil scientists advise against feeding children produce grown on soil with more than 100 ppm of lead.
Of course, knowing as we do that that sludge itself contains unpredictable but high levels and thus certainly contributes to the lead contamination of soils, who in their right mind would continue to support the practice of disposing of it on land? And remember also, lead is only one of countless and unpredictable toxins to be found in sewage sludge.
Natalie “Sludge Giveaways” Sierra, San Francisco Public Utilities Commission
Natalie Sierra has helped the toxic sludge industry score a major victory in the green city of San Francisco, where they’ve actually been able to get city residents to take toxic sludge and dispose of it in their own yards and community gardens. As part of the SF Public Utilities Commission’s contract with Synagro to take its toxic sewage sludge, SF gets a little of it sent back to them in a form that’s very similar to what was used in the Baltimore study and at the White House: pelletized, composted, sanitized beyond recognition. This is given away to community, school and home gardeners as “organic compost.” Since May 2007, the San Francisco Public Utilities Commission has given away more than 125 tons of toxic sludge to the unsuspecting public at “free giveaway” events.
The sludge giveaways have been successful, either because the recipients think they’re getting real organic compost (how should they know otherwise when the city also gives away OMRI-certified organic, genuine compost made from composted food scraps collected in the green recycling bins), or they trustingly assume that the law regarding the use of toxic sewage sludge as fertilizer must be protective of human health. As Greg Kester emphasized in his talk, the toxic sludge industry is counting on the city to stand their ground against complaints from groups like the Center for Food Safety and RILES (ReSource Institute for Low Entropy Systems), which filed a legal petition with city in 2009 to stop the disposal of toxic sludge on city lands through the “giveaway” program. Precisely because of its reputation as a green city, San Francisco is the strategic battleground in a national dispute pitting the toxic sludge industry against localities that have decided they don’t want to be toxic sludge disposal sites anymore. In Kester and Sierra’s view, sludge “giveaways” are the best opportunity to convince the public that toxic sludge can be “beneficially reused” as “non-toxic, nutrient-rich organic biosolids compost.”
For the “biosolids” conference, which organizers assumed was attended only by industry insiders (admission to the one day event was $226 for people who aren’t members of the California Water Environment Association, a sewage industry trade association), Sierra gave a presentation on local ordinances in California that threatened to limit efforts to dispose of sludge on rural lands. When I questioned her after the conference as to why it was so important to give sludge away to San Francisco gardeners, she claimed that it was an issue of “social justice”—meaning you shouldn’t dump on someone else’s land what you don’t want on your own; that city dwellers shouldn’t be so cavalier about dumping their wastes on the farmland of rural counties. This was a shock to me, considering that, in her presentation about the “challenge” of anti-sludge rural counties, the only concern she had expressed was to “keep rate payers in mind” and “keep costs down.” But, in this “social justice” comment, it appeared for a moment that, in her view, rural communities should not be forced to receive of a city’s toxic sewage sludge for disposal on their farmland. She quickly disabused me of this illusion, assuring me that this was not what she meant. Perhaps she has not yet got her propaganda logic straight.
Liz “Bribery” Ostoich, Synagro
Liz Ostoich works as a project developer for the “land application”-of-toxic-sludge corporate giant, Synagro. Her presentation began with a cartoon image of a person who had gotten whacked very hard on the nose. She said she was going to teach us what she had learned in the school of hard knocks about how to gain local approval for toxic sludge processing. This is what she’s learned:
1. Poor Neighborhoods, Not Rich Neighborhoods
Ostoich advised us to pick the “right location,” not someplace that’s going to involve “taking trucks through a very exclusive neighborhood.”
2. Out of Sight, Out of Mind
A “remote location” is also key. Ostoich warned us to “be in an area where folks can’t really see you, they smell with their eyes.”
Ostoich gave two examples of projects she’d worked on, one Synagro’s Temescal Canyon facility in Corona, CA, “where it was done wrong,” and the other its South Kern County facility, which she told us was “unanimously supported.”
To hear Ostoich tell it, the closing of Synagro’s Temescal Canyon facility was the result of Synagro’s failure to manage the politics and public relations surrounding toxic sewage sludge, not their failure to properly manage the toxic sludge itself. She shared with us her suspicions that complaints that were phoned in from neighbors (suburban sprawl had placed 7,000 homes within a 4 mile radius of the sludge plant) were “bogus and contrived to get us shut down.”
What Ostoich didn’t share with us was that 37 individual small-claims lawsuits for $5,000 each, the maximum allowable amount, were won against Synagro for creating a public nuisance. This was for 11 years of suffering. One of the plaintiffs, Diana Schramm, told a local newspaper, “We would express our frustration to these people, Synagro, that the odor was so intense that it was burning our eyes, burning our noses, burning our throats. It was so frustrating. They just didn’t seem to care about us.”
As a counterpoint, Ostoich used Kern County as an example of a place where Synagro had done things right: “remote location,” “political involvement,” “proven technology,” and “going into it with the right attitude.” As an example of the community support Synagro received for their project, Ostoich read—in full—a letter from the president of Taft College written to California Senator Florez about Synagro’s generosity: an annual contribution of $25,000 to the college. Oh, and the letter happened to mention that Taft College was of the opinion that Synagro was a superb environmental steward. If you can’t beat ’em, buy ’em?
That strategy didn’t work so well with the Taft City Council. As councilman Craig Noble said after the council voted to reject a check from Synagro for $25,000, he felt that accepting the money could have created a conflict of interest.
“I hate to take money from somebody that might try to be buying their way into something later on,” Noble said.
It isn’t easy to go up against a public-private trifecta that is so well-resourced and unscrupulous, but we have to try. If we can’t stop San Francisco, home to the man Organic Style magazine calls the World’s Greenest Mayor, from tricking its citizens into taken poison and growing their food in it, there’s no telling what the toxic sludge industry will try and get away with in other towns. That’s why we’re putting out a national call to all of our members and readers to encourage organic consumers across the country to help us stop San Francisco’s toxic sewage sludge giveaways.
Will Obama guarantee a new reactor war?
By Harvey Wasserman | Online Journal | February 1, 2010
Amidst utter chaos in the atomic reactor industry, Team Obama is poised to vastly expand a bitterly contested loan guarantee program that may cost far more than expected, both financially and politically.
The long-stalled, much-hyped “renaissance” in atomic power has failed to find private financing. New construction projects are opposed for financial reasons by fiscal conservatives such as the Heritage Foundation and National Taxpayers Union, and by a national grassroots safe energy campaign that has already beaten such loan guarantees three times.
New reactor designs are being challenged by regulators in both the US and Europe. Key projects, new and old, are engulfed in political/financial uproars in Florida, Texas, Maryland, Vermont, New Jersey and elsewhere.
And 53 years after the opening of the first commercial reactor at Shippingport, Pennsylvania, Department of Energy Secretary Steven Chu is now convening a “blue ribbon” commission on managing radioactive waste, for which the industry still has no solution. Though stacked with reactor advocates, the commission may certify the death certificate for Nevada’s failed Yucca Mountain dump.
In 2005, George W. Bush’s energy bill embraced appropriations for an $18.5 billion loan guarantee program, which the Obama administration now may want to triple. But the DOE has been unable to minister to a chaotic industry in no shape to proceed with new reactor construction. As many as five government agencies are negotiating over interest rates, accountability, capital sourcing, scoring, potential default and accident liability, design flaws and other fiscal, procedural and regulatory issues, any or all of which could wind up in the courts.
In 2007, a national grassroots uprising helped kill a proposed addition of $50 billion in guarantees, then beat them twice again.
When Obama endorsed “safe, clean nuclear power plants” and “clean coal” in this year’s State of the Union, more than 10,000 MoveOn.org members slammed that as the worst moment of the speech.
The first designated recipient of the residual Bush guarantees may be at the Vogtle site in Waynesboro, Georgia, where two reactors now operate. Georgia regulators have ruled that consumers must pay for two proposed new reactors even as they are being built.
But initial estimates of $2-3 billion per unit have soared to $8 billion and more, even long before construction begins. Standardized designs have not been certified. Ongoing technical challenges remind potential investors that the first generation of reactors cost an average of more than double their original estimates.
The Westinghouse AP-1000 model, currently slated for Vogtle — and for another site in South Carolina — has become an unwanted front runner.
Owned by Japan’s Toshiba, Westinghouse has been warned by the Nuclear Regulatory Commission of serious design problems relating to hurricanes, tornadoes and earthquakes.
The issues are not abstract. Florida’s Turkey Point plant took a direct hit from Hurricane Andrew in 1991, sustaining more than $100 million in damage while dangerously losing off-site communication and power, desperately relying on what Mary Olson of NIRS terms “shaky back-up power.” Ohio’s Perry reactor was damaged by a 1986 earthquake that knocked out surrounding roads and bridges. A state commission later warned that evacuation under such conditions could be impossible.
Long considered a loyal industry lapdog, the NRC’s willingness to send Westinghouse back to the drawing board indicates the AP-1000’s problems are serious. That they could be expensive and time-consuming to correct means the Vogtle project may prove a losing choice for the first loan guarantees.
South Texas is also high among candidates for loan money. But San Antonio, a primary partner in a two-reactor project there, has been rocked by political fallout from soaring cost estimates. As the San Antonio city council recently prepared to approve financing, it learned the price had jumped by $4 billion, to a staggering $17-18 billion. Angry debate over who-knew-what-when has led to the possibility that the city could pull out altogether.
In Florida, four reactors have been put on hold by a plummeting economy and the shifting political aims of Governor Charlie Crist. Crist originally supported two reactors proposed by Florida Power & Light to be built at Turkey Point, south of Miami, and another proposed for Levy County by Progress Energy. State regulators voted to allow the utilities to charge ratepayers before construction began, or even a license was approved.
But Crist is now running for US Senate, and has distanced himself from the increasingly unpopular utilities. With votes from two new appointees, the Public Service Commission has nixed more than $1 billion in rate hikes. The utilities have in turn suspended preliminary reactor construction (though they say they will continue to pursue licenses).
At Calvert Cliffs, Maryland, the financially tortured Constellation Energy has committed to the French AREVA’s European Power Reactor, now under serious challenge by regulators in France, Finland and Great Britain. An EPR under construction in Finland is now at least three years behind schedule, and more than $3 billion over budget.
Meanwhile, at Entergy’s 30-year-old Yankee reactor in Vermont, a series of radiation and information leaks have severely damaged prospects for re-licensing. The decision will soon be made by a deeply divided state legislature. “It would be better for the industry to let Vermont Yankee die a quiet death in the Green Mountain state,” says Deb Katz of the grassroots Citizens Awareness Network. “With radioactive leaks, lies and systemic mismanagement, Entergy is no poster child for a new generation of nukes.”
Meanwhile, New Jersey may require operators of the aging Oyster Creek reactor to install sizable towers to protect what’s left of the severely damaged Barnegat Bay, which the plant uses for cooling. Though the requirement may not be enforced for as much as seven years, the towers’ high cost could prompt a shutdown of the relatively small plant.
This unending stream of technical, financial and political downfalls could doom the “reactor renaissance” to history’s radioactive dump heap. “President Obama needs to remember what Candidate Obama promised: no more taxpayer subsidies for nuclear power,” said Michael Mariotte, executive director of the Nuclear Information and Resource Service. “Renewables and energy efficiency provide both greater carbon emissions reductions and more jobs per dollar spent than nuclear. Unlike nuclear power, they are relatively quick to install, and are actually safe and clean.”
Indeed, despite congressional and White House support for these latest proposed loan guarantees, the grassroots fight over both old and new nukes grows fiercer by the day.
In the long run, this alleged “nuclear renaissance” could prove to be little more than a rhetorical relapse.
History of Leaks Imperils Reactor License Extension in Vermont
![vermontcoolingtower.jpg [A portion of a cooling tower at the Vermont Yankee reactor collapsed Wednesday, August 22, 2007. A broken 52” pipe was photographed spewing water into the ground, in the latest embarrassment for Yankee owner Entergy Corporation, the nation’s second-largest nuclear utility.]](https://i0.wp.com/www.commondreams.org/files/article_images/vermontcoolingtower.jpg)
A portion of a cooling tower at the Vermont Yankee reactor collapsed Wednesday, August 22, 2007. A broken 52” pipe was photographed spewing water into the ground, in the latest embarrassment for Yankee owner Entergy Corporation, the nation’s second-largest nuclear utility. – © 2010 Boston Globe
Search continues for radioactive Vermont Yankee leak
By Terri Hallenbeck, Free Press Staff Writer • Saturday, January 23, 2010
MONTPELIER — Crews will start drilling more monitoring wells today at Vermont Yankee as the Vernon nuclear power plant’s owners continue the search for the source of a leak of the radioactive isotope tritium.
The tritium leak was revealed three weeks ago after heightened radioactive levels were found in a monitoring well outside the plant. Tests continued this week to show a range of tritium — from 14,000 to 28,100 picocuries per liter, Vermont Yankee spokesman Rob Williams said. The most recent tests were at 20,900, he said.
The Environmental Protection Agency safety standard for tritium in drinking water is 20,000 picocuries per liter.
More heightened levels — as high as 2 million picocuries — have since been found in a concrete trench on the plant’s campus. No heightened levels of radioactive substances have been found in nearby drinking wells that are monitored.
Other radioactive isotopes — cobalt-60 and zinc-65 — have also been found in the trench, which is a locked and confined building where radioactive waste is stored and treated. Williams said Friday that those were found in the Jan. 13 sampling along with tritium.
Neil Sheehan, Nuclear Regulatory Commission spokesman, said 13,000 picocuries per liter of cobalt-60 and 2,460 picocuries of zinc-45 were found in the standing water in the trench. Those are well above federal drinking-water limits of 100 picocuries for cobalt-60 and 300 picocuries for zinc-65. Officials cautioned that the substances aren’t in the drinking water but are in an enclosed part of the nuclear power plant.
Williams said it’s not a surprise to find those other isotopes, nor does the discovery likely help in the search for the source of the leak.
“That’s the kind of material you’d expect to find in the basement of a building that processes nuclear waste,” Williams said. “What we’re looking for is the source of the tritium.”
The leak — and the fact that Vermont Yankee owner Entergy Corp. had told state officials that pipes that might play a role in the leak didn’t exist — have raised concerns about the plant as its owner seeks permission to continue operation for another 20 years after its license expires in 2012. Friday, Vermont’s three-member congressional delegation wrote a letter to the Nuclear Regulatory Commission calling for an immediate investigation.
“This investigation should not only determine whether there was an attempt by Entergy Vermont Yankee to mislead state officials regarding the plant’s safety and underground piping, but also provide a complete and accurate assessment of the full scope of the contamination at and near the plant as soon as possible,” the delegation said.
Meanwhile, Vermont Yankee has constructed a barrier to seal a connection between the trench and an evaporator tank that could be contributing to the accumulation of tritium-contaminated water in the trench, according to the state Health Department, which started posting online updates on the situation Thursday. The seal appears to have stopped condensation of tritium-contaminated water vapor on the corrugated roof over the concrete trench, according to the Health Department.
Engineers are studying the trench’s structural integrity to see if it is contributing to the leak, according to the Health Department.
Also Friday, the NRC extended Entergy’s deadline for spinning off Vermont Yankee and other Northeast nuclear power plants into a new company. Because Entergy has not won approval from Vermont or New York regulators for the move, the deadline has been extended from Jan. 28 to Aug. 1.
ManBearPig Attacked by Science!
By Khephra | Aletho News | January 28, 2010
Today I’d like to more thoroughly address specific planks of Anthropogenic Global Warming Theory (AGW) that I think deserve further scrutiny. Over the past year AGW rhetoric has reached deafening levels, and advocates have successfully framed the hypothesis as unassailable. Propagandists have yolked AGW with “wise stewardship” and today it’s common for skeptics of AGW to be derided as ignorant anti-environmentalists. But I don’t think that things are nearly so simple.
Unfortunately, once people become emotionally invested in a position, it can be very difficult to provoke them into changing course. Liberals and progressives hailed the election of Obama as the most wonderful thing since sliced-bread. With a battlefield full of broken promises behind him and the insinuation of institutionalized corruption and illegal forced detentions stretching into the foreseeable future, many of those same liberals and progressives have fallen into an exasperated, listless complacency. They became emotionally invested in the “hope” engendered by Obama, and when the reality failed to live up to the myth, they were forced into cognitive dissonance, apathy, or synthesis. If you meet someone who still supports Obama, dig a little and you’ll find the cognitive dissonance – and, I would argue, the same could be said of supporters of AGW.
To get us started, I think we should rehash the essential assumptions of AGW:
• As atmospheric levels of C02 increase, Earth’s median temperature increases.
• As Earth’s median temperature increases, atmospheric imbalances precipitate increases in the frequency and strength of weather events (e.g., hurricanes, tornadoes, droughts).
• Humans are directly exacerbating this process through the burning of fossil fuels and any activity that yields C02 as a byproduct.
• Increased median temperatures are melting the polar ice caps and causing glaciers to recede or vanish.
Since AGW has the pleasant benefit of being a bonafide scientific theory, it suggests falsifiable claims. If these claims can be demonstrated invalid, the theory is in need of reconsideration. On the other hand, if emotional investment and cognitive dissonance are high enough, no amount of contradictory data will matter. Young Earth Creationists make a fine example of this psychopathology. In spite of overwhelming tangible evidence that their theory is invalid, they fall back on dogma or the Bible – and no amount of science will provoke them into reconsidering their position. Thankfully, AGW is far easier to invalidate than dogma from the Bible, because it makes so many suppositions that are easily testable.
Let’s begin with the most crucial component of AGW – C02. Here’s a graph of historical global C02 levels and temperatures. According to their analysis:
“Current climate levels of both C02 and global temperatures are relatively low versus past periods. Throughout time, C02 and temperatures have been radically different and have gone in different directions. As this graph reveals, there is little, if any correlation, between an increase of C02 and a resulting increase in temperatures.”
If we realize that C02’s correlation with global temperature is not a given, the entire edifice of AGW begins to crumble. Therefore, it’s difficult to get adherents of AGW to accept the implications of this data. Again and again they’ll fall back on the assumption that the correlation between C02 and global temperatures is incontrovertible, but they must avoid an ever-expanding amount of dissonant data:
MIT’s professor Richard Lindzen’s peer reviewed work states “we now know that the effect of CO2 on temperature is small, we know why it is small, and we know that it is having very little effect on the climate.”
The global surface temperature record, which we update and publish every month, has shown no statistically-significant “global warming” for almost 15 years. Statistically-significant global cooling has now persisted for very nearly eight years. Even a strong el Nino – expected in the coming months – will be unlikely to reverse the cooling trend. More significantly, the ARGO bathythermographs deployed throughout the world’s oceans since 2003 show that the top 400 fathoms of the oceans, where it is agreed between all parties that at least 80% of all heat caused by manmade “global warming” must accumulate, have been cooling over the past six years. That now prolonged ocean cooling is fatal to the “official” theory that “global warming” will happen on anything other than a minute scale. – Science & Public Policy Institute: Monthly CO2 Report: July 2009
“Just how much of the “Greenhouse Effect” is caused by human activity?
It is about 0.28%, if water vapor is taken into account– about 5.53%, if not.
This point is so crucial to the debate over global warming that how water vapor is or isn’t factored into an analysis of Earth’s greenhouse gases makes the difference between describing a significant human contribution to the greenhouse effect, or a negligible one.” – Geocraft
Next, let’s further consider the hypothetical tangential effects of AGW – e.g., rising global temperatures melt icecaps, etc.:
• Climatologists Baffled by Global Warming Time-Out: “Global warming appears to have stalled. Climatologists are puzzled as to why average global temperatures have stopped rising over the last 10 years. Some attribute the trend to a lack of sunspots, while others explain it through ocean currents.”
• ‘AGW – I refute it thus!’: Central England Temperatures 1659 – 2009: “Summary: Unprecedented warming did not occur in central England during the first decade of the 21st century, nor during the last decade of the 20th century. As the CET dataset is considered a decent proxy for Northern Hemisphere temperatures, and since global temperature trends follow a similar pattern to Northern Hemisphere temps, then the same conclusion about recent warming can potentially be inferred globally. Based on the CET dataset, the global warming scare has been totally blown out of proportion by those who can benefit from the fear.”
• 50 Years of Cooling Predicted: “‘My findings do not agree with the climate models that conventionally thought that greenhouse gases, mainly CO2, are the major culprits for the global warming seen in the late 20th century,’ Lu said. ‘Instead, the observed data show that CFCs conspiring with cosmic rays most likely caused both the Antarctic ozone hole and global warming….’
In his research, Lu discovers that while there was global warming from 1950 to 2000, there has been global cooling since 2002. The cooling trend will continue for the next 50 years, according to his new research observations.”
A comparison of GISS data for the last 111 years show US cities getting warmer but rural sites are not increasing in temperature at all. Urban Heat Islands may be the only areas warming.
Rise of sea levels is ‘the greatest lie ever told’:
If there is one scientist who knows more about sea levels than anyone else in the world it is the Swedish geologist and physicist Nils-Axel Mörner, formerly chairman of the INQUA International Commission on Sea Level Change. And the uncompromising verdict of Dr Mörner, who for 35 years has been using every known scientific method to study sea levels all over the globe, is that all this talk about the sea rising is nothing but a colossal scare story.
Despite fluctuations down as well as up, “the sea is not rising,” he says. “It hasn’t risen in 50 years.” If there is any rise this century it will “not be more than 10cm (four inches), with an uncertainty of plus or minus 10cm”. And quite apart from examining the hard evidence, he says, the elementary laws of physics (latent heat needed to melt ice) tell us that the apocalypse conjured up by Al Gore and Co could not possibly come about.
The reason why Dr Mörner, formerly a Stockholm professor, is so certain that these claims about sea level rise are 100 per cent wrong is that they are all based on computer model predictions, whereas his findings are based on “going into the field to observe what is actually happening in the real world”. – Telegraph.co.uk
Since the early Holocene, according to the findings of the six scientists, sea-ice cover in the eastern Chuckchi Sea appears to have exhibited a general decreasing trend, in contrast to the eastern Arctic, where sea-ice cover was substantially reduced during the early to mid-Holocene and has increased over the last 3000 years. Superimposed on both of these long-term changes, however, are what they describe as “millennial-scale variations that appear to be quasi-cyclic.” And they write that “it is important to note that the amplitude of these millennial-scale changes in sea-surface conditions far exceed [our italics] those observed at the end of the 20th century.”
Since the change in sea-ice cover observed at the end of the 20th century (which climate alarmists claim to be unnatural) was far exceeded by changes observed multiple times over the past several thousand years of relatively stable atmospheric CO2 concentrations (when values never strayed much below 250 ppm or much above 275 ppm), there is no compelling reason to believe that the increase in the air’s CO2 content that has occurred since the start of the Industrial Revolution has had anything at all to do with the declining sea-ice cover of the recent past; for at a current concentration of 385 ppm, the recent rise in the air’s CO2 content should have led to a decrease in sea-ice cover that far exceeds what has occurred multiple times in the past without any significant change in CO2. – C02 Science.org
See also:
• The Global Warming Scandal Heats Up: “The IPCC has been forced to admit that the claim made was actually taken from an article published in 1999. The article was based around a telephone interview with an Indian scientist who has admitted that he was working from pure speculation and his claims were not backed by research.”
• The Dam is Cracking: “[The claims of Himalayan glacial melting] turned out to have no basis in scientific fact, even though everything the IPCC produces is meant to be rigorously peer-reviewed, but simply an error recycled by the WWF, which the IPCC swallowed whole.
The truth, as seen by India’s leading expert in glaciers, is that “Himalayan glaciers have not in anyway exhibited, especially in recent years, an abnormal annual retreat.” …
Then at the weekend another howler was exposed. The IPCC 2007 report claimed that global warming was leading to an increase in extreme weather, such as hurricanes and floods. Like its claims about the glaciers, this was also based on an unpublished report which had not been subject to scientific scrutiny — indeed several experts warned the IPCC not to rely on it.”
• Arctic Sea Ice Since 2007: “According to the World Meteorological Organization, Arctic sea ice has increased by 19 percent since its minimum in 2007, though they don’t make it very easy to see this in the way that they report the data.”
Now let’s consider some of the agents and institutions that are strong advocates of AGW:
Howard C. Hayden, emeritus professor of physics from the University of Connecticut, told a Pueblo West audience that he was prompted to speak out after a visit to New York where he learned that scaremongering billboards about the long-term effects of global warming were being purchased at a cost of $700,000 a month.
“Someone is willing to spend a huge amount of money to scare us about global warming,” Hayden said. “Big money is behind the global-warming propaganda.”
Lawrence Solomon: Wikipedia’s Climate Doctor:
Connolley took control of all things climate in the most used information source the world has ever known – Wikipedia. Starting in February 2003, just when opposition to the claims of the bands members were beginning to gel, Connolley set to work on the Wikipedia site. He rewrote Wikipedia’s articles on global warming, on the greenhouse effect, on the instrumental temperature record, on the urban heat island, on climate models, on global cooling. On Feb. 14, he began to erase the Little Ice Age; on Aug.11, the Medieval Warm Period. In October, he turned his attention to the hockey stick graph. He rewrote articles on the politics of global warming and on the scientists who were skeptical of the band. Richard Lindzen and Fred Singer, two of the world’s most distinguished climate scientists, were among his early targets, followed by others that the band especially hated, such as Willie Soon and Sallie Baliunas of the Harvard-Smithsonian Center for Astrophysics, authorities on the Medieval Warm Period.
All told, Connolley created or rewrote 5,428 unique Wikipedia articles. His control over Wikipedia was greater still, however, through the role he obtained at Wikipedia as a website administrator, which allowed him to act with virtual impunity. When Connolley didn’t like the subject of a certain article, he removed it — more than 500 articles of various descriptions disappeared at his hand. When he disapproved of the arguments that others were making, he often had them barred — over 2,000 Wikipedia contributors who ran afoul of him found themselves blocked from making further contributions. Acolytes whose writing conformed to Connolley’s global warming views, in contrast, were rewarded with Wikipedia’s blessings. In these ways, Connolley turned Wikipedia into the missionary wing of the global warming movement.” – National Post
The ‘ClimateGate’ scandal that broke a couple of months ago warrants some elaboration, too. For previous posts on this topic, see:
• Using ClimateGate to Reason with ManBearPig
• ClimateGate Crashes ManBearPig’s Party
• ManBearPig Meets the Vikings
• ManBearPig on Life Support?
That foundation established, let’s take a closer look at who was involved with ClimateGate:
For a thorough, email-by-email elaboration of exactly what the ‘big deal’ is, see here:
Climategate publicly began on November 19, 2009, when a whistle-blower leaked thousands of emails and documents central to a Freedom of Information request placed with the Climatic Research Unit of the University of East Anglia in the United Kingdom. This institution had played a central role in the “climate change” debate: its scientists, together with their international colleagues, quite literally put the “warming” into Global Warming: they were responsible for analyzing and collating the various measurements of temperature from around the globe and going back into the depths of time, that collectively underpinned the entire scientific argument that mankind’s liberation of “greenhouse” gases—such as carbon dioxide—was leading to a relentless, unprecedented, and ultimately catastrophic warming of the entire planet.
The key phrase here, from a scientific point of view, is that it is “unprecedented” warming.
• The Proof Behind the CRU ClimateGate Debacle: Because Computers Do Lie When Humans Tell Them Too: “As you can see, (potentially) valid temperature station readings were taken and skewed to fabricate the results the “scientists” at the CRU wanted to believe, not what actually occurred.”
• Unearthed Files Include “Rules” for Mass Mind Control Campaign: “The intruded central computer was not only filled to the brim with obvious and attempted ostracizing of scientists who don’t blindly follow the leader, the files also reveal that the folks of the IPCC made use or considered making use of a disinformation campaign through a ‘communication agency’ called Futerra.
The agency describes itself as ‘the sustainability communications agency’ and serves such global players as Shell, Microsoft, BBC, the UN Environment Programme, the UK government and the list goes on. The co-founder of Futerra, Ed Gillespie explains: ‘For brands to succeed in this new world order, they will have to become eco, ethical and wellness champions.’
The document included within the climategate treasure-chest is called ‘Rules of the Game’ and shows deliberate deception on the part of this agency to ensure that the debate would indeed be perceived as being settled. When facts do not convince, they reasoned, let us appeal to emotions in order to get the job done.”
• Climategate goes SERIAL: now the Russians confirm that UK climate scientists manipulated data to exaggerate global warming: “Climategate has already affected Russia. On Tuesday, the Moscow-based Institute of Economic Analysis (IEA) issued a report claiming that the Hadley Center for Climate Change based at the headquarters of the British Meteorological Office in Exeter (Devon, England) had probably tampered with Russian-climate data.
The IEA believes that Russian meteorological-station data did not substantiate the anthropogenic global-warming theory. Analysts say Russian meteorological stations cover most of the country’s territory, and that the Hadley Center had used data submitted by only 25% of such stations in its reports. Over 40% of Russian territory was not included in global-temperature calculations for some other reasons, rather than the lack of meteorological stations and observations.”
• ClimateGate Expanding, Including Russian Data and Another Research Center: “Well now some Russian climate officials have come forward stating that the data they handed over to the Hadley Centre in England has been cherry-picked, leaving out as much as 40% of the cooler temperature readings and choosing the hottest readings to make it appear things were warmer than they actually are (regardless of whether the temperature is human-induced or natural).”
Scientists using selective temperature data, sceptics say:
Two American researchers allege that U.S. government scientists have skewed global temperature trends by ignoring readings from thousands of local weather stations around the world, particularly those in colder altitudes and more northerly latitudes, such as Canada.
In the 1970s, nearly 600 Canadian weather stations fed surface temperature readings into a global database assembled by the U.S. National Oceanic and Atmospheric Administration (NOAA). Today, NOAA only collects data from 35 stations across Canada.
Worse, only one station — at Eureka on Ellesmere Island — is now used by NOAA as a temperature gauge for all Canadian territory above the Arctic Circle.
The Canadian government, meanwhile, operates 1,400 surface weather stations across the country, and more than 100 above the Arctic Circle, according to Environment Canada. – Canada.com
The ClimateGate emails were highly damning, and have led to Phil Jones’ (one of the researchers at the centre of the scandal) resignation and an investigation into Michael Mann’s ’scholarship’. Furthermore, the UN is also ‘investigating’ the ’scholarship’ underlying the scandal, but if something as incontrovertible as the Goldstone Report can get whitewashed, I have little hope for a meaningful or just analysis in a scandal of this magnitude. In theory, science is self-correcting; but in practice it’s “defend your thesis at all costs”.
Nevertheless, each of our original four suppositions are demonstrably ambiguous – if not outright invalid. Therefore, science – and empiricism – invalidates AGW.
Humanity has irrevocably altered – blighted? – the Earth, but C02 levels are far less relevant than other forms of industrial pollution: mercury-seeping lightbulbs, dioxin pollution, gene drift, cell phone-induced genetic damage, and all manner of other harmful and silly endeavours pose greater unambiguous threats to humanity than C02. Therefore, if you really want to help clean up the Earth, leave the AGW rhetoric in the dustbin and let’s get on with disempowering the hegemons.
Related articles
- NCSE: When Is purported Science not Science? (wattsupwiththat.com)
Coral in Florida Keys suffers lethal hit from cold
By Curtis Morgan | The Miami Herald | January 27, 2010
A diver surveys dead coral at an Upper Keys reef. ‘Ecosystem-wide mortality,’ says Meaghan Johnson of The Nature Conservancy. Bitter cold this month may have wiped out many of the shallow water corals in the Keys. Scientists have only begun assessments, with dive teams looking for “bleaching” that is a telltale indicator of temperature stress in sensitive corals, but initial reports are bleak. The impact could extend from Key Largo through the Dry Tortugas west of Key West, a vast expanse that covers some of the prettiest and healthiest reefs in North America.
Given the depth and duration of frigid weather, Meaghan Johnson, marine science coordinator for The Nature Conservancy, expected to see losses. But she was stunned by what she saw when diving a patch reef 2 ½ miles off Harry Harris Park in Key Largo.
Star and brain corals, large species that can take hundreds of years to grow, were as white and lifeless as bones, frozen to death. There were also dead sea turtles, eels and parrotfish littering the bottom.
“Corals didn’t even have a chance to bleach. They just went straight to dead,” said Johnson, who joined teams of divers last week surveying reefs in the Florida Keys National Marine Sanctuary. “It’s really ecosystem-wide mortality.”
The record chill that gripped South Florida for two weeks has taken a heavy toll on wildlife — particularly marine life.
On Tuesday, the Florida Fish and Wildlife Conservation Commission reported that record numbers of endangered manatees had already succumbed to the cold this year — 77, according to a preliminary review. The previous record, 56, was set last year. Massive fish kills also have been reported across the state. Carcasses of snook and tarpon are still floating up from a large fish kill across Florida Bay and the shallow waters of Everglades National Park.
Many of the Florida Keys’ signature diving destinations such as Carysfort, Molasses and Sombrero reefs — as well as deeper reefs off Miami-Dade and Broward — are believed to have escaped heavy losses, thanks to warming effects of the Gulf Stream. But shallower reefs took a serious, perhaps unprecedented hit, said Billy Causey, Southeast regional director of national marine sanctuaries for the National Oceanic and Atmospheric Administration.
PAST PROBLEMS
Coral-bleaching has struck the Keys in the past, most recently twice in the 1990s, preceding a die-off that claimed 30 percent of the reef tract. But those events, along with others that have hit reefs around the world, have usually been triggered by water hotter than what corals typically tolerate.
Healthy corals depend on a symbiotic relationship between polyps, the living tissues that slowly build the hard outer skeletons that give species distinctive shapes, and algae called zooxanthellae that give them their vibrant colors. But when ocean temperatures veer from their comfort zone too much or too long, the coral begin to shed that algae, turning dull or a bleached bone-white.
The effect usually doesn’t immediately kill coral but can weaken it, slowing growth and leaving fragile reefs — home to millions of fish, crabs and other animals — more vulnerable to diseases, pollution and damage from boaters and divers.
Cold-water bleaching is unusual, last occurring in 1977, the year it snowed in Miami. It killed hundreds of acres of staghorn and elkhorn corals across the Keys. Neither species has recovered, both becoming the first corals to be federally listed as threatened in 2006.
This big chill, said Causey, shapes up worse.
“They were exposed to temperatures much colder, that went on longer, than what they were exposed to three decades ago,” he said.
Typical winter lows in-shore hover in the mid- to high-60s in the Keys.
At its coldest more than a week ago, a Key Largo reef monitor recorded 52. At Munson Reef, just about a half-mile off the Newfound Harbor Keys near Big Pine Key, it hit 56.
At Munson Reef, said Cory Walter, a biologist for Mote Marine Laboratory in Summerland Key, scientists saw losses similar to what was reported off Key Largo. Dead eels, dead hogfish, dead coral — including big coral head five- to six-feet wide, bleached white with only fringes of decaying tissue.
“They were as big, as tall, as me. They were pretty much dead,” said Walter, who coordinates Mote’s BleachWatch program, which monitors reefs.
The dividing line for damage seems to be Hawk Channel, which parallels the Keys on the Atlantic Ocean side.
East of the channel, at reefs such as Looe Key, one of the top tourist sites, there was only light paling on some coral, she said. In Hawk Channel itself, there were dead sponges and stressed corals but not many outright dead ones.
SURVEYING DAMAGE
West of the channel toward shore, damage was more serious. Walter estimated 75 percent coral loss at one patch reef, though with poor visibility, it was a limited survey. Some nurseries growing small staghorn and elkhorn corals for restoration programs also may have been hard hit.
Over the next few weeks, scientists and divers from the Florida Keys National Marine Sanctuary, National Park Service, Florida Fish & Wildlife Conservation Commission, Mote Marine Laboratory, the University of Miami, Nova Southeastern University and other organizations will try to get a more completepicture of damage with reef surveys as far northas Martin County and as far south as the Dry Tortugas.
While they may not be able to save cold-damaged corals, Causey said, chronicling what dies and, more importantly, what survives, will help coral researchers in the future.
“We’re going to know so much more about this event than any other event in history,” he said.
© 2010 Miami Herald Media Company. All Rights Reserved. Contact reporter at cmorgan@MiamiHerald.com
Report finds high rate of thyroid cancer in eastern Pa.; blames nuclear power plants
By REGINA MEDINA | Philadelphia Daily News | January 24, 2010
Residents of eastern Pennsylvania might not know it, but they’re living in the middle of a thyroid-cancer hot spot, according to a public-health advocate.
The eastern side of the state lays claim to six of the nation’s top 18 counties with the highest thyroid-cancer rates, according to figures from the Centers for Disease Control and Prevention.
Pennsylvania ranked as the No. 1 state in thyroid-cancer cases between 2001 and 2005, 12.8 cases per 100,000 residents. (New Jersey comes in at No. 5 with 11.8 cases per 100,000.)
Joseph Mangano, the executive director of the Radiation and Public Health Project research group, said yesterday that he believes the spike in cancer is due to the high number of nuclear plants in the area.
At a news conference at City Hall where thyroid-cancer survivors and physicians also spoke, Mangano said that within 100 miles of eastern Pennsylvania, 16 nuclear reactors are operating at seven nuclear plants, the highest concentration in the country.
The emissions from the Limerick and Three-Mile Island plants don’t come close to those from the 1945 bombing of Hiroshima or the 1986 Chernobyl accident, but “that doesn’t necessarily mean [it’s] safer,” Mangano said.
“Not only have we documented an epidemic of thyroid cancer in the area, but we have raised a red flag for more and more detailed study of the relationship between the reactor emissions and thyroid cancer,” Mangano said.
Mangano, who published his findings in the International Journal of Health Services, said that the only known cause of thyroid cancer is exposure to radiation, specifically radioactive iodine, “one of the 100 man-made chemicals” produced by nuclear energy.
One University of Pennsylvania doctor who has researched thyroid cancer called the findings “provocative” and “intriguing,” but added that the author needed to delve more into the subject.
“We do know nuclear plants give off radioactive iodine [and] radioactive iodine can be associated with thyroid cancer,” said Susan J. Mandel, a professor of medicine and radiology. “Does it mean it causes it? It requires further investigation to see if it’s causing it.”
Lehigh County had the highest thyroid-cancer rate; others in eastern Pennsylvania were: Northampton (3rd), Luzerne (6th), York (7th), Bucks (14th) and Lancaster (18th). In New Jersey, Camden was ranked No. 16 and Burlington was 17th.
Climate science: models vs. observations
By Richard K. Moore | Aletho News | January 16, 2010
This document continues to evolve, based on continuing research. The latest version is always maintained at this URL:
http://rkmdocs.blogspot.com/2010/01/climate-science-observations-vs-models.html
You can click on any graphic in this document to see a larger image.
If a man is offered a fact which goes against his instincts, he will scrutinize it closely, and unless the evidence is overwhelming, he will refuse to believe it. If, on the other hand, he is offered something which affords a reason for acting in accordance to his instincts, he will accept it even on the slightest evidence.
— Bertrand Russell, Roads to Freedom, 1918
Science and models
True science begins with observations. When patterns are recognized in these observations, that leads to theories and models, which then lead to predictions. The predictions can then be tested by further observations, which can validate or invalidate the theories and models, or be used to refine them.
This is the paradigm accepted by all scientists. But scientists being people, typically in an academic research community, within a political society, there can be many a slip between cup and lip in the practice of science. There are the problems of getting funding, of peer pressure and career considerations, of dominant political dogmas, etc.
In the case of models there is a special problem that frequently arises. Researchers tend to become attached to their models, both psychologically and professionally. When new observations contradict the model, there is a tendency for the researchers to distort their model to fit the new data, rather than abandoning their model and looking for a better one. Or they may even ignore the new observations, and simply declare that their model is right, and the observations must be in error. This problem is even worse with complex computer models, where it is difficult for reviewers to figure out how the model really works, and whether ’fudging’ might be going on.
A classic example of the ’attached to model’ problem can be found in models of the universe. The Ptolemaic model assumed that the Earth is the center of the universe, and that the universe revolves around that center. Intuitively, this model makes a lot of sense. On the Earth, it feels like we are stationary. And we see the Sun and stars moving across the sky. “Obviously” the universe revolves around the Earth.
However, in order for this model to work in the case of the planets, it was necessary to introduce the arbitrary mechanism of epicycles. When Galileo and Copernicus came along, a much cleaner model was presented, that explained all the motions with no need for arbitrary assumptions. But no longer would the Earth be the center.
In this case it was not so much scientists that were attached to the old model, but the Church, which liked the model because it fit their interpretation of scripture. We’ve all heard the story of the Bishop who refused to look through the telescope, so he could ignore the new observations and hold on to the old model. Galileo was forced to recant. Thus can political interference hold back the progress of science, and ruin careers.
Climate models and global warming
Over the past century there has been a strong correlation between rising temperatures, and rising CO2 levels in the atmosphere, caused by the ever-increasing burning of fossil fuels. And it is well known that CO2 is a greenhouse gas. Other things being equal, higher CO2 levels must cause an increase in temperature, due to trapping more heat from the sun. Many scientists, quite reasonably, began to explore the theory that continually rising CO2 emissions would lead to continually rising temperatures.
Intuitively, it seems that the theory is “obviously” true. Temperatures have been rising along with CO2 levels; CO2 is a greenhouse gas; what is there to prove? And if the theory is true, and we keep increasing our emissions, then temperatures will eventually reach dangerous levels, melting the Antarctic ice sheet, raising sea levels, and all the other disasters presented by Al Gore in his famous documentary. “Obviously” we are facing a human-generated crisis – and something has got to be done!
But for many years, before Gore’s film, governments didn’t seem to be listening. Environmentalists, however, were listening. Public concern began to grow about CO2 emissions, and the climate scientists investigating the theory shared these concerns. They had a strong motivation to present the scientific case convincingly, in order to force governments to pay attention and take effective action — the future of humanity was at stake!
The climate scientists began building computer models, based on the observed correlation between temperature and CO2 levels. The models looked solid, not only for the past century, but extending back in time. Research with ice-core data revealed a general correlation between temperature and CO2 levels, extending back for a million years and more. What had been “obvious” to begin with, now looked even more obvious, confirmed by seemingly solid science.
These are the very conditions that typically cause scientists to become attached to their models. The early success of the model confirms what the scientists suspected all along: the theory must be true. A subtle shift happens in the mind of the scientists involved. What began as a theory starts to become an assumption. If new data seems to contradict the theory, the response is not to discard the theory, but rather to figure out what the model is lacking.
In the case of the Ptolemaic model, they figured out that epicycles must be lacking, and so epicycles were added. They were certain the universe revolved around the Earth, and so epicycles had to exist. Similarly, the climate scientists have run into problems with their models, and they’ve needed to add more and more machinery to their models in order to overcome those problems. They are certain of their theory, and so their machinery must be valid.
Perhaps they are right. Or perhaps they’ve strayed into epicycle territory, where the theory needs to be abandoned and a better model needs to be identified. This is the conclusion that quite a few scientists have reached. Experts do differ on this question, despite the fact that Gore says emphatically that the “science is settled”. Which group of scientists is right? This is the issue we will be exploring in this article.
Question 1
Compared to the historical record, are we facing a threat of dangerous global warming?
Let’s look at the historical temperature record, beginning with the long-term view. For long-term temperatures, ice-cores provide the most reliable data. Let’s look first at the very-long-term record, using ice cores from Vostok, in the Antarctic.
Data source:
ftp://ftp.ncdc.noaa.gov/pub/data/paleo/icecore/antarctica/vostok/deutnat.txt
Vostok Temperatures: 450,000 BC — Present
Here we see a very regular pattern of long-term temperature cycles. Most of the time the Earth is in an ice age, and about every 125,000 years there is a brief period of warm tempertures, called an inter-glacial period. Our current inter-glacial period has lasted a bit longer than most, indicating that the next ice age is somewhat overdue. These long-term cycles are probably related to changes in the eccentricity of the Earth’s orbit, which follows a cycle of about 100,000 years.
We also see other cycles of more closely-spaced peaks, and these are probably related to other cycles in the Earth’s orbit. There is an obliquity cycle of about 41,000 years, and a precession cycle, of about 20,000 years, and all of these cycles interfere with one another in complex ways. Here’s a tutorial from NASA that discusses the Earth’s orbital variations:
http://www-istp.gsfc.nasa.gov/stargaze/Sprecess.htm
Next let’s zoom-in on the current inter-glacial period, as seen in Vostok and Greenland, again using ice-core data. Temperatures here are relative to the value for 1900, which is shown as zero:
Vostok Temperatures: 12,000 BC — 1900
Data source:
http://www.ncdc.noaa.gov/paleo/metadata/noaa-icecore-2475.html
Greenland Temperatures: 9,500 BC — 1900
Here we see that the Southern Hemisphere emerged from the last ice age about 1,000 years earlier than did the Northern Hemisphere. As of 1900, in comparison to the whole inter-glacial period, the temperature was 3°C below the maximum in Vostok, and 3°C below the maximum in Greenland. Thus, as of 1900, temperatures were rather cool for the period in both hemispheres, and in Greenland, temperatures were close to a minimum.
During this recent inter-glacial period, temperatures in both Vostok and Greenland have oscillated through a range of about 4°C, although the patterns of oscillation are quite different in each case. In order to see just how different the patterns are, let’s look at Greenland and Vostok together, for the period 500BC–1900. Vostok is shown with a feint line, actually a dotted line if you click to see the enlarged version.
The patterns are very different indeed. In many cases we see an extreme high in Greenland, while at the same time Vostok is experiencing an extreme low. And in the period 1500—1900, while Greenland temperatures were relatively stable, within a range of .5°C, Vostok went through a radical oscillation of 3°C, from an extreme high to an extreme low. These differences between the two hemispheres might be related to the Earth’s orbit (See NASA tutorial), or they might be related to the fact that the Southern Hemisphere is dominated by oceans, while most of the land mass is in the Northern Hemisphere. Whatever the reason, the difference is striking.
There may be some value in trying to average these different records, to obtain a ’global average’, but it is important to understand that a global average is not the same as a global temperature. For example, consider temperatures 2,000 years ago. Greenland was experiencing a very wram period, 2°C above the baseline, while Vostok was experiencing a cold spell, nearly 1°C below the baseline. While the average for year 1000 might be near the baseline, that average does not represent the real temperature in either location.
This distinction between a global average, and real temperatures, is very important to keep in mind. Consider for example the concern that warming might lead to melting of the tundra in the Arctic, leading to the runaway release of methane. If that happens, it must happen in the Arctic. So it is the temperature in the Arctic that is relevant, not any kind of global average. In Greenland, temperatures 2,000 years ago were a full 2°C higher than 1900 temperatures, and there was no runaway release of methane.
The fact that the global average 2,000 years ago was dragged down by Antarctic cooling is completely irrelevant to the issue of melting tundra. Temperatures in the Arctic must rise by more than 2°C above 1900 levels before tundra-melting might be a problem, and this fact is obscured when we look at the global-average-derived hockey stick put out by the IPCC:
This graph gives the impression that temperatures 2,000 years ago were relatively low, and that in 1900 temperatures were higher than that. This may have some kind of abstract meaning, but it has nothing to do with what’s been going on in the Arctic, and it is very misleading as regards the likelihood of tundra-melting, or Arctic-melting in general. The graph is a gross misrepresentation of what’s been happening in the real world. It obscures the actual temperature record in both hemispheres, by presenting an artifical average that has existed nowhere.
Let’s now look at some other records from the Northern Hemisphere, to find out how typical the Greenland record is of its hemisphere. This first record is from Spain, based on the mercury content in a peat bog, as published in Science, 1999, vol. 284, for the most recent 4,000 years. Note that this graph is backwards, with present day on the left:
This next record is from the Central Alps, based on stalagmite isotopes, as published in Earth and Planteary Science Letters, 2005, vol. 235, for the most recent 2,000 years:
And finally, let’s include our Greenland record for the most recent 4,000 years:
While the three records are clearly different, they do share certain important characteristics. In each case we see a staggered rise, followed by a staggered decline — a long-term up-and-down cycle over the period. In each case we see that during the past few thousand years, temperatures have been 3°C higher than 1900 temperatures. And in each case we see a gradual descent towards the overdue next ice age. The Antarctic, on the other hand, shares none of these characteristics.
If we want to understand warming-related issues, such as tundra-melting and glacier-melting, we must consider the two hemispheres separately. If glaciers melt, they do so either because of high northern termperatures, or high southern temperatures. Whether or not glaciers are likely to melt cannot be determined by global averages. In this article we will concern ourselves with the Northern Hemisphere.
In the Northern Hemisphere, based on the shared characteristics we have observed, temperatures would need to rise at least 3°C above 1900 levels before we would need to worry about things like the extinction of polar bears, the melting of the Greenland ice sheet, or runaway methane release. We know this because none of these things have happened in the past 4,000 years, and temperatures have been3°C higher during that period.
However such a 3°C rise seems very unlikely to happen, given that all three of our Nothern Hemisphere samples show a gradual but definite decline toward the overdue next ice age. Let’s now zoom in the temperature record since 1900, and see what kind of rise has actually occurred. Let’s turn to Jim Hansen’s latest article, published on realclimate.org, 2009 temperatures by Jim Hansen. The article includes the following two graphs.
Jim Hansen is of course one of the primary proponents of the CO2-dangerous-warming theory, and there is considerable reason to believe these graphs show an exaggerated picture as regards to warming. Here is one article relevant to that point, and it is typical of other reports I’ve seen:
Son of Climategate! Scientist says feds manipulated data
Nonetheless, let’s accept these graphs as a valid representation of recent temperature changes, so as to be as fair as possible to the warming alarmists. We’ll be using the red line, which is from GISS, and which does not use the various extrapolations that are included in the green line. We’ll return to this topic later, but for now suffice it to say that these extrapolations make little sense from a scientific perspective.
The red line shows a temperature rise of .7°C from 1900 to the 1998 maximum, a leveling off beginning in 2001, and then a brief but sharp decline starting in 2005. Let’s enter that data into our charting program, using values for each 5-year period that represent the center of the oscillations for that period. Here’s what we get for 1900-2008:
Consider the downward trend at the right end of the graph. Hansen tells us this is very temporary, and that temperatures will soon start rising again. Perhaps he is right. However, as we shall see, his arguments for this prediction are seriously flawed. What we know for sure is that a downward trend has begun. How far that trend will continue is not yet known.
Next, let’s append that latest graph to the Greenland data, to get a reasonable characterization of Northern Hemisphere temperatures from 2000 BC to 2008:
This graph shows us that the temperature rise in the Northern Hemipshpere from 1800 to 2005 was not at all unnatural. That rise follows precisely the long-term pattern, where such rises have been occurring approximately every 1,000 years, with no help from human-caused CO2. Based on the long-term pattern of diminishing peaks, we would expect the recent down-trend to continue, and not turn upward again as Hansen predicts. If the natural pattern continues, then the recent warming has reached its maximum, and we will soon experience about two centuries of rapid cooling, as we continue our descent to the overdue next ice age.
So everything depends on the next decade or so. If temperatures turn upwards again, then the IPCC may be right, and human-caused CO2 emissions may have taken control of climate. However, if temperatures continue downward, then climate has been following natural patterns all along in the Northern Hemisphere. In this case there has been no evidence of any noticeable influence on climate from human-caused CO2, and we are now facing an era of rapid cooling. Within two centuries we could expect temperatures in the Northern Hemisphere to be consideralby lower than they were in the recent Little Ice Age.
We don’t know for sure which way temperatures will go, rapidly up or rapidly down. But I can make this statement:
As of this moment, based on the long-term temperature patterns in the Northern Hemisphere, there is no evidence that human-caused CO2 has had any effect on climate. The rise since 1800, as well as the downward dip starting in 2005, are entirely in line with the natural long-term pattern. If temperatures turn sharply upwards in the next decade or so, that will be the first-ever evidence for human-caused warming in the Northern Hemisphere.
As regards the the recent downturn, here are two other records, both of which show an even more dramatic downturn than the one shown in the GISS data:
University of Alabama, Huntsville (UAH)
Dr. John Christy
UAH Monthly Means of Lower Troposphere LT5-2
2004 – 2008
Remote Sensing Systems of Santa Rosa, CA (RSS)
RSS MSU Monthly Anomaly – 70S to 82.5N (essentially Global)
2004 – 2008
Based on the data we have looked at, all from mainstream scientific sources, we are now in a position to answer our first question with a reasonable level of confidence:
Answer 1
Temperatures, at least in the Northern Hemisphere, have been continuing to follow natural, long-term patterns — despite the unusually high levels of CO2 caused by the burning of fossil fuels. There have indeed been two centuries of global warming, and that is exactly what we would expect based on the natural pattern. Temperatures now are more than 2°C cooler than they were only 2,000 years ago, which means we have not been experiencing dangerously high temperatures in the Northern Hemisphere.
The illusion of global warming arises from a failure to recognize that global averages are are a very poor indicator of actual conditions in either hemisphere.
Within the next decade, or perhaps sooner, we are likely to learn which way the climate is going. If it turns again sharply upwards, as Hansen predicts, that will be counter to the long-term pattern, and evidence for human-caused warming. If it levels off, and continues downwards, that is consistent with long-term patterns, and we are likely to experience about two centuries of rapid cooling in the Northern Hemisphere, as we continue our descent toward the overdue next ice age.
Question 2
Why haven’t unsually high levels of CO2 significantly affected temperatures in the Northern Hemisphere?
One place to look for answers to this question is in the long-term patterns that we see in the temperature record of the past few thousand years, such as the peaks separated by about 1,000 years in the Greenland data, and other more closely-spaced patterns that are also visible. Some forces are causing those patterns, and whatever those forces are, they have nothing to do with human-caused CO2 emissions. Perhaps the forces have to do with cycles in solar radiation and solar magnetism, or perhaps they have something to do with cosmic radiation on a galactic scale, or something we haven’t yet identified. Until we understand what those forces are, how they intefere with one another, and how they effect climate, we can’t really build useful climate models, except on very short time scales.
We can also look for answers in the regulatory mechanisms that exist within the Earth’s own climate system. If an increment of warming happens on the surface, for example, then there is more evaporation from the oceans and more precipitation. While an increment of warming may melt glaciers, it may also cause increased snowfall in the arctic regions. Do these balance each other or not? Increased warming of the ocean’s surface may gradually heat the ocean, but the increased evaporation acts to cool the ocean. Do these balance each other?
Vegetation also acts as a regulatory system. Plants and trees gobble up CO2; that is where their substance comes from. Greater CO2 concentration leads to faster growth, taking more CO2 out of the atmosphere. Until we understand quantitively how these various regulatory systems function and interact, we can’t even build useful models on a short time scale.
In fact a lot of research is going on, investigating both lines of inquiry. However, in the current public-opinion and media climate, any research not related to CO2 causation is dismissed as the activity of contrarians, deniers, and oil-company hacks. Just as the Bishop refused to look through Galileo’s telescope, so today we have a whole society that refuses to look at many of the climate studies that are available.
I’d like to draw attention to one example of a scientist who has been looking at one aspect of the Earth’s regulatory system. Roy Spencer has been conducting research using the satellite systems that are in place for climate studies. Here are his relevant qualifications:
http://en.wikipedia.org/wiki/Roy_Spencer_(scientist)
Roy W. Spencer is a principal research scientist for the University of Alabama in Huntsville and the U.S. Science Team Leader for the Advanced Microwave Scanning Radiometer (AMSR-E) on NASA’s Aqua satellite. He has served as senior scientist for climate studies at NASA’s Marshall Space Flight Center in Huntsville, Alabama.
He describes his research in a presentation available on YouTube:
http://www.youtube.com/watch?v=xos49g1sdzo&feature=channel
In the talk he gives a lot of details, which are quite interesting, but one does need to concentrate and listen carefully to keep up with the pace and depth of the presentation. He certainly sounds like someone who knows what he’s talking about. Permit me to summarize the main points of his research:
When greenhouse gases cause surface warming, a response occurs, a ‘feedback response’, in the form of changes in cloud and precipitation patterns. The CRU-related climate models all assume the feedback response is a positive one: any increment of greenhouse warming will be amplified by knock-on effects in the weather system. This assumption then leads to the predictions of ‘runaway global warming’.
Spencer set out to see what the feedback response actually is, by observing what happens in the cloud-precipitation system when surface warming is occurring. What he found, by targeting satellite sensors appropriately, is that the feedback response is negative rather than positive. In particular, he found that the formation of storm-related cirrus clouds is inhibited when surface temperatures are high. Cirrus clouds are themselves a powerful greenhouse gas, and this reduction in cirrus cloud formation compensates for the increase in the CO2 greenhouse effect.
This is the kind of research we need to look at if we want to build useful climate models. Certainly Spencer’s results need to be confirmed by other researchers before we accept them as fact, but to simply dismiss his work out of hand is very bad for the progress of climate science. Consider what the popular website SourceWatch says about Spencer.
We don’t find there any reference to rebuttals to his research, but we are told that Spencer writes columns for a free-market website funded by Exxon. They also mention that he spoke at conference organized by the Heartland Institute, that promotes lots of reactionary, free-market principles. They are trying to discredit Spencer’s work on irrelevant grounds, what the Greeks referred to as an ad hominem argument. Sort of like, “If he beats his wife, his science must be faulty”.
And it’s true about ‘beating his wife’ — Spencer does seem to have a pro-industry philosophy that shows little concern for sustainability. That might even be part of his motivation for undertaking his recent research, hoping to give ammunition to pro-industry lobbyists. But that doesn’t prove his research is flawed or that his conclusions are invalid. His work should be challenged scientifically, by carrying out independent studies of the feedback process. If the challenges are restricted to irrelevant attacks, that becomes almost an admission that his results, which are threatening to the climate establishment, cannot be refuted. He does not hide his data, or his code, or his sentiments. The same cannot be said for the warming-alarmist camp.
Question 3
What are we to make of Jim Hansen’s prediction that rapid warming will soon resume?
Once again, I refer you to Dr. Hansen’s recent article, 2009 temperatures by Jim Hansen.
Jim begins with the following paragraph:
The past year, 2009, tied as the second warmest year in the 130 years of global instrumental temperature records, in the surface temperature analysis of the NASA Goddard Institute for Space Studies (GISS). The Southern Hemisphere set a record as the warmest year for that half of the world. Global mean temperature, as shown in Figure 1a, was 0.57°C (1.0°F) warmer than climatology (the 1951-1980 base period). Southern Hemisphere mean temperature, as shown in Figure 1b, was 0.49°C (0.88°F) warmer than in the period of climatology.
The Southern Hemisphere may be experiencing warming, but it has 2°C to go before that might become a problem there, and it has nothing to do with the Northern Hemisphere, where temperatures have been declining recently, not setting records for warming. This mathematical abstraction, the global average, is characteristic of nowhere. It creates the illusion of a warming crisis, when in fact no evidence for such a crisis exists. In the context of IPCC warnings about glacers melting, runaway warming, etc., this global-average argument serves as deceptive and effective propaganda, but not as science.
Jim continues with this paragraph, emphasis added:
The global record warm year, in the period of near-global instrumental measurements (since the late 1800s), was 2005. Sometimes it is asserted that 1998 was the warmest year. The origin of this confusion is discussed below. There is a high degree of interannual (year‐to‐ year) and decadal variability in both global and hemispheric temperatures. Underlying this variability, however, is a long‐term warming trend that has become strong and persistent over the past three decades. The long‐term trends are more apparent when temperature is averaged over several years. The 60‐month (5‐year) and 132 month (11‐year) running mean temperatures are shown in Figure 2 for the globe and the hemispheres. The 5‐year mean is sufficient to reduce the effect of the El Niño – La Niña cycles of tropical climate. The 11‐ year mean minimizes the effect of solar variability – the brightness of the sun varies by a measurable amount over the sunspot cycle, which is typically of 10‐12 year duration.
As I’ve emphasized in bold, Jim is assuming that there is a strong and persistent warming trend, which he of course attributes to human-caused CO2 emissions. And then that assumption becomes the justification for the 5 and 11-year running averages. Those running averages then give us phantom ’temperatures’ that don’t match actual observations. In particular, if a downard decline is beginning, the running averages will tend to ‘hide the decline’.
It seems we are looking at a classic case of over-attachment to model. What began as a theory has now become an assumption, and actual observations are being dismissed as “confusion” because they don’t agree with the model. The climate models have definitely strayed into the land of imaginary epicycles. The assumption of CO2 causation, plus the preoccupation with an abstract global average, creates a warming illusion that has no connection with reality in either hemisphere, as we see in these two graphs from Jim’s article:
As with the Ptolemaic model, there is a much simpler explantation for our recent era of warming , at least in the Northern Hemisphere: long term patterns are continuing, for whatever reasons, and human-caused CO2 has so far had no noticeable effect. This simpler explanation is based on actual observations, and requires no abstract mathematical epicycles or averages, but it removes CO2 from the center of the climate debate. And just as powerful forces in Galileo’s day wanted the Earth to remain the center of the universe, powerful forces today want CO2 to remain at the center of climate debate, and global warming to be seen as a threat.
Question 4
What is the real agenda of the politically powerful factions who are promoting global-warming alarmism?
One thing we always need to keep in mind is that the people at the top of the power pyramid in our society have access to the very best scientific information. They control dozens, probably hundreds, of high-level think tanks, able to hire the best minds, and carrying out all kinds of research we don’t hear about. They have access to all the secret military and CIA research, and a great deal of influence over what research is carried out in think tanks, the military, and in universities.
Just because they might be promoting fake science for its propaganda value, that doesn’t mean they believe it themselves. They undoubtedly know that global cooling is the real problem, and the actions they are promoting are completely in line with such an understanding.
Cap-and-trade, for example, won’t reduce carbon emissions. Rather it is a mechanism that allows emissions to continue, while pretending they are declining — by means of a phony market model. You know what a phony market model looks like. It looks like Reagan and Thatcher telling us that lower taxes will lead to higher government revenues due to increased business activity. It looks like globalization, telling us that opening up free markets will “raise all boats” and make us all prosperous. It looks like Wall Street, telling us that mortgage derivatives are a good deal, and we should buy them. And it looks like Wall Street telling us the bailouts will restore the economy, and that the recession is over. In short, it’s a con. It’s a fake theory about what the consequences of a policy will be, when the real consequences are known from the beginning.
Cap-and-trade has nothing to do with climate. It is part of a scheme to micromanage the allocation of global resources, and to maximize profits from the use of those resources. Think about it. Our ‘powerful factions’ decide who gets the initial free cap-and-trade credits. They run the exchange market itself, and can manipulate the market, create derivative products, sell futures, etc. They can cause deflation or inflation of carbon credits, just as they can cause deflation or inflation of currencies. They decide which corporations get advance insider tips, so they can maximize their emissions while minimizing their offset costs. They decide who gets loans to buy offsets, and at what interest rate. They decide what fraction of petroleum will go to the global North and the global South. They have ‘their man’ in the regulation agencies that certify the validity of offset projects. And they make money every which way as they carry out this micromanagement.
In the face of global cooling, this profiteering and micromanagenent of energy resources becomes particularly significant. Just when more energy is needed to heat our homes, we’ll find that the price has gone way up. Oil companies are actually strong supporters of the global-warming bandwagon, which is very ironic, given that they are funding some of the useful contrary research that is going on. Perhaps the oil barrons are counting on the fact that we are suspicious of them, and asssume we will discount the research they are funding, as most people are in fact doing. And the recent onset of global cooling explains all the urgency to implement the carbon-management regime: they need to get it in place before everyone realizes that warming alarmism is a scam.
And then there’s the carbon taxes. Just as with income taxes, you and I will pay our full share for our daily commute and for heating our homes, while the big corporate CO2 emitters will have all kinds of loopholes, and offshore havens, set up for them. Just as Federal Reserve theory hasn’t left us with a prosperous Main Street, despite its promises, so theories of carbon trading and taxation won’t give us a happy transition to a sustainable world.
Instead of building the energy-efficient transport systems we need, for example, they’ll sell us biofuels and electric cars, while most of society’s overall energy will continue to come from fossil fuels, and the economy continues to deteriorate. The North will continue to operate unsustainably, and the South will pay the price in the form of mass die-offs, which are already ticking along at the rate of six million children a year from malnutrition and disease.
While collapse, suffering, and die-offs of ‘marginal’ populations will be unpleasant for us, it will give our ‘powerful factions’ a blank canvas on which to construct their new world order, whatever that might be. And we’ll be desperate to go along with any scheme that looks like it might put food back on our tables and warm up our houses.
Author contact – rkm@quaylargo.com
Up in Smoke
Why Biomass Wood Energy is Not the Answer
By George Wuerthner | January 12, 2010
After the Smurfit-Stone Container Corp.’s linerboard plant in Missoula Montana announced that it was closing permanently, there have been many people including Montana Governor Switzer, Missoula mayor and Senator Jon Tester, among others who advocate turning the mill into a biomass energy plant. Northwestern Energy, a company which has expressed interest in using the plant for energy production has already indicated that it would expect more wood from national forests to make the plant economically viable.
The Smurfit Stone conversion to biomass is not alone. There have been a spate of new proposals for new wood burning biomass energy plants sprouting across the country like mushrooms after a rain. Currently there are plans and/or proposals for new biomass power plants in Maine, Vermont, Pennsylvania, Florida, California, Idaho, Oregon and elsewhere. In every instance, these plants are being promoted as “green” technology.
Part of the reason for this “boom” is that taxpayers are providing substantial financial incentives, including tax breaks, government grants, and loan guarantees. The rationale for these taxpayer subsidies is the presumption that biomass is “green” energy. But like other “quick fixes” there has been very little serious scrutiny of real costs and environmental impacts of biomass. Whether commercial biomass is a viable alternative to traditional fossil fuels can be questioned.
Before I get into this discussion, I want to state right up front, that coal and other fossil fuels that now provide much of our electrical energy need to be reduced and effectively replaced. But biomass energy is not the way to accomplish this end goal.
BIOMASS BURNING IS POLLUTION
First and foremost, biomass burning isn’t green. Burning wood produces huge amounts of pollution. Especially in valleys like Missoula where temperature inversions are common, pollution from a biomass burner will be the source of numerous health ailments. Because of the air pollution and human health concerns, the Oregon Chapter of the American Lung Association, the Massachusetts Medical Society and the Florida Medical Association, have all established policies opposing large-scale biomass plants.
The reason for this medical concern is that even with the best pollution control devises, biomass energy is extremely dirty. For instance, one of the biggest biomass burners now in operation, the McNeil biomass plant in Burlington, Vermont is the number one pollution source in the state, emitting 79 classified pollutants. Biomass releases dioxins, and as much particulates as coal burning, plus carbon monoxide, nitrogen oxide, sulfur dioxide, and contributes to ozone formation. […]
BIOMASS ENERGY IS INEFFICIENT
Wood is not nearly as concentrated a heat source as coal, gas, oil, or any other fossil fuel. Most biomass energy operations are only able to capture 20-25% of the latent energy by burning wood. That means one needs to gather and burn more wood to get the same energy value as a more concentrated fuel like coal. That is not to suggest that coal is a good alternative, rather wood is a worse alternative. Especially when you consider the energy used to gather the rather dispersed source of wood and the energy costs of trucking it to a central energy plant. If the entire carbon footprint of wood is considered, biomass creates far more CO2 with far less energy output than other energy sources.
The McNeil Biomass Plant in Burlington Vermont seldom runs full time because wood, even with all the subsidies (and Vermonters made huge and repeated subsidies to the plant—not counting the “hidden subsidies” like air pollution) wood energy can’t compete with other energy sources, even in the Northeast where energy costs are among the highest in the nation. Even though the plant was also retrofitted so it could burn natural gas to increase its competitiveness with other energy sources, the plant still does not operate competitively. It generally is only used to off- set peak energy loads.
One could argue, of course, that other energy sources like coal are greatly subsidized as well, especially if all environmental costs were considered. But at the very least, all energy sources must be “standardized” so that consumers can make informed decisions about energy—and biomass energy appears to be no more green than other energy sources.
BIOMASS SANITIZES AND MINES OUR FORESTS
The dispersed nature of wood as a fuel source combined with its low energy value means any sizable energy plant must burn a lot of wood. For instance, the McNeil 50 megawatt biomass plant in Burlington, Vermont would require roughly 32,500 acres of forest each year if running at near full capacity and entirely on wood. Wood for the McNeil Plant is trucked and even shipped on trains from as far away as Massachusetts, New Hampshire, Quebec and Maine.
Biomass proponents often suggest that wood [gathered] as a consequence of forest thinning to improve “forest health” (logging a forest to improve health of a forest ecosystem is an oxymoron) will provide the fuel for plant operations. For instance, one of the assumptions of Senator Tester’s Montana Forest Jobs bill is that thinned forests will provide a ready source of biomass for energy production. But in many cases, there are limits on the economic viability of trucking wood any distance to a central energy plant. Again without huge subsidies, this simply does not make economic sense. Biomass forest harvesting is even worse for forest ecosystems than clear-cutting. Biomass energy tends to utilize the entire tree, including the bole, crown, and branches. This robs a forest of nutrients, and disrupts energy cycles.
Worse yet, such biomass removal ignores the important role of dead trees to sustain the forest ecosystems. Dead trees are not a “wasted” resource. They provide home and food for thousands of species, including 45% of all bird species in the Nation. Dead trees that fall to the ground are used by insects, small mammals, amphibians and reptiles for shelter and even potentially food. Dead trees that fall into streams are important physical components of aquatic ecosystems and provide critical habitat for many fish and other aquatic species. Removal of dead wood is mining the forest. Keep in mind that logging activities are not benign. Logging typically requires some kind of access, often roads which are a major source of sedimentation in streams, and disrupt natural subsurface water flow. Logging can disturb sensitive wildlife like grizzly bear and even elk are known to abandon locations with active logging. Logging can spread weeds. And finally since large amounts of forest carbon are actually tied up in the soils, soil disturbance from logging is especially damaging, often releasing substantial additional amounts of carbon over and above what is released up a smoke stack.
BIOMASS ENERGY USES LARGE AMOUNTS OF WATER
A large-scale biomass plant (50 MW) uses close to a million gallons of water a day for cooling. Most of that water is lost from the watershed since approximately 85% is lost as steam. Water channeled back into a river or stream typically has a pollution cost as well, including higher water temperatures that negatively impact fisheries, especially trout. Since cooling need is greatest in warm weather, removal of water from rivers occurs just when flows are lowest, and fish are most susceptible to temperature stress.
BIOMASS ENERGY SAPS FUNDS FROM OTHER TRULY GREEN ENERGY SOURCES LIKE SOLAR
Since biomass energy is eligible for state renewable portfolio standards (RPS), it has captured the bulk of funding intended to move the country away from fossil fuels. For example, in Vermont, 90% of the RPS is from “smokestack” sources—mostly biomass incineration. This pattern holds throughout many other parts of the country. Biomass energy is thus burning up funds that could and should be going into other energy programs like energy conservation, solar and insulation of buildings.
PUBLIC FORESTS WILL BE LOGGED FOR BIOMASS ENERGY
Many of the climate bills now circulating in Congress, as well as Montana Senator Jon Tester’s Montana Jobs and Wilderness bill target public forests. Some of these proposals even include roadless lands and proposed wilderness as a source for wood biomass. One federal study suggests that 368 million tons of wood could be removed from our national forests every year—of course this study did not include the ecological costs that physical removal of this much would have on forest ecosystems.
The Biomass Crop Assistance Program, or BCAP, which was quietly put into the 2008 farm bill has so far given away more than a half billion dollars in a matching payment program for businesses that cut and collect biomass from national forests and Bureau of Land Management lands. And according to a recent Washington Post story, the Obama administration has already sent $23 million to biomass energy companies, and is poised to send another half billion.
And it is not only federal forests that are in jeopardy. Many states are eying their own state forests for biomass energy. For instance, Maine recently unveiled a new plan known as the Great Maine Forest Initiative which will pay timber companies to grow trees for biomass energy.
JOB LOSSES
Ironically one of the main justifications for biomass energy is the creation of jobs, yet the wood biomass rush is having unintended consequences for other forest products industries. Companies that rely upon surplus wood chips to produce fiberboard, cabinet makers, and furniture are scrambling to find wood fiber for their products. Considering that these industries are secondary producers of products, the biomass rush could threaten more jobs than it may create.
BOTTOM LINE
Large scale wood biomass energy is neither green, nor truly economical. It is also not ecologically sustainable and jeopardizes our forest ecosystems. It is a distraction that funnels funds and attention away from other more truly worthwhile energy options, in particular, the need for a massive energy conservation program, and changes in our lifestyles that will in the end provide truly green alternatives to coal and other fossil fuels.
George Wuerthner is a wildlife biologist and a former Montana hunting guide. His latest book is Plundering Appalachia.
Related articles
- Massachusetts Restricts Dirty Biopower (switchboard.nrdc.org)
- Forest Owners Tell EPA to Avoid Pitfalls in Biomass Review (prweb.com)
- Greens warn biomass plan could reduce food supplies (morningstaronline.co.uk)
- Biggest English Polluter Spends $1 Billion to Burn Wood (bloomberg.com)
- California Proposes Forest Thinning for Biomass Energy, But is it a Good Idea? (kcet.org)
DOE grants moratorium on safety inspections for nuclear weapons labs
Project on Government Oversight | January 7, 2010
If your kid accidentally blew apart a building, would you give them less supervision? This hands-off approach is exactly what the Department of Energy’s National Nuclear Security Administration (NNSA) is doing by giving the contractors who manage the nation’s eight nuclear weapons sites (Los Alamos National Laboratory, Lawrence Livermore National Laboratory, Nevada Test Site, Sandia National Laboratory, Savannah River Site, Pantex, Y-12, and the Kansas City Plant) a six-month break from many regularly scheduled oversight reviews.
On December 18, 2009–two days after researchers at the Los Alamos National Laboratory (LANL) accidentally blew apart a building, causing an initial estimate of $3 million in damage–NNSA Administrator Tom D’Agostino signed a directive “placing a six-month moratorium on NNSA-initiated functional assessments, reviews, evaluations and inspections.” Project on Government Oversight (POGO) saw this directive coming, as DOE and NNSA have initiated reforms to put contractors in charge of their own oversight, “Reforming the Nuclear Security Enterprise.” POGO is not convinced that this moratorium is so temporary, and is interested to know what NNSA is going to do with all of the federal full time employees at the site offices and headquarters it no longer needs as a result of this directive.
Getting a hiatus from regular reviews are many of the areas that have had recent serious problems—security, nuclear safety, cyber security, Material Control and Accountability (MC&A), contractor assurance systems that relate to contract oversight, property accountability, and nuclear weapons quality. For example, the weaknesses with Los Alamos’s MC&A program were so significant it took NNSA more than a year and a half to resolve them. Additionally, it was NNSA, not the contractor that found that Los Alamos treated its loss of more than 67 computers merely as a property management issue, and not as a potential lapse in cyber security. Over the last few years, POGO has also discovered countless security and safety incidents at Los Alamos, Livermore National Laboratory, Pantex, and Y-12, that the contractors had not provided oversight sufficient to prevent and resolve the problems. In the past, a senior manager at Los Alamos and his sidekick went to jail when the procurement system got out of control. Now, the directive exempts Los Alamos from a procurement management review.
“It seems foolish for NNSA to abdicate its management, given the last few years of debacles at the labs,” says POGO Senior Investigator Peter Stockton. “NNSA needs to recognize its role in overseeing the labs, as that was one of the major reasons it was created.”
NNSA’s new approach to federal safety and security oversight is irresponsible—stopping it in its entirety for six months. POGO would instead like to see NNSA make a New Year’s resolution to conduct smarter, more rigorous oversight of its labs. Such a move could prevent some of the costly contractor errors that occurred in 2009, such as Los Alamos’s Plutonium Facility (PF-4) needing to stop its main operations for more than one month, once again, because of the contractor and NNSA’s inadequate oversight of its fire suppression system.
.












