It has been 17 years since the U.S. Department of Energy (DOE) began cleaning up the Cold War-era nuclear weapons plant, Savannah River Site, in South Carolina, and at the current pace, it may be another 30 years before the work is completed.
That fact does not sit well with state officials who are now threatening to levy an enormous fine on DOE for not keeping to its original deadline of fixing the mess by 2023.
A key aspect of the project, which started in 1996, is to turn liquid radioactive bomb waste into a solid that can be safely stored for millennia while its radiation decays.
It’s important to make this conversion sooner rather than later because the toxic waste now sits in huge underground tanks (that hold anywhere from 750,000 to 1.3 million gallons each) that have been in use since the 1950s.
If the federal government takes until the 2040s to finish the remediation, it means the tanks will need to hold up for 90 years.
“I don’t know what the tanks’ design life was intended to be, but it’s not for infinity,” Catherine B. Templeton, South Carolina’s top environmental official, told The New York Times.
“We have to get that waste out of the tanks so it’s not Fukushima, so you don’t have the groundwater interacting with the waste and running off,” she added, referring to the radioactive water flowing from the Fukushima Daiichi plant in Japan and into the ocean.
To prod the DOE into moving faster, the state is threatening to impose $154 million in fines for failing to finish the project in nine years.
Energy officials say the slowdown couldn’t be helped, what with the budget cuts from sequestration and other decisions by Congress that reduced the amount of money flowing to the Savannah cleanup operation.
“There’s only so much to go around,” Terrel J. Spears, DOE’s assistant manager for waste disposition at the site, told the Times. “We can’t increase the budgets. Now we have to balance the budgets.”
Many of the people who were forced to evacuate after the 2011 triple meltdown at the Fukushima nuclear power plant may never return, Japanese lawmakers admitted, overturning initial optimistic government pledges.
A call to admit the grim reality and step back from the ambitious Fukushima decontamination goals came from Prime Minister Shinzo Abe’s coalition parties. Japan has so far spent $30 billion on the clean-up program, which has proven to be more difficult to carry out than initially expected.
The new plan would be for the government to fund relocation to new homes for those who used to live in the most contaminated areas.
“There will come a time when someone has to say, ‘You won’t be able to live here anymore, but we will make up for it’,” Shigeru Ishiba, the secretary General of Abe’s Liberal Democrat party said in a speech earlier this month.
On Tuesday, evacuees reacted with anger at the government’s admission.
“Politicians should have specified a long time ago the areas where evacuees will not be able to return, and presented plans to help them rebuild their lives elsewhere,” Toshitaka Kakinuma, a 71-year-old evacuee, told the Asahi Shimbun newspaper.
Some 160,000 people escaped the vicinity of Fukushima Daiichi, when a powerful earthquake and tsunami transformed the plant into the world’s worst nuclear disaster since Chernobyl. About a third of them are still living in temporary housing. They were promised that this would not last for longer than 3 years.
In August the death toll among the evacuees surpassed the threshold of 1,599 lives, which is how many people in the prefecture were killed by the disaster itself. The displaced residents are suffering from health problems, alcoholism and high rates of suicide.
The Ministry of Environment wanted to decontaminate 11 townships in the affected area, bringing the average annual radiation dose to 20 millisieverts, a level deemed safe by the International Centre for Radiological Protection. It further pledged to pursue a long-term goal reducing it to 1 millisievert per year.
The clean-up, however, has been marred by delays and reports that workers sometimes simply dumped contaminated waste rather than collect it for safe storage, causing the environment ministry push back the deadline. There are also calls on the government to abandon the more ambitious dose target, arguing that it is unrealistic.
Some evacuees said they wouldn’t return even after the first phase of the cleanup, saying the dose of 20 millisieverts per year still poses health risks.
“No matter how much they decontaminate I’m not going back because I have children and it is my responsibility to protect them,” Yumi Ide, a mother of two teenage boys, told Reuters.
The fear of radiation has soared in Japan in the wake of the Fukushima disaster, with rallies against the use of nuclear power scoring record attendance. The government shut down all 50 remaining Japanese reactors for safety checks, and there is strong pressure to keep them offline.
The Japanese government is reportedly seeking to borrow an extra $30 billion for the Fukushima cleanup and compensations, which would raise the total cost of the disaster response to $80 billion. The figure does not include the cost of decommissioning reactors to be carried out by the plant operator, Tepco. The company recently complained about the huge expense of the process, which may last at least 30 years.
Can the world’s biggest corporations act with impunity? When it comes to General Electric (GE) — the eighth-largest U.S. corporation, with $146.9 billion in sales and $13.6 billion in profits in 2012 — the answer appears to be “yes.”
Let us begin with a small-scale case in upstate New York, where in late September 2013 GE announced that it would close its electrical capacitor plant in the town of Fort Edward. Some two hundred workers will lose their jobs and, thereafter, will have little opportunity to obtain comparable wages, pensions, or even employment in this economically distressed region. Ironically, the plant has been highly profitable. Earlier in the year, the local management threw a party to celebrate a record-breaking quarter. But the high-level financial dealings of a vast multinational operation like GE are mysterious, and the company merely announced that the Fort Edward plant was “non-competitive.” The United Electrical Workers (UE), the union that has represented the workers there for the past seventy years, has already begun a vigorous campaign of resistance to the plant closing, but it is sure to be an uphill battle.
If we dig deeper into the record, a broader pattern of corporate misbehavior emerges. Indeed, the Fort Edward factory is one of two GE plants that polluted the communities at Fort Edward and nearby Hudson Falls, as well as a 197-mile stretch of the Hudson River, with 1.3 million pounds of cancer-causing PCBs for several decades. Worried about the dangers of PCBs, workers asked managers about them, and were told that these toxins were perfectly safe — in fact, that the workers should rub the PCBs on their heads to combat baldness! When the extent of this environmental disaster began to be revealed in the 1970s, GE began a lengthy campaign to deny it and, later, a multimillion dollar public relations campaign to prevent remedial action by the Environmental Protection Administration. GE lost this battle, for the EPA insisted upon the dredging of the Hudson River and ordered GE to pay for it. Thus, the Hudson Valley became the largest Superfund cleanup site in the United States, with a project that will take decades to complete.
GE has produced other environmental disasters, as well. Three GE nuclear reactors at the Fukushima Daiichi nuclear power site in Japan melted down on March 31, 2011. This was the world’s worst nuclear accident in three decades, and quickly spread radioactive contamination nearly one hundred fifty miles. Indeed, the stricken reactors are still sending three hundred tons a day of radioactive water flooding into the Pacific Ocean. Dr. Helen Caldicott, who has studied nuclear power for decades, has estimated that up to 3.5 million people could eventually die from cancer thanks to the Fukushima radiation release. In the late 1960s and early 1970s, when these boiling water nuclear reactors were installed, GE’s engineers and management knew that their design was flawed. But the company kept selling them to unsuspecting utilities around the world, including many in the United States. As a result, there are still thirty-five GE boiling water reactors operating in this country, most of them located near population centers east of the Mississippi River. Currently, in fact, more than 58 million Americans live within fifty miles of a GE nuclear reactor.
Another important product produced by GE is the export of jobs. According to an extensive New York Timesreport on GE in March 2011: “Since 2002, the company has eliminated a fifth of its work force in the United States while increasing overseas employment.” By the end of 2010, another study found, 54 percent of GE’s 287,000 employees worked abroad. Not surprisingly, the company’s overseas operations in that year provided most of its total revenue. Responding to GE’s claim that it had created thousands of new jobs in the United States during the Obama administration, Chris Townsend, the political action director of the UE, produced a list of 40 U.S. plants the company closed in the country during the same period.
Townsend also noted that, even when GE kept its operations going in the United States, it slashed wages, sometimes by as much as 45 percent at a time. For example, the work of the Fort Edward plant will be moved to Clearwater, Florida, a non-union site where GE pays many workers $12 an hour and hires others through a temp agency at $8 an hour — little more than the minimum wage.
Although technically a U.S. corporation, GE — with operations in 130 nations — apparently feels little loyalty to the United States. Jack Welch, a former GE CEO, once remarked: “Ideally, you’d have every plant you own on a barge to move with currencies and changes in the economy.” According to a Bloomberg analysis, to avoid paying U.S. taxes, GE keeps more of its profits overseas than any other U.S. company — $108 billion by the end of 2012. Most of these profits, GE declared, would be invested in its foreign business enterprises. Thanks to this tax dodge and others, GE reportedly paid an average annual U.S. corporate income tax rate of only 1.8 percent between 2002 and 2011. In 2010, when GE reported worldwide profits of $14.2 billion, it paid no U.S. corporate income tax at all. Instead, it claimed a tax benefit of $3.2 billion. This is a sweet deal for that giant corporation, for the official corporate tax rate is 35 percent.
Despite this appalling record, the U.S. government has been very generous to GE. During the financial crisis of 2008-2009, the federal government’s Temporary Liquidity Guarantee Program loaned approximately $85 billion to GE Capital, the company’s huge finance arm that accounts for roughly half of GE’s profits. GE needed the bailout because, among other reasons, GE Capital was marketing subprime mortgages, making GE the tenth-largest subprime lender in the United States. The Federal Reserve also bought $16.1 billion worth of short-term corporate i.o.u.’s from GE in late 2008, when the public market for this kind of debt had nearly frozen, and GE became one of the largest beneficiaries of this federal program. In yet a further indication of GE’s influence, President Obama appointed Jeffrey Immelt, GE’s CEO, as chair of his Council on Jobs and Competitiveness, which strategizes about how to revive America’s manufacturing base. One of Immelt’s favorite panaceas is to end taxes on the overseas profits of corporations.
Thus, it might seem that those two hundred embattled workers at Fort Edward have no possibility at all of effectively challenging a corporation this wealthy and influential. But stranger things have happened in the United States — especially when Americans have had their fill of corporate arrogance.
The Associated Press ran an alarming news piece on 9/6/13: Climate Change Threatens Caribbean’s Water Supply
It was picked up and echoed around the world, from Time Magazine’s Space and Science section in the US to CBC Canada to ABC Australia to ZeeNews India. The headline was everywhere, repeated at the Huffington Post as ‘Caribbean water supplies severely threatened by climate change.” The AP story reported on contemporary expert warnings at an August 2013 UN conference in St. Lucia. The lead AP paragraph is quite clear:
“Experts are sounding a new alarm about the effects of climate change for parts of the Caribbean—the depletion of already strained drinking water throughout much of the region.”
Experts like Avril Alexander, Caribbean coordinator of Global Water Partnership:
“When you look at the projected impact of climate change, a lot of the impact is going to be felt through water.”
Experts like Lystra Fletcher-Paul, Caribbean land/water officer for the UN FAO:
“Inaction is not an option. The water resources will not be available.”
Yet another anthropogenic global warming alarm, and just in time for IPCC AR5, whose newly released WG1 chapters 7 and 11 say there is high confidence that dry regions will get drier, wet regions will get wetter, and storms will get stormier. “But there is only low confidence in the magnitude.” These Caribbean experts are much more certain—Caribbean water resources will not be available.
Little in this MSM AP news is what it seems. Paragraph 2 starts out saying rising sea levels could contaminate Caribbean fresh water supplies. What a curious assertion. Less dense fresh water floats on top of salt water no matter the sea level. Excessive groundwater draw-down can cause saltwater intrusion from below. That is already a problem in urbanized Broward County, Florida despite proximity to the Everglades. And on the Tuvalu atolls in the Pacific, where government owned tourist hotels have strained its very limited groundwater capacity. Tuvalu is another urban development problem, not AGW. It was caused by Tuvalu’s government itself, eager to develop ecotourism (diving) after their new Funafuti runway was built with World Bank financing.
Saltwater intrusion doesn’t apply much to Caribbean island groundwater. The islands are mountainous. Pico Duarte in the DR is 3098m. Pic la Selle in Haiti is 2680m. Jamaica’s Blue Mountain is 2256m. Cuba’s Pico Turquina is 1974m. Antigua’s ‘Boggy Peak’ is 402m. St. Croix’ ‘Mount Eagle’ is 355m. Barbados is only hilly, with a maximum elevation of ‘just’ 343m. Barbados:
Rising sea levels will not contaminate Caribbean fresh water supplies.
The AP reported that Jason Johnson, head of the Caribbean Water and Wastewater Association, said the real issue is groundwater replenishment.
“Many Caribbean nations rely exclusively on underground water for their needs, a vulnerable source that would be hit hard by climate change effects. That’s the greatest concern. Those weather patterns may change, and there may not necessarily be the means for those water supplies to be replenished at the pace that they have historically been replenished.”
The AP noted some of the islands experienced an unusual dry spell in 2012. That’s weather. But Cedric Van Meerbeck, climatologist with the Caribbean Institute for Meteorology & Hydrology, made the inevitable AGW connection:
“There are a number of indications that the total amount of rainfall in much of the Caribbean would be decreasing by the end of the century.”
Since 2012 was dry, and AR5 WG1 Chapter 7 executive summary says dry will get dryer, perhaps IPCC pronouncements are the indications. But regionally down-scaled GCMs cannot make such predictions on multi-decadal time scales. [1]
Intense rains fully ameliorated the unusual 2012 dry spell early in the usual 2013 Caribbean tropical storm season. AR5 WG1 7.6.2 also says wet will get wetter and storms stormier. That worries Barnard Ettinoffe, President of the Caribbean Water and Sewerage Association:
“Heavy rains mean there’s not enough time for water to soak into the ground as it quickly runs off.”
Climate change causes dry to get drier and wet to get wetter according to AR5 WG1 11.3.2.3.1. It threatens Caribbean island water supply both ways!
What is actually going on was clued in the lead AP paragraph above—depletion of already strained water supplies throughout much of the region.
Much of the region is not correct. The AP story cites a 2012 study from British investment risk firm Maplecroft [2] saying Barbados is most at risk, but Cuba and the Dominican Republic also have high water security risk. On the large island of Hispaniola, the Dominican Republic has 2069m3 of renewable water per capita according the World Bank.[3] Cuba has 3381m3. The UK (another island for comparison sake) has 2311m3 but is not a water risk. The only way Cuba and the Dominican Republic could have a high water security risk rating (when the UK doesn’t) is through some illogic unrelated to water.
Barbados (although verdant, as the above picture proves) does have the least per capita renewable water in the Caribbean, only 284m3. That is because Barbados water consumption has doubled over the past 50 years [4] as its population has grown from ≈232K to ≈280K while its per capita GDP tripled from ≈$4k to ≈$12k. Water has become a major problem, and Barbados doesn’t have the oil wealth to import food (virtual water) or desalinate seawater like Saudi Arabia (86m3). Barbados’ water problem is anthropogenic, but not AGW. It is about unsustainable population growth and economic development on a smallish dryish island–just like on Tuvalu.
Another Caribbean country with current water problems is Antigua/Barbuda, at 590m3. Neither indigenous Caribbean tribes nor Spanish conquistadors settled those islands because of insufficient fresh water. The British did later. The country’s population has almost doubled from ≈54k in 1960 to ≈90k today. That always eventually causes finite resource problems. And now has in naturally dry Antigua/Barbuda.
Climate change does not threaten Caribbean water supplies. Population growth and economic development already do on some of the smaller islands. And they are using climate change to ‘extort’ financial aid (e.g. for desalination) from the usual rich ‘guilty’ AGW culprits.
The UN Framework Convention on Climate Change organized this regional conference (at St. Lucia’s luxurious Bay Gardens Hotel/Resort) for Caribbean environment ministers and politicians. The UN organizer’s locally televised purpose was to give “these less developed country ministers and politicians the information and tools to know what to ask for in the negotiations leading up to the new world agreements of 2015”. That starts at COP19 in Warsaw in November 2013.
It is no coincidence the conference was held on St. Lucia. Its minister presently heads the Alliance of Small Island States (AOSIS). AOSIS says its 44 member states comprise 30% of developing countries, 20% of UN member states, and 5% of world population. The AOSIS agenda for COP19 is clear from its PR after being disappointed at June 2013 Bonn meetings:
At the closing of the latest round of U.N. climate talks, the Alliance of Small Island States (AOSIS), a group of 44 low-lying and coastal countries that are highly vulnerable to the impacts of climate change, released the following statement:
“After losing two weeks to needless procedural wrangling, it is worth recalling the scale of the challenge we face and the precious little time remaining to meet it… Therefore an international mechanism to address the permanent injury our islands are experiencing [emphasis added] must be established this year at the Warsaw conference.”
Tuvalu is the AOSIS member most aggressively agitating for UN ‘climate change aid’, having experienced saltwater intrusion caused by government tourist hotel development. Hence the AP story’s odd second paragraph, which is unrelated to the Caribbean but right in the AOSIS (Tuvalu) lobbying sweet spot.
Hey mon, its Babylon politricks. (H/T to Bob Marley and Jamaica, a Caribbean island of 2.7 million people enjoying 2473m3 renewable water per capita and fantastic reggae music.)
[1] Pielke Sr., Regional Climate Downscaling: What’s the Point, EOS 93: 52-53 (2012)
[2] Maplecroft Global Risk Analytics, info@maplecroft.com
[3] Available at data.worldbank.org/indicator/ER.H2O.INTR.PC
[4] Barbados Free Press editorial on water rationing 2/28/10
The Ecuadorian National Assembly voted Thursday to permit the drilling for petroleum in two sections of the Yasuní National Park in the country’s eastern Amazon basin. The decision comes just seven weeks after President Rafael Correa announced the failure of the Yasuní-ITT initiative, a project that sought to indefinitely prohibit oil exploration in the Yasuní in exchange for international donations equal to half of the reserve’s projected income.
The approved measure, which will allow oil exploration in the park’s 31 and 49 blocks, was passed with the votes of 108 of the assembly’s 133 members. The assembly cited “national interest” as justification for its decision. Ecuador’s constitution forbids “activities for the extraction of nonrenewable natural resources” except in the case of national interest as determined by the National Assembly.
President Correa says the exploration will only affect .01% of the park. Additionally, the legislation promises the protection of indigenous communities that live in the affected areas and excludes extractive activity from the Yasuní’s “untouchable zone”, the largest section of the park, which is to be preserved in its natural state as a wildlife sanctuary. The project will be run by state-run oil company Petroamazonas.
Fifty days ago, Correa requested authorisation to begin oil exploration within the park, declared a global biosphere reserve by UNESCO in 1989. The move has intensified national debate over drilling in the Yasuní and saw the president embark on a countrywide tour to convince oppositional groups of the economic and social need to drill in the wake of the Yasuní-ITT initiative’s failure.
In support of the president’s new initiative were 30 mayors from towns in Ecuador’s Amazon basin who travelled to Quito last month to express their support for the measure. Additionally, just last Friday 180 mayors signed a statement in support of the move to drill in the Yasuní.
However, opposition from ecological and indigenous rights groups remains high. On 28th August, police were accused of firing rubber bullets against protestors who had gathered in response to Correa’s initial remarks on opening the Yasuní up to exploration.
In the past month, the opposition has called for a national referendum, a request denied by the Constitutional Court.
Humberto Cholango, head of the Confederation of Indigenous Nationalities of Ecuador, told Ecuadorian newspaper El Universo that he was confused by the court’s decision. “There was a referendum over bullfighting in 2011, so why would you not consult the people on this issue of such importance, which threatens the lives of indigenous peoples as well as the reserve’s enormous biodiversity.”
Despite the rejection, the opposition pushed until the last moments before the vote.
Three community leaders from Ecuador’s Amazon region were invited to speak before the assembly on the final day of the debate. The first two spoke in favour of the government’s proposal, citing a need for economic development and a belief that the government would do its best to protect the local environment and communities.
The third speaker, a Guaraní woman named Alicia Cawiya, steered away from her prepared speech and delivered an emotional plea in an effort to change the minds of those about to vote.
“All we want is that you respect our territory, which we have preserved and cared for,” pleaded Alicia. “Leave us to live how we want. This is our only proposal.”
California’s Rim fire, expected to be fully “contained” by October after igniting in Yosemite National Park on August 17, will ultimately benefit the forests it has passed through. While media accounts sensationalize such large wildfires as “catastrophic” and “disastrous,” science demonstrates that, to the contrary, fire is a vital component of western forest ecosystems.
Journalists mischaracterize the ecological function of wildfire as “devastation” or refer to forests that have experienced fire as a “barren wasteland,” exploiting emotions to sell newspapers. Yet media is only an accomplice to one of the masterminds ultimately responsible for fanning the flames of wildfire hysteria: the biomass energy industry.
Ignoring sound science and common sense, the biomass industry insists that cutting more backcountry forests, including native forests, will somehow prevent wildfires and protect people.
In September, the U.S. Department of Agriculture (USDA) announced the siphoning of even more taxpayer dollars to log and burn forests for energy under the guise of “reduc[ing] the risks of catastrophic wildfires.” In this most recent taxpayer handout to the biomass industry, $1.1 million in grants will be diverted to encourage more biomass incineration in California, Idaho, Minnesota, New Hampshire, and Alaska.
The biomass boosters’ well-worn talking points are laid out perfectly by Julia Levin, director of the Bioenergy Association of California, in a recent op-ed in the San Francisco Chronicle. Without citing a single scientific study, Levin boldly claims that hacking apart forests to burn for energy would “prevent more Rim Fires,” asserting that keeping chainsaws out a forest is the same thing as letting it go “up in smoke.”
George Wuerthner, ecologist and editor of Wildfire: A Century of Failed Forest Policy, explains that instead of stopping fires, logging “typically has little effect on the spread of wildfires.” Contrary to industry and media spin, large fires such as the Rim fire are a product of “high winds, high temperatures, low humidity and severe drought.” These bigger fires are “unstoppable and go out only when the weather changes — not because of a lack of fuels” in a logged forest.
Wuerthner contends that logging or “thinning” can actually “increase wildfires’ spread and severity by increasing the fine fuels on the ground (slash) and by opening the forest to greater wind and solar penetration, drying fuels faster than in unlogged forests.”
Biomass proponent Levin warns in her op-ed that wildfires have “enormous impacts on public health from the smoke, soot and other emissions.” Yet Levin sees no disconnect in building biomass incinerators that would spew deadly particulate matter into low-income communities twenty-four hours a day, seven days a week, at higher levels than most coal plants.
Wildfire can “threaten lives, homes and businesses,” Levin states truthfully, particularly as more forests in the fire plain are opened to development. Yet the industry mouthpiece doesn’t once mention the only action that can actually protect structures from wildfire: maintaining “defensible space” 100-200 feet around a building. Instead, she offers more backcountry logging as the solution.
Levin claims to fret about the impact on climate change from an occasional wildfire, while pushing hard for more biomass incinerators that would pump out more carbon dioxide per unit of energy than some of the dirtiest coal plants in the country.
Recent science demonstrates that big blazes have been typical in western forests for hundreds of years. “If you go back even to the turn of the century, you will find that tens of millions of acres burned annually,” according to Wuerthner. “One researcher in California recently estimated that prior to 1850, an average of 5 million to 6 million acres burned annually in California alone.”
Yet biomass opportunists such as Levin cling to the outdated belief that “wildfires are increasing dramatically in frequency and severity as the result of climate change and overgrown forests.”
It would be unfair to suggest that Levin completely ignores forest ecology in her op-ed. She doesn’t. She just makes up her own version of it to suit industry’s desire to get out the cut, swearing that more intensive logging won’t harm forests, but magically “increase forest ecosystem health.”
That’s just dead wrong, according to ecologist Chad Hanson, director of the John Muir Project of Earth Island Institute in California. Hanson explains that burned forests “support levels of native biodiversity and total wildlife abundance” equal to or greater than any forest type, including old growth. Burned forests are also the rarest kind of forest, and therefore among the most ecologically important.
Black-backed woodpeckers drill their burrows in standing dead snags, according to Hanson, eventually providing homes for other cavity-nesting species of birds and mammals. Native flowering shrubs thriving in the wake of wildfire attract insects, which feeds species of birds and bats. Shrubs and downed logs provide habitat for small mammals, which become food for raptors like the California spotted owl and northern goshawk. Deer live off the tender new tree growth, bears gorge themselves on the resulting berries and grubs, and Pacific fisher hunt the rodents, while the decaying organic material rejuvenates soils for swiftly regenerating seedlings.
Levin and the biomass industry’s “cure” for our “sick” western forests includes a recent bill passed by the California legislature requiring the Public Utilities Commission to generate up to 50 megawatts of biomass power, which Levin says would be extracted from 300,000 acres of forests over a ten year period.
The director of the Bioenergy Association of California specifically advocates for the construction of the 2.2 megawatt Cabin Creek Biomass Energy Facility in Placer County, California. This proposed facility is currently under legal challenge from Center for Biological Diversity, the environmental organization alleging that the Environmental Impact Report “does not comply with the California Environmental Quality Act.”
Since the Rim fire began in the central Sierra Nevada on August 17, there has been a steady stream of fearful, hyperbolic, and misinformed reporting in much of the media. The fire, which is currently 188,000 acres in size and covers portions of the Stanislaus National Forest and the northwestern corner of Yosemite National Park, has been consistently described as “catastrophic”, “destructive”, and “devastating.” One story featured a quote from a local man who said he expected “nothing to be left”. However, if we can, for a moment, set aside the fear, the panic, and the decades of misunderstanding about wildland fires in our forests, it turns out that the facts differ dramatically from the popular misconceptions. The Rim fire is a good thing for the health of the forest ecosystem. It is not devastation, or loss. It is ecological restoration.
What relatively few people in the general public understand at present is that large, intense fires have always been a natural part of fire regimes in Sierra Nevada forests. Patches of high-intensity fire, wherein most or all trees are killed, creates “snag forest habitat,” which is the rarest, and one of the most ecologically important, forest habitat types in the entire Sierra Nevada. Contrary to common myths, even when forest fires burn hottest, only a tiny proportion of the aboveground biomass is actually consumed (typically less than 3 percent). Habitat is not lost. Far from it. Instead, mature forest is transformed into “snag forest”, which is abundant in standing fire-killed trees, or “snags,” patches of native fire-following shrubs, downed logs, colorful flowers, and dense pockets of natural conifer regeneration.
This forest rejuvenation begins in the first spring after the fire. Native wood-boring beetles rapidly colonize burn areas, detecting the fires from dozens of miles away through infrared receptors that these species have evolved over millennia, in a long relationship with fire. The beetles bore under the bark of standing snags and lay their eggs, and the larvae feed and develop there. Woodpecker species, such as the rare and imperiled black-backed woodpecker (currently proposed for listing under the Endangered Species Act), depend upon snag forest habitat and wood-boring beetles for survival.
One black-backed woodpecker eats about 13,500 beetle larvae every year — and that generally requires at least 100 to 200 standing dead trees per acre. Black-backed woodpeckers, which are naturally camouflaged against the charred bark of a fire-killed tree, are a keystone species, and they excavate a new nest cavity every year, even when they stay in the same territory. This creates homes for numerous secondary cavity-nesting species, like the mountain bluebird (and, occasionally, squirrels and even martens), that cannot excavate their own nest cavities. The native flowering shrubs that germinate after fire attract many species of flying insects, which provide food for flycatchers and bats; and the shrubs, new conifer growth, and downed logs provide excellent habitat for small mammals. This, in turn, attracts raptors, like the California spotted owl and northern goshawk, which nest and roost mainly in the low/moderate-intensity fire areas, or in adjacent unburned forest, but actively forage in the snag forest habitat patches created by high-intensity fire — a sort of “bedroom and kitchen” effect. Deer thrive on the new growth, black bears forage happily on the rich source of berries, grubs, and small mammals in snag forest habitat, and even rare carnivores like the Pacific fisher actively hunt for small mammals in this post-fire habitat.
In fact, every scientific study that has been conducted in large, intense fires in the Sierra Nevada has found that the big patches of snag forest habitat support levels of native biodiversity and total wildlife abundance that are equal to or (in most cases) higher than old-growth forest. This has been found in the Donner fire of 1960, the Manter and Storrie fires of 2000, the McNally fire of 2002, and the Moonlight fire of 2007, to name a few. Wildlife abundance in snag forest increases up to about 25 or 30 years after fire, and then declines as snag forest is replaced by a new stand of forest (increasing again, several decades later, after the new stand becomes old forest). The woodpeckers, like the black-backed woodpecker, thrive for 7 to 10 years after fire generally, and then must move on to find a new fire, as their beetle larvae prey begins to dwindle. Flycatchers and other birds increase after 10 years post-fire, and continue to increase for another two decades. Thus, snag forest habitat is ephemeral, and native biodiversity in the Sierra Nevada depends upon a constantly replenished supply of new fires.
It would surprise most people to learn that snag forest habitat is far rarer in the Sierra Nevada than old-growth forest. There are about 1.2 million acres of old-growth forest in the Sierra, but less than 400,000 acres of snag forest habitat, even after including the Rim fire to date. This is due to fire suppression, which has, over decades, substantially reduced the average annual amount of high-intensity fire relative to historic levels, according to multiple studies. Because of this, and the combined impact of extensive post-fire commercial logging on national forest lands and private lands, we have far less snag forest habitat now than we had in the early twentieth century, and before. This has put numerous wildlife species at risk. These are species that have evolved to depend upon the many habitat features in snag forest — habitat that cannot be created by any other means. Further, high-intensity fire is not increasing currently, according to most studies (and contrary to widespread assumptions), and our forests are getting wetter, not drier (according to every study that has empirically investigated this question), so we cannot afford to be cavalier and assume that there will be more fire in the future, despite fire suppression efforts. We will need to purposefully allow more fires to burn, especially in the more remote forests.
The black-backed woodpecker, for example, has been reduced to a mere several hundred pairs in the Sierra Nevada due to fire suppression, post-fire logging, and commercial thinning of forests, creating a significant risk of future extinction unless forest management policies change, and unless forest plans on our national forests include protections (which they currently do not). This species is a “management indicator species”, or bellwether, for the entire group of species associated with snag forest habitat. As the black-backed woodpecker goes, so too do many other species, including some that we probably don’t yet know are in trouble. The Rim fire has created valuable snag forest habitat in the area in which it was needed most in the Sierra Nevada: the western slope of the central portion of the range. Even the Forest Service’s own scientists have acknowledged that the levels of high-intensity fire in this area are unnaturally low, and need to be increased. In fact, the last moderately significant fires in this area occurred about a decade ago, and there was a substantial risk that a 200-mile gap in black-backed woodpeckers populations was about to develop, which is not a good sign from a conservation biology standpoint. The Rim fire has helped this situation, but we still have far too little snag forest habitat in the Sierra Nevada, and no protections from the ecological devastation of post-fire logging.
Recent scientific studies have caused scientists to substantially revise previous assumptions about historic fire regimes and forest structure. We now know that Sierra Nevada forests, including ponderosa pine and mixed-conifer forests, were not homogenously “open and parklike” with only low-intensity fire. Instead, many lines of evidence, and many published studies, show that these areas were often very dense, and were dominated by mixed-intensity fire, with high-intensity fire proportions ranging generally from 15 percent to more than 50 percent, depending upon the fire and area. Numerous historic sources, and reconstructions, document that large high-intensity fire patches did in fact occur prior to fire suppression and logging. Often these patches were hundreds of acres in size, and occasionally they were thousands — even tens of thousands — of acres. So, there is no ecological reason to fear or lament fires like the Rim fire, especially in an era of ongoing fire deficit.
Most fires, of course, are much smaller, and less intense than the Rim fire, including the other fires occurring this year. Over the past quarter-century fires in the Sierra Nevada have been dominated on average by low/moderate-intensity effects, including in the areas that have not burned in several decades. But, after decades of fear-inducing, taxpayer-subsidized, anti-fire propaganda from the US Forest Service, it is relatively easier for many to accept smaller, less intense fires, and more challenging to appreciate big fires like the Rim fire. However, if we are to manage forests for ecological integrity, and maintain the full range of native wildlife species on the landscape, it is a challenge that we must embrace.
Encouragingly, the previous assumption about a tension between the restoration of more fire in our forests and home protection has proven to be false. Every study that has investigated this issue has found that the only way to effectively protect homes is to reduce combustible brush in “defensible space” within 100 to 200 feet of individual homes. Current forest management policy on national forest lands, unfortunately, remains heavily focused not only on suppressing fires in remote wildlands far from homes, but also on intensive mechanical “thinning” projects — which typically involve the commercial removal of upwards of 80 percent of the trees, including mature trees and often old-growth trees —that are mostly a long distance from homes. This not only diverts scarce resources away from home protection, but also gives homeowners a false sense of security because a federal agency has implied, incorrectly, that they are now protected from fire — a context that puts homes further at risk.
The new scientific data is telling us that we need not fear fire in our forests. Fire is doing important and beneficial ecological work, and we need more of it, including the occasional large, intense fires. Nor do we need to balance home protection with the restoration of fire’s role in our forests. The two are not in conflict. We do, however, need to muster the courage to transcend our fears and outdated assumptions about fire. Our forest ecosystems will be better for it.
Chad Hanson, the director of the John Muir Project (JMP) of Earth Island Institute, has a Ph.D. in ecology from the University of California at Davis, and focuses his research on forest and fire ecology in the Sierra Nevada. He can be reached at cthanson1@gmail.com, or visit JMP’s website at www.johnmuirproject.org for more information, and for citations to specific studies pertaining to the points made in this article.
The hypothesis for a single, simple, scientific explanation underlying the entire complex social phenomenon of CAGW.
Whatever is happening in the great outdoors regarding actual climate, inside the minds of men overwhelming evidence indicates that Catastrophic Anthropogenic Global Warming is a self-sustaining narrative that is living off our mental capacity, either in symbiosis or as an outright cultural parasite; a narrative that is very distanced from physical real-world events. The social phenomenon of CAGW possesses all the characteristics of a grand memetic alliance, like numerous similar structures before it stretching back beyond the reach of historic records, and no doubt many more cultural creatures that have yet to birth.
Having painted a picture CAGW from a memetic perspective in fiction last December [link], I realized that many people instinctively sense the memetic characteristics of CAGW, and typically express this in blogs or articles as relatively casual comments that cite memes or religion. Yet these folks appear to have no real knowledge of how truly meaningful and fundamental their observations are. Hence I have written a comprehensive essay which attempts to fill in this knowledge gap, and indeed proposes that the entire complex social phenomenon of CAGW is dominated by memetic action, i.e. CAGW is a memeplex.
Note: a ‘meme’ is a minimal cultural entity that is subject to selective pressures during replication between human minds, its main medium. A meme can be thought of as the cultural equivalent to a gene in biology; examples are a speech, a piece of writing (‘narratives’), a tune or a fashion. A memeplex is a co-adapted group of memes that replicate together and reinforce each other’s survival; cultural or political doctrines and systems, for instance a religion, are major alliances of self-replicating and co-evolving memes. Memetics101: memeplexes do not only find shelter in the mind of a new host, but they will change the perceptions and life of their new host.
Because the memetic explanation for CAGW rests upon social and evolutionary fundamentals (e.g. the differential selection of self-replicating narratives, narrative alliances, the penetration of memes into the psyche causing secondary phenomena like motivated reasoning, noble cause corruption and confirmation bias etc.) it is not dependent upon politics or philosophies of any stripe, which tend to strongly color most ‘explanations’ and typically rob them of objectivity. Critically,a memetic explanation also does not depend on anything happening in the climate (for better or for worse). CO2 worry acted as a catalyst only; sufficient real-world uncertainties at the outset (and indeed still) provided the degree of freedom that let a particular ‘ability’ of memeplexes take hold. That ability is to manipulate perceptions (e.g. of real-world uncertainty itself), values, and even morals, which means among other things that once birthed the CAGW memeplex rapidly insulated itself from actual climate events.
Homo Sapiens have likely co-evolved with memeplexes essentially forever (Blackmore), therefore they are a fundamental part of us, and indeed no characteristic of CAGW appears to be in the slightest bit new, quite the contrary. Underlining this ancient origin, one class of memeplexes folks are familiar with is: ‘all religions’. Yet these fuzzy structures are by no means limited to religion; science has triggered memetic themes before and extreme politics frequently does so, and there have even been historic memeplexes centered on climate. This does not mean CAGW is precisely like a religion, but being similarly powered by self-replicating narratives creates the comparable characteristics that many have commented upon.
Using a great deal of circumstantial evidence from the climate blogosphere and support from various knowledge domains: neuroscience, (economic) game theory, law, corporate behavior, philosophy, biological evolution and of course memetics etc., the essay maps the primary characteristics of CAGW onto the expected behavior for a major memeplex, finding conformance. Along the way, contemporary and historic memeplexes (mainly religious) are explored as comparisons. The essay is long, book-sized, because the subject matter is large. I guess an essay describing all of climate science would be very long, so one exploring the entire memetic characteristics of CAGW plus I hopeenough context for readers to make sense of that, is similarly so.
The context is extremely broad, ranging from why pyramid building evolved in Egypt to a passionate cry against kings, priests, and tyranny in a radical women’s journal of the early nineteenth century. From the impact of memeplexes on the modern judicial system courtesy of Duke Law, to the ancient purpose of story-telling and contemporary attempts to subvert this, along with a plot analysis of the film Avatar. From the long and curious tale of an incarnation of ‘the past is always better’ meme currently rampant on the internet, to the evolutionary selection of fuzzy populations in biology and the frankenplex multi-element cultural creature that is CAGW. From the conflict related death-rates in primitive tribes versus modern states, to analysis of corporate social responsibilities after the Enron and banking sector crises. From memetic chain letters that stretch back to the hieroglyphs (Letters from Heaven), to the analysis of social cross-coalitions via game theory within the perspective of economics. From the concept of ‘the Social Mind’ courtesy of neuro-scientist Michael Gazzaniga, to pressure upon religions by aggressive atheism as promoted by Richard Dawkins. From modification of theistic memes in the Old to the New Testament, to notions of Gaia and telegraph wires and wing-nuts. Plus memetic sex, witchcraft, cults, Cathars, concepts of salvation, Communism, hi-jacking altruism, Lynsenkoism, lichen, psychologizers, National Socialism, de-darwinisation, that ugly term ‘denier’, and much more.
The reason for this huge breadth and depth is that memeplexes are deeply integrated into both our psyche and our societies; this level of vision and historical context is necessary to uncover the entities, to identify their actions with as much distancing from what remains of ‘ourselves’ as can be achieved.
In counter-weight to this very broad context, the essay is richly laced throughout with quotes from many of the main players and commenters in the climate blogosphere (plus from newspapers and other publications too), much of which will be pretty familiar to followers of the climate debate. These quotes cover luke-warmers, skeptics and Consensus folks, plus politicians, philosophers, psychologists and others as regards their views on CAGW, yet all are chosen and brought together for their focus on the memetic aspects of the phenomenon. There are also plenty of deeper topics specific to the sociological aspects of CAGW that most denizens of the climate blogosphere will recognize and can get their teeth into, some contentious. For instance, a look at Richard Dawkins’ immersion within a rampant memeplex (while this would seem to be both controversial and ironic, when one realizes that we’re all immersed to some extent in several memeplexes, irony tends to morph to introspection). A brief view of a different Stephan Lewandowski paper (i.e. NOT either of the ‘conspiracy ideation’ ones) in which he highlights the very type of inbuilt cultural bias that has then led him blindly to produce those very challenged and troubled works! An exposé of memetically induced cultural bias in a recent paper on ‘Professionals’ Discursive Construction of Climate Change’, that in my opinion undermines the objectivity of the work and robs the conclusions of any real meaning. A very interesting take on Mike Hulme’s stance as revealed by the memetic perspective. A glimpse of the ‘shall-we shan’t-we dance’ tentative cross-coalition between the Christian and CAGW memeplexes. The constant references to grandchildren within CAGW advocacy texts. Both the laudable and the lurking memetic content in philosopher Pascal Bruckner’s essay ‘Against Environmental Panic’. Numerous views of sociological comment by atmospheric scientist Judith Curry or at her blog Climate Etc from a memetic perspective. Plus a delve into one of pointman’s very interesting climate related essays, strong language and classic climate quotes explained via memetics, and more…
While CAGW skeptics might at first blush celebrate the possibility of a single, non-climate related, non-partisan, science-based theory that explains the whole complex range of CAGW’s social characteristics, acceptance of this theory also requires acceptance of a couple of pretty uncomfortable truths, and the ditching of at least one touchstone used by many (but by no means all) climate change skeptics. These issues are all expounded in the essay, but I summarize here:
Acceptance of the memeplex explanation requires us to rethink what ‘self’ means, and how our opinions, perceptions, and even morals are formed and maintained, with an implication that our ‘self’ is much more about the societal groups we’re immersed in than about what’s intrinsically inside our heads. The fact that we don’t really ‘own’ ourselves, is challenging.
Acceptance of the memeplex explanation requires a rejection of the ‘scam’ or ‘hoax’ theory as a root cause of the CAGW phenomenon, and as a primary motivator for the vast majority of CAGW ‘adherents’. (Note this does not rule out the fact that scams / hoaxes and other negative social phenomena may be attached to the memeplex as secondary structures – this is in fact common for major memeplexes). The essay spends quite some length saying why this is so.
Whatever downsides are observed to stem from the social phenomenon of CAGW, memeplexes in general often contribute major net advantages to their host societies, sometimes very major. The balance between positive and negative aspects of a major memeplex are not easy to determine except long in retrospect and with access to the ‘big picture’ (all attributes and all impacts across all of society). Hence we cannot yet know the balance of this equation for CAGW. The positive aspects are not typically intuitive.
As already mentioned, the memetic explanation is virtually independent of actual climate events. Hence dangerous climate scenarios are not ruled out. It simply means that no scenarios are ruled out, from the very dangerous to the utterly benign, and it is very much in the memeplex’s interests to keep the situation that way. Memeplexes wallow in uncertainty and confusion.
Many commenters in the climate blogosphere have written to the effect that: ‘it isn’t and never was about the science’. I happen to agree, very little of the CAGW phenomenon is about the science. The memetic perspective reveals why this is; not in terms of political or financial motivations but in the objective terms of the underlying social mechanisms, which are independent of (and enable) all such motivations.
Despite the essay’s length, I hope you will take the journey to acquiring a memetic perspective. Here [ memeplex summary ] is a very distilled summary of each section of the essay below this text, with a list of references, in which a few regular contributors might find their names. Please note that the work is not a ‘paper’, containing no proofs or supporting mathematics, excepting a couple of references to Game Theory and the Price Equation. And merely for convenience, I have written as though the memeplex hypothesis is true, i.e. that CAGW is a memeplex and that this characteristic dominates the social effects. It is just extremely cumbersome throughout hundreds of references to make them all conditional – so I haven’t. Yet by no means does that mean the hypothesis is true, or at least wholly true in the sense that the memetic effects are dominant. Readers must form their own opinions regarding that, no doubt which opinions will be colored by the memeplexes they’re already immersed in J. I think most folks will find it an interesting and enjoyable ride though. The complete essay is here [link]: (Note: this Post text doubles as the essay Foreword, so you can skip that).
P.S. while I intend to issue further Revs of the essay with some extensions plus feedback / corrections applied, in practice this may only happen on a very long timescale, or possibly not at all as my time is extremely pressured. Please keep an eye on www.wearenarrative.wordpress.com for any up-Revs or additional information. Note: the novella Truth from the WUWT post above is now available (free) at Smashwords here: https://www.smashwords.com/books/view/273983 or within the anthology ‘Engines of Life’ also at Smashwords here: https://www.smashwords.com/books/view/334834, or at Amazon here.
PRESS CONFERENCE 9/24/13 Foreign Correspondents Club of Japan
Gregory Jaczko, Former Chairman,US Nuclear Regulatory Commission
Torgen Johnson,Citizens’ Representative, San Diego Forum
Tetsuro Tsutsui,Member, Nuclear Regulation Sub-committee,
Citizens’ Commission on Nuclear Energy (CCNE) /
Nuclear Power Plant Technical Experts’ Group
Canada’s Proposed Radioactive Waste Dump Next to Lake Huron
By John LaForge | September 27, 2013
Kincardine, Ontario – The thought “Dumb and Dumber” came to mind as I recorded the work of Canada’s Joint Review Panel Sept. 23 and 24, here in Ontario, on the east end of Lake Huron. The JRP is currently taking comments on a proposal to dump radioactive waste in a deep hole, 1mile from the shore of this magnificent inland sea.
What has to be called just plain dumb, is that the nuclear bomb industry branched out to build nuclear power reactors and, as E.F. Schumacher said, to “accumulate large amounts of highly toxic substances which nobody knows how to make safe and which remain an incalculable danger to the whole of creation for historical or even geological ages.” Unfortunately in the case of radioactive waste this has happened here, in Canada, etc.
Then, the giant Canadian utility Ontario Power Generation (OPG) proposes to bury its radioactive waste in a limestone dug-out, or “deep geologic repository,” one mile from the Great Lake Huron.
This must be considered “dumber”, but you’d be amazed at how much dumber it gets. Listening to the presentations of government regulators and utility propagandists for two long days normally puts reporters to sleep. But the staggering implausibility of some statements and the shockingly cavalier nature of others kept me blindingly awake.
The low- and intermediate-level radioactive waste that could be dumped in a 2,200-foot deep hole here — 200,000 cubic meters of it — contains long-lived, alpha radiation emitters like plutonium, the most toxic substance on Earth, which is dangerous for 240,000 years (10 half-lives).
Yet the reactor operator, Ontario Power Generation, had the nerve to say in a 2008 public handout: “[E]ven if the entire waste volume were to be dissolved into Lake Huron, the corresponding drinking water dose would be a factor of 100 below the regulatory criteria initially, and decreasing with time.”
This flabbergasting assertion prompted me to ask the oversight panel, “Why would the government dig a 1-billion-dollar waste repository, when it is safe to throw all the radiation into the lake?” The panel members must have considered my question rhetorical because they didn’t answer.
But it gets dumber.
There is much concern among Canadians over the fact that their government’s allowable limit for radioactive tritium in drinking water is 7,000 becquerels-per-liter. In the U.S., the EPA’s allowable limit is 740 bq/L — a standard almost ten times more strict. (A Becquerel is a single radioactive disintegration per second.) Tritium is the radioactive form of hydrogen, it can’t be filtered out of water, and it is both dumped and vented by operating nuclear reactors, and can leak from radioactive wastes in large amounts.
When the Canadian Nuclear Safety Commission staff scientist at the hearing, Dr. Patsy Thompson, was asked why Canada’s allowable contamination was so much higher than the U.S.’s, Thompson said, “The U.S. limit is based on using wrong dose conversion factors from the 1970s that haven’t been corrected.”
This preposterous assertion went unchallenged (because of hearings rules that required questions to be reserved in advance), but it will certainly be contested by Canadians and those in the U.S. who have learned a lot about tritium hazards since the ‘70s.
Can you believe it got dumber still? Lothar Doehler, Manager of the Radiation Protection Service in the Occupational health and Safety Branch, Ministry of Labor, testified that “To ensure safety after a radiological accident, the labor ministry does monitoring of water, vegetables, soil and other foods.”
I rushed to reserve a question and said for the record, “When the Labor Ministry measures radiation releases in the environment during a radiological accident, those releases have already occurred and exposure to that radiation has already begun. Simply monitoring the extent of radiological contamination does not ‘ensure safety’ from that radiation in any sense. Measuring radiation merely quantifies the harm being done by exposure to what is measured. Does the ministry have the authority to order evacuations from contaminated areas, like in Fukushima? Or to order the replacement of contaminated water with safe water, like in Fukushima? Or to order the cessation of fishing or fish consumption in the event of their contamination, like at Fukushima?”
The Chair of the JRP, Dr. Stella Swanson answered that the Ontario Ministry of Community Safety was responsible for evacuation planning in the event of a disaster. For his part, Mr. Doehler added that he was responsible “… to see that radioactively contaminated food was safe to eat.”
Stupefied by Mr. Doehler’s “blunder,” I missed a direct follow-up question and had to hustle after the man in the parking lot during a break to ask, “Pardon me Mr. Doehler; You didn’t mean to say that eating radioactive contamination in food is safe did you?”
“Oh, no,” Mr. Doehler said, “I apologize if I left that impression” — as he handed me his card.
Now Mr. Doehler is a highly-paid, high-level professional government official and didn’t make a mistake as I’d assumed. He’s not dumb or dumber, but enjoys deliberately misstating the facts when he can get away with it and when it suits his interests — just as Dr. Patsy Thompson does.
No, the sad mistake here is that so many catastrophic government actions can move ahead toward approval because the general public is keeping too quiet, or “playing dumb.”
John LaForge works for Nukewatch, an environmental watchdog group in Wisconsin, and edits its Quarterly newsletter.
The U.S. Air Force has spent years cleaning up toxic and nuclear materials at McClellan Air Force Base outside Sacramento since it was decommissioned in 2011, unsuccessfully trying to ship radioactive waste to a California dump and successfully sending a bunch of it to Utah under suspicious circumstances.
But now, as it bears down on a 2019 deadline for finishing the job of scraping potentially dangerous materials from 326 waste areas before delivering what’s left of the base-turned-industrial-hub into nonmilitary hands, the Air Force wants to bury the last of the radioactive waste on the property, close to residential neighborhoods.
State regulators and the city are not happy.
California has stricter rules governing waste disposal than the federal government and the California Department of Public Health says the plans for entombing the radium-226, a substance known to cause cancer, do not meet its standards, according to Katherine Mieszkowski and Matt Smith of the Center for Investigative Reporting.
The department has the power to block transfer of the property. California law requires that only facilities with special permits can accept soil contaminated with radium, and the state doesn’t have any.
But Steve Mayer, the Air Force remediation project manager at McClellan, told Center reporters that he was prepared to wait out the city and state because by 2019, “There will be a different governor then, too, and (regulators) all work for the governor.”
Actions at McClellan could serve as a template for federal behavior at other bases in California facing similar transitions from military to civilian use. Instead of paying costly expenses to ship the material to dumps, the Air Force could simply bury it on-site and walk away. There are reportedly seven bases in California that could face similar situations.
State regulators rebuffed the Air Force in 2011 when it lobbied hard to classify its McClellan radioactive waste as “naturally occurring” so it could qualify for shipment to Clean Harbors’ Buttonwillow landfill. Instead, it sent 43,000 tons of soil to an Idaho dump.
The 3,452-acre base was named a Superfund site in 1987 owing to years of maintaining aircraft that involved the “use, storage and disposal of hazardous materials including industrial solvents, caustic cleansers, paints, metal plating wastes, low-level radioactive wastes, and a variety of fuel oils and lubricants,” according to the U.S. Environmental Protection Agency. Much of the residue is believed to be from cleanup efforts related to radioactive paint used more than 50 years ago on glow-in-the-dark dials and gauges.
The property is being transferred to private hands piecemeal. In 2007, 62 acres were moved to the McClellan Business Park and another 35 acres was sold in 2011 to U.S. Foods, a national food distribution company. In 2010, 545 acres were transferred to the business park and California Governor Jerry Brown approved the transfer of another 528 acres in January of this year.
The latest surge in radiation at Fukushima nuclear plant may suggest not only additional water leaks at the site, but could also mean fission is occurring outside the crippled reactor, explains Chris Busby from the European Committee on Radiation Risk.
The increase in radiation reading is too significant to be blamed on random water leaks, believes Busby.
RT:Just how serious is the situation now in Japan?
Chris Busby: I think this is an indication that it has actually deteriorated significantly, very suddenly in the last week. What they are not saying and what is the missing piece of evidence here is that radiation suddenly cannot increase unless something happens and that something cannot be leakage from a tank, because gamma radiation goes straight through a tank. The tank has got very thin metal walls. These walls will only attenuate gamma radiation by 5 per cent, even when it is 1 cm thick.
Although they may think this is a leak from the tank, and there may well be leaks from the tank, this sudden increase of 1.8 Sieverts per hour is an enormously big dose that can probably kill somebody in 2 to 4 hours.
Today there was another leak found at 1.7 Sieverts per hour in more or less the same place. This huge radiation increase, in my mind means something going on outside the tanks, some radioactive fission is occurring, like an open air reactor, if you like, under the ground.
RT:What impact will this have on the clean-up operation and those who are involved in that operation?
CB: First of all it is clearly out of control and secondly no one can go anywhere near it. Nobody can go in to measure where these leaks are or do anything about them, because anybody who is to approach that sort of area would be dead quite quickly. They would be seriously harmed.
RT:Then presumably, someone who was there earlier, not knowing that the radiation levels were so high, are at risk now?
CB: I think many people are going to die as a result of this just like liquidators died after Chernobyl. They were dying over the next ten years or so.
RT:Why has TEPCO failed to contain the radiation?
CB: I think no one has actually realized how bad this is, because the international nuclear industries have tried to play it down so much, that they sort of came to the idea that somehow it can be controlled. Whereas all along, it could never be controlled.
I’ve seen a photograph taken from the air recently, in which the water in the Pacific Ocean is actually appearing to boil. Well, it is not boiling. You can see that it’s hot. Steam is coming off the surface. There is a fog condensing over the area of the ocean close to the reactors, which means that hot water is getting into the Pacific that means something is fissioning very close to the Pacific and it is not inside the reactors, it must be outside the reactors in my opinion.
RT:Surely the international nuclear industry should have come to TEPCO’s help before this?
CB: Yes. They should have done that. This is not a local affair. This is an international affair. I could not say why it has not. I think they are all hoping that nothing will happen, hoping that this will all go away and keeping their fingers crossed. But from the beginning it was quite clear that it was very serious and that there is no way in which this is not going to go very bad.
And now it seems to have suddenly got very bad. If that photograph I’ve seen is true, they should start evacuating people up to a 100 kilometer zone.
RT:So not only those that live in the vicinity but also those that live within 100 km could be at risk?
CB: I say that this might be a faked dubbed photograph, but if that is real and these levels of 1.8 Sieverts per hour are real, than something very serious has happened and I think people should start to get away.
RT:Since the radiation is leaking into the ocean, will it not have a major ecological impact elsewhere?
CB: Of course. What happens there is that it moves all the radioactivity up and down the coast right down to Tokyo. I’ve seen a statement made by Tokyo’s mayor saying this will not affect the application of Tokyo to be considered for the Olympic Games. I actually thought they ought to consider evacuating Tokyo. It is very, very serious.
Soybeans generate approximately $80 million annually in mandatory producer assessments alone, funding a marketing apparatus that has transformed an industrial commodity into one of America’s most trusted “health foods.” The campaign succeeded. Soy milk lines supermarket shelves beside dairy. Soy protein fortifies everything from infant formula to energy bars. Vegetarians rely on tofu and tempeh as dietary staples. Doctors recommend soy to menopausal women. School lunch programs serve soy-based meat substitutes to children. An estimated 60 percent of processed foods contain soy derivatives. The premise underlying this proliferation—that Asians have thrived on soy for millennia and that modern science validates its health benefits—has been repeated so often it functions as established fact.
Kaayla T. Daniel’s The Whole Soy Story dismantles this premise through systematic examination of the scientific literature. The book documents that traditional Asian soy consumption averaged roughly one tablespoon daily, consumed as fermented condiments after processing methods that neutralized inherent toxins—a pattern bearing no resemblance to American consumption of industrially processed soy protein isolate, soy flour, and soy oil. Daniel catalogs the antinutrients that survive modern processing (protease inhibitors, phytates, lectins, saponins), the toxic compounds created by industrial methods (nitrosamines, lysinoalanine, hexane residues), and the heavy metals concentrated in soy products (manganese, aluminum, fluoride, cadmium). She traces the mechanisms by which soy isoflavones—plant estrogens present at pharmacologically significant levels—disrupt thyroid function, impair fertility, and interact with hormone-sensitive cancers. The evidence emerges from peer-reviewed journals, FDA documents, and industry sources themselves.
The stakes extend beyond individual dietary choices. Infants fed soy formula receive isoflavone doses equivalent to several birth control pills daily, with blood concentrations 13,000 to 22,000 times higher than their natural estrogen levels. Soy protein isolate—the ingredient in formula, protein bars, and thousands of products—has never received GRAS (Generally Recognized as Safe) status; its only pre-1960s use was as an industrial paper sealant. Two senior FDA scientists formally protested their own agency’s approval of soy health claims, citing evidence of thyroid damage and reproductive harm. The Honolulu Heart Program found that men consuming tofu twice weekly showed accelerated brain aging and increased dementia. These findings have not penetrated public awareness because the institutions responsible for consumer protection have been compromised by the industry they regulate. The Whole Soy Story presents the evidence that has been systematically excluded from mainstream health messaging, enabling readers to evaluate for themselves what the soy industry prefers they never learn. … continue
This site is provided as a research and reference tool. Although we make every reasonable effort to ensure that the information and data provided at this site are useful, accurate, and current, we cannot guarantee that the information and data provided here will be error-free. By using this site, you assume all responsibility for and risk arising from your use of and reliance upon the contents of this site.
This site and the information available through it do not, and are not intended to constitute legal advice. Should you require legal advice, you should consult your own attorney.
Nothing within this site or linked to by this site constitutes investment advice or medical advice.
Materials accessible from or added to this site by third parties, such as comments posted, are strictly the responsibility of the third party who added such materials or made them accessible and we neither endorse nor undertake to control, monitor, edit or assume responsibility for any such third-party material.
The posting of stories, commentaries, reports, documents and links (embedded or otherwise) on this site does not in any way, shape or form, implied or otherwise, necessarily express or suggest endorsement or support of any of such posted material or parts therein.
The word “alleged” is deemed to occur before the word “fraud.” Since the rule of law still applies. To peasants, at least.
Fair Use
This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more info go to: http://www.law.cornell.edu/uscode/17/107.shtml. If you wish to use copyrighted material from this site for purposes of your own that go beyond ‘fair use’, you must obtain permission from the copyright owner.
DMCA Contact
This is information for anyone that wishes to challenge our “fair use” of copyrighted material.
If you are a legal copyright holder or a designated agent for such and you believe that content residing on or accessible through our website infringes a copyright and falls outside the boundaries of “Fair Use”, please send a notice of infringement by contacting atheonews@gmail.com.
We will respond and take necessary action immediately.
If notice is given of an alleged copyright violation we will act expeditiously to remove or disable access to the material(s) in question.
All 3rd party material posted on this website is copyright the respective owners / authors. Aletho News makes no claim of copyright on such material.