Aletho News

ΑΛΗΘΩΣ

Japan hopes seabed will yield data and resources

DW | January 17, 2014

With scant energy and mineral reserves of its own, and nuclear plants mothballed since the Fukushima nuclear disaster, Japan is investing heavily in exploring beneath the oceans for resources that will power its future.
Seabed off coast of Japan

On the first day of 2014, the Japanese research ship Chikyu set a new record by drilling down to a point 3,000 meters beneath the seabed off southern Japan. It was an appropriate way to ring in the new year and signals an increased commitment to learning more about the secrets that lay beneath the floor of the ocean close to Japan.

The research has two distinct but connected driving forces. As Japan prepares to mark the third anniversary of the March 11 Great East Japan Earthquake, the Chikyu is undertaking the most extensive survey ever attempted of the Nankai Trough, a geological fault that extends for several hundred kilometers parallel to the southern coast of Japan and widely seen as the source of the next major earthquake that will affect this tremor-prone nation. And with all of Japan’s nuclear reactors presently mothballed in the aftermath of the disaster, which destroyed the Fukushima Dai-Ichi nuclear plant, there is a new sense of urgency in the search for sources of energy and other natural resources close to Japan.

Limited natural resources

“When I was in elementary school, we learned that Japan does not have many natural resources of its own and that we needed to import all the oil, the gas, the metals and minerals that we needed,” Toshiyaki Mizuno, the deputy director of the Ocean and Earth Division at the ministry of science and technology, told DW.

“And that was what we thought for a long time,” he said. “Until we recently discovered that there are significant deposits of methane hydrates within Japan’s exclusive economic zone.”

Also known as natural gas hydrate or “fire ice,” it is a solid compound in which high levels of methane have been trapped in a crystal structure of water. Originally believed to only exist on the outer reaches of the solar system, significant deposits are now being discovered beneath seabed sediment and it is estimated that supplies are as much as 10 times the known reserves of natural gas.”

The dream of new energy

“There are many problems that we need to overcome before we can say that Japan’s energy problems have been solved, but the dream is to exploit this new source of energy and other resources and this is the first step in achieving that,” Mizuno said.

The Japanese government has announced plans to work with private companies to develop new technologies to explore the resources that are below the seabed off Japan, including the development of advanced submersibles and remote-controlled underwater vehicles.

Companies will work with no fewer than four Japanese ministries, representing trade and industry, science and technology, land and infrastructure and the Internal Affairs Ministry and there are hopes that the proposed recovery of resources could go ahead in as little as five years.

The government is putting aside a portion of the 50 billion yen (352.3 million euros) budget for strategic innovation projects to support the ambitious drive, with organizations such as the Japan Agency for Marine-Earth Science and Technology tasked with developing submarines that can operate at depths of up to 3,000 meters and large-scale excavation ships.

“This issue is becoming quite urgent for Japan because the government’s growth policy to date has largely focused on the weakening yen, which means that all imports of resources and energy are very expensive,” said Martin Schulz, senior economist at the Fujitsu Research Institute.

“Japan has to reduce those costs over the long term and developing these undersea resources is becoming much more economic than it was before,” he said.

“It is also important in terms of Japan’s energy mix as it does not seem likely that the nuclear reactors will be restarted in a significant way in the immediate future,” he added.

“Exploring close to Japan’s coastline for these resources makes complete sense, although we also know that methane hydrates can be extremely dangerous to collect and develop,” he said.

At the same time as Japan attempts to reduce its reliance on expensive imports and distance itself from relying on volatile suppliers of rare earth minerals – such as China – it is also in a hurry to learn more about the geological structure of the surface of the Earth close to the Japanese archipelago and the threats that natural disasters pose.
a Chinese navy missile frigate passing a drilling rig at the Tianwaitian gas field in the East China Sea, taken by Japanese Maritime Self-Defense Forces patrol plane on 09 September, 2005.

Questions over sovereignty and natural resources in the East China Sea have led to disputes with China

The drilling being conducted by the Chikyu is to examine the layers beneath the seabed in the Nankai Trough. In March last year, a study by the Central Disaster Management Council as a direct result of the impact of the earthquake that struck northeast Japan predicted that a magnitude-9 quake in the danger zone could trigger a tsunami as much as 30 meters high that could kill 320,000 people.

The disaster would destroy road and rail links the length of the country, the tsunami would pulverize buildings that had already been weakened by the tremor, infrastructure would be wiped out for hundreds of kilometers along the coast and the projected cost in terms of the damage wrought on the country is 220 trillion yen (1.84 trillion euros).

Given the scale of the threat, scientists say there is no time to lose in trying to determine when and precisely where the disaster might strike.

January 19, 2014 Posted by | Economics, Science and Pseudo-Science | , , | Leave a comment

Senate EPW Hearing on the President’s Climate Action Plan

By Judith Curry | Climate Etc. | January 16, 2014

The hearing is now concluded, I’m on a plane flying back to Atlanta.

The testimony from each of the witnesses is now online [here].  The link for my testimony is [here].

The content of my verbal remarks is below:

I would like to thank the Committee for the opportunity to present testimony this morning. I am Chair of the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology. I have devoted 30 years to conducting research on topics including climate of the Arctic, the role of clouds and aerosols in the climate system, and the climate dynamics of extreme weather events.

The premise of the President’s Climate Action Plan is that there is an overwhelming judgment of science that anthropogenic global warming is already producing devastating impacts. Anthropogenic greenhouse warming is a theory whose basic mechanism is well understood, but whose magnitude is highly uncertain. Multiple lines of evidence presented in the recent IPCC 5th assessment report suggest that the case for anthropogenic warming is now weaker than in 2007, when the 4th assessment report was published.

My written testimony documented the following evidence:

  • For the past 16 years, there has been no significant increase in surface temperature. There is a growing discrepancy between observations and climate model projections. Observations since 2011 have fallen below the 90% envelope of climate model projections
  • The IPCC does not have a convincing or confident explanation for this hiatus in warming.
  • There is growing evidence of decreased climate sensitivity to atmospheric carbon dioxideconcentrations
  • Based on expert judgment in light of this evidence, the IPCC 5th assessment report lowered its surface temperature projection relative to the model projections for the period 2016-2036.

The growing evidence that climate models are too sensitive to CO2 has implications for the attribution of late 20th century warming and projections of 21st century climate change. Sensitivity of the climate to carbon dioxide, and the level of uncertainty in its value, is a key input into the economic models that drive cost-benefit analyses, including estimates of the social cost of carbon.

If the recent warming hiatus is caused by natural variability, then this raises the question as to what extent the warming between 1975 and 2000 can also be explained by natural climate variability. In a recent journal publication, I provided a rationale for projecting that the hiatus in warming could extend to the 2030’s. By contrast, according to climate model projections, the probability of the hiatus extending beyond 20 years is vanishing small.  If the hiatus does extend beyond 20 years, then a very substantial reconsideration will be needed of the 20th century attribution and the 21st century projections of climate change.

Attempts to modify the climate through reducing CO2 emissions may turn out to be futile. The stagnation in greenhouse warming observed over the past 15+ years demonstrates that CO2 is not a control knob that can fine tune climate variability on decadal and multi-decadal time scales. Even if CO2 mitigation strategies are successfully implemented and climate model projections are correct, an impact on the climate would not be expected for a number of decades. Further, solar variability, volcanic eruptions and natural internal climate variability will continue to be sources of unpredictable climate surprises.

As a result of the hiatus in warming, there is growing appreciation for the importance of natural climate variability on multi-decadal timescales.  Further, the IPCC AR5 and Special Report on Extreme Events published in 2012, find little evidence that supports an increase in most extreme weather events that can be attributed to humans.

The perception that humans are causing an increase in extreme weather events is a primary motivation for the President’s Climate Change Plan.  However, in the U.S., most types of weather extremes were worse in the 1930’s and even in the 1950’s than in the current climate, while the weather was overall more benign in the 1970’s. The extremes of the 1930’s and 1950’s are not attributable to greenhouse warming and are associated with natural climate variability (and in the case of the dustbowl drought and heat waves, also to land use practices). This sense that extreme weather events are now more frequent and intense is symptomatic of pre-1970 ‘weather amnesia’.

The frequency and intensity of extreme weather events is heavily influenced by natural climate variability. Whether or not anthropogenic climate change is exacerbating extreme weather events, vulnerability to extreme weather events will continue to increase owing to increasing population and concentration of wealth in vulnerable regions. Regions that find solutions to current problems of climate variability and extreme weather events and address challenges associated with an increasing population are likely to be well prepared to cope with any additional stresses from climate change.

Nevertheless, the premise of dangerous anthropogenic climate change is the foundation for a far-reaching plan to reduce greenhouse gas emissions and reduce vulnerability to extreme weather events. Elements of this Plan may be argued as important for associated energy policy reasons, economics, and/or public health and safety. However, claiming an overwhelming scientific justification for the Plan based upon anthropogenic global warming does a disservice both to climate science and to the policy process.

Good judgment requires recognizing that climate change is characterized by conditions of deep uncertainty. Robust policy options that can be justified by associated policy reasons whether or not anthropogenic climate change is dangerous avoids the hubris of pretending to know what will happen with the 21st century climate.

This concludes my testimony.

JC comments:   The hearing was very long; not so much because of questioning of the witnesses, but there was much pontification by the committee members (much more of this than on the House Subcommittees, it seems).

Several things struck me.  All of the members seem pretty well educated on the topic of climate change.  I cannot say the same of the administrators on the first panel.

Most of the members were there for Panel 1; only a few remained for Panel 2.

I’m fairly happy with my written testimony, but was surprised that my verbal testimony went over the time limit (have never gone over before).  The questions were fairly light weight.

Andrew Dessler did a pretty good job particularly on the verbal testimony and answering questions.

All in all, a very interesting experience, but stressful since you need to pretty much drop everything to prepare your testimony (and I have a pile of things that need to be finished before tomorrow).

So does any of this matter? We’ll see.  I felt that my previous testimony to the House Committee did have an impact.

January 18, 2014 Posted by | Science and Pseudo-Science | , , , , , , | Leave a comment

Drug Companies and Doctors Boost Profits Pitching Attention Deficit Disorder

By Noel Brinkerhoff | AllGov | December 17, 2013

With the help of physicians, pharmaceutical makers have made billions of dollars peddling medicines to treat attention deficit disorder, leading some experts, and even one pharmaceutical executive, to declare that the marketing push has gone too far.

Last year, sales of stimulant medication intended to treat attention deficit hyperactivity disorder (ADHD) reached $9 billion—a fivefold increase from a decade ago.

Today, 15% of high school students have been diagnosed with ADHD, with about 3.5 million of them on some sort of drug marketed to treat the disorder.

Dr. Keith Conners, who has spent decades trying to help children with ADHD, has questioned the increasing rates of diagnosis, calling them “a national disaster of dangerous proportions.”

“The numbers make it look like an epidemic. Well, it’s not. It’s preposterous,” Conners, a psychologist and professor emeritus at Duke University, told The New York Times. “This is a concoction to justify the giving out of medication at unprecedented and unjustifiable levels.”

The drug industry has worked for two decades to publicize ADHD and promote its remedies to doctors, educators and parents. As a result, the disorder is now the second most frequent long-term diagnosis made in children, just behind asthma.

Drugs such as Ritalin, Adderall, Concerta, Focalin, Vyvanse, Intuniv and Strattera have been promoted to help children, but along the way, the Food and Drug Administration has cited every major ADHD drug for false and misleading advertising since 2000.

Doctors also have been criticized for taking money from drug companies to publish research and deliver presentations that encourage colleagues to prescribe these drugs, which possess significant side effects and are regulated in the same class as morphine and oxycodone because of their potential for abuse and addiction.

Now, companies want to market the medications to adults to further expand revenue-making opportunities.

Roger Griggs, the pharmaceutical executive who introduced Adderall in 1994, objects to marketing stimulants to the general public because of the risks involved. He called the drugs “nuclear bombs” that should rarely be prescribed and carefully monitored by a treating physician, according to the Times.

To Learn More:

The Selling of Attention Deficit Disorder (by Alan Schwarz, New York Times)

Latest Condition Invented by Drug Companies…Low Testosterone (by Matt Bewig, AllGov)

Drug Companies Increase Profits by Creating Fear of Diseases (and Even Diseases) (by David Wallechinsky, AllGov)

December 17, 2013 Posted by | Science and Pseudo-Science | , , , | Leave a comment

Read This Before You Take That Statin

By Barbara Roberts and Martha Rosenberg | Dissident Voice | December 10, 2013

The American Heart Association (AHA) and the American College of Cardiology (ACC) recently released new cardiovascular disease prevention guidelines. They are an egregious example of much that is wrong with medicine today.

The guidelines propose a vast expansion of the use of statins in healthy people, recommending them for about 44 percent of men and 22 percent of healthy women between the ages of 40 and 75. According to calculations by John Abramson, lecturer at Harvard Medical School, 13,598,000 healthy people for whom statins were not recommended based on the 2001 guidelines now fall into the category of being advised to take moderate or high intensity statin therapy.

The American Heart Association (AHA) is a nonprofit organization with a mission to “build healthier lives free of cardiovascular disease and stroke.” Yet in its 2011-2012 financial statement, the AHA noted $521 million in donations from non-government and non-membership sources and many well-known large drug companies, including those who make and market statins, contribute amounts in the $1 million range.

Even as many in the medical community suspected the guidelines were a ploy to help the AHA’s drug partners sell statins, it was revealed that the guideline’s online calculator to determine cardiac disease risk over predicts risk by an astonishing 75 to 150 percent. But the guideline writers are standing firmly behind their faulty calculator.

Seven of the 15 authors disclosed ties to industry. Originally, the panel chair, Neil J. Stone, MD of Northwestern University, declared that he has had no ties to industry since 2008. Jeanne Lenzer, writing in the British Medical Journal (BMJ) recently, interviewed Dr. Stone who said: “When I was asked by NHLBI [National Heart, Lung and Blood Institute] to chair the [cholesterol] panel, I immediately severed ties with all industry connections prior to assuming my role as chair.” However, prior to 2008, he accepted funding and consultancy fees from multiple pharmaceutical companies, including Abbott, AstraZeneca, Pfizer, Merck, and Schering-Plough among others. Dr. Stone also told the BMJ that he will “definitely” not take any industry funding for two years. Are we to believe that by severing his ties in 2008 his mind became an instant tabula rasa, completely devoid of any conscious or unconscious bias towards the drug companies which had been paying him? To do so strains the bonds of credulity past the breaking point.

The financial ties between large pharmaceutical companies and the AHA are numerous and very remunerative for the AHA, including huge donations from Abbott, Bayer, Boehringer Ingelheim, Bristol-Myers Squibb (BMS), Eli Lilly, Merck and Pfizer. BMS, along with Merck and Pfizer, are major funders of AHA’s Go Red For Women heart disease awareness campaign whose web site tells patients “If your doctor has placed you on statin therapy to reduce your cholesterol, you can rest easy–the benefits outweigh the risks” The site also proclaims that , “Zocor and Pravachol–have the fewest side effects,” and “statins may only slightly increase diabetes risks.” The Women’s Health Initiative, a federal study of over 160,000 healthy women to investigate the most common causes of death, disability and poor quality of life in postmenopausal women, showed that a healthy woman’s risk of developing diabetes was increased 48 percent compared to women who were not on a statin. And contrary to what statin apologists say about statins only increasing diabetes risk in people who are at high risk of developing it anyway, for example the obese, women on statins in the Women’s Health Initiative who were of normal weight increased their risk of diabetes 89 percent compared to same weight women not taking a statin.

In 2010, AHA received $21,000 from statin maker AstraZeneca to run an AHA course about “emerging strategies with statins” at the Discovery Institute of Medical Education and almost $100,000 for learning projects including “debating controversial topics in cardiovascular disease.” The AHA defended the deceptively marketed and controversial cholesterol drug Vytorin. Did that have anything to do with the $2 million a year the AHA was taking from marketer Merck/Schering-Plough Pharmaceuticals?

The AHA also rakes in millions from food companies which are also million dollar donors and which pay from $5,490 to $7,500 per product to gain the “heart-check mark” imprimatur from the AHA, renewable, at a price, every year. The foods so anointed have to be low in fat, saturated fat, and cholesterol yet Boar’s Head All Natural Ham (340 milligrams of sodium in a 2-ounce serving) somehow made the cut as did Boar’s Head EverRoast Oven Roasted Chicken Breast (440 milligrams of sodium in a 2-ounce serving). Such processed, high-sodium meats raise blood pressure, the risk of cardiovascular disease and the risk of diabetes. A review of almost 1,600 studies involving one million people in ten countries on four continents showed that a 1.8-ounce daily serving of processed meat raised the risk of diabetes by 19 percent and of heart disease by 42 percent.

The new guidelines might make sense if statins were truly as effective as their proponents claim, and if they had no adverse effects. But they have an increasing list of side effects, which affect at least 18 percent of people who take them. These range from muscle pain, weakness and damage to cataracts, cognitive dysfunction, nerve damage, liver injury and kidney failure.

Even the most avid statin proponents agree that statins do not prevent 60 to 80 percent of cardiac events. This is called “residual risk.” If there were a vaccine, say Vaccine X, that did not prevent 60 to 80 percent of cases of Infection Y, very few would be inclined to take it.

As Jerome Hoffman, MD, Emeritus Professor of Medicine at UCLA wrote recently with regard to these guidelines: “How did we arrive at a place where conflicted parties get to make distorted semi-official pronouncements that have so much impact on public policy?” How indeed?

~

Barbara Roberts, MD, FACC is an Associate Clinical Professor of Medicine at the Alpert Medical School of Brown University. She is the author of The Truth about Statins and How to Keep from Breaking Your Heart: What Every Woman Needs to Know about Cardiovascular Disease. Martha Rosenberg is a health reporter and author of Born with a Junk Food Deficiency.

December 11, 2013 Posted by | Corruption, Deception, Science and Pseudo-Science | , , , , , , | Leave a comment

How to Debunk WTC Thermite

wtc_thermite_2

By Kevin Ryan | Dig Within | December 8, 2013

The evidence for the presence of thermite at the World Trade Center (WTC) on 9/11 is extensive and compelling. This evidence has accumulated to the point at which we can say that WTC thermite is no longer a hypothesis, it is a tested and proven theory. Therefore it is not easy to debunk it. But the way to do so is very straightforward and is in no way mysterious.

To debunk the thermite theory, one must first understand the evidence for it and then show how all of that evidence is either mistaken or explained by other phenomena. Here are the top ten categories of evidence for thermite at the WTC.

  1. Molten metal: There are numerous photographs and eyewitness testimonies to the presence of molten metal at the WTC, both in the buildings and in the rubble. No legitimate explanation has been provided for this evidence other than the exothermic reaction of thermite, which produces the temperatures required and molten iron as a product.
  2. The fires at Ground Zero could not be put out for several months. Despite the application of millions of gallons of water to the pile, several rainfall events at the site, and the use of a chemical fire suppressant, the fires would not subside. Thermal images produced by satellite showed that the temperatures in the pile were far above that expected in the debris from a typical structure fire. Only thermite, which contains its own oxidant and therefore cannot be extinguished by smothering it, can explain this evidence.
  3. Numerous eyewitnesses who were fleeing the area described the air mass as a hot wind filled with burning particles.[1] This evidence agrees with the presence of large quantities of thermite byproducts in the air, including hot metallic microspheres and still-reacting agglomerates of thermite.
  4. Numerous vehicles were scorched or set on fire in the area. Photographic evidence shows that cars parked within the lower-level garage areas of the WTC complex burned as if impacted by a super-hot wind like that described by the eyewitnesses. All non-metallic parts of the cars, including the plastic, rubber, and glass, were completely burned off by a hot blast.
  5. There was a distinct “white smoke” present—clearly different from smoke produce by a normal structural fire—as indicated by eyewitnesses and photographic evidence.[2] The second major product of the thermite reactions is aluminum oxide, which is emitted as a white solid shortly after reaction.
  6. Peer-reviewed, scientific research confirmed the presence of extremely high temperatures at the WTC. The high temperatures were evidenced by metallic and other microspheres, along with evaporated metals and silicates. These findings were confirmed by 9/11 investigators and by scientists at an independent company and at the United States Geologic Survey.
  7. The elemental composition of the metallic microspheres from the WTC dust matches that of metallic microspheres produced by the thermite reaction.
  8. The environmental data collected at Ground Zero in the months following 9/11 indicate that violent incendiary fires, like those produced by thermite, occurred on specific dates. Peer-reviewed scientific analysis of these data show that the components of thermite spiked to extraorindary levels on specific dates in both the air and aerosol emissions at Ground Zero.
  9. Carbon nanotubes have been found in the WTC dust and in the lungs of 9/11 first responders. Formation of carbon nanotubes requires extremely high temperatures, specific metal catalysts, and carbon compounds exactly like those found in nanothermite formulations. Researchers have discovered that nanothermite produces the same kinds of carbon nanotubes. That finding has been confirmed by independent analysis in a commercial contract laboratory.
  10. A peer-reviewed scientific publication has identified the presence of nanothermite in the WTC dust. One of the critical aspects of that paper has been confirmed by an independent scientist.

There is also a great deal of indirect evidence for the thermite theory. This includes the attempts by NIST to downplay the evidence for thermite. It also includes things like a weak effort by Rupert Murdoch’s National Geographic Channel to discredit the ability of thermite to cut structural steel, which was itself roundly discredited by one independent investigator. It is now unquestionable that thermite can cut structural steel as needed for a demolition.

Therefore, debunking the WTC thermite theory is not easy but is very straightforward. Doing so simply requires addressing the evidence listed above point by point, and showing in each case how an alternative hypothesis can explain that evidence better. Given the scientific grounding of the thermite theory, use of the scientific method, including experiments and peer-reviewed publications, would be essential to any such debunking effort.

That is almost certainly why we have seen no such debunking. Instead, the people working to refute the WTC thermite theory have resorted to what might be called a case study in how NOT to respond to scientific evidence.

The failed thermite theory debunkers have produced:

  • Thousands of chat room comments and other posts yet not one peer-reviewed scientific article.
  • Alternate hypotheses that have little or no evidence to support them. For example, the mini-nuke hypothesis and the “Star Wars Beam” hypothesis.
  • Government scientists declaring that the evidence simply doesn’t exist.
  • Attempts to exaggerate the meaning of the evidence, for example by saying that thermite or nanothermite could not have caused all of the effects seen at the WTC.
  • Deceptive efforts to introduce the government contractors who produced the official accounts as independent scientists.

The last of these methods has been the most popular. Trying to debunk the tenth piece of evidence for WTC thermite, NIST contractor James Millette produced an unreviewed paper that purports to replicate the finding of nanothermite in the WTC dust. This was apparently organized in the hope that doing so would discredit all of the evidence for thermite at the WTC.

Millette is well known for having helped produce the official reports on the analysis of WTC dust. He was responsible for creating the form that was used to pre-screen all materials found in the dust prior to any analysis by official investigators. Those official reports did not mention any of the evidence listed above, in particular failing to report the abundant iron microspheres scattered throughout the WTC dust. Additinally, Millette’s official report team did not find any red-gray chips, let alone nanothermite.

As he worked to debunk the WTC thermite research, Millette was still unable to find any iron microspheres. But he did claim to have finally found the red-gray chips. Curiously, he did not attempt to replicate the testing that would determine if those chips were thermitic.

Claiming to have found the chips, Millette perfomed an XEDS analysis for elemental composition but failed to do any of the other tests including BSE, DSC, the flame test, the MEK test, or measurement of the chip resistivity. Having inexplicably “ashed” the chips at 400 °C in a muffle furnace, thereby proving that they were not the nanothermite chips (which ignite at 430 °C), Millette ignored the remainder of the study he had set out to replicate.  Because he did not do the DSC test, he could not do XEDS of the spheres formed from the chips. Since he had still not found spheres in the dust, he could not test those and this allowed him to ignore the testing of spheres produced by the thermite reaction.

ftir911Millette rested his case on FTIR, which I have also performed on chips from WTC dust but with a much different result. Like Millette’s paper, my FTIR work is not yet part of a peer-reviewed publication and therefore should not be taken as authoritative evidence. There has been less urgency to this supplemental work because what has been done to date has received no legitimate response from the government or from much of the scientific community. That sad fact should be the central point of discussion today.

In any case, Millette attempted only one tenth of the tests in his struggle to replicate (or refute) one tenth of the evidence for thermite at the WTC. His un-reviewed “one percent approach” was nonetheless very convincing to many people, including some of the people who produced the official reports for 9/11. But it is obvious to others that Millette’s work was not a replication in any sense of the word.

I’m looking forward to the peer-reviewed scientific article that finally does replicate the nanothermite paper or any of the other peer-reviewed scientific papers that give evidence for thermite at the WTC. Hopefully, we can approach those efforts without concerns about the sources and without recalling all the deception and manipulation that preceded them.

Until then, it is important to recognize the difference between the superficial appearance of science and the actual practice of science. Ignoring 90 percent of the evidence is not scientific. And replication of the 10 percent means actually repeating the work. If thermite debunkers and alternate hypothesis supporters can find the courage and focus to step through that challenge, maybe they can begin to add to the discussion.

[1] Here are only a few examples of the hot wind:
“Then the dust cloud hits us. Then it got real hot. It felt like it was going to light up almost.” -Thomas Spinard, FDNY Engine 7
“A wave — a hot, solid, black wave of heat threw me down the block.” – David Handschuh, New York’s Daily News
“When I was running, some hot stuff went down by back, because I didn’t have time to put my coat back on, and I had some — well, I guess between first and second degree burns on my back.” -Marcel Claes, FDNY Firefighter
“And then we’re engulfed in the smoke, which was horrendous. One thing I remember, it was hot. The smoke was hot and that scared me” -Paramedic Manuel Delgado
“I remember making it into the tunnel and it was this incredible amount of wind, debris, heat….” -Brian Fitzpatrick FDNY Firefighter
“A huge, huge blast of hot wind gusting and smoke and dust and all kinds of debris hit me” -Firefighter Louis Giaconelli
“This super-hot wind blew and it just got dark as night and you couldn’t breathe” -Firefighter Todd Heaney

[2] For example, see Joel Meyerowitz, Aftermath: World Trade Center archive. Phaldon Publishing, London, p 178. See photograph of the event on 11/08/01 that shows a stunning and immediate change of cloud-like emissions from the pile, from dark smoke to white cloud.

December 8, 2013 Posted by | Deception, False Flag Terrorism, Science and Pseudo-Science, Timeless or most popular | , , , , | Leave a comment

Unminced Words By Climate Scientist Hans von Storch

“Scientists Too Quick To Claim Last Word”

No Trick Zone | November 18, 2013

The Resonator, the research podcast of the German Helmholtz Research Group conducted a long interview (1 hr 40 min!) with climate scientist Hans von Storch director of the GKSS Research Center. In the interview von Storch was asked about his views on a wide variety of climate science related issues.

Overall the interview saw a Hans von Storch who spoke frankly and openly. Some of the remarks he made raised my eye brows. In general von Storch, best described as a non-alarmist warmist, views the climate debate as being dominated by the more extreme positions from both sides, with voices in the middle getting drowned out. He levels a fair amount of criticism at the climate science community, but does so without naming any persons in particular.

Due to the sheer length of the interview, I will only look at the points that I found interesting and relevant as skeptic.

Scientists too quick to accept dramatic scenarios

At the 15-minute mark von Storch describes a science that is so politicized with both sides are trying to make it black and white, and a debate that has been overly shrill. Some scientists, he says, have tended to accept dramatic scenarios and consequences even when there’s little evidence behind them. He also talks of a group of scientists who fancy themselves as the ultimate authority and who have the last word. All the exaggerations and projections of doom, gloom and disaster have led to an overall discrediting of the field.

“Science and Nature are pretty bad journals”

At the 29-minute mark von Storch says he sees himself as someone who needs a lot of time before he is convinced of anything. I was surprised to hear him call both Science and Nature “pretty bad journals” when it comes to the quality of their articles. Hans von Storch cites an article published by Science claiming that the climate was going to tip in the year 2047, calling the report “a real doozy“. He says that science journals must remain sufficiently critical and not let themselves get caught up with the zeitgeist. Von Storch admits that he has not always been popular among the community.

Overall von Storch doesn’t blame the media much for the hysteria, implying that the hysteria stems more from scientists communicating poorly. The media are only interpreting what the scientists are spewing. Projections of snowless winters, for example, were hardly helpful in lending credibility to climate science.

Scientists dramatizing for attention and prestige

At the 37-minute mark von Storch believes some scientists succumbed to drama in order to get attention and prestige, and says that the such are only damaging the credibility of climate science.

Models too CO2-centric

At the 40-minute mark von Storch discusses possible reasons why the warming has stalled and thinks other explanations need to be examined, such as solar activity and aerosols. He finds climate models too CO2-centric in general. Here he appeals for more patience to let the science unfold.

At the 45 minute mark he fires harsh criticism at scientists who promote a society governed by an elite technocracy, calling the idea “stupidity”. He calls the proposals made by a group of scientists in favor of appointing future councils to represent the interests of future generations “peculiar”.

At the 59-minute mark, on whether storms are becoming more frequent and severe, von Storch says he doesn’t think this is the case and that the disasters are more about the over-development of coastal areas.

Hockey stick was “something dumb” – an attempt to steer politics

On the hockey stick chart, at the 63 minute mark, von Storch has some blunt words on how it was possible to for it to become the icon that it became. He recalls having examined the chart himself and found it deficient.

“I believe it was something dumb by scientists who wanted to steer politics.”

He thinks the climate science community were too quick to call it the last word. Hans von Storch sees critique of the hockey stick and confirmed and that’s why it no longer appears in the IPCC reports. Scientists, von Storch reminds us, should not be so quick to claim absolute truth.

Also, von Storch believes that the oceans could be warming up, but that there is very little data out there to confirm it.

November 23, 2013 Posted by | Science and Pseudo-Science | , , , , , | Leave a comment

Mummy, The Ocean’s Eaten My Heat!!

By Paul Homewood | Not A Lot Of People Know That | September 24, 2013

I’ve been meaning to post on this for a while. We often hear the claim that all of the missing heat has been gobbled up by the oceans.

Now, let’s leave aside some of the obvious problems with this theory, such as:

  • How all of this heat has selectively and mysteriously managed to avoid land areas.
  • How warm water has managed to sink instead of rise.
  • How the water at, or near, the surface seems to have escaped this warming.

And get straight to the nub of the matter.

Water has a much higher heat capacity than air. According to NOAA,

The oceans store more heat in the uppermost 3 meters (10 feet) than the entire atmosphere (above it).

So let’s run some very simple calculations.

In the last decade, most models were predicting something of the order of 0.2C global warming. If, instead of warming the atmospheric , this extra heat has gone into the sea, its effects will be much diluted, with the result that increases in sea temperatures will be much, much less than 0.2C.

(Remember, it takes much more energy to warm a bucket of water by 1C than a bucket of air.)

The suggestion is that, as there has been no noticeable warming in the upper 100 meters, this “hidden heat” is as far as 2000 meters down.

So, ocean temperature should have increased by:

2000 Meters Divided By 3 Meters = 666.6

0.2C Divided By 666.6 = 0.0003C

The idea that we can:

  • measure sea temperatures throughout all the oceans of the world
  • measure it throughout the whole depth down to 2000 meters and more.
  • take into account seasonal changes
  • take into account shifting ocean cycles and currents.

and still be able to measure the overall temperature to better than  three ten thousandths of a degree is patent nonsense.

So step forward Professor Ted Shepherd, a leading atmospheric scientist and recently installed as Grantham Chair in Climate Science at Reading University.

He had this to say to the Guardian.

The heat is still coming in, but it appears to have gone into the deep ocean and, frustratingly, we do not have the instruments to measure there,”

Or to put it another way, we have no idea whether it is or not, but in the meantime we’ll still cling to our theory.

November 21, 2013 Posted by | Deception, Science and Pseudo-Science | , , , | Leave a comment

20 tips for interpreting scientific claims

By Judith Curry | Climate Etc. | November 20, 2013

This list will help non-scientists to interrogate advisers and to grasp the limitations of evidence  – William J. Sutherland, David Spiegelhalter and Mark A. Burgman.

Nature has published a very interesting comment, titled Twenty tips for interpreting scientific evidence.  Excerpts:

Perhaps we could teach science to politicians? It is an attractive idea, but which busy politician has sufficient time? The research relevant to the topic of the day  is interpreted for them by advisers or external advocates.

In this context, we suggest that the immediate priority is to improve policy-makers’ understanding of the imperfect nature of science. The essential skills are to be able to intelligently interrogate experts and advisers, and to understand the quality, limitations and biases of evidence.

To this end, we suggest 20 concepts that should be part of the education of civil servants, politicians, policy advisers and journalists — and anyone else who may have to interact with science or scientists. Politicians with a healthy scepticism of scientific advocates might simply prefer to arm themselves with this critical set of knowledge.

Differences and chance cause variation. The real world varies unpredictably. Science is mostly about discovering what causes the patterns we see. Why is it hotter this decade than last?  There are many explanations for such trends, so the main challenge of research is teasing apart the importance of the process of interest  from the innumerable other sources of variation.

No measurement is exact. Practically all measurements have some error. If the measurement process were repeated, one might record a different result. In some cases, the measurement error might be large compared with real differences. Results should be presented with a precision that is appropriate for the associated error, to avoid implying an unjustified degree of accuracy.

Bias is rife. Experimental design or measuring devices may produce atypical results in a given direction. Confirmation bias arises when scientists find evidence for a favoured theory and then become insufficiently critical of their own results, or cease searching for contrary evidence.

Bigger is usually better for sample size. The average taken from a large number of observations will usually be more informative than the average taken from a smaller number of observations. That is, as we accumulate evidence, our knowledge improves. This is especially important when studies are clouded by substantial amounts of natural variation and measurement error.

Correlation does not imply causation. It is tempting to assume that one pattern causes another. However, the correlation might be coincidental, or it might be a result of both patterns being caused by a third factor — a ‘confounding’ or ‘lurking’ variable.

Regression to the mean can mislead. Extreme patterns in data are likely to be, at least in part, anomalies attributable to chance or error.

Extrapolating beyond the data is risky. Patterns found within a given range do not necessarily apply outside that range.

Scientists are human. Scientists have a vested interest in promoting their work, often for status and further research funding, although sometimes for direct financial gain. This can lead to selective reporting of results and occasionally, exaggeration. Peer review is not infallible: journal editors might favour positive findings and newsworthiness. Multiple, independent sources of evidence and replication are much more convincing.

Feelings influence risk perception. Broadly, risk can be thought of as the likelihood of an event occurring in some time frame, multiplied by the consequences should the event occur. People’s risk perception is influenced disproportionately by many things, including the rarity of the event, how much control they believe they have, the adverseness of the outcomes, and whether the risk is voluntarily or not. 

Data can be dredged or cherry picked. Evidence can be arranged to support one point of view. The question to ask is: ‘What am I not being told?’

JC comments:  I really like the idea behind this article:

What we offer is a simple list of ideas that could help decision-makers to parse how evidence can contribute to a decision, and potentially to avoid undue influence by those with vested interests.

I suspect this article will not be appreciated by scientists who are playing power politics with their expertise, or by advocates promoting scientism with cherry-picked evidence.

I picked 10 of the 20 tips that I thought were of greatest relevance to the climate change debate.

November 21, 2013 Posted by | Science and Pseudo-Science | , , , | Leave a comment

Global warming, Typhoon Haiyan and the Philippines

By Michel Chossudovsky | RT | November 14, 2013

Typhoon Haiyan (Yolanda), the strongest tropical typhoon ever recorded, has resulted in devastating consequences for the Philippines. The natural disaster took the lives of more than 10,000 people.

An estimated 615,000 residents have been displaced. Up to 4.3 million people have been affected, according to government sources.

The tragedy has become a talking point at Warsaw Climate Change Conference under UN auspices. The plight of Typhoon Haiyan has casually been assigned without evidence to the impacts of global warming.

While there is no scientific evidence that the super typhoon was the consequence of global warming, opening statements at the Warsaw summit hinted in no uncertain terms to a verified casual relationship. The executive director of the UN Framework Convention on Climate Change (UNFCC), Christiana Figueres, stated (without evidence) that the typhoon was part of the “sobering reality” of global warming.

In turn, the Philippines’ UN representative at the Climate Change talks, Yeb Sano, stated in his address at the opening session that “Typhoons such as Yolanda (Haiyan) and its impacts represent a sobering reminder to the international community that we cannot afford to procrastinate on climate action. Warsaw must deliver on enhancing ambition and should muster the political will to address climate change.”

In a bitter irony, the tragedy in the Philippines has contributed to reinforcing a consensus which indirectly feeds the pockets of corporations lobbying for a new deal on carbon trade. ‘Cap-and-trade’ is a multibillion dollar bonanza which is supported by the global warming consensus.

According to UNFCC executive director Christiana Figueres, “We must clarify finance that enables the entire world to move towards low-carbon development…We must launch the construction of a mechanism that helps vulnerable populations to respond to the unanticipated effects of climate change.”

Known and documented, cap-and-trade markets are manipulated. What is at stake is the trade in carbon derivatives which is controlled by powerful financial institutions including JP Morgan Chase. In 2008, Simon Linnett, executive vice-chairman of Rothschild, acknowledged the nature of this multibillion dollar business.

“As a banker, I also welcome the fact that the cap-and-trade system is becoming the dominant methodology for CO2 control. Unlike taxation, or plain regulation, cap-and-trade offers the greatest scope for private sector involvement and innovation,” he said, as quoted by The Telegraph.

Cap-and-trade packaged into derivative products feeds on the global warming consensus. Without it, this multibillion dollar trade would fall flat.

The humanitarian crisis in the Philippines bears no relationship to global warming. The social impacts of Typhoon Haiyan are aggravated due to the lack of infrastructure and social services, not to mention the absence of a coherent housing policy. Those most affected by the typhoon are living in poverty in make-shift homes.

A reduction of CO2 emissions – as suggested by Yeb Sano in his address at the Warsaw summit – will not resolve the plight of an impoverished population.

In the Philippines, the social impacts of natural disasters are invariably exacerbated by a macro-economic policy framework imposed by Manila’s external creditors.

What is at stake is the deadly thrust of neoliberal economic reforms. For more than 25 years – since the demise of the Marcos dictatorship – the International Monetary Fund’s “economic medicine” under the helm of the Washington Consensus has prevailed, largely serving the interests of financial institutions and corporations in mining and agribusiness.

The government of Philippine President Benigno Aquino has embarked upon a renewed wave of austerity measures which involves sweeping privatization and the curtailment of social programs. In turn, a large chunk of the state budget has been redirected to the military, which is collaborating with the Pentagon under Obama’s “Asia Pivot.” This program – which serves the interests of Washington at the expense of the Philippines population – also includes a $1.7 billion purchase of advanced weapons systems.


Deconstructing the hype on Super Typhoon Haiyan – Yolanda

By Paul Homewood | Watts Up With That? | November 13, 2013

Now we have had a few days to reflect on the terrible events of last week, we can start to piece together some of the facts.

First of all, as it is the thing that really matters above all, fatalities. The good news, if it can be termed that, is that the death toll is likely to be around 2000 to 2500, according to the Philippine President. This is much less than the 10,000 originally feared to have died.

As far as the storm itself was concerned, the official statistics from the Philippine Met Agency, PAGASA, remain the same as those issued at the time. The table below compares these with the original satellite estimates put out by the Joint Typhoon Warning Centre, JTWC, and that were subsequently used by the media around the world to claim that Yolanda was the “strongest storm ever”.

PAGASA JTWC
Sustained Wind Speed mph 147 195
Gust mph 171 235

Full article

See also:

Some historical perspectives on Typhoon Haiyan-Yolanda

November 13, 2013 Posted by | Corruption, Deception, Science and Pseudo-Science, Timeless or most popular | , , , , , | Leave a comment

Public Relations (Spin Doctors) Deliberately Deceived Public About Global Warming and Climate Change

By Dr. Tim Ball | Watts Up With That |

Half the work done in the world is to make things appear what they are not. E.R. Beadle.

In a 2003 speech Michael Crichton, graduate of Harvard Medical School and author of State of fear, said,

I have been asked to talk about what I consider the most important challenge facing mankind, and I have a fundamental answer. The greatest challenge facing mankind is the challenge of distinguishing reality from fantasy, truth from propaganda. Perceiving the truth has always been a challenge to mankind, but in the information age (or as I think of it, the disinformation age) it takes on a special urgency and importance.

We are in virtual reality primarily as Public Relations (PR) and its methods are applied to every aspect of our lives. The term “spin doctors” is more appropriate because it is what they are really doing. A spin doctor is defined as: a spokesperson employed to give a favorable interpretation of events to the media, esp. on behalf of a political party. It doesn’t say truthful interpretation. There are lies of commission and omission and this definition bypasses the category of omission. It’s reasonable to argue that if you deliberately commit a sin of omission it encompasses both. A favorable interpretation means there is deliberate premeditated deception. The person knows the truth, but selects information to create a false interpretation.

Despite all the discussion and reports about weather and climate the public are unaware of even the most fundamental facts. Recently, I gave a three hour presentation with question and answers. The audience was educated people who distrust government and were sympathetic to my information. I decided to illustrate my point and concern by asking a few basic questions. Nobody could tell me the difference between weather and climate. Nobody could name the three major so-called greenhouse gases, let alone explain the mechanics of the greenhouse theory. My goal was not to embarrass, but to illustrate how little they knew and how easily PR can deceive and misdirect.

Few people exemplify or describe the modern PR views better (worse?) than Jim Hoggan, President of a large Canadian PR company, Hoggan and Associates, in the Vancouver Sun December 30, 2005.

Want good coverage? Tell a good story. When your business is under siege, you can’t hope to control the situation without first controlling the story. The most effective form of communication is a compelling narrative that ties your interest to those of your audience. This is particularly critical when you’re caught in the spotlight; it doesn’t matter if you have the facts on your side if your detractors are framing the story. So, don’t just react. Take some time now to define your company story. Then you’ll be ready to build a response into that narrative should something go wrong.

Environment and climate suffer more from spinning than most areas and Hoggan, as Chair of the David Suzuki Foundation and owner of a large PR company, has a long connection with both. He is the proud founder and supporter of the web site DeSmogBlog as he explains in his book about the climate cover-up. The objective was to denigrate people by creating “favorable interpretations” to the following questions. “Were these climate skeptics qualified? Were they doing any research in the climate change field? Were they accepting money, directly or indirectly, from the fossil fuel industry?This wasn’t about answering the questions skeptics were asking about the science. Richard Littlemore, Hoggan’s co-author and senior writer for DeSmogBlog, revealed what was going on in a December 2007 email to Michael Mann.

Hi Michael [Mann],

I’m a DeSmogBlog writer [Richard LIttlemore] (sic) (I got your email from Kevin Grandia) and I am trying to fend off the latest announcement that global warming has not actually occurred in the 20th century.

It looks to me like Gerd Burger is trying to deny climate change by “smoothing,” “correcting” or otherwise rounding off the temperatures that we know for a flat fact have been recorded since the 1970s, but I am out of my depth (as I am sure you have noticed: we’re all about PR here, not much about science) so I wonder if you guys have done anything or are going to do anything with Burger’s intervention in Science. (Emphasis added)

The hypocrisy is profound because nobody ever questioned Al Gore’s qualifications or financial, career or political rewards. No promoters of global warming, such as Bill McKibben, Ross Gelbspan, Seth Borenstein, Andrew Revkin or most members of the Intergovernmental Panel on Climate Change (IPCC) are challenged. Borenstein exposed his bias in a leaked CRU email from July 23, 2009 to the Climatic Research Unit (CRU) gang. He wrote, “Kevin (Trenberth), Gavin (Schmidt), Mike (Mann), It’s Seth again. Attached is a paper in JGR today that Marc Morano is hyping wildly. It’s in a legit journal. Watchya think?A journalist talking to scientists is legitimate, but like the leaked emails, tone and subjectivity are telling. “Again” means there was previous communication. At least Revkin left the New York Times apparently because of such exposure.

The problem began the moment environmentalism and climate were exploited for political agendas and people asked questions. If you can’t answer the questions you either admit that or initiate personal attacks. Spin-doctors use two basic types.

• The individual is named and a slur applied. These are usually false or at best taken out of context. This includes guilt by association and taking payment from an agency or belonging to a group the slanderer considers inappropriate. It is an ad hominem.

• Individuals are marginalized by putting them in a group with a term created that marginalizes by implying they are at best outside any norm. For example, despite obvious limitations of data availability anyone who asks about President Obama’s biography is called a “Birther”. Anyone who is troubled by incomplete, unclear, or illogical explanations for events is called a “Conspiracy theorist”. There is no word or phrase for falsifying information about a group. A collective ad hominem is a contradiction. Guilt by association has some application, but a term like “Birther” has a different function. It is a collective designed to discredit anyone assigned. There can be no general name because the objective is to identify the group with a specific issue. This is necessary as part of the goal of marginalizing or isolating.

Early indicators of the politicizing of climate included the claim of a consensus. The word applies in politics not science Calling people who questioned the science “skeptics” was greater evidence. “Skeptic” is negative for the public and defined as “A person inclined to question or doubt all accepted opinions. Most think it is the definition for a cynic, A person who believes that people are motivated purely by self-interest rather than acting for honorable or unselfish reasons. The problem is most people don’t know that scientists must be skeptics.

The epithet “global warming skeptic” was applied to me years ago and was used in questions from the media. When I explained I accepted global warming the media was surprised. They didn’t understand when I explained my skepticism was about the cause – the claim it was due to human CO2. Some labeled me a contrarian, but it wasn’t effective because few know what it means.

When the basic assumption of the IPCC hypothesis that increased CO2 causes increased temperature stopped occurring after 1998, the attackers changed the subject and the pejorative. They raised the smearing level because they were losing the battle for the public mind. Now it became climate change and questioners deniers with the deliberate association with “holocaust deniers”.

Ironically, like all so-labeled, I am anything but a denier. My 40-year career involved teaching people how much climate changes naturally over time. The IPCC were deliberately constrained by their terms of reference to human causes and don’t consider natural changes. Rather they provide a “favorable interpretation” for their political objective to blame human CO2. It’s an interpretation a required spin to counter what Huxley called ugly facts.

Every time a problem appeared public relations people appeared and strategized a defense, usually to divert from the problem. When the emails were leaked from the Climatic Research Unit (CRU) a public relations person was engaged. After the November 2009 leak the University of East Anglia hired Neil Wallis of Outside Organization to handle the fall out. University spokesperson Trevor Davies said it was a “reputation management problem, which he said they don’t handle well. Apparently they didn’t consider telling the truth. The leaked emails triggered a shock wave that required a top political spin-doctor. Wallis, a former editor at the News of The World, was later arrested in connection with the phone hacking scandals that led to the resignation of London Metropolitan Police Commissioner and Deputy Commissioner, as well as Andy Coulson, Prime Minister Cameron’s press secretary.

Michael Mann’s 2004 email to CRU Director Phil Jones was evidence of the PR battle. Confronted by challenging questions they apparently developed a defensive mentality.

“I’ve personally stopped responding to these, they’re going to get a few of these op-ed pieces out here and there, but the important thing is to make sure they’re loosing (sic) the PR battle. That’s what the site is about. By the way, Gavin did come up w/ the name!”

The “site” is the web site Realclimate, named by Gavin (Schmidt). But science doesn’t need PR, so why do climate scientists use it? The apparent answer is they are not telling the truth and worse, know it.

I opened with a quote from Michael Crichton so it is fitting to end with his closing remarks.

Because in the end, science offers us the only way out of politics. And if we allow science to become politicized, then we are lost. We will enter the Internet version of the dark ages, an era of shifting fears and wild prejudices, transmitted to people who don’t know any better. That’s not a good future for the human race. That’s our past. So it’s time to abandon the religion of environmentalism, and return to the science of environmentalism, and base our public policy decisions firmly on that.

The problem and challenge is the population generally divides into 80 percent who struggle with science and 20 percent who are comfortable. I taught a science credit for arts students for 25 years so know the challenges. This makes resolving Crichton’s challenge of “distinguishing reality from fantasy, truth from propaganda, even more difficult. It is almost impossible when professional spin-doctors are deliberately diverting, misleading and creating confusion.

The improver of natural knowledge absolutely refuses to acknowledge authority, as such. For him, skepticism is the highest of duties; blind faith the one unpardonable sin.Thomas H. Huxley

A danger sign of the lapse from true skepticism in to dogmatism is an inability to respect those who disagreeDr. Leonard George.

“It is error alone which needs the support of government. Truth can stand by itself.” –Thomas Jefferson

November 8, 2013 Posted by | Deception, Science and Pseudo-Science, Timeless or most popular | , , , , , | Leave a comment

US makes first step toward banning trans fats

RT | November 7, 2013

The Food and Drug Administration announced on Thursday that it would require the food industry to phase out the use of artificial trans fats in its products.

The FDA said it has made a preliminary determination that the primary source of trans fat – partially hydrogenated oils – is no longer “generally recognized as safe,” and that it plans to ban their use in the market. Some trans fat is naturally generated in meat and dairy products, and the ban will only apply to trans fat added to foods.

According to FDA Commissioner Margaret Hamburg, the decision could potentially prevent 20,000 heart attacks a year and 7,000 deaths.

Over the last decade, American consumption of trans fat has declined significantly. In 2006, the average citizen was consuming 4.6 grams of trans fat a day, while the number decreased to roughly one gram a day in 2012. Still, Hamburg said they “remain an area of significant public health concern,” according to NBC News.

Many companies began eliminating the use of trans fat when the FDA required them to list the ingredient on nutritional labels in 2006, but it can still be found in common products like frozen pizza, microwave popcorn, margarine, coffee creamer, and various desserts.

“The artery is still half clogged,” Dr. Thomas Frieden, the director of the Centers for Disease Control and Prevention, said to the New York Times. “This is about preventing people from being exposed to a harmful chemical that most of the time they didn’t even know was there.”

“It’s quite important,” he added, referring to the FDA’s new proposal. “It’s going to save a huge amount in health care costs and will mean fewer heart attacks.”

Numerous studies have shown that there is virtually no health benefit to consuming trans fat. It lowers the level of “good” cholesterol and raises levels of “bad” cholesterol, clogging the arteries and increasing the risk of heart attacks.

The FDA did not lay out a timetable for the ban. It will open its proposal to public comment for 60 days while it formulates a schedule that gives food manufacturers enough time to cooperate with the new rule.

“We want to do it in a way that doesn’t unduly disrupt markets,” Michael Taylor, the FDA’s deputy commissioner for foods, said to the Associated Press. At the same time, he said the food “industry has demonstrated that it is by and large feasible to do.”

Public health groups have welcomed the FDA’s proposal, which the agency has been collecting data for since 2009.

Should the FDA move forward with its plan, the United States will join other nations such as Denmark, Iceland, and Switzerland, in banning the ingredient.

Still, there are numerous other ingredients that have been outlawed in various countries while still being sold in the U.S. An, article by BuzzFeed over the summer noted that brominated vegetable oil, which has been linked to birth defects and organ damage, continues to be used in sports drinks and the popular soda Mountain Dew. It’s been banned in more than 100 countries.

Meanwhile, synthetic hormones rGBH and rBST, linked to cancer and infertility, continue to be given to cows and show up in dairy products that aren’t labeled otherwise. They’ve been banned in Japan, Canada, New Zealand, Australia, and the European Union.

Earlier this month, the FDA banned three out of the four brands of arsenic-laced animal feed that was being given to chickens, turkeys, and pigs. The decision came four years after the Center for Food Safety called on the FDA to remove the feed, but one brand remains on the market.

November 7, 2013 Posted by | Science and Pseudo-Science | , , , , , , , | Leave a comment