Japan hopes seabed will yield data and resources
DW | January 17, 2014
With scant energy and mineral reserves of its own, and nuclear plants mothballed since the Fukushima nuclear disaster, Japan is investing heavily in exploring beneath the oceans for resources that will power its future.
Seabed off coast of Japan
On the first day of 2014, the Japanese research ship Chikyu set a new record by drilling down to a point 3,000 meters beneath the seabed off southern Japan. It was an appropriate way to ring in the new year and signals an increased commitment to learning more about the secrets that lay beneath the floor of the ocean close to Japan.
The research has two distinct but connected driving forces. As Japan prepares to mark the third anniversary of the March 11 Great East Japan Earthquake, the Chikyu is undertaking the most extensive survey ever attempted of the Nankai Trough, a geological fault that extends for several hundred kilometers parallel to the southern coast of Japan and widely seen as the source of the next major earthquake that will affect this tremor-prone nation. And with all of Japan’s nuclear reactors presently mothballed in the aftermath of the disaster, which destroyed the Fukushima Dai-Ichi nuclear plant, there is a new sense of urgency in the search for sources of energy and other natural resources close to Japan.
Limited natural resources
“When I was in elementary school, we learned that Japan does not have many natural resources of its own and that we needed to import all the oil, the gas, the metals and minerals that we needed,” Toshiyaki Mizuno, the deputy director of the Ocean and Earth Division at the ministry of science and technology, told DW.
“And that was what we thought for a long time,” he said. “Until we recently discovered that there are significant deposits of methane hydrates within Japan’s exclusive economic zone.”
Also known as natural gas hydrate or “fire ice,” it is a solid compound in which high levels of methane have been trapped in a crystal structure of water. Originally believed to only exist on the outer reaches of the solar system, significant deposits are now being discovered beneath seabed sediment and it is estimated that supplies are as much as 10 times the known reserves of natural gas.”
The dream of new energy
“There are many problems that we need to overcome before we can say that Japan’s energy problems have been solved, but the dream is to exploit this new source of energy and other resources and this is the first step in achieving that,” Mizuno said.
The Japanese government has announced plans to work with private companies to develop new technologies to explore the resources that are below the seabed off Japan, including the development of advanced submersibles and remote-controlled underwater vehicles.
Companies will work with no fewer than four Japanese ministries, representing trade and industry, science and technology, land and infrastructure and the Internal Affairs Ministry and there are hopes that the proposed recovery of resources could go ahead in as little as five years.
The government is putting aside a portion of the 50 billion yen (352.3 million euros) budget for strategic innovation projects to support the ambitious drive, with organizations such as the Japan Agency for Marine-Earth Science and Technology tasked with developing submarines that can operate at depths of up to 3,000 meters and large-scale excavation ships.
“This issue is becoming quite urgent for Japan because the government’s growth policy to date has largely focused on the weakening yen, which means that all imports of resources and energy are very expensive,” said Martin Schulz, senior economist at the Fujitsu Research Institute.
“Japan has to reduce those costs over the long term and developing these undersea resources is becoming much more economic than it was before,” he said.
“It is also important in terms of Japan’s energy mix as it does not seem likely that the nuclear reactors will be restarted in a significant way in the immediate future,” he added.
“Exploring close to Japan’s coastline for these resources makes complete sense, although we also know that methane hydrates can be extremely dangerous to collect and develop,” he said.
At the same time as Japan attempts to reduce its reliance on expensive imports and distance itself from relying on volatile suppliers of rare earth minerals – such as China – it is also in a hurry to learn more about the geological structure of the surface of the Earth close to the Japanese archipelago and the threats that natural disasters pose.
a Chinese navy missile frigate passing a drilling rig at the Tianwaitian gas field in the East China Sea, taken by Japanese Maritime Self-Defense Forces patrol plane on 09 September, 2005.
Questions over sovereignty and natural resources in the East China Sea have led to disputes with China
The drilling being conducted by the Chikyu is to examine the layers beneath the seabed in the Nankai Trough. In March last year, a study by the Central Disaster Management Council as a direct result of the impact of the earthquake that struck northeast Japan predicted that a magnitude-9 quake in the danger zone could trigger a tsunami as much as 30 meters high that could kill 320,000 people.
The disaster would destroy road and rail links the length of the country, the tsunami would pulverize buildings that had already been weakened by the tremor, infrastructure would be wiped out for hundreds of kilometers along the coast and the projected cost in terms of the damage wrought on the country is 220 trillion yen (1.84 trillion euros).
Given the scale of the threat, scientists say there is no time to lose in trying to determine when and precisely where the disaster might strike.
Related article

Senate EPW Hearing on the President’s Climate Action Plan
By Judith Curry | Climate Etc. | January 16, 2014
The hearing is now concluded, I’m on a plane flying back to Atlanta.
The testimony from each of the witnesses is now online [here]. The link for my testimony is [here].
The content of my verbal remarks is below:
I would like to thank the Committee for the opportunity to present testimony this morning. I am Chair of the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology. I have devoted 30 years to conducting research on topics including climate of the Arctic, the role of clouds and aerosols in the climate system, and the climate dynamics of extreme weather events.
The premise of the President’s Climate Action Plan is that there is an overwhelming judgment of science that anthropogenic global warming is already producing devastating impacts. Anthropogenic greenhouse warming is a theory whose basic mechanism is well understood, but whose magnitude is highly uncertain. Multiple lines of evidence presented in the recent IPCC 5th assessment report suggest that the case for anthropogenic warming is now weaker than in 2007, when the 4th assessment report was published.
My written testimony documented the following evidence:
- For the past 16 years, there has been no significant increase in surface temperature. There is a growing discrepancy between observations and climate model projections. Observations since 2011 have fallen below the 90% envelope of climate model projections
- The IPCC does not have a convincing or confident explanation for this hiatus in warming.
- There is growing evidence of decreased climate sensitivity to atmospheric carbon dioxideconcentrations
- Based on expert judgment in light of this evidence, the IPCC 5th assessment report lowered its surface temperature projection relative to the model projections for the period 2016-2036.
The growing evidence that climate models are too sensitive to CO2 has implications for the attribution of late 20th century warming and projections of 21st century climate change. Sensitivity of the climate to carbon dioxide, and the level of uncertainty in its value, is a key input into the economic models that drive cost-benefit analyses, including estimates of the social cost of carbon.
If the recent warming hiatus is caused by natural variability, then this raises the question as to what extent the warming between 1975 and 2000 can also be explained by natural climate variability. In a recent journal publication, I provided a rationale for projecting that the hiatus in warming could extend to the 2030’s. By contrast, according to climate model projections, the probability of the hiatus extending beyond 20 years is vanishing small. If the hiatus does extend beyond 20 years, then a very substantial reconsideration will be needed of the 20th century attribution and the 21st century projections of climate change.
Attempts to modify the climate through reducing CO2 emissions may turn out to be futile. The stagnation in greenhouse warming observed over the past 15+ years demonstrates that CO2 is not a control knob that can fine tune climate variability on decadal and multi-decadal time scales. Even if CO2 mitigation strategies are successfully implemented and climate model projections are correct, an impact on the climate would not be expected for a number of decades. Further, solar variability, volcanic eruptions and natural internal climate variability will continue to be sources of unpredictable climate surprises.
As a result of the hiatus in warming, there is growing appreciation for the importance of natural climate variability on multi-decadal timescales. Further, the IPCC AR5 and Special Report on Extreme Events published in 2012, find little evidence that supports an increase in most extreme weather events that can be attributed to humans.
The perception that humans are causing an increase in extreme weather events is a primary motivation for the President’s Climate Change Plan. However, in the U.S., most types of weather extremes were worse in the 1930’s and even in the 1950’s than in the current climate, while the weather was overall more benign in the 1970’s. The extremes of the 1930’s and 1950’s are not attributable to greenhouse warming and are associated with natural climate variability (and in the case of the dustbowl drought and heat waves, also to land use practices). This sense that extreme weather events are now more frequent and intense is symptomatic of pre-1970 ‘weather amnesia’.
The frequency and intensity of extreme weather events is heavily influenced by natural climate variability. Whether or not anthropogenic climate change is exacerbating extreme weather events, vulnerability to extreme weather events will continue to increase owing to increasing population and concentration of wealth in vulnerable regions. Regions that find solutions to current problems of climate variability and extreme weather events and address challenges associated with an increasing population are likely to be well prepared to cope with any additional stresses from climate change.
Nevertheless, the premise of dangerous anthropogenic climate change is the foundation for a far-reaching plan to reduce greenhouse gas emissions and reduce vulnerability to extreme weather events. Elements of this Plan may be argued as important for associated energy policy reasons, economics, and/or public health and safety. However, claiming an overwhelming scientific justification for the Plan based upon anthropogenic global warming does a disservice both to climate science and to the policy process.
Good judgment requires recognizing that climate change is characterized by conditions of deep uncertainty. Robust policy options that can be justified by associated policy reasons whether or not anthropogenic climate change is dangerous avoids the hubris of pretending to know what will happen with the 21st century climate.
This concludes my testimony.
JC comments: The hearing was very long; not so much because of questioning of the witnesses, but there was much pontification by the committee members (much more of this than on the House Subcommittees, it seems).
Several things struck me. All of the members seem pretty well educated on the topic of climate change. I cannot say the same of the administrators on the first panel.
Most of the members were there for Panel 1; only a few remained for Panel 2.
I’m fairly happy with my written testimony, but was surprised that my verbal testimony went over the time limit (have never gone over before). The questions were fairly light weight.
Andrew Dessler did a pretty good job particularly on the verbal testimony and answering questions.
All in all, a very interesting experience, but stressful since you need to pretty much drop everything to prepare your testimony (and I have a pile of things that need to be finished before tomorrow).
So does any of this matter? We’ll see. I felt that my previous testimony to the House Committee did have an impact.

James Hansen’s Policies Are Shafting The Poor
By Willis Eschenbach | Watts Up With That? | March 15, 2013
I was reading an interview with Adrian Bejan (worth taking a look at), and I got to musing about his comments regarding the relationship between energy use and per capita income. So I pulled up GapMinder, the world’s best online visualization software. Here’s a first cut at the relationship between energy and income.
Figure 1. Energy use per person (tons of oil equivalent, TOE) versus average income, by country. Colors show geographical regions. Size of the circle indicates population. The US is the large yellow circle at the top right. Canada is the overlapping yellow circle. China is the large red circle, India the large light blue circle. Here’s a link to the live Gapminder graph so you can experiment with it yourself.
Clearly, other than a few outliers, the relationship between energy use and income is quite straightforward. You can’t have one without the other. Well, that’s not quite true, you can have energy without income. You can have (relatively) high energy use without having the corresponding income, plenty of Africa is in that boat. But the reverse is not true—you can’t have high income without high energy use. You need the energy to make the income.
Now, James Hansen is the NASA guy who is leading the charge to stop all forms of cheap energy. Coal is bad, terrible stuff in his world. He calls trains of coal “death trains”. He wants to deny cheap energy to all of those folks in the bottom half of the graph above. Well, actually, he wants to deny access to cheap energy to everyone, but where it hurts is the bottom half of the graph. For example, the World Bank and other international funding agencies, at the urging of folks like Hansen, have been turning down loans for coal plants in developing countries.
But as you can see, if you deny energy to those folks, that is the same as denying them development. Because when there’s less energy, there’s less income. The two go hand in hand. So what James Hansen is advising is that we should take money from the poor … actually he wants to deny them cheap energy, but that means denying them income and the development that accompanies it.
A look at the history of some of the countries is instructive in that regard, to see how the income and the energy use have changed over time. Figure 2 shows the history of some selected countries.
Figure 2. A history of selected countries. Colors now show crude birth rate (births per thousand)
Now, this is showing something very interesting. It may reveal why Hansen thinks he’s doing good. Notice that for countries where people make below say $20,000 of annual income, the only way up is up and to the right … which means that the only way to increase income is to increase energy use. Look at India and China and Brazil and Spain and the Netherlands as examples. (Note also that crude birth rate is tied to increasing income, and that the crude birth rate in the US has dropped by about half since 1960.)
Above that annual income level of ~ $20,000, however something different happens. The countries start to substitute increased energy efficiency for increased energy use. This is reflected in the vertical movement of say the US, where the 2011 per capita energy use is exactly the same as the 1968 per capita energy use. And Canada is using the same energy per person as in 1977 … so let’s take a closer look at the upper right section of the chart. Figure 3 shows an enlargement of just the top right of the chart, displaying more countries.
Figure 3. A closeup of Figure 2, showing more countries. Start date is 1968 for clarity.
Now, this is interesting. Many, perhaps most of these countries show vertical or near vertical movement during the last twenty years or so. And the recent economic crash has caused people to be more conservative about energy use, squeezing more dollars out per ton of oil equivalent.
But that only happens up at the high end of the income spectrum, where people are making above about twenty or even twenty-five thousand dollars per year. You need to have really good technology to make that one work, to produce more income without using more energy. You need to be in what is called a “developed” nation.
When people think “development”, they often think “bulldozers”. But they should think “energy efficiency”, because that is the hallmark of each technological advance—it squeezes more stuff out of less energy. But you have to be in an industrialized, modern society to take advantage of that opportunity.
So this may be the reason for Hansen’s attitude toward energy use. He may not know that most of the world is not in the situation of the US. This may be the reason the he claims that we should curtail energy use by all means possible. He may not see that while the US and industrialized countries can get away with that, in part because we waste a lot of energy and have a lot of both money and technology, the poor and even the less well off of the world have little energy or money to waste.
For those poorer countries and individuals, which make up the overwhelming bulk of the world’s population, a reduction in energy use means a reduction in the standard of living. And the part Hansen and his adherents don’t seem to get is that for most of the world, the standard of living is “barely” … as in barely making ends meet.
As is usual in this world, the situation of the rich and the poor is different, and in this case the break line is high. Twenty grand of income per year is the line dividing those who can take advantage of technology to get more income with the same energy, and the rest, which is most of the world. Most of the world are still among those who must use more energy to increase their income. They don’t have the option the US and the developed nations have. They must increase energy use to increase income.
And when you start jacking up energy prices and discouraging the use of cheap energy sources around the planet, as Hansen and his adherents are doing, the poorest of the poor get shafted. James Hansen is making lots and lots of money. He’s comfortably in the top 1% of the world’s population by income, and he obviously doesn’t give much thought to the rest. We know this because if he thought about the poor he’d realize that while he is mouthing platitudes about how he’s doing his agitation and advocacy for his grandchildren’s world in fifty years, what he’s doing is shafting the poor today in the name of his grandchildren. Of course Hansen is not the first rich white guy to do that, so I suppose I really shouldn’t be surprised, but still …
Increased energy prices, often in the form of taxes and “cap-and-trade” and “renewable standards”, are THE WORLDS MOST REGRESSIVE TAX. Hansen proposes taxing the living daylights out of the poor, but he won’t feel the pain. He can stand a doubling of the gas prices, no problem. But when electricity and gas prices double around the planet, POOR PEOPLE DIE … and Hansen just keeps rolling, he has quarter-million-dollar awards from his friends and a fat government salary and a princely retirement pension you and I paid for, he could care less about increased energy prices. He’s one of the 1%, why should he pay attention to the poor?
Forgive the shouting, but the damn hypocrisy is infuriating, and I’m sick of being nice about it. James Hansen and Michael Mann and Gavin Schmidt and Phil Jones and Peter Gleick and the rest of the un-indicted co-conspirators are a bunch of rich arrogant 1%er jerkwagons who don’t care in the slightest about the poor. Not only that, but they’ve given the finger to the rest of the climate scientists and to the scientific establishment, most of whom have said nothing in protest, and far too many of whom have approved of their malfeasance.
Their patented combination of insolent arrogance and shabby science would be bad enough if that was all they were doing … but they are hurting poor people right now. Their policies are causing harder times for the poor today, as we speak … and they mouth platitudes about how they are saving the poor from some danger they won’t see for fifty years?
If you ask the poor whether they’d rather get shafted for sure today, or possibly get shafted in fifty years, I know what they’d tell you. To me, hurting the poor today under the rubric of saving them in half a century from an unsubstantiated and fanciful danger is moral dishonesty of the first order.
So let me say to all of you folks who claim the world is using too much energy, you have the stick by the wrong end. The world needs to use MORE energy, not less, because there is no other way to get the poor out of poverty. It can’t be done without cheap energy. We need to use more energy to lift people out of bone-crushing poverty, not use less and condemn them to brutal lives. And to do that, energy needs to be cheaper, not more expensive.
Let me be crystal clear, and speak directly to Hansen and other global warming alarmists. Any one of you who pushes for more expensive energy is hurting and impoverishing and killing the poor today. Whether through taxes or cap-and-trade or renewable subsidies or blocking drilling or any other way, increasing energy costs represent a highly regressive tax of the worst kind. And there is no escape at the bottom end, quite the opposite. The poorer you are, the harder it bites.
So please, don’t give us the holier-than-thou high moral ground stance. Spare us the “we’re noble because we are saving the world” BS. When a poor single mother of three living outside Las Vegas has her gas costs double, she has little choice other than to cut out some other essential item, food or doctor visits or whatever … because her budget doesn’t have any of the non-essential items that James Hansen’s budget contains, and she needs the gas to get to work, that’s not optional.
For her, all her money goes to essentials— so if gas costs go up, her kids get less of what they need. You’re not saving the world, far from it. You’re taking food out of kids’ mouths.
You are causing pain and suffering to the poor and acting like your excrement has no odor … but at least there is some good news. People are no longer buying your story. People are realizing that if someone argues for expensive energy, they are anti-human, anti-development, and most of all, without compassion for the poor. They are willing to put the most damaging, regressive, destructive tax imaginable on the poorest people of the planet.
Now those of you advocating for higher energy prices, after reading this, you might still fool the media about what you are doing to the poor. And it’s possible for you to not mention to your co-workers about the real results of your actions. And you could still deceive your friends about the question of the poor, or even your wife or husband.
But by god, you can no longer fool yourself about it. As of now, you know that agitating for more expensive energy for any reason hurts the poor. What you do with that information is up to you … but you can’t ignore it, it will haunt you at 3 AM, and hopefully, it will make you think about the less fortunate folk of our planet and seriously reconsider your actions. Because here’s the deal. Even if CO2 will damage the poor in 50 years, hurting the poor now only makes it worse. If you think there is a problem, then look for a no-regrets solution.
Because if you truly care about the poor, and you are afraid CO2 will increase the bad weather and harm the poor fifty years from now, you owe it to them to find a different response to your fears of CO2, a response that doesn’t hurt the poor today.
Related article

Drug Companies and Doctors Boost Profits Pitching Attention Deficit Disorder
By Noel Brinkerhoff | AllGov | December 17, 2013
With the help of physicians, pharmaceutical makers have made billions of dollars peddling medicines to treat attention deficit disorder, leading some experts, and even one pharmaceutical executive, to declare that the marketing push has gone too far.
Last year, sales of stimulant medication intended to treat attention deficit hyperactivity disorder (ADHD) reached $9 billion—a fivefold increase from a decade ago.
Today, 15% of high school students have been diagnosed with ADHD, with about 3.5 million of them on some sort of drug marketed to treat the disorder.
Dr. Keith Conners, who has spent decades trying to help children with ADHD, has questioned the increasing rates of diagnosis, calling them “a national disaster of dangerous proportions.”
“The numbers make it look like an epidemic. Well, it’s not. It’s preposterous,” Conners, a psychologist and professor emeritus at Duke University, told The New York Times. “This is a concoction to justify the giving out of medication at unprecedented and unjustifiable levels.”
The drug industry has worked for two decades to publicize ADHD and promote its remedies to doctors, educators and parents. As a result, the disorder is now the second most frequent long-term diagnosis made in children, just behind asthma.
Drugs such as Ritalin, Adderall, Concerta, Focalin, Vyvanse, Intuniv and Strattera have been promoted to help children, but along the way, the Food and Drug Administration has cited every major ADHD drug for false and misleading advertising since 2000.
Doctors also have been criticized for taking money from drug companies to publish research and deliver presentations that encourage colleagues to prescribe these drugs, which possess significant side effects and are regulated in the same class as morphine and oxycodone because of their potential for abuse and addiction.
Now, companies want to market the medications to adults to further expand revenue-making opportunities.
Roger Griggs, the pharmaceutical executive who introduced Adderall in 1994, objects to marketing stimulants to the general public because of the risks involved. He called the drugs “nuclear bombs” that should rarely be prescribed and carefully monitored by a treating physician, according to the Times.
To Learn More:
The Selling of Attention Deficit Disorder (by Alan Schwarz, New York Times)
Latest Condition Invented by Drug Companies…Low Testosterone (by Matt Bewig, AllGov)
Drug Companies Increase Profits by Creating Fear of Diseases (and Even Diseases) (by David Wallechinsky, AllGov)
Unminced Words By Climate Scientist Hans von Storch
“Scientists Too Quick To Claim Last Word”
No Trick Zone | November 18, 2013
The Resonator, the research podcast of the German Helmholtz Research Group conducted a long interview (1 hr 40 min!) with climate scientist Hans von Storch director of the GKSS Research Center. In the interview von Storch was asked about his views on a wide variety of climate science related issues.
Overall the interview saw a Hans von Storch who spoke frankly and openly. Some of the remarks he made raised my eye brows. In general von Storch, best described as a non-alarmist warmist, views the climate debate as being dominated by the more extreme positions from both sides, with voices in the middle getting drowned out. He levels a fair amount of criticism at the climate science community, but does so without naming any persons in particular.
Due to the sheer length of the interview, I will only look at the points that I found interesting and relevant as skeptic.
Scientists too quick to accept dramatic scenarios
At the 15-minute mark von Storch describes a science that is so politicized with both sides are trying to make it black and white, and a debate that has been overly shrill. Some scientists, he says, have tended to accept dramatic scenarios and consequences even when there’s little evidence behind them. He also talks of a group of scientists who fancy themselves as the ultimate authority and who have the last word. All the exaggerations and projections of doom, gloom and disaster have led to an overall discrediting of the field.
“Science and Nature are pretty bad journals”
At the 29-minute mark von Storch says he sees himself as someone who needs a lot of time before he is convinced of anything. I was surprised to hear him call both Science and Nature “pretty bad journals” when it comes to the quality of their articles. Hans von Storch cites an article published by Science claiming that the climate was going to tip in the year 2047, calling the report “a real doozy“. He says that science journals must remain sufficiently critical and not let themselves get caught up with the zeitgeist. Von Storch admits that he has not always been popular among the community.
Overall von Storch doesn’t blame the media much for the hysteria, implying that the hysteria stems more from scientists communicating poorly. The media are only interpreting what the scientists are spewing. Projections of snowless winters, for example, were hardly helpful in lending credibility to climate science.
Scientists dramatizing for attention and prestige
At the 37-minute mark von Storch believes some scientists succumbed to drama in order to get attention and prestige, and says that the such are only damaging the credibility of climate science.
Models too CO2-centric
At the 40-minute mark von Storch discusses possible reasons why the warming has stalled and thinks other explanations need to be examined, such as solar activity and aerosols. He finds climate models too CO2-centric in general. Here he appeals for more patience to let the science unfold.
At the 45 minute mark he fires harsh criticism at scientists who promote a society governed by an elite technocracy, calling the idea “stupidity”. He calls the proposals made by a group of scientists in favor of appointing future councils to represent the interests of future generations “peculiar”.
At the 59-minute mark, on whether storms are becoming more frequent and severe, von Storch says he doesn’t think this is the case and that the disasters are more about the over-development of coastal areas.
Hockey stick was “something dumb” – an attempt to steer politics
On the hockey stick chart, at the 63 minute mark, von Storch has some blunt words on how it was possible to for it to become the icon that it became. He recalls having examined the chart himself and found it deficient.
“I believe it was something dumb by scientists who wanted to steer politics.”
He thinks the climate science community were too quick to call it the last word. Hans von Storch sees critique of the hockey stick and confirmed and that’s why it no longer appears in the IPCC reports. Scientists, von Storch reminds us, should not be so quick to claim absolute truth.
Also, von Storch believes that the oceans could be warming up, but that there is very little data out there to confirm it.
Mummy, The Ocean’s Eaten My Heat!!
By Paul Homewood | Not A Lot Of People Know That | September 24, 2013
I’ve been meaning to post on this for a while. We often hear the claim that all of the missing heat has been gobbled up by the oceans.
Now, let’s leave aside some of the obvious problems with this theory, such as:
- How all of this heat has selectively and mysteriously managed to avoid land areas.
- How warm water has managed to sink instead of rise.
- How the water at, or near, the surface seems to have escaped this warming.
And get straight to the nub of the matter.
Water has a much higher heat capacity than air. According to NOAA,
The oceans store more heat in the uppermost 3 meters (10 feet) than the entire atmosphere (above it).
So let’s run some very simple calculations.
In the last decade, most models were predicting something of the order of 0.2C global warming. If, instead of warming the atmospheric , this extra heat has gone into the sea, its effects will be much diluted, with the result that increases in sea temperatures will be much, much less than 0.2C.
(Remember, it takes much more energy to warm a bucket of water by 1C than a bucket of air.)
The suggestion is that, as there has been no noticeable warming in the upper 100 meters, this “hidden heat” is as far as 2000 meters down.
So, ocean temperature should have increased by:
2000 Meters Divided By 3 Meters = 666.6
0.2C Divided By 666.6 = 0.0003C
The idea that we can:
- measure sea temperatures throughout all the oceans of the world
- measure it throughout the whole depth down to 2000 meters and more.
- take into account seasonal changes
- take into account shifting ocean cycles and currents.
and still be able to measure the overall temperature to better than three ten thousandths of a degree is patent nonsense.
So step forward Professor Ted Shepherd, a leading atmospheric scientist and recently installed as Grantham Chair in Climate Science at Reading University.
He had this to say to the Guardian.
“The heat is still coming in, but it appears to have gone into the deep ocean and, frustratingly, we do not have the instruments to measure there,”
Or to put it another way, we have no idea whether it is or not, but in the meantime we’ll still cling to our theory.
20 tips for interpreting scientific claims
By Judith Curry | Climate Etc. | November 20, 2013
This list will help non-scientists to interrogate advisers and to grasp the limitations of evidence – William J. Sutherland, David Spiegelhalter and Mark A. Burgman.
Nature has published a very interesting comment, titled Twenty tips for interpreting scientific evidence. Excerpts:
Perhaps we could teach science to politicians? It is an attractive idea, but which busy politician has sufficient time? The research relevant to the topic of the day is interpreted for them by advisers or external advocates.
In this context, we suggest that the immediate priority is to improve policy-makers’ understanding of the imperfect nature of science. The essential skills are to be able to intelligently interrogate experts and advisers, and to understand the quality, limitations and biases of evidence.
To this end, we suggest 20 concepts that should be part of the education of civil servants, politicians, policy advisers and journalists — and anyone else who may have to interact with science or scientists. Politicians with a healthy scepticism of scientific advocates might simply prefer to arm themselves with this critical set of knowledge.
Differences and chance cause variation. The real world varies unpredictably. Science is mostly about discovering what causes the patterns we see. Why is it hotter this decade than last? There are many explanations for such trends, so the main challenge of research is teasing apart the importance of the process of interest from the innumerable other sources of variation.
No measurement is exact. Practically all measurements have some error. If the measurement process were repeated, one might record a different result. In some cases, the measurement error might be large compared with real differences. Results should be presented with a precision that is appropriate for the associated error, to avoid implying an unjustified degree of accuracy.
Bias is rife. Experimental design or measuring devices may produce atypical results in a given direction. Confirmation bias arises when scientists find evidence for a favoured theory and then become insufficiently critical of their own results, or cease searching for contrary evidence.
Bigger is usually better for sample size. The average taken from a large number of observations will usually be more informative than the average taken from a smaller number of observations. That is, as we accumulate evidence, our knowledge improves. This is especially important when studies are clouded by substantial amounts of natural variation and measurement error.
Correlation does not imply causation. It is tempting to assume that one pattern causes another. However, the correlation might be coincidental, or it might be a result of both patterns being caused by a third factor — a ‘confounding’ or ‘lurking’ variable.
Regression to the mean can mislead. Extreme patterns in data are likely to be, at least in part, anomalies attributable to chance or error.
Extrapolating beyond the data is risky. Patterns found within a given range do not necessarily apply outside that range.
Scientists are human. Scientists have a vested interest in promoting their work, often for status and further research funding, although sometimes for direct financial gain. This can lead to selective reporting of results and occasionally, exaggeration. Peer review is not infallible: journal editors might favour positive findings and newsworthiness. Multiple, independent sources of evidence and replication are much more convincing.
Feelings influence risk perception. Broadly, risk can be thought of as the likelihood of an event occurring in some time frame, multiplied by the consequences should the event occur. People’s risk perception is influenced disproportionately by many things, including the rarity of the event, how much control they believe they have, the adverseness of the outcomes, and whether the risk is voluntarily or not.
Data can be dredged or cherry picked. Evidence can be arranged to support one point of view. The question to ask is: ‘What am I not being told?’
JC comments: I really like the idea behind this article:
What we offer is a simple list of ideas that could help decision-makers to parse how evidence can contribute to a decision, and potentially to avoid undue influence by those with vested interests.
I suspect this article will not be appreciated by scientists who are playing power politics with their expertise, or by advocates promoting scientism with cherry-picked evidence.
I picked 10 of the 20 tips that I thought were of greatest relevance to the climate change debate.
Related article
- The 97% consensus myth – busted by a real survey (wattsupwiththat.com)
US makes first step toward banning trans fats
RT | November 7, 2013
The Food and Drug Administration announced on Thursday that it would require the food industry to phase out the use of artificial trans fats in its products.
The FDA said it has made a preliminary determination that the primary source of trans fat – partially hydrogenated oils – is no longer “generally recognized as safe,” and that it plans to ban their use in the market. Some trans fat is naturally generated in meat and dairy products, and the ban will only apply to trans fat added to foods.
According to FDA Commissioner Margaret Hamburg, the decision could potentially prevent 20,000 heart attacks a year and 7,000 deaths.
Over the last decade, American consumption of trans fat has declined significantly. In 2006, the average citizen was consuming 4.6 grams of trans fat a day, while the number decreased to roughly one gram a day in 2012. Still, Hamburg said they “remain an area of significant public health concern,” according to NBC News.
Many companies began eliminating the use of trans fat when the FDA required them to list the ingredient on nutritional labels in 2006, but it can still be found in common products like frozen pizza, microwave popcorn, margarine, coffee creamer, and various desserts.
“The artery is still half clogged,” Dr. Thomas Frieden, the director of the Centers for Disease Control and Prevention, said to the New York Times. “This is about preventing people from being exposed to a harmful chemical that most of the time they didn’t even know was there.”
“It’s quite important,” he added, referring to the FDA’s new proposal. “It’s going to save a huge amount in health care costs and will mean fewer heart attacks.”
Numerous studies have shown that there is virtually no health benefit to consuming trans fat. It lowers the level of “good” cholesterol and raises levels of “bad” cholesterol, clogging the arteries and increasing the risk of heart attacks.
The FDA did not lay out a timetable for the ban. It will open its proposal to public comment for 60 days while it formulates a schedule that gives food manufacturers enough time to cooperate with the new rule.
“We want to do it in a way that doesn’t unduly disrupt markets,” Michael Taylor, the FDA’s deputy commissioner for foods, said to the Associated Press. At the same time, he said the food “industry has demonstrated that it is by and large feasible to do.”
Public health groups have welcomed the FDA’s proposal, which the agency has been collecting data for since 2009.
Should the FDA move forward with its plan, the United States will join other nations such as Denmark, Iceland, and Switzerland, in banning the ingredient.
Still, there are numerous other ingredients that have been outlawed in various countries while still being sold in the U.S. An, article by BuzzFeed over the summer noted that brominated vegetable oil, which has been linked to birth defects and organ damage, continues to be used in sports drinks and the popular soda Mountain Dew. It’s been banned in more than 100 countries.
Meanwhile, synthetic hormones rGBH and rBST, linked to cancer and infertility, continue to be given to cows and show up in dairy products that aren’t labeled otherwise. They’ve been banned in Japan, Canada, New Zealand, Australia, and the European Union.
Earlier this month, the FDA banned three out of the four brands of arsenic-laced animal feed that was being given to chickens, turkeys, and pigs. The decision came four years after the Center for Food Safety called on the FDA to remove the feed, but one brand remains on the market.



