A while ago, I received an email from a friend who asked:
How can many, many respected, competitive, independent science folks be so wrong about [global warming] (if your [skeptical] premise is correct). I don’t think it could be a conspiracy, or incompetence. … Has there ever been another case when so many ‘leading’ scientific minds got it so wrong?
The answer to the second part of my friend’s question—“Has there ever been another case where so many ‘leading’ scientific minds got it so wrong?”—is easy. Yes, there are many such cases, both within and outside climate science. In fact, the graveyard of science is littered with the bones of theories that were once thought “certain” (e.g., that the continents can’t “drift,” that Newton’s laws were immutable, and hundreds if not thousands of others).
Science progresses by the overturning of theories once thought “certain.”
And so, Carl Sagan has written:
“Even a succession of professional scientists—including famous astronomers who had made other discoveries that are confirmed and now justly celebrated—can make serious, even profound errors in pattern recognition.”[1]
There is no reason to believe that climate scientists (alarmist or skeptic) are exempt from this possibility.
That leaves the first question, which is how so many “respected, competitive, independent science folks [could] be so wrong” about the causes and dangers of global warming, assuming they are wrong. And here, I confess that after five years of research into climate fears, this question still baffles me.
Climate certainty is baffling
It is not baffling that so many scientists believe humanity might be to blame for global warming. If carbon dioxide causes warming, additional CO2 should produce additional warming. But it’s baffling that alarmist climate scientists are so certain that additional carbon dioxide will produce a climate disaster, even though there is little empirical evidence to support this view, and much evidence against it, including a decade of non-warming. This dogmatism makes it clear, at least to those outside the alarmist climate paradigm, that something is very wrong with the state of “consensus” climate science.
There are many possible reasons for this scientific blindness, including sheer financial and career self-interest: scientists who don’t accept the alarmist paradigm will lose research grants and career doors will be closed to them. But one psychological diagnosis fits alarmist climate science like a glove: groupthink. With groupthink, we get the best explanation of “how can many, many respected, competitive, independent science folks be so wrong.”
Groupthink was extensively studied by Yale psychologist Irving L. Janis and described in his 1982 book Groupthink: Psychological Studies of Policy Decisions and Fiascoes.
Janis was curious about how teams of highly intelligent and motivated people—the “best and the brightest” as David Halberstam called them in his 1972 book of the same name—could have come up with political policy disasters like the Vietnam War, Watergate, Pearl Harbor and the Bay of Pigs. Similarly, in 2008 and 2009, we saw the best and brightest in the world’s financial sphere crash thanks to some incredibly stupid decisions, such as allowing sub-prime mortgages to people on the verge of bankruptcy.
In other words, Janis studied why and how groups of highly intelligent professional bureaucrats and, yes, even scientists, screw up, sometimes disastrously and almost always unnecessarily. The reason, Janis believed, was “groupthink.” He quotes Nietzsche’s observation that “madness is the exception in individuals but the rule in groups,” and notes that groupthink occurs when “subtle constraints … prevent a [group] member from fully exercising his critical powers and from openly expressing doubts when most others in the group appear to have reached a consensus.”[2]
Janis found that even if the group leader expresses an openness to new ideas, group members value consensus more than critical thinking; groups are thus led astray by excessive “concurrence-seeking behavior.”[3] Therefore, Janis wrote, groupthink is “a model of thinking that people engage in when they are deeply involved in a cohesive in-group, when the members’ strivings for unanimity override their motivation to realistically appraise alternative courses of action.”[4]
The groupthink syndrome
The result is what Janis calls “the groupthink syndrome.” This consists of three main categories of symptoms:
1. Overestimate of the group’s power and morality, including “an unquestioned belief in the group’s inherent morality, inclining the members to ignore the ethical or moral consequences of their actions.” [emphasis added]
2. Closed-mindedness, including a refusal to consider alternative explanations and stereotyped negative views of those who aren’t part of the group’s consensus. The group takes on a “win-lose fighting stance” toward alternative views.[5]
3. Pressure toward uniformity, including “a shared illusion of unanimity concerning judgments conforming to the majority view”; “direct pressure on any member who expresses strong arguments against any of the group’s stereotypes”; and “the emergence of self-appointed mind-guards … who protect the group from adverse information that might shatter their shared complacency about the effectiveness and morality of their decisions.”[6]
It’s obvious that alarmist climate science—as explicitly and extensively revealed in the Climatic Research Unit’s “Climategate” emails—shares all of these defects of groupthink, including a huge emphasis on maintaining consensus, a sense that because they are saving the world, alarmist climate scientists are beyond the normal moral constraints of scientific honesty (“overestimation of the group’s power and morality”), and vilification of those (“deniers”) who don’t share the consensus.
For example, regarding Symptom 1, overestimation of the group’s power and morality: leading consensus climate spokespeople like Al Gore, James Hansen, and Stephen Schneider have stated outright that they feel it’s acceptable and even moral to exaggerate global-warming claims to gain public support, even if they have to violate the broader scientific principle of adherence to truth at all costs (http://www.paulmacrae.com/?p=51 has examples.) Consensus climate science also overestimates the power of humanity to override climate change, whether human-caused or natural, just as government planners overestimated the U.S.’s ability to win the Vietnam War.
Regarding Symptom 2, closed-mindedness, there are many cases of the alarmist climate paradigm ignoring or suppressing evidence that challenges the AGW hypothesis. The Climategate emails, for example, discuss refusing publication to known skeptics and even firing an editor favorable to skeptics.
Regarding Symptom 3, pressure toward uniformity: within alarmist climate science there is a “shared illusion of unanimity” (i.e., a belief in total consensus) about the majority view when this total or near-total consensus has no basis in reality. For example, the Oregon Petition against the anthropogenic warming theory has 31,000 signatories, over 9,000 of them with PhDs.
Climate scientists who dare to deviate from the consensus are censured as “deniers”—a choice of terminology that can only be described as odious. And the Intergovernmental Panel on Climate Change explicitly aims for “consensus” in its reports—it does not publish minority reports, and yet it is impossible that its alleged more than “2,000 scientists” could completely agree on a subject as complicated as climate.
Group polarization
Janis notes one other form of dysfunctional group dynamic that arises out of groupthink and that, in turn, helps create even more groupthink:
The tendency for the collective judgments arising out of group discussions to become polarized, sometimes shifting toward extreme conservatism and sometimes toward riskier forms of action than the individual members would otherwise be prepared to take.[7]
This dynamic is commonly referred to as “group polarization.”
As a process, “when like-minded people find themselves speaking only with one another, they get into a cycle of ideological reinforcement where they end up endorsing positions far more extreme than the ones they started with.”[8] [emphasis added]
And because these positions are so extreme, they are held with extreme ferocity against all criticisms.
Examples of alarmist climate groupthink
Groupthink is common in academic disciplines. For example, philosopher Walter Kaufmann, a world-renowned editor of Nietzsche’s works, identifies groupthink in his discipline as follows:
There is a deep reluctance to stick out one’s neck: there is safety in numbers, in belonging to a group, in employing a common method, and in not developing a position of one’s own that would bring one into open conflict with more people than would be likely to be pleased.[9]
Similarly, in the 2009 Climategate emails, CRU director Phil Jones shows this “deep reluctance to stick out one’s neck” in writing (July 5, 2005):
“The scientific community would come down on me in no uncertain terms if I said the world had cooled from 1998.”
Keith Briffa laments (Sept. 22, 1999):
“I know there is pressure to present a nice tidy story as regards ‘apparent unprecedented warming in a thousand years or more in the temperature proxy data’ but in reality the situation is not quite so simple. … I believe that the recent warmth was probably matched about 1,000 years ago.”
Elsewhere, Briffa notes (April 29, 2007):
“I tried hard to balance the needs of the science and the IPCC, which were not always the same. I worried that you might think I gave the impression of not supporting you well enough while trying to report on the issues and uncertainties.”
All of the above (there are many more examples in the Climategate emails) reveal scientific groupthink, which puts the needs and desires of a peer group—the desire for “consensus”—ahead of the scientific facts. We would, undoubtedly, find other examples of alarmist groupthink if we could examine the emails of other promoters of climate alarmism, like James Hansen’s Goddard Institute.
This groupthink isn’t at all surprising. After all, alarmist climate scientists attend several conferences a year with like-minded people (the views of outright “deniers” are not welcome, as the CRU emails clearly reveal). In the absence of real debate or dissent they easily persuade themselves that human beings are the main reason the planet is warming and it’s going to be a catastrophe. Why? Because everyone else seems to think so and, in groupthink, consensus is highly valued. The same principles operate strongly, of course, in religion.
The ‘hockey stick’ and groupthink
Climate alarmists will, of course, angrily dispute that climate science groupthink is as strong as claimed here. However, groupthink is clearly identified in the 2006 Wegman report into the Michael Mann hockey stick controversy.
The Wegman report was commissioned by the U.S. House Science Committee after Mann refused to release all the data leading to the hockey stick conclusions, conclusions that eliminated the Medieval Warm Period and Little Ice Age in order to show today’s warming as unprecedented. In fact, as mathematician Steve McIntyre discovered after years of FOI requests, the calculations in Mann’s paper had not been checked by the paper’s peer reviewers and were, in fact, wrong.
The National Academy of Sciences committee, led by Dr. Edward Wegman, an expert on statistics, identified one of the reasons why Mann’s paper was so sloppily peer-reviewed as follows:
There is a tightly knit group of individuals who passionately believe in their thesis. However, our perception is that this group has a self-reinforcing feedback mechanism and, moreover, the work has been sufficiently politicized that they can hardly reassess their public positions without losing credibility.[10] [emphasis added]
Wegman noted that the Mann paper became prominent because it “fit some policy agendas.”[11]
The Wegman Report also observed:
As statisticians, we were struck by the isolation of communities such as the paleoclimate community that rely heavily on statistical methods, yet do not seem to be interacting with the mainstream statistical community. The public policy implications of this debate are financially staggering and yet apparently no independent statistical expertise was sought or used.[12] [emphasis added]
In other words, alarmist climate scientists are part of an exclusive group that talks mainly with itself and avoids groups that don’t share the anthropogenic global warming hypothesis and alarmist political agenda. Overall, Wegman is describing with great precision a science community whose conclusions have been distorted and polarized by groupthink.
Recognizing groupthink
After the Climategate emails, some consensus climate scientists began to recognize the dangers of groupthink within their discipline. So, Georgia Tech climatologist Judith Curry wrote in 2009:
In my opinion, there are two broader issues raised by these emails that are impeding the public credibility of climate research: lack of transparency in climate data, and “tribalism” in some segments of the climate research community that is impeding peer review and the assessment process.[13]
Similarly, IPCC contributor Mike Hulme wrote:
It is possible that climate science has become too partisan, too centralized. The tribalism that some of the leaked emails display is something more usually associated with social organization within primitive cultures; it is not attractive when we find it at work inside science.[14] [emphasis added]
In short, it is clear that groupthink—a later, more scientific word for “tribalism”—is strongly at work within alarmist climate science, however much the affected scientists refuse to recognize it. As a result of tribalism (groupthink), alarmist climate science makes assertions that are often extreme (polarized), including the explicit or implicit endorsement of claims that global warming will lead to “oblivion,” “thermageddon,” mass extinctions, and the like. Indeed, one of the ironies of climate science is that extremist AGW believers like Gore, Hansen and Schneider have succeeded in persuading the media and public that those who don’t make grandiose claims, the skeptics, are the extremists.
Group polarization offers a rational explanation for extreme alarmist claims, given that the empirical scientific evidence is simply not strong enough to merit such confidence. It is likely that even intelligent, highly educated scientists have been caught in what has been called the “madness of crowds.” Indeed, writing in the Times Higher Education magazine, British philosopher Martin Cohen makes this connection explicit:
Is belief in global-warming science another example of the “madness of crowds”? That strange but powerful social phenomenon, first described by Charles Mackay in 1841, turns a widely shared prejudice into an irresistible “authority”. Could it [belief in human-caused, catastrophic global warming] indeed represent the final triumph of irrationality?[16]
There is strong psychological evidence that alarmist fears of climate change are far more the result of groupthink and the group polarization process than scientific evidence and, yes, this alarmist groupthink has indeed led to the triumph of irrationality over reason.
Paul MacRae is the author of False Alarm: Global Warming—Facts Versus Fears.
Notes
1. Carl Sagan, The Demon-Haunted World: Science as a Candle in the Dark. New York: Ballantine Books, 1996, p. 49.
2. Irvin L. Janis, Groupthink: Psychological Studies of Policy Decisions and Fiascoes. Boston: Houghton Mifflin, 1982, p. 3.
3. Janis, p. vii.
4. Janis, p. 9.
5. Janis, p. 247.
6. Janis, pp. 174-175.
7. Janis, p. 5.
8. Andrew Potter, “The newspaper is dying—hooray for democracy.” Maclean’s, April 7, 2008, p. 17.
9. Walter Kaufmann, Critique of Religion and Philosophy. Princeton, NJ: Princeton Univ. Press, 1990 (1958), p. 51.
10. Edward Wegman, et al., “Ad Hoc Committee Report on the ‘Hockey Stick’ Global Climate Reconstruction.” U.S. House Committee on Energy and Commerce, 2006, p. 65.
11. Wegman, et al., p. 29.
12. Wegman, et al., p. 51.
13. Judith Curry, “On the credibility of climate research.” Climate Audit blog, Nov. 22, 2009.
14. Andrew Revkin, “A climate scientist who engages skeptics.” Dot.Earth, Nov. 27, 2009.
15. Steve Fuller, Kuhn vs. Popper: The Struggle for the Soul of Science. Cambridge: Icon Books, 2006 (2003), p. 105.
16. Martin Cohen, “Beyond debate?” Times Higher Education, Dec. 10, 2009.
June 8, 2016
Posted by aletho |
Science and Pseudo-Science, Timeless or most popular |
Leave a comment
In US, Indian Premier Modi vows to improve ease of doing business
India and the US have signed an agreement to enhance cooperation on energy security, clean energy and climate change, and an MOU on cooperation in gas hydrates. In Washington on Tuesday, Indian Prime Minister held extensive talks with US President Barack Obama, including climate change and nuclear energy.
A Reuters report quoted a Westinghouse Electric spokesperson as saying “negotiations continue” on building 6 nuclear reactors in India. A joint statement, after Modi-Obama talks, said India and the US Export-Import Bank were working to complete a financing package for the project.
The Indian Prime Minister also pushed for enlisting US support to India’s membership of the Nuclear Suppliers Group (NSG) and the Asia Pacific Economic Cooperation bloc, APEC.
A New York Times editorial argued that India has yet merited a NSG berth.
India does not meet one of the major factors for membership of the NSG – being a party to Nuclear Non-Proliferation Treaty. Many countries including Ireland, Austria, New Zealand, among many others, are opposed to India’s NSG ascension.
Meanwhile, the US-India joint statement issued after Modi-Obama talks does not mention the much hyped South China Sea dispute. The document does refer to “settlement of territorial disputes by peaceful means”.
“The leaders reiterated the importance they attach to ensuring freedom of navigation and overflight and exploitation of resources as per international law, including the United Nations Convention on the Law of the Sea (UNCLOS), and settlement of territorial disputes by peaceful means,” said the Indo-US joint statement on Tuesday.
The US has not signed the UN treaty, the UNCLOS.
A trilateral Russia-India-China (RIC) statement earlier this year echoed Beijing’s position that the disputes must be resolved between “parties directly involved”.
“Russia, India and China are committed to maintaining a legal order for the seas and oceans based on the principles of international law, as reflected notably in the UN Convention on the Law of Sea (UNCLOS). All related disputes should be addressed through negotiations and agreements between the parties concerned,” the joint statement after the Russian, Chinese and Indian Foreign Ministers meet said in April in Moscow.
At the Oval Office meeting between Obama and Modi on Tuesday, the two leaders also reiterated their commitment to pursue low greenhouse gas emission development strategies in the pre-2020 period and to develop long-term low greenhouse gas emission development strategies.
New Delhi has vowed to join the Paris climate change deal this year, which would provide a “significant global momentum” towards implementation of the historic agreement, the White House said.
“We discussed how we can, as quickly as possible, bring the Paris Agreement into force,” Obama said.
Modi, who also addressed the US-India Business Council, stressed that the Indian government would “continue to make progress on improving the investment climate and ease of doing business”.
“We are encouraging foreign and domestic investors to set up high quality and efficient manufacturing facilities,” Modi told the audience.
On Tuesday, Amazon Inc AMZN.O Chief Executive Jeff Bezos said his company would invest an additional $3 billion in India.
Two major American business bodies earlier this year, however, voiced disappointment with what they called “the glacial pace” of market reforms in India.
In a submission to the US commerce secretary, the US National Association of Manufacturers urged Washington to press for change during Modi’s visit.
“Despite statements made by Prime Minister Modi and other senior Indian officials over the past two years, there has been limited progress in many key areas that make it challenging to do business in India,” the group wrote.
US exporters to India have frequently complained about protectionist restrictions and high tariffs. India and the US have also dragged several trade disputes to the WTO.
The United States won a ruling against India at the WTO in February after challenging the rules on the origin of solar cells and solar modules used in India’s national solar power program. In April, Indian Minister of State for power, coal, new and renewable energy Piyush Goyal said the government intends to file 16 cases against the US for allegedly violating WTO treaties.
Modi is set to address the US Congress on Wednesday.
US to build 6 nuclear power plants in India: WH
The United States and India have agreed to move ahead with a plan to build six nuclear reactors in India, according to the White House.
The plan was finalized during a meeting between President Barack Obama and Indian Prime Minister Narendra Modi at the White House on Tuesday.
It will be the first such construction since the two countries signed a landmark nuclear accord in 2008.
The price for the project is still under discussion, but officials said more difficult issues like liability have been worked out.
India passed a law in 2010 that would make US companies constructing nuclear power plants in the country liable for accidents.
Under the new deal, India’s Nuclear Power Corporation and Westinghouse Electric Co. of the US will begin engineering work for the reactors, though the final contract is not expected to be completed until June 2017, White House officials said.
“Culminating a decade of partnership on civil nuclear issues, the leaders welcomed the start of preparatory work on-site in India for six AP 1000 reactors to be built by Westinghouse and noted the intention of India and the US Export-Import Bank to work together toward a competitive financing package for the project,” the White House said in a statement.
“Once completed, the project would be among the largest of its kind,” it added.
The deal is believed to be part of Washington’s drive to boost cooperation with India as a counterbalance to China.
Obama said at the meeting that the US and India intended to “cooperate more effectively in order to promote jobs, promote investment, promote trade and promote greater opportunities for our people.”
The meeting will be followed by a speech Wednesday by the Indian prime minister to a joint session of the US Congress, where he is expected to be greeted warmly by American lawmakers.
Modi also announced his intention to formally join the international climate-change agreement reached in Paris in December.
The inclusion of India is significant as it could guarantee that the Paris climate agreement will go into effect before the next US president takes office. India is the world’s third-largest emitter after China and the US.
Donald Trump, the presumptive Republican nominee for US president, has vowed to “cancel” the pact if elected.
It is Modi’s fourth visit to the US as New Delhi intends to forge closer ties with Washington before President Obama leaves office next year.
June 8, 2016
Posted by aletho |
Nuclear Power, Progressive Hypocrite, Science and Pseudo-Science, Timeless or most popular | India, Obama, United States |
Leave a comment
The professional standards of science must impose a framework of discipline and at the same time encourage rebellion against it. – Michael Polanyi (1962)
A recent tweet by Andrea Saltelli reminded me of Michael Polanyi’s 1962 essay “The Republic of Science: Its Political and Economic Theory.”
Polanyi provides an interesting perspective from the mid 20th century, as the U.S. and Europe were contemplating massive public investments in science. Polanyi’s perspective was colored by his early years in Hungary, which led him to oppose central planning in the sciences.
I encourage you to read Polanyi’s entire essay, it contains many interesting reflections on history and political philosophy of science. Below are some some excerpts with highlights that provide the springboard for my own reflections on the state of science (particularly climate science) in the early 21st century.
Excerpts:
MY title is intended to suggest that the community of scientists is organised in a way which resembles certain features of a body politic and works according to economic principles similar to those by which the production of material goods is regulated.
The first thing to make clear is that scientists, freely making their own choice of problems and pursuing them in the light of their own personal judgment are in fact cooperating as members of a closely knit organisation. [T]he principle of their coordination consists in the adjustment of the efforts of each to the hitherto achieved results of the others. We may call this a coordination by mutual adjustment of independent initiatives–of initiatives which are coordinated because each takes into account all the other initiatives operating within the same system.
Such self-coordination of independent initiatives leads to a joint result which is unpremeditated by any of those who bring it about. Their coordination is guided as by ‘an invisible band’ towards the joint discovery of a hidden system of things. Since its end-result is unknown, this kind of cooperation can only advance stepwise, and the total performance will be the best possible if each consecutive step is decided upon by the person most competent to do so. We may imagine this condition to be fulfilled for the fitting together of a jig-saw puzzle if each helper watches out for any new opportunities arising along a particular section of the hitherto completed patch of the puzzle, and also keeps an eye on a particular lot of pieces, so as to fit them in wherever a chance presents itself. The effectiveness of a group of helpers will then exceed that of any isolated member to the extent to which some member of the group will always discover a new chance for adding a piece to the puzzle more quickly than any one isolated person could have done by himself.
WHAT I have said here about the highest possible coordination of individual scientific efforts by a process of self-coordination may recall the self coordination achieved by producers and consumers operating in a market. It was, indeed, with this in mind that I spoke of ‘the invisible hand ‘ guiding the coordination of independent initiatives to a maximum advancement of science, just as Adam Smith invoked ‘ the invisible hand ‘ to describe the achievement of greatest joint material satisfaction when independent producers and consumers are guided by the prices of goods in a market.
In the case of science, adjustment takes place by taking note of the published results of other scientists; while in the case of the market, mutual adjustment is mediated by a system of prices broadcasting current exchange relations, which make supply meet demand.
[T]he decisions of a scientist choosing a problem and pursuing it to the exclusion of other possible avenues of inquiry may be said to have an economic character. For his decisions are designed to produce the highest possible result by the use of a limited stock of intellectual and material resources. The scientist fulfils this purpose by choosing a problem that is neither too hard nor too easy for him. The line the scientist must choose turns out, therefore, to be that of greatest ego involvement; it is the line of greatest excitement, sustaining the most intense attention and effort of thought. He should not hesitate to incur such a loss, if it leads him to deeper and more important problems.
BOTH the criteria of plausibility and of scientific value tend to enforce conformity, while the value attached to originality encourages dissent. This internal tension is essential in guiding and motivating scientific work. The professional standards of science must impose a framework of discipline and at the same time encourage rebellion against it. They must demand that, in order to be taken seriously, an investigation should largely conform to the currently predominant beliefs about the nature of things, while allowing that in order to be original it may to some extent go against these.
The authority of scientific standards is thus exercised for the very purpose of providing those guided by it with independent grounds for opposing it. The capacity to renew itself by evoking and assimilating opposition to itself appears to be logjcally inherent in the sources of the authority wielded by scientific orthodoxy.
But who is it, exactly, who exercises the authority of this orthodoxy? No single scientist has a sound understanding of more than a tiny fraction of the total domain of science. [W]hile scientists can admittedly exercise competent judgment only over a small part of science, they can usually judge an area adjoining their own special studies that is broad enough to include some fields on which other scientists have specialised. And, of course, each scientist who is a member of a group of overlapping competences will also be a member of other groups of the same kind, so that the whole of science will be covered by chains and networks of overlapping neighbourhoods.
ADMITTEDLY, scientific authority is not distributed evenly throughout the body of scientists; some distinguished members of the profession dominate over others of a more junior standing. But the authority of scientific opinion remains essentially mutual; it is established between scientists, not above them.
Let me make it clear, even without going into detail, how great and varied are the powers exercised by this authority. Appointments to positions in universities and elsewhere, which offer opportunity for independent research, are filled in accordance with the appreciation of candidates by scientific opinion. Referees reporting on papers submitted to journals are charged with keeping out contributions which current scientific opinion condemns as unsound. Representatives of scientific opinion will pounce upon newspaper articles or other popular literature which would venture to spread views contrary to scientific opinion. The teaching of science in schools is controlled likewise. And, indeed, the whole outlook of man on the universe is conditioned by an implicit recognition· of the authority of scientific opinion.
Only by securing popular respect for its own authority can scientific opinion safeguard the complete independence of mature scientists and the unhindered publicity of their results, which jointly assure the spontaneous coordination of scientific efforts throughout the world.
DURING the last 20 to 30 years, there have been many suggestions and pressures towards guiding the progress of scientific inquiry in the direction of public welfare. I appreciate the generous sentiments which actuate the aspiration of guiding the progress of science into socially beneficent channels, but I hold its aim to be impossible and nonsensical.
I argued that the present practice of filling vacant chairs by the most eminent candidate that the university can attract was the best safeguard for rational distribution of efforts over rival lines of scientific research. For the principal criterion for offering increased opportunities to a new subject was the rise of a growing number of distinguished scientists in that subject and the falling off of creative initiative in other subjects, indicating that resources should be withdrawn from them.
[L]ittle more can, or need, be done towards the advancement of science, than to assist spontaneous movements towards new fields of distinguished discovery, at the expense of fields that have become exhausted. Though special considerations may deviate from it, this procedure must be acknowledged as the major principle for maintaining a balanced development of scientific research.
Those who think that the public is interested in science only as a source of wealth and power are gravely misjudging the situation. Universities should have the courage to appeal to the electorate, and to the public in general, on their own genuine grounds. For the only justification for the pursuit of scientific research in universities lies in the fact that the universities provide an intimate communion for the formation of scientific opinion, free from corrupting intrusions and distractions. For though scientific discoveries eventually diffuse into all people’s thinking the general public cannot participate in the intellectual milieu in which discoveries are made. Discovery comes only to a mind immersed in its pursuit. For such work the scientist needs a secluded place among like minded colleagues who keenly share his aims and sharply control his performances.
The more widely the republic of science extends over the globe, the more numerous become its members in each country and the greater the material resources at its command, the more clearly emerges the need for a strong and effective scientific authority to reign over this republic. When we reject today the interference of political religious authorities with the pursuit of science, we must do this in the name of the established scientific authority which safeguards the pursuit of science.
Consider, also, the fact that these scientific evaluations are exercised by a multitude of scientists, each of whom is competent to assess only a tiny fragment of current scientific work, so that no single person is responsible at first hand for the announcements made by science at any time. And remember that each scientist originally established himself as such by joining at some point a network of mutual appreciation extending far beyond his own horizon. Each such acceptance appears then as a submission to a vast range of value-judgments exercised over all the domains of science, which the newly accepted citizen of science henceforth endorses, although he knows hardly anything about their subject-matter. Thus, the standards of scientific merit are seen to be transmitted from generation to generation by the affiliation of individuals at a great variety of widely disparate points, in the same way as artistic, moral or legal traditions are transmitted. This conclusion gains important support from the fact that the methods of scientific inquiry cannot be explicitly formulated and hence can be transmitted only in the same way as an art, by the affiliation of apprentices to a master. The authority of science is essentially traditional.
But this tradition upholds an authority which cultivates originality. Scientific opinion imposes an immense range of authoritative pronouncements on the student of science, but at the same time it grants the highest encouragement to dissent from them in some particular. Scientific tradition enforces its teachings in general, for the very purpose of cultivating their subversion in the particular.
The Republic of Science shows us an association of independent initiatives, combined towards an indeterminate achievement. It is disciplined and motivated by serving a traditional authority, but this authority is dynamic; its continued existence depends on its constant self-renewal through the originality of its followers.
The Republic of Science is a Society of Explorers. Such a society strives towards an unknown future, which it believes to be accessible and worth achieving. In the case of scientists, the explorers strive towards a hidden reality, for the sake of intellectual satisfaction. And as they satisfy themselves, they enlighten all men and are thus helping society to fulfil its obligation towards intellectual self-improvement.
Since a dynamic orthodoxy claims to be a guide in search of truth, it implicitly grants the right to opposition in the name of truth–truth being taken to comprise here, for brevity, all manner of excellence that we recognise as the ideal of self-improvement. Th[is] freedom assures them the right to speak the truth as they know it.
JC reflections
Polanyi’s essay provides some interesting insights, as well as some striking contrasts with the Republic of Science in the early 21st century.
Polanyi’s analogy of the scientific process with markets captures the pure incentives that drive scientists – search of truth, intellectual satisfaction and individual ego. What happens when the externalities of the Republic of Science produce perverse incentives, and careerism becomes a dominant incentive that requires publishing a lot of papers rapidly and producing headline-worthy results (who even cares if these papers don’t survive scrutiny beyond their press release)? (see What is the measure of scientific success?) What happens is that you get increasing incidence of scientific fraud (see Science: in the doghouse?), cherry picking and meaningless papers on headline grabbing topics that don’t stand up to the test of time (see Trust and don’t bother to verify).
And what happens when the ‘hand’ guiding science isn’t ‘invisible’, i.e. science is driven by politics, such as a political imperative to move away from fossil fuels and towards renewable energy? Federal funding can bias science, particularly in terms of selecting which scientific problems receive attention (link).
And what of Polanyi’s statement: “Such self-coordination of independent initiatives leads to a joint result which is unpremeditated by any of those who bring it about.” The ‘result’ of dangerous anthropogenic climate change and the harms of dietary fat were hardly unpremeditated.
When science is politically relevant and has been politicized, how objective are the authorities that are keepers of the orthodoxy — journal editors, officers of professional societies, university administrators — and how open are they to dissenting perspectives? The experiences of Lennart Bengtsson (link), my being called a ‘climate heretic’ (see my essay Heresy and the creation of monsters), Christopher Essex’s essay (link), Roger Pielke Jr’s experiences, and MANY more examples among climate scientists speak to the fact that the keepers of the climate science orthodoxy are failing in this regard [link to Are climate scientists being forced to toe the line?]. Without the internet and the blogosphere, these dissenting voices would be rendered silent by the keepers of the orthodoxy.
Climate and environmental sciences are far from the only scientific fields suffering in this way – the problem is also rampant in medicine, nutrition, and psychology [link to Partisanship and silencing science.]
Where lies the solution to this? Well, one possibility is reflected in Polanyi’s statement: “[L]ittle more can, or need, be done towards the advancement of science, than to assist spontaneous movements towards new fields of distinguished discovery, at the expense of fields that have become exhausted.” Now that climate science is ‘settled’, i.e. at least it is perceived to be sufficiently settled to provide the basis for a very expensive international climate ‘agreement’ (not treaty), perhaps future investments should be directed towards other fields that are deemed important or where greater progress can be made. This is exactly what has been happening in Australia, as the Turnbull administration has been axing climate jobs at CSIRO (link).
Is climate science ‘exhausted’ in terms of diminishing returns on future research? I would argue that climate science is an immature field with many unknowns; however the current paradigm of using inadequate climate models to focus on human caused climate change has reached the point of diminishing returns. Further, the intense politicization of the subject has adversely influenced the community of scientists — in terms of biasing the scientists and also in discouraging young scientists from entering and staying in the field. So in a sense, climate science has become ‘exhausted’ by the politicization.
Governments who fund science and universities who hire scientists need to make the hard decisions regarding which fields and subfields are most worthy of investment, in terms of new breakthrough science. While I was Chair of the School of Earth and Atmospheric Sciences, it was my privilege and opportunity to hire 27 faculty members (24 as primary appointments, 3 as joint hires) over the course of 13 years. This is a rare opportunity for a department in the geosciences. When I became Chair in 2002, the School had 4 divisions – geochemistry, geophysics, atmospheric chemistry, and atmospheric dynamics. I made it a priority to bring ‘water’ into the School, and to hire faculty members that could interact with other scientists and engineers, beyond the geosciences, to stimulate new research areas. Apart from these broad objectives, I hired the best people that we could attract, with little preference for specific research areas. This approach resulted in a reconfiguration of the school to include oceanography, planetary and space sciences, biogeochemistry, and new subfields of geophysics.
I did not hire much in the areas of atmospheric dynamics or climate science (outside of oceanography and biogeochemistry), simply because the quality of the applicants was not as strong as in the other fields. While I have inferred that my provost was not pleased that I did not hire more in ‘climate science’, the outstanding young scientists that I did hire are garnering substantial external recognition and are being heavily recruited by other universities (good luck to the new Chair in retaining these outstanding faculty members). Why didn’t I hire more in atmospheric dynamics and climate science? The atmospheric dynamics faculty candidates generally were in the areas of data assimilation and mesoscale modeling — areas that are important, but arguably engineering rather than science that is going to lead to a breakthrough in understanding. In climate science, most of the applicants were using climate models, by running scenarios and inferring dire consequences — not the climate dynamics theorists that I was hoping for, that could help understand and untangle the complex physical, chemical and even biological processes influencing the climate system.
In a broader sense, which scientific subfields and topics are deemed to be important and why? There is no easy answer to this, but it is the job of university Deans and federal funding agencies to prioritize. There is an interesting example currently in the news, that comes from Georgia Tech’s David Hu, Associate Professor in Mechanical Engineering. He has written an essay Confessions of a Wasteful Scientist. Subtitle: Three of my projects appeared last week on a senator’s list of questionable research. Allow me to explain…
I would also like to respond to Polanyi’s statement: “universities provide an intimate communion for the formation of scientific opinion, free from corrupting intrusions and distractions.” I am very sad to report that this simply isn’t true of universities in the early 21st century. Heterodoxacademy.org is responding to the lack of intellectual diversity at universities. Universities are becoming very uncomfortable places for faculty members with minority perspectives on controversial topics.
As a result, many scientists with minority perspectives are leaving universities. Further, the internet has enabled many individuals outside of academia to make important contributions to climate science (published in refereed journals, in books, and in other reports). Polanyi wrote: “[T]he general public cannot participate in the intellectual milieu in which discoveries are made. For such work the scientist needs a secluded place among like minded colleagues who keenly share his aims and sharply control his performances.” This is a perspective on scientists that is peculiar to the 20th century [see Scientist: the evolving story of a word]. Particularly in climate science, we are seeing the emergence of a substantial and influential cohort of non-academic scientists, contributing both to the published literature and the public scientific debate. This broadening of the notions of expertise away from university elites is leading some to question whether our traditional notions of expertise are dead [link].
So, what should the Republic of Science look like in the 21st century? The overwhelming issue for the health of science is to reassert the importance of intellectual and political diversity in science, and to respect and even nurture scientific mavericks. The tension between pure (curiosity driven) science and use-inspired and applied science [see Pasteur’s quadrant] needs to be resolved in a way that supports all three, with appropriate roles for universities, government and the private sector. And finally, the reward structure for university scientists need to change to reward more meaningful science that stands the test of time, versus counting papers and press releases, which may not survive even superficial scrutiny even after being published in prestigious journals that are more interested in impact than in rigorous methods and appropriate conclusions.
Failure to give serious thought to these issues risks losing the public trust and support for elite university science (at least in certain fields). Scientists are becoming their own worst enemy when they play into the hands of politicians and others seeking to politicize their science.
May 31, 2016
Posted by aletho |
Corruption, Science and Pseudo-Science, Timeless or most popular | Georgia Tech |
Leave a comment

http://www.theguardian.com/environment/climate-consensus-97-per-cent/2016/may/27/meteorologists-are-seeing-global-warmings-effect-on-the-weather
The Guardian have dredged up a US meteorologist, Paul Douglas, to come up with a list of “extreme” weather events, which he then uses to claim that climate change is making worse.
Whatever happened to normal weather? Earth has always experienced epic storms, debilitating drought, and biblical floods. But lately it seems the treadmill of disruptive weather has been set to fast-forward. God’s grandiose Symphony of the Seasons, the natural ebb and flow of the atmosphere, is playing out of tune, sounding more like a talent-free second grade orchestra, with shrill horns, violins screeching off-key, cymbal crashes coming in at the wrong time. Something has changed.
Let’s start by looking at some of his claims:
A warmer atmosphere is increasing water vapor levels overhead, juicing storms, fueling an increase in flash floods in the summer, and heavier winter snows along the East Coast of the USA. “All storms are 5 to 10 percent stronger in terms of heavy rainfall” explained Dr. Kevin Trenberth, at the National Center for Atmospheric Research in Boulder, Colorado. “It means what was a very rare event is now not quite so rare.”
Yet even the IPCC tell us they can find no evidence that floods are getting bigger or more frequent on a worldwide basis:

https://notalotofpeopleknowthat.wordpress.com/2015/12/08/the-ipcc-floods/
And as far as the US is concerned, the USGS say:
Only one of four large regions of the United States showed a significant relationship between carbon dioxide (CO2) in the atmosphere and the size of floods over the last 100 years. This was in the southwestern region, where floods have become smaller as CO2 has increased.
Storms? Surely any US meteorologist worth his salt must know that tornadoes have been getting much less frequent, and, more particularly, less violent since the 1970s.

http://www1.ncdc.noaa.gov/pub/data/cmb/images/tornado/clim/EF3-EF5.png
He goes on to rehash the thoroughly discredited theory of Jennifer Francis that Arctic warming is making the jet stream more sluggish and wavy, bringing weather blocks.
If he had bothered reading HH Lamb, he might have found out that the same sort of weather was occurring when the world was cooling after the Second World War. This was what Lamb had to say in his volume, “Climate, History and The Modern World”:
ANOTHER TURNING POINT
Over the years since the 1940’s, it has become apparent that many of the tendencies in world climate which marked the previous 50 to 80 years or more have either ceased or changed…. It was only after the Second World War that the benign trend of the climate towards general warming over those previous decades really came in for much scientific discussion and began to attract public notice.
VARIABILITY INCREASES
Such worldwide surveys as have been attempted seem to confirm the increase of variability of temperature and rainfall [since 1950].’’
In Europe, there is a curious change in the pattern of variability: from some time between 1940 and 1960 onwards, the occurrence of extreme seasons – both as regards temperature and rainfall has notably increased.
A worldwide list of the extreme seasons reported since 1960 makes impressive reading. Among the items included:
1960-9 – Driest decade in central Chile since 1770’s and 1790’s.
1962-3 Coldest winter in England since 1740.
1962-5 Driest four-year period in the eastern United States since records began in 1738.
1963-4 Driest winter in England & Wales since 1743; coldest winter over an area from the lower Volga basin and Caspian Sea to the Persian Gulf since 1745.
1965-6 Baltic Sea completely ice covered.
1968 Arctic sea ice half surrounded Iceland for the first time since 1888.
1968-73 Severest phase thus far of the prolonged drought in the Sahel, surpassing all 20thC experience.
1971-2 Coldest winter in more than 200 yrs in parts of European Russia and Turkey: River Tigris frozen over.
1972 Greatest heatwave in the long records for north Finland and northern Russia.
1973-4 Floods beyond all previous recorded experience stretching across the central Australian desert.
1974-5 Mildest winter in England since 1834.
1975-6 Great European drought produced the most severe soil moisture deficit that can be established in the London (Kew) records since 1698.
1975-6 Greatest heatwaves in the records for Denmark, Netherlands and England.
1976-7 Severest winter in the temperature records (which began in 1738) for the eastern United States.
1978-9 Severest winter and lowest temperature recorded in 200 yrs in parts of northern Europe, and perhaps in the Moscow region. Snowfalls also extreme in parts of northern Europe.
This shortened list omits most of the notable events reported in the southern hemisphere and other parts of the world where instrument records do not extend so far back. Cases affecting the intermediate seasons, the springs and autumns, have also been omitted.
These variations, perhaps more than any underlying trend to a warmer or colder climate, create difficulties for the planning age in which we live. They may be associated with the increased meridionality of the general wind circulation, the greater frequency of blocking, of stationary high and low pressure systems, giving prolonged northerly winds in one longitude and southerly winds in another longitude sector in middle latitudes.
Over both hemispheres there has been more blocking in these years… The most remarkable feature seems to be the an intensification of the cyclonic activity in high latitudes near 70-90N, all around the northern polar region. And this presumably has to do with the almost equally remarkable cooling of the Arctic since the 1950’s, which has meant an increase in the thermal gradient between high and middle latitudes.
https://notalotofpeopleknowthat.wordpress.com/2014/12/02/11646/
He then goes full Guardian !
Pick up a newspaper or turn on the TV to see signs of climate volatility sparking more weather disruption. From the mega-blaze that swept across Fort McMurray, Alberta to repeated flooding of Houston, scorching heat in India, perpetual drought from California to Australia, and a record year for global hurricanes, typhoons and cyclones in 2015, the symptoms of a warming ecosystem are becoming harder to dismiss or deny.
We already know that the so called mega blaze in Alberta is small from a historical perspective, has nothing to do with climate change and would have made little news if man had not built a city in the middle of the wilderness where such things happen all the time.
And what nonsense is this about drought?
There may have been a drought in California recently, one that is certainly not in any way unprecedented, but for the US as a whole, NOAA’s own figures show that droughts are much less common, or severe, in recent decades than they used to be in the past.

http://www.ncdc.noaa.gov/cag/time-series/us/110/0/pdsi/12/12/1895-2016?base_prd=true&firstbaseyear=1901&lastbaseyear=2000
And Australia?

http://www.bom.gov.au/cgi-bin/climate/change/timeseries.cgi?graph=rranom&area=aus&season=0112&ave_yr=7
Not only are rainfall totals consistently higher than the past, but the percentage of land area in decile1, the driest category, is also sharply down. This indicates that the extra rainfall has been widespread, rather than simply extreme in just a few areas.

http://www.bom.gov.au/cgi-bin/climate/change/timeseries.cgi?graph=raindecile01&area=aus&season=0112&ave_yr=7
And Accumulated Cyclone Energy stats do not support the contention that global warming is making hurricanes worse.

http://models.weatherbell.com/tropical.php#!prettyPhoto
Of course, weather and climate continually change. I have little doubt that in some places and at certain times extreme weather has increased, and no doubt too that in others the reverse is true.
What is sad about these pathetic little attempts to blame everything on global warming is that they stop us having a balanced and objective debate on the subject.
The real reason, however, for this story is revealed when Douglas tells us:
In my upcoming book I interview 11 veteran television meteorologists in the United States. All of them are witnessing symptoms of climate change in their hometowns.
May 28, 2016
Posted by aletho |
Deception, Mainstream Media, Warmongering, Science and Pseudo-Science | Extreme weather, The Guardian |
Leave a comment
Aitken et. al. in Nature newly comports to confirm 2015 fears about instability of the Totten Glacier in Eastern Antarctica. This could ‘suddenly’ raise sea level as much as 4 meters! (Or, based on the abstract, maybe only 0.9 meters in ‘modern scale configuration’, but over 2 meters [2.9-4] in unspecified other configurations).
There are two parts to the story of Aitken et. al. 2016: the author’s comments as reported by MSM, and what the paper actually found.
Media reports
An example from the Weather Channel:
“An Antarctic glacier three-fourths the size of Texas continues to melt into the sea, and if it disappears completely, sea levels will rise dramatically around the world, a new study says. The Totten Glacier is melting quickly in eastern Antarctica and threatens to become yet another point of concern as global temperatures rise, according to the study published in the journal Nature. It’s getting close to a “tipping point,” the study found, and if the glacier collapses, global sea levels could rise nearly 10 feet…”I predict that before the end of the century the great global cities of our planet near the sea will have two- or three-meter (6.5 to 10 feet) high sea defenses all around them,” study author Martin Siegert told the French Press Agency.” [Bolds mine]
From Science Daily, drawn from the Imperial College London press release:
Current rates of climate change could trigger instability in a major Antarctic glacier, ultimately leading to more than 2m of sea-level rise. By studying the history of Totten’s advances and retreats, researchers have discovered that if climate change continues unabated, the glacier could cross a critical threshold within the next century, entering an irreversible period of very rapid retreat. This would cause it to withdraw up to 300 kilometres inland in the following centuries and release vast quantities of water, contributing up to 2.9 metres to global sea-level rise. [Bolds mine]
Finally, the lurid title of Chris Mooney’s article in the WaPo on May 18: ‘Fundamentally unstable’: Scientists confirm their fears about East Antarctica’s biggest glacier
Paper
Most of the paper is a complex analysis of detailed gravimetric and magnetic data captured from low pass aircraft mapping an important ridge component of Totten’s subglacial geology.
It is helpful to understand the context for seeking evidence of alarming seal level rise (SLR) (see my previous CE post Sea Level Rise Tipping Points). SLR is not accelerating, so warmunists have searched for future ice sheet ‘tipping points’ that might cause the abrupt SLR supporting urgent CO2 mitigation. Greenland was the initial focus; it is not cooperating because of its bowl shaped geology. See my previous post for details and references.
The West Antarctic Ice Sheet (WAIS) was the next focal point. The Ronne Ice Shelf proved pinned and stable per the above linked Tipping Points guest post. ANDRILL showed that the Ross Ice Shelf is also stable; its grounding line hasn’t shifted for about 4 millennia, ditto the Tipping Points sites linked to above. Attention then shifted to the Amundsen Embayment, where much was made in 2014 of the flowing Pine Island Glacier (PIG)–until it was pointed out PIG sits on an active volcano that has nothing to do with global warming. (There are volcanic ash layers embedded in PIG.) WAIS is not cooperating, either. So attention has now shifted to the East Antarctic Ice Sheet (EAIS) where Totten is the biggest glacier/catchment basin, almost half of the above figure’s NASA defined geological sector (which also contains the Moscow University Ice Shelf and the Frost glacier) just ‘east’ of the Wilkes Land sector in the figure below.

Where Totten enters the Southern Ocean, it is mostly grounded in shallows <500 meters deep. This does not affect its stability (like the Ross Ice Sheet), since the first ~500 meters of Antarctic coastal seawater is basically at the freezing point. But warmer seawater below about 500 meters is melting Totten’s base at a deep trough about 5 km wide and about 800 meters deep, discovered in 2015 [link]. This melting causes a slow retreat of the grounding line behind the trough. The annual basal melting/grounding line retreat rate is presently about 100 meters/year, (but as fast as 175 meters per year in some places according to Aitken per WaPo). It is useful to note that Aitken was an author, but not lead author, on the 2015 trough discovery paper.
This deep ocean melting process could move inland for about 150 km through the Sabrina subglacial basin (deep blue in the following figure from the 2015 paper) over about 1500 years before hitting a sub-ice rock ridge perpendicular to the glacier only about 200 meters below sea level, which would stop the melting (since melting water is below ~500 meters). Aitken et. al. 2016 estimate that this would raise sea level about 0.9 meters, or ~6 cm/century. No cause for alarm.

What Aitken et. al. 2016 reports is another fjord like deep ‘fault trench’ through this blocking ridge, which would (if water temperature stratification remained undisturbed) enable basal melting to proceed through the interior Aurora subglacial basin behind the ridge. This process would continue for about another 350 km, or about 40% back into the Totten catchment basin. Aitken et. al. also used ice-penetrating radar to probe both the Sabrina and Aurora basin floors to confirm that Totten did in fact melt back through both basins about 3 million years ago in the Pliocene (before the onset of the current ice ages), with CO2 at about 400 ppm. That was spun into the PR alarm—it happened before at 400 ppm!!! At the current melting rates this would take about 3 millennia and could raise sea level about 2.9 meters, an unalarming 10cm/century. This is probably still far too fast, since all the Aurora warming water would have to enter undisturbed through the newly reported narrow trench through the ridge.

This is NOT fundamentally unstable collapse, implying 2-3 meters SLR by the end of this century, as the authors clearly intimated in their press releases.
How to get 3 feet of SLR by melting the Sabrina basin back to the ridge? Simply assume that all the ice in the catchment basin to the ridge disappears, even that above sea level not subject to seawater melting. To the ridge and ‘trench’, the catchment basin is about 200-250 km wide, the glacier about 100 km wide, its mouth and protruding ice shelf 145 km wide. The assumption is dubious, but not implausible. It would imply ice flow similar to that of coastal northeast Greenland glaciers today (another overhyped SLR alarm favorite), except where there are no such flowing glaciers today, and where Antarctica never gets above freezing in summer (while most of Greenland does, briefly).
How to get 2.9 meters SLR from the red oval? Easy. Just use the same entire catchment assumption to that deeper recessional melting point.
How to get ~4 meters (WaPo)? Just assume that if the Aurora basin behind the ridge melts via trough/trench intrusion of warmer seawater, the entire catchment will then lose all its ice because it lost its Totten ‘plug’ (up catchment ice is about 2.5 km thick).
This is the same assumption Rignot made in raising PIG alarm about losing all the ice in the Amundsen Embayment catchment, even though his own paper showed that is impossible (as per my previous post at CE).
This is the same assumption that the Greenbaum et. al 2015 trench paper cited above made (on which Aitken was a co-author), upon which Aitken et. al. 2016 builds. From its SI,
8. Sea Level Potential for Totten Glacier and the Aurora Subglacial Basin
We estimate the global sea level potential of ice flowing through Totten Glacier using a modified approach applied for Thwaites and Pine Island Glaciers. We find the ice volume within the Totten Glacier Catchment20, correct for the higher density of seawater, subtract the volume of seawater required to replace the submarine ice, and divide the result by the area of the world oceans (3.6E14 m2). The result, ~3.5 meters, is conservative because it implies vertical catchment boundaries whereas, in reality, ice from neighboring catchments would contribute to the total sea level contribution if the entire catchment was drained of ice.
We follow a similar procedure to compute the total potential global sea level contribution of the Aurora Subglacial Basin (ASB) using catchment 13 defined on NASA Goddard Space Flight Center’s drainage basin website21. Using that catchment we find that at least 5.1 m of global sea level potential is grounded below sea level and is therefore more susceptible to retreat. This figure assumes that all remaining ice grounded above sea level remains as it is today with unrealistic vertical cliffs. If all of the ice in the ASB were to melt, the total sea level contribution would be closer to 6.7 meters. The sea level figures here have not been corrected for isostatic rebound associated with the removal of ice loading of the crust.
[Note: the 6.7 meters assumes all the ice in this entire sector of the first figure disappears. It is easy to build scary PR from bad assumptions. Rignot blazed a false trail now relied on [SI fn 17, 18] by others.
Conclusion
The alarming estimates from this new Nature paper, particularly as represented by the media, are grievously wrong both with respect to the amount of and the rate of sea level rise that might be associated with melting of the EIAS Totten glacier.
There is unjustified author spin in the press releases and author’s interviews. There are underlying bad assumptions never mentioned except by reference to a previously refuted [here] bad paper by Rignot. A tangled web of deceit, to paraphrase a famous poem.
May 23, 2016
Posted by aletho |
Deception, Science and Pseudo-Science, Timeless or most popular | Totten Glacier |
Leave a comment
The EPA recently posted online reports on two disputed herbicide chemicals, only to pull them offline shortly afterwards. The reports said glyphosate was not a human carcinogen and atrazine caused reproductive harm to mammals.
On April 29, the EPA’s cancer assessment review committee (CARC) posted an 86-page report on the agency’s regulations.gov website that stated glyphosate, the main ingredient in Monsanto’s Roundup weed killer that was deemed a “probable” human carcinogen by the World Health Organization last year, “was not likely to be carcinogenic to humans,” Reuters reported.
On May 2, the EPA pulled the report offline, saying the action was taken “because our assessment is not final,” and that the “preliminary” documents were “inadvertently” published.
“EPA has not completed our cancer review,” the EPA told Reuters. “We will look at the work of other governments as well as work by (the U.S. Department of Health and Human Services’) Agricultural Health Study as we move to make a decision on glyphosate.”
However, the cover page of the documents was titled “final Cancer Assessment Document,” Reuters reported, and the word “FINAL” was printed on each page of the report, dated October 1, 2015. The EPA said the assessment — part of the first comprehensive safety review of the chemical since 1993, which will determine glyphosate use in the US over the next 15 years — will be complete by the end of 2016.
Critics of glyphosate ridiculed the EPA for its short-lived assessment, while the chemical’s supporters, including agribusiness giant Monsanto, hailed the report for endorsing glyphosate’s safety. Monsanto even posted a copy of it on its website.
“Pulling the report indicates lack of confidence in the outcome,” tweeted Nathan Donley, a scientist for the Center for Biological Diversity. “Can’t blame them, the analysis is terrible.”
The glyphosate documents indicated that the EPA was “relying heavily on unpublished, industry funded studies” in its assessment that glyphosate is not a human carcinogen, the Center for Biological Diversity said. In contrast, the World Health Organization’s view that glyphosate is a “likely” human carcinogen included studies that were publicly available and that took into account consumer products.
“All they’re doing is reviewing studies that are funded by the industry,” Jennifer Sass, a senior scientist at Natural Resources Defense Council, told Reuters.
In 1974, Monsanto began selling the chemical in Roundup, which has become a top bioicide for farming, especially involving genetically-engineered crops, and home and garden uses.
“No pesticide regulator in the world considers glyphosate to be a carcinogen, and this conclusion by the U.S. EPA once again reinforces this important fact,” said Hugh Grant, Monsanto’s CEO.
The use of glyphosate in herbicides has increased by more than 250 times in the United States over the last 40 years, according to the New England Journal of Medicine. Long-term exposure to glyphosate has been linked to kidney and liver damage, as well as cellular and genetic diseases. Monsanto and defenders of glyphosate use called the World Health Organization’s carcinogen classification too “dramatic” and have pointed to assurances that the chemical is safe.
In April, the European Parliament approved the seven-year reauthorization of glyphosate, though it recommended the chemical should be used only by professionals and not in public places.
Atrazine
Around the same time it pulled the glyphosate assessment off its website, the EPA similarly published and retracted a less-flattering report on the herbicide atrazine, which was banned in Europe in 2004. Atrazine is legal in the US, where it is second only to glyphosate among most-used agricultural herbicides.
Atrazine is manufactured by agrochemical corporation Syngenta. At least 60 million pounds of the chemical is used in the US each year, mainly on corn fields, according to the Natural Resources Defense Council. US agencies and other researchers have found high levels of atrazine in groundwater and drinking water near agricultural and rural areas. Atrazine is known to be an endocrine disruptor and has been linked to hormonal defects and some types of cancer in humans.
On April 29, an EPA assessment on atrazine was posted on the agency’s website but subsequently taken down. The documents are available here. The assessment said atrazine was found to cause reproductive harm to birds and mammals, exceeding by 200 times the EPA’s “levels of concern.” Amphibians were found to be especially at-risk from atrazine exposure, echoing research by scientists at the University of California, Berkeley, who found that about three-quarters of male frogs are castrated by the chemical.
“When the amount of atrazine allowed in our drinking water is high enough to turn a male tadpole into a female frog, then our regulatory system has failed us,” said Donley, the Center for Biological Diversity scientist. “We’ve reached a point with atrazine where more scientific analysis is just unnecessary — atrazine needs to be banned now.”
Like glyphosate, atrazine is undergoing a 15-year safety review by the EPA. The previous of such assessments on atrazine occurred in 2003.
Syngenta, atrazine’s maker, touts the chemical’s safety on its website, claiming it is not “physically possible to dissolve enough atrazine in water to have any impact on hormones or human health.”
“No one has, ever will, or ever could be exposed to enough atrazine in the natural environment to affect their reproductive health,” the chemical giant says.
Read more:
Quaker Oats sued for use of glyphosate in ‘100% natural’ products
May 6, 2016
Posted by aletho |
Deception, Science and Pseudo-Science | Atrazine, EPA, Glyphosate, Monsanto, Syngenta |
Leave a comment
Marc Morano has a new movie, Climate Hustle.
CLIMATE HUSTLE, hosted by award-winning investigative journalist Marc Morano, reveals the history of climate scares including global cooling; debunks outrageous claims about temperatures, extreme weather, and the so-called “consensus;” exposes the increasingly shrill calls to “act immediately before it’s too late,” and in perhaps the film’s most important section, profiles key scientists who used to believe in climate alarm but have since converted to skepticism.
The movie had a red carpet premiere last December in Paris, and was shown last week in a Congressional briefing.
The film will be aired in 500 theaters in the U.S. (and one in Canada) on May 2 in a one night theater event. Locations and showtimes can be found [here].
Video clips including trailers, interviews with Morano, and other clips are found on the Climate Hustle web page [link]. A list of scientists interviewed in film is found [here].
An interesting interview with Marc Morano about the film is found [here].
Let me start by discussing my take on Marc Morano, and why I agreed to be interviewed for his movie. I first heard of Marc Morano circa 2006, from Joe Romm. Romm’s take on Morano was basically that of the climate ‘anti-Christ.’ I then put ClimateDepot on my list of blogs to monitor, to check up on what the ‘evil’ side in the climate debate was up to. I slowly built up an understanding of what Morano was doing, and I didn’t regard all of it as negative.
At some point (probably around the time of Climategate) I found myself on the same email list as Marc Morano, and we exchanged a few emails on issues of common interest. Circa 2010 (if my memory serves) I referred to Marc Morano as a ‘demagogue’ (I can’t find this anywhere on the internet). Marc was offended, we discussed this on email, and I raised my concern about his attacks on individual climate scientists that included publishing their email addresses, etc. We declared sort of a truce on this, and we agreed to point out to each other if we spotted inappropriate behaviors.
Subsequently, I’ve met Marc several times, and I have to say I like the guy. He’s smart and he’s funny (he pokes fun at both sides), and as far as I can tell he is honest. When he asked to interview me for the movie, I agreed to do it. The interview itself was really fun. I have no complaints about how I was portrayed in the movie.
I saw an earlier version of the film in November, prior to the Paris premiere. I wasn’t quite sure what to expect, but my initial reaction was relief that there were no goofy or incredible statements about the science. I found the movie to be pretty entertaining and even interesting, especially the narratives developed around silly alarmist statements made by scientists and politicians.
I thought the selection of featured scientists was quite good. It included some new faces that were quite effective – Caleb Rossiter, Robert Giegengack, Richard Tol, Daniel Botkin were especially good.
The budget for this was shoestring, I think it was less than $500K (somewhere I recall seeing a $20M budget for Merchants of Doubt movie, this may not be correct). Financials for Merchants of Doubt movie: $192K at the box office, with an additional $114K from home video sales (JC note: Merchants of Doubt movie was discussed in this previous post). It will be interesting to see how Climate Hustle does at the box office (and in subsequent home video sales).
I’m sure people will criticize me for participating in this, but then these are the people that have pretty much already sent me to Coventry, so . . . so what.
The key issues surrounding the movie are reflected in these quotes from Randy Olson and Bill Nye:
“I also think [Morano]’s a danger to the efforts of the climate movement”
“I think it will expose your point of view as very much in the minority and very much not in our national interest and the world’s interest.”
Chip Knappenberger tweetrd re Nye’s ‘national interest’ statement: “Sounds like Nye should work for the State Department.”
Well, I will make no attempt to arbitrate what is in the national interest, but a reminder of minority rights in a constitutional democracy seems in order:
Thomas Jefferson, third President of the United States, expressed this concept of democracy in 1801 in his First Inaugural Address. He said,
All . . . will bear in mind this sacred principle, that though the will of the majority is in all cases to prevail, that will to be rightful must be reasonable; that the minority possess their equal rights, which equal law must protect and to violate would be oppression.
In every genuine democracy today, majority rule is both endorsed and limited by the supreme law of the constitution, which protects the rights of individuals. Tyranny by minority over the majority is barred, but so is tyranny of the majority against minorities.
The perspective in Climate Hustle is arguably a minority perspective, at least in terms of world governments and a select group of scientists. Randy Olson comments on this:
There is a need for opposition voices and questioning. If anyone feels threatened by this movie it would have to mean you’re conceding that the communication skills of the environmental side are really bad — which actually they are, so maybe there should be some cause for concern.
So, I hope some of you will be able to see the movie on May 2, I look forward to your reactions.
More reviews
Marc Morano posing with his ‘Climate Criminal’ wanted poster in the streets of Paris

May 2, 2016
Posted by aletho |
Full Spectrum Dominance, Science and Pseudo-Science |
Leave a comment
The House Energy and Commerce Subcommittee held a hearing on a bill intended to streamline nuclear power regulatory rules, in order to allow safer and more efficient next-generation reactors to replace those being decommissioned.
The Advanced Nuclear Technology Development Act of 2016 (HR 4979), introduced by Representative Bob Latta (R-Ohio), was discussed during a Friday hearing of the House Energy and Commerce Subcommittee to reduce regulatory hurdles for building advanced reactors. “Advanced” being defined as having significant improvements over contemporary nuclear reactor, such as better “inherent safety features, lower waste yields, greater fuel utilization, superior reliability, resistance to proliferation, and increased thermal efficiency.”
Currently, the Nuclear Regulatory Commission (NRC) demands a complete and final design from potential nuclear developers. This, combined with expensive reviews that developers pay out of pocket, can deter potential startups with a multimillion dollar price tag with no assurance of ever being allowed to operate. The bipartisan panel’s tenor was that this needs to change.
“The future of the nuclear industry needs to start now, and the Nuclear Regulatory Commission needs to be able to provide the certainty the provide sector needs to invest in innovative technologies.” Goodlatte said at the hearing. “As the United States looks to the future, more energy will be needed, and Nuclear power provides reliable, clean baseload power option.
“Investment in new technology is already happening, with approximately 50 companies in this country working to develop the next generation of nuclear power. It’s time to insure that the NRC provides a framework so that innovators and investors can prepare to apply to licensing technologies.”
In order to create a conducive environment for investment in next-generation plants, HR 4979 would require the NRC to implement a new framework to streamline nuclear plant licensing, making it more efficient and cost-effective to investors by 2019. The commission would have to submit to an implementation plan for such a framework within 180 days of the enactment of the law.
The US’s 99 operational nuclear energy plants provide nearly 20 percent of the country’s power, but approximately 126,000 megawatts of nuclear power generation is set to be retired over the next 15 years. At the same time, the US Energy Information Administration forecasts a need for 287,000 megawatts of new electric capacity by 2040 – on top of replacing the electric capacity that is needed to replace the retired power plants.
This reality, combined with the fact that nuclear power produces no greenhouse gasses, has led to environmentally-conscious lawmakers on the committee making common cause with their innovation-minded colleagues worried about falling behind international competitors.
“Our nation will, by necessity, diminish its dependence on fossil fuels in order to fight climate change. And as we do so, we will need to turn more and more to nuclear power,” said Representative Jerry McNerney (D-Illinois), who co-signed the bill.
The hearing comes at a time of renewed anxiety about aging nuclear power infrastructure. Earlier this month, a Manhattan Project-era nuclear storage facility in Washington state had up to 3,500 gallons of waste leaking out. However, the Washington Department of Ecology said that there was no risk to the environment or nearby residents.
April 30, 2016
Posted by aletho |
Corruption, Militarism, Nuclear Power, Science and Pseudo-Science | United States |
Leave a comment
The world has had 30 years to assess the consequences for life on Earth of the disaster at Chernobyl.
This is about the same period during which I have studied the effects of radioactive pollution on the planet. It was the radioactive rain in the mountains of North Wales, where I lived in 1986, that brought me into this strange Alice in Wonderland area of science, where people and children die, and the global authorities, advised by physicists, deny what would be obvious to a child at school.
Chernobyl was mentioned as the star that fell to earth in the Book of Revelations. You may laugh, and it may be a coincidence, but the impact of the event has certainly been of biblical proportions. It is a story about the imposition by reductionist science on humanity of a version of the truth constructed from mathematics, not the only one, but perhaps the most important, since it involves the systematic destruction of the genetic basis of life. It is a story of lies, secrecy, power, assassination and money: the vast amounts of money that would be lost if the truth came out.
Shortly after the murder in 1992 of the German Green Party leader and anti-nuclear activist Petra Kelly, the late Prof Ernest Sternglass (the first of the radiation scientist/ activists) told me that Kelly had just struck a deal with a German TV company to run a series demonstrating the true awfulness of the immediate effects of radiation. He said: if the truth came out, all the Uranium and the billions of dollars in Uranium shares would turn into sand. So something like a cover-up had to happen, and it did, continuing the process of chicanery and control of information that began with the nuclear weapons tests of the 50s and 60s. In 1959, as the genetic effects of the atmospheric tests became apparent, the control of the understanding of radiation and health was wrested from the World Health Organization (WHO) and passed to the International Atomic Energy Agency (IAEA).
Since then, no research on the health effects of radiation has been carried out by WHO, which has led to a permanent vigil outside their headquarters in Geneva by the group Independent WHO.
The arguments about the health effects of Chernobyl have mostly centered on cancer. I won’t write much about cancer here. The study of radiation and cancer has many complications, including that the data is often suspect, the time lag between the cancer diagnosis and the original radiation exposure can be 20 years, in which time a lot can happen, introducing ammunition (and opportunity) for those denying causation. The predictions of the global cancer yield of the Chernobyl contamination has ranged from around a million (as predicted independently by the European Committee on Radiation Risk (ECRR), Rosalie Bertell, John Gofman and me, to about 600,000 (Alexey Yablokov), to less than a few thousand (the International Commission on Radiological Protection (ICRP), whose risk model is the current basis for all legal constraints on radioactive releases in Europe.
Cancer is caused by genetic damage but takes a while to show. More easily studied is the immediate and direct genetic damage, demonstrated in birth rates of congenital diseases, birth defects, fetal abnormalities, data which is easier to locate. The effects of a sudden increase in radioactive contamination are most easily seen in sudden increases in these indicators. You don’t have to wait 20 years. Out they come after nine months or in aborted fetuses with their heart and central nervous system defects, their lack of hands and feet, their huge hydrocephalic heads, their inside-out organs, their cleft palates, cyclops eyes and the whole range of dreadful and usually fatal conditions. There is no argument, and the affair is in the hands of doctors, not physicists. The physicists of the ICRP base their risk of genetic effects on experiments with mice.
I was in Kiev in 2000 at the WHO conference on Chernobyl. On the podium, conducting the theatricals, were the top men in the IAEA (Abel Gonzalez) and the United National Scientific Committee on the Effects of Atomic Radiation (UNSCEAR), represented by Canadian Norman Gentner. No effects can be seen—Abel Gonzalez. Internal radiation is the same as external—Norman Gentner. Happily you can watch this farce as it was videotaped by a Swiss team.
So: cut to the chase, to the fatal assault on the edifice of the current ICRP radiation risk model. In January 2016 Prof Inge Schmitz Feuerhake, Dr Sebastian Pflugbeil and I published a major review paper on the genetic effects of radiation in the prestigious Korean peer-reviewed Journal of Environmental Health and Toxicology.
What the research shows is that in every corner of the ex-Soviet Union and Europe and even further afield where epidemiologists and pediatricians looked, there were large and statistically significant increases in congenital diseases at birth and in babies that were aborted.
The new article recalculates the genetic risk from radiation based upon reports from Germany, Turkey, Greece, Croatia, Egypt, Belarus, Ukraine, Russia, Hungary, Italy, the UK, Scotland, Wales, indeed everywhere where anyone looked. There was a sudden jump in birth defects immediately following the contamination from Chernobyl and in proportion; but only up to the point where the exposure was so great the babies died in the womb or miscarried early in pregnancy. Thus, the relation between exposure level and effect was not a simple one where the birth defects increased with exposure: after a critical level of exposure they leveled off, or indeed fell. Also since contamination is still there, women are still giving birth to genetically damaged children some 30 years later. These results, published by many doctors, epidemiologists and researchers in many different journals, show that the effects occurred at levels of contamination that provided ‘doses’, that yardstick of radiation exposure invented by the ICRP, that were very low, often below the natural background dose.
It is worse: from research on the nuclear test site veterans’ grandchildren (also reviewed in the study) it is clear that these effects continue down the generations and will only disappear when an offspring dies without issue, and leaves the genome of the human race. And many will or already have done: since what causes genetic malformation in the infant, at a larger dose causes fetal death and infertility. No one can have failed to have noticed the increase in human infertility that has occurred since the radioactive contamination of the planet began in the 1950s. As ex- US Atomic Energy Commission scientists John Gofman wrote in 1981 “the nuclear industry is waging a war on humanity.”
How can it be possible that the legislative system has got it so wrong? The answer is also given in the paper. It is that the concept of ‘dose’ which may be convenient for the physicists as it is simple to compute, really does not address the situation where the substances that provide the dose are inside the body, often bound chemically to the DNA, which is the acknowledged target for all these genetic effects. It shows that the human genome (and of course that of all life) is exquisitely sensitive to radiation damage from such internal exposures, to Strontium-90, Plutonium-239, Uranium and particularly to the nano-particles containing these radioactive elements which were produced when the reactor No 4 blew apart.
The paper shows the studies of the Hiroshima bomb survivors, upon which the current unsafe radiation laws are based were faulty because the true comparison group, those not in the city at the time of the bombing, was abandoned when it began to look like there was a real effect. Was this stupidity? Was it a trick? Does someone have to go to jail?
Last month, Prof. Alexey Yablokov, Dr. Alex Rosen and I wrote to the editor of The Lancet, in a recorded delivery letter posted by the Independent WHO in Geneva, requesting space in that influential journal to draw attention to these truths and overturn the false and dangerous structures created by the physicists. Let us all hope that some good will finally come of the disaster—that the real legacy of Chernobyl will be the understanding of the true danger to health of radioactive pollution.
Note: The ECRR has focused on Chernobyl as a major data source for establishing the risk posed by radiation. It has concluded that the current ICRP model is in error by upwards of about 300-fold, for some types of internal exposures, by upwards of 1000-fold. This means that over the period of the radiation contamination, more than 60 million people have died from cancer as a result of the releases. This risk model is available on the website http://www.euradcom.org.
See also:
CHERNOBYL: FALLOUT 30 (SPECIAL PROJECT)
Christopher Busby is an expert on the health effects of ionizing radiation. He qualified in Chemical Physics at the Universities of London and Kent, and worked on the molecular physical chemistry of living cells for the Wellcome Foundation. Professor Busby is the Scientific Secretary of the European Committee on Radiation Risk based in Brussels and has edited many of its publications since its founding in 1998. He has held a number of honorary University positions, including Visiting Professor in the Faculty of Health of the University of Ulster. Busby currently lives in Riga, Latvia. See also: http://www.chrisbusbyexposed.org, http://www.greenaudit.org and http://www.llrc.org.
April 26, 2016
Posted by aletho |
Environmentalism, Nuclear Power, Science and Pseudo-Science, Timeless or most popular, Video | Chernobyl, Human rights, Russia |
Leave a comment
New York Attorney General Eric T. Schneiderman has accused ExxonMobil of lying to the public and investors about the risks of climate change according to the NY Times and has launched an investigation and issued a subpoena demanding extensive financial records, emails and other documents.
Massachusetts, the US Virgin Islands, and California are also investigating ExxonMobil. It is interesting that all but one of the attorneys general are Democrats. The remaining attorney general is Claude Walker of the US Virgin Islands who is a Green leaning Independent. So, this is a very partisan investigation, carefully coordinated with anti-fossil fuel activists. How much is there to it?
I’ve reviewed the 22 internal documents from 1977 to 1989 made available by ExxonMobil here. I’ve also reviewed what I could find on 104 publications (most are peer-reviewed) with ExxonMobil personnel as authors or co-authors. For some of the peer-reviewed articles I only had an abstract and for some I could find the reference but no abstract or text without paying a fee. Below this short essay is an annotated bibliography of all 22 internal documents and 89 of the published papers. The documents are interesting reading, they fill in the history of modern climate science very well. Much of the current debate on climate change was being debated in the same way, and often with the same uncertainties, in 1977.
Between 1977 and the fifth IPCC report in 2013 ExxonMobil Corporate Research in New Jersey investigated the effect of increasing CO2 on climate. If they withheld or suppressed climate research from the public or shareholders, it is not apparent in these documents. Further, if they found any definitive evidence of an impending man-made climate catastrophe, I didn’t see it. The climate researchers at ExxonMobil participated in the second, third, fourth and fifth IPCC assessment reports making major contributions in mapping the carbon cycle and in climate modeling. They calculated the potential impact of man-made CO2 in several publications. They investigated methods of sequestering CO2 and adapting to climate change. They also investigated several potential biofuels.
The internal documents are generally summaries of published work by outside researchers. Some of the documents are notes from climate conferences or meetings with the DOE (Department of Energy). For many of the internal documents one has to read carefully to separate what is being said by the writer and what he is reporting from outside research. Exxon (and later ExxonMobil) did some original research, particularly making ocean and atmospheric measurements of CO2 from their tankers. But, most of what they produced was by funding research at Columbia University or the Lamont-Doherty Earth Observatory. All of their internal research and the work at Columbia was published as far as I can tell, so it is difficult to accuse them of hiding anything from the public or shareholders.
At the heart of Schneiderman’s accusation, according to the NY Times, is a list of statements made by ExxonMobil executives that he believes contradict the internal memos summarized below. The statements are reported here. In fact, the internal memos and documents listed below, do not contradict the ExxonMobil executives in any way. The internal documents and publications all clearly describe the considerable uncertainties in climate science and align with the executives’ statements. Go to the link to see all of them, two of the most notable are quoted below:
Mr. Ken Cohen, ExxonMobil Vice President for Public and Government Affairs, 2015 (Blog Post):
“What we have understood from the outset – and something which over-the-top activists fail to acknowledge — is that climate change is an enormously complicated subject.
“The climate and mankind’s connection to it are among the most complex topics scientists have ever studied, with a seemingly endless number of variables to consider over an incredibly long timespan.”
Duane Levine, Exxon’s manager of Science and Strategy Development, 1989 (Internal Document #21 below)
“In spite of the rush by some participants in the greenhouse debate to declare that the science has demonstrated the existence of [man-made global warming] today, I do not believe such is the case. Enhanced greenhouse is still deeply imbedded in scientific uncertainty, and we will require substantial additional investigation to determine the degree to which its effects might be experienced in the future.”
Even if there were a contradiction between the executives and the ExxonMobil climate researchers, who is to say which of them is wrong? Free speech is a fundamental individual right in the USA and executives are allowed to disagree with their employees. As University of Tennessee Law Professor Glenn Harlan Reynolds has said in USA Today :
Federal law makes it a felony “for two or more persons to agree together to injure, threaten, or intimidate a person in any state, territory or district in the free exercise or enjoyment of any right or privilege secured to him/her by the Constitution or the laws of the Unites States, (or because of his/her having exercised the same).”
“I wonder if U.S. Virgin Islands Attorney General Claude Walker, or California Attorney General Kamala Harris, or New York Attorney General Eric Schneiderman have read this federal statute. Because what they’re doing looks like a concerted scheme to restrict the First Amendment free speech rights of people they don’t agree with. They should look up 18 U.S.C. Sec. 241.”
ExxonMobil has filed court papers in Texas seeking to block a subpoena issued by the attorney general of the US Virgin Islands Claude Walker. They argue that the subpoena is an unwarranted fishing expedition into ExxonMobil’s internal records.
Environmentalist groups, like the Rockefeller Family Fund and 350.org are trying to organize a legal attack against ExxonMobil patterned on the attack many organizations led against the tobacco companies. They feel that their presumed imminent man-made climate disaster is being ignored and they want to make ExxonMobil a scapegoat. As Lee Wasserman (Rockefeller Family Fund) said recently “It’s not really about Exxon.”
Mr. Scheiderman may have made the “error of assuming facts that are not in evidence.” He assumes that man-made greenhouse gases are a significant factor in climate change and that the resulting enhanced climate change is dangerous. Neither assertion has been proven. He also assumes that Exxon’s early research proved these assertions to be true, with little or no doubt. Therefore, Mr. Scheiderman believes the Exxon executives’ claims that there is significant uncertainty around the idea of dangerous man-made climate change is a lie. I do not see any proof of dangerous climate change, man-made or otherwise in any of the documents below. In peer reviewed document #55 below, Flannery, et al. in 1985 suggest that the effect of CO2 on climate, based on geological data from the Cretaceous Period, is 50% or less. Internal document #3 indicates concern that there is a “potential problem amid all the scientific uncertainties.”
Along this line of thought, the ExxonMobil court filing against Mr. Walker and the US Virgin Islands says in part:
“… [ExxonMobil] has “widely and publicly confirmed” that it recognizes “that the risk of climate change and its potential impacts on society and ecosystems may prove to be significant.”
Brian Flannery states in published document #66 below in 2001:
“Although we know the human emissions fairly well, we don’t know the natural emissions well at all. Added to this uncertainty is the fact that natural emissions can change as a result of long-term climate changes.”
The key problem is that ExxonMobil management and most, if not all, of their researchers do not think the idea of dangerous man-made climate change has been proven. Further, one of them said in internal document #3 below: “we have time to evaluate the uncertainties even in a worse-case scenario.” This is still true, especially considering the very slow pace of warming over the last twenty years.
In internal document #3 below, they discuss the potential effect of doubling CO2 in the atmosphere and the discussion is instructive. The CO2 level prior to the industrial revolution (roughly 1840-1850) is unknown. They give two possibilities (260-270 ppm or 290-300 ppm). The temperature increase from 1850 to the end of 2015 is roughly 0.85°C from the HADCRUT 4 dataset and the 5th IPCC Assessment reports 0.85°C from 1880 to 2012. The Exxon researchers did not think a clear anthropogenic signal was detectable in 1979, because at that time the total temperature increase from 1850 had not exceeded 0.5°C, their assumed natural variability. So, they thought man-made warming might be clearly detected by the year 2000.
We are now well past the year 2000 and according to the data shown in their Table 6 (Internal Document #3), we are on track with their most benign scenario of a temperature increase of 1.3° to 1.7°C per doubling of CO2 (ECS). This assumes an initial concentration of CO2 of 265 to 295 ppm and a natural variability of +-0.5°C. The initial CO2 concentration assumption is reasonable, the assumption of 0.5°C for natural variability may be too low. However, if the assumptions are true, they probably eliminate the possibility of higher climate sensitivity to CO2 (ECS>2°). This is also supported by recent empirical estimates of ECS. There are considerable uncertainties in this approach, but they are important to recognize. We don’t know the CO2 level when we started emitting a lot of fossil fuel CO2, we don’t know the net effect on our climate, and can’t be certain we have seen any impact of man-made CO2 on our climate to date.
Even Brian Flannery, one of the Exxon researchers who has been deeply involved in the IPCC process stated in internal document 22, below: “While uncertainty exists, science supports the basic idea that man’s actions pose a serious potential threat to climate.” This is the most alarmist statement I could find anywhere, but it still says “potential” and notes that uncertainty exists.
In peer-reviewed paper #25 below, Dr. Kheshgi and Dr. White state in 2001:
“Many previous claims that anthropogenically caused climate change has been detected have utilized models in which uncertainties in the values of some parameters have been neglected (Santer et al. 1996b). In section 5 we have incorporated known parameter uncertainties for an illustrative example by using the proposed methodology for distributed parameter hypothesis testing. The results clearly show that incorporation of parameter uncertainty can greatly affect the conclusions of a statistical study. In particular, inclusion of uncertainty in aerosols forcing would likely lead to rejection of the hypothesis of anthropogenically caused climate change for our illustrative model …”
They are concerned here and in other papers, that the GCM (global circulation climate models) have used fixed parameters for their calculations for variables that actually have a great deal of uncertainty. By fixing these variables across many models, the modelers produce a narrower range of outcomes giving a misleading appearance of consistency and accuracy that does not actually exist.
As Professor Judith Curry has often said there is an uncertainty monster at the science-policy interface. The ExxonMobil scientists are very good, they write well and their superiors in ExxonMobil understand what they are saying. Man-made climate change is a potential problem, but it is shrouded in uncertainty because it is an extremely complex research topic with countless variables. The internal and published documents below show that Exxon has worked hard to define the uncertainty and they have even succeeded in reducing the uncertainty in some areas, especially in the carbon cycle. But still, the remaining uncertainty is huge and it covers the range from zero anthropogenic effect to perhaps 4° or 5°C (see publication #7, Kheshgi and White 1993) to this day. Not much different than in 1977 when they got started.
I’ll conclude this post with a quote from internal document #11, the 1982 Exxon Consensus statement. I think it speaks well for ExxonMobil and puts Schneiderman (and many in the media) to shame:
“As we discussed in the August 24 meeting, there is the potential for our research to attract the attention of the popular news media because of the connection between Exxon’s major business and the role of fossil fuel combustion in contributing to the increase of atmospheric CO2. Despite the fact that our results are in accord with most major researchers in the field and are subject to the same uncertainties, it was recognized that it is possible for these results to be distorted or blown out of proportion.
Nevertheless the consensus position was that Exxon should continue to conduct scientific research in this area because of its potential importance in affecting future energy scenarios and to provide Exxon with the credentials required to speak with authority in this area. Furthermore our ethical responsibility is to permit the publication of our research in the scientific literature; indeed to do otherwise would be a breach of Exxon’s public position and ethical credo on honesty and integrity.”
This is the only thing I found in the internal memos that was not published. In 1982 they thought the media might distort their research results or blow them out of proportion (the Uncertainty Monster). Well, that certainly happened. For science to work properly, research outcomes cannot be dictated. All interested parties must be allowed to investigate the problem and publish their results. They must have access to data, computer programs and models that are publicly funded. But, above all, they should not be punished, jailed, intimidated or sued because they are skeptical of a popular scientific thesis. They should be judged only on the quality of their scientific work and not who they work for or who funds them.
This post is excerpted from a longer post The Exxon Climate Papers, that includes links and annotations to 89 documents, including internal documents and published papers.
Bio notes: Andy May worked for Exxon from 1980 to 1985. During part of that time he worked on the Natuna D-Alpha project discussed in some of these documents. He did not work at either the Florham Park, New Jersey Research laboratory or the Linden, New Jersey laboratory where the climate research was done. The views expressed in this essay and bibliography are his own. This was written in his spare time and he received no compensation from anyone for writing and posting it.
April 20, 2016
Posted by aletho |
Science and Pseudo-Science, Timeless or most popular | 350.org, Exxon Climate Papers, Lee Wasserman, Rockefeller Family Fund, United States |
Leave a comment
WASHINGTON – The US government has sent Special Envoy Amos Hochstein to Kuwait, Qatar, Egypt and Israel to discuss falling oil prices after the failure of the Doha energy talks, the US Department of State announced in a media note on Monday.
“Special Envoy and Coordinator for International Energy Affairs Amos J. Hochstein will be travelling to the region to meet with key interlocutors in Jerusalem, Cairo, Kuwait City and Doha,” the note stated.
As global oil prices remain near record lows, and the United States emerges as a global exporter of liquefied natural gas, Hochstein will be seeking to strengthen US relationships with partners in the region as well as discuss strategies for addressing the market realities of the energy sector, the note explained.
Hochstein will discuss energy security issues in Israel, power generation issues in Egypt and plans to investment in developing new oil fields and build additional oil refineries in Kuwait, the State Department pointed out.
In Qatar, Hochstein will give a speech emphasizing US support for liquefied natural gas development and its role in reducing global carbon emissions, the note said.
April 19, 2016
Posted by aletho |
Economics, Science and Pseudo-Science | Egypt, Israel, LNG, United States |
Leave a comment
The last few years have seen an alarming increase in claims that tribal peoples have been shown to be more violent than we are. This is supposed to prove that our ancestors were also brutal savages. Such a message has profound implications for how we view human nature – whether or not we see war as innate to the human condition and so, by extension, broadly unavoidable. It also underpins how industrialized society treats those it sees as “backward.” In reality though it’s nothing more than an old colonialist belief, masquerading once again as “science.” There’s no evidence to support it.
The American anthropologist, Napoleon Chagnon, is invariably cited in support of this brutal savage myth. He studied the Yanomami Indians of Amazonia from the 1960s onwards (he spells the tribe “Yanomamö”) and you’d be hard pressed to find a book or article on tribal violence which doesn’t refer to his work. Popular writers such as Steven Pinker and Jared Diamond frequently make much of Chagnon’s thesis, so it’s worth giving a thumbnail sketch of why in reality it proves little about the Yanomami, and nothing about human evolution.
First, it’s important to dispatch a red herring from the murky cauldron being cooked up by the brutal savage promoters: They often point to Darkness in El Dorado, a book by Patrick Tierney, which attacked Chagnon’s work, but went too far. Tierney raised the possibility that one of Chagnon’s colleagues may have deliberately introduced a deadly measles epidemic to the Indians. That simply wasn’t true: In fact, the epidemic was inadvertently started by American missionaries. That Tierney was wrong on this single point is now used to claim that all his and other writers’ criticisms of Chagnon have been discredited. They haven’t. In any case, were a single error deemed to negate a whole thesis, then pretty much all science, as well as journalism, the law and a lot else, falls apart.
Anyway, let’s set Tierney aside. For decades, Napoleon Chagnon’s findings have been rejected by almost all of the many other anthropologists who have worked with the Yanomami, and in most countries his work simply isn’t taught. He had rather faded from anthropology in the United States too, until his recent resurgence as the darling of establishment attitudes.
According to Chagnon, brutality is a key driver of human evolution. How did he come upon such a disturbing “discovery”? Basically, he counted how many Yanomami men boasted that they were unokai and he told us this means they’ve killed people. He then crunched the numbers to show that unokai are similarly successful in love as they are in war, and that by fathering more children than non-killers, they ensure the next generation is as murderous as they are.
As with any sweeping conclusion in human sciences, there are numerous known unknowns. For example, did Yanomami raiding in the 1960s increase through growing pressure from settler or missionary incursions? (After all, Chagnon used the extremist New Tribes Mission to get into the Yanomami.) Did the influx of outside trade goods, including guns, play a role? Such impacts are difficult to analyze, though some believe they were clearly significant.
But the most significant fact, the extraordinary single error that, in this case, does destroy Chagnon’s thesis in one swoop, is something Chagnon doesn’t tell us – unokai does not just mean “killer.” It’s also the status claimed by everyone who’s ever shot an arrow into a dead body during an inter-village raid (most raids stop after one killing). It describes many other individuals as well, including men who’ve killed an animal thought to be a kind of shamanic embodiment of a human, as well as stay-at-homes who try and cast lethal spells. It even includes those who’ve participated in a ritual during their future wife’s puberty (she also becomes unokai). In other words, many unokai haven’t killed anyone. With this simple fact, every one of Chagnon’s conclusions about “killers” falls apart.
But supposing he was right after all, what would his figures show? What percentage of the population are we talking about? Here the brew gets fishier: Chagnon plays fast and loose with his own data. His autobiography, “Noble Savages,” says that “killers” number “approximately 45 percent of all the living adult males.” Yet even according to his own (shaky) data, that is simply not true: Chagnon’s own figures do not show that 45 percent of men are unokai. He has grossly inflated his percentage by ignoring everyone younger than 25, an age group with far fewer claiming unokai status. Were they included, his percentage would plummet.
Chagnon has been asked about this manipulation for years. When he bothers to reply, he claims he’ll publish new supporting data. We’re still waiting.
So there you have it: That’s the poster boy of the “scientific proof” behind the myth of the brutal savage. The fact that Chagnon’s thesis has been repeatedly demolished in scholarly publications for decades is simply ignored by those who want him to be right. For them to dismiss the many Chagnon critics, to pretend that science is on their side, and to chorus sneeringly “noble savages” whenever Chagnon is criticized, is just facile propaganda.
By the way, if you want to know how many unokai (supposed “killers”) Chagnon managed to winkle out during a quarter century of fieldwork with one of Amazonia’s largest tribes – numbering several thousand – the answer is just 137 men. They could all comfortably fit into a single car on the New York subway. How many of those were actually killers? We’ll never know.
That’s the size of the sample group supposedly proving that tribal peoples live in a state of chronic warfare and, by throwing in more red herrings, that our ancestors did so too. The latter assertion is widely promulgated. It goes like this: The Yanomami are a small-scale tribal (non-state) hunting society, our ancestors were the same, so the Yanomami can teach us about our ancestors because they live in a similar way. And yet the theory fails on several points: For example, no one knows the degree to which our distant ancestors scavenged for meat, rather than actively hunted it. That’s quite a different approach to life, and the Yanomami wouldn’t dream of doing it. In any case, a moment’s informed reflection tells you that no one who inhabited the ice age plains of Eurasia, for example, lived remotely like the tropical rainforest Yanomami of Chagnon’s 1960s.
The real story is more obvious, prosaic and simpler than the Chagnon-created “fierce people” and their supposed “chronic” warfare. The truth is that there are some tribal peoples who have a belligerent reputation, others known for avoiding violence as much as possible, and lots in between. That’s nothing to do with any grasping at mythic noble savages, it’s what anthropologists have actually found.
Despite the growing mythology, the archeological record reveals very little evidence of past violence either (until the growth of big settlements, starting around 10,000 years ago). Researchers Jonathan Haas and Matthew Piscitelli studied descriptions of 2,930 earlier skeletons from 900 different sites worldwide.[1] Apart from a single massacre site of two dozen people in the Sudan, they found “but a tiny number of cases of violence in skeletal remains,” and noted how just four sites in Europe “are mentioned over and over by multiple authors” striving to demonstrate the opposite of what the evidence actually reveals. The archeological record before 10,000 years ago, they conclude, in fact “shows that warfare was the rare exception.”
Much of the other “proof” for the brutal savage advanced by Steven Pinker, Jared Diamond, and other champions of Chagnon, is rife with the selection and manipulation of facts to fit a desired conclusion.
To call this “science” is both laughable and dangerous. These men are desperate to persuade us that they’ve got “proof” for their opinions, which isn’t surprising as they’re nothing more – opinions based on a narrow and essentially self-serving political point of view. They have proved nothing, except to those who want to believe them.
Does it matter? Yes, very much. How we think of tribal peoples dictates how we treat them. Proponents of Chagnon seek to reestablish the myth of the brutal savage which once underpinned colonialism and its land theft. It’s an essentially racist fiction which belongs in the 19th century and, like a flat earth, should have been discarded generations ago. It’s the myth at the heart of the destruction of tribal peoples and it must be challenged.
It’s not just deadly for tribal peoples: It’s dangerous for all of us. False claims that killing is a proven key factor in our evolution are used to justify, even ennoble, the savagery inherent in today’s world. The brutal savage may be a largely invented creature among tribal peoples, but he is certainly dangerously and visibly real much closer to home.
Notes.
[1] See also, http://blogs.scientificamerican.com/cross-check/10-000-year-old-massacre-does-not-bolster-claim-that-war-is-innate/
April 9, 2016
Posted by aletho |
Deception, Militarism, Science and Pseudo-Science, Timeless or most popular | Napoleon Chagnon |
Leave a comment