David Attenborough was my favorite wildlife cinematographer and each year I fed my students numerous clips to make biology and ecology come alive. Researching the plight of the polar bears, I began to worry that “my hero” had decided to use his spectacular wildlife videos to promote catastrophic climate change.
The first example that raised my suspicions was his portrayal of polar bears feeding on walruses, with a narration suggesting it was a new behavior desperately driven by climate change. But for us ecologists who know better: shame on you David Attenborough. He ignored documented wildlife history and cherry-picked a dramatic scene to promote climate fear.
First view this older BBC video pitting polar bears against walrus. Notice how many bears are converging on the walrus herd and that they are coming from the land. Then view Attenborough’s “new and improved video” that puts a very misleading slant on polar bears and walruses.
If you want to read historical facts about walruses and polar bears, I suggest reading Francis H. Fay’s 1982 “Ecology And Biology Of The Pacific Walrus, Odobenus rosmarus, Divergens Illiger.” In the 1950s, Fay was concerned that the walrus was headed for extinction due to over-hunting for ivory and blubber so Fay set out to document everything there was to know about walruses.
In his tome, Fay published early 1900 observations by Russian researchers who admired the polar bears’ varied and clever tactics for hunting walrus.
“The walruses on Peschan Island are frequently bothered by bears, which creep up to them under cover of uneven terrain and of driftwood, of which there usually is an abundance, along the shore. Sometimes the bears dig pits in the sand or make a pile in front of themselves in order to hide from the walruses. We saw a bear in a pit dug in the driftwood within 50 m of the herd, where it watched for a long time. Suddenly, it leaped from its concealment and plunged along the flat terrain toward the walruses. The animals, upon seeing the running bear, rushed into the water, and when the bear reached those on shore, only a few large males remained, and these gradually pivoted into the water, threatening with roars and swinging tusks. The bear in his misfortune was unable to decide whether or not to enter the water and only brandished his paws helplessly and growled in discontent. Not infrequently, in the confusion, the adult walruses crush some young; possibly, at the time of the attack, the bears hope to profit from such accidentally crushed or abandoned young.”
Anyone familiar with the scientific literature knows polar bears have been hunting walruses since recorded history and most certainly before that time. More recently researchers reporting to the Polar Bear Specialist Group meeting speculated that hunting walruses on land was likely to be a behavior that has allowed bears to survive the lack of sea ice that was far more common through out the Holocene Optimum.
For example, Wrangel Island is both home of one of the largest known polar bear denning areas in the Arctic as well as the location of several traditional walrus land haul-outs each summer. Because walruses often get trampled at these haul-outs, bears eagerly supplement their diet by feeding on the trodden carcasses. In addition, polar bears will wait at these haul outs anticipating the summer wave of walrus herds that typically come ashore and then dine on weak or young walruses. Seasoned bears know to avoid a healthy bull.
In 2007 the 2nd greatest decrease in Arctic sea ice was observed in the waters surrounding Wrangel Island. That summer researchers observed the greatest number of polar bears on the island. However, contrary to the less ice-means-starving-bear theory, there were no signs of increased nutritional stress. Quite the opposite.
Anticipating the seasonal haul-out of walruses, the bears concentrated along the beaches where they were easily observed by researchers who determined that less than 5% of the Wrangel Island bears were designated skinny or very skinny. That compared very favorably to the 7 to 15% of skinny bears observed in previous years with heavier ice. Furthermore researcher determined that not only did 29% of all bears look “normal”, the remaining 66% were fat or very fat. Those polar bear experts wrote, “Under certain circumstances, such as were observed on Wrangel Island in 2007, (Ovsyanikov and Menyushina 2008, Ovsyanikov et al., 2008), resources available in coastal ecosystems may be so abundant that polar bears are able to feed on them more successfully than while hunting on the sea ice.”
With that scientific background, view Attenborough’s rendition and ask yourself if he is objectively narrating the video. He ignores the bears and walruses’ natural history to suggest polar bears have only recently attacked walruses out of desperation. Attenborough suggests the lone bear had been desperately swimming for days trying to reach the island. However, without a radio-collar on the bear, one must wonder if Attenborough is using creative license. And why is Attenborough “serendipitously“ set up in this location to film this event??? Is it a traditional walrus hunting spot and not the rare event his video suggests?
Researchers have documented instances of younger bears who have not mastered hunting walrus that resulted in injury, but it is a matter of a younger bears evolving experience. Attenborough marries an uncommon hunting failure to climate change. Playing sad music, he suggests that bears only attack walruses as an unnatural last resort; suggesting that, in essence, it is a climate change driven act that is suicidal and doomed to increase.
To my increasing dismay, my former wildlife hero seems to be plunging more deeply into climate propaganda. Attenborough has a new series on Discovery called Africa but it might as well be called “Let’s Push Climate Fear“.
Take for instance his video segment, shown below, on Green Turtles. He accurately tells us that unlike humans who determine gender via the X and Y chromosomes, Green Turtles (as well as several other reptiles) determine the next generation’s gender based on the temperature of the developing eggs. Researchers realized this when trying to save endangered sea turtles from depredation and dug up their eggs to “safely” incubate them. Fearing that buried eggs at the bottom of the pile had not benefited equally from the sun’s warmth, the eggs were laid out evenly on trays so all could incubate at the same temperature. The result was uni-sex baby turtles.
However, turtles have been around since the dinosaurs and their temperature-gender system has been completely successful throughout monumental periods of climate change, massive extinctions, and epochs with far warmer temperatures than today. Attenborough should tell his audience that micro-climates are far more critical to their success as well as informing the public that temperatures drop off dramatically with depth in the sand. Nonetheless he warns that due to global warming, female turtles will soon have great difficulty finding a male. Shameful propaganda Sir David!
* Author Jim Steele is Director emeritus Sierra Nevada Field Campus, San Francisco State University.
Literature cited
Fay, F. (1982) Ecology and Biology of Odobenus rosmarus the Pacific Walrus, divergens. US. Department of the Interior, Fish and Wildlife Service, North American Fauna, No. 74.
Ovsyanikov N.G., and Menyushina I.E. (2008) Specifics of Polar Bears Surviving an Ice Free Season on Wrangel Island in 2007. Marine Mammals of the Holarctic. Odessa, pp. 407-412.
EU documents newly obtained by the nonprofit Pesticide Action Network of Europe reveal that the health commission of the European Union (DG SANCO), which is responsible for protecting public health, is attempting to develop a procedural “escape route” to evade an upcoming EU-wide ban on endocrine disrupting pesticides. Endocrine disrupting chemicals (EDCs) are those that alter hormonal regulation at very low doses to cause effects on behavior, reproduction, and gender, as well as cancer and birth defects.
In 2009, under the European Union’s then-new chemical REACH legislation, a continent-wide ban on endocrine disrupting pesticides was agreed. The European Commission (EC) was charged with taking various steps to protect public safety. These included officially defining what constitutes an endocrine disrupting effect and designating acceptable chemical detection methods. The deadline to present these criteria for ensuring protection against endocrine disrupting pesticides expired on December 14, 2013.
Instead of providing the needed safety guidance, however, the EU’s Health Commission (DG SANCO) appears to be drafting a procedural “escape route” around the endocrine disrupting ban. This legal maneuvering is being done behind closed doors and with the collaboration of some EU member states and the European Food Safety Authority (EFSA, an independent EU agency created to assess food risks for the Commission).
As initially revealed by the Pesticides Action Network of Europe (PAN Europe), only Sweden is opposing this escape route, which they consider to be an abandonment of the original democratic mandate. According to a report by Agence France Presse (AFP) Sweden is now going to sue the EU due to mounting evidence that harmful impacts of endocrine disruption are already being felt. AFP quotes Swedish environment minister Lena Ek:
“In some places in Sweden we see double sexed fish. We have scientific reports on how this affects fertility of young boys and girls, and other serious effects.”
The documents obtained by PAN Europe show that the lobbying to undermine the ban is being led by EFSA. This is in direct conflict with the missions of both EFSA and DG SANCO which are to protect public health.
The crisis has come about because EDCs are the subject of a large body of independent academic research showing that certain synthetic chemicals are already causing developmental disabilities and cancer among humans and wildlife through non-traditional (i.e. hormonal) toxicological routes. This evidence is why the ban was instigated. Because of the strength of the evidence and the low doses involved (Vandenberg et al 2012), any rigorous and effective rules to protect the public are likely to result in widespread bans and restrictions on commonly used industrial, agricultural, and household chemicals. This is one reason why AFP also reported the Swedish Minister as saying that EU commissioners were under strong industry pressure.
Tony Tweedale, a Brussels-based independent consultant to NGOs, explained to Independent Science News, there is a second reason for industry pressure:
“That hormones are often disrupted at very low doses threatens to upset industry’s decades-long total control of risk assessment which is based, for example on insensitive tests.”
While missing their mandated December deadline for providing safety rules, DG SANCO and EFSA chose to perform an economic impact assessment of potential regulations instead. Now this economic impact assessment is itself 9 months late. Sweden and others have interpreted these delays as stalling a collectively agreed action.
Before the Swedish lawsuit was announced Sweden had already expressed its concerns to the European Commission in letters to DG SANCO (published on the PAN Europe website). These letters reveal that Sweden believes the failure of DG SANCO to proceed according to the rules is deliberate and that DG SANCO is instead focused on drafting the illegal escape clause. This, believes Sweden, would likely take the form of a general derogation for pesticides that may be endocrine disruptors (1). It would be a legal technicality that effectively allowed pesticides which would have been banned to be exempt from the ban (2).
Simultaneous with Sweden’s announcement to take the European Commission to court, PAN Europe uncovered a letter from a representative of the EFSA Scientific Committee (which is helping to draw up the new scientific criteria). In this letter, which is addressed to advisors of Jean-Manuel Barroso (head of the European Commission), the EFSA official says that the permanent science advisors to EFSA are opposing the ban and aim to use traditional risk assessment to undermine it. Traditional risk assessment is the approach favoured by the pesticide industry.
Also in the letter, the EFSA science advisor complains of the pesticide legislation having no “control route” or “socio-economic route” to save endocrine disrupting pesticides from a ban. The anonymous writer suggests that an existing ‘negligible exposure’ option (EC 1107/2009, Annex II, 3.6.5) can be manipulated to keep such pesticides on the market. It is use of this ‘negligible exposure’ option that is opposed by Sweden, which believes that because negligible exposure is not well defined it is in danger of becoming a generic exemption (i.e. a derogation) for the use of endocrine disrupting chemicals.
The existence of this letter confirms Sweden’s interpretation of the intentions of EFSA and DG SANCO; the ‘negligible exposure’ option is indeed being lined up as a loophole for avoiding likely science-based bans on endocrine disruptors.
In the view of PAN Europe:
“By unilaterally changing the rules, DG SANCO is sidelining the EU Parliament and choosing economic interests over their own mission to protect people and the environment.”
Science Director of The Bioscience Resource Project, Allison Wilson, concluded:
“The public will be astounded and appalled to find that the institutions tasked with protecting them are secretly working against them. EFSA has shown itself to be untrustworthy and should be disbanded. Deep rethinking appears necessary since it is not only the EU that has failed to construct institutions capable of safely regulating toxic substances. Perhaps we should question the wisdom of economies dependent on synthetic chemicals and high risk products.” (3)
Footnotes
(1) A derogation is a partial or temporal suspension of a law.
(2) The list of pesticides Sweden thinks likely to be banned can be found here.
(3) See: Robinson C., Holland N., Leloup D., Muilerman H. (2013) Conflicts of interest at the European Food Safety Authority erode public confidence. J Epidemiol Community Health 2013;67:717-720 doi:10.1136/jech-2012-202185
References
Vandenberg LN, Colborn T, Hayes TB, Heindel JJ, Jacobs DR Jr et al. (2012) Hormones and endocrine-disrupting chemicals: Low-dose effects and nonmonotonic dose responses. Endocr Rev 33: 378-455.
Sea level tipping points are a popular CAGW/media theory, easily suggested by images of calving icebergs and summer meltwater rushing down Greenland moulins. But they are alarmist precautionary mitigation fantasies rather than remotely possible future scenarios on multi-centennial time scales.
The core tenant of catastrophic anthropogenic global warming (CAGW) is that rising CO2 will raise temperatures and result in various catastrophes. IPCC, UNFCC and now the US NCA have argued this requires immediate drastic collective mitigation. Nature has not co-operated. Temperatures stopped rising (the pause), extreme weather did not increase (IPCC SREX), Australian drought turned to flood, Tuvalu has not disappeared, and polar bears thrive. So AR5 WG2 finally said adaptation might be a better response.
About the only urgent immediate mitigation rationale left is the precautionary principle: take expensive actions as just in case ‘insurance’. Precautions against some ‘tipping point’ beyond which the world is rapidly, disastrously, and irreversibly affected, which point at any cost we therefore dare not risk passing. No price is too high to pay to avoid a catastrophic tipping point according to this precautionary principle. Bad economics piled onto bad science.
One of the most marketed tipping points is sea level rise (SLR). The problem with ‘sudden’ SLR is that it did not happen in the Eemian interglacial. But that does not say it might not with CAGW added to this one, the Holocene.
There are only three ice bodies with enough water to cause a potentially rapid and large sea level rise. These are the Greenland, East Antarctic, and West Antarctic Ice Sheets. Since Antarctica as a whole may (inconveniently for CAGW) be accumulating ice [i], Greenland has been the ‘tipping point’ most frequently mentioned by official agencies [ii] and by the MSM. [iii]
There is no doubt that Greenland is losing ice mass, and at a recently increased rate. This has been measured in different ways (ice melt boundary, gravity (GRACE), iceberg calving… The ‘consensus’ is about 170- 200 Gt per year recently, but about 100Gtpy over satellite era Arctic cycles since the estimated loss was only about 7Gtpy in the 1990s.[iv] Winter snow accumulation is as important to net ice mass balance as the summer melt.
The observed mass loss should be put into perspective. According to the Byrd Polar Research Center the Greenland Ice Sheet comprises 2.62-2.93 E+6 km3. That is a total mass of about 2.67E+18Kg (uncertainty on volume, and uncertainty on density—firn, moulins, entrained air). A gigaton is E+12Kg. Greenland is estimated to be losing about E+14Kg per year averaged over two decades. At that rate, it would take about (2.67 E+18kg mass/E+14kg average annual mass loss) 27000 years to melt/sluff. Even the recent accelerated rate (if continued) would take over 14000 years.[v] That is longer than it took the great Laurentide Ice Sheet to disappear at the end of the last ice age. If Greenland ever did melt it would raise sea level by 6.7 meters. Even at the faster melt rate this would be (670 cm/140 centuries) 4.8 cm/century of sea– an additional 0.5mm/yr—more adaptation than mitigation.
It is unlikely that Greenland will melt. NEEM showed northwest Greenland was +5-8°C above the present for about 7 millennia during the Eemian. True, more ice melted there then than has up to now in the Holocene. The NEEM site cored ‘only’ 2537 meters of ice. At end of the Eemian the NEEM location ice was about 130 meters lower—‘only’ ≈2400 meters thick.
The only way a centennial or even millennial Greenland tipping point would be possible is if much of its ice ‘slid off’. It is true that the outer ice sheet edges are glaciers creeping seaward and sluffing—calving icebergs like the one that sank the Titanic in 1912 (before AGW). But it is not true that most of the Greenland ice sheet could ever creep off, since the underlying bedrock is bowl shaped. The most graphic 3D visualization is from Bamber, University of Bristol.
The thickest ice is over the deepest part of a bedrock bowl 1000-2500 meters deep, e.g. at the NEEM site. It is not going anywhere anytime soon. That ‘bowl’ interior is where the Greenland Sheet has been accumulating even as the edges sluff/melt. Creep decline becomes increasingly self-limited by underlying geology.
Greenland losing all its ice is geophysically impossible on millennial time scales, since it has to melt. Not something to worry about at all on centennial time scales, even as an implausible black swan or dragon king.
With Greenland geologically debunked as a possible SLR tipping point, attention turned to Antarctica. Whether Antarctica in total is gaining or losing ice is a matter of dispute between NASA and NOAA. Current NOAA ice loss is:
WAIS losing, EAIS gaining, the Peninsula about even. So any tipping point has to be sought in West Antarctica (WAIS). The general WAIS slope is from the Transarctic Mountain divide down to the sea, although some is anchored by the Executive Committee and Ellsworth mountain ranges.
Potential WAIS instability has been the subject of much scientific scrutiny. The original concerns were the large below sea level grounded portions of the Ronne (which is not part of WAIS but is still mostly in the Western half of Antarctica) and Ross ice shelves. (Floating shelf ice cannot further raise sea levels.) These have the largest volumes of ice creeping toward the sea. Like Greenland, much of the rest (and most of EAIS) is land anchored by underlying bedrock topology. On an annual basis fresh snow still replenishes most of the lost edge mass inland at higher WAIS elevations. It is the net mass balance along these seaward sloping WAIS ice sheet edges that might constitute sufficiently large tipping points.
Ronne (1) is net gaining ice mass according to NASA. So it isn’t a plausible tipping point. Ross (19) might or might not be losing ice, but it is what ‘holds back’ almost half of WAIS. Ross also has more ice grounded deeper on the seabed, which if ungrounded (melted from below), would raise sea levels more. For years Ross was the main WAIS instability ‘tipping point’ concern.
The ANDRILL program was designed to look at the underlying Ross seabed (both where the ice is grounded below sea level, and where it is floating shelf) to understand its behavior in previous interglacials. Andrill cores and creep rates suggest it has not before (well, for at least 3 million years and the entire Pleistocene Ice Age) and likely will not now collapse. The Ross shelf’s seaward creep has decelerated. [vi],[vii] Ross had bedrock islands ‘anchoring’ its grounded ice, retarding seaward creep. [viii] So Ross is not a plausible tipping point after all.
So 2014 attention turned to the only other possibility, the Amundsen Embayment, which is indisputably losing ice at an accelerating rate. Abetted by additional NASA PR and author interviews (Rignot of NASA JPL “Already gone into irreversible retreat, past the point of no return”), MSM alarmist headlines were, well, alarming. Reuters reported worldwide: “West Antarctic Glaciers in irreversible thaw: rising seas” CNN said: “Ice melt in part of Antarctica ‘appears unstoppable’, NASA says”
The MSM did not read these new papers carefully or in context (if at all). The first paper found Pine Island (22) plus Thwaites (21) plus the four lesser Amundsen Embayment glaciers are discharging ice more rapidly than all of Greenland (together ±330Gtyr). That is surprisingly 3-4x higher than any previous estimate, for example those also from NASA in 2011 shown above. The second paper used computer models of Thwaites (21) bottom melting to conclude it couldbecome unstable in 200 to 900 years. If so, the computer models suggested 1mm/yr of additional SLR thereafter. Not ‘in coming decades’ as Reuters said and NASA PR implied.
There is a deeper comprehension problem in this new NASA sponsored version of a SLR tipping point. The NASA NEWS about these papers says the Embayment region contains enough ice to raise global sea level by 4 feet (1.2 meters). That is true for the entire catchment basin of about 360,000 km2. [ix]For 1.2 meters of SLR, the entire catchment would have to become entirely ice free. That is highly unlikely. The interior portions are not flowing much toward the sea according to the first paper itself, and are also still accumulating ice. [x], [xi]
Sea level tipping points are a popular CAGW/media theory, easily suggested by images of calving icebergs and summer meltwater rushing down Greenland moulins. But they are alarmist precautionary mitigation fantasies rather than remotely possible future scenarios on multi-centennial time scales.
AR5 WG2 had it right that the best response to SLR is adaptation. Major coastal cities like New Orleans (3-10mm/yr), Jakarta (6-22mm/yr), and Bangkok (10-28mm/yr) are already subsiding at much faster rates than sea levels are or will foreseeably be rising.
Climate change, once considered a problem for the distant future, has moved firmly into the present. Climate change is already affecting the American people. – U.S. NCADAC
The U.S. National Climate Assessment Report was published Tuesday [link]. I’ve read half of the chapters (at the beginning and end), skimming the ones in the middle.
My main conclusion from reading the report is this: the phrase ‘climate change’ is now officially meaningless. The report effectively implies that there is no climate change other than what is caused by humans, and that extreme weather events are equivalent to climate change. Any increase in adverse impacts from extreme weather events or sea level rise is caused by humans. Possible scenarios of future climate change depend only on emissions scenarios that are translated into warming by climate models that produce far more warming than has recently been observed.
Some of the basic underlying climate science and impacts reported is contradictory to the recent IPCC AR5 reports. Pat Michaels and Chip Knappenberger have written a 134 page critique of a draft of the NCADAC report [link].
Even in the efforts to spin extreme weather events as alarming and caused by humans, Roger Pielke Jr. has tweeted the following quotes from the Report:
“There has been no universal trend in the overall extent of drought across the continental U.S. since 1900″
“Other trends in severe storms, including the intensity & frequency of tornadoes, hail, and damaging thunderstorm winds, are uncertain”
“lack of any clear trend in landfall frequency along the U.S. eastern and Gulf coasts”
“when averaging over the entire contiguous U.S., there is no overall trend in flood magnitudes”
As a I wrote in a previous post on a draft of the report, the focus should be on the final Chapter 29: Research Agenda, which outlines what we DON’T know. Chapter 28 Adaptation is also pretty good. Chapter 27 Mitigation is also not bad, and can hardly be said to make a strong case for mitigation. Chapter 26 on Decision Support is also ok, with one exception: they assume the only scenarios of future climate are tied to CO2 emissions scenarios.
An interesting feature of the report is Traceable Accounts – for each major conclusion a Traceable Account is given that describes the Key Message Process, Description of evidence base, New information and remaining uncertainties, Assessment of confidence based on evidence. The entertainment value comes in reading the description of very substantial uncertainties, and then seeing ‘very high confidence’. This exercise, while in principle is a good one, in practice only serves to highlight the absurdity of the ‘very high confidence’ levels in this report.
In an interesting move, Obama Taps TV Meteorologists to Roll Out New Climate Report, which describes how Obama is giving interviews to some TV weathermen. It will be interesting to see how this strategy plays out, since TV weathermen tend to be pretty skeptical of AGW.
The politics on this are interesting also, see especially these two articles
While there is some useful analysis in the report, it is hidden behind a false premise that any change in the 20th century has been caused by AGW. Worse yet is the spin being put on this by the Obama administration. The Washington Post asks the following question: Does National Climate Assessment lack necessary nuance? In a word, YES.
The failure to imagine future extreme events and climate scenarios, other than those that are driven by CO2 emissions and simulated by deficient climate models, has the potential to increase our vulnerability to future climate surprises (see my recent presentation on this Generating possibility distributions of scenarios for regional climate change). As an example, the Report highlights the shrinking of winter ice in the Great Lakes: presently, in May, Lake Superior is 30% covered by ice, which is apparently unprecedented in the historical record.
The big question is whether the big push by the White House on climate change will be able to compete with this new interview with Monica Lewinsky :)
Uncorking East Antarctica yields unstoppable sea-level rise
The melting of a rather small ice volume on East Antarctica’s shore could trigger a persistent ice discharge into the ocean, resulting in unstoppable sea-level rise for thousands of years to come. This is shown in a study now published in Nature Climate Change by scientists from the Potsdam Institute for Climate Impact Research (PIK). The findings are based on computer simulations of the Antarctic ice flow using improved data of the ground profile underneath the ice sheet.
“East Antarctica’s Wilkes Basin is like a bottle on a slant,” says lead-author Matthias Mengel, “once uncorked, it empties out.” The basin is the largest region of marine ice on rocky ground in East Antarctica. Currently a rim of ice at the coast holds the ice behind in place: like a cork holding back the content of a bottle. While the air over Antarctica remains cold, warming oceans can cause ice loss on the coast. … Full article
~~~
Meanwhile… back in reality:
Antarctic Sea Ice Blows Away Records In April
By Paul Homewood
Antarctic sea ice continues to set new records, with extent in April at the highest since measurements began in 1979.
Back in April 2011, I had a post on The U.S. House of Representatives Hearing on Climate Change: Examining the Processes Used to Create Science and Policy. John Christy’s testimony is worth revisiting, in two contexts:
problems with the IPCC process, most recently highlighted in context of WG3 [link]
the Steyn versus Mann and Mann versus Steyn lawsuits [link]
John Christy has a unique perspective on how the hockey stick became the icon of the Third Assessment Report (TAR) – he served as a Lead Author (along with Michael Mann) on Chapter 2 Observed Climate Variability and Change. Relevant excerpts from Christy’s testimony:
In simplified terms, IPCC Lead Authors are nominated by their countries, and downselected by the IPCC bureaucracy with help from others (the process is still not transparent to me – who really performs this down-select?) The basic assumption is that the scientists so chosen as Lead Authors (L.A.s) represent the highest level of expertise in particular fields of climate science (or some derivative aspect such as agricultural impacts) and so may be relied on to produce the most up-to-date and accurate assessment of the science. In one sense, the authors of these reports are volunteers since they are not paid. However, they do not go without salaries. Government scientists make up a large portion of the author teams and can be assigned to do such work, and in effect are paid to work on the IPCC by their governments. University scientists aren’t so lucky but can consider their IPCC effort as being so close to their normal research activities that salary charges to the university or grants occur. Travel expenses were paid by the IPCC for trips, in my case, to Australia, Paris, Tanzania, New Zealand, Hawaii, and Victoria, Canada. Perhaps it goes without saying that such treatment might give one the impression he or she is an important authority on climate.
As these small groups of L.A.s travel the world, they tend to form close communities which often re-enforce a view of the climate system that can be very difficult to penetrate with alternative ideas (sometimes called “confirmation bias” or “myside bias”.) They become an “establishment” as I call them. With such prominent positions as IPCC L.A.s on this high profile topic, especially if they support the view that climate change is an unfolding serious disaster, they would be honored with wide exposure in the media (and other sympathetic venues) as well as rewarded with repeated appointments to the IPCC process. In my case, evidently, one stint as an L.A. was enough.
The second basic problem (the first was the murkiness of our science) with these assessments is the significant authority granted the L.A.s. This is key to understanding the IPCC process. In essence, the L.A.s have virtually total control over the material and, as demonstrated below, behave in ways that can prevent full disclosure of the information that contradicts their own pet findings and which has serious implications for policy in the sections they author. While the L.A.s must solicit input for several contributors and respond to reviewer comments, they truly have the final say.
In preparing the IPCC text, L.A.s sit in judgment of material of which they themselves are likely to be a major player. Thus they are in the position to write the text that judges their own work as well as the work of their critics. In typical situations, this would be called a conflict of interest. Thus L.A.s, being human, are tempted to cite their own work heavily and neglect or belittle contradictory evidence (see examples below.) In the beginning, the scientists who wrote the IPCC assessment were generally aware of the new responsibility, the considerable uncertainties of climate science, and that consequences of their conclusions could generate burdensome policies. The first couple of reports were relatively cautious and rather equivocal.
In my opinion, as further assessments were created, a climate “establishment” came into being, dominating not only the IPCC but many other aspects of climate science, including peer-review of journals. Many L.A.s became essentially permanent fixtures in the IPCC process and rose to positions of prominence in their institutions as a side benefit. As a result, in my view, they had a vested interest in preserving past IPCC claims and affirming evermore confident new claims to demonstrate that the science was progressing under their watch and that financial support was well spent. Speaking out as I do about this process assured my absence of significant contribution on recent and future reports. Political influence cannot be ignored. As time went on, nations would tend to nominate only those authors whose climate change opinions were in line with a national political agenda which sought perceived advantages (i.e. political capital, economic gain, etc.) by promoting the notion of catastrophic human-induced climate change. Scientists with well-known alternative views would not be nominated or selected. Indeed, it became more and more difficult for dissension and skepticism to penetrate the process now run by this establishment. As noted in my InterAcademy Council (IAC) testimony, I saw a process in which L.A.s were transformed from serving as Brokers of science (and policy-relevant information) to Gatekeepers of a preferred point of view.
A focus evolved in the IPCC that tended to see enhanced greenhouse gas concentrations as the cause for whatever climate changes were being observed, particularly in the 2001(Third Assessment Report or TAR) which was further solidified in 2007, (the Fourth Assessment Report or AR4.) The IAC 2010 report on the IPCC noted this overconfidence when it stated that portions of the AR4 contained “many vague statements of ‘high confidence’ that are not supported sufficiently in the literature, not put into perspective, or are difficult to refute.’” (This last claim relates to the problem of generating “unfalsifiable hypotheses” discussed in my recent House testimony.)
My experience as Lead Author in the IPCC TAR, Chapter 2 “Observed Climate Variability and Change”, allowed me to observe how a key section of this chapter, which produced the famous Hockey Stick icon, was developed. My own topic was upper air temperature changes that eventually drew little attention, even though the data clearly indicated potentially serious inconsistencies for those who would advocate considerable confidence in climate model projections.
First, note these key points about the IPCC process: the L.A. is allowed (a) to have essentially complete control over the text, (b) sit in judgment of his/her own work as well as that of his/her critics and (c) to have the option of arbitrarily dismissing reviewer comments since he/she is granted the position of “authority” (unlike peer-review.) Add to this situation the rather unusual fact that the L.A. of this particular section had been awarded a PhD only a few months before his selection by the IPCC. Such a process can lead to a biased assessment of any science. But, problems are made more likely in climate science, because, as noted, ours is a murky field of research – we still can’t explain much of what happens in weather and climate.
The Hockey Stick curve depicts a slightly meandering Northern Hemisphere cooling trend from 1000 A.D. through 1900, which then suddenly swings upward in the last 80 years to temperatures warmer than any of the millennium when smoothed. To many, this appeared to be a “smoking gun” of temperature change proving that the 20th century warming was unprecedented and therefore likely to be the result of human emissions of greenhouse gases.
I will not debate the quality of the Hockey Stick – that has been effectively done elsewhere (and indeed there is voluminous discussion on this issue), so, whatever one might think of the Hockey Stick, one can readily understand that its promotion by the IPCC was problematic given the process outlined above. Indeed, with the evidence contained in the Climategate emails, we have a fairly clear picture of how this part of the IPCC TAR went awry. For a more detailed account of this incident with documentation, see http://climateaudit.org/2009/12/10/ipcc-and-the-trick/.
We were appointed L.A.s in 1998. The Hockey Stick was prominently featured during IPCC meetings from 1999 onward. I can assure the committee that those not familiar with issues regarding reconstructions of this type (and even many who should have been) were truly enamored by its depiction of temperature and sincerely wanted to believe it was truth. Skepticism was virtually non-existent. Indeed it was described as a “clear favourite” for the overall Policy Makers Summary (Folland, 0938031546.txt). In our Sept. 1999 meeting (Arusha, Tanzania) we were shown a plot containing more temperature curves than just the Hockey Stick including one from K. Briffa that diverged significantly from the others, showing a sharp cooling trend after 1960. It raised the obvious problem that if tree rings were not detecting the modern warming trend, they might also have missed comparable warming episodes in the past. In other words, absence of the Medieval warming in the Hockey Stick graph might simply mean tree ring proxies are unreliable, not that the climate really was relatively cooler.
The Briffa curve created disappointment for those who wanted “a nice tidy story” (Briffa 0938031546.txt). The L.A. remarked in emails that he did not want to cast “doubt on our ability to understand factors that influence these estimates” and thus, “undermine faith in paleoestimates” which would provide “fodder” to “skeptics” (Mann 0938018124.txt). One may interpret this to imply that being open and honest about uncertainties was not the purpose of this IPCC section. Between this email (22 Sep 1999) and the next draft sent out (Nov 1999, Fig. 2.25 Expert Review) two things happened: (a) the email referring to a “trick” to “hide the decline” for the preparation of report by the World Meteorological Organization was sent (Jones 0942777075.txt, “trick” is apparently referring to a splicing technique used by the L.A. in which non-paleo data were merged to massage away a cooling dip at the last decades of the original Hockey Stick) and (b) the cooling portion of Briffa’s curve had been truncated for the IPCC report (it is unclear as to who performed the truncation.)
In retrospect, this disagreement in temperature curves was simply an indication that such reconstructions using tree ring records contain significant uncertainties and may be unreliable in ways we do not currently understand or acknowledge. This should have been explained to the readers of the IPCC TAR and specifically our chapter. Highlighting that uncertainty would have been the proper scientific response to the evidence before us, but the emails show that some L.A.’s worried it would have diminished a sense of urgency about climate change (i.e. “dilutes the message rather significantly”, Folland, 0938031546.txt.)
When we met in February 2000 in Auckland NZ, the one disagreeable curve, as noted, was not the same anymore because it had been modified and truncated around 1960. Not being aware of the goings-on behind the scenes, I had apparently assumed a new published time series had appeared and the offensive one had been superceded (I can’t be certain of my actual thoughts in Feb. 2000). Now we know, however, that the offensive part of Briffa’s curve had simply been amputated after a new realization was created three months before. (It appears also that this same curve was apparently a double amputee, having its first 145 years chopped off too, see http://climateaudit.org/2011/03/23/13321/.) So, at this point, data which contradicted the Hockey Stick, whose creator was the L.A., had been eliminated. No one seemed to be alarmed (or in my case aware) that this had been done.
Procedures to guard against such manipulation of evidence are supposed to be in place whenever biases and conflicts of interest interfere with duties to report the whole truth, especially in assessments that have such potentially drastic policy implications. That the IPCC allowed this episode to happen shows, in my view, that the procedures were structurally deficient.
Even though the new temperature chart appeared to agree with the Hockey Stick, I still expressed my skepticism in this reconstruction as being evidence of actual temperature variations. Basically, this result relied considerably on a type of western U.S. tree-ring not known for its fidelity in reproducing large-scale temperatures (NRC 2006, pg. 52).
At the L.A. meetings, I indicated that there was virtually no inter-century precision in these measurements, i.e. they were not good enough to tell us which century might be warmer than another in the pre-calibration period (1000 to 1850.)
In one Climategate email, a Convening L.A., who wanted to feature the Hockey Stick at the time (though later was less enthusiastic), mentions “The tree ring results may still suffer from lack of multicentury time scale variance” and was “probably the most important issue to resolve in Chapter 2” (Folland, 0938031546.txt). This, in all likelihood, was a reference to (a) my expressed concern (see my 2001 comments to NRC below) as well as to (b) the prominence to which the Hockey Stick was predestined.
To compound this sad and deceptive situation, I had been quite impressed with some recent results by Dahl-Jensen et al., (Science 1998), in which Greenland ice-borehole temperatures had been deconvolved into a time series covering the past 20,000 years. This measurement indeed presented inter-century variations. Their result indicated a clear 500-year period of temperatures, warmer than the present, centered about 900 A.D. – commonly referred to as the Medieval Warm Period, a feature noticeably absent in the Hockey Stick. What is important about this is that whenever any mid to high-latitude location shows centuries of a particularly large temperature anomaly, the spatial scale that such a departure represents is also large. In other words, long time periods of warmth or coolness are equivalent to large spatial domains of warmth or coolness, such as Greenland can represent for the Northern Hemisphere (the domain of the Hockey Stick.)
I discussed this with the paleo-L.A. at each meeting, asking that he include this exceptional result in the document as evidence for temperature fluctuations different from his own. To me Dahl-Jensen et al.’s reconstruction was a more robust estimate of past temperatures than one produced from a certain set of western U.S. tree-ring proxies. But as the process stood, the L.A. was not required to acknowledge my suggestions, and I was not able to convince him otherwise. It is perhaps a failure of mine that I did not press the issue even harder or sought agreement from others who might have been likewise aware of the evidence against the Hockey Stick realization.
As it turned out, this exceptional paper by Dahl-Jensen et al. was not even mentioned in the appropriate section (TAR 2.3.2). There was a brief mention of similar evidence indicating warmer temperatures 1000 years ago from the Sargasso Sea sediments (TAR 2.3.3), but the text then quickly asserts, without citation, that this type of anomaly is not important to the hemisphere as a whole.
Thus, we see a situation where a contradictory data set from Greenland, which in terms of paleoclimate in my view was quite important, was not offered to the readers (the policymakers) for their consideration. In the end, the Hockey Stick appeared in Figure 1 of the IPCC Summary for Policymakers, without any other comparisons, a position of prominence that speaks for itself.
So, to summarize, an L.A. was given final say over a section which included as its (and the IPCC’s) featured product, his very own chart, and which allowed him to leave out not only entire studies that presented contrary evidence, but even to use another strategically edited data set that had originally displayed contrary evidence. This led to problems that have only recently been exposed. This process, in my opinion, illustrates that the IPCC did not provide policymakers with an unbiased evaluation of the science, whatever one thinks about the Hockey Stick as a temperature reconstruction.
Judith Curry comments: Christy’s assessment, when combined with the University of East Anglia emails, provides substantial insight into how this hockey stick travesty occurred. My main unanswered question is: How did Michael Mann become a Lead Author on the TAR? He received his Ph.D. in 1998, and presumably he was nominated or selected before the ink was dry on his Ph.D. It is my suspicion that the U.S. did not nominate Mann (why would they nominate someone for this chapter without a Ph.D.?) Here is the only thing I can find on the U.S. nomination process [link]. Instead, I suspect that the IPCC Bureau selected Mann; it seems that someone (John Houghton?) was enamored of the hockey stick and wanted to see it featured prominently in the TAR. The actual selection of Lead Authors by the IPCC Bureau is indeed a mysterious process.
The IPCC process is clearly broken, and I don’t see anything in their recent policies that addresses the problems that Christy raises. The policy makers clearly wrought havoc in context of the AR5 WG3 report; however there is a more insidious problem particularly with the WG1 scientists in terms of conflict of interest and the IPCC Bureau in terms of stacking the deck to produce the results that they want.
Yelling “fire” in a crowded room is the iconic example used to distinguish free speech from license. Falsely claiming there was a fire in a crowded room would be punishable license because it could cause a panicked stampede resulting in injury. Episode 2 of Years of Living Dangerously treads dangerously close to license, as the producers present a completely one-sided biased view of forest fires purposely trying to incite climate panic. Those of us who have studied forest ecology for the past few decades have understood that bigger more destructive fires have been the result of fire suppression and a growing population. As the USA added 100 million people since 1970, more and more people moved into more forested areas and changes in fire frequency are skewed by the number of fires ignited by humans.
For example the Arbor Day Foundation reports that more than 83% of forest fires in 2006 were started by human activities, accounting for the burning of nearly 4.4 million acres. In 2004, wildfires in Alaska burned more than 5 million acres, the worst year for Alaskan fires. However 426 fires were started by humans and only 275 were natural fires ignited by lightning. In 2003 California’s “Fire Siege” the first of several fires was set during military artillery practice. The biggest fire, the Cedar Fire, happened when a signal fire got out of control.
While using movie stars to bludgeon us with the idea that all bad things must be due CO2 climate change, “Years of Living Dangerously” committed huge sins of omission.
There is a wealth of scientific research regards the effects of fire suppression and natural cycles. Instead Arnold Schwarzenegger yuks it up with firefighters on the front-line.
The episode then exploits human tragedy by highlighting the recent death of a “hot shot” crew. And Arnold (scientist?) then marries the tragedies by telling us CO2 climate change is making fires bigger, more frequent, and more dangerous.
But Arnold never tells us forests that naturally experienced fires every 5 to 40 years, had built up dangerously high fuel loads. During the 20th century era of fire suppression the US Forest Service’s “10 AM rule” dominated, and every attempt was made to extinguish all small fires by 10 AM the next day. Normally those small fires would burn longer and spread further and create a mosaic of forest patches. The remaining forest patches were then buffered from any new fires that started in a neighboring patch and large catastrophic fires are very rare in mosaic habitats. A patchy forest also prevents widespread beetle infestations.1 Until the media’s recent attempts to promote climate fear, forest ecologists had always complained fire suppression was promoting larger beetle infestations. A list of research on the effects of logging, fire suppression and natural cycles on beetle infestations can be found here. Yet “Years of Living Dangerously” simply blames climate change.
In 1996 one of the fire experts interviewed in “Years of Living Dangerously”, Thomas Swetnam wrote:
“The paradox of fire management in conifer forests is that, if in the short term we are effective at reducing fire occurrence below a certain level, then sooner or later catastrophically destructive wildfires will occur. Even the most efficient and technologically advanced fire fighting efforts can only forestall this inevitable result. It is clear from many years of study and published works that the thinning action of pre-settlement surface fires maintained open stand conditions and thereby prevented the historically anomalous occurrence of catastrophic crown fires that we are experiencing in today’s Southwestern forests”2
[emphasis added]
However there are other forests at higher elevation that naturally burn every 100 to 300 years. Over that time fuel loads naturally build to dangerous levels. When those forests burn it is usually catastrophic. So Swetnam also co-authored a paper with Westerling, suggesting the increase in fires since 1970s is likely do climate change and that paper became the “scientific basis” for this episode of “Years of Living Dangerously”. They cited warm temperatures and dry weather associated with those large catastrophic fires but they confused weather for climate and their paper only diagnosed 3 decades of trends and only the largest fires (see graph below). However the authors did admit, “Whether the changes observed in western hydro-climate and wildfire are the result of greenhouse gas–induced global warming or only an unusual natural fluctuation is beyond the scope of this work.” [emphasis added] Nonetheless instead of providing a greater historical framework to critique natural cycles that last 60 to 200 years, they promoted untested speculation and simply reported that all the models predict more fires due to CO2 warming in the future.
Figure from Westerling 2006
But all catastrophic fires for the past several centuries have been associated with warm dry conditions. Months of dry weather accelerated the biggest fires in written history. Swetnam himself had published papers showing southwest forest fires were far larger and far more frequent between 1700 and 1900 as seen in his published graph (Fig.5 below) Other authors echoed the same findings. Estimated from early journalists’ accounts of fire throughout the Rocky Mountain region, modern fires burn less than one-fourth of the land that had burned historically. Fire ecologists debating how great an area needs to be burned to restore the natural fire regimes reported a “comprehensive assessment of burning in the contiguous United States and estimated that approximately 3 to 6 times more area must be burned to restore historical fire regimes.”1 The Westerling paper shows a peak in 1988, driven largely by the Yellowstone fires that burnt about 800,000 acres. In comparison The Great Fire of 1910 was a wildfire that burned about three million acres (approximately the size of Connecticut) in northeast Washington, northern Idaho, and western Montana. From a historical point of view, the Yellowstone fire was modest.
The Peshtigo Fire of 1871 (in and around Peshtigo, Wisconsin) caused an estimated 1,500 deaths possibly as many as 2,500. It consumed about 1.5 million acres, an area approximately twice the size of Rhode Island. The combination of wind, topography and fire that created the firestorm that is now known as the Peshtigo Paradigm. Those elements that created the fire were studied and recreated by the American and British military during World War II for the fire bombings of German and Japanese cities. Nonetheless the Peshtigo fire happened the same time as the Great Chicago Fire, so it did not get a lot of media attention. However the combination of those catastrophic fires prompted a fearful public to speculate that comets, meteorites or aliens were behind those firestorms, and one must wonder if blaming CO2 for recent fires is driven by the same lack of scientific understanding.
The USA embarked on an era of fire suppression as early as 1886 when the U.S. Army began to patrol the newly created National Parks. But fire suppression to preserve natural resources had unintended consequences. The consensus among fire experts is “Fires generally become less frequent and more severe with active suppression on the landscape” “Modern wildfires on late seral landscapes tend to be larger, more intense, and more severe because of high biomass loadings, multilayer stand structures, and the high connectivity of the biomass at the stand and landscape level.” “The end result of fire exclusion in fire-prone forests is increasingly synchronous landscapes dominated by large, catastrophic disturbance regimes.”1
The Westerling paper argued that the greatest absolute increase in large wildfires occurred in Northern Rockies forests where fire exclusion has had little impact on natural fire regimes because those forest had only burned every 100 to 300 years and the era of fire suppression was too short to play a significant role. So they suggested that earlier springs due to climate change had caused the increase in catastrophic fires in that region. However a study by the US Forest service concluded fire suppression has played a major role in the Rocky Mountains.1 A picture near the Yellowstone River from 1871 shows a landscape dominated by grasslands and a mosaic of forest patches (Figure 1C below). After a century of suppression, a photograph of the same area from 1981 (Figure 1D below) shows a vast expanse of interconnected forest with a high fuel load now dominating landscape and primed for a huge fire. Until that landscape recovered from the 1800s fires, large catastrophic fires would impossible.
Large fires have always been a natural occurrence in these regions. What is unprecedented in recent decades are tremendous swathes of dense forest. (Also notice contrary to global warming theory, trees confined to the higher elevations had migrated to lower elevations.) Furthermore the sudden uptick in recent forest fires beginning in the 1970s correlates with a change in forest management. Land managers now recognized the importance of natural fires and the mosaic the prevented devastating fires and promoted biodiversity. The “10 AM” rule was dropped and small fires were allowed to burn. The 3 decades of fire suppression simply delayed the inevitable as forests recovered. Catastrophic fires in Yellowstone were the result of small fires that were now allowed to burn.6
The American West also experiences decades of drought driven by natural ocean cycles. Most extreme climate events occur when a La Niña and a cold Pacific Decadal Oscillation (PDO) phase coincide, or when an EL Niño and a warm PDO phase coincide. When a La Niña and a cold PDO coincide, the southwestern United States experiences its severest droughts and heightened fire danger. For example, each phase of the PDO persists for about 20-30 years, but cycles of El Niño and La Niña alternate every two to seven years. Thus the coincidence of both a La Niña and the cool phase of the PDO has only occurred about 29% of the time. However since the 1700s that 29% coincided with 70% of all major fires in Rocky Mountain National Park. Colorado’s 2012 wildfire season was no exception.5 Snow fall and the timing of Spring’s arrival is als0 largely driven the Pacific Decadal Oscillation.6 Informing the public about those natural cycles would help them prepare better for the weather it brings, but this episode prefer fear mongering.
“Years of Living Dangerously” justifies yelling fire and promoting climate fear based on Westerling paper that reports all the models show rising CO2 will cause warmer and drier weather in some places (but wetter elsewhere). However those models have failed horribly in replicating natural ocean cycles. Several researchers have shown that the warmth and drought predicted by CO2 driven models may have mistakenly modeled the climate effects of ocean cycles but blamed the results on CO2. One study from climate experts at Los Alamos National Laboratories explored the climate impacts of the PDO and Atlantic Multidecadal Oscillation on the American Southwest.
They concluded,
“The late twentieth century warming was about equally influenced by increasing concentration of atmospheric greenhouse gases (GHGs) and a positive phase of the AMO [Atlantic Multidecadal Oscillation].” “A strong warming and severe drought predicted on the basis of the ensemble mean of the CMIP climate models simulations is supported by our regression analysis only in a very unlikely case of the continually increasing AMO at a rate similar to its 1970–2010 increase”7
The 17 year hiatus global warming is likewise being attributed to those same ocean cycles, and ardent advocates of CO2 climate change like Kevin Trenberth now admit, “The IPCC has not paid enough attention to natural variability, on several time scales, especially El Niños and La Niñas, the Pacific Ocean phenomena that are not yet captured by climate models, and the longer term Pacific Decadal Oscillation (PDO) and Atlantic Multidecadal Oscillation (AMO) which have cycle lengths of about 60 years.”
Is such distorted truth liberty or license? I suppose if no one is listening the question is moot.
Literature Cited
1. Keane, et al (2002) Cascading Effects of Fire Exclusion in
Rocky Mountain Ecosystems:A Literature Review. USDA Forest Service RMRS GTR-91.
2. Swetnam, T. W.; Baisan, C. H. 1996. Historical fire regime patterns
in the Southwestern United States since AD 1700
3. Westerling et al (2006) Warming and Earlier Spring Increase Western U.S. Forest Wildfire Activity Science, vol. 313
4. Schoennagel, T., (2005) ENSO and PDO Variability Affect Drought‑induced Fire Occurrence in Rocky Mountain Subalpine Forests. Ecological Applications, vol. 15, pp. 2000-2014
5. McCabe, G., et al., (2011) Influences of the El Niño Southern Oscillation and the Pacific Decadal Oscillation on the timing of the North American spring. International Journal of Climatology, doi:10.1002/joc.3400
6. Romme (1989) Historical Perspective on the Yellowstone Fires of 1988 Bioscienc, vol. 39
7. Chylek et al (2013) Imprint of the Atlantic multi-decadal oscillation and Pacific decadal oscillation on southwestern US climate: past, present, and future Climate Dynamics DOI 10.1007/s00382-013-1933-3
The IPCC says unmitigated climate change will cost 0.2-2% GDP/year in 2070.
The IPCC says climate policies in 2070 will cost more than 3.4% and likely much more than that.
This is why climate mitigation makes no economic sense: the cure costs more than the disease.
But, wait, “Skeptical Science” tank driver Dana Nuccitelli has an op-ed today in theGuardian where he claims the IPCC uses only a select range of measures: the 0.2-2% is expressed in “annual global economic losses”, while the other is expressed “as a slightly slowed global consumption growth”.
He only achieves that by cutting out the actual quote from IPCC report, as you can see in the screen cap helpfully provided by Lomborg in his Twitter feed that compares texts. Note the ellipsis:
And that’s why we label the Dana Nuccitelli/John Cook “skeptical science” enterprise in our blogroll as a category all their own, “Unreliable”.
Nuccitelli eliminated the full text of that section of the third IPCC report so he could bolster his headline claim “preventing global warming is the cheap option”.
Imagine the screaming if any climate skeptic did something like that in an MSM venue.
Meanwhile Lomborg in his op-ed points out what is really worth worrying about, and it isn’t the beloved global warming “crisis” of the Skeptical Science Kids.
We live in a world where one in six deaths are caused by easily curable infectious diseases; one in eight deaths stem from air pollution, mostly from cooking indoors with dung and twigs; and billions of people live in abject poverty, with no electricity and little food. We ought never to have entertained the notion that the world’s greatest challenge could be to reduce temperature rises in our generation by a fraction of a degree.
Lomborg makes more humanistic sense than Nuccitelli, and he doesn’t have to make lies of omission to get his point across.
The New York Times publishes pablum about the IPCC.
The international edition of today’s New York Times is entertaining if you examine pages eight and nine together.
On the right (page nine), there’s an ad for the newspaper, in which it claims to be “the world’s finest journalism” and urges people to purchase a digital subscription that will “ensure” access to “trusted global news coverage and insight.” On the left (page eight) the Times runs a single editorial. Editorials are the official voice of any newspaper.
The sub-headline that accompanies today’s editorial refers to the latest findings of the Intergovernmental Panel on Climate Change (IPCC):
In an ominous report, the world’s top scientists say a global energy revolution must begin within 15 years [bold added]
Three paragraphs down, we read that:
The I.P.C.C. is composed of thousands of the world’s leading climate scientists… [bold added]
Yes, a newspaper that thinks it’s producing the world’s finest journalism still hasn’t noticed that
The IPCC provides no proof whatsoever that it is composed of the world’s top scientists. In fact, it declines to make public the CVs of its personnel.
Certain IPCC lead authors and chapter leaders have historically been graduate students a decade or more away from earning their PhD (see here and here)
60% of the people who helped produce this latest report have never worked with the IPCC before (see the bottom of p. 3 of this PDF). Was there really a 60% turnover rate in the world’s top scientists since the last IPCC report appeared in 2007?
IPCC personnel have so little power, they aren’t able to alter their chapter title by a single word. In reality, these people are mere cogs in a large, bureaucratic, UN machine.
Many IPCC personnel are not “scientists” in the way that term is normally understood. They are, instead, economists, geographers, policy wonks, UN employees, and activists.
The New York Times is demonstrably not offering what it claims to be offering: trustworthy news and insight.
Whoever wrote and approved today’s editorial is years out-of-date. There’s no meaty analysis here, just mindless parroting of the IPCC party line.
“A growing body of evidence suggests that the kind of extreme cold being experienced by much of the United States as we speak is a pattern that we can expect to see with increasing frequency as global warming continues….
We also know that this week’s cold spell is of a type there’s reason to believe may become more frequent in a world that’s getting warmer, on average, because of greenhouse-gas pollution.”
But is there any evidence that extreme cold winters are becoming more common, or, for that matter, more extreme?
First, let’s check the temperature trends for the CONUS in winter.
Clearly, on a national basis, recent winters have not been unusually cold. In the last 10 years, only three winters have been colder than the 1901-2000 mean. Moreover, no winters in recent years have come anywhere near to being as cold as some of the winters in the 1970’s, for instance, or earlier.
But this graph only tells half the story. As it covers the whole country, it could cover up regional extremes. As we know, this winter has seen particularly cold weather in Mid West and East, but warmer conditions out West. The result is that, to some extent, they cancel each other out.
So, is there a way we can isolate the warm from the cold, and see whether cold winters are becoming more extreme in just parts of the country?
There is actually a very simple method, and that is to use NOAA’s own Climate Extremes Index. This provides the percentage of the country which have had extreme temperatures (or precipitation, drought etc) during the year. As both above average and below average temperatures are shown separately, we can look at extreme cold weather on its own.
The graphs below cover the Winter months (Dec to Feb) only, with the first using mean monthly maximum temperatures, and the second minimums. The results seem pretty similar.
It is abundantly clear that much less of the country has been affected by extreme cold this winter, and indeed other recent ones, when compared with the 20thC. There is also no trend towards cold winters becoming more common.
What is also interesting is that there does not seem to be much of a trend towards milder winters taking over. Only the winter of 2011/12 stands out in this respect, and there have been plenty of similar years previously.
There has been nothing unusual or unprecedented about this winter. And, as cold winters have become less frequent in the last couple of decades, there is absolutely no evidence to support Holdren’s claim that “this week’s cold spell is of a type there’s reason to believe may become more frequent in a world that’s getting warmer”.
Technical Stuff
NOAA give this definition for the (maximum temperature) index:
The U.S. CEI is the arithmetic average of the following five or six# indicators of the percentage of the conterminous U.S. area:
The sum of (a) percentage of the United States with maximum temperatures much below normal and (b) percentage of the United States with maximum temperatures much above normal.
And their definition for “much above normal”:
In each case, we define much above (below) normal or extreme conditions as those falling in the upper (lower) tenth percentile of the local, period of record. In any given year, each of the five indicators has an expected value of 20%, in that 10% of all observed values should fall, in the long-term average, in each tenth percentile, and there are two such sets in each indicator
The Climate Extremes Index can be accessed at the link below. It covers temperatures, drought, rainfall and hurricanes, and can used on a seasonal or annual basis. There is also a regional section.
The message is clear – we are all going to die from swine flu. It spreads fast, it is dangerous, and it must be feared – says the World Health Organization.
But worry not – there is a way to save yourself. Just get a flu shot – and purchase a remedy for the deadly virus. Those are the instructions from the WHO.
However, the WHO may find itself coughing up explanations, as more and more scientists and health researchers, and even journalists, are starting to question the organization’s motives behind raising the alert so quickly.
According to the Danish Daily Information newspaper, the WHO and pharmaceutical companies are suffering from the profit bug. Or, to put it simply, the chief health care organization in the world has teamed up with the drug makers to create a phantom monster – and to rake in cash by selling a remedy for it.
Plastered all over the front pages and headlines news, swine flu made its triumphant entrance into limelight, heralded as the next “in” virus, which threatened to bring an end to humanity as we know it.
Let’s stop right there and talk numbers for a little bit.
So far, more than 3.5 million people have been reported to be infected with swine flu worldwide. More than 9,000 deaths have been confirmed.
In comparison: every year, up to one billion people get infected with seasonal flu, with up to 500 million deaths. These numbers come from the World Health Organization, but they never make headline news for some reason.
On June 11 of this year, the WHO declared swine flu a pandemic. But few know that, right before doing that, the Organization changed its definition, taking out the word “deadly” from it.
Aleksander Saversky, the chair of the Patient’s Rights Protection League, was one of those who did pay attention. He says it is clear that the WHO dramatized the situation around the H1N1 virus. In an interview to RT, Saversky speculated that it is due to the WHO’s close ties with the world’s major pharmaceutical companies.
And recently, Danish journalists conducted their own research, which resulted in accusations that the WHO, and scientists who appear to be independent are, in fact, on pharmaceutical companies’ payroll.
Saversky points out that the WHO declared the status of pandemic when only a few thousand people were infected with it – something that is highly illogical, he says, considering the hundred thousand more cases of seasonal flu never gets paid such high attention.
The virus was reported to be extremely deadly. Parallels were drawn to the Spanish Flu, which killed roughly 50 million people worldwide in the span of six months.
As panic spread, people rushed to clinics for Tamiflu – $145 a pop and by prescription only in the US – and for vaccinations, which range anywhere from $10 to $50. And despite the fact that many have lost their jobs in the financial crisis, and were left without health insurance, vaccinations and pharmaceutical sales skyrocketed. Nobody wants to die a grisly death from the supposedly new virus.
Aleksander Saversky warns the hullaballoo over swine flu is akin to the fable of “The Boy Who Cried Wolf.” He says that, because of this hype, the next time a truly dangerous virus comes about, no one will take any precautions. Fooled once already by swine flu, people will ignore the warnings and fall prey to a more dangerous – and deadly virus.
In fact, vaccinating people from swine flu during the seasonal flu outbreak, in Saversky’s opinion, is criminal. People end up having to battle two viruses at the same time, which puts an enormous strain on the immune system.
Saversky puts the blame on capitalism – pharmaceutical companies make billions on people’s fears, combined with asymmetrical information dispersal (meaning that most people know very little substantial information about the virus, whereas the WHO, pharmaceutical companies and researchers know a lot more).
So, what’s to be done to conquer the virus – and stop the WHO?
Saversky says there is one solution – for governments worldwide to step in and take matters into their own hands, by controlling healthcare and pharmaceutical production.
Until that happens, however, remember to check for all common flu symptoms. And should a general disinclination to work of any kind be among them, rest assured – it is most probably a run-of-the-mill case of the Monday Blues.
Governments spent heavily on Tamiflu starting last decade when public health officials warned of deadly influenzas. But the billion-dollar investment produced only healthy outcomes for the balance sheets of the drug’s manufacturer, Roche.
This conclusion was reached by British researchers who said they could not substantiate claims by Roche and GlaxoSmithKline (which makes a rival flu drug) Relenza, that their products helped people fight off flu effects.
The British government—anticipating the potential death of 750,000 of its citizens in the event of a bird flu outbreak—spent more than $700 million stockpiling 40 million doses of Tamiflu, while the U.S. government forked out $1.3 billion on a massive antiviral reserve that includes the drug. Tamiflu is also listed by the World Health Organization as an “essential medicine.” Yet the researchers found few if any benefits from the two drugs and, in fact, discovered that they produce negative side effects (“psychiatric…renal…and metabolic adverse events”) which were previously dismissed or never acknowledged.
All the money spent by governments around the world on those stockpiles “have been thrown down the drain,” Carl Heneghan, a lead investigator of the study and a professor of evidence-based medicine at Britain’s Oxford University, told Reuters. This is because accurate data about the drugs has long been withheld from government regulators, the medical community and the public.
Five years ago, Tamiflu sales reached nearly $3 billion, in large part because of the H1N1 flu pandemic scare. The Cochrane Collaboration and the British Medical Journal fought for four years to gain access to Roche’s Tamiflu data. Once they succeeded, they conducted a joint analysis.
Roche officials dismissed the researchers’ findings, saying the drug firm “fundamentally disagrees with the overall conclusions” of the study.
“We firmly stand by the quality and integrity of our data, reflected in decisions reached by 100 regulators across the world and subsequent real-world evidence demonstrating that Tamiflu is an effective medicine in the treatment and prevention of influenza,” the company said in a prepared statement.
“Remember, the idea of a drug is that the benefits should exceed the harms,” noted Heneghan. “So if you can’t find any benefits, that accentuates the harms.”
“Why did no-one else demand this level of scrutiny before spending such huge sums on one drug?” Journal editor Fiona Godlee said to Reuters. “The whole story gives an extraordinary picture of the entrenched flaws in the current system of drug regulation and drug evaluation.”
By Mark Curtis | MintPress News | November 16, 2022
There is a myth the UK did not support Washington’s war against Vietnam in the 1960s and 1970s. In fact, Labour and Conservative governments backed every phase of US military escalation and played secret roles in the conflict, declassified files show.
UK sent SAS team to Vietnam in 1962, flew secret RAF missions to deliver arms, and provided intelligence to US
UK governments lied to parliament they were not providing military advice to South Vietnam’s brutal regime
Labour government secretly gave arms to US for use in Vietnam, stressing need for “no publicity”
It also connived with Washington to deceive UK public over its support for US
UK governments knew of atrocities against civilians but backed US war aims
Whitehall only started to advocate a peaceful solution, on US terms, once the war became unwinnable
During its war in Vietnam in the 1960s and 1970s the US dropped more bombs than in the whole of World War Two, in a conflict that killed over two million people. The wholesale destruction of villages and killing of innocent people was a permanent feature of the US war from the beginning, along with widespread indiscriminate bombing.
Britain’s role in the war has been largely buried and must be almost completely unknown to the public. When the UK media mentions the war now, reports often simply reference the refusal by Harold Wilson’s government to agree to US requests to openly deploy British troops.
Although this was certainly a public rebuff to Washington, Britain did virtually everything else to back the US war over more than a decade, the declassified documents show. … continue
This site is provided as a research and reference tool. Although we make every reasonable effort to ensure that the information and data provided at this site are useful, accurate, and current, we cannot guarantee that the information and data provided here will be error-free. By using this site, you assume all responsibility for and risk arising from your use of and reliance upon the contents of this site.
This site and the information available through it do not, and are not intended to constitute legal advice. Should you require legal advice, you should consult your own attorney.
Nothing within this site or linked to by this site constitutes investment advice or medical advice.
Materials accessible from or added to this site by third parties, such as comments posted, are strictly the responsibility of the third party who added such materials or made them accessible and we neither endorse nor undertake to control, monitor, edit or assume responsibility for any such third-party material.
The posting of stories, commentaries, reports, documents and links (embedded or otherwise) on this site does not in any way, shape or form, implied or otherwise, necessarily express or suggest endorsement or support of any of such posted material or parts therein.
The word “alleged” is deemed to occur before the word “fraud.” Since the rule of law still applies. To peasants, at least.
Fair Use
This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more info go to: http://www.law.cornell.edu/uscode/17/107.shtml. If you wish to use copyrighted material from this site for purposes of your own that go beyond ‘fair use’, you must obtain permission from the copyright owner.
DMCA Contact
This is information for anyone that wishes to challenge our “fair use” of copyrighted material.
If you are a legal copyright holder or a designated agent for such and you believe that content residing on or accessible through our website infringes a copyright and falls outside the boundaries of “Fair Use”, please send a notice of infringement by contacting atheonews@gmail.com.
We will respond and take necessary action immediately.
If notice is given of an alleged copyright violation we will act expeditiously to remove or disable access to the material(s) in question.
All 3rd party material posted on this website is copyright the respective owners / authors. Aletho News makes no claim of copyright on such material.