Aletho News


Warming Improves Our Health – part 1

What’s Natural

By Jim Steel | Watts Up With That? | March 22, 2020

It’s deeply disturbing to hear people uncritically regurgitate media misinformation suggesting global warming threatens our health far worse than the COVID pandemic. Scientific evidence unequivocally shows colder weather is the major killer. As Figure 1 illustrates, the percentage of all deaths attributed to weather and temperature increases during the cold months. In contrast mortalities rates fall during warmer months. Researchers examining 74 million deaths across the globe from 1985-2012 found 7.3% were caused by temperatures cooler than the optimum compared to just 0.4% attributed to temperatures above the optimum. Extreme temperature events, both hot and cold only accounted for 0.9% of all deaths.

Likewise, a 2014 National Health Statistics reports found, “During 2006–2010, about 2,000 U.S. residents died each year from weather-related deaths. About 31% of these deaths were attributed to exposure to excessive natural heat, heat stroke, sun stroke; 63% were attributed to exposure to excessive natural cold, hypothermia, or both.” Similarly, according to the CDC , from 1979-1999, a total of 8015 deaths in the United States were heat related while 13,970 deaths were attributed to hypothermia. So why aren’t people listening to the science?


Global warming fear is based on speculation regards what could happen in the future if global average temperatures rose 2°F to 4°F. But scary predictions are not scientific fact until their hypotheses are tested and verified. Without time machines we cannot directly test predicted outcomes for the years 2050 or 2100. But we can observe the effects of a similar temperature change.

In the United States people have steadily migrated away from the cold Northeast to the warmer Southwest. In the Southwest they are exposed to higher average temperatures, temperatures equal to or greater than what global warming predicts they would endure if they remained in the Northeast. The good news, scientists determined that “migration from the Northeast to the Southwest accounts for 4% to 7% of the total gains in life expectancy experienced by the U.S. population over the past thirty years.” We can infer a similar benefit from global warming. A complementary study determined people that migrated to colder climates suffered ‘greater cardiovascular mortality” than people who remained in their native country.

Because the two major U.S. government agencies that track heat and cold deaths – and the CDC – differ sharply on which is the bigger killer, the public is rightfully confused. In contrast to the CDC results, NOAA argues heat is killing twice as many people as cold, but NOAA’s researchers have also been heavily invested in catastrophic global warming claims. By statistically adjusting the data via “seasonal detrending”, they remove the greater number of winter deaths in their analyses and just focus on extreme temperature deaths. They justify their adjustments arguing factors such as increased winter deaths due to flu season are not directly due to colder temperatures. But that obscures the health effects of temperature.

Colder temperatures reduce the effectiveness of our immune systems, which promotes influenza epidemics that may kill 34,000 to 60,000 people in a year. Because influenza season ends when temperatures warm, scientists are hoping warmer weather will similarly curtail the novel COVID-19 pandemic.

NOAA’s adjusted data focuses on deaths from heat waves and cold snaps. Indeed, there is a greater spike in deaths during heat waves, but research suggests heat waves have a small long‑term effect due to a “harvesting effect”. Us elderly and health compromised people are most vulnerable to extreme weather and epidemics. The “harvesting effect” describes an event during which vulnerable people who would have likely died over the following months instead died “prematurely” during an extreme event. But mortality rates drop in the following months because the most vulnerable have already passed. Researchers have found because mortality rates fall during the months following a heat wave there is no long­‑term population effect. In contrast, cold snaps do have long-term effects as researchers found no such “harvesting effect”.

Although alarming models and media narratives suggest global warming causes more extreme heat waves, scientific data disagrees. As the EPA’s heat wave index illustrates (below), there has been no increase in heat waves as the worse heat waves happened during the 1930s.

Fortunately, heat waves are short‑lived and foreseeable. Weather forecasters detect approaching high-pressure systems that bring cloudless skies that increase solar heating. High pressure systems inhibit rising air currents that normally carry heat away. And high‑pressure systems draw warm tropical air poleward on one flank while blocking cooler air from moving south. By forecasting extreme heat waves, scientists believe we can prevent most heat wave deaths. Urban heat effects are 2°F to 10°F warmer than the countryside, thus urban dwellers should be most careful. And because elderly people who lack air conditioning are most vulnerable and less mobile, we can make sure they are moved out of harm’s way.


Jim Steele is Director emeritus of San Francisco State’s Sierra Nevada Field Campus and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism

March 22, 2020 Posted by | Science and Pseudo-Science, Timeless or most popular | | 1 Comment

Hotter than the hottest thing ever

Climate Discussion Nexus | January 22, 2020

So 2019 was hotter than anything ever was hot, except 2016 which was itself the hottest thing ever. We’re all going to die! Unless we don’t because it wasn’t. As Anthony Watts observes, if you measure from the depths of the natural Little Ice Age you get an upward line. But if you take a longer perspective you get ups and downs, within which our era is not remarkable. Even worse, as Watts also shows on a graph, the most credible numbers from the United States, which has the best temperature measurements in the world, show 2019 as cooler than 2005… and 2006… and 2007, 2010, 2011, 2012, 2015, 2016, 2017 and 2018. But hey, who’s counting?

Alarmists frequently assert that they rely on science whereas “deniers” rely on oil money and slippery rhetoric. But in addition to the contradictions between reasonably complete American temperature records (that, among other things, show the number of really hot days falling over the past century) and very patchy records from most of the rest of the planet, Watts raises some very basic statistical issues that the Armageddon types do not seem eager to discuss.

For instance, Watts’ Jan. 15 post objects to suspect statistical selectivity in the findings. Particularly glaring is an inconsistent baseline for comparisons because NASA’s Goddard Institute for Space Studies (GISS) clings to the coolest available period (1951-80, though without wishing to discuss why there was a cooling from around 1920 even as the atmospheric CO2 that supposedly drives temperature increased) whereas the National Oceanographic and Atmospheric Administration (NOAA), equally alarmist in its views, uses 1981-2000.

His Jan. 17 post makes another point that deserves far more attention than it usually gets. He takes aim at “a press release session that featured NOAA and NASA GISS talking about how their climate data says that the world in 2019 was the second warmest ever, and the decade of 2010-2019 was the hottest ever (by a few hundredths of a degree).” But as every competent statistician knows, results can never be more accurate than inputs. And since nobody claims to be measuring temperature in hundredths of a degree outside a laboratory, there must be a lot of people within NOAA and NASA writhing in shame at this claim.

It gets worse. As we were told in high school math, and some of us even listened, if you measure two things to one decimal place and multiply them correctly, you may very well get a number with two decimal places. Thus 0.5 times 0.5 is 0.25. And that second decimal place yields an apparent increase in precision. But it’s worse than apparent, it’s deceptive, unless you know the two factors are exactly right. If I give you exactly half of a buck and a half, that is, exactly 0.5 times 1.5 dollars, I give you exactly 75 cents. But if the two factors are just estimates, if I try to split the leftover doughnut and a half from the meeting evenly between us, giving you about .5 times roughly 1.5, it is fatuous to say you got exactly .75 of a doughnut which beats the measly .73 you had last week.

The right procedure in such cases is not to keep two decimal places or even one. It is to round it to a whole number to accommodate the growing uncertainty as you combine uncertainties. “I got most of a stale doughnut again” is the best way to characterize what happened.

Such spurious precision is a chronic feature of climate science as of a great many things in the modern world. Thus David Middleton mocks a publication called The Anthropocene for asserting that death will get worse due to climate change including “an additional 1,603 deaths from injuries each year in the United States”; as Middleton rightly asks, “Are they sure it’s not 1,602 or 1,604?” And since the actual piece said “Global warming of 1.5 °C could result in an additional 1,603 deaths from injuries each year in the United States, an international team of researchers reported yesterday in the journal Nature Medicine” there’s an Ossa of estimated temperature rise beneath a Pelion of “could result” medical modeling that ought to have shamed the authors into saying “about 1,500”.

When it comes to global temperature, no sane person would ever claim to have measured the temperature anywhere outside a laboratory within a few hundreds of a degree. So there is no possible way that we know the temperature of the entire Earth, most of which has no temperature stations at all, to within even a few tenths of a degree let alone a few hundredths.

Putting all this legerdemain together, if that press release that galloped around the world while the statistics were pulling on their boots was not a lie then, to borrow a phrase from Damon Runyon’s Guys and Dolls, it will do until a lie comes along.

January 22, 2020 Posted by | Deception, Science and Pseudo-Science | , | Leave a comment

Government Scientists : No Data – But Tremendous Precision

By Tony Heller | The Deplorable Science Blog | September 13, 2017

NOAA has no daily temperature data from Central or South America, or most of Canada from the year 1900, But they claim to know the temperature in those regions very precisely. Same story in the rest of the world.

Despite not having any data, all government agencies agree very precisely about the global temperature.

Climate Change: Vital Signs of the Planet: Scientific Consensus

The claimed global temperature record is completely fake. There is no science behind it. It is the product of criminals, not scientists. This is the biggest scam in history.

September 14, 2017 Posted by | Deception, Fake News, Mainstream Media, Warmongering, Science and Pseudo-Science, Timeless or most popular | | 2 Comments

Exposed: How world leaders were duped into investing billions over manipulated global warming data

By David Rose | The Mail on Sunday | February 4, 2017

The Mail on Sunday today reveals astonishing evidence that the organisation that is the world’s leading source of climate data rushed to publish a landmark paper that exaggerated global warming and was timed to influence the historic Paris Agreement on climate change.

A high-level whistleblower has told this newspaper that America’s National Oceanic and Atmospheric Administration (NOAA) breached its own rules on scientific integrity when it published the sensational but flawed report, aimed at making the maximum possible impact on world leaders including Barack Obama and David Cameron at the UN climate conference in Paris in 2015.

The report claimed that the ‘pause’ or ‘slowdown’ in global warming in the period since 1998 – revealed by UN scientists in 2013 – never existed, and that world temperatures had been rising faster than scientists expected. Launched by NOAA with a public relations fanfare, it was splashed across the world’s media, and cited repeatedly by politicians and policy makers.

But the whistleblower, Dr John Bates, a top NOAA scientist with an impeccable reputation, has shown The Mail on Sunday irrefutable evidence that the paper was based on misleading, ‘unverified’ data.

It was never subjected to NOAA’s rigorous internal evaluation process – which Dr Bates devised.

His vehement objections to the publication of the faulty data were overridden by his NOAA superiors in what he describes as a ‘blatant attempt to intensify the impact’ of what became known as the Pausebuster paper.

His disclosures are likely to stiffen President Trump’s determination to enact his pledges to reverse his predecessor’s ‘green’ policies, and to withdraw from the Paris deal – so triggering an intense political row.

In an exclusive interview, Dr Bates accused the lead author of the paper, Thomas Karl, who was until last year director of the NOAA section that produces climate data – the National Centers for Environmental Information (NCEI) – of ‘insisting on decisions and scientific choices that maximised warming and minimised documentation… in an effort to discredit the notion of a global warming pause, rushed so that he could time publication to influence national and international deliberations on climate policy’.
Dr Bates was one of two Principal Scientists at NCEI, based in Asheville, North Carolina.

Official delegations from America, Britain and the EU were strongly influenced by the flawed NOAA study as they hammered out the Paris Agreement – and committed advanced nations to sweeping reductions in their use of fossil fuel and to spending £80 billion every year on new, climate-related aid projects.

The scandal has disturbing echoes of the ‘Climategate’ affair which broke shortly before the UN climate summit in 2009, when the leak of thousands of emails between climate scientists suggested they had manipulated and hidden data. Some were British experts at the influential Climatic Research Unit at the University of East Anglia.

NOAA’s 2015 ‘Pausebuster’ paper was based on two new temperature sets of data – one containing measurements of temperatures at the planet’s surface on land, the other at the surface of the seas.

Both datasets were flawed. This newspaper has learnt that NOAA has now decided that the sea dataset will have to be replaced and substantially revised just 18 months after it was issued, because it used unreliable methods which overstated the speed of warming. The revised data will show both lower temperatures and a slower rate in the recent warming trend.

The land temperature dataset used by the study was afflicted by devastating bugs in its software that rendered its findings ‘unstable’.

The paper relied on a preliminary, ‘alpha’ version of the data which was never approved or verified.

A final, approved version has still not been issued. None of the data on which the paper was based was properly ‘archived’ – a mandatory requirement meant to ensure that raw data and the software used to process it is accessible to other scientists, so they can verify NOAA results.

Dr Bates retired from NOAA at the end of last year after a 40-year career in meteorology and climate science. As recently as 2014, the Obama administration awarded him a special gold medal for his work in setting new, supposedly binding standards ‘to produce and preserve climate data records’.

Yet when it came to the paper timed to influence the Paris conference, Dr Bates said, these standards were flagrantly ignored.

The paper was published in June 2015 by the journal Science. Entitled ‘Possible artifacts of data biases in the recent global surface warming hiatus’, the document said the widely reported ‘pause’ or ‘slowdown’ was a myth.

Less than two years earlier, a blockbuster report from the UN Intergovernmental Panel on Climate Change (IPCC), which drew on the work of hundreds of scientists around the world, had found ‘a much smaller increasing trend over the past 15 years 1998-2012 than over the past 30 to 60 years’. Explaining the pause became a key issue for climate science. It was seized on by global warming sceptics, because the level of CO2 in the atmosphere had continued to rise.

Some scientists argued that the existence of the pause meant the world’s climate is less sensitive to greenhouse gases than previously thought, so that future warming would be slower. One of them, Professor Judith Curry, then head of climate science at the Georgia Institute of Technology, said it suggested that computer models used to project future warming were ‘running too hot’.

However, the Pausebuster paper said while the rate of global warming from 1950 to 1999 was 0.113C per decade, the rate from 2000 to 2014 was actually higher, at 0.116C per decade. The IPCC’s claim about the pause, it concluded, ‘was no longer valid’.

The impact was huge and lasting. On publication day, the BBC said the pause in global warming was ‘an illusion caused by inaccurate data’.

One American magazine described the paper as a ‘science bomb’ dropped on sceptics.

Its impact could be seen in this newspaper last month when, writing to launch his Ladybird book about climate change, Prince Charles stated baldly: ‘There isn’t a pause… it is hard to reject the facts on the basis of the evidence.’

Data changed to make the sea appear warmer

The sea dataset used by Thomas Karl and his colleagues – known as Extended Reconstructed Sea Surface Temperatures version 4, or ERSSTv4, tripled the warming trend over the sea during the years 2000 to 2014 from just 0.036C per decade – as stated in version 3 – to 0.099C per decade. Individual measurements in some parts of the globe had increased by about 0.1C and this resulted in the dramatic increase of the overall global trend published by the Pausebuster paper. But Dr Bates said this increase in temperatures was achieved by dubious means. Its key error was an upwards ‘adjustment’ of readings from fixed and floating buoys, which are generally reliable, to bring them into line with readings from a much more doubtful source – water taken in by ships. This, Dr Bates explained, has long been known to be questionable: ships are themselves sources of heat, readings will vary from ship to ship, and the depth of water intake will vary according to how heavily a ship is laden – so affecting temperature readings.

Dr Bates said: ‘They had good data from buoys. And they threw it out and “corrected” it by using the bad data from ships. You never change good data to agree with bad, but that’s what they did – so as to make it look as if the sea was warmer.’

ERSSTv4 ‘adjusted’ buoy readings up by 0.12C. It also ignored data from satellites that measure the temperature of the lower atmosphere, which are also considered reliable. Dr Bates said he gave the paper’s co-authors ‘a hard time’ about this, ‘and they never really justified what they were doing.’

Now, some of those same authors have produced the pending, revised new version of the sea dataset – ERSSTv5. A draft of a document that explains the methods used to generate version 5, and which has been seen by this newspaper, indicates the new version will reverse the flaws in version 4, changing the buoy adjustments and including some satellite data and measurements from a special high-tech floating buoy network known as Argo. As a result, it is certain to show reductions in both absolute temperatures and recent global warming.

The second dataset used by the Pausebuster paper was a new version of NOAA’s land records, known as the Global Historical Climatology Network (GHCN), an analysis over time of temperature readings from about 4,000 weather stations spread across the globe.

This new version found past temperatures had been cooler than previously thought, and recent ones higher – so that the warming trend looked steeper. For the period 2000 to 2014, the paper increased the rate of warming on land from 0.15C to 0.164C per decade.

In the weeks after the Pausebuster paper was published, Dr Bates conducted a one-man investigation into this. His findings were extraordinary. Not only had Mr Karl and his colleagues failed to follow any of the formal procedures required to approve and archive their data, they had used a ‘highly experimental early run’ of a programme that tried to combine two previously separate sets of records.

This had undergone the critical process known as ‘pairwise homogeneity adjustment’, a method of spotting ‘rogue’ readings from individual weather stations by comparing them with others nearby.

However, this process requires extensive, careful checking which was only just beginning, so that the data was not ready for operational use. Now, more than two years after the Pausebuster paper was submitted to Science, the new version of GHCN is still undergoing testing.

Moreover, the GHCN software was afflicted by serious bugs. They caused it to become so ‘unstable’ that every time the raw temperature readings were run through the computer, it gave different results. The new, bug-free version of GHCN has still not been approved and issued. It is, Dr Bates said, ‘significantly different’ from that used by Mr Karl and his co-authors.

Dr Bates revealed that the failure to archive and make available fully documented data not only violated NOAA rules, but also those set down by Science. Before he retired last year, he continued to raise the issue internally. Then came the final bombshell. Dr Bates said: ‘I learned that the computer used to process the software had suffered a complete failure.’

The reason for the failure is unknown, but it means the Pausebuster paper can never be replicated or verified by other scientists.

The flawed conclusions of the Pausebuster paper were widely discussed by delegates at the Paris climate change conference. Mr Karl had a longstanding relationship with President Obama’s chief science adviser, John Holdren, giving him a hotline to the White House.

Mr Holdren was also a strong advocate of robust measures to curb emissions. Britain’s then Prime Minister David Cameron claimed at the conference that ‘97 per cent of scientists say climate change is urgent and man-made and must be addressed’ and called for ‘a binding legal mechanism’ to ensure the world got no more than 2C warmer than in pre-industrial times.

President Obama stressed his Clean Power Plan at the conference, which mandates American power stations to make big emissions cuts.

President Trump has since pledged he will scrap it, and to withdraw from the Paris Agreement.

Whatever takes its place, said Dr Bates, ‘there needs to be a fundamental change to the way NOAA deals with data so that people can check and validate scientific results. I’m hoping that this will be a wake-up call to the climate science community – a signal that we have to put in place processes to make sure this kind of crap doesn’t happen again.

‘I want to address the systemic problems. I don’t care whether modifications to the datasets make temperatures go up or down. But I want the observations to speak for themselves, and for that, there needs to be a new emphasis that ethical standards must be maintained.’

He said he decided to speak out after seeing reports in papers including the Washington Post and Forbes magazine claiming that scientists feared the Trump administration would fail to maintain and preserve NOAA’s climate records.

Dr Bates said: ‘How ironic it is that there is now this idea that Trump is going to trash climate data, when key decisions were earlier taken by someone whose responsibility it was to maintain its integrity – and failed.’

NOAA not only failed, but it effectively mounted a cover-up when challenged over its data. After the paper was published, the US House of Representatives Science Committee launched an inquiry into its Pausebuster claims. NOAA refused to comply with subpoenas demanding internal emails from the committee chairman, the Texas Republican Lamar Smith, and falsely claimed that no one had raised concerns about the paper internally.

Last night Mr Smith thanked Dr Bates ‘for courageously stepping forward to tell the truth about NOAA’s senior officials playing fast and loose with the data in order to meet a politically predetermined conclusion’. He added: ‘The Karl study used flawed data, was rushed to publication in an effort to support the President’s climate change agenda, and ignored NOAA’s own standards for scientific study.’

Professor Curry, now the president of the Climate Forecast Applications Network, said last night: ‘Large adjustments to the raw data, and substantial changes in successive dataset versions, imply substantial uncertainties.’

It was time, she said, that politicians and policymakers took these uncertainties on board.

Last night Mr Karl admitted the data had not been archived when the paper was published. Asked why he had not waited, he said: ‘John Bates is talking about a formal process that takes a long time.’ He denied he was rushing to get the paper out in time for Paris, saying: ‘There was no discussion about Paris.’

They played fast and loose with the figures

He also admitted that the final, approved and ‘operational’ edition of the GHCN land data would be ‘different’ from that used in the paper’.

As for the ERSSTv4 sea dataset, he claimed it was other records – such as the UK Met Office’s – which were wrong, because they understated global warming and were ‘biased too low’. Jeremy Berg, Science’s editor-in-chief, said: ‘Dr Bates raises some serious concerns. After the results of any appropriate investigations… we will consider our options.’ He said that ‘could include retracting that paper’. NOAA declined to comment.

It’s not the first time we’ve exposed dodgy climate data, which is why we’ve dubbed it: Climate Gate 2

Dr John Bates’s disclosures about the manipulation of data behind the ‘Pausebuster’ paper is the biggest scientific scandal since ‘Climategate’ in 2009 when, as this paper reported, thousands of leaked emails revealed scientists were trying to block access to data, and using a ‘trick’ to conceal embarrassing flaws in their claims about global warming.

Both scandals suggest a lack of transparency and, according to Dr Bates, a failure to observe proper ethical standards.

Because of NOAA ’s failure to ‘archive’ data used in the paper, its results can never be verified.

Like Climategate, this scandal is likely to reverberate around the world, and reignite some of science’s most hotly contested debates.

Has there been an unexpected pause in global warming? If so, is the world less sensitive to carbon dioxide than climate computer models suggest?

And does this mean that truly dangerous global warming is less imminent, and that politicians’ repeated calls for immediate ‘urgent action’ to curb emissions are exaggerated?

Judith Curry has also blogged on the same story.

February 5, 2017 Posted by | Corruption, Deception, Science and Pseudo-Science, Timeless or most popular | , , , , | Leave a comment

NOAA’s Tornado Fraud

By Paul Homewood | Not A Lot Of People Know That | January 15, 2017

According to NOAA, the number of tornadoes has been steadily growing since the 1950s, despite a drop in numbers in the last five years.

They show the above chart prominently in their Tornadoes – Annual 2016 Report.

However, they know full well that it is meaningless to compare current data with the past, as they explain themselves in the section Historical Records and Trends, which is hidden away on their own website:

One of the main difficulties with tornado records is that a tornado, or evidence of a tornado must have been observed. Unlike rainfall or temperature, which may be measured by a fixed instrument, tornadoes are short-lived and very unpredictable. If a tornado occurs in a place with few or no people, it is not likely to be documented. Many significant tornadoes may not make it into the historical record since Tornado Alley was very sparsely populated during the 20th century.

Much early work on tornado climatology in the United States was done by John Park Finley in his book Tornadoes, published in 1887. While some of Finley’s safety guidelines have since been refuted as dangerous practices, the book remains a seminal work in tornado research. The University of Oklahoma created a PDF copy of the book and made it accessible at John Finley’s Tornadoes (link is external).

Today, nearly all of the United States is reasonably well populated, or at least covered by NOAA’s Doppler weather radars. Even if a tornado is not actually observed, modern damage assessments by National Weather Service personnel can discern if a tornado caused the damage, and if so, how strong the tornado may have been. This disparity between tornado records of the past and current records contributes a great deal of uncertainty regarding questions about the long-term behavior or patterns of tornado occurrence. Improved tornado observation practices have led to an increase in the number of reported weaker tornadoes, and in recent years EF-0 tornadoes have become more prevelant in the total number of reported tornadoes. In addition, even today many smaller tornadoes still may go undocumented in places with low populations or inconsistent communication facilities.

With increased National Doppler radar coverage, increasing population, and greater attention to tornado reporting, there has been an increase in the number of tornado reports over the past several decades. This can create a misleading appearance of an increasing trend in tornado frequency. To better understand the variability and trend in tornado frequency in the United States, the total number of EF-1 and stronger, as well as strong to violent tornadoes (EF-3 to EF-5 category on the Enhanced Fujita scale) can be analyzed. These tornadoes would have likely been reported even during the decades before Doppler radar use became widespread and practices resulted in increasing tornado reports. The bar charts below indicate there has been little trend in the frequency of the stronger tornadoes over the past 55 years.



Of course it is nonsensical to claim that the bar charts below indicate there has been little trend in the frequency of the stronger tornadoes over the past 55 years – there has clearly been a large reduction.

Note as well that they have not even bothered to update the graph for 2015. Could it be they would rather the public did not find out the truth?

Meanwhile, over at the Storm Prediction Center (SPC) you can see that, when allowance is made for changing reporting procedures, last year may well have had the lowest number of tornadoes on record.


The SPC is also part of NOAA, but is the department that actually deals with tornado events and data on a day to day basis. As such, they tend to be more interested in the facts, rather than a political agenda.

While we still await the final numbers and classification for last year, but what we do know is that there was no EF-5. Indeed the last occurrence was the Moore, OK tornado in May 2013.

It is unusual to go nearly four years without one, as there have been 59 since 1953, effectively one a year on average.

The bottom line is that the NOAA headline graph is grossly dishonest. Indeed, if a company published something like that in their Annual Accounts, they would probably end up in jail!

NOAA themselves know all of this full well.

Which raises the question – why are they perpetuating this fraud?

January 16, 2017 Posted by | Deception, Fake News, Science and Pseudo-Science | , | 1 Comment

100% Of US Warming Is Due To NOAA Data Tampering

By Tony Heller | The Deplorable Climate Science Blog | December 28, 2016

Climate Central just ran this piece, which the Washington Post picked up on. They claimed the US was “overwhelmingly hot” in 2016, and temperatures have risen 1,5°F since the 19th century.

The U.S. Has Been Overwhelmingly Hot This Year | Climate Central

The first problem with their analysis is that the US had very little hot weather in 2016. The percentage of hot days was below average, and ranked 80th since 1895. Only 4.4% of days were over 95°F, compared with the long term average of 4.9%. Climate Central is conflating mild temperatures with hot ones.

They also claim US temperatures rose 1.5°F since the 19th century, which is what NOAA shows.

Climate at a Glance | National Centers for Environmental Information (NCEI)

The problem with the NOAA graph is that it is fake data. NOAA creates the warming trend by altering the data. The NOAA raw data shows no warming over the past century

The adjustments being made are almost exactly 1.5°F, which is the claimed warming in the article.

The adjustments correlate almost perfectly with atmospheric CO2. NOAA is adjusting the data to match global warming theory. This is known as PBEM (Policy Based Evidence Making.)

The hockey stick of adjustments since 1970 is due almost entirely to NOAA fabricating missing station data. In 2016, more than 42% of their monthly station data was missing, so they simply made it up. This is easy to identify because they mark fabricated temperatures with an “E” in their database.

When presented with my claims of fraud, NOAA typically tries to arm wave it away with these two complaints.

  1. They use gridded data and I am using un-gridded data.
  2. They “have to” adjust the data because of Time Of Observation Bias and station moves.

Both claims are easily debunked. The only effect that gridding has is to lower temperatures slightly. The trend of gridded data is almost identical to the trend of un-gridded data.

Time of Observation Bias (TOBS) is a real problem, but is very small. TOBS is based on the idea that if you reset a min/max thermometer too close to the afternoon maximum, you will double count warm temperatures (and vice-versa if thermometer is reset in the morning.) Their claim is that during the hot 1930’s most stations reset their thermometers in the afternoon.

This is easy to test by using only the stations which did not reset their thermometers in the afternoon during the 1930’s. The pattern is almost identical to that of all stations. No warming over the past century. Note that the graph below tends to show too much warming due to morning TOBS.

NOAA’s own documents show that the TOBS adjustment is small (0.3°F) and goes flat after 1990.

Gavin Schmidt at NASA explains very clearly why the US temperature record does not need to be adjusted.

You could throw out 50 percent of the station data or more, and you’d get basically the same answers.

One recent innovation is the set up of a climate reference network alongside the current stations so that they can look for potentially serious issues at the large scale – and they haven’t found any yet.

NASA – NASA Climatologist Gavin Schmidt Discusses the Surface Temperature Record

NOAA has always known that the US is not warming.

U.S. Data Since 1895 Fail To Show Warming Trend –

All of the claims in the Climate Central article are bogus. The US is not warming and 2016 was not a hot year in the US. It was a very mild year.

December 29, 2016 Posted by | Deception, Mainstream Media, Warmongering, Science and Pseudo-Science, Timeless or most popular | , , | 1 Comment

Senator Markey’s Climate Education Act Goes The Wrong Way

By David Wojick | Climate Etc. | September 5, 2016

The “Climate Change Education Act” (S.3074) directs the National Oceanic and Atmospheric Administration (NOAA) to establish a climate change education program focused on formal and informal learning for all age levels.

When it comes to beating the climate change drum, Sen. Ed Markey is the Energizer Bunny. As a Congressman, Rep. Markey was Chairman of the now defunct House Select Committee on Energy Independence and Global Warming from 2007 to 2011. This time he is drumming on the education front. Markey has dropped the “Climate Education Act” into the Senate hopper. While the bill is unlikely to pass at this time, it is still important to object to, lest it be seen to be acceptable.

Sen. Markey’s website summarizes the proposal as follows: “The “Climate Change Education Act” (S.3074) directs the National Oceanic and Atmospheric Administration (NOAA) to establish a climate change education program focused on formal and informal learning for all age levels. The program would explore solutions to climate change, the dangers we face in a warming world, and relatively small changes in daily routines that can have a profound global impact. The legislation also establishes a grant program to support public outreach programs that improve access to clean energy jobs and research funds so local communities can address climate mitigation and adaptation issues.”

There is a lot not to like here, beginning with the false scientific claims. The first is hyping the supposed dangers we face in a warming world, which simply do not exist. Nor are there small changes in daily routines that can have a profound global impact, because humans do not control the global climate. What is here being called Education is really just scaremongering and propaganda. Ironically, the Bill itself says one goal is to remove the fear of climate change, which it actually promotes.

What is really strange is the focus on so-called clean energy jobs and technology. The term “clean energy” is a misleading euphemism for renewable technologies. Thus the thrust of the Bill is not just on climate science education; rather it is on using the education system to promote renewables. NOAA has no expertise in this regard and no mission. They do things like running the National Weather Service. Promoting renewables and green workforce development is the Energy Department’s job.

On the science side, NOAA has long been active in so-called “climate education,” which basically means spreading the Government’s biased view of climate change as human driven and dangerous. For example, the Climate Literacy and Energy Awareness Network (CLEAN) Portal was launched in 2010, co-sponsored by NOAA, NSF and the Energy Department. As of 2012, CLEAN has been syndicated to NOAA’s portal, where they offer over 600 educational materials, most of which are biased toward the scary Federal version of climate science.

In fact NOAA has led a Federal drive to redefine “climate literacy” as accepting the Government’s biased position. According to their website, the stated Guiding Principle for climate literacy is “Humans can take actions to reduce climate change and its impacts.” The reality is that humans can do little to change climate change and a little global warming is not harmful. It is probably beneficial.

What the proposed Climate Education Act would do is give statutory authority for NOAA’s existing propaganda actions, something that is presently lacking. It also allows the agency to bribe states to use its stuff, which is pretty insidious.

It would also allow NOAA to go beyond simply providing online information, to begin writing actual curriculums to be used in the classroom. That is where the bribery really comes in. This curricular push coincides with the widespread deployment of the Next Generation Science Standards. Most states that adopt them need to develop new curriculums, because these science standards are very different from the existing state standards, especially in the area of climate change.

Beyond this, the Bill would put NOAA into the strange new business of promoting the renewable energy industry and training its workers. The Energy Department already does this, while NOAA has neither the mission nor the organization to do it.

In summary this so-called Climate Education Act does nothing that is good, for the climate or the students. It is based on false science and pushes NOAA in the wrong direction. NOAA should be trying to understand climate change, not promote renewable technologies in the name of dangerous global warming.

Press coverage is bad, buying the Bill as expected. See for example these:

September 5, 2016 Posted by | Deception, Full Spectrum Dominance, Science and Pseudo-Science | , , | Leave a comment

Claim: ‘With 2015, Earth Has Back-to-Back Hottest Years Ever Recorded’

MIT Climate Scientist Mocks ‘Hottest Year’ Claim: ‘Anyone who starts crowing about those numbers shows that they’re putting spin on nothing’

Climate Depot | January 20, 2016

NASA and NOAA today proclaimed that 2015 was the ‘hottest year’ on record.

Meanwhile, satellite data shows an 18 plus year standstill in global temperatures.

MIT climate scientist Dr. Richard Lindzen balked at claims of the ‘hottest year’ based on ground based temperature data.

“Frankly, I feel it is proof of dishonesty to argue about things like small fluctuations in temperature or the sign of a trend.  Why lend credibility to this dishonesty?” Lindzen, an emeritus Alfred P. Sloan Professor of Meteorology at the Department of Earth, Atmospheric and Planetary Sciences at MIT, told Climate Depot shortly after the announcements.

“All that matters is that for almost 40 years, model projections have almost all exceeded observations. Even if all the observed warming were due to greenhouse emissions, it would still point to low sensitivity,” Lindzen continued.

“But, given the ‘pause.’ we know that natural internal variability has to be of the same order as any other process,” Lindzen wrote.

Lindzen has previously mocked ‘warmest’ or ‘hottest’ year proclamations.

“When someone says this is the warmest temperature on record. What are they talking about? It’s just nonsense. This is a very tiny change period,” Lindzen said in November 2015.

Lindzen cautioned: “The most important thing to keep in mind is – when you ask ‘is it warming, is it cooling’, etc.  — is that we are talking about something tiny (temperature changes) and that is the crucial point.”

“And the proof that the uncertainty is tenths of a degree are the adjustments that are being made. If you can adjust temperatures to 2/10ths of a degree, it means it wasn’t certain to 2/10ths of a degree,” he added.

“70% of the earth is oceans, we can’t measure those temperatures very well. They can be off a half a degree, a quarter of a degree. Even two-10ths of a degree of change would be tiny but two-100ths is ludicrous. Anyone who starts crowing about those numbers shows that they’re putting spin on nothing.”


Satellites: No global warming at all for 18 years 8 months

January 20, 2016 Posted by | Science and Pseudo-Science | , | 3 Comments

The Global Warming Pause Explained

December 21, 2015

Fact: the RSS global mean temperature anomaly dataset shows a least-squares linear regression trend of 0.0C from February 1997 to October 2015. But what does this really mean? And what is the significance of this global warming pause?


December 22, 2015 Posted by | Science and Pseudo-Science, Timeless or most popular, Video | , | Leave a comment

Temperature station siting matters

By Judith Curry | Climate Etc. | December 17, 2015

30 year trends of temperature are shown to be lower, using well-sited high quality NOAA weather stations that do not require adjustments to the data.

Anthony Watts has presented an important analysis of U.S. surface temperatures, in a presentation co-authored by John Nielsen-Gammon and John Christy. Here is the link to the AGU press release. Watts has a more extensive post [here]. Excerpts:

SAN FRANCISO, CA – A new study about the surface temperature record presented at the 2015 Fall Meeting of the American Geophysical Union suggests that the 30-year trend of temperatures for the Continental United States (CONUS) since 1979 are about two thirds as strong as officially NOAA temperature trends.

Using NOAA’s U.S. Historical Climatology Network, which comprises 1218 weather stations in the CONUS, the researchers were able to identify a 410 station subset of “unperturbed” stations that have not been moved, had equipment changes, or changes in time of observations, and thus require no “adjustments” to their temperature record to account for these problems. The study focuses on finding trend differences between well sited and poorly sited weather stations, based on a WMO approved metric for classification and assessment of the quality of the measurements based on proximity to artificial heat sources and heat sinks which affect temperature measurement.

Following up on a paper published by the authors in 2010, Analysis of the impacts of station exposure on the U.S. Historical Climatology Network temperatures and temperature trends which concluded:

Temperature trend estimates vary according to site classification, with poor siting leading to an overestimate of minimum temperature trends and an underestimate of maximum temperature trends, resulting in particular in a substantial difference in estimates of the diurnal temperature range trends

A 410-station subset of U.S. Historical Climatology Network (version 2.5) stations is identified that experienced no changes in time of observation or station moves during the 1979-2008 period. These stations are classified based on proximity to artificial surfaces, buildings, and other such objects with unnatural thermal mass using guidelines established by Leroy (2010)1 . The United States temperature trends estimated from the relatively few stations in the classes with minimal artificial impact are found to be collectively about 2/3 as large as US trends estimated in the classes with greater expected artificial impact. The trend differences are largest for minimum temperatures and are statistically significant even at the regional scale and across different types of instrumentation and degrees of urbanization. The homogeneity adjustments applied by the National Centers for Environmental Information (formerly the National Climatic Data Center) greatly reduce those differences but produce trends that are more consistent with the stations with greater expected artificial impact. Trend differences are not found during the 1999- 2008 sub-period of relatively stable temperatures, suggesting that the observed differences are caused by a physical mechanism that is directly or indirectly caused by changing temperatures.

Key findings:

1. Comprehensive and detailed evaluation of station metadata, on-site station photography, satellite and aerial imaging, street level Google Earth imagery, and curator interviews have yielded a well-distributed 410 station subset of the 1218 station USHCN network that is unperturbed by Time of Observation changes, station moves, or rating changes, and a complete or mostly complete 30-year dataset. It must be emphasized that the perturbed stations dropped from the USHCN set show significantly lower trends than those retained in the sample, both for well and poorly sited station sets.

2. Bias at the microsite level (the immediate environment of the sensor) in the unperturbed subset of USHCN stations has a significant effect on the mean temperature (Tmean) trend. Well sited stations show significantly less warming from 1979 – 2008. These differences are significant in Tmean, and most pronounced in the minimum temperature data (Tmin). (Figure 3 and Table 1)

3. Equipment bias (CRS v. MMTS stations) in the unperturbed subset of USHCN stations has a significant effect on the mean temperature (Tmean) trend when CRS stations are compared with MMTS stations. MMTS stations show significantly less warming than CRS stations from 1979 – 2008. (Table 1) These differences are significant in Tmean (even after upward adjustment for MMTS conversion) and most pronounced in the maximum temperature data (Tmax).

4. The 30-year Tmean temperature trend of unperturbed, well sited stations is significantly lower than the Tmean temperature trend of NOAA/NCDC official adjusted homogenized surface temperature record for all 1218 USHCN stations.

5. We believe the NOAA/NCDC homogenization adjustment causes well sited stations to be adjusted upwards to match the trends of poorly sited stations.

6. The data suggests that the divergence between well and poorly sited stations is gradual, not a result of spurious step change due to poor metadata.

Lead author Anthony Watts said of the study: “The majority of weather stations used by NOAA to detect climate change temperature signal have been compromised by encroachment of artificial surfaces like concrete, asphalt, and heat sources like air conditioner exhausts. This study demonstrates conclusively that this issue affects temperature trend and that NOAA’s methods are not correcting for this problem, resulting in an inflated temperature trend. It suggests that the trend for U.S. temperature will need to be corrected.” He added: “We also see evidence of this same sort of siting problem around the world at many other official weather stations, suggesting that the same upward bias on trend also manifests itself in the global temperature record”.

The full AGU presentation can be downloaded [here].

JC reflections

This looks like a solid study.  The participation of John Nielsen-Gammon in this study is particularly noteworthy; Watts writes:

Dr. John Nielsen-Gammon, the state climatologist of Texas, has done all the statistical significance analysis and his opinion is reflected in this statement from the introduction

Dr. Nielsen-Gammon has been our worst critic from the get-go, he’s independently reproduced the station ratings with the help of his students, and created his own series of tests on the data and methods. It is worth noting that this is his statement:

The trend differences are largest for minimum temperatures and are statistically significant even at the regional scale and across different types of instrumentation and degrees of urbanization.

The p-values from Dr. Nielsen-Gammon’s statistical significance analysis are well below 0.05 (the 95% confidence level), and many comparisons are below 0.01 (the 99% confidence level). He’s on-board with the findings after satisfying himself that we indeed have found a ground truth. If anyone doubts his input to this study, you should view his publication record.

This paper has been a long process for Anthony, but it appears to have produced a robust and important analysis.

The extension of this analysis globally is important to build confidence in the land surface temperature records.

It will certainly be interesting to see how the various groups producing global surface temperature analyses respond to the study.

December 18, 2015 Posted by | Deception, Science and Pseudo-Science, Timeless or most popular | , | Leave a comment

Hottest Month Claims

By Ken Haapala | Science and Environmental Policy Project (SEPP) | August 29, 2015

Divergence: It is summertime in the US, and temperatures are warmer. Several readers have asked TWTW for comments on the recent claims that July 2015 was the hottest month ever and similar announcements by certain US government entities, including branches of the National Oceanic and Atmospheric Administration (NOAA) and the National Aeronautics and Space Administration (NASA). These entities are making strong public statements that the globe continues to warm, and the future is dire. A humorist could comment that the closer we are to the 21st session of the Conference of the Parties (COP-21) of the United Nations Framework Convention on Climate Change (UNFCCC) to be held in Paris from November 30 to December 11, the hotter the globe becomes.

However, there are three significant forms of divergence that are being demonstrated. One divergence is the increasing difference between atmospheric temperatures and surface temperatures. The second divergence is the growing difference between temperatures forecast by models and observed temperatures, particularly atmospheric temperatures. This leads to the third divergence, the difference between the activities of what can be called the Climate Establishment and what is observed in nature.

The atmospheric temperatures are reported by two independent entities: the largely NASA-financed UAH entity at the University of Alabama in Huntsville, and Remote Sensing Systems (RSS) in California. The surface temperatures are reported by NOAA, NASA, and Hadley Centre of the UK Met Office, combined with those of the Climatic Research Unit (CRU) of the University of East Anglia. These measurements depend, in part, on the historic record maintained by NOAA’s National Climatic Data Center (NCDC). Unfortunately, for more than two decades, the historic record of the surface temperatures has been adjusted numerous times, without adequate records of the details and the effects. The net effect is an inflation of a warming trend, particularly obvious in the US where excellent historic records continue to exist. The UAH data have been adjusted, but the adjustments and effects have been publically recorded.

The divergence between the temperatures forecasted by the global climate models and the observed temperatures is becoming extremely obvious, particularly with the observed atmospheric temperatures. The adjustments to surface temperatures lessen this divergence somewhat, particular with the latest adjustments by the NCDC, where superior measurements taken by fixed or floating buoys were inflated to correspond with earlier, inferior measurements taken by ships. The director of NCDC, Tom Karl, was a lead author in the paper announcing this change. As a result, we should see announcements that sea surface temperatures, and global surface temperatures, are increasing, although the increase may be strictly an artifact of human adjustments rather than an occurrence in nature.

The questionable adjustments in reported surface temperatures leads to the third form of increasing divergence – the differences between what is reported by the Climate Establishment and what is occurring in nature. The Climate Establishment can be defined as those who embrace the findings of the UN Intergovernmental Panel on Climate Change (IPCC), particularly the assertion of a high confidence, a high degree of certainty, that human emissions of carbon dioxide and other greenhouse gases are causing unprecedented and dangerous global warming. Simply because data is adjusted to reflect the IPCC view, does not mean that the IPCC view is occurring.

The greenhouse effect takes place in the atmosphere, yet it is not being observed in the atmosphere. The satellite data, independently verified by four sets of weather balloon data, clearly shows it is not. There has been no significant warming for about 18 years. These data are the most comprehensive temperature data existing and are largely independent of other human influences that bias surface data such as urbanization, including building of structures and impervious surfaces, and other changes in land use. Those who broadcast claims of the hottest year ever, based on adjusted surface data, are actually emphasizing the divergence between science practiced by the Climate Establishment and Nature, and are not engaged in a natural science.

Unfortunately, many government entities and government-funded entities are involved in the Climate Establishment. The leaders of such government entities and funding entities demonstrate a lack of concern for institutional credibility, no respect for the scientific bases on which such institutions were built, including those who came before them and those who will replace them, and will leave their institutions in an inferior condition, rather than strengthen them.

It is important to note that not all government-funded entities are so involved. The National Space Science & Technology Center (NSSTC) at the University of Alabama in Huntsville (UAH), which is largely funded by the federal government (NASA) is a notable exception.

August 31, 2015 Posted by | Deception, Science and Pseudo-Science | , , , | Leave a comment

@NOAA ‘s desperate new paper: Is there no global warming ‘hiatus’ after all?

By Patrick J. Michaels, Richard S. Lindzen, and Paul C. Knappenberger | Watts Up With That? | June 4, 2015

A new paper published today by Science, from Thomas Karl and several co-authors[1], that removes the “hiatus” in global warming prompts many serious scientific questions.

The main claim[2] by the authors that they have uncovered a significant recent warming trend is dubious. The significance level they report on their findings (.10) is hardly normative, and the use of it should prompt members of the scientific community to question the reasoning behind the use of such a lax standard.

In addition, the authors’ treatment of buoy sea-surface temperature (SST) data was guaranteed to create a warming trend. The data were adjusted upward by 0.12°C to make them “homogeneous” with the longer-running temperature records taken from engine intake channels in marine vessels.

As has been acknowledged by numerous scientists, the engine intake data are clearly contaminated by heat conduction from the structure, and as such, never intended for scientific use. On the other hand, environmental monitoring is the specific purpose of the buoys. Adjusting good data upward to match bad data seems questionable, and the fact that the buoy network becomes increasingly dense in the last two decades means that this adjustment must put a warming trend in the data.

The extension of high-latitude arctic land data over the Arctic Ocean is also questionable. Much of the Arctic Ocean is ice-covered even in high summer, meaning the surface temperature must remain near freezing. Extending land data out into the ocean will obviously induce substantially exaggerated temperatures.

Additionally, there exist multiple measures of bulk lower atmosphere temperature independent from surface measurements which indicate the existence of a “hiatus”[3]. If the Karl et al., result were in fact robust, it could only mean that the disparity between surface and midtropospheric temperatures is even larger that previously noted.

Getting the vertical distribution of temperature wrong invalidates virtually every forecast of sensible weather made by a climate model, as much of that weather (including rainfall) is determined in large part by the vertical structure of the atmosphere.

Instead, it would seem more logical to seriously question the Karl et al. result in light of the fact that, compared to those bulk temperatures, it is an outlier, showing a recent warming trend that is not in line with these other global records.

And finally, even presuming all the adjustments applied by the authors ultimately prove to be accurate, the temperature trend reported during the “hiatus” period (1998-2014), remains significantly below (using Karl et al.’s measure of significance) the mean trend projected by the collection of climate models used in the most recent report from the United Nation’s Intergovernmental Panel on Climate Change (IPCC).

It is important to recognize that the central issue of human-caused climate change is not a question of whether it is warming or not, but rather a question of how much. And to this relevant question, the answer has been, and remains, that the warming is taking place at a much slower rate than is being projected.

The distribution of trends of the projected global average surface temperature for the period 1998-2014 from 108 climate model runs used in the latest report of the U.N.’s Intergovernmental Panel on Climate Change (IPCC)(blue bars). The models were run with historical climate forcings through 2005 and extended to 2014 with the RCP4.5 emissions scenario. The surface temperature trend over the same period, as reported by Karl et al. (2015, is included in red. It falls at the 2.4th percentile of the model distribution and indicates a value that is (statistically) significantly below the model mean projection.

The distribution of trends of the projected global average surface temperature for the period 1998-2014 from 108 climate model runs used in the latest report of the U.N.’s Intergovernmental Panel on Climate Change (IPCC)(blue bars). The models were run with historical climate forcings through 2005 and extended to 2014 with the RCP4.5 emissions scenario. The surface temperature trend over the same period, as reported by Karl et al. (2015, is included in red. It falls at the 2.4th percentile of the model distribution and indicates a value that is (statistically) significantly below the model mean projection.

[1] Karl, T. R., et al., Possible artifacts of data biases in the recent global surface warming hiatus. Scienceexpress, embargoed until 1400 EDT June 4, 2015.

[2] “It is also noteworthy that the new global trends are statistically significant and positive at the 0.10 significance level for 1998-2012…”

[3] Both the UAH and RSS satellite records are now in their 21st year without a significant trend, for example

[NOTE: An earlier version of this posting accidentally omitted the last two paragraphs before the graphic, they have been restored, and the error is mine – Anthony]

June 6, 2015 Posted by | Corruption, Deception, Science and Pseudo-Science | , | 1 Comment