Despite History of Israeli Espionage, Bill Would Force NASA Cooperation with Israel Space Agency
By Whitney Webb | MintPress News | September 5, 2018
A bill that was passed by the U.S. Senate in early August and is currently under consideration by the House would mandate that the National Aeronautics and Space Administration (NASA) work closely with the Israel Space Agency (ISA) despite the fact that such cooperation in the past was used by Israel to steal U.S. state secrets.
The provision is tucked within the bill titled the “United States-Israel Security Assistance Authorization Act of 2018,” which would also provide Israel with $38 billion in U.S. military aid over a ten-year period, the largest military aid package in U.S. history. MintPress News previously reported that this massive aid package translates into approximately $23,000 every year for every Israeli family. However, the provision pertaining to NASA, which was first identified by the website If Americans Knew, has largely gone unreported.
According to the current text of the bill, NASA and the Israel Space Agency are mandated to work together “to identify and cooperatively pursue peaceful space exploration and science initiatives in areas of mutual interest, taking all appropriate measures to protect sensitive information, intellectual property, trade secrets, and economic interests of the United States.” The text also references past agreements established between NASA and the ISA such as the first mutual cooperation agreement, signed in 1996, and the 2015 “Framework Agreement for Cooperation in Aeronautics and the Exploration and Use of Airspace and Outer Space for Peaceful Purposes” as the basis for this “continuing cooperation.”
Absent, however, from the bill’s text is the fact that the ISA has used this cooperation in the past to steal classified U.S. information and to conduct espionage. For instance, a lawsuit filed in November 2014 by physicist Dr. Sandra Troian detailed how an Israeli postdoctoral student at Caltech, Amir Gat, blatantly violated U.S. law by illegally transmitting to Israel classified information on NASA technology.
According to court documents, the theft of classified information took place at Caltech’s Jet Propulsion Laboratory, an important NASA research and development center. Gat now lives in Israel and works at ITT, an Israeli government institution.
Yet, instead of attempting to stop the espionage, Caltech administrators sought to silence Troian, in violation of the school’s whistleblower policy, and retaliated against her for speaking up, including engaging in efforts to have her fired.
Troian maintains that the school was afraid of taking her concerns seriously, as it would have put the university’s $8 billion contract with NASA at risk and cast the institution in a bad light. Also of note was the fact that the Obama administration showed no interest in the case despite its repeated use of the Espionage Act to target legitimate government whistleblowers.
Thus, the Caltech incident — and the lack of accountability and the effort to silence whistleblowers that ultimately ensued — greatly weaken the bill’s claim that “all appropriate measures to protect sensitive information, intellectual property, trade secrets, and economic interests of the United States” will be followed. Despite the gravity of this incident, the inclusion of this NASA-related provision in the pending bill leaves an open door for such espionage to again take place, to the detriment of U.S. “national security.”
However, as the Trump administration has shown, the “national security” of the U.S. and of Israel have become profoundly intertwined, as President Trump’s campaign promises of “America First” quickly devolved into “Israel First” — thanks largely to the influence of Trump’s largest donor, Zionist billionaire Sheldon Adelson. Thus, concerns about Israeli espionage seem to be of little import to the current administration as well as to many members of Congress — particularly those greatly influenced by powerful organizations of the Israel lobby, such as the American Israel Public Affairs Committee (AIPAC).
A long-standing double standard
Yet, failure to prevent or punish Israeli espionage in the United States has long been a common policy in Washington that significantly predates the Trump administration. With the notable exception of former U.S. government contractor and Israeli spy Jonathan Pollard, the Israel lobby and pro-Israel billionaire donors have been largely successful in obtaining presidential pardons or lenient sentences for alleged Israeli spies.
A clear illustration of this double standard is the case of Colonel Lawrence Franklin, a case that clearly illustrates that espionage, when conducted by Israel, is not treated as seriously by the U.S. government as other cases of espionage. Franklin, a former employee at the U.S. Department of Defense, pled guilty to espionage in 2006 for giving classified information to the American Israel Public Affairs Committee (AIPAC), as well as directly to Israeli officials, in an attempt to pivot U.S. military forces engaged in Iraq towards Iran.
The Bush administration successfully pushed the Justice Department to pardon Franklin’s co-conspirators and then pushed Justice to reduce Franklin’s 13-year prison sentence to 10 months of house arrest. Subsequently, members of U.S. Congress asked Obama to pardon Franklin in 2016, asserting that “his [Franklin’s] intentions were to save lives and protect this great country” despite the fact that Franklin had sought to involve the U.S. in a war with Iran in order to benefit Israel.
Thus, the current NASA provision in the United States-Israel Security Assistance Authorization Act of 2018 would ostensibly continue this practice of “turning a blind eye” to Israeli interference and espionage in the United States if the bill is passed in the coming weeks.
Whitney Webb is a staff writer for MintPress News and a contributor to Ben Swann’s Truth in Media. Her work has appeared on Global Research, the Ron Paul Institute and 21st Century Wire, among others. She has also made radio and TV appearances on RT and Sputnik. She currently lives with her family in southern Chile.
Claim: ‘With 2015, Earth Has Back-to-Back Hottest Years Ever Recorded’
MIT Climate Scientist Mocks ‘Hottest Year’ Claim: ‘Anyone who starts crowing about those numbers shows that they’re putting spin on nothing’
Climate Depot | January 20, 2016
NASA and NOAA today proclaimed that 2015 was the ‘hottest year’ on record.
Meanwhile, satellite data shows an 18 plus year standstill in global temperatures.
MIT climate scientist Dr. Richard Lindzen balked at claims of the ‘hottest year’ based on ground based temperature data.
“Frankly, I feel it is proof of dishonesty to argue about things like small fluctuations in temperature or the sign of a trend. Why lend credibility to this dishonesty?” Lindzen, an emeritus Alfred P. Sloan Professor of Meteorology at the Department of Earth, Atmospheric and Planetary Sciences at MIT, told Climate Depot shortly after the announcements.
“All that matters is that for almost 40 years, model projections have almost all exceeded observations. Even if all the observed warming were due to greenhouse emissions, it would still point to low sensitivity,” Lindzen continued.
“But, given the ‘pause.’ we know that natural internal variability has to be of the same order as any other process,” Lindzen wrote.
Lindzen has previously mocked ‘warmest’ or ‘hottest’ year proclamations.
“When someone says this is the warmest temperature on record. What are they talking about? It’s just nonsense. This is a very tiny change period,” Lindzen said in November 2015.
Lindzen cautioned: “The most important thing to keep in mind is – when you ask ‘is it warming, is it cooling’, etc. — is that we are talking about something tiny (temperature changes) and that is the crucial point.”
“And the proof that the uncertainty is tenths of a degree are the adjustments that are being made. If you can adjust temperatures to 2/10ths of a degree, it means it wasn’t certain to 2/10ths of a degree,” he added.
“70% of the earth is oceans, we can’t measure those temperatures very well. They can be off a half a degree, a quarter of a degree. Even two-10ths of a degree of change would be tiny but two-100ths is ludicrous. Anyone who starts crowing about those numbers shows that they’re putting spin on nothing.”
Related:
Hottest Month Claims
By Ken Haapala | Science and Environmental Policy Project (SEPP) | August 29, 2015
Divergence: It is summertime in the US, and temperatures are warmer. Several readers have asked TWTW for comments on the recent claims that July 2015 was the hottest month ever and similar announcements by certain US government entities, including branches of the National Oceanic and Atmospheric Administration (NOAA) and the National Aeronautics and Space Administration (NASA). These entities are making strong public statements that the globe continues to warm, and the future is dire. A humorist could comment that the closer we are to the 21st session of the Conference of the Parties (COP-21) of the United Nations Framework Convention on Climate Change (UNFCCC) to be held in Paris from November 30 to December 11, the hotter the globe becomes.
However, there are three significant forms of divergence that are being demonstrated. One divergence is the increasing difference between atmospheric temperatures and surface temperatures. The second divergence is the growing difference between temperatures forecast by models and observed temperatures, particularly atmospheric temperatures. This leads to the third divergence, the difference between the activities of what can be called the Climate Establishment and what is observed in nature.
The atmospheric temperatures are reported by two independent entities: the largely NASA-financed UAH entity at the University of Alabama in Huntsville, and Remote Sensing Systems (RSS) in California. The surface temperatures are reported by NOAA, NASA, and Hadley Centre of the UK Met Office, combined with those of the Climatic Research Unit (CRU) of the University of East Anglia. These measurements depend, in part, on the historic record maintained by NOAA’s National Climatic Data Center (NCDC). Unfortunately, for more than two decades, the historic record of the surface temperatures has been adjusted numerous times, without adequate records of the details and the effects. The net effect is an inflation of a warming trend, particularly obvious in the US where excellent historic records continue to exist. The UAH data have been adjusted, but the adjustments and effects have been publically recorded.
The divergence between the temperatures forecasted by the global climate models and the observed temperatures is becoming extremely obvious, particularly with the observed atmospheric temperatures. The adjustments to surface temperatures lessen this divergence somewhat, particular with the latest adjustments by the NCDC, where superior measurements taken by fixed or floating buoys were inflated to correspond with earlier, inferior measurements taken by ships. The director of NCDC, Tom Karl, was a lead author in the paper announcing this change. As a result, we should see announcements that sea surface temperatures, and global surface temperatures, are increasing, although the increase may be strictly an artifact of human adjustments rather than an occurrence in nature.
The questionable adjustments in reported surface temperatures leads to the third form of increasing divergence – the differences between what is reported by the Climate Establishment and what is occurring in nature. The Climate Establishment can be defined as those who embrace the findings of the UN Intergovernmental Panel on Climate Change (IPCC), particularly the assertion of a high confidence, a high degree of certainty, that human emissions of carbon dioxide and other greenhouse gases are causing unprecedented and dangerous global warming. Simply because data is adjusted to reflect the IPCC view, does not mean that the IPCC view is occurring.
The greenhouse effect takes place in the atmosphere, yet it is not being observed in the atmosphere. The satellite data, independently verified by four sets of weather balloon data, clearly shows it is not. There has been no significant warming for about 18 years. These data are the most comprehensive temperature data existing and are largely independent of other human influences that bias surface data such as urbanization, including building of structures and impervious surfaces, and other changes in land use. Those who broadcast claims of the hottest year ever, based on adjusted surface data, are actually emphasizing the divergence between science practiced by the Climate Establishment and Nature, and are not engaged in a natural science.
Unfortunately, many government entities and government-funded entities are involved in the Climate Establishment. The leaders of such government entities and funding entities demonstrate a lack of concern for institutional credibility, no respect for the scientific bases on which such institutions were built, including those who came before them and those who will replace them, and will leave their institutions in an inferior condition, rather than strengthen them.
It is important to note that not all government-funded entities are so involved. The National Space Science & Technology Center (NSSTC) at the University of Alabama in Huntsville (UAH), which is largely funded by the federal government (NASA) is a notable exception.
German Media On The Prophets Of NASA: “Prophesizing Gigantic Floods” – 200 Years In The Future!
Pre-Paris hype
By P Gosselin | No Tricks Zone | August 28, 2015
The German media have been buzzing some with the recent NASA publication warning of rising sea levels for the future, and that we need to be very worried.
Maybe I’m reading more into the lines than I should, but I get the feeling that the increasingly dubious NASA climate science organization is no longer being taken 100% seriously by some major German outlets, who have started to label NASA scenarios and projections as “prophecies”.
For example Germany’s normally politically correct, devout green NTV here has the article bearing the title: “NASA prophesizes gigantic floods.”
Prophecies are more the sort of things one typically expects to hear from prophets. The trouble today is that anyone who claims to be a prophet or to possess prophet-like powers almost always gets equated to being a kook, quack, or charlatan. Moreover being labeled a prophet doesn’t get you much respect either. So you have to wonder about the NTV’s choice of words for the title of its story.
Could NTV journalists really be so dim and naïve as to actually believe in climate prophets?
NTV writes of an organization that seems to fancy itself as having visionary power to see the end of the world. NTV tells us:
An unavoidable sea level rise of at least one meter in the coming 100 to 200 years is the result of the latest research data.”
The NTV report then cites NASA prophet Tom Wagner:
NASA scientist Tom Wagner says that when the ice sheets break down on each other, even the risk of a sea level rise of three meters over the coming 100 to 200 years is thinkable.”
Okay, these visions may be still a bit fuzzy, but the NASA scientists prophets know almost for sure they are out there. And again the prophecy of doom gets repeated at the end of the article by prophet Steve Nerem:
‘Things will probably get worse in the future,’ prophesizes Nerem as a result of global warming.”
Again this is the NTV using the word “prophesizes”.
Of course there are only a few teensy-weensy problems with NASA’s prophecies of doom:
1) The hundreds of coastal tide gauges show no acceleration in sea level rise and they show a rise that is much less than what has been measured by the seemingly poorly calibrated satellites,
2) polar sea ice has recovered over the past years,
3) polar temperatures have flattened, or are even declining,
4) global temperatures have flattened, and 5) there’s a growing number of scientists who are now telling us that we should be expecting global cooling over the coming decades.
Moreover, new Greenland data show growing ice (more on this tomorrow).
I’ll let the readers judge for themselves on whether NASA scientists are true prophets, or if they are behaving more like snake oil peddling charlatans.
Myself I’ve lost all respect for the space organization. It’s become a grossly distorted caricature of what scientific research is about.
200 years in the future… yeah, right!
Skeptical of skeptics: is Steve Goddard right?
By Judith Curry | Climate Etc. | June 28, 2014
Skeptics doing what skeptics do best . . . attack skeptics. – Suyts
Last week, the mainstream media was abuzz with claims by skeptical blogger Steve Goddard that NOAA and NASA have dramatically altered the US temperature record. For examples of MSM coverage, see:
- Telegraph: The Scandal of Fiddled Global Warming Data
- Washington Times: Rigged ‘science’
- RealClearPolitics: Climate Change: Who are the real deniers?
Further, this story was carried as the lead story on Drudge for a day.
First off the block to challenge Goddard came Ronald Bailey at reason.com in an article Did NASA/NOAA Dramatically Alter U.S. Temperatures After 2000? that cites communication with Anthony Watts, who is critical of Goddard’s analysis, as well as being critical of NASA/NOAA.
Politifact chimed in with an article that assessed Goddard’s claims, based on Watt’s statements and also an analysis by Zeke Hausfather. Politifact summarized with this statement: We rate the claim Pants on Fire.
I didn’t pay much attention to this, until Politifact asked me for my opinion. I said that I hadn’t looked at it myself, but referred them to Zeke and Watts. I did tweet their Pants on Fire conclusion.
Skepticism in the technical climate blogosphere
Over at the Blackboard, Zeke Hausfather has a three-part series about Goddard’s analysis – How not to calculate temperatures (Part I, Part II, Part III). Without getting into the technical details here, the critiques relate to the topics of data dropout, data infilling/gridding, time of day adjustments, and the use of physical temperatures versus anomalies. The comments thread on Part II is very good, well worth reading.
Anthony Watts has a two-part series On denying hockey sticks, USHCN data and all that (Part 1, Part 2). The posts document Watts’ communications with Goddard, and make mostly the same technical points as Zeke. There are some good technical comments in Part 2, and Watts makes a proposal regarding the use of US reference stations.
Nick Stokes has two technical posts that relate to Goddard’s analysis: USHCN adjustments, averages, getting it right and TOBS nailed.
While I haven’t dug into all this myself, the above analyses seem robust, and it seems that Goddard has made some analysis errors.
The data
OK, acknowledging that Goddard made some analysis errors, I am still left with some uneasiness about the actual data, and why it keeps changing. For example, Jennifer Marohasy has been writing about Corrupting Australian’s temperature record.
In the midst of preparing this blog post, I received an email from Anthony Watts, suggesting that I hold off on my post since there is some breaking news. Watts pointed me to a post by Paul Homewood entitled Massive Temperature Adjustments At Luling, Texas. Excerpt:
So, I thought it might be worth looking in more detail at a few stations, to see what is going on. In Steve’s post, mentioned above, he links to the USHCN Final dataset for monthly temperatures, making the point that approx 40% of these monthly readings are “estimated”, as there is no raw data.
From this dataset, I picked the one at the top of the list, (which appears to be totally random), Station number 415429, which is Luling, Texas.
Taking last year as an example, we can see that ten of the twelve months are tagged as “E”, i.e estimated. It is understandable that a station might be a month, or even two, late in reporting, but it is not conceivable that readings from last year are late. (The other two months, Jan/Feb are marked “a”, indicating missing days).
But, the mystery thickens. Each state produces a monthly and annual State Climatological Report, which among other things includes a list of monthly mean temperatures by station. If we look at the 2013 annual report for Texas, we can see these monthly temperatures for Luling.
Where an “M” appears after the temperature, this indicates some days are missing, i.e Jan, Feb, Oct and Nov. (Detailed daily data shows just one missing day’s minimum temperature for each of these months).
Yet, according to the USHCN dataset, all ten months from March to December are “Estimated”. Why, when there is full data available?
But it gets worse. The table below compares the actual station data with what USHCN describe as “the bias-adjusted temperature”. The results are shocking.
In other words, the adjustments have added an astonishing 1.35C to the annual temperature for 2013. Note also that I have included the same figures for 1934, which show that the adjustment has reduced temperatures that year by 0.91C. So, the net effect of the adjustments between 1934 and 2013 has been to add 2.26C of warming.
Note as well, that the largest adjustments are for the estimated months of March – December. This is something that Steve Goddard has been emphasising.
It is plain that these adjustments made are not justifiable in any way. It is also clear that the number of “Estimated” measurements made are not justified either, as the real data is there, present and correct.
Watts appears in the comments, stating that he has contacted John Nielsen-Gammon (Texas State Climatologist) about this issue. Nick Stokes also appears in the comments, and one commenter finds a similar problem for another Texas station.
Homewood’s post sheds light on Goddard’s original claim regarding the data drop out (not just stations that are no longer reporting, but reporting stations that are ‘estimated’). I infer from this that there seems to be a real problem with the USHCN data set, or at least with some of the stations. Maybe it is a tempest in a teacup, but it looks like something that requires NOAA’s attention. As far as I can tell, NOAA has not responded to Goddard’s allegations. Now, with Homewood’s explanation/clarification, NOAA really needs to respond.
Sociology of the technical skeptical blogosphere
Apart from the astonishing scientific and political implications of what could be a major bug in the USHCN dataset, there are some interesting insights and lessons from this regarding the technical skeptical blogosphere.
Who do I include in the technical skeptical blogosphere? Tamino, Moyhu, Blackboard, Watts, Goddard, ClimateAudit, Jeff Id, Roman M. There are others, but the main discriminating factor is that they do data analysis, and audit the data analysis of others. Are all of these ‘skeptics’ in the political sense? No – Tamino and Moyhu definitely run warm, with Blackboard and a few others running lukewarm. Of these, Goddard is the most skeptical of AGW. There is most definitely no tribalism among this group.
In responding to Goddard’s post, Zeke, Nick Stokes (Moyhu) and Watts may have missed the real story. They focused on their previous criticism of Goddard and missed his main point. Further, I think there was an element of ‘boy who cried wolf’ – Goddard has been wrong before, and the comments at Goddard’s blog can be pretty crackpotty. However, the main point is that this group is rapidly self-correcting – the self-correcting function in the skeptical technical blogosphere seems to be more effective (and certainly faster) than for establishment climate science.
There’s another issue here and that is one of communication. Why was Goddard’s original post unconvincing to this group, whereas Homewood’s post seems to be convincing? Apart from ‘crying wolf’ issue, Goddard focused on the message that the real warming was much less than portrayed by the NOAA data set (caught the attention of the mainstream media), whereas Homewood more carefully documented the actual problem with the data set.
I’ve been in email communications with Watts through much of Friday, and he’s been pursuing the issue along with Zeke and help from Neilsen-Gammon to NCDC directly, who is reportedly taking it seriously. Not only does Watts plan to issue a statement on how he missed Goddard’s original issue, he says that additional problems have been discovered and that NOAA/NCDC will be issuing some sort of statement, possibly also a correction, next week. (Watts has approved me making this statement).
This incident is another one that challenges traditional notions of expertise. From a recent speech by President Obama:
“I mean, I’m not a scientist either, but I’ve got this guy, John Holdren, he’s a scientist,” Obama added to laughter. “I’ve got a bunch of scientists at NASA and I’ve got a bunch of scientists at EPA.”
Who all rely on the data prepared by his bunch of scientists at NOAA.
How to analyze the imperfect and heterogeneous surface temperature data is not straightforward – there are numerous ways to skin this cat, and the cat still seems to have some skin left. I like the Berkeley Earth methods, but I am not convinced that their confidence interval/uncertainty estimates are adequate.
Stay tuned, I think this one bears watching.
Solar warnings, global warming and crimes against humanity
Malaysian Realist
We’ve been seeing a lot of unexpectedly cool weather across the world. While this may be explained by local phenomenon such as the Northeast Monsoon in Malaysia and the Polar Vortex in the USA, a longer term trend of worldwide cooling is headed our way.
I say this because the sun – the main source of light and heat for our planet – is approaching a combined low point in output. Solar activity rises and falls in different overlapping cycles, and the low points of several cycles will coincide in the near future:
A) 11-year Schwabe Cycle which had a minimum in 2008 and is due for the next minimum in 2019, then 2030. Even at its recent peak (2013) the sun had its lowest recorded activity in 200 years.
B) 87-year Gleissberg cycle which has a currently ongoing minimum period from 1997 – 2032, corresponding to the observed ‘lack of global warming’ (more on that later).
C) 210-year Suess cycle which has its next minimum predicted to be around 2040.
Hence, solar output will very likely drop to a substantial low around 2030 – 2040. This may sound pleasant for Malaysians used to sweltering heat, but it is really not a matter to be taken lightly. Previous lows such as the Year Without A Summer (1816) and the Little Ice Age (16th to 19th century) led to many deaths worldwide from crop failures, flooding, superstorms and freezing winters.
But what about the much-ballyhooed global warming, allegedly caused by increasing CO2 levels in the atmosphere? Won’t that more than offset the coming cooling, still dooming us all to a feverish Earth?
Regarding this matter, it is now a plainly accepted fact that there has been no global temperature rise in the past 25 years. This lack of warming is openly admitted by: NASA; The UK Met Office; the University of East Anglia Climatic Research Unit, as well as its former head Dr. Phil Jones (of the Climategate data manipulation controversy); Hans von Storch (Lead Author for Working Group I of the IPCC); James Lovelock (inventor of the Gaia Theory); and media entities the BBC, Forbes, Reuters, The Australian, The Economist, The New York Times, and The Wall Street Journal.
And this is despite CO2 levels having risen more than 13%, from 349 ppm in 1987 to 396ppm today. The central thesis of global warming theory – that rising CO2 levels will inexorably lead to rising global temperatures, followed by environmental catastrophe and massive loss of human life – is proven false.
(All the above are clearly and cleanly depicted by graphs, excerpts, citations and links in my collection at http://globalwarmingisunfactual.wordpress.com – as a public service.)
This is probably why anti-CO2 advocates now warn of ‘climate change’ instead. But pray tell, exactly what mechanism is there for CO2 to cause climate change if not by warming? The greenhouse effect has CO2 trapping solar heat and thus raising temperatures – as we have been warned ad nauseum by climate alarmists – so how does CO2 cause climate change when there is no warming?
Solar activity is a far larger driver of global temperature than CO2 levels, because after all, without the sun there would be no heat for greenhouse gases to trap in the first place. (Remember what I said about the Gleissberg cycle above?)
And why is any of this important to you and I? It matters because countless resources are being spent to meet the wrong challenges. Just think of all the time, energy, public attention and hard cash that have already been squandered on biofuel mandates, subsidies for solar panels and wind turbines, carbon caps and credits, bloated salaries of dignitaries, annual jet-setting climate conferences in posh five-star hotels… To say nothing of the lost opportunities and jobs (two jobs lost for every one ‘green’ job created in Spain, which now has 26% unemployment!). And most of the time it is the common working man, the taxpayer, you and I who foot the bill.
What if all this immense effort and expenditure had been put towards securing food and clean water for the impoverished (combined 11 million deaths/year)? Or fighting dengue and malaria (combined 1.222 million deaths/year)? Or preserving rivers, mangroves, rainforests and endangered species? Or preparing power grids for the increased demand that more severe winters will necessitate – the same power grids now crippled by shutting down reliable coal plants in favour of highly intermittent wind turbines?
In the face of such dire needs that can be met immediately and effectively, continuing to throw away precious money to ‘possibly, perhaps, maybe one day’ solve the non-problem of CO2 emissions is foolish, arrogant and arguably malevolent. To wit, the UN World Food Programme just announced that they are forced to scale back aid to some of the 870 million malnourished worldwide due to a $1 billion funding shortfall and the challenges of the ongoing Syrian crisis. To put this is context, a billion is a mere pittance next to the tens of billions already flushed away by attempted adherence to the Kyoto Protocol (€6.2 billion for just Germany in just 2005 alone!).
During the high times for global warmist doomsaying, sceptics and realists who questioned the unproven theories were baselessly slandered as ‘anti-science’, ‘deniers’, ‘schills for big oil’… Or even ‘war criminals’ deserving Nuremberg-style trials for their ‘crimes against humanity’!
Now that the tables are turned, just let it be known that it was not the sceptics who flushed massive amounts of global resources down the drain – while genuine human and environmental issues languished and withered in the empty shadow of global warming hysteria. Crimes against humanity, indeed.
Related articles

The Government Can’t Even Figure Out How To Shut Down Its Websites In A Reasonable Way
By Mike Masnick | Techdirt | October 2, 2013
With the government shutdown, you have may have come across a variety of oddities involving various government agency websites that were completely taken offline. This seems strange. Yes, the government is shut down, but does that really mean they need to turn off their web servers as well, even the purely informational ones? I could see them just leaving them static without updating them, but to completely block them just seems… odd. Even odder is that not all websites are down and some, such as the FTC’s website appears to be fully up, including fully loading a page… only to then redirect you to a page that says it’s down. Julian Sanchez, over at Cato, explores the various oddities of government domains that are either up or down — or something in between.
For agencies that directly run their own Web sites on in-house servers, shutting down might make sense if the agency’s “essential” and “inessential” systems are suitably segregated. Running the site in those cases eats up electricity and bandwidth that the agency is paying for, not to mention the IT and security personnel who need to monitor the site for attacks and other problems. Fair enough in those cases. But those functions are, at least in the private sector, often outsourced and paid for up front: if you’ve contracted with an outside firm to host your site, shutting it down for a few days or weeks may not save any money at all. And that might indeed explain why some government sites remain operational, even though they don’t exactly seem “essential,” while others have been pulled down.
That doesn’t seem to account for some of the weird patterns we see, however. The main page at NASA.gov redirects to a page saying the site is unavailable, but lots of subdomains that, however cool, seem “inessential” remain up and running: the “Solar System Exploration” page at solarsystem.nasa.gov; the Climate Kids website at climatekids.nasa.gov; and the large photo archive at images.jsc.nasa.gov, to name a few. There are any number of good reasons some of those subdomains might be hosted separately, and therefore unaffected by the shutdown—but it seems odd they can keep all of these running without additional expenditures, yet aren’t able to redirect to a co-located mirror of the landing page.
He also takes on the issue of the FTC redirect, in which he notes that the redirect after loading the full page shows that they’re not saving any money at all this way, meaning it makes absolutely no sense at all.
Still weirder is the status of the Federal Trade Commission’s site. Browse to any of their pages and you’ll see, for a split second, the full content of the page you want—only to be redirected to a shutdown notice page also hosted at FTC.gov. But that means… their servers are still up and running and actually serving all the same content. In fact they’re serving more content: first the real page, then the shutdown notice page. If you’re using Firefox or Chrome and don’t mind browsing in HTML-cluttered text, you can even use this link to navigate to the FTC site map and navigate from page to page in source-code view without triggering the redirect. Again, it’s entirely possible I’m missing something, but if the full site is actually still running, it’s hard to see how a redirect after the real page is served could be avoiding any expenditures.
Sanchez tries to piece together why this might be happening, and points to a White House memo which explicitly says that agencies should shut stuff down even if it’s cheaper to keep them online:
The determination of which services continue during an appropriations lapse is not affected by whether the costs of shutdown exceed the costs of maintaining services…
It’s difficult to see how this helps anyone at all. But it does yet a good job (yet again) of demonstrating that logic and bureaucracy don’t often go well together.





