Aletho News

ΑΛΗΘΩΣ

Erroneous climate change study reported far and wide, corrections few and far between

RT | November 15, 2018

As the world grapples with extreme weather and wildfires, the issue of climate change is at the forefront of policy decisions, scientific research and media coverage – but bias towards alarmism is proving somewhat irresistible.

A new study recently published in the journal Nature suggested that “ocean warming is at the high end of previous estimates,” based on atmospheric data taken between 1991 and 2016. Ocean temperatures are 60-percent higher per year than the estimates offered by the United Nations’ Intergovernmental Panel on Climate Change in 2014, the authors claim.

The research was co-authored by an expert – a Princeton geoscientist no less – so the disturbing news spread like… well, wildfire across the news media, with each headline more breathless than the last. The only problem was, the numbers used to generate the conclusions in the research were off; way off.

One climate change researcher and statistician wasn’t so convinced by the study: Nicholas Lewis took a closer look at the numbers and spotted a few glaring errors in the researchers’ calculations.

“Unfortunately their work involves many assumptions where there is scope for subjective choices by the authors, so it is difficult to validate those assumptions,” Lewis told Reason.com.

In fact, the warming of the world’s oceans was overstated by approximately 30 percent, a substantial margin of error by most standards.

Lewis also questioned the “failure of the original peer review and editorial process to pick up the fairly obvious statistical problems in the original paper.”

In response, the study’s co-author and Scripps Institution of Oceanography climate scientist Ralph Keeling has acknowledged that there may be issues with the numbers, but insists that once they are rectified, it won’t affect the overall conclusion.

The issues “do not invalidate the study’s methodology or the new insights into ocean biogeochemistry on which it is based,” Keeling said in an addendum to the original news release.

While there may be less intense cause for immediate alarm about ocean warming, the entire episode does create concern over the validity of, and scrutiny placed on, research that bows to the scientific consensus rather than that which challenges it.

However, at the time of writing, only a handful of outlets have published news of the correction in comparison with the multitude who shared the original, erroneous findings. Correction coverage just doesn’t generate the clicks quite like alarmism after all.

November 15, 2018 Posted by | Mainstream Media, Warmongering, Science and Pseudo-Science | | Leave a comment

The Migration of the Skeptic

qedcon | October 16, 2018

Naturalist Sir David Attenborough definitely presents this short documentary on the migration of the Skeptic. This spoof was originally used to open QED 2018.

Written by Matt White, Michael Marshall, and Mike Hall
Directed by Matt White
Edited by Deniz Kavalali
VFX by Joe Pavlo
Audio post-production by Offset Audio
Featuring Adam Diggle as the voice of Sir David Attenborough

November 12, 2018 Posted by | Science and Pseudo-Science, Timeless or most popular, Video | , | Leave a comment

JORDAN PETERSON On Global Warming

Climatism | November 9, 2018

JORDAN PETERSON is a professor at the University of Toronto, a clinical psychologist and the author of the million-plus selling ’12 Rules for Life’, a Number 1 bestseller. He rose to international prominence in 2016, after criticising the Canadian government’s enactment of Bill C-16.

THE psychologist and internet celebrity has been touted as ‘the most influential public intellectual in the Western world right now’, with contentious views on gender, political correctness. A culture warrior, who has no truck with “white privilege”, “cultural appropriation” and a range of other ideas associated with social justice movements.

November 10, 2018 Posted by | Science and Pseudo-Science, Timeless or most popular, Video | Leave a comment

How the CDC Uses Fear to Increase Demand for Flu Vaccines

Collective Evolution | November 9, 2018

The CDC claims that its recommendation that everyone aged six months and up should get an annual flu shot is firmly grounded in science. The mainstream media reinforce this characterization by misinforming the public about what the science says.

New York Times article from earlier this year, for example, in order to persuade readers to follow the CDC’s recommendation, cited scientific literature reviews of the prestigious Cochrane Collaboration to support its characterization of the influenza vaccine as both effective and safe. The Times claimed that the science showed that the vaccine represented “a big payoff in public health” and that harms from the vaccine were “almost nonexistent”.

What the Cochrane researchers actually concluded, however, was that their findings “seem to discourage the utilization of vaccination against influenza in healthy adults as a routine public health measure” (emphasis added). Furthermore, given the known serious harms associated with specific flu vaccines and the CDC’s recommendation that infants as young as six months get a flu shot despite an alarming lack of safety studies for children under two, “large-scale studies assessing important outcomes, and directly comparing vaccine types are urgently required.”

The CDC also recommends the vaccine for pregnant women despite the total absence of randomized controlled trials assessing the safety of this practice for both expectant mother and unborn child. (This is all the more concerning given that multi-dose vials of the inactivated influenza vaccine contain mercury, a known neurotoxin that can cross both the placental and blood-brain barriers and accumulate in the brain.)

The Cochrane researchers also found “no evidence” to support the CDC’s assumptions that the vaccine reduces transmission of the virus or the risk of potentially deadly complications—the two primary justifications claimed by the CDC to support its recommendation.

The CDC nevertheless pushes the influenza vaccine by claiming that it prevents large numbers of hospitalizations and deaths from flu. To reinforce its message that everyone should get an annual flu shot, the CDC claims that hundreds of thousands of people are hospitalized and tens of thousands die each year from influenza. These numbers are generally relayed by the mainstream media as though representative of known cases of flu. The aforementioned New York Times article, for example, stated matter-of-factly that, of the 9 million to 36 million people whom the CDC estimates get the flu each year, “Somewhere between 140,000 and 710,000 of them require hospitalization, and 12,000 to 56,000 die each year.”

… the average number of deaths each year for which the cause is actually attributed on death certificates to the influenza virus is little more than 1000.

On September 27, the CDC issued the claim at a press conference that 80,000 people died from the flu during the 2017 – 2018 flu season, and the media parroted this number as though fact.

What is not being communicated to the public is that the CDC’s numbers do not represent known cases of influenza. They do not come directly from surveillance data, but are rather controversial estimates based on controversial mathematical models that may greatly overestimate the numbers.

To put the matter into perspective, the average number of deaths each year for which the cause is actually attributed on death certificates to the influenza virus is little more than 1,000.

The consequence of the media parroting the CDC’s numbers as though uncontroversial is that the public is routinely misinformed about the impact of influenza on society and the ostensible benefits of the vaccine. Evidently, that’s just the way the CDC wants it, since the agency has also outlined a public relations strategy of using fear marketing to increase demand for flu shots.

In other words, the CDC considers it to be a problem that people are increasingly doing their own research and becoming more adept at educating themselves about health-related issues.

The CDC’s “Problem” of “Growing Health Literacy”

Before looking at some of the problems with the CDC’s estimates, it’s useful to examine the mindset at the agency with respect to how CDC officials view their role in society. An instructive snapshot of this mindset was provided in a presentation by the CDC’s director of media relations on June 17, 2004, at a workshop for the Institute of Medicine (IOM).

In its presentation, the CDC outlined a “‘Recipe’ for Fostering Public Interest and High Vaccine Demand”. It called for encouraging medical experts and public health authorities to “state concern and alarm” about “and predict dire outcomes” from the flu season. To inspire the necessary fear, the CDC encouraged describing each season as “very severe”, “more severe than last or past years”, and “deadly”.

One problem for the CDC is the accurate view among healthy adults that they are not at high risk of serious complications from the flu. As the presentation noted, “achieving consensus by ‘fiat’ is difficult”—meaning that just because the CDC makes the recommendation doesn’t mean that people will actually follow it. Therefore it was necessary to cause “concern, anxiety, and worry” among young, healthy adults who regard the flu as an inconvenience rather than something to be terribly afraid of.

The larger conundrum for the CDC is the proliferation of information available to the public on the internet. As the CDC bluntly stated it, “Health literacy is a growing problem”.

In other words, the CDC considers it to be a problem that people are increasingly doing their own research and becoming more adept at educating themselves about health-related issues. And, as we have already seen, the CDC has very good reason to be concerned about people doing their own research into what the science actually tells us about vaccines.

One prominent way the CDC inspires the necessary fear, of course, is with its estimates of the numbers of people who are hospitalized or die each year from the flu.

… many if not most people diagnosed with ‘the flu’ may not have actually been infected with the influenza virus at all, given the large number of other viruses that cause the same symptoms and the general lack of lab confirmation.

The Problems with the CDC’s Estimates of Annual Flu Deaths

Among the relevant facts that are routinely not relayed to the public by the media when the CDC’s numbers are cited is that only about 7% to 15% of what are called “influenza-like illnesses” are actually caused by influenza viruses. In fact, there are over 200 known viruses that cause influenza-like illnesses, and to determine whether an illness was actually caused by the influenza virus requires laboratory testing—which isn’t usually done.

Furthermore, as the authors of a 2010 Cochrane review stated, “At best, vaccines may only be effective against influenza A and B, which represent about 10% of all circulating viruses” that are known to cause influenza-like symptoms. (That’s the same review, by the way, that the Times mischaracterized as having found the vaccine to be “a big payoff in public health”.)

While the CDC now uses a range of numbers to describe annual deaths attributed to influenza, it used to claim that on average “about 36,000 people per year in the United States die from influenza”. The CDC switched to using a range in response to criticism that the average was misleading because there is great variability from year to year and decade to decade. And while switching to the range did address that criticism, other serious problems remain.

One major problem with “the much publicized figure of 36,000”, as Peter Doshi observed in a 2005 BMJ article, was that it “is not an estimate of yearly flu deaths, as widely reported in both the lay and scientific press, but an estimate—generated by a model—of flu-associated death.”

Of course, as the media routinely remind us when it comes to the subject of vaccines and autism (but seem to forget when it comes to the CDC’s flu numbers), temporal association does not necessarily mean causation. Just because someone dies after an influenza infection does not mean that it was the flu that killed him. And, furthermore, many if not most people diagnosed with “the flu” may not have actually been infected with the influenza virus at all, given the large number of other viruses that cause the same symptoms and the general lack of lab confirmation.

The “36,000” number came from a 2003 CDC study published in JAMA that acknowledged the difficulty of estimating deaths attributable to influenza, given that most cases are not lab-confirmed. Yet, rather than acknowledging the likelihood that a substantial percentage of reported cases actually had nothing to do with the influenza virus, the CDC researchers treated it as though it only meant that flu-related deaths must be significantly higher than the reported numbers.

The study authors pointed out that seasonal influenza is “associated with increased hospitalizations and mortality for many diagnoses”, including pneumonia, and they assumed that many cases attributed to other illnesses were actually caused by influenza. They therefore developed a mathematical model to estimate the number by instead using as their starting point all “respiratory and circulatory” deaths, which include all “pneumonia and influenza” deaths.

In his aforementioned BMJ article, Peter Doshi reasonably asked, “Are US flu death figures more PR than science?”

Of course, not all respiratory and circulatory deaths are caused by the influenza virus. Yet the CDC treats this number as “an upper bound”—as though it was possible that 100% of all respiratory and circulatory deaths occurring in a given flu season were caused by influenza. The CDC also treats the total number of pneumonia and influenza deaths as “a lower bound for deaths associated with influenza”. The CDC states on its website that reported pneumonia and influenza deaths “represent only a fraction of the total number of deaths from influenza”—as though all pneumonia deaths were caused by influenza!

The CDC certainly knows better. In fact, at the same time, the CDC contradictorily acknowledges that not all pneumonia and influenza deaths are flu-related; it has estimatedthat in an average year 2.1% of all respiratory and circulatory deaths and 8.5% of all pneumonia and influenza deaths are influenza-associated.

So how can the CDC maintain both (a) that 8.5% of pneumonia and influenza deaths are flu-related, and (b) that the combined total of all pneumonia and influenza deaths represents only a fraction of flu-caused deaths? How can both be true?

The answer is that the CDC simply assumes that influenza-associated deaths are so greatly underreported within the broader category of deaths coded under “respiratory and circulatory” that they dwarf all those coded under “pneumonia and influenza”.

In his aforementioned BMJ article, Peter Doshi reasonably asked, “Are US flu death figures more PR than science?” As he put it, “US data on influenza deaths are a mess.” The CDC “acknowledges a difference between flu death and flu associated death yet uses the terms interchangeably. Additionally, there are significant statistical incompatibilities between official estimates and national vital statistics data. Compounding these problems is a marketing of fear—a CDC communications strategy in which medical experts ‘predict dire outcomes’ during flu seasons.”

Setting aside pneumonia and looking just at influenza-associated deaths from 1979 to 2002, the annual average according to the NCHS data was only 1,348.

Illustrating the problem, Doshi observed that for the year 2001, the total number of reported pneumonia and influenza deaths was 62,034. Yet, of those, less than one half of one percent were attributed to influenza. Furthermore, of the mere 257 cases blamed on the flu, only 7% were laboratory confirmed. That’s only 18 cases of lab confirmed influenza out of 62,034 pneumonia and influenza deaths—or just 0.03%, according to the CDC’s own National Center for Health Statistics (NCHS).

Setting aside pneumonia and looking just at influenza-associated deaths from 1979 to 2002, the annual average according to the NCHS data was only 1,348.

The CDC’s mortality estimates would be compatible with the NCHS data, Doshi argued, “if about half of the deaths classed by the NCHS as pneumonia were actually flu initiated secondary pneumonias.” But the NCHS criteria itself strongly indicated otherwise, stating that “Cause-of-death statistics are based solely on the underlying cause of death … defined by WHO as ‘the disease or injury which initiated the train of events leading directly to death.’”

The CDC researchers who authored the 2003 study acknowledged that underlying cause-of-death coding “represents the disease or injury that initiated the chain of morbid events that led directly to the death”—yet they fallaciously coupled pneumonia deaths with influenza deaths in their model anyway.

At the time Doshi was writing, the CDC was publicly claiming that each year “about 36,000 [Americans] die from flu”, and as seen with the example from the New York Times, the range of numbers is likewise presented as though representative of known cases of flu-caused deaths. Yet the lead author of that very CDC study, William Thompson of the CDC’s National Immunization Program, acknowledged that the number rather represented “a statistical association” that does not necessarily mean causation. In Thompson’s own words, “Based on modelling, we think it’s associated. I don’t know that we would say that it’s the underlying cause of death.” (Emphasis added.)

Of course, the CDC does say it’s the underlying cause of death in its disingenuous public relations messaging. As Doshi noted, Thompson’s acknowledgment is “incompatible” with the CDC’s “misrepresentation” of its flu deaths estimates. The CDC, Doshi further observed, was “working in manufacturers’ interest by conducting campaigns to increase flu vaccination” based on estimates that are “statistically biased”, including by “arbitrarily linking flu with pneumonia”.

… there are otherwise significant limitations of the CDC’s models that potentially result in spurious attribution of deaths to influenza.

More “Limitations” of the CDC’s Models

While the media present the CDC’s numbers as though uncontroversial, there is in fact “substantial controversy” surrounding flu death estimates, as a 2005 study published in the American Journal of Epidemiology noted. One problem is that the CDC’s models use virus surveillance data that “have not been made available in the public domain”, which means that its results or not reproducible. (As the journal Cell reminds, “the reproducibility of science” is “a lynch pin of credibility”.) And there are otherwise “significant limitations” of the CDC’s models that potentially result in “spurious attribution of deaths to influenza.”

To illustrate, when Peter Doshi requested access to virus circulation data, the CDC refused to allow it unless he granted the CDC co-authorship of the study he was undertaking—which Doshi appropriately refused.

While the number of confirmed H1N1-related child deaths was 371, the CDC’s claimed number was 1,271 or more.

In the New York Review of Books, Helen Epstein has pointed out how the CDC’s dire warnings about the 2009 H1N1 “swine flu” never came to pass, as well as how “some experts maintain that the CDC’s estimates studies overestimate influenza mortality, particularly among children.” While the number of confirmed H1N1-related child deaths was 371, the CDC’s claimed number was 1,271 or more. To arrive at its number, the CDC used a multiplier based on certain assumptions. One assumption is that some cases are missed either because lab confirmation wasn’t sought or because the children weren’t in a hospital when they died and so weren’t tested. Another is that a certain percentage of test results will be false negatives.

However, Epstein pointed out, “according to CDC guidelines at the time”, any child hospitalized with severe influenza symptoms should have been tested for H1N1. Furthermore, “deaths in children from infectious diseases are rare in the US, and even those who didn’t die in hospitals would almost certainly have been autopsied (and tested for H1N1)…. Also, the test is accurate and would have missed few cases. Because it’s unlikely that large numbers of actual cases of US child deaths from H1N1 were missed, the lab-confirmed count (371) is probably much closer to the modeled numbers … which are in any case impossible to verify.”

As already indicated, another assumption the CDC makes is that excess mortality in winter is mostly attributable to influenza. A 2009 Slate article described this as among a number of “potential glitches” that make the CDC’s reported flu deaths the “‘least bad’ estimate”. Referring to earlier methods that associated flu deaths with wintertime deaths from all causes, the article observed that this risked blaming influenza for deaths from car accidents caused by icy roads. And while the updated method presented in the 2003 CDC study excluded such causes of death implausibly linked to flu, related problems remain.

As the aforementioned American Journal of Epidemiology study noted, the updated method “reduces, but does not eliminate, the potential for spurious correlation and spurious attribution of deaths to influenza.” Furthermore, “Methods based on seasonal pattern begin from the assumption that influenza is the major source of excess winter death.” The CDC’s models therefore still “are in danger of being confounded by other seasonal factors.” The authors also stated that they could not conclude from their own study “that influenza is a more important cause of winter mortality on an annual timescale than is cold weather.”

Once the CDC has its estimated hospitalization rate, it then multiplies that number by the ratio of deaths to hospitalizations to arrive at its estimated mortality rate. Thus, any overestimation of the hospitalization rate is also compounded into its estimated death rate.

As a 2002 BMJ study stated, “Cold weather alone causes striking short term increases in mortality, mainly from thrombotic and respiratory disease. Non-thermal seasonal factors such as diet may also affect mortality.” (Emphasis added.) The study estimated that of annual excess winter deaths, only “2.4% were due to influenza either directly or indirectly.” It concluded that, “With influenza causing such a small proportion of excess winter deaths, measures to reduce cold stress offer the greatest opportunities to reduce current levels of winter mortality.”

CDC researchers themselves acknowledge that their models are “subject to some limitations.” In a 2009 study published in the American Journal of Public Health, CDC researchers admitted that “simply counting deaths for which influenza has been coded as the underlying cause on death certificates can lead to both over- and underestimates of the magnitude of influenza-associated mortality.” (Emphasis added.) Yet they offered no comment on how, then, their models account for the likelihood that many reported cases of “flu” had nothing whatsoever to do with the influenza virus. Evidently, this is because they don’t, as indicated by the CDC’s treatment of all influenza deaths plus pneumonia deaths as a “lower bound”.

For another illustration, since it takes two or three years before the data is available to be able to estimate flu hospitalizations and deaths by the usual means, the CDC has also developed a method to make preliminary estimates for a given year by “adjusting” the numbers of reported lab-confirmed cases from selected surveillance areas around the country. The “80,000” figure claimed for last season’s flu deaths is just such an estimate. The way the CDC “adjusts” the numbers is by multiplying the number of lab-confirmed cases by a certain amount, ostensibly “to correct for underreporting”. To determine the multiplier, the CDC makes a number of assumptions to estimate (a) the likelihood that a person hospitalized for any respiratory illness would be tested for influenza and (b) the likelihood that a person with influenza would test positive.

Caveats such as that, however, are not communicated to the general public by the CDC in its press releases or by the mainstream media so that people can make a truly informed choice about whether it’s worth the risk to get a flu shot.

Once the CDC has its estimated hospitalization rate, it then multiplies that number by the ratio of deaths to hospitalizations to arrive at its estimated mortality rate. Thus, any overestimation of the hospitalization rate is also compounded into its estimated death rate.

One obvious problem with this is the underlying assumption that the percentage of people who (a) are hospitalized for respiratory illness and have the flu is the same as (b) the percentage of those who are hospitalized for respiratory illness, are actually tested, and test positive. This implies that doctors are not more likely to seek lab confirmation for people who actually have influenza than they are for people whose respiratory symptoms are due to some other cause.

Assuming that doctors can do better than a pair of rolled dice at picking out patients with influenza, it further implies that doctors are no more likely to order a lab test for patients whom they suspect of having the flu than they are to order a lab test for patients whose respiratory symptoms they think are caused by something else.

The CDC’s assumption thus introduces a selection bias into its model that further calls into question the plausibility of its conclusions, as it is bound to result in overestimation. In a 2015 study published in PLoS One that detailed this method, CDC researchers acknowledged that, “If physicians were more likely to recognize influenza patients clinically and select those patients for testing, we may have over-estimated the magnitude of under-detection.” And that, of course, would result in an overestimation of both hospitalizations and deaths associated with influenza.

Caveats such as that, however, are not communicated to the general public by the CDC in its press releases or by the mainstream media so that people can make a truly informed choice about whether it’s worth the risk to get a flu shot.

Conclusion

In summary, to avoid underestimating influenza-associated hospitalizations and deaths, the CDC relies on models that instead appear to greatly overestimate the numbers due to the fallacious assumptions built into them. These numbers are then mispresented to the public by both public health officials and the mainstream media as though uncontroversial and representative of known cases of influenza-caused illnesses and deaths from surveillance data. Consequently, the public is grossly misinformed about the societal disease burden from influenza and the ostensible benefit of the vaccine.

It is clear that the CDC does not see its mission as being to educate the public in order to be able to make an informed choice about vaccination. After all, that would be incompatible with its view that growing health literacy is a threat to its mission and an obstacle to be overcome. On the other hand, a misinformed populace aligns perfectly with the CDC’s stated goal of using fear marketing to generate more demand for the pharmaceutical industry’s influenza vaccine products.

This article is an adapted and expanded excerpt from part two of the author’s multi-part exposé on the influenza vaccine.

November 10, 2018 Posted by | Corruption, Deception, Science and Pseudo-Science | , | Leave a comment

FDA ‘Ignoring The Evidence’ on Cellphones & Cancer

See Video report at Bitchute

RT America | November 2, 2018

A $25 million study on the effects of cellphone radiation on rats found clear evidence of tumors. But the Food and Drug Administration, which had requested the study in the first place, is casting doubt on its conclusions. RT America’s Dan Cohen has the details.

November 8, 2018 Posted by | Deception, Science and Pseudo-Science, Timeless or most popular, Video | , | Leave a comment

News Media Gave Blanket Coverage To Flawed Climate Paper

Global Warming Policy Forum – 07/11/18

A week ago, we were told that climate change was worse than we thought. But the underlying science contains a major error.

Independent climate scientist Nicholas Lewis has uncovered a major error in a recent scientific paper that was given blanket coverage in the English-speaking media. The paper, written by a team led by Princeton oceanographer Laure Resplandy, claimed that the oceans have been warming faster than previously thought. It was announced, in news outlets including the BBC, the New York Times, the Washington Post and Scientific American that this meant that the Earth may warm even faster than currently estimated.

However Lewis, who has authored several peer-reviewed papers on the question of climate sensitivity and has worked with some of the world’s leading climate scientists, has found that the warming trend in the Resplandy paper differs from that calculated from the underlying data included with the paper.

“If you calculate the trend correctly, the warming rate is not worse than we thought – it’s very much in line with previous estimates,” says Lewis.

In fact, says Lewis, some of the other claims made in the paper and reported by the media, are wrong too.

“Their claims about the effect of faster ocean warming on estimates of climate sensitivity (and hence future global warming) and carbon budgets are just incorrect anyway, but that’s a moot point now we know that about their calculation error”.

And now that the errors have been uncovered, Lewis points out that it is important that the record is corrected.

“The original findings of the Resplandy paper were given blanket coverage by the media, who rarely question hyped-up findings of this kind. Let’s hope some of them are willing to correct the record”.

November 8, 2018 Posted by | Mainstream Media, Warmongering, Science and Pseudo-Science | , , , | Leave a comment

Discovery Of Massive Volcanic CO2 Emissions Puts Damper On Global Warming Theory

By James Edward Kamis | Climate Change Dispatch | November 6, 2018

Recent research shows that the volume of volcanic CO2 currently being emitted into Earth’s atmosphere is far greater than previously calculated, challenging the validity of the man-made global warming theory.

Figure 1.) Volcanic gas emissions breakthrough overlying fractured and partially melted glacial ice sheet. (Image credits: Christina Neal, AVO/USGS)

The cornerstone principle of the global warming theory, anthropogenic global warming (AGW), is built on the premise that significant increases of modern era human-induced CO2 emissions have acted to unnaturally warm Earth’s atmosphere.

A warmed atmosphere that directly, or in some cases indirectly fuels anomalous environmental disasters such as ocean warming, alteration of ocean chemistry, polar ice sheet melting, global sea level rise, coral bleaching and most importantly dramatic changes in climate.

There are numerous major problems with the AGW principle.

Identification of Volcanic vs. Man-made CO2

Natural volcanic and man-made CO2 emissions have the exact same and very distinctive carbon isotopic fingerprint.

It is therefore scientifically impossible to distinguish the difference between volcanic CO2 and human-induced CO2 from the burning of fossil fuels (see here).

This major problem with the AGW principle has been rationalized away by consensus climate scientists who insist, based on supposedly reliable research, that volcanic emissions are minuscule in comparison to human-induced CO2 emissions (Gerlach 1991).

Terrance Gerlach’s volcanic CO2 calculation was based on just 7 actively erupting land volcanoes and three actively erupting ocean floor hydrothermal vents (seafloor hot geysers).

Utilizing gas emission data from this very limited number of volcanic features, Gerlach estimated that the volume of natural volcanic CO2 emissions is 100 to 150 times less than the volume of man-made CO2 emissions from the burning of fossil fuels and therefore of no consequence.

To put this calculation process into perspective, the Earth is home to 1,500 land volcanoes and 900,000 seafloor volcanoes/hydrothermal vents.

By sampling just an extremely small percent of these volcanic features it is impossible to imagine that the calculation is correct.

Especially knowing that volcanic activity varies greatly from area to area, volcano to volcano, and through time. Utilizing just 0.001 percent (10/901,500) of Earth’s volcanic features to calculate volcanic CO2 emissions does not inspire confidence in the resulting value.

Non-Erupting Volcanoes Can Emit Massive Amounts of CO2 into Earth’s Atmosphere

Recent geological research by the University of Leeds and others proves that non-erupting volcanoes can emit massive amounts of CO2 into Earth’s atmosphere and oceans. The Gerlach calculation and all follow-up calculations utilized volcanic CO2 rates from actively erupting volcanoes.

Lost in the numerous recent media articles concerning the argument of when, or if Iceland’s Katla Volcano will erupt is the discovery that this non-erupting subglacial volcano is currently emitting staggering amounts of CO2 into Earth’s atmosphere!

Researchers from the University of Leeds who studied the Katla Volcano said this.

“We discovered that Katla volcano in Iceland is a globally important source of atmospheric carbon dioxide (CO2) in spite of being previously assumed to be a minor gas emitter. Volcanoes are a key natural source of atmospheric CO2 but estimates of the total global amount of CO2 that volcanoes emit are based on only a small number of active volcanoes. Very few volcanoes which are covered by glacial ice have been measured for gas emissions, probably because they tend to be difficult to access and often do not have obvious degassing vents. Through high‐precision airborne measurements and atmospheric dispersion modeling, we show that Katla, a highly hazardous subglacial volcano which last erupted 100 years ago, is one of the largest volcanic sources of CO2 on Earth, releasing up to 5% of total global volcanic emissions. This is significant in the context of a growing awareness that natural CO2 sources have to be more accurately quantified in climate assessments and we recommend urgent investigations of other subglacial volcanoes worldwide.”(see here)

The Number of Volcanoes Emitting CO2 into the Atmosphere at Any One Time 

The calculation of the total yearly volume of volcanic CO2 emitted into the atmosphere is based on the presumption that very few volcanoes are erupting at any one time.

Scientists from various worldwide volcano research institutions, most notably the United States Geological Survey, have estimated this number to be 20.

This very low number has been challenged by many scientists including those at NASA.

A  multinational team led by NASA has initiated a high-resolution satellite CO2 monitoring project (see here). This project is focused on determining how many geological features are emitting CO2 at any one time.

This project may eventually give scientists a better idea of how many land volcanoes are emitting CO2 at any one time.

However, it is doubtful the project will properly record ocean CO2 emissions from Earth’s 900,000 deep ocean floor and very difficult to monitor volcanic features.

In any case, this project is certainly a step forward towards achieving a better understanding of the climate influence of volcanic CO2 emissions.

The Amount of CO2 and heat infused into Earth’s Oceans by Seafloor Geological Features

About 71% of Earth’s surface is covered by oceans making it a water, not land, planet. For many years now, scientists have contended that the nearly one million geological features present in these vast ocean regions have played a minimal role in heating and chemically charging ocean seawater.

Instead of contending that man-made atmospheric CO2 was the root cause of changes to our oceans.

Recent research has proven that the contentions of these scientists are far from 100% proven. To the contrary, it has become clear that geological heat flow and chemically charged heated fluid flow into our oceans is far more influential than previously thought and possibly the root cause of changes to our oceans.

One example is that geological features are warming Earth’s oceans and causing El Nino’s and La Nina’s (see here, here, and here). Warmed seawater is not capable of holding as much CO2 as cold water.

So, the geologically warming of seawater indirectly leads to a large amount of CO2 being released from oceans and emitted into the atmosphere.

Recent research shows that seafloor geological features also directly emit large amounts of CO2 into our oceans and atmosphere(see here, here, here, and Figure 2).

In summary, the volume of volcanic CO2 being emitted into the Earth’s atmosphere has not been accurately assessed.

Numerous research studies and articles conducted/written by qualified scientists concur with this contention (see herehere, and here).

In a geological time frame, Earth has gone through many periods of increased volcanism. These volcanic periods resulted in; major plant and animal extinction events (see here, here, and here), the end of glacial eras (see here) and the dramatic alteration of Earth’s climate (see here).

All indications are that Earth is currently experiencing another period of strong volcanic activity which is acting to infuse CO2 into our atmosphere thereby challenging the validity of the global warming theory.

Clearly, its time to put on hold all environmental action plans based on the cornerstone AGW principle of the global warming theory until additional geological CO2 emission research is conducted.


James Edward Kamis is a retired professional Geologist with 42 years of experience, a B.S. in Geology from Northern Illinois University (1973), an M.S. in geology from Idaho State University (1977), and a longtime member of AAPG who has always been fascinated by the connection between Geology and Climate. More than 14 years of research/observation have convinced him that the Earth’s Heat Flow Engine, which drives the outer crustal plates, is an important driver of the Earth’s climate as per his Plate Climatology Theory.

November 7, 2018 Posted by | Science and Pseudo-Science, Timeless or most popular | Leave a comment

Robert Faurisson and the Study of the Past

By Gilad Atzmon | October 23, 2018

The history of ideas provides us with the names of those few men and women who challenged the boundaries of tolerance. Professor Robert Faurisson was one such man. Faurisson, who died last Sunday at age 89, was a French academic who didn’t believe in the validity of parts of the Holocaust narrative. He argued that gas chambers in Auschwitz were the “biggest lie of the 20th century,” and contended that deported Jews had died of disease and malnutrition. Faurisson also questioned the authenticity of the Diary of Anne Frank many years before the Swiss foundation that holds the copyright to the famous diary “alerted publishers that her father (Otto Frank) is not only the ‘editor’ but also legally the ‘co-author’ of the celebrated book” (NY Times ).

In the France of the late 1960s-1970s Faurisson had reason to believe that his maverick attitude toward the past would receive a kosher pass. He was wrong. Faurisson may have failed to grasp the role of the Holocaust in contemporary Jewish politics and culture. And he did not grasp that Jewish power is literally the power to silence opposition to Jewish power. 

In 1990 France made holocaust revisionism into the crime of  history denial. Faurisson was repeatedly prosecuted, beaten and fined for his writings. He was dismissed from his academic post at Lyon University in 1991.

I am bothered by the question of why Jews and others  attached to their politics are desperate to restrict the story of their past. This question extends far beyond the holocaust. Israel has enacted a law that bans discussion of the Nakba – the racially motivated ethnic cleansing of the Palestinian people that occurred a mere three years after the liberation of Auschwitz. Similarly, exploring the role of Jews in the slave trade will cost your job or lead to your expulsion from the labour party. My attempt to analyse the true nature of the Yiddish Speaking International Brigade in the 1936 Spanish Civil War outraged some of my Jewish ‘progressive’ friends.

Jean-François Lyotard addressed this question. History may claim to relate what actually occurred, but what it does more often is operate to conceal our shame. The task of an authentic historian is, according to Lyotard, similar to that of the psychoanalyst. It is all about removing layers of shame, concealment and suppression to try to uncover the truth.

It was the work of Faurisson that helped me to define the historical endeavour in philosophical terms. I define history as the attempt to narrate the past as we move along. To deal with history for real, is to continually re-visit and revise the past in light of our cultural, social and ideological changes. For instance, the 1948 Nakba came to be thought of in terms of ethnic cleansing in the early 2000s when the notion of ‘ethnic cleansing’ entered our vocabulary (and our way of understanding a conflict) following the crisis in Kosovo. The real historian reevaluates the past and embraces adjustments that place our understanding of that past in line with our contemporaneous reality and terminology.

Professor Faurisson and the controversy around his work illuminates the distinction between real history and religion. While history is a vibrant dynamic matter subject to constant ‘revision,’ the religious approach to the past is limited to the production of a rigid unchanging chronicle of events. Authentic history invokes ethical thinking to examine the past in light of the present and vice versa, religious history often operates by denying or rejecting increasing ethical insight – it judges actions and events according to set predefined parameters. The question at stake is not what happened in the past but the freedom to research and evaluate the past without being threatened by ‘history laws.’ In the same manner I support ‘progress’ in cancer research, although I do not  produce scholarly comments on related scientific findings, I support the past being continually re-examined although I offer no judgment of any kind regarding the validity of those historical findings. For history to be a valid and an ethical universal pursuit, history laws must be abolished.

In 2014 I met Robert Faurisson and discussed with him different questions about the meaning of history and what the past meant to him.

https://dailymotion.com/video/x2e7359

October 23, 2018 Posted by | Full Spectrum Dominance, Science and Pseudo-Science, Timeless or most popular, Video | , , | Leave a comment

Cut Emissions? Who, Me?

By Paul Homewood | Not A Lot Of People Know That | October 22, 2018

The IPCC says we have got to start cutting emissions radically immediately, but the rest of the world is not listening!

1) Australia rejects UN call to phase out coal

Australia has rejected a call by scientists to phase out coal use by 2050 to prevent the world overshooting targets in the Paris Climate Change agreement with potentially disastrous consequences.

The world’s biggest coal exporter on Tuesday said it would be “irresponsible” to comply with the recommendation by the UN’s Intergovernmental Panel on Climate Change (IPCC) to stop using coal to generate electricity.

Canberra also reiterated its priority is to cut domestic electricity prices rather than curb greenhouse gas emissions, which have risen for four consecutive years.

“To say that it [coal] has to be phased out by 2050 is drawing a very long bow,” said Melissa Price, Australia’s environment minister, who previously worked in the mining industry.

“I just don’t know how you could say by 2050 that you’re not going to have the technology that’s going to enable good, clean technology when it comes to coal. That would be irresponsible of us.” … https://www.ft.com/content/326d7228-cb83-11e8-b276-b9069bde0956

2)  Japan Will Defy Calls By The IPCC To Phase Out Coal By Mid Century

Japan’s ambassador to Australia has confirmed Tokyo will defy calls by the Intergovernmental Panel on Climate Change to phase out coal by mid-century as part of a scientific appeal to limit global temperature increases to 1.5C.

Sumio Kusaka told The Australian that Japan would consider “all practical ways to further advance decarbonisation” but would need to bolster coal supply in the ­immediate future. He said Japanese plans to ­reduce reliance on fossil fuels in line with its international commitments would see a greater focus on nuclear energy, a form of power prohibited in Australia since 1998.

In recent weeks, Tony Abbott and Ziggy ­Swit­kowski, former chair of the Australian Nuclear Science and Technology Organisation, have called for the prohibition on nuclear power to be lifted to provide for the arrival of small modular reactors that can power towns of 100,000 people.

“I am aware the recent IPCC report contains some firm recommendations in relation to coal,” Mr Kusaka told The Australian.

“However, Japan is a country with very limited resources of its own, and bearing in mind our energy ­security requirements, it would be difficult for us to eliminate coal- fired power altogether.

“With a view to 2050, we are also considering all practical ways to further advance decarbonisation. In relation to this, some of the technologies we are looking at include renewable energy, ­nuclear energy and carbon capture and storage.’’

Mr Kusaka said Tokyo would continue to buy coal from Australia to secure its energy needs into the future. Japan was the largest importer of Australian thermal coal last year. – https://www.thegwpf.com/japan-will-defy-calls-by-the-ipcc-to-phase-out-coal-by-mid-century/

3) China To Speed Up End Of Green Energy Subsidies

SHANGHAI (Reuters) – China will speed up efforts to ensure its wind and solar power sectors can compete without subsidies and achieve “grid price parity” with traditional energy sources like coal, according to new draft guidelines issued by the energy regulator.

As it tries to ease its dependence on polluting fossil fuels, China has encouraged renewable manufacturers and developers to drive down costs through technological innovations and economies of scale.

The country aims to phase out power generation subsidies, which have become an increasing burden on the state.

The guidelines said some regions with cost and market advantages had already “basically achieved price parity” with clean coal-fired power and no longer required subsidies, and others should learn from their experiences.

They also urged local transmission grid companies to provide more support for subsidy-free projects and ensure they have the capacity to distribute all the power generated by wind and solar plants…

China’s solar sector is still reeling from a decision to cut subsidies and cap new capacity at 30 gigawatts (GW) this year, down from a record 53 GW in 2017, with the government concerned about overcapacity and a growing subsidy backlog.

According to the NEA, the government owed around 120 billion yuan ($17.46 billion) in subsidies to solar plants by the middle of this year.

4) Germany’s Merkel Promises New Law To Ward Off Diesel Driving Bans (And To Save Her Floundering Government)

BERLIN (Reuters) – German Chancellor Angela Merkel, campaigning for her Christian Democrats (CDU) to retain control of the crucial state of Hesse in next Sunday’s election, promised legislation to ward off the threat of air pollution leading to driving bans.

Speaking at a news conference on Sunday evening, Merkel said it would be disproportionate to ban dirty diesel cars from the road in places like Frankfurt, Hesse’s largest city, where nitrogen emissions limits were only marginally exceeded.

Following her allies’ disastrous showing in Bavaria’s regional elections last week, Merkel faces murmurs of dissent within her party. Defeat in the state to the resurgent Greens could prove fatal to her premiership.

October 22, 2018 Posted by | Economics, Science and Pseudo-Science | Leave a comment

Finnish Deaf Demand State Apology, Compensation for Decades of Sterilization

Sputnik – October 22, 2018

For decades, the Finnish state has run a eugenics-like program that pressured an unknown number of deaf women to undergo sterilization before marriage and forced pregnant women to get abortions, a topic that largely remains taboo even today.

Members of the Finnish sign language society have argued that the state should take responsibility and acknowledge the abuses and encroachments on private life committed over the course of decades.

“(Forced) sterilization is a violent act and a serious human rights violation that has been tacitly accepted,” Maija Koivisto, a teacher at the Deaf Folk High School, told the daily newspaper Hufvudstadsbladet, in her call for a reconciliation process.

According to the Marriage Act of 1929, the deaf were not allowed to marry each other without special permission from the president. This law remained in force until 1969. According to Koivisto, many deaf women were slapped with an ultimatum: get sterilized or forget about marriage.

However, it’s now being acknowledged that some doctors continued to recommend sterilization to patients for several more decades. “Until now, I have assumed that the sterilizations continued until the 1950s and 1960s. But I have heard of a case in the 1990s when a doctor suggested sterilization for his deaf patient,” Koivisto told Hufvudstadsbladet.

According to Koivisto, the church may have had a role in forcible sterilizations, something that has not been talked about so much. In Deaf magazine, two women testified that church staff had exerted pressure on them to get sterilized.

At present, no exact data is available on exactly how many deaf women were sterilized in Finland, a mistake Koivisto intends to rectify. The Sterilization Act of 1935 led to devastating consequences for at least 7,530 Finnish women.

Koivisto suggested that many circumvented the marriage laws by becoming pregnant, thus forcing priests to wed them. Nevertheless, some had to agree to sterilization after that. Others chose to ‘live in sin’; cohabiting and giving birth to children out of wedlock was considered unacceptable at that time.

Koivisto ventured that the topic of sterilization has long been a taboo due to society’s attitude involving shame. Additionally, sterilized women were often seen as “whores” as they could have sex without having to worry about getting pregnant.

Koivisto noted a general tendency to disregard the needs of the deaf in the past. In Finland, sign language was forbidden in schools during the epoch of “oralism” between 1880 and 1970, when deaf children were encouraged to read lips and articulate. According to Koivisto, this matter may be gender-related, as most politicians and all priests at that time were men.

According to Koivisto, the Finnish state should give the deaf victims financial compensation for the abuse.

“I think the state should promise that we can participate in all decisions that concern us and will work to improve the status of the sign language. The state should also grant funds for investigations within the deaf community, for instance for therapeutic purposes,” she added.

Previously, Finland officially apologized for the mistreatment of children at orphanages and boarding schools. Last autumn, it was decided to form a Truth and Reconciliation Commission to gather information about the forced “Finnization” and discrimination that the Sami people have suffered. The Commission received over $1.7 million from the state budget.

Neighboring Sweden sterilized almost 63,000 people between 1935 and 1975, but later apologized and compensated the victims in 1997.

This year, the Japanese victims of a state-run sterilization program that targeted tens of thousands of people to prevent the birth of “inferior descendants,” demanded an apology from the state.

READ MORE:

Sweden to Make Peace With Forced Sterilization Victims Through Indemnities

October 22, 2018 Posted by | Civil Liberties, Science and Pseudo-Science, Timeless or most popular | , , | Leave a comment

Mysterious IPCC Expertise

The IPCC publishes the citizenship and gender of its authors – but says nothing about their scientific expertise

By Donna Laframboise | Big Picture News | October 17, 2018

The Intergovernmental Panel on Climate Change (IPCC) claims to be a scientific organization. But it’s really a political one.

An obvious tell is how it describes its personnel. In the old days, IPCC reports listed people according to their role and their country. Matters have improved since then.

Today, the IPCC gives us six data points about its personnel rather than three. A webpage associated with its latest report tells us each individual’s:

  1. name
  2. IPCC role (coordinating lead author, lead author, review editor)
  3. gender
  4. country of residence
  5. citizenship
  6. institutional affiliation

But this only looks like progress. In the real world, the additional info is irrelevant. Science doesn’t care where someone lives or what citizenship they hold. Science doesn’t care if they’re a man or a woman.

If the IPCC is a panel of experts, the critical issue is: What is each of these people an expert in? More than 30 years after its founding, the IPCC still thinks it doesn’t need to talk about this.

For the UN bureaucrats who run the show, some things are important. Some are not. The nature of an author’s scientific expertise clearly isn’t a burning issue. But lots of attention is being paid to checking diversity boxes.

October 17, 2018 Posted by | Science and Pseudo-Science | | Leave a comment

The Dark Story Behind Global Warming aka Climate Change

By F. William Engdahl – New Eastern Outlook – 16.10.2018

The recent UN global warming conference under auspices of the deceptively-named International Panel on Climate Change (IPCC) concluded its meeting in South Korea discussing how to drastically limit global temperature rise. Mainstream media is predictably retailing various panic scenarios “predicting” catastrophic climate change because of man-made emissions of Greenhouse Gases, especially CO2, if drastic changes in our lifestyle are not urgently undertaken. There is only one thing wrong with all that. It’s based on fake science and corrupted climate modelers who have reaped by now [many] billions in government research grants to buttress the arguments for radical change in our standard of living. We might casually ask “What’s the point?” The answer is not positive.

The South Korea meeting of the UN IPCC discussed measures needed, according to their computer models, to limit global temperature rise to below  1.5 Centigrade above levels of the pre-industrial era. One of the panel members and authors of the latest IPCC Special Report on Global Warming, Drew Shindell, at Duke University told the press that to meet the arbitrary 1.5 degree target will require world CO2 emissions to drop by a staggering 40% in the next 12 years. The IPCC calls for a draconian “zero net emissions” of CO2 by 2050. That would mean complete ban on gas or diesel engines for cars and trucks, no coal power plants, transformation of the world agriculture to burning food as biofuels. Shindell modestly put it, “These are huge, huge shifts.”

The new IPCC report, SR15, declares that global warming of 1.5°C will “probably“ bring species extinction, weather extremes and risks to food supply, health and economic growth. To avoid this the IPCC estimates required energy investment alone will be $2.4 trillion per year. Could this explain the interest of major global banks, especially in the City of London in pushing the Global Warming card?

This scenario assumes an even more incredible dimension as it is generated by fake science and doctored data by a tight-knit group of climate scientists internationally that have so polarized scientific discourse that they label fellow scientists who try to argue as not mere global warming skeptics, but rather as “Climate Change deniers.” What does that bit of neuro-linguistic programming suggest? Holocaust deniers? Talk about how to kill legitimate scientific debate, the essence of true science. Recently the head of the UN IPCC proclaimed, “The debate over the science of climate change is well and truly over.”

What the UN panel chose to ignore was the fact the debate was anything but “over.” The Global Warming Petition Project, signed by over 31,000 American scientists states, “There is no convincing scientific evidence that human release of carbon dioxide, methane, or other greenhouse gasses is causing or will, in the foreseeable future, cause catastrophic heating of the Earth’s atmosphere and disruption of the Earth’s climate. Moreover, there is substantial scientific evidence that increases in atmospheric carbon dioxide produce many beneficial effects upon the natural plant and animal environments of the Earth.”

‘Chicken Little’

Most interesting, about the dire warnings of global catastrophe if dramatic changes to our living standards are not undertaken urgently, is that the dire warnings are always attempts to frighten based on future prediction. When the “tipping point” of so-called irreversibility is passed with no evident catastrophe, they invent a new future point.

In 1982 Mostafa Tolba, executive director of the UN Environment Program (UNEP), warned the “world faces an ecological disaster as final as nuclear war within a couple of decades unless governments act now.” He predicted lack of action would bring “by the turn of the century, an environmental catastrophe which will witness devastation as complete, as irreversible as any nuclear holocaust.”In 1989 Noel Brown, of the UN Environmental Program (UNEP), said entire nations could be wiped off the face of the earth by rising sea levels if the global warming trend is not reversed by the year 2000. James Hansen, a key figure in the doomsday scenarios declared at that time that 350 ppm of CO2 was the upper limit, “to preserve a planet similar to that on which civilization developed and to which life on Earth is adapted.” Rajendra Pachauri, then the chief of the UN IPCC, declared that 2012 was the climate deadline by which it was imperative to act: “If there’s no action before 2012, that’s too late.” Today the measured level is 414.

As UK scientist Philip Stott notes, “In essence, the Earth has been given a 10-year survival warning regularly for the last fifty or so years. …Our post-modern period of climate change angst can probably be traced back to the late-1960s… By 1973, and the ‘global cooling’ scare, it was in full swing, with predictions of the imminent collapse of the world within ten to twenty years…Environmentalists were warning that, by the year 2000, the population of the US would have fallen to only 22 million. In 1987, the scare abruptly changed to ‘global warming’, and the IPCC (the Intergovernmental Panel on Climate Change) was established (1988)…”

Flawed Data

A central flaw to the computer models cited by the IPCC is the fact that they are purely theoretical models and not real. The hypothesis depends entirely on computer models generating scenarios of the future, with no empirical records that can verify either these models or their flawed prediction. As one scientific study concluded, “The computer climate models upon which ‘human-caused global warming’ is  based have  substantial  uncertainties  and  are  markedly unreliable. This is not surprising, since the climate is a coupled, non-linear dynamical system. It is very complex.” Coupled refers to the phenomenon that the oceans cause changes in the atmosphere and the atmosphere in turn affects the oceans. Both are complexly related to solar cycles. No single model predicting global warming or 2030 “tipping points” is able or even tries to integrate the most profound influence on Earth climate and weather, the activity of the sun and solar eruption cycles which determine ocean currents, jet stream activity, El ninos and our daily weather.

An Australian IT expert and independent researcher, John McLean, recently did a detailed analysis of the IPCC climate report. He notes that HadCRUT4 is the primary dataset used by the Intergovernmental Panel on Climate Change (IPCC) to make its dramatic claims about “man-made global warming”, to justify its demands for trillions of dollars to be spent on “combating climate change.” But McLean points to egregious errors in the HadCRUT4 used by IPCC. He notes, “It’s very careless and amateur. About the standard of a first-year university student.” Among the errors, he cites places where temperature “averages were calculated from next to no information. For two years, the temperatures over land in the Southern Hemisphere were estimated from just one site in Indonesia.” In another place he found that for the Caribbean island, St Kitts temperature was recorded at 0 degrees C for a whole month, on two occasions. TheHadCRUT4 dataset is a joint production of the UK Met Office’s Hadley Centre and the Climatic Research Unit at the University of East Anglia. This was the group at East Anglia that was exposed several years ago for the notorious Climategate scandals of faking data and deleting embarrassing emails to hide it. Mainstream media promptly buried the story, turning attention instead on “who illegally hacked East Anglia emails.”

Astonishing enough when we do a little basic research, we find that the IPCC never carried out a true scientific inquiry into the possible cases of change in Earth climate. Man made sources of change were arbitrarily asserted, and the game was on.

Malthusian Maurice Strong

Few are aware however of the political and even geopolitical origins of Global Warming theories. How did this come about? So-called Climate Change, aka Global Warming, is a neo-malthusian deindustrialization agenda originally developed by circles around the Rockefeller family in the early 1970’s to prevent the rise of independent industrial rivals, much as Trump’s trade wars today. In my book, Myths, Lies and Oil Wars, I detail how the highly influential Rockefeller group also backed creation of the Club of Rome, Aspen Institute, Worldwatch Institute and MIT Limits to Growth report. A key early organizer of Rockefeller’s ‘zero growth’ agenda in the early 1970s was David Rockefeller’s longtime friend, a Canadian oilman named Maurice Strong. Strong was one of the early propagators of the scientifically unfounded theory that man-made emissions from transportation vehicles, coal plants and agriculture caused a dramatic and accelerating global temperature rise which threatens civilization, so-called Global Warming.

As chairman of the 1972 Earth Day UN Stockholm Conference, Strong promoted an agenda of population reduction and lowering of living standards around the world to “save the environment.” Some years later the same Strong restated his radical ecologist stance: “Isn’t the only hope for the planet that the industrialized civilizations collapse? Isn’t it our responsibility to bring that about?” Co-founder of the Rockefeller-tied Club of Rome, Dr Alexander King admitted the fraud in his book, The First Global Revolution. He stated, “In searching for a new enemy to unite us, we came up with the idea that pollution, the threat of global warming, water shortages, famine and the like would fit the bill… All these dangers are caused by human intervention… The real enemy, then, is humanity itself.”

Please reread that, and let it sink in. Humanity, and not the 147 global banks and multi-nationals who de facto determine today’s environment, bear the responsibility.

Following the Earth Summit, Strong was named Assistant Secretary General of the United Nations, and Chief Policy Advisor to Kofi Annan. He was the key architect of the 1997-2005 Kyoto Protocol that declared man made Global Warming, according to “consensus,” was real and that it was “extremely likely” that man-made CO2 emissions have predominantly caused it. In 1988 Strong was key in creation of the UN IPCC and later the UN Framework Convention on Climate Change at the Rio Earth Summit which he chaired, and which approved his globalist UN Agenda 21.

The UN IPCC and its Global Warming agenda is a political and not a scientific project. Their latest report is, like the previous ones, based on fake science and outright fraud. MIT Professor Richard S Lindzen in a recent speech criticized politicians and activists who claim “the science is settled,” and demand “unprecedented changes in all aspects of society.” He noted that it was totally implausible for such a complex “multifactor system” as the climate to be summarized by just one variable, global mean temperature change, and primarily controlled by just a 1-2 per cent variance in the energy budget due to CO2. Lindzen described how “an implausible conjecture backed by false evidence, repeated incessantly, has become ‘knowledge,’ used to promote the overturn of industrial civilization.” Our world indeed needs a “staggering transformation,” but one that promotes health and stability of the human species instead.

October 16, 2018 Posted by | Corruption, Fake News, Mainstream Media, Warmongering, Science and Pseudo-Science | | Leave a comment