Aletho News

ΑΛΗΘΩΣ

Awareness The CDC’s Influenza Math Doesn’t Add Up: Exaggerating the Death Toll to Sell Flu Shots

By Robert F. Kennedy Jr. | Collective Evolution | October 10, 2018

Every year at about this time, public health officials and their media megaphones start up the drumbeat to encourage everyone (including half-year-old infants, pregnant women and the invalid elderly) to get a flu shot. Never mind that more often than not the vaccines don’t work, and sometimes even increase the risk of getting sick.

To buttress their alarmist message for 2018-2019, representatives from the Centers for Disease Control and Prevention (CDC) and other health agencies held a press conference and issued a press release on September 27, citing a particularly “record-breaking” (though unsubstantiated) 80,000 flu deaths last year. Having “medical experts and public health authorities publicly… state concern and alarm (and predict dire outcomes)” is part and parcel of the CDC’s documented playbook for “fostering public interest and high… demand” for flu shots. CDC’s media relations experts frankly admit that “framing” the current flu season as “more severe than last or past years” or more “deadly” is a highly effective strategy for garnering strong interest and attention from both the media and the public.

Peter Doshi (associate editor at The BMJ and a MIT graduate) has criticized the CDC’s “aggressive” promotion of flu shots, noting that although the annual public health campaigns deliver a “who-in-their-right-mind-could-possibly-disagree message,” the “rhetoric of science” trotted out each year by public health officials has a “shaky scientific basis.” Viewed within the context of Doshi’s remarks, the CDC’s high-flying flu numbers for 2017-2018 raise a number of questions. If accurate, 80,000 deaths would represent an enormous (and mystifying) one-year jump—tens of thousands more flu deaths compared to the already inflated numbers presented for 2016 (and every prior year). Moreover, assuming a roughly six-month season for peak flu activity, the 80,000 figure would translate to an average of over 13,300 deaths per month—something that no newspaper last year came close to reporting.

The CDC’s statistics are impervious to independent verification because they remain, thus far, unpublished—despite the agency’s pledge on its website to base its public health pronouncements on high-quality data derived openly and objectively. Could the CDC’s disappointment with influenza vaccination coverage—which lags far behind the agency’s target of 80%—have anything to do with the opacity of the flu data being used to peddle the unpopular and ineffective vaccines?

Fudging facts

There are a variety of reasons to question the precision with which the CDC likes to imbue its flu statistics. First, although the CDC states that it conducts influenza mortality surveillance with its partner agencies, there is no actual requirement for U.S. states to report adult flu deaths to the CDC. (In public health parlance, adult influenza deaths are not “reportable” or “nationally notifiable.”) In fact, the only “flu-associated deaths” that the CDC requires states and other jurisdictions to report are deaths in children—180 last year.

How did the CDC reach its as-yet-unpublished conclusion—widely shared with the media—that 79,820 American adults in addition to 180 children died from the flu in 2017-2018? The agency states that it relies on death certificate data. However, members of the Cochrane research community have observed that “when actual death certificates are tallied, influenza deaths on average are little more than 1,000 yearly.”

Other knowledgeable individuals have also noted that the death records system in the U.S. is subjective, incomplete and politicized, and have suggested that citizens should adopt a “healthy skepticism about even the most accepted, mainstream, nationally reported CDC or other ‘scientific’ statistics.” This skepticism may be especially warranted for the influenza stats, which are so inextricably intertwined with the CDC’s vaccination agenda that the statistical techniques and assumptions that the agency uses focus specifically on “project[ing] the burden of influenza that would have occurred in the absence of vaccination.”

Notwithstanding its incessant use of influenza statistics to justify its flu vaccine policies, the CDC tries to have it both ways, cautioning that because “influenza activity reporting… is voluntary,” influenza surveillance in the U.S. “cannot be used to ascertain how many people have become ill with influenza during the influenza season.” A larger problem is that the vital statistics that form the basis of the CDC’s surveillance data conflate deaths from pneumonia and influenza (P&I). The CDC concedes that this conflation complicates the challenge of specifically estimating flu deaths:

The system “tracks the proportion of death certificates processed that list pneumonia or influenza as the underlying or contributing cause of death. This system… does not provide an exact number of how many people died from flu” [emphasis added].

Curiously, the CDC presented its cause-of-death data slightly differently prior to 2015. Through 2014, the agency’s annual National Vital Statistics Reports included tables showing influenza deaths and pneumonia deaths as separate line items. Those reports made it abundantly clear that pneumonia deaths (at least as transmitted by death certificates) consistently and dramatically outstripped influenza deaths. The table below illustrates this pattern for 2012-2014.

Starting in 2015, the annual vital statistics reports began displaying P&I together and eliminated the distinct line items. At present, only one tool remains to examine mortality associated with influenza as distinct from pneumonia—the CDC’s interactive FluView dashboard—which provides weekly national breakdowns. The dashboard shows the same general pattern as in the annual reports—that is, lower numbers of influenza deaths and much higher numbers of pneumonia deaths. Bearing in mind all the shortcomings and potential biases of death certificate data, dashboard reports for the first week of March (week 9) for the past three years show 257 influenza deaths versus 4,250 pneumonia deaths in 2016, and 534 and 736 flu deaths (versus over 4,000 annual pneumonia deaths) in 2017 and 2018, respectively.

Semantic shenanigans

Semantics also play a key role in the CDC’s slippery communications about “flu.” For example, CDC’s outpatient surveillance focuses on the broad category of “influenza-like illness” (ILI)—an almost meaningless term describing general symptoms (fever, cough and/or sore throat) that any number of non-influenza viruses are equally capable of triggering. Cochrane lists several problems with the reliance on ILI to make inferences about influenza:

  • There is “no reliable system to monitor and quantify the epidemiology and impact of ILI” and no way of knowing what proportion of ILI is caused by influenza.
  • There are almost no reliable data on the number of ILI-related physician contacts or hospitalizations—and no one knows what proportion of ILI doctor visits and hospitalizations are due to influenza.

“Pneumonia,” too, is a catch-all diagnosis covering lung infections caused by a variety of different agents: viruses (non-influenza as well as influenza), bacteriafungiair pollutants and many others. Interestingly, hospitalization is a common route of exposure to pneumonia-causing pathogens, and mortality from hospital-acquired pneumonia exceeds 60%. In a plausible scenario, an adult hospitalized for suspected (but unconfirmed) “flu” could acquire a lethal pneumonia bug in the hospital, and their death might be chalked up to “flu” regardless of the actual facts, particularly because clinicians do not necessarily order influenza testing. When clinicians in outpatient settings do order testing, relatively few of the “flu” specimens—sometimes as low as 1%—actually test positive for influenza. Over the past couple of decades, the proportion of specimens testing positive has averaged around 15%—meaning that about 85% of suspected “flu” specimens are not, in fact, influenza.

Propaganda with a purpose

It takes little subtlety to recognize that the principal reason for flu hyperbole is to sell more vaccines. However, more and more people—even infectious disease specialists—are realizing that flu shots are fraught with problems. Roughly four-fifths of the vaccine injury and death cases settled through the National Vaccine Injury Compensation Program are flu-vaccine-related. A University of Toronto-based expert recently stated, “We have kind of hyped this vaccine so much for so long we are starting to believe our own hype.”

Pro-flu-vaccination studies—through their skillful placement in prestigious journals—tend to drown out other influenza studies that should be ringing warning bells. Published peer-reviewed studies show that:

  • Previous influenza vaccination, particularly in those who get a flu shot every year, diminishes or “blunts” the already low effectiveness of flu shots.
  • Getting vaccinated against influenza increases susceptibility to other severe respiratory viruses and also to other strains of influenza.
  • Mothers who receive influenza vaccines during pregnancy face an increased risk of miscarriages and their offspring face elevated risks of birth defects and autism.

A systematic review of influenza vaccine trials by Cochrane in 2010 urges the utmost caution. Noting that “studies funded from public sources [have been] significantly less likely [than industry-funded studies] to report conclusions favorable to the vaccines,” and citing evidence of “widespread manipulation of conclusions,” the Cochrane reviewers’ bottom line is that “reliable evidence on influenza vaccines is thin.” We should all keep those words in mind the next time the CDC and the media try to mischaracterize flu facts and science.

CHD is planning many strategies, including legal, in an effort to defend the health of our children and obtain justice for those already injured.  Your support is essential to CHD’s successful mission. Please visit our crowdfunding page.

October 11, 2018 Posted by | Deception, Science and Pseudo-Science | , | 1 Comment

IPCC Pretends the Scientific Publishing Crisis Doesn’t Exist

By Donna Laframboise | Big Picture News | October 8, 2018

The Intergovernmental Panel on Climate Change (IPCC) issued a press release today. It tells us the IPCC assesses “thousands of scientific papers published each year,” and that its latest report relies on “more than 6,000 references.”

That sounds impressive until one remembers that academic publishing is in the grips of a reproducibility crisis. A disturbing percentage of the research published in medicine, economics, computer science, psychology, and other fields simply doesn’t stand up. Whenever independent third parties attempt to reproduce/replicate this work – carrying out the same research in order to achieve the same findings – the success rate is dismal.

The influential 2005 paper, Why Most Published Research Findings Are False, is now very old news. Headlines declaring that ‘science is broken’ have become commonplace. In 2015, the editor-in-chief of The Lancet declared that “much of the scientific literature, perhaps half, may simply be untrue.”

So here’s the bottom line: We know that studies about promising drugs typically fail when strangers attempt to reproduce those studies. We know that flashy physics research published in Science and Nature has been wholly fraudulent. We know that half of economics papers can’t be replicated, even with assistance from their own authors. We know political bias distorts the peer-review process in psychology. (All of this is discussed in a report I wrote in 2016).

We therefore have no earthly reason to imagine that climate science is exempt from these kinds of problems.

If half of the scientific literature is untrue, it therefore follows that half of climate research is also untrue.

This means that 3,000 of the IPCC’s 6,000 references aren’t worth the paper they’re written on.

BACKGROUND: The IPCC is a UN bureaucracy. Governments select scientists to write climate reports – one of which has just been completed.

These scientists are further asked to summarize their work. But the scientist-crafted summary is only a draft. At the meeting that just ended in South Korea, the draft was re-written by politicians, diplomats, and bureaucrats representing the political establishments of various countries.

At that point, the summary forfeited any conceivable claim to be a scientific document and became, instead, a politically-negotiated statement.

Today’s press release announces that the politicized summary was “approved by governments” and has therefore been made public (download it here).

Please note: the report itself has not been made public. Nor has the draft summary containing the scientists’ own words. (Although the IPCC claims to be ultra-transparent, its website says the original/draft version of the Summary for Policymakers is available only to “authorised users” such as government officials.)

This is the IPCC’s standard MO. It controls the message by feeding the media a politically-negotiated Summary of its latest work. Then it stands back and lets gullible reporters mislead the public about what the science says.

LINKS:

The Delinquent Teenager Who Was Mistaken for the World’s Top Climate Expert

October 8, 2018 Posted by | Corruption, Deception, Science and Pseudo-Science | | 3 Comments

Creating a Suspect Society: The Scary Side of the Technological Police State

By John W. Whitehead | Rutherford Institute | October 2, 2018

It’s a given that Big Brother is always watching us.

Unfortunately, thanks to the government’s ongoing efforts to build massive databases using emerging surveillance, DNA and biometrics technologies, Big Brother (and his corporate partners in crime) is getting even creepier and more invasive, intrusive and stalker-like.

Indeed, every dystopian sci-fi film (and horror film, for that matter) we’ve ever seen is suddenly converging into this present moment in a dangerous trifecta between science and technology, Big Business, and a government that wants to be all-seeing, all-knowing and all-powerful—but not without help from the citizenry.

On a daily basis, Americans are relinquishing (in many cases, voluntarily) the most intimate details of who we are—our biological makeup, our genetic blueprints, and our biometrics (facial characteristics and structure, fingerprints, iris scans, etc.)—in order to navigate an increasingly technologically-enabled world.

Consider all the ways we continue to be tracked, hunted, hounded, and stalked by the government and its dubious agents:

By tapping into your phone lines and cell phone communications, the government knows what you say.

By uploading all of your emails, opening your mail, and reading your Facebook posts and text messages, the government knows what you write.

By monitoring your movements with the use of license plate readers, surveillance cameras and other tracking devices, the government knows where you go.

By churning through all of the detritus of your life—what you read, where you go, what you say—the government can predict what you will do.

By mapping the synapses in your brain, scientists—and in turn, the government—will soon know what you remember.

By mapping your biometrics—your “face-print”—and storing the information in a massive, shared government database available to bureaucratic agencies, police and the military, the government’s goal is to use facial recognition software to identify you (and every other person in the country) and track your movements, wherever you go.

And by accessing your DNA, the government will soon know everything else about you that they don’t already know: your family chart, your ancestry, what you look like, your health history, your inclination to follow orders or chart your own course, etc.

Of course, none of these technologies are foolproof.

Nor are they immune from tampering, hacking or user bias.

Nevertheless, they have become a convenient tool in the hands of government agents to render null and void the Constitution’s requirements of privacy and its prohibitions against unreasonable searches and seizures.

Consequently, no longer are we “innocent until proven guilty” in the face of DNA evidence that places us at the scene of a crimebehavior sensing technology that interprets our body temperature and facial tics as suspicious, and government surveillance devices that cross-check our biometricslicense plates and DNA against a growing database of unsolved crimes and potential criminals.

For a long time, the government was required to at least observe some basic restrictions on when, where and how it could access someone’s biometrics and DNA and use it against them.

That is no longer the case.

The information is being amassed through a variety of routine procedures, with the police leading the way as prime collectors of biometrics for something as non-threatening as a simple moving violation. The nation’s courts are also doing their part to “build” the database, requiring biometric information as a precursor to more lenient sentences. And of course Corporate America has made it so easy to use one’s biometrics to access everything from bank accounts to cell phones.

This doesn’t even touch on the many ways in which the government is using our DNA against us, the Constitution be damned.

DNA technology, what police like to refer to as a “modern fingerprint,” reveals everything about “who we are, where we come from, and who we will be.”

With such a powerful tool at their disposal, it was inevitable that the government’s collection of DNA would become a slippery slope toward government intrusion.

Now, Americans are vulnerable to the government accessing, analyzing and storing their DNA without their knowledge or permission.

Even hospitals have gotten in on the game by taking and storing newborn babies’ DNA, often without their parents’ knowledge or consent. It’s part of the government’s mandatory genetic screening of newborns. However, in many states, the DNA is stored indefinitely.

What this means for those being born today is inclusion in a government database that contains intimate information about who they are, their ancestry, and what awaits them in the future, including their inclinations to be followers, leaders or troublemakers.

For the rest of us, it’s just a matter of time before the government gets hold of our DNA, either through mandatory programs carried out in connection with law enforcement and corporate America.

If you haven’t yet connected the dots, let me point the way.

Having already used surveillance technology to render the entire American populace potential suspects, DNA technology in the hands of government will complete our transition to a suspect society in which we are all merely waiting to be matched up with a crime.

No longer can we consider ourselves innocent until proven guilty.

Now we are all suspects in a DNA lineup until circumstances and science say otherwise.

It’s not just yourself you have to worry about, either.

It’s also anyone related to you who can be connected by DNA.

Unfortunately, we now find ourselves in the unenviable position of being monitored, managed, convicted and controlled by our technology, which answers not to us but to our government and corporate rulers.

This is the fact-is-stranger-than-fiction lesson that is being pounded into us on a daily basis.

While the Fourth Amendment was created to prevent government officials from searching an individual’s person or property without a warrant and probable cause—evidence that some kind of criminal activity was afoot—the founders could scarcely have imagined a world in which we needed protection against widespread government breaches of our privacy on a cellular level.

Yet that’s exactly what we are lacking.

Once again, technology has outdistanced both our understanding of it and our ability to adequately manage the consequences of unleashing it on an unsuspecting populace.

In the end, as I make clear in my book Battlefield America: The War on the American People, what all of this amounts to is a carefully crafted campaign designed to give the government access to and control over what it really wants: you.

October 2, 2018 Posted by | Civil Liberties, Science and Pseudo-Science, Timeless or most popular | , | Leave a comment

Dialog on Diet as Preventative Medicine

Joe Rogan Experience #1175 – Chris Kresser & Dr. Joel Kahn – September 27, 2018

Chris Kresser, M.S., L.Ac is a globally recognized leader in the fields of ancestral health, Paleo nutrition, and functional and integrative medicine. Dr. Joel Kahn is one of the world’s top cardiologists and believes that plant-based nutrition is the most powerful source of preventative medicine on the planet.

https://chriskresser.com/rogan

https://drjoelkahn.com/joe-rogan-expe…

October 1, 2018 Posted by | Science and Pseudo-Science, Timeless or most popular, Video | , | 3 Comments

What is the Meaningful 97% in the Climate Debate?

By Dr. Tim Ball | Watts Up With That? | September 29, 2018

For a brief period, the New York Times added a column to their best-seller book list. It identified the percentage of people who finished reading the book. As I recall, the outright winner for lowest percentage was Umberto Eco’s Name of the Rose with only 6%. It is an excellent and fascinating book if you understand the Catholic church, its theological disputes, know much about medieval mythology, understand Catholic religious orders, and are familiar with the history of Italy in the Middle Ages. As one reviewer wrote, “I won’t lie to you. It is absolutely a slog at times.” This phrase struck me because it is exactly what a lawyer told me after reading my book “The Deliberate Corruption of Climate Science.”

I told him it was a slog to research because it required reading all the Reports of the Intergovernmental Panel on Climate Change (IPCC), a task that few, certainly fewer than 6%, ever achieve, including most of the people involved with the production. This is the tragedy. There are so many people with such strong, definitive views, including among skeptics and the general science community who have never read the Reports at all. The challenge is made more difficult by the deliberate attempt to separate truth and reality from propaganda and the political agenda.

In media interviews or discussions with the public, the most frequent opening challenge is; “But don’t 97% of scientists agree?” It is usually said obliquely to imply that you know a lot, and I don’t understand, but I assume you are wrong because you are in the minority. I don’t attempt to refute the statistics. Instead, I explain the difference in definitions between science and society. Then I point out that the critical 97% figure is that at least 97% of scientists have never read the claims of the IPCC Reports. How many people reading this article have read all the IPCC Reports, or even just one of them? If you have, it is probably the deliberately deceptive Summary for Policymakers (SPM). Even fewer will have read the Report of Working Group I: The Physical Science Basis. Naively, people, especially other scientists, assume scientists would not falsify, mislead, misrepresent, or withhold information. It is worse, because the IPCC deliberately created the false claim of consensus.

I wrote earlier about the problem of communications between groups and the general public because of the different definition of terms. Among the most damaging, especially in the public debate, is the word consensus. Exploitation of the confusion was deliberate. On 22 December 2004, RealClimate, the website created to manipulate the global warming story, provided this insight;

We’ve used the term “consensus” here a bit recently without ever really defining what we mean by it. In normal practice, there is no great need to define it – no science depends on it. But it’s useful to record the core that most scientists agree on, for public presentation. The consensus that exists is that of the IPCC reports, in particular the working group I report (there are three WG’s. By “IPCC”, people tend to mean WG I).

In other words, it is what the creators of the Reports consider a consensus. This is classic groupthink on display. One characteristic of which says they have,

“…a culture of uniformity where individuals censor themselves and others so that the facade of group unanimity is maintained.”

The source of the 97% claim in the public arena came from John Cook et al., and was published in 2013 in Environmental Research Letters. It was titled “Quantifying the consensus on anthropogenic global warming in the scientific literature.” I acknowledge to people some of the brilliant dissections of this claim, such as Lord Monckton’s comment, “0.3% consensus, not 97.1%.” If I have time, I explain how the plan to exploit the idea of consensus was developed by the same people and corrupted science exposed in the emails leaked from the Climatic Research Unit (CRU) in November 2009.

Harvard graduate, medical doctor, and world-famous science fiction writer, Michael Crichton provides an excellent riposte.

“I want to pause here and talk about this notion of consensus, and the rise of what has been called consensus science. I regard consensus science as an extremely pernicious development that ought to be stopped cold in its tracks. Historically, the claim of consensus has been the first refuge of scoundrels; it is a way to avoid debate by claiming that the matter is already settled. Whenever you hear the consensus of scientists agrees on something or other, reach for your wallet, because you’re being had.

Let’s be clear: the work of science has nothing whatever to do with consensus. Consensus is the business of politics. Science, on the contrary, requires only one investigator who happens to be right, which means that he or she has results that are verifiable by reference to the real world. In science consensus is irrelevant. What is relevant is reproducible results. The greatest scientists in history are great precisely because they broke with the consensus.”

The attempt to deceive and divert was built into the structure, format, and procedures of the IPCC. Few people know that a major part of the deception is to identify all the problems with the science but only identify them in the Report of Working Group I: The Physical Science Basis. They know most won’t read or understand it and can easily marginalize the few who do. In 2012 I created a list of several of these acknowledgments, but only one is sufficient here to destroy the certainty of their claims about future climates. Section 14.2.2. of the Scientific Section of Third IPCC Assessment Report, (2001) titled “Predictability in a Chaotic System” says:

“The climate system is particularly challenging since it is known that components in the system are inherently chaotic; there are feedbacks that could potentially switch sign, and there are central processes that affect the system in a complicated, non-linear manner. These complex, chaotic, non-linear dynamics are an inherent aspect of the climate system.”

“In sum, a strategy must recognise what is possible. In climate research and modelling, we should recognise that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible” (My emphasis).

This is not reported in the Summary for Policymakers (SPM) that is deliberately different. David Wojick, an IPCC expert reviewer, explained,

“What is systematically omitted from the SPM are precisely the uncertainties and positive counter evidence that might negate the human interference theory. Instead of assessing these objections, the Summary confidently asserts just those findings that support its case. In short, this is advocacy, not assessment.”

He should add, it is deliberate advocacy, as the RealClimate quote shows.

The SPM receives scant attention from the media and the public, except for the temperature predictions and then only the most extreme figure is selected. The Science Report receives even less attention, but that is by instruction because it is released months later. All of this is why I quoted German physicist and meteorologist Klaus Eckart Puls (English translation version) on the cover of both my books.

“Ten years ago, I simply parroted what the IPCC told us. One day I started checking the facts and data – first I started with a sense of doubt but then I became outraged when I discovered that much of what the IPCC and the media were telling us was sheer nonsense and was not even supported by any scientific facts and measurements. To this day I still feel shame that as a scientist I made presentations of their science without first checking it.” “Scientifically it is sheer absurdity to think we can get a nice climate by turning a CO2 adjustment knob.”

The real challenge of the 97% consensus claim is to get more of the 97% to do what Puls did, read the Reports and find out what the IPCC did and said. They need to do it because the misuse and loss of credibility of science aren’t restricted to the climate deception. As I read and hear from all sectors of science and society, it is endemic (fake news) and potentially devastating. I think one of the most important achievements of my successful trial with Andrew Weaver was to go beyond the defamation charge, against my lawyer’s advice, and show that the misuse of science will and must elicit passionate reactions. So, next time you are confronted with the 97% oblique charge, simply ask the person if they have read any of the IPCC Reports. Just be prepared for the invective.

September 30, 2018 Posted by | Science and Pseudo-Science, Timeless or most popular | | 2 Comments

New Book – Climate Basics: Nothing to Fear

By Rod Martin, Jr | Watts Up With That? | September 29, 2018

Afraid of the future? Don’t be. When we’re armed with the basic facts of climate, we can more easily spot the lies that the corrupt, corporate news media is trying to feed us. Some of those lies are huge.

This small book gives us everything the layman needs to know about climate science.

The warming alarmists attempt to frighten us with things like,
– Global warming will result in more violent storms.
– Global warming will give us more deserts and droughts.
– Global warming is dangerous.
– Carbon dioxide is driving our dangerous global warming.
– We urgently need to cool down the planet.
– Carbon dioxide is a dangerous pollutant.
– The future looks bleak unless we drastically reduce our “carbon footprint.”

This slender volume sets the record straight on all of the above issues.

There are concerns for the future, but none of these are it. And when we have real problems to face, it does no good to be fixing things that don’t need it.

Download for free here for a limited time. PDF, Ebook, Kindle formats

September 29, 2018 Posted by | Book Review, Science and Pseudo-Science | Leave a comment

Study Buried For Four Years Shows Crime Lab DNA Testing Is Severely Flawed

By Tim Cushing – TechDirt – September 27, 2018

DNA is supposed to be the gold standard of evidence. Supposedly so distinct it would be impossible to convict the wrong person, yet DNA evidence has been given far more credit than it’s earned.

Part of the problem is that it’s indecipherable to laypeople. That has allowed crime lab technicians to testify to a level of certainty that’s not backed by the data. Another, much larger problem is the testing itself. It searches for DNA matches in samples covered with unrelated DNA. Contamination is all but assured. In one stunning example of DNA testing’s flaws, European law enforcement spent years chasing a nonexistent serial killer whose DNA was scattered across several crime scenes before coming to the realization the DNA officers kept finding belonged to the person packaging the testing swabs used by investigators.

The reputation of DNA testing remains mostly untainted, rose-tinted by the mental imagery of white-coated techs working in spotless labs to deliver justice, surrounded by all sorts of science stuff and high-powered computers. In reality, testing methods vary greatly from crime lab to crime lab, as do the standards for declaring a match. People lose their freedom thanks to inexact science and careless handling of samples. And it happens far more frequently than anyone involved in crime lab testing would like you to believe.

An op-ed about the failures of crime lab DNA testing at the New York Times — written by Boise State Professor of Biology Greg Hampikian — discusses this ongoing problem using some science of his own: a recently-released NIST study. (h/t Grits for Breakfast)

Researchers from the National Institute of Standards and Technology gave the same DNA mixture to about 105 American crime laboratories and three Canadian labs and asked them to compare it with DNA from three suspects from a mock bank robbery.

The first two suspects’ DNA was part of the mixture, and most labs correctly matched their DNA to the evidence. However, 74 labs wrongly said the sample included DNA evidence from the third suspect, an “innocent person” who should have been cleared of the hypothetical felony.

This is already a problem. People’s lives are literally on the line and crime lab testing is more likely to make the wrong call on evidence than the correct one. What’s truly disturbing is this study was completed in 2014, but the report was apparently buried by the scientists it implicated. As Dr. Hampikian states in his op-ed, the study’s results may still be unpublished if it weren’t for forensic scientists publicly complaining about the burial.

Four years have passed since the study’s completion and it appears no improvements have been made. The study notes testing protocols vary widely and very little effort is being made to improve error-prone procedures. In addition, the study [PDF] comes with a disclaimer meant to dissuade litigants from challenging DNA evidence by quoting the study’s findings.

The results described in this article provide only a brief snapshot of DNA mixture interpretation as practiced by participating laboratories in 2005 and 2013. Any overall performance assessment is limited to participating laboratories addressing specific questions with provided data based on their knowledge at the time. Given the adversarial nature of the legal system, and the possibility that some might attempt to misuse this article in legal arguments, we wish to emphasize that variation observed in DNA mixture interpretation cannot support any broad claims about “poor performance” across all laboratories involving all DNA mixtures examined in the past.

This certainly doesn’t raise the reader’s confidence in crime lab DNA testing. Instead, it gives the impression the four-year delay between completion and public release was for wagon-circling purposes as crime lab forensic scientists looked for ways to mute the impact of the study’s findings.

But there are also problems with the study itself. The authors of the study appear far too willing to cut crime labs slack for their failures. Rather than point out the problems originating from a lack of standardized processes, the study uses them to excuse the failures, as if unintentionally nailing the wrong person for the crime was somehow worthy of gold stars for effort. Here’s Hampikian’s take:

It is uncomfortable to read the study’s authors praising labs for their careful work when they get things right, but offering sophomoric excuses for them when they get things wrong. Scientists in crime labs need clear feedback to change entrenched, error-prone methods, and they should be strongly encouraged to re-examine old cases where such methods were used.

The study confirms much of what has been exposed earlier: DNA evidence may be based on hard science, but any small variable — including the inevitable tainting of DNA samples — has the ability to throw things off. And when it’s used as evidence in criminal trials, it has the potential to destroy lives. This study shows — at least indirectly — the labs handling DNA evidence aren’t taking it nearly as seriously as they should.

September 29, 2018 Posted by | Civil Liberties, Deception, Science and Pseudo-Science, Timeless or most popular | Leave a comment

BBC Ignores Widely Publicized IPCC Problems

By Donna Laframboise | Big Picture News | September 26, 2018

The BBC recently issued a document telling its journalists how to approach climate stories. That document treats the findings of a UN entity known as the Intergovernmental Panel on Climate Change (IPCC) as gospel.

The “best science on the issue,” it says, is expressed by the IPCC, “which drew on the expertise of a huge number of the world’s top scientists.

Cripes. Out here in the real world, it’s 2018. But the last decade may as well not have happened as far as the BBC is concerned. In the bubble in which BBC bureaucrats reside it’s still 2007, the year Al Gore and the IPCC were each awarded half of the Nobel Peace Prize – not for their scientific prowess, but for their role in raising the alarm about climate change.

The world was more innocent back then. The InterAcademy Council (IAC) – an international collection of science entities – wouldn’t strike a committee to examine the IPCC’s internal workings until two years later.

The release of the IAC’s August 2010 report should have been a game changer. After all, the report identified “significant shortcomings in each major step of IPCC’s assessment process” (see the first paragraph of Chapter 2).

The New Scientist magazine considered the report so devastating it called for the resignation of the IPCC’s chairman in an article titled Time for Rajendra Pachauri to go.

The Financial Times similarly ran an editorial that urged Mr. Pachauri “to move on.”

Geoffrey Lean, then Britain’s longest-serving environmental correspondent, said the report revealed the IPCC to be an “amateurish, ramshackle operation.”

Louise Gray, environment correspondent for Telegraph, began her account with these words: “In a damning report out earlier this week…”

Over at the Daily Mail, writer Fiona Macrae called it a “scathing report.”

Environmental studies professor Roger Pielke Jr. thought the report “remarkably hard hitting” – and was quoted by the Associated Press saying the IPCC might be redeemed via this flavour of “tough love.”

A headline in the London Times declared: This discredited science body must be purged. Two others – in India and America – used the word “slams” when characterizing the IAC’s conclusions.

Precious few improvements have occurred since then. Being a UN bureaucracy, the IPCC is essentially a law unto itself, an entrenched culture with no meaningful oversight mechanisms.

But the BBC wouldn’t know that. Because rather than performing due diligence to determine how much progress has been made since 2010, the BBC chooses to behave as though the IAC report doesn’t exist. The IPCC’s fall from grace simply never happened.

September 28, 2018 Posted by | Deception, Science and Pseudo-Science | , | Leave a comment

BBC’s climate change ‘facts’ are fiction

By Harry Wilkinson – The Conservative Woman – September 22, 2018

In order to avoid giving ‘false balance’ to the climate alarmists at the BBC, I thought it would be a good idea to fact-check their new internal guidance on climate change. This is their totalitarian memorandum aimed at stamping out free scientific discourse, on the basis that certain facts are established beyond dispute.

The problem is that these aren’t, and the BBC is guilty of repeatedly failing to describe accurately the nuances of climate science and the degree to which certain claims are disputed.

The crucial paragraph reads:

‘Most climate scientists regard a rise of 2 degrees C as the point when global warming could become irreversible and the effects dangerous. At current rates, we are on track for a rise of more than 3-4 degrees C by the end of the century.’

There are so many things wrong with this short statement.

That global warming can be somehow ‘irreversible’ is pure propaganda; the climate has always been changing and it always will. The briefing later describes the idea of catastrophic tipping points as a ‘common misconception’, so they have comically failed their own test right at the start.

A temperature rise of more than two degrees is not inherently dangerous either. The majority of economic impact studies put the cost of climate change by the end of the century at between 1.5% and 3% of world GDP, but these studies often make the inaccurate assumption that either no or little adaptation will take place.

In contrast, even the IPCC has admitted (p.15) that the cost of reducing emissions (‘mitigation’) to meet the 2oC target may be up to 4% of world GDP in 2030, 6% in 2050 and 11% in 2100.

These numbers do not incorporate the benefits of reducing our emissions, which are primarily the avoided costs of climate change. But given that a certain amount of warming is already ‘baked in’, it looks almost certain that this ‘mitigation’ will actually be far more expensive than not doing anything. If warming actually turns out to have a positive effect, the gamble will have failed even more spectacularly.

The IPCC has openly admitted that its cost forecasts come with incredibly optimistic assumptions that immediate mitigation takes place in all countries, that there is a single global carbon price, and that there are ‘no additional limitations on technology relative to the models’ default technology assumptions’. With no carbon capture and storage (CCS), they predict the total mitigation cost rises by a staggering 138%. The bad news is that CCS is currently failing to deliver, and few now expect it to play a significant role in reducing emissions.

Given the record of economic forecasts, all these predictions should be taken with a pinch of salt, but on the available evidence it appears we are sleepwalking into spending trillions of pounds to achieve only a negligible reduction in global temperatures.

The father of the two-degree target, veteran climate alarmist Hans Joachim Schellnhuber, has admitted the number is entirely fabricated: ‘Two degrees is not a magical limit; it’s clearly a political goal’.  He nonetheless celebrates its cynical effectiveness at motivating international political action.

Other prominent climate scientists, such as Hans von Storch, have been much more critical of this approach. Storch reflects on how scientists have become political sermonisers in a way which damages science as a whole: ‘Unfortunately, some of my colleagues behave like pastors . . . it’s certainly no coincidence that all the mistakes that became public always tended in the direction of exaggeration and alarmism.’

The statement that we are on track for ‘more than 3-4 degrees’ is an even more blatant distortion of the scientific evidence. Earlier this year, Peter Cox of the University of Exeter announced the results of his latest study which ruled out higher levels of warming. He concluded that ‘climate sensitivity’ would be in the narrower range of 2.2-3.4oC, thus ruling out warming of 4 or 5 degrees by 2100. His voice adds to a growing consensus that climate sensitivity will be lower than previously estimated. Does the BBC now consider him a climate denier too? 

Quite surreally, the document also describes the statement that ‘climate change has happened before’ as a ‘common misconception’. How much longer before the BBC renames itself The Ministry of Truth?

Estimating the current and future impacts of climate change is a complex and contested enterprise, but the BBC would rather you didn’t know. ‘The science is settled’ they say, so move on. This climate memorandum is nothing less than propaganda presented as fact by controller Fran. There is a critical debate to be had, so inquisitive people had better look elsewhere.

September 25, 2018 Posted by | Deception, Economics, Science and Pseudo-Science | | 1 Comment

The BBC’s Naive View of the UN’s Climate Machine

Big Picture News | September 24, 2018

SPOTLIGHT: Bureaucracies put their trust in other bureaucracies.

BIG PICTURE: A few weeks back, Joanne Nova perfectly captured the position of the British Broadcasting Corporation (BBC) regarding the scandalous UN entity known as the Intergovernmental Panel on Climate Change (IPCC).

A recent internal document gives BBC journalists advice about how to report on climate matters. In Nova’s words, it declares that the “IPCC is God, can not be wrong.”

The document’s exact words:

What’s the BBC’s position?

  • Man-made climate change exists: If the science proves it we should report it. The BBC accepts that the best science on the issue is the IPCC’s position, set out above. [italics added]

Well, here’s the problem. The IPCC does not do science. The IPCC is a bureaucracy whose purpose is to write reports.

The primary function of those reports is to pave the way for UN climate treaties. A set of facts need to be agreed-upon by all parties in advance, so that negotiators can start from the same page.

IPCC reports get written by government-appointed scientists, according to predetermined guidelines. Portions of IPCC reports then get re-written by politicians, bureaucrats, and diplomats (in effect, this is an unofficial round of negotiating, in advance of the official negotiations that take place later).

International treaties are political instruments. The IPCC exists to make climate treaties possible. The ‘science’ involved has therefore been selected and massaged to serve a political purpose.

Let’s ditch the naiveté. How likely is it that experts appointed by governments that have spent billions fighting climate change, would conclude that man-made climate change doesn’t exist?

TOP TAKEAWAY: Journalists are part of a system of checks and balances that help keep governments and large organizations honest. The BBC is a huge bureaucracy. The geniuses running it have declared another bureaucracy – the UN’s IPCC – a font of scientific truth. How pathetic.

LINKS:

September 25, 2018 Posted by | Deception, Mainstream Media, Warmongering, Science and Pseudo-Science | , | 1 Comment

IPCC to release “October surprise” on climate change

Watts Up With That? | September 24, 2018

With all the crazy talk about “Russian meddling” in the 2016 Presidential election, one wonders if the same sort of crazy talk might be applied to the release of a special climate report just weeks before the U.S. mid-term elections. Given the timing, you can be sure that whatever is in the report will be front page news and used by the left as a political tool. Here is a press release from the IPCC, h/t to Dr. Willie Soon


Save the Date: IPCC Special Report Global Warming of 1.5ºC

The Intergovernmental Panel on Climate Change (IPCC) will meet in Incheon, Republic of Korea, on 1-5 October 2018, to consider the Special Report Global Warming of 1.5ºC. Subject to approval, the Summary for Policymakers will be released on Monday 8 October with a live-streamed press conference.

The press conference, addressed by the IPCC Chair and Co-Chairs from the three IPCC Working Groups, will be open to registered media, and take place at 10:00 local time (KST), 03:00 CEST, 02:00 BST, 01:00 GMT and 21:00 (Sunday 7 October) EDT.

Registered media will also be able to access the Summary for Policymakers and press release under embargo, once they are available. They will also be able to attend the opening session of the meeting at 10:00-11:00 on Monday 1 October. All other sessions of the IPCC meeting are closed to the public and to media.

The opening session of the meeting will include statements by the Chair of the IPCC, senior officials the IPCC’s two parent bodies World Meteorological Organization (WMO) and United Nations Environment Programme (UN Environment) and of the United Nations Framework Convention on Climate Change (UNFCCC), and senior officials of the Republic of Korea.

The IPCC meetings and the press conference will take place at Songdo Convensia in Incheon.

Arrangements for media registration, submitting questions remotely, booking interviews, and broadcast facilities will be communicated in the coming weeks.

The report, whose full name is Global Warming of 1.5°C, an IPCC special report on the impacts of global warming of 1.5°C above pre-industrial levels and related global greenhouse gas emission pathways, in the context of strengthening the global response to the threat of climate change, sustainable development, and efforts to eradicate poverty, is being prepared under the scientific leadership of all three IPCC Working Groups.

Formally, the meeting will start with the 48th Session of the IPCC. Next a joint session of the three Working Groups chaired by their Co-Chairs will consider the Summary for Policymakers line by line for approval. Then the 48th Session of the IPCC will resume to accept the Summary for Policymakers and overall report.

The IPCC decided to prepare the report, in response to an invitation from the UNFCCC Conference of the Parties at its 21st meeting in December 2015 when the Paris Agreement was signed.

Source: http://www.ipcc.ch/news_and_events/ma-p48.shtml

September 24, 2018 Posted by | Science and Pseudo-Science | , , | Leave a comment

Alabama debunks the Times’ story about our warming world

Fabius Maximus website | September 19, 2018

Summary: The NY Times gives a story with bold numbers, confidently stated. Too bad their fact-checkers did not notice that their numbers are grossly misleading. Propaganda pretending to be science. This does not help, even if well-intended. The State Climatologist of Alabama tells the real story.

The Alabama Climate Report, August 2018.

By John R. Christy, Alabama State Climatologist.
Also Professor of Atmospheric Science and Director of the Earth System Science Center
at the U of AL in Huntsville. Links added.

Meteorological summer (June, July and August) is over. It is time to check how the summer temperatures compare with other years. For a research project a few years ago we developed a statewide summer temperature index for four 100-mile diameter regions centered on the major cities of the state – Mobile, Montgomery, Birmingham and Huntsville – going back to 1883. This summer will go down in that database and in NOAA’s official records as being slightly cooler than average.

Somewhat related to this, a reader sent me a link to a New York Times interactive website that claims to provide the number of days above 90°F each year for cities across the country: “How Much Hotter Is Your Hometown Than When You Were Born?” The results are produced for the Times by Climate Impact Lab (some might call it an environmental pressure group).

Since I build numerous datasets of this type, I took a look. The website asks you for the town and year in which you were born, then provides a time series purportedly showing the number of 90°F days per year since your birth and how that has increased.

Though a native of California, I have lived in Huntsville more years than any other place, so I put in my birth year and Huntsville as my hometown. Immediately I became suspicious when their dataset started only recently in 1960 (and a few years after my birth). …

For Huntsville and Montgomery, here are their results. Quite scary. It appears that the number of 90°F days has risen to their highest levels ever. It says that in 1960 Huntsville had 45 days above 90°F, but by 2017 it was 57 days and rising.

Huntsville, Alabama.

Huntsville AL - number of 90+ degree days

Montgomery, Alabama.

Montgomery AL - number of 90+ degree days

Then, to make matters even scarier, they use climate model projections to 2090 to tell me that in 2040, when I’m 80, there will be 73 such hot days in Huntsville (as shown below). Yikes!

Huntsville’s future per RCP4.5!

Huntsville AL - projected future temperature

Editor’s note – From the NYT website.

“For each year, the count of days at or above 90 degrees reflects a 21-year rolling average. Temperature observations for your hometown are averaged over an area of approximately 625 km² (240 square miles), and may not match single weather-station records.

“The time series is based on historical data for 1960-2000. The 2001-2020 period relies on a combination of historical data and future projections. After 2020, the data uses a mixed climate model that captures a broad range of extreme temperature responses. The “likely” future range reflects outcomes with 66 percent probability of occurrence in the RCP 4.5 scenario.”

The rest of the story

Before you sell your house and move to Canada, let’s take a look at the real story. Having built many climate datasets of Alabama, some starting as early as 1850, I knew the Times story was designed to create alarm and promote the claim that humans who use carbon-based energy (gasoline, natural gas, coal) to help them live better lives are making our summers ever more miserable. Be aware reader, this webtool is not designed to provide accurate information.

First of all, climate data for Alabama began in the 19th century, not 1960. In 2016 Dr. Richard McNider (Alabama’s former State Climatologist) and I published a carefully constructed time series of summer temperatures for the state starting from 1883. This used numerous station records, including some that the federal government had not archived into its databases (which are the most common source for outfits like the Climate Impacts Lab.)

Time Series Construction of Summer Surface Temperatures for Alabama, 1883–2014, and Comparisons with Tropospheric Temperature and Climate Model Simulations” in the Journal of Applied Meteorology and Climatology, March 2016.

I’ve updated that work to include summer temperatures through 2018. The result is below. Not only are summer daytime temperatures not rising, they have actually fallen over the last 136 years. After looking at the graph, why do you suppose the Climate Impacts Lab decided to start their charts in 1960?

We went a step further in that paper and demonstrated that climate models failed completely to replicate the downward temperature trend in Alabama over the past 120 years: 76 different models with a 100% failure rate. Would you trust these same models to tell you about the future as the Times does? Why did they not check the models for validity?

Now, what about the number of “hot” (or in Alabama we would say “typical”) 90°F days? For Alabama and the nation, I’ve calculated the average value per station each year since 1895. The results below speak for themselves (there is no increase of days hotter than 90°F) and expose the misinformation provided through the Times.

Alabama - days exceeding 90 degrees

 

Continental 48 US states - days exceeding 90 degrees

Providing accurate information on Alabama’s climate is what we do in our office. In fact, using real data, I can’t even come close to reproducing the images that the Climate Impacts Lab did which show 2010’s as having the most 90°F days in Alabama. I’m guessing they are using some theoretical output rather than sticking with observations. …I’ll check and follow-up as I can, but something is fishy.

This is a great state in which people can enjoy life and in which businesses can operate. Our climate resources are one of the reasons we are doing so well in recruitment. Occasionally though the time comes when I must address claims made by those whose intention is not to inform but to promote false alarm. This usually happens when an environmental pressure group generates a press release whose dramatic statements are published by a willing media (without any fact-checking). This is one of those times, and I’m sure it will not be the last.


Dr. John R. Christy is the Distinguished Professor of Atmospheric Science and Director of the Earth System Science Center at the University of Alabama in Huntsville. Since November 2000 he has been Alabama’s State Climatologist. See his bio at the U of AL website (from which this bio was taken).

In 1989 Dr. Roy W. Spencer (then a NASA scientist, now a Principle Research Scientist at UAH) and Christy developed a global temperature data set from microwave data observed from satellites beginning in 1979. For this achievement, the Spencer-Christy team was awarded NASA’s Medal for Exceptional Scientific Achievement in 1991. In 1996, they received a Special Award by the American Meteorological Societyfor developing a global, precise record of earth’s temperature from operational polar-orbiting satellites, fundamentally advancing our ability to monitor climate.” In January 2002 Christy was inducted as a Fellow of the American Meteorological Society.

Dr. Christy has served as a Contributor (1992, 1994, 1996 and 2007) and Lead Author (2001) for the U.N. reports by the IPCC in which the satellite temperatures were included as a high-quality data set for studying global climate change. He has served on five National Research Council panels or committees, has performed research funded by NASA, NOAA, DOE, DOT and the State of Alabama, and has testified 18 times for congressional committees.

His papers have been published in many journals, including Science, Nature, Journal of Climate, and The Journal of Geophysical Research. See the list here (with links).

September 21, 2018 Posted by | Fake News, Mainstream Media, Warmongering, Science and Pseudo-Science, Timeless or most popular | | Leave a comment