Earlier this month I gave a talk at the University of Minnesota. It was my first public talk on climate since being “investigated” by Rep. Raul Grijalva (D-AZ) in 2015.
It is also the first and only invitation I’ve received to give a public talk on climate at a US university since 2015.
Before that I received about 2-3 invitations per month.
Below I document a key episode in my own experience that I have never looked back on in detail. The timeline is of use to me, shared here for anyone else who might be interested.
A Look Back at the Holdren-Pielke Debate of 2014
One of the more bizarre experiences I’ve had in the climate debate was when President Obama’s science advisor, John Holdren, posted a weird, 6-page screed about me on the White House web site.
Here is a reconstruction of and look back at those events, and an evaluation how they look from vantage point of 2018.
This look back is mainly just for me, as when you are in the spin cycle it can be hard to see what has happened at the time.
The Holdren episode ultimately led to me being investigated by a member of Congress with a major impact on my life and career.
I’ve not taken a close look back at this episode, it’s time for me to document exactly what transpired. If you are not interested, this would be a good place to take the exit ramp.
In July 2013, I testified before the US Senate Environment and Public Works Committee on extreme events.
You can see my 5 minute statement below and read my full written testimony here in PDF. That testimony was widely discussed.
I followed that testimony up with similar testimony before the US House a few months later, in October 2013.
I wrote a blog post explaining that the science on these issues was solid. Even so I argued that “zombie science” (to the contrary) would always be with us.
Two weeks later Dr. Holdren was asked about these statements by Senator Jeff Sessions before the Senate Environment and Public Works Committee, the same committee that I had testified before the previous July.
After some sparring on what Dr. Holdren said or didn’t say a few week previous, Senator Sessions said:
“Well, let me tell you what Dr. Pilkey (sic) said, who sat in that chair you are sitting in today just a few months ago, he is a climate impact expert, and he agrees that warming is partly caused by human emissions. But he testified “It is misleading and just plain incorrect to claim that disasters associated with hurricanes, tornadoes, floods or droughts have increased on climate change time scales either in the United States or globally.”
Holdren replied with a delegitmization effort, saying that I was
“not representative of the mainstream scientific opinion on this point. And again, I will be happy to submit for the record recent articles from Nature, Nature GeoScience, Nature Climate Change, Science and others showing that in drought-prone regions droughts are becoming more intense.”
Of course, Holdren was incorrect.
My views are 100% consistent with those of the IPCC, the very definition of “mainstream scientific opinion.”
Holdren promised to submit scientific evidence for the hearing record in support of his views, Sessions said he looked forward to it.
Three days later Holdren’s missive about me was posted on the White House website, titled Drought and Global Climate Change: An Analysis of Statements by Roger Pielke Jr ” (here in PDF).
Holdren singled out just 2 statements that I had made in my testimony:
“It is misleading, and just plain incorrect, to claim that disasters associated with hurricanes, tornadoes, floods or droughts have increased on climate timescales either in the United States or globally.”
Drought has “for the most part, become shorter, less, frequent, and cover a smaller portion of the U.S. over the last century”. Globally, “there has been little change in drought over the past 60 years.”
The quotes in blue above are from the US National Climate Assessment (former) and a Nature paper (latter) on global drought trends.
Holdren explained his objections:
“I replied that the indicated comments by Dr. Pielke … were not representative of mainstream views on this topic in the climate-science community; and I promised to provide for the record a more complete response with relevant scientific references. “
The slide below shows the entirety of my discussion of drought in my 2013 Senate testimony, which consisted only of quotes from the IPCC, the US CCSP and an image from the CCSP report.
Holdren did not mention hurricanes, floods or tornadoes in his 6 pages of response.
Holdren’s response blew up the internet (or at least the tiny part of it involving issues related to climate).
When the White House posts 6 pages about you, it gets noticed.
For my part, in response I wrote a blog response which you can read here.
In that post I noted:
“It is fine for experts to openly disagree. But when a political appointee uses his position not just to disagree on science or policy but to seek to delegitimize a colleague, he has gone too far.”
This was, as far as I am aware, the first time that a Science Advisor to the US President used his platform to seek to delegitimize an academic with whom he disagreed.
I am aware of no such comparable use of the authority and reach of the White House against a researcher.
The fact that I was singled out by the president’s science advisor was not reported on or commented on by the mainstream scientific media. Leading scientific organizations said nothing.
I found this pretty amazing, but c’est la vie.
If John Marburger, say, had gone after James Hansen, it’d have been a story.
None of this mattered, I quickly learned that a lone academic is no match for the bully pulpit that is the White House and the powerful echo chamber of the online climate debate.
A few weeks later the campaign to have me removed as a writer for 538 was underway and 11 months later the investigation motivated by Rep. Raul Grijlava (D-AZ), which he indicated was the result of Holdren’s missive, was launched.
One of my close colleagues said to me at the time: “I’d love to come to your defense, but I don’t want them coming after me.”
Fair enough.
Let’s quickly take a look at the state of the science in 2018 on drought.
The 2017 US National Climate Assessment, prepared under the direction of John Holdren in the last months of the Obama Administration and released after Donald Trump became president concluded the following about drought:
“drought statistics over the entire CONUS have declined … no detectable change in meteorological drought at the global scale”
“Western North America was noted as a region where determining if observed recent droughts were unusual compared to natural variability was particularly difficult.”
Television has taught us that the crack CSI experts and their state-of-the-art technology can solve any crime through the power of science. In reality, more often than not the crime-detection technology of the past has turned out to be pseudoscience at best, and outright fraud at worst. And, of course, it has been used to put innocent people in jail. Here are 4 Ways The Crime Lab Can Frame You.
European Commission to present 2021-2027 budget proposal May 2
Climate to be component of regional aid, transport spending
The European Union’s executive is poised to propose spending 25 percent of funds available in next EU multiannual budget on activities related to climate protection, making sure new economic and political challenges don’t weaken the bloc’s resolve to fight pollution.
The European Commission’s blueprint for the 2021-2027 budget, to be proposed on May 2, will boost the so-called climate mainstreaming from 20 percent in the current multiannual financial plan, according to a person with knowledge of the matter. The funds for reducing emissions and adapting to climate change will be earmarked under policies such as regional aid, transport, research and external relations, said the person, who asked not to be identified because talks on the draft budget are private.
The latest edition of Radio 4’s environmental programme, “Costing the Earth”, looks at how our springs are supposedly getting earlier. (Yes, I know spring starts in March!)
The programme’s opening introduction by presenter Lindsey Chapman gives us a clue that it won’t be an objective assessment:
We’re looking for signs of how a volatile climate is shifting our seasons, and affecting both our native wildlife and migrant visitors to these shores.
Chapman, also presenter of the Springwatch TV series, then adds:
I’ve been noticing changes on my own patch, from the arrival of the first swallows to the flowering times of spring flowers over the last ten years.
At about seven minutes in though, she makes this extraordinary statement:
Spring now arrives an average of 26 days earlier each year than it did 10 years ago. We know this because of the extraordinary records kept by the public, stretching back centuries.
This statement that Spring is almost a month earlier than it was just 10 years ago is complete nonsense and fails the most elementary sanity check. It appears, yet again, that where global warming is concerned, elementary common sense and fact-checking are thrown out by the BBC, and replaced with absurd exaggeration and alarmism.
So where did Chapman get this crazy claim from?
As she goes on to explain, it is supposedly from the Woodland Trust, who run a scheme called Nature’s Calendar.
This allows members of the public to record when they first see certain events each spring, such as birds, first flowerings, butterflies and so on. In other words, phenology. During warm springs, naturally enough, these events tend to arrive earlier.
According to Woodland Trust, these first sightings have been between around one and two weeks earlier in the last three years, though some butterfly and bird arrivals were as much as three weeks early in 2017:
You will notice that Woodland Trust use 2001 as a baseline, and nowhere do they claim that spring is now 26 days earlier than ten years ago.
But why 2001? In fact they have only been collecting this data since 2000, and decided to use 2001 as the base year because, they claim, weather conditions that year “closely reflected the 30-year average”.
However, on closer examination we see that it is not the current 30-year average they are talking about (ie 1981-2010), but 1961-90.
This is highly significant, because the 1961-90 period was considerably colder than both the decades that preceded and followed it.
The 1961-90 period was 0.7C colder than 1981-2010. We can also see that, while there have been ups and downs, there is little evidence of overall change in spring temperatures since around 1990.
This is definitely not the message portrayed by the BBC programme.
We should also note that the spring of 2001 was much colder than prior years, which makes it strange that it should be used as a base year at all. The Woodland Trust recognised this same point in their Spring 2005 report:
This should be little surprise, when we see that, contrary to popular myth, temperatures in January and February have changed little since a century ago.
And, as with spring temperatures, there is a noticeable dip between 1961-90:
There appears to be no evidence to back up Chapman’s claim that spring now arrives an average of 26 days earlier each year than it did 10 years ago, either in the temperature record or in the Woodland Trust surveys.
The latter are in any event misleading, and certainly not in a shape or from “scientific”. Their conclusions are obtained only by using an unusually cold year, 2001, as their base point.
There is actually nothing in the temperature record to suggest that springs are beginning any earlier than they were thirty years ago.
To be fair, one of the interviewees, Matthew Oates of the National Trust, did mention that the transition to warmer/earlier springs began several decades ago.
Nevertheless, the central theme of the programme was that the UK climate is changing rapidly, something not borne out by the data.
I have no doubt that the BBC will fall back on their regular defence of “scientists say”. However, following OFCOM’s recent ruling that the BBC should have challenged Lord Lawson on comments he made, it should surely not be acceptable for them to simply accept unscientific research from bodies like the Woodland Trust without challenging that as well.
Of course, in this instance the BBC has gone one step further. Not only have they broadcast the Woodland Trust’s findings, Lindsey Chapman has actually then presented them as an indisputable fact.
In his interview with the BBC’s Today Programme on 10 August 2017, Lord Lawson pointed out that while some extreme events had increased, others had diminished. Overall, however, extreme weather events had not increased according to the IPCC:
“For example, for example he [Al Gore] said that there has been a growing, increase which is continuing, in extreme weather events. There hasn’t been. All the experts say there hasn’t been. The IPCC, the Inter-Governmental Panel on Climate Change, which is the sort of voice of the consensus, concedes that there has been no increase in extreme weather events. Extreme weather events have always happened. They come and go. And some kinds of extreme weather events, there’s a particular time increase, whereas others, like tropical storms, diminish”.
“Overall, the most robust global changes in climate extremes are seen in measures of daily temperature, including to some extent, heat waves. Precipitation extremes also appear to be increasing, but there is large spatial variability”
“There is limited evidence of changes in extremes associated with other climate variables since the mid-20th century”
“Current datasets indicate no significant observed trends in global tropical cyclone frequency over the past century … No robust trends in annual numbers of tropical storms, hurricanes and major hurricanes counts have been identified over the past 100 years in the North Atlantic basin”
“In summary, there continues to be a lack of evidence and thus low confidence regarding the sign of trend in the magnitude and/or frequency of floods on a global scale”
“In summary, there is low confidence in observed trends in small-scale severe weather phenomena such as hail and thunderstorms because of historical data inhomogeneities and inadequacies in monitoring systems”
“In summary, the current assessment concludes that there is not enough evidence at present to suggest more than low confidence in a global-scale observed trend in drought or dryness (lack of rainfall) since the middle of the 20th century due to lack of direct observations, geographical inconsistencies in the trends, and dependencies of inferred trends on the index choice. Based on updated studies, AR4 conclusions regarding global increasing trends in drought since the 1970s were probably overstated. However, it is likely that the frequency and intensity of drought has increased in the Mediterranean and West Africa and decreased in central North America and north-west Australia since 1950”
“In summary, confidence in large scale changes in the intensity of extreme extratropical cyclones since 1900 is low”
Without providing any evidence to justify disputing the IPCC’s conclusions, Ofcom claimed that Lawson’s statement about extreme weather was incorrect and not sufficiently challenged by the BBC presenter during the interview.
Ofcom, however, appear to base its ruling on information from unnamed complainants, the BBC (and possibly from other unnamed sources) without publishing that information or where it obtained it from. As a result, nobody is able to see it and judge its credibility. It did not ask Lord Lawson for any information regarding his statements.
That Ofcom should judge on scientific matters without justifying their decision sets a worrying precedent concerning the oversight of journalists.
Presenters are not experts and cannot be expected to be. For them to provide a detailed examination of competing viewpoints would be a burden on them and a limitation of the freedom of broadcasters and the BBC, and severely inhibit live discussions, as well as investigative journalism.
It certainly does appear to be extremely bad judgment by OFCOM to have accepted the word of some anonymous complainant, without attempting to ascertain the true facts, or get the GWPF’s views.
One wonders whether there is also the hand of someone at the BBC, like Harrabin, guiding the OFCOM judgment here, as an attempt to enforce more discipline on their news staff, who might otherwise be tempted to seek out dissenting views.
It is clear that OFCOM have fallen into the same groupthink we have seen lately, and automatically assumed that extreme weather must be on the increase.
I wait with baited breath for OFCOM to criticise the BBC next time they interview Al Gore, and fail to challenge the palpable nonsense he spouts. But I fear I will be waiting a long time!
Following up the Lake Chad post, this is a highly relevant contribution by science writer, Fred Pearce, in Yale 360 last October:
The Hadejia-Nguru wetland was once a large green smudge on the edge of the Sahara in northeast Nigeria. More than 1.5 million people lived by fishing its waters, grazing their cattle on its wet pastures, and irrigating their crops from its complex network of natural channels and lakes. Then, in the 1990s, the Nigerian government completed two dams that together captured 80 percent of the water that flowed into the wetland.
The aim was to provide water for Kano, the biggest city in northern Nigeria. But the two dams dried up four-fifths of the wetland, destroying its natural bounty and the way of life that went with it. Today, many of the people who lost their livelihoods have either headed for Kano, joined the Islamic terrorist group Boko Haram that is terrorizing northeast Nigeria – or paid human-smugglers to take them to Europe.
For the past three years, Europe has been convulsed by a crisis of migrants, some from Syria and the war-torn Middle East, but also hundreds of thousands coming from the arid Sahel region of Africa, including Nigeria, Mali, and Senegal. They are fleeing poverty and social breakdown caused by insurgent groups such as Boko Haram. But environmentalists and others in the region say that behind this social chaos lies serious water mismanagement in the drought-prone region.
Big dams intended to bring economic development to the Sahel are having the opposite effect. By blocking rivers, they are drying out lakes, river floodplains, and wetlands on which many of the poorest in the region depend. The end result has been to push more and more young people to risk their lives to leave the region.
The Manantali Dam is estimated to have caused the loss of 90 percent of fisheries and up to 618,000 acres previously covered by water.
Last year, I traveled with Wetlands International, a Dutch-based environmental NGO, along the valley of the River Senegal, which forms the border between Senegal and Mauritania. Farmers, herders, and fishermen told of their battles against the ecological breakdown that has followed the building of the Manantali Dam, which is located upstream in Mali and was completed in 1987. The dam holds back a large part of the river’s seasonal flood flow to generate hydroelectricity for cities and provide irrigation water for some farmers. But there have been more losers than winners.
Seydou Ibrahima Ly, a teacher in the bankside village of Donaye Taredji in Podor district, said that when he was young, “the river had a flood that watered wetlands where fish grew.” But “now there is no flood because of the dam… Compared to the past, there aren’t many fish. Our grandparents did a lot of fishing, but we don’t.” With their livelihoods gone, more than 100 people had left his village, he said. “In some villages, they are almost all gone.”
“The migrants know the boats [traveling to Europe] are dangerous, but they have a determination to go and find a better life,” said Oumar Cire Ly, deputy chief of neighbouring Donaye village, which has also seen an exodus of its young people.
Farmers once planted their crops in the wet soils as the waters receded. Pastoralists grazed their animals where forests and wildlife flourished. But the dam and its related projects are estimated to have resulted in the loss of 90 percent of the fisheries and up to 618,000 acres of fields that were previously covered by water from the rising river during the wet season, a system of natural irrigation known as flood-recession agriculture.
The Senegal River Basin Development Organization – the intergovernmental agency that is responsible for the dam project and is known by its French acronym, OMVS – conceded in 2014 that eliminating the river’s annual flood “has made flood-recession crops and fishing on the floodplain more precarious, which makes the rural production systems of the middle valley less diversified, and therefore more vulnerable.”
This is clearly at odds with the organization’s mandate to “ensure food security for all people within the river basin and region.” But Amadou Lamine Ndiaye, the OMVS’s director of environment and sustainable development, told me his agency regarded wetlands such as river floodplains primarily as a source of revenue for tourists, rather than as a lifeline for rural communities.
As many as a million Nigerians have lost livelihoods because of dams that once fed a wetland that flowed into Lake Chad.
Worse still is the crisis affecting the region around Lake Chad, which until half a century ago was Africa’s fourth largest lake, straddling the border between Nigeria, Niger, Chad, and Cameroon. The lake has lost more than 90 percent of its surface area since then. Initially, this was largely due to persistent droughts in the Sahel that often dried up the rivers supplying it with water. Since 2002, rainfall has improved markedly, but Lake Chad has not recovered.
That is because of dams on the rivers flowing into the lake from the wetter south, mainly in Cameroon and Nigeria. The Maga Dam in Cameroon has diverted 70 percent of the flow of the Logone river to rice farms. This has both dried up part of the floodplain pastures that once supported 130,000 people, and dramatically reduced inflow to Lake Chad.
In northern Nigeria, up to 1 million people have lost livelihoods because of dams on the River Yobe that once fed the Hadejia-Nguru wetland and flowed on into Lake Chad. In both cases, says Edward Barbier, an environmental economist at Colorado State University, the dams have had an overall negative effect on local economies, as losses to fishermen, pastoralists, and others exceeded gains from irrigation agriculture.
The major wetlands and water basins of the Sahel region in Africa. Wetlands International
The poverty is driving social breakdown and conflict all around the lake. Mana Boukary, an official of the Lake Chad Basin Commission, an intergovernmental body, told Duetsche Welle two years ago: “Youths in the Lake Chad Basin are joining Boko Haram because of lack of jobs and difficult economic conditions resulting from the drying up of the lake.”
The UN humanitarian coordinator for the Sahel region, Toby Lanzer, told a European Union-Africa summit that it was also fueling migration: “Asylum seeking, the refugee crisis, the environmental crisis, the instability that extremists sow — all of those issues converge in the Lake Chad basin.”
A Nigerian government audit of the lake basin in 2015 agreed. It concluded that “uncoordinated upstream water impounding and withdrawal” were among factors that had “created high competition for scarce water, resulting into [sic] conflicts and forced migration.” More than 2.6 million people have left the Lake Chad region since mid-2013, according to the International Organization for Migration.
At their greatest extent, wetlands cover one-tenth of the Sahel, the arid region stretching for 3,400 miles across northern Africa, immediately south of the Sahara desert. They are wildlife havens, especially notable for their birdlife. The Inner Niger Delta in Mali, for instance, is one of the world’s most important seasonal stops for migrating birds, hosting about 4 million waterbirds from Europe each winter. In addition, these wetlands are a source of sustenance for the region’s poor and the main sources of the region’s economic productivity outside the short wet season from June to September.
Dried-up wetlands are often blamed on climate change when the real cause often is more human interference in river flows.
Yet the decline of the wetlands and the resulting social and economic consequences remains a largely untold story. That is partly because dried-up wetlands are routinely, and often incorrectly, blamed on climate change, when the real cause is often more direct human interference in river flows. It is also partly because many development agencies still mostly think of dams as infrastructure development that furthers economic activity and wealth – and partly because many environmental groups concentrate on the ecological impacts of dried wetlands, while ignoring the human consequences.
In this climate of ignorance, more wetlands are under threat. The next victim is likely to be the Inner Niger Delta, a wetland in northern Mali that covers an area the size of Belgium. The delta forms where West Africa’s largest river, the Niger, spreads out across flat desert near the ancient city of Timbuktu.
The delta is a magnet for migrating European waterbirds. It is also currently one of the most productive areas in one of the world’s poorest countries. It provides 80 percent of Mali’s fish and pasture for 60 percent of the country’s cattle, and it delivers 8 percent of Mali’s GDP and sustains 2 million people, 14 percent of the population, says Dutch hydrologist Leo Zwarts. Its fish are exported across West Africa from Mopti, a market town on the shores of the delta.
In recent years the Mali government has been diverting water from the River Niger at the Markala barrage just upstream of the delta, to irrigate desert fields of thirsty crops such as rice and cotton. These diversions have cut the area of delta flooded annually by up to 7 percent, says Zwarts, causing declines in forests, fisheries, and grazing grasses. Some people have left the delta as a result, though it is unclear whether they have been among the Malians regularly reported to be in migrant boats heading from Libya to Italy.
The Markala Barrage in Mali, which diverts water from the River Niger for irrigating crops such as rice and cotton. Fred Pearce/Yale e360
But this trickle of people from the delta could soon become a flood. In July this year, Mali’s upstream neighbour, Guinea, announced the go-ahead for Chinese firms to build a giant new hydroelectric dam, the Fomi Dam, in the river’s headwaters. Construction could begin as soon as December.
The Fomi Dam’s operation will replace the annual flood pulse that sustains the wetland’s fecundity with a more regular flow that the Mali government intends to tap for a long-planned tripling of its irrigation along the river. Wetlands International estimates that the combined impact of the dam and irrigation schemes could cut fish catches and pastures in the delta by 30 percent.
“Less water flowing into the delta means a lower flood level and a smaller flood extent”, says Karounga Keïta of Wetlands International in Mali. “This will have a direct impact on food production, including fish, livestock, and floating rice.” He fears that the inevitable outcome will be further human migrations from the wetland.
The links in the chain from water management through wetland health to social breakdown and international migration are complex. Wetland loss is certainly not the only reason for the human exodus from the Sahel. And migration is a long-standing coping strategy for people living in a region of extreme climate variability.
But the parlous state of the wetlands of the Sahel is changing the region. In the past, wetlands were refuges in times of drought or conflict. They were safe, and the water persisted even in the worst droughts. But today, with their waters diminished, these wetlands have become sources of outmigration. Now, migrations that were once temporary and local are becoming permanent and intercontinental.
I have every sympathy for countries like Nigeria and Cameroon. They are between the proverbial rock and hard place,
They have growing populations, with ever growing expectations. For this they need, among other things, food and water supplies. On the other hand, these very things impact on the environment which ultimately sustains them.
There are no easy answers.
But blaming it all on climate change does nobody any favours.
You may recall the above report by the BBC, which described how bad last year’s Atlantic hurricane season was, before commenting at the end:
A warmer world is bringing us a greater number of hurricanes and a greater risk of a hurricane becoming the most powerful category 5.
As I promised, I fired off a complaint, which at first they did their best to dodge. After my refusal to accept their reply, they have now been forced to back down.
The above sentence now no longer appears, and instead they now say:
Scientists are still analysing what this data will mean, but a warmer world may bring us a greater number of more powerful category 4 and 5 hurricanes and could bring more extreme rainfall.
Correction 29 January 2018: This story has been updated to clarify that it is modelling rather than historical data that predicts stronger and wetter hurricanes.
Of course, we have the usual problem, that those who read the article originally and who would have been deeply misled, won’t see the correction now.
George Monbiot, who has now been diagnosed with prostate cancer at the young age of 55, was therefore born in 1963, at the peak of the atmospheric test fallout. He is thus a peak exposed (at risk) member of a cohort of those exposed in the womb to the fallout (1959-63) and currently suffering the consequences of exposure to Strontium-90 in the milk, and (measured) in the childrens’ bones.
In his article in theGuardian, he says that he has always done all the healthy things, done lots of exercise, eaten vegetables, didn’t smoke or drink, all that stuff. He is clearly puzzled about being singled out by the three ladies. But the cause was something that he had no control over, and neither had anyone else who was born in the fallout period. George writes that he is happy. This insane response to his predicament, (which I personally am not happy about despite his intemperate attacks on me in his Guardian column and blogs) must go alongside his equally insane response about the Fukushima events where he publicised his road-to-Damascus conversion to nuclear power.
The effect of the genetic damage of the fallout on babies can be seen in the graph below, Fig 1, taken from a recent paper I published (Busby C (2017) Radiochemical Genotoxicity Risk and Absorbed Dose. Res Rep Toxi. Vol.1 No.1:1.). The babies that did not die were just those with insufficient genetic damage to kill, but this damage would have affected them in later life in various ways. The most measurable effect (apart from genetic defects and congenital diseases) is higher cancer risk which is presented as early cancer onset. The issue of the 1959-63 cancer cohort was discussed in my 1995 book Wings of Death, and a letter I published in 1994 in the British Medical Journal (BMJ). The issue is one of Absorbed Dose. If internal exposure to radionuclides like Strontium-90 and Uranium-238 and Uranium-235 bind to DNA, which is the target for genetic damage, then Dose, which is an average quantity over kilograms of tissue, is an unsafe way of quantifying genetic damage. The issue of genetic damage from radioactive pollution was first raised in 1950 by Herman Muller, the Nobel Prize winning geneticist who discovered the effects of radiation, but his warnings were ignored, though they are now found to be accurate.
The serious effects of internal radionuclide exposures on Prostate Cancer were revealed in a study of UK Atomic Energy Agency workers also published in 1993 in the BMJ (Fraser P, Carpenter L, Maconochie N, Higgins C, Booth M and Beral V (1993) Cancer mortality and morbidity in employees of the United Kingdom Atomic Energy Authority 1946-86. Brit. J. Cancer 67 615-624.) This paper showed a 2-fold excess cancer risk in workers who had been monitored for internal radionuclides versus those who had not been. Prostate cancer mortality was significantly high. Although later cover-up studies by the nuclear industry, using a larger cohort reduced this effect for prostate cancer, the internal/ external exposure result for all cancers has not been satisfactorily followed up.
Fig 1. First day neonatal mortality USA shows the effects of the fallout. Because of advances in medicine and better social conditions, infant mortality was falling everywhere. But as soon as the atmospheric tests began, rates went up in time with the fallout. 1st day neonatal mortality is a measure of congenital damage: the baby survives in the mother by using the mothers’ oxygenation and other support but because the babies own organs are damaged and it cannot survive after birth. Strontium-90 was measured in bone where it built up to a peak in 1964. It will also have attached to chromosomes due to its affinity for DNA.
The fallout cohort is now entering the cancer bracket and these people are driving up the cancer rates in the Northern hemisphere, especially for breast cancer and prostate cancer. I have been studying this group since 1995, but now my predictions are appearing in the data.
But the true picture of the fallout effects is even more scary. Not only are the babies born over the peak fallout period, like George, at higher risk of more and earlier cancer, but it is now emerging that their children, born around 1980- 1990 are carrying the same genetic (or rather genomic) curse. I am in the process of putting together a scientific paper on this. There is a sudden increase in cancer rates in young people aged 25-35 which began after 2008. This is an extraordinary development. The finding was confirmed for colon cancer in the USA in a paper published recently in the Journal of the National Cancer Institute (Rebecca L. Siegel, Stacey A. Fedewa, William F. Anderson, Kimberly D. Miller, Jiemin Ma, Philip S. Rosenberg, Ahmedin Jemal Colorectal Cancer Incidence Patterns in the United States, 1974–2013 JNCI J Natl Cancer Inst (2017) 109(8): djw322). The authors were unable to explain their findings of increases in colon cancer in young people but decreasing colon cancer rates in older people. They were “puzzled”. The explanation is simple. These were children born to those who were themselves born during the fallout and genomically damaged at birth. The damage is passed to the children (and will be in turn passed to theirs and so on). The effect is clear also in the England and Wales data.
So, for the logical positivists, let’s have a look at the prostate cancer data in England and Wales.
In Table 1 below I show some data from the official ONS government annual reports on prostate cancer incidence in some selected years from 1974 to 2015.
No argument there then. The amazing thing is that there are huge amounts of money received and spent on cancer research: but no-one looks at the cause. Or rather that those who do look at the cause are attacked and marginalised and their work is not reported.
For example, and relevant here, are the serious genetic effects of small dose internal exposures in Europe after Chernobyl reviewed by Prof Inge Schmitz-Feuerhake, Dr Sebastian Pflugbeil and myself in a peer review publication in 2016 (Schmitz-Feuerhake, Busby C, Pflugbeil P Genetic Radiation Risks-A Neglected Topic in the Low Dose Debate. Environmental Health and Toxicology. 2016. 31Article ID e2016001. .) You would think that this evidence, which was reported in the peer review literature from 20 studies from countries all over Europe, might make it into one of the newspapers. But nothing.
My attempts to draw attention to these internal genetic damage issues have also been ignored or dismissed by the British establishment. This year, in September, I was to have presented this evidence to British Government Minister Richard Harrington at a meeting of the NGOs and the government at Church House Westminster. My flight from Sweden was sabotaged but I made it to the meeting nevertheless, to find that the Minister had made some excuse, and had not come. )
At the meeting, the government radiation expert committee members (COMARE) refused to consider anything I said.
This behaviour by the British can be compared with the Swedish Environmental Court in Stockholm to which I had been presenting the same findings the previous week. In January 2018, the 8 judges of the Swedish Court told the Swedish government that they must not permit the development of the nuclear waste facility at Forsmark. This landmark decision was also omitted from any newspapers in the UK, which itself is currently busy trying to find a local council they can bribe to allow them to bury nuclear waste somewhere in England and (more probably) Wales.
When I presented the same genetic damage evidence in the nuclear test veteran case in the Royal Courts of Justice in 2016, I submitted reports by 4 eminent radiation experts, including Prof Schmitz-Feuerhake/ All gave evidence under cross examination. We filed the evidence of genetic damage in the Test Veteran children: a 10-fold excess risk for congenital malformations and in the grandchildren 8-fold. The British Judge, Sir Nicholas Blake, refused to listen to any of this evidence and dismissed our experts. Blake found for the Ministry of Defence. I am taking a new Test Veteran case this summer. We shall see what happens.
But no surprise about judge Blake. In a recent survey of judges in Europe, it was found that Britain was only exceeded by Albania in the percentage of judges (45%) who reported that their decisions had been made at the direction of the establishment. The lowest rates of interference with judges was found (1%) in—guess where—Norway, Sweden and Denmark.
It seems that we live in a corrupt society here in Britain and I am ashamed to be part of this State which has poisoned its citizens consistently since 1945 and continues to do so, and to cover it all up, aided by dishonest scientists and celebrity reporters like George Monbiot. Those who have a magical view of events might delight in thinking that George has received his just due; for myself I just hope that this may make him look into the issue more deeply and change his mind about the effects of radioactive contamination.
A better accounting of natural groundwater discharge is needed to constrain the range of contributions to sea level rise. The greater the contribution from groundwater discharge, the smaller the adjustments used to amplify contributions from meltwater and thermal expansion.
In a 2002 paper, what is frequently referred to as “Munk’s enigma”, Scripps Institution of Oceanography’s senior researcher bemoaned the fact researchers could not fully account for the causes of sea level rise. He lamented, “the historic rise started too early, has too linear a trend, and is too large.” Early IPCC analyses noted about 25% of estimated sea level rise was unaccounted for. Accordingly, in 2012, an international team of prominent sea level researchers published, Twentieth-Century Global-Mean Sea Level Rise: Is the Whole Greater than the Sum of the Parts? (henceforth Gregory 2012). They hoped to balance struggling sea level budgets by re-analyzing and adjusting estimates of the contributions from melting glaciers and ice caps, thermal expansion, and the effects of dam building and groundwater extraction. However, a natural contribution from any imbalance in groundwater re-charge vs discharge was never considered. Yet the volume of freshwater stored as groundwater, is second only to Antarctica’s frozen supply, and 3 to 8 times greater than Greenland’s.
At the risk of oversimplifying, the effects of groundwater storage can be differentiated between shallow-aquifer effects that modulate global sea level on year to year and decade to decade timeframes, versus deep aquifer effects that modulate sea level trends over centuries and millennia.
Researchers are increasingly aware of natural shallow groundwater dynamics. As noted by Reager (2016) in A Decade of Sea Level Rise Slowed by Climate-Driven Hydrology, researchers had determined the seasonal delay in the return of precipitation to the oceans causes sea levels to oscillate by 17 ± 4 mm [~0.7 inches] per year. Reager (2016) also argued decadal increases in terrestrial water storage driven by climate events such as La Nina, had reduced sea level rise by 0.71 mm/year. Likewise, Cazenave 2014 had published according to altimetry data, sea level had decelerated from 3.5 mm/yr in the 1990s to 2.5mm/yr during 2003-2011, and that deceleration could be explained by increased terrestrial water storage, and the pause in ocean warming reported by Argo data.
Improved observational data suggest during more frequent La Nina years a greater proportion of precipitation falls on the land globally and when routed through more slowly discharging aquifers, sea level rise decelerates. During periods of more frequent El Niños, more rain falls back onto the oceans, and sea level rise accelerates. In contrast to La Nina induced shallow-aquifer effects, deep aquifers have been filled with meltwater from the last Ice Age, and that water is slowly and steadily seeping back into the oceans today.
Munk’s “Too Linear Trend” Enigma and Deep Groundwater Discharge
Hydrologists concerned with sustainable groundwater supplies and drinking water contamination, have been in the forefront of analyzing the volume and ages of the world’s groundwater, providing greater insight into deep aquifer effects. Gleeson (2015) determined, “total groundwater volume in the upper 2 km of continental crust is approximately 22.6 million cubic kilometers, twice as much as earlier estimates. If all 22.6 million cubic kilometers of freshwater stored underground reached the oceans, sea level would rise 204 feet (62,430 millimeters). Via various isotope analyses and flow models, Jasechko (2017) estimated that between 42-85% of all groundwater stored in the upper 1 kilometer of the earth’s crust is water that had infiltrated the ground more than 11,000 years ago, during last Ice Age.
Clearly the earth’s groundwater has yet to reach an equilibrium with modern sea levels. With deep aquifer discharge primarily regulated by geological pore spaces (in addition to pressure heads), the slow and steady discharge of these older waters affects sea level rise on century and millennial timeframes. And, although freshwater discharge from deep aquifers may be locally insignificant relative to river runoff, deep aquifer discharge when integrated across the globe could account for the missing contribution to the sea level rise budgets.
Unfortunately, quantifying the groundwater discharge contribution to sea level rise is extremely difficult, suffering from a low signal to noise problem. That difficulty is why natural groundwater contributions are often ignored or brushed aside as insignificant. Although GRACE satellite monitoring of gravity changes offers great promise for detecting changes in terrestrial groundwater storage, GRACE cannot accurately separate the relatively small discharge of deep aquifers from large annual changes in shallow groundwater. In periods of heavy rains, groundwater increases will mask deep aquifer discharge. And during a drought, any deep groundwater discharge will likely be attributed to the lack of rain.
However, estimates of groundwater re-charge via isotope analyses can provide critical information regards rates of groundwater re-charge and discharge.
Using the abnormal levels of tritium released during nuclear testing in the 1950s, plus carbon14 dating, researchers have categorized the time since groundwater had last left the surface into 25, 50, 75 and 100-year old age classes. As expected, the youngest water is concentrated in the shallowest aquifer layers and the proportion of young water decreases with depth. The estimated volume of 25-year-old or younger groundwater suggests global groundwater is currently recharging at a rate that would reduce sea level by 21 mm/year (0.8 inches/year). Water cycle researchers (i.e. Dai and Trenberth) have made the dubious assumption that the amount of water transported via precipitation to the land from the ocean is balanced each year by river runoff. But if the tritium derived estimates are valid, balancing water cycle and sea level budgets becomes more enigmatic. Clearly a significant amount of precipitation does not return for decades and centuries.
Intriguingly, comparing the smaller volume of ground water aged 50 to 100-years-old versus the volume of water 50-years-old and younger suggests 2 possible scenarios. Either ground water recharge has increased in recent decades, or if recharge rates averaged over 50 years have remained steady, then as groundwater ages a significant portion seeps back to the ocean at rates approaching 1.7 mm/year, a rate that is very similar to 20th century IPCC estimates of sea level rise.
Groundwater discharge must balance recharge or else it directly alters global sea levels. When less than 21 mm/year seeps back to the ocean, then natural groundwater storage lowers sea level. When discharge is greater than 21 mm/year, then groundwater discharge is raising sea level. Without accounting for recharge vs discharge, the much smaller estimates of all the other factors contributing to sea level rise are simply not well constrained.
Higher rates of discharge could account for the enigmatic missing sea level contributions reported by the IPCC and other researchers (i.e. Gregory 2012). More problematic, if discharge proves to significantly exceed recharge, then estimates of contributions from other sources such as melting ice and thermal expansion may be too high. What is certain, the current estimates of contributions to sea level from melting ice and thermal expansion only range from 1.5 to 2.0 mm/year, and those factors by themselves cannot offset the tritium estimated 21 mm/year of groundwater recharge. So, what is missing in our current water cycle budgets?
The Importance of Submarine Groundwater Discharge (SGD)
The recharge-discharge imbalance can be reconciled if water cycle budgets included the difficult-to-measure rates of prolific submarine groundwater discharge (SGD). Freshwater springs bubbling up from coastal sea floors have long been observed. To reliably replenish drinking water, Roman fisherman mapped their occurrences throughout the Mediterranean. Moosdorf (2017) has reviewed the locations and many human uses of fresh submarine groundwater discharge around the world.
Recent ecological studies have measured local submarine groundwater seepages to determine contributions of solutes and nutrients to coastal ecosystems. But those sparse SGD measurements cannot yet be reliably integrated into a global estimate. Rodell (2015) notes that most water cycle budgets have ignored SGD due to its uncertainty, so Rodell’s water cycle budget included a rate of SGD equivalent to 6.5 millimeters/year (~0.25 inch/yr) of sea level rise. However, that estimate is still insufficient to balance current recharge estimates.
However, with improving techniques, researchers recently estimated total submarine groundwater (saline and fresh water combined) discharges suggesting a rate 3 to 4 times greater than the observed global river runoff, or a volume equivalent to 331 mm/year (13 inches) of sea level rise. Nonetheless more than 90% of that submarine discharge is saline sea water, most of which is likely recirculated sea water, and not likely to affect sea level. Only the fraction of entrained freshwater would raise sea level. To balance the 21 mm/year ground water recharge, between 6 and 7% of total SGD must be freshwater and that amount is very likely. Local estimates of the freshwater fraction of submarine discharge range from 1 to 35%, and on average just less than 10%. If fresh submarine groundwater discharge approaches just 7% of the total SGD, it would not only balance current groundwater recharge, but would steadily raise sea level by an additional 2 mm/year, even if there was no ocean warming and no melting glaciers.
A Sea Level Rise “Base-flow” and Paleo-climate Conundrums
Hydrologists seek to quantify the aquifer contributions to river flow, otherwise known as the “base flow”. During the rainy season or the season of melting snow, any groundwater contribution is masked by heavy surface runoff and shallow aquifer effects. However, during extended periods of drought hydrologists assume the low river flow that persists must be largely attributed to supplies from deeper aquifers. Streams that dry up during a drought are usually supported by small shallow aquifers, while reduced but persistent river and stream flows must be maintained by large aquifers. Using a similar conceptual approach, we can estimate a possible “base flow” contribution to sea level.
When the continental ice sheets began to melt as the earth transitioned from its Ice Age maximum to our present warm interglacial, sea level began to rise from depths ~130 meters lower than today (see graph below). Melting continental ice sheets drove much higher rates of sea level rise than seen today, ranging from 10 to 40+ mm/year. Approximately 6,000 years ago, a consensus suggests the last of the continental ice sheets had melted completely, the earth’s montane glaciers had disappeared, and Greenland and Antarctic ice sheets had shrunk to their minimums. The earth then entered a long-term 5000-year cooling trend dubbed the Neoglaciation. Although sea level models forced only by growing glaciers and cooling ocean temperatures would project falling sea levels, proxy evidence enigmatically suggests global sea level continued to rise. Albeit at reduced rates, global sea level continued to rise another 4 meters (Figure 1 below). Although there is some debate regards any continued contribution from Antarctica and “ocean siphoning”, according to Lambeck 2014 about 3 meters of sea level were added between 6.7–4.2 thousand years ago. That continued sea level rise could be explained by aquifer discharge, suggesting a minimal “base flow” of ~1.2 mm/year from groundwater discharge.
Similarly, during the Little Ice Age between 1300 and 1850 AD, montane glaciers as well as Greenland and Antarctic ice sheets, grew and reached their largest extent in the last 7,000 years. Ocean temperatures cooled by about 1 degree. Yet inexplicably, most researchers estimate global sea level never dropped significantly. They report sea levels were “stable” during the Little Ice Age, fluctuating only by tenths of a millimeter. That stability contrasts greatly with the recent rising trend, that has led some to attribute the current rise to increasing CO2 concentrations. However Little Ice Age stability defies the physics of cooling temperatures and increasing water storage in growing glaciers that should have caused a significant sea level fall. However, that seeming paradox is consistent with a scenario in which a “base flow” from groundwater discharge would offset any transfer of waters to growing Little Ice Age glaciers.
Once the growth of Little Ice Age glaciers stopped, and groundwater base flow was no longer offset, we would expect sea levels to rise as witnessed during the 19th and 20th centuries. Such a scenario would also explain Munk’s enigma that sea level rise had started too early, before temperatures had risen significantly from any CO2-driven warming.
Interestingly, assuming a ballpark figure of a 1.2 mm/year groundwater base flow, unbalanced groundwater discharge could also explain the much higher sea levels estimated for the previous warm interglacial, the Eemian. Researchers estimate sea levels ~115,000 years ago were about 6 to 9 meters higher than today. That interglacial has also been estimated to have spanned 15,000 years before continental glaciation resumed. Compared to our present interglacial span of 11,700 years, an extra 3,300 years of groundwater discharge before being offset by resumed glacier growth, could account for 4 meters of the Eemian’s higher sea level.
Recent glacier meltwater contribution to sea level is likely overestimated?
In addition to a groundwater base flow driving the current steady rise in sea level, meltwater from retreating Little Ice Age glaciers undoubtedly contributed as well. But by how much? Researchers have estimated there was greater glacial retreat (and thus a greater flux of meltwater) in the early 1900s compared to now. So, current glacier retreat is unlikely to cause any acceleration of recent sea level rise. Furthermore, we cannot assume glacier meltwater rapidly enters the oceans. A large proportion of meltwater likely enters the ground, so it may take several hundred years for Little Ice Age glacier meltwater to affect sea level.
How fast can groundwater reach the ocean? Groundwater measured in the Great Plains’ Ogallala Aquifer can flow at a higher-than-average seepage rate of ~300 mm (~1 foot) in a day, or about the length of a football field in a year. For such “fast” moving groundwater to travel 1000 kilometers (620 miles) to the sea, it would require over 10,000 years! Most ground water travels much slower. The great weight of the continental glaciers during our last ice age, applied such great pressure that it forced meltwater to into the ground at much greater rates than currently observed recharge. And that Ice Age meltwater is still slowly moving through aquifers like the Ogallala.
(However, its release to the ocean has been sped up by human pumping. Recent estimates suggest that globally, human groundwater extraction currently exceeds rates of water capture from dam building, so that groundwater depletion is now accelerating sea level rise.)
How much of the current meltwater can we expect to transit to the ocean via a slow groundwater route? That’s a tough question to answer. However, thirteen percent of the earth’s ice-free land surface is covered by endorheic basins as illustrated by the gray areas shown in the illustration below. Endorheicbasins have no direct outlets to the ocean. Water entering endorheic basins only return to the sea via evaporation, or by the extremely slow route of groundwater discharge. Any precipitation or glacial meltwater flowing into an endorheic basin could require centuries to thousands of years to flow back to the oceans.
For example, in 2010-2011, researchers reported that a La Nina event had caused global sea level to fall by the equivalent of 7mm/year (~0.3 inches/year). That dramatic drop happened despite concurrent extensive ice melt in Greenland and despite any base flow contribution. As described by Fasullo (2013), GRACE satellite observations detected increased groundwater storage caused by higher rates of rainwater falling on endorheic basins, primarily in Australia. Although satellite observations suggested much of the rainwater remained in the Australian basin, sea level resumed its unabated rise as groundwater base flow contribution would predict.
To balance their sea level budgets, researchers assert melting glaciers have added ~0.8 mm/year to recent sea level rise. The 20th century retreat of most glaciers is undeniable, but we cannot simply assume all 20th century glacier meltwater immediately reached the oceans. The greatest concentration of ice, outside of Greenland and Antarctica, resides in the regions north of India and Pakistan, in the Himalaya and Karakoram glaciers. Most melt water flowing northward enters the extensive Asian endorheic basins. Likewise, some of the Sierra Nevada meltwater flows into Nevada’s Great Basin, and some Andes meltwater flows into the endorheic basins of the Altiplano and Lake Titicaca as well as the Atacama Desert. It is very likely much of the current glacial meltwater will then take decades to millennia to reach the ocean and has yet to impact modern sea levels. If the glacial melt water contribution to sea level is overestimated, then, the unaccounted–for contribution to sea level rise becomes much larger than initially thought.
Accurate Attribution of Groundwater Discharge and Recharge Will Constrain Sea Level Contributions
Using a combination of GRACE gravity data that measured changes in ocean mass, altimetry data that measured changes in ocean volume and Argo data that measured heat content, Cazenave (2008) used 2 different methods and both estimated the contribution from increased ocean heat to be about 0.3 to 0.37 mm/year. Jevrejeva (2008) calculated a similar heat contribution. Other researchers suggest thermal expansion contributes 1.2 to 1.5 mm/year (i.e. Chambers 2016). Such large discrepancies reveal contributing factors to sea level rise are not yet reliably constrained.
One of the great uncertainties in sea level research are glacial isostatic adjustments.
Researchers have subjectively adopted various Glacio-isotatic adjustment models with recommended adjustments ranging from 1 to 2 mm/year. For example, although GRACE gravity estimates had not detected any added water mass to the oceans, Cazenave (2008) added a 2 mm/year adjustment, as illustrated from her Figure 1 below. Other researchers only added a 1 mm/yr adjustment.
Assume the contribution from glacier melting was greater than previously estimated.
But greater melting rates were documented for the 30s and 40s, and the likelihood that some glacier meltwater is still trapped as groundwater, suggests the glacier meltwater contribution has been overestimated.
Assume an increased contribution from thermal expansion.
Assume Greenland positively contributed to sea level throughout the entire 20th century.
Greenland has undoubtedly contributed to episodes of accelerating and decelerating sea level changes, but the greatest rate of Greenland warming occurred during the 1920s and 30s. Previous researchers suggested Greenland glaciers have oscillated during the 20th century but had been stable from the 60s to 1990s. Although there was increased surface melt in the 21st century, culminating in 2012, that melt rate has since declined. And according to the Danish Meteorological Institute, Greenland gained about 50 billion tons of ice in 2017 which should have lowered sea level in 2017. Clearly Greenland cannot explain the enigmatic steady 20th century sea level rise.
Assume reservoir water storage balanced groundwater extraction.
But net contributions from groundwater extraction vs water impoundments and other landscape changes are still being debated. For the period 2002–2014 landscape changes have been estimated to have reduced sea level by −0.40 mm/year versus IPCC estimates of contributing 0.38 mm/year from 1993–2010 to sea level rise.
Assume the remaining unaccounted contribution to sea level rise is small enough to be attributed to melting in Antarctica.
Debatably, Antarctic melting is too often used as the catch-all fudge factor to explain the unexplainable. Furthermore, there is no consensus within the Antarctic research community if there have been any human effects on Antarctica’s ice balance. Regions that are losing ice are balanced by regions that are gaining ice. Claims of net ice loss have been countered by claims of net ice gain such as NASA 2015. Additionally, unadjusted GRACE gravity data has suggested no lost ice mass and all estimates of ice gains or loss depend on which Glacial Isostatic Adjustments modelers choose to use. We cannot dismiss the possibility that unaccounted for groundwater discharge has been mistakenly attributed to hypothetical Antarctic melting?
A better accounting of natural groundwater discharge is needed to constrain the range of contributions to sea level rise suggested by researchers such as Gregory 2012. The greater the contribution from groundwater discharge, the smaller the adjustments used to amplify contributions from meltwater and thermal expansion. Until a more complete accounting is determined, we can only appreciate Munk’s earnest concern. How can we predict future sea level rise if we don’t fully understand the present or the past?
Below are a series of rebuttals of typical climate alarmists’ claims such as those made in the recently released Fourth National Climate Assessment Report. The authors of these rebuttals are all recognized experts in the relevant scientific fields. The rebuttals demonstrate the falsity of EPA’s claims merely by citing the most credible empirical data on the topic. For each alarmist claim, a summary of the relevant rebuttal is provided along with a link to the full text of the rebuttal which includes the names and the credentials of the authors of each rebuttal.
Claim #1: Heat waves are increasing at an alarming rate and heat kills
Claim #2: Global warming is causing more hurricanes and stronger hurricanes.
Claim #3: Global warming is causing more and stronger tornadoes
Claim #4: Global warming is increasing the magnitude and frequency of droughts and floods.
Claim #5: Global Warming has increased U.S. Wildfires
Claim #6: Global warming is causing snow to disappear
Claim #7: Global warming is resulting in rising sea levels as seen in both tide gauge and satellite technology
Claim #8: Arctic, Antarctic and Greenland ice loss is accelerating due to global warming
Claim #9: Rising atmospheric CO2 concentrations are causing ocean acidification, which is catastrophically harming marine life
Claim #10: Carbon pollution is a health hazard
Claim #1: Heat Waves are Increasing at an Alarming Rate and Heat Kills
Summary of Rebuttal
There has been no detectable long-term increase in heat waves in the United States or elsewhere in the world. Most all-time record highs here in the U.S. happened many years ago, long before mankind was using much fossil fuel. Thirty-eight states set their all-time record highs before 1960 (23 in the 1930s!). Here in the United States, the number of 100F, 95F and 90F days per year has been steadily declining since the 1930s. The Environmental Protection Agency Heat Wave Index confirms the 1930s as the hottest decade.
James Hansen, while at NASA in 1999, said about the U.S. temperature record “In the U.S. the warmest decade was the 1930s and the warmest year was 1934”. When NASA was challenged on the declining heat records in the U.S, the reply was that the U.S. is just 2% of the world. However, all 8 continents recorded their all-time record highs before 1980. Interestingly while the media gives a great deal of coverage to even minor heat waves to support the case that man-made global warming is occurring, the media tends to ignore deadly cold waves. But in actual fact worldwide cold kills 20 times as many people as heat. This is documented in the “Excess Winter Mortality” which shows that the number of deaths in the 4 coldest winter months is much higher than the other 8 months of the year. The USA death rate in January and February is more than 1000 deaths per day greater than in it is July and August.
Clearly, there is no problem with increased heat waves due to climate change.
Claim #2: Global Warming Is Causing More Hurricanes and Stronger Hurricanes
Summary of Rebuttal
There has been no detectable long-term trend in the number and intensity of hurricane activity globally. The activity does vary year to year and over multidecadal periods as ocean cycles including El Nino/La Nina, multidecadal cycles in the Pacific (PDO) and Atlantic (AMO) favor some basins over others.
The trend in landfalling storms in the United States has been flat to down since the 1850s. Before the active hurricane season in the United States in 2017, there had been a lull of 4324 days (almost 12 years) in major hurricane landfalls, the longest lull since the 1860s. Harvey was the first hurricane to make landfall in Texas since Ike in 2008 and the first Category 4 hurricane in Texas since Hurricane Carla in 1961. There has been a downtrend in Texas of both hurricanes and major hurricanes. Texas is an area where Gulf Tropical Storms and hurricanes often stall for days, and 6 of the heaviest tropical rainfall events for the U.S. have occurred in Texas. Harvey’s rains were comparable to many of these events. Claudette in 1979 had an unofficial rainfall total greater than in Harvey.
In Florida, where Irma hit as a category 4 on the Keys, it came after a record 4339 days (just short of 12 years) without a landfalling hurricane. The previous record lull was in the 1860s (8 years). There has been no trend in hurricane intensity or landfalling frequency since at least 1900.
Claim #3: Global Warming Is Causing More and Stronger Tornadoes
Summary of Rebuttal
Tornadoes are failing to follow “global warming” predictions. Big tornadoes have seen a decline in frequency since the 1950s. The years 2012, 2013, 2014, 2015 and 2016 all saw below average to near record low tornado counts in the U.S. since records began in 1954. 2017 to date has rebounded only to the long-term mean.
This lull followed a very active and deadly strong La Nina of 2010/11, which like the strong La Nina of 1973/74 produced record setting and very deadly outbreaks of tornadoes. Population growth and expansion outside urban areas have exposed more people to the tornadoes that once roamed through open fields. Tornado detection has improved with the addition of NEXRAD, the growth of the trained spotter networks, storm chasers armed with cellular data and imagery and the proliferation of cell phone cameras and social media. This shows up most in the weak EF0 tornado count but for storms from moderate EF1 to strong EF 3+ intensity, the trend has been flat to down despite improved detection.
Claim #4: Global warming Is Increasing the Magnitude and Frequency of Droughts and Floods
Summary of Rebuttal
Our use of fossil fuels to power our civilization is not causing droughts or floods. NOAA found there is no evidence that floods and droughts are increasing because of climate change. The number, extend or severity of these events does increase dramatically for a brief period of years at some locations from time to time but then conditions return to more normal. This is simply the long-established constant variation of weather resulting from a confluence of natural factors. In testimony before Congress Professor Roger Pielke, Jr. said: “It is misleading, and just plain incorrect, to claim that disasters associated with hurricanes, tornadoes, floods, or droughts have increased on climate timescales either in the United States or globally. Droughts have, for the most part, become shorter, less frequent, and cover a smaller portion of the U.S. over the last century.”
“The good news is U.S. flood damage is sharply down over 70 years,” Roger Pielke Jr. said. “Remember, disasters can happen any time…”. But it is also good to understand long-term trends based on data, not hype.”
Claim #5: Global Warming Has Increased U.S. Wildfires
Summary of Rebuttal
Wildfires are in the news almost every late summer and fall. The
National Interagency Fire Center has recorded the number of fires and acreage affected since 1985. This data show the number of fires trending down slightly, though the acreage burned had increased before leveling off over the last 20 years. The NWS tracks the number of days where conditions are conducive to wildfires when they issue red-flag warnings. It is little changed. 2017 was an active fire year in the U.S. but my no means a record. The U.S. had 64,610 fires, the 7th most since in 11 years and the most since 2012. The 9,574, 533 acres burned was the 4th most in 11 years and most since 2015. The fires burned in the Northwest including Montana with a very dry summer then the action shifted south seasonally with the seasonal start of the wind events like Diablo in northern California and Santa Ana to the south.
Fires spread to northern California in October with an episode of the dry Diablo wind that blows from the east and then in December as strong and persistent Santa Ana winds and dry air triggered a round of large fires in Ventura County. According to the California Department of Forestry and Fire Protection the 2017 California wildfire season was the most destructive one on record with a total of 8,987 fires that burned 1,241,158 acres. It included five of the 20 most destructive wildland-urban interface fires in the state’s history.
When it comes to considering the number of deaths and structures destroyed, the seven-fold increase in population in California from 1930 to 2017 must be noted. Not only does this increase in population mean more people and home structures in the path of fires, but it also means more fires. Lightning and campfires caused most historic fires; today most are the result of power lines igniting trees. The power lines have increased proportionately with the population, so it can be reasoned that most of the damage from wild fires in California is a result of increased population not Global Warming. The increased danger is also greatly aggravated by poor government forest management choices.
Claim #6: Global Warming Is Causing Snow to Disappear
Summary of Rebuttal
This is one claim that has been repeated for decades even as nature showed very much the opposite trend with unprecedented snows even to the big coastal cities. Every time they repeated the claim, it seems nature upped the ante more.
Alarmists have eventually evolved to crediting warming with producing greater snowfall, because of increased moisture but the snow events in recent years have usually occurred in colder winters with high snow water equivalent ratios in frigid arctic air.
Snowcover in the Northern Hemisphere, North America, and Eurasia has been increasing since the 1960s in the fall and winter but declining in the spring and summer. However, as NOAA advised might be the case, snowcover measurement methodology changes at the turn of this century may be responsible for part of the warm season differences.
Claim #7: Global warming is resulting in rising sea levels as Seen in Both Tide Gauge and Satellite Technology
Summary of Rebuttal
This claim is demonstrably false. It really hinges on this statement: “Tide gauges and satellites agree with the model projections.” The models project a rapid acceleration of sea level rise over the next 30 to 70 years. However, while the models may project acceleration, the tide gauges clearly do not.
All data from tide gauges in areas where land is not rising or sinking show instead a steady linear and unchanging sea level rate of rise from 4 up to 6 inches/century, with variations due to gravitational factors. It is true that where the land is sinking as it is in the Tidewater area of Virginia and the Mississippi Delta region, sea levels will appear to rise faster but no changes in production would change that.
The implication that measured, validated, and verified Tide Gauge data support this conclusion remains simply false. All such references rely on “semi-empirical” information, which merges, concatenates, combines, and joins, actual tide gauge data with various models of the reference author’s choosing. Nowhere on this planet can a tide gauge be found, that shows even half of the claimed 3.3 mm Sea level rise rate in Tectonically Inert” coastal zones. These are areas that lie between regions of geological uplift and subsidence. They are essentially neutral with respect to vertical land motion, and tide gauges located therein show between 1 mm/yr (3.9 inches/century) and 1.5 mm/yr (6 inches/century rise). The great Swedish Oceanographer, Nils-Axel Mörner, has commented on this extensively, and his latest papers confirm this ‘inconvenient truth.’
Further, alarmist claims that “Satellites agree with the model projection” are false. Satellite technology was introduced to provide more objective measurement of the sea level rise because properly adjusted tide gauge data was not fitting Alarmists’ claims. However, the new satellite and radar altimeter data lacked the resolution to accurately measure sea levels down to the mm level. Moreover, the raw data from this technology also conflicted with Alarmists’ claims. As a result, adjustments to this data were also made – most notably a Glacial Isostatic Adjustment (GIA). GIA assumes that basically all land is rebounding from long ago glaciations and oceanic basins are deepening. The assumption is that this rebounding is masking the true sea level rise. Alarmists continue to proclaim that their models project a rapid acceleration of sea level rise over the next 30 to 70 years, when those same models have failed to even come close to accurately measuring the past 25 years.
Claim #8: Arctic, Antarctic and Greenland Ice Loss Is Accelerating Due to Global Warming
Summary of Rebuttal
Satellite and surface temperature records and sea surface temperatures show that both the East Antarctic Ice Sheet and the West Antarctic Ice Sheet are cooling, not warming and glacial ice is increasing, not melting. Satellite and surface temperature measurements of the southern polar area show no warming over the past 37 years. Growth of the Antarctic ice sheets means sea level rise is not being caused by melting of polar ice and, in fact, is slightly lowering the rate of rise. Satellite Antarctic temperature records show 0.02C/decade cooling since 1979. The Southern Ocean around Antarctica has been getting sharply colder since 2006. Antarctic sea ice is increasing, reaching all-time highs. Surface temperatures at 13 stations show the Antarctic Peninsula has been sharply cooling since 2000. The Arctic includes the Arctic Ocean, Greenland, Iceland, and part of Siberia and northern Alaska. Because of the absence of any land mass in the Arctic Ocean, most of area lacks glaciers, which require a land mass. Thus, most of the Arctic contains only floating sea ice. Greenland, Iceland, northern Alaska, and northern Siberia contain the only glaciers in the general Arctic region. Arctic temperature records show that the 1920s and 1930s were warmer than 2000. Records of historic fluctuations of Arctic sea ice go back only to the first satellite images in 1979. That happens to coincide with the end of the 1945–1977 global cold period and the maximum extent of Arctic sea ice. During the warm period from 1978 until recently, the extent of sea ice has diminished, but increased in the past several years. The Greenland ice sheet has also grown recently.
Claim #9: Rising Atmospheric CO2 Concentrations Are Causing Ocean Acidification, which Is Catastrophically Harming Marine Life
Summary of Rebuttal
As the air’s content rises in response to ever-increasing anthropogenic CO2 emissions, more and more carbon dioxide is expected to dissolve into the surface waters of the world’s oceans, which dissolution is projected to cause a 0.3 to 0.7 pH unit decline in the planet’s oceanic waters by the year 2300. A potential pH reduction of this magnitude has provoked concern and led to predictions that, if it occurs, marine life will be severely harmed—with some species potentially driven to extinction—as they experience negative impacts in growth, development, fertility and survival. This ocean acidification hypothesis, as it has come to be known, has gained great momentum in recent years, because it offers a second independent reason to regulate fossil fuel emissions in addition to that provided by concerns over traditional global warming. For even if the models are proven to be wrong with respect to their predictions of atmospheric warming, extreme weather, glacial melt, sea level rise, or any other attendant catastrophe, those who seek to regulate and reduce CO2 emissions have a fall-back position, claiming that no matter what happens to the climate, the nations of the Earth must reduce their greenhouse gas emissions because of projected direct negative impacts on marine organisms via ocean acidification.
The ocean chemistry aspect of the ocean acidification hypothesis is rather straightforward, but it is not as solid as it is often claimed to be. For one thing, the work of a number of respected scientists suggests that the drop in oceanic pH will not be nearly as great as the IPCC and others predict. And, as with all phenomena involving living organisms, the introduction of life into the analysis greatly complicates things. When a number of interrelated biological phenomena are considered, it becomes much more difficult, if not impossible, to draw such sweeping negative conclusions about the reaction of marine organisms to ocean acidification. Quite to the contrary, when life is considered, ocean acidification is often found to be a non-problem, or even a benefit. And in this regard, numerous scientific studies have demonstrated the robustness of multiple marine plant and animal species to ocean acidification—when they are properly performed under realistic experimental conditions.
The term “carbon pollution” is a deliberate, ambiguous, disingenuous term, designed to mislead people into thinking carbon dioxide is pollution. It is used by the environmentalists to confuse the environmental impacts of CO2 emissions with the impact of the emissions of unwanted waste products of combustion. The burning of carbon-based fuels (fossil fuels – coal, oil, natural gas – and biofuels and biomass) converts the carbon in the fuels to carbon dioxide (CO2), which is an odorless invisible gas that is plant food and it is essential to life on the planet. Because the burning of the fuel is never 100% efficient, trace amounts of pollutants including unburnt carbon are produced in the form of fine particulates (soot), hydrocarbon gases and carbon monoxide. In addition, trace amounts of sulfur oxides, nitrogen oxides and other pollutant constituents can be produced. In the US, all mobile and industrial stationary combustion sources must have emission control systems that remove the particulates and gaseous pollutants so that the emissions are in compliance with EPA’s emission standards. The ambient air pollutant concentrations have been decreasing for decades and are going to keep decreasing for the foreseeable future because of existing non-GHG-related regulations.
Electricity Consumers File New Study in Their Call for EPA to Reopen its Endangerment Finding
Key Points:
o Just Released, new research findings demonstrate that Ten Frequent Climate Alarmists’ Claims have each been Rebutted by true experts in each Field by simply citing the most relevant and credible empirical data.
o The new results invalidate 10 very frequent Alarmist Claims in recent years, and thereby also invalidate the so-called “lines of evidence” on which EPA claimed to base its 2009 CO2 Endangerment Finding.
o If the Endangerment Finding is not vacated, whether the current administration likes it or not, it is certain that electric utility, automotive and many other industries will face ongoing EPA CO2 regulation.
This scientifically illiterate basis for regulation will raise U.S. energy prices thereby reducing economic growth, jobs and national security.
February 20, 2018
On February 9, 2018, The Concerned Household Electricity Consumers Council (CHECC) submitted a fifth Supplement to their Petition to provide additional new highly relevant and credible information. It relates to variables other than temperature describing the Earth’s Climate System. With each of EPA’s three Lines of Evidence purporting to support their 2009 Endangerment Finding already shown in the CHECC petition and its first 2 Supplements to be invalid, EPA has no proof whatsoever that CO2 has had a statistically significant impact on global temperatures.
The Council’s original Petition and First Supplement to Petition demonstrated that the Endangerment Finding is nothing more than assumptions that have each been disproved by the most relevant empirical evidence from the real world. The original Petition was substantially based on a major peer-reviewed 2016 scientific paper by James Wallace, John Christy and Joseph D’Aleo (Wallace 2016) that analyzed the best available temperature data sets and “failed to find that the steadily rising atmospheric CO2 concentrations have had a statistically significant impact on any of the 13 critically important tropical and global temperature time series data sets analyzed.” The full text of Wallace 2016 may be found here”>here.
First Supplement to Petition was substantially based on a new April 2017 peer reviewed scientific paper, also from the same authors (Wallace 2017A). Wallace 2017A can be found here. Wallace 2017A concluded that once impacts of natural factors such as solar, volcanic and ENSO activity are accounted for, there is no “natural factor adjusted” warming remaining to be attributed to rising atmospheric CO2 levels.
The Second Supplement to the Petition relied on a third new major peer reviewed scientific paper from James Wallace, Joseph D’Aleo and Craig Idso, published in June 2017 (Wallace 2017B). Wallace 2017B analyzes the GAST data issued by U.S. agencies NASA and NOAA, as well as British group Hadley CRU. (Wallace 2017B can be found here) In this research report past changes in the previously reported historical data are quantified. It was found that each new version of GAST has nearly always exhibited a steeper warming linear trend over its entire history. And, this result was nearly always accomplished by each entity systematically removing the previously existing cyclical temperature pattern. This was true for all three entities providing GAST data measurement, NOAA, NASA and Hadley CRU.
The Second Supplement to Petition states: Adjustments that impart an ever-steeper upward trend in the data by removing the natural cyclical temperature patterns present in the data deprive the GAST products from NOAA, NASA and Hadley CRU of the credibility required for policymaking or climate modeling, particularly when they are relied on to drive trillions of dollars in expenditures.
The invalidation of the adjusted GAST data knocked yet another essential pillar out from under the lines of evidence that are the claimed foundation of the Endangerment Finding. As the Second Supplement to Petition stated: It is therefore inescapable that if the official GAST data from NOAA, NASA and Hadley CRU are invalid, then both the “basic physical understanding” of climate and the climate models will also be invalid. The scientific invalidity of the Endangerment Finding becomes more blindingly obvious and undeniable with each day’s accumulation of reliable empirical data -and, the willingness of more scientists to come forward with such new evidence. (See here.) Perhaps recognizing this fact, Climate Alarmist have over time gone from focusing on Global Warming, to Climate Change to simply fear of Carbon. Thus, this research sought to determine the credibility of Ten (10) very frequently cited Climate Alarmists Claims. Above are Rebuttals to each of these ten typical climate alarmists’ claims. The rebuttal authors are all recognized experts on their topic and each rebuttal demonstrates the claim fallacy by merely citing the most credible empirical data.
The Conclusion of the Fifth Supplement
The invalidation of the three lines of evidence upon which EPA attributes global warming to human GHG emissions breaks the causal link between human GHG emissions and global warming. This in turn necessarily breaks the causal chain between human GHG emissions and the alleged knock-on effects of global warming, such as loss of Arctic ice, increased sea level, and increased heat waves, floods, droughts, hurricanes, tornadoes, etc.
Nevertheless, these alleged downstream effects are constantly cited to whip up alarm and create demands for ever tighter regulation of GHG emissions involving all fossil fuels, not just coal. EPA explicitly relied on predicted increases in such events to justify the Endangerment Finding. But there is no evidence to support such Alarmist Claims, and copious empirical evidence that refutes them. The enormous cost and essentially limitless scope of the government’s regulatory authority over GHG emissions cannot lawfully rest upon a collection of scary stories that are conclusively disproven by readily available empirical data.
The scientific invalidity of the Endangerment Finding becomes more blindingly obvious and undeniable with each day’s accumulation of reliable empirical data. It is time for an honest and rigorous scientific re-evaluation of the 2009 CO2 Endangerment Finding. The nation has been taken down a tragically foolish path of pointless GHG/CO2 regulations and wasteful mal-investments to “solve” a problem which does not actually exist. Our leaders must summon the courage to acknowledge the truth and act accordingly.
The legal criteria for reconsidering the Endangerment Finding are clearly present in this case. The scientific foundation of the Endangerment Finding has been invalidated. The parade of horrible calamities that the Endangerment Finding predicts and that a vast program of regulation seeks to prevent have been comprehensively and conclusively refuted by empirical data. The Petition for Reconsideration should be granted.
The Council brought its Petition because the Obama-era greenhouse gas regulations threaten, as President Obama himself conceded, to make the price of electricity “skyrocket.” But clearly CO2 regulation does not just raise electricity prices, it raises all fossil fuel prices. America can have, and must have, the lowest possible energy costs in order to attain and maintain its energy, economic and national security.
Media Contacts:
Harry W. MacDougald
Caldwell Propst & DeLoach LLP
Two Ravinia Drive, Suite 1600
Atlanta, Georgia 30346
(404) 843-1956
hmacdougald@cpdlawyers.com
Francis Menton
Law Office of Francis Menton
85 Broad Street, 18th floor
New York, New York 10004
(212) 627-1796
fmenton@manhattancontrarian.com
Stephen Paddock, Omar Mateen, Gavin Long, Eric Harris, Dylan Klebold, James Holmes, and now, Nikolas Cruz all have one thing in common other than the mass murders they carried out. They were all reportedly taking prescription drugs which alter their state of mind and carry a host of negative side effects ranging from aggression and suicide to homicidal ideation.
Suicide, birth defects, heart problems, hostility, violence, aggression, hallucinations, self-harm, delusional thinking, homicidal ideation, and death are just a few of the side effects caused by the medication taken by the monsters named above, some of which are known as SSRIs (selective serotonin reuptake inhibitors), or antidepressants.
There have been 150 studies in 17 countries on antidepressant-induced side effects. There have been 134 drug regulatory agency warnings from 11 countries and the EU warning about the dangerous side effects of antidepressants.
Despite this deadly laundry list of potential reactions to these medications, their use has skyrocketed by 400% since 1988. Coincidentally, as antidepressant use went up, so did mass shootings.
The website SSRIstories.org has been documenting the link between selective serotonin reuptake inhibitors (SSRIs) and violence. On the website is a collection of over 6,000 stories that have appeared in local media (newspapers, TV, scientific journals) in which prescription drugs were mentioned and in which the drugs may be linked to a variety of adverse outcomes including most of the mass shootings which have taken place on US soil.
As the Citizens Commission on Human Rights notes, before the late nineteen-eighties, mass shootings and acts of senseless violence were relatively unheard of. Prozac, the most well known SSRI (selective serotonin reuptake inhibitor) antidepressant, was not yet on the market. When Prozac did arrive, it was marketed as a panacea for depression which resulted in huge profits for its manufacturer Eli Lilly. Of course other drug companies had to create their own cash cow and followed suit by marketing their own SSRI antidepressants.
Subsequently, mass shootings and other violent incidents started to be reported. More often than not, the common denominator was that the shooters were on an antidepressant, or withdrawing from one. This is not about an isolated incident or two but numerous shootings.
The issue of psychotropic medication playing a role in mass shootings is not some conspiracy theory. It is very real and the drug manufacturers list these potentially deadly side effects on the very inserts of every one of these drugs. But the mainstream media and the government continue to ignore or suppress this information. Why is that?
In a clear example of how beholden mainstream media is to the pharmaceutical industries who manufacture and market these drugs, FOX News’ Sean Hannity was recorded this week, blatantly cutting off a reporter who dared mention Nikolas Cruz’s reported association with antidepressants.
In a news segment this week, Hannity was interviewing radio talk show host, Gina Loudon who tried to bring up Cruz’s association with SSRIs.
“I think we have to take a hard look at one thing we’re not talking about yet too, Sean, and that is psychotropic drugs,” Loudon says.
“My guess is, we’ll find out like most of these shooters…..” she says, just before Hannity jumps in to silence her.
Hannity then shuts up Loudon and moves to the doctor next to her. Just like that, all talk which was implicating big pharma in their role in mass shootings was effectively silenced.
It is no secret that the pharmaceutical industry wields immense control over the government and the media. It is their control which keeps any negative press about their dangerous products from airing. However, most people likely do not know the scope of this control.
As Mike Papantonio, attorney and host of the international television show America’s Lawyer, explains, with the exception of CBS, every major media outlet in the United States shares at least one board member with at least one pharmaceutical company. To put that into perspective: These board members wake up, go to a meeting at Merck or Pfizer, then they have their driver take them over to a meeting with NBC to decide what kind of programming that network is going to air.
In the report below, Papantonio explains how the billions of dollars big pharma gives to mainstream media outlets every year is used to keep them subservient and complicit in covering up the slew of deadly side effects from their products.
How much longer will we allow these billion-dollar drug companies to control the narrative and not let this conversation take place? How many more mass shootings will take place before Americans wake up to this reality?
Bisphosphonate bone drugs are among the most harmful and misrepresented drug classes still on the market. But that has not stopped Pharma-funded medical associations like the American Society of Bone and Mineral Research, the National Osteoporosis Foundation and the National Bone Health Alliance from periodically wringing their hands over low sales. [1]
This week the New York Times repeats the industry lament. “Currently, many people at risk of a fracture — and often their doctors — are failing to properly weigh the benefits of treating fragile bones against the very rare but widely publicized hazards of bone-preserving drugs, experts say,” it writes. Hip fractures among women 65 and older on Medicare are rising says the piece and Medicare reimbursements for bone density tests are falling. “Doctors who did them in private offices could no longer afford to [do them] which limited patient access and diagnosis and treatment of serious bone loss,” says a doctor quoted in the article which sounds like a Pharma plea for tax-payer funding.
But here is the back story.
The first bisphosphonate bone drug approved for osteoporosis, Merck’s Fosamax, received only a six month review before FDA approval. When its esophageal side effects were revealed, the FDA tried to unapprove it but Merck got the FDA to settle for a warning label that told patients to sit upright for an hour after taking the drug. Six months after Fosamax was approved, there were 1,213 reports of adverse effects including 32 patients hospitalized for esophageal harm. One woman who took Fosamax but remained upright for only thirty minutes was admitted to the hospital with “severe ulcerative esophagitis affecting the entire length of the esophagus” and had to be fed intravenously, according to the New England Journal of Medicine (NEJM).
Soon bisphosphonates (which include Boniva, Actonel and Zometa) were shown to weaken not strengthen bones by suppressing the body’s bone-remodeling action. Yes bone loss is stopped but since the bone is not renewed, it becomes brittle, ossified and prone to fracture. More than a decade ago, articles in the NEJM, the Annals of Internal Medicine, the Journal of Clinical Endocrinology & Metabolism, Journal of Orthopaedic Trauma and Injury warned of the paradoxical drug results. One-half of doctors at a 2010 American Academy of Orthopaedic Surgeons annual meeting presentation said they’d personally seen patients with bisphosphonate-compromised bone. “There is actually bone death occurring,” said Phuli Cohan, MD on CBS about a woman who’d been on Fosamax for years.
By 2003, dentists and oral surgeons found that after simple office dental work, the jawbone tissue of patients taking bisphosphonates would sometimes not heal but become necrotic and die. They had received no warnings though Merck knew about the jawbone effects from animal studies since 1977.
“Up to this point, this rare clinical scenario was seen only at our centers in patients who had received radiation therapy and accounted for 1 or 2 cases per year,” said the authors of an article titled “Osteonecrosis of the Jaws Associated with the Use of Bisphosphonates: A Review of 63 Cases,” published in the Journal of Oral and Maxillofacial Surgery.
Despite reports of ulcerative esophagitis, bone degradation, fractures and jawbone death Merck aggressively promoted Fosamax. It hired researcher Jeremy Allen to plant bone scan machines in medical offices across the country to drive sales and to push through the Bone Mass Measurement Act which made bone scans Medicare reimbursable paid by you and me. Hopefully that is changing.
Blaming hip fractures on not enough people taking bisphosphonates is not a new tactic for Pharma. It blamed increasing suicides on not enough people taking antidepressants (even when as much as a fourth of the population takes antidepressants). Get ready for Pharma to blame obesity on not enough people taking prescription obesity drugs. The ruse is even more dishonest because many popular drugs people are taking like GERD medications really do thin bones. First do no harm.
Notes.
[1] According to the British Medical Journal, the National Osteoporosis Foundation is funded by Bayer Healthcare, Lane Laboratories, Mission Pharmacal, Novartis, Pharmavite, Pfizer, Roche, Warner Chilcott and Eli Lilly. The American Society for Bone and Mineral Research is funded by Pfizer and Eli Lilly. The National Bone Health Alliance is a public- private partnership that is an offshoot of the National Osteoporosis Foundation.
Stark Realities with Brian McGlinchey | April 4, 2024
A principal goal of Stark Realities is to “expose fundamental myths across the political spectrum” — and few myths are as universally embraced as the notion that US participation in World War II (1941-1945) lifted the American economy out of the Great Depression.
This myth is dangerous not only because it leads citizens and politicians to see a bright side of war that doesn’t really exist, but also because it helps foster a belief that government spending is essential to countering economic downturns. That belief, in turn, has helped propel us to a point where the national debt now exceeds $34.6 trillion, with interest payments alone on pace to reach $1 trillion a year in 2026, inviting financial catastrophe. … continue
This site is provided as a research and reference tool. Although we make every reasonable effort to ensure that the information and data provided at this site are useful, accurate, and current, we cannot guarantee that the information and data provided here will be error-free. By using this site, you assume all responsibility for and risk arising from your use of and reliance upon the contents of this site.
This site and the information available through it do not, and are not intended to constitute legal advice. Should you require legal advice, you should consult your own attorney.
Nothing within this site or linked to by this site constitutes investment advice or medical advice.
Materials accessible from or added to this site by third parties, such as comments posted, are strictly the responsibility of the third party who added such materials or made them accessible and we neither endorse nor undertake to control, monitor, edit or assume responsibility for any such third-party material.
The posting of stories, commentaries, reports, documents and links (embedded or otherwise) on this site does not in any way, shape or form, implied or otherwise, necessarily express or suggest endorsement or support of any of such posted material or parts therein.
The word “alleged” is deemed to occur before the word “fraud.” Since the rule of law still applies. To peasants, at least.
Fair Use
This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more info go to: http://www.law.cornell.edu/uscode/17/107.shtml. If you wish to use copyrighted material from this site for purposes of your own that go beyond ‘fair use’, you must obtain permission from the copyright owner.
DMCA Contact
This is information for anyone that wishes to challenge our “fair use” of copyrighted material.
If you are a legal copyright holder or a designated agent for such and you believe that content residing on or accessible through our website infringes a copyright and falls outside the boundaries of “Fair Use”, please send a notice of infringement by contacting atheonews@gmail.com.
We will respond and take necessary action immediately.
If notice is given of an alleged copyright violation we will act expeditiously to remove or disable access to the material(s) in question.
All 3rd party material posted on this website is copyright the respective owners / authors. Aletho News makes no claim of copyright on such material.