Anthony Watts has posted a story about a laughable analysis of the cost of propping up renewables through subsidies. And long-time WUWT contributor KD helpfully pointed me to the document itself. Now that I have the actual document, here’s what they say about subsidies (all emphasis mine).
First, they point out that the cost of shifting to renewables will be on the order of $800 billion dollars per year. Overall, they say the cost will be $45,000,000,000,000 ($45 trillion dollars) by 2050, and could be as high as $70 trillion.
In other words, a substantial “clean-energy investment gap” of some $800 billion/yr exists – notably on the same order of magnitude as present-day subsidies for fossil energy and electricity worldwide ($523 billion). Unless the gap is filled rather quickly, the 2°C target could potentially become out of reach.
Now, a trillion is an unimaginable amount of money. Here’s a way to grasp it. If I started a business in the year zero AD, and my business was so bad that I lost a million dollars a day, not a million a year but a million dollars a day, how many trillion dollars would I have lost by now?
Well, I wouldn’t have lost even one trillion by now, only about $735 billion dollars … in other words, less than the estimated PER-YEAR cost of switching to renewables.
Then they go on to claim that hey, $800 billion per year is no big deal, because fossil fuel subsidies are nearly that large.
While the clean-energy investment gaps (globally and by region) may indeed appear quite sizeable at first glance, a comparison to present-day energy subsidy levels helps to put them into context. According to estimates by the International Monetary Fund and International Energy Agency, global “pre-tax” (or direct) subsidies for fossil energy and fossil electricity totaled $480–523 billion/yr in 2011 (IEA 2012b; IMF 2013). This corresponds to an increase of almost 30% from 2010 and was six times more than the total amount of subsidies for renewables at that time. Oil-exporting countries were responsible for approximately two-thirds of total fossil subsidies, while greater than 95% of all direct subsidies occurred in developing countries.
Now, this is a most interesting and revealing paragraph.
First, despite what people have said on the previous thread, they have NOT included taxes in their calculation of subsidies.
Next, to my great surprise an amazing 95% of all subsidies are being paid by developing nations. This underscores the crucial importance of energy for the poor.
In addition, they say that most of the money used to pay the fossil fuel subsidies comes from … wait for it … the sale of fossil fuels.
Next, it means that nothing that the developed world does will free up much money. Only 5% of the subsidies are in developed nations, they could go to zero and it wouldn’t change the big picture.
It also means that since these subsidies are not going to drivers in Iowa and Oslo, but are propping up the poorest of the global poor, we cannot stop paying them without a huge cost in the form of impoverishment, hardship, and deaths.
Finally, unless we shift the fuel subsidy from fossil fuels to renewables, which obviously we cannot do, the comparison is meaningless—we will still need nearly a trillion dollars per year in additional subsidies to get renewables off of the ground, over and above the assistance currently given to the poor … where do the authors think that money would come from?
I fear that like the pathetically bad Stern Report, this analysis is just another batch of bogus claims trying to prop up the war on carbon, which is and always has been a war on development and human progress, and whose “collateral damages” fall almost entirely on the poor.
And at the end of the day, despite their vain efforts to minimize the cost, even these proponents of renewables say it will cost up to $70 trillion dollars to make the switch, with no guarantee that it will work.
Sigh …
[UPDATE]
I see that in the study they make much of the disparity between fossil fuel subsidies ($523 billion annually) and renewables subsidies, which they proudly state are only about a sixth of that ($88 billion annually).
However, things look very different when we compare the subsidies on the basis of the energy consumed from those sources. To do that, I use the data in the BP 2014 Statistical Review of World Energy spreadsheet in the common unit, which is “TOE”, or “Tonnes of Oil Equivalent”. This expresses everything as the tonnes of oil that are equivalent to that energy. I’ve then converted the results to “Gallons of Oil Equivalent” and “Litres of Oil Equivalent” to put them in prices we can understand. That breakdown looks like this:
Fuel, Subsidy/Gallon, Subsidy/Litre
Fossil fuels – $0.17 per gallon, $0.04 per litre
Renewables – $1.19 per gallon, $0.31 per litre.
So despite the fact that renewable subsidies are only a sixth of the fossil subsidies, per unit of energy they are seven times as large as the fossil subsidies.
This, of course, is extremely bad news for the promoters of the subsidies. It means that to get the amount of energy we currently use, without using fossil fuels and solely from renewables, it would require seven times the current fossil fuel subsidy, or $3.5 TRILLION DOLLARS PER YEAR.
And of course, since there’d be no fossil fuel sales at that point, there’d be little money to pay for the subsidy.
Sometimes, the idiocy of the savants is almost beyond belief.
Here are some background to the video you should be aware of:
1) The bears were swimming away from the USGS researchers and film crew who had shot them full of sedatives and attached a camera to one of their necks — they were not swimming toward sea ice 100 miles away.
2) The video was shot in the Bering Sea, in April 2014, when sea ice was about its maximum extent of the year — there was lots of ice around when this video was filmed.
3) The company doing the filming is using this video as a fundraiser.
Details below, including a sea ice map for April 2014.
UPDATE Below
Andy Revkin at the New York Times DotEarth blog promotes it as something spectacular (as he did a couple of weeks ago with an earlier offering from the same team, June 10), admitted in the comments section in response to someone who said it looked like these bears were being harassed by a boat:
“The cameras are on bears that were sedated (which counts as a kind of harassment, yes, but is part of a broader research project; see the earlier post linked from this one).”
Sea ice map for April (average extent for the month, label added), courtesy NSIDC Click it to enlarge.
The youtube video includes a link to the Arctic Exploration Fundof filmmaker Adam Ravetch’s Arctic Bear Productions company.
“Arctic Exploration Fund, AEF, is a non profit 501c 3 whose mission is to arm wild animals around the world, with groundbreaking new cameras, who go out and gather multiple hours of footage of their own lives, which we track using on-board satellite GPS, and together create a brand new natural history archive filmed entirely by the animals themselves.”
Nothing there about deliberately misrepresenting the facts shown in the film footage and using the video for propaganda purposes.
Finally, shame on the USGS: its work in the Bering and Beaufort Seas is being promoted as scientific polar bear research (like collecting blood samples), yet the real products being generated are propaganda videos.
UPDATE:
Swimming bear video used for propaganda was not shot with bear-mounted cameras
A reader from Oregon questioned the filming techniques used for this video.
Revkin followed up.
And it turned out, the reader from Oregon was correct — the film used in this video was shot with “an assortment of traditional methods,” not with the strapped on cameras that the USGS were using on the bears.
Revkin assumed from the background provided to him that this was leading-edge technology, bear-generated video. And even though he’d interviewed the filmmaker, the truth hadn’t come out.
Update June 29, 2014 – another damning comment made, added below.
Andrew, I have spent quite a bit of time observing polar bears during the course of 12 summer seasons working as a guide and photographer in Svalbard. In my view, the bears in this footage often appear quite stressed, swimming rapidly away from the camera, turning to look behind and even diving briefly. I think that it is quite unlikely that all, or even most of this video was shot from cameras mounted on bears. As another commenter said, it very much appears that the bears are fleeing from a boat. I would like to hear from Ravetch’s team and other wildlife film makers about this question. It certainly appears to me that the majority of the footage was shot with pole-mounted cameras.
Stressing bears by chasing them, particularly when they are swimming, is illegal in many parts of their range and certainly unethical anywhere. I have great respect for your blog and the careful attention you give to complex issues; I hope that you will look further into this. Film makers working with politically charged species like polar bears must be very careful that their methods are unassailable.
I sent your note to Ravetch and he clarified that this footage was not shot with the new strapped-on cameras, as you suspected.
He included this note:“That footage was not taken with strapped on cameras and was documented with an assortment of traditional techniques. Aerial, pole cam, and more traditional filming techniques…. I operate with the greatest respect for animals when I am around them for brief periods of time.” [SJC bold]
Thanks for offering your valuable insight. (I also fixed the photo caption to avoid confusion.)
Kudos to Revkin for not dismissing the comment and for following through.
This video was strictly propaganda from the get-go. The new-fangled camera technology Ravetch’s company and USGS polar bear biologists had been experimenting with was not used at all in the filming of this video.
I work as a polar bear guide in Churchill. These images looks like they came from Hudson Bay, likely near Southampton Island, when Ravetch was filming there in the summer of 2012, I believe. I have seen raw footage of this event and the bear is so stressed that at one point, it actually tries to climb into the boat after being followed for a considerable amount of time. Beautiful shot though.
So, this seems like it is not USGS, not in the Beaufort Sea and filmed in the summer. If I am wrong, I apologize, however, the similarities in the footage are striking.
The Arctic Exploration Fund actually seems like it is just raising money for documentary projects not actual research – wish I had thought of that.
It is very unfortunate that management decisions are being based on fictional portrayals of polar bear habitat and behaviour. The bears are the ones that will suffer in the long run.
Kels
* Dr. Susan J. Crockford is a zoologist with more than 35 years experience, including work on the Holocene history of Arctic animals. Like polar bear biologist Ian Stirling, Susan Crockford earned her undergraduate degree in zoology at the University of British Columbia. She is currently an adjunct professor at the University of Victoria, B.C. Polar bear evolution is one of Dr. Crockford’s professional interests, which she discusses in her book, Rhythms of Life: Thyroid Hormone and the Origin of Species.
Further, this story was carried as the lead story on Drudge for a day.
First off the block to challenge Goddard came Ronald Bailey at reason.com in an article Did NASA/NOAA Dramatically Alter U.S. Temperatures After 2000? that cites communication with Anthony Watts, who is critical of Goddard’s analysis, as well as being critical of NASA/NOAA.
Politifact chimed in with an article that assessed Goddard’s claims, based on Watt’s statements and also an analysis by Zeke Hausfather. Politifact summarized with this statement: We rate the claim Pants on Fire.
I didn’t pay much attention to this, until Politifact asked me for my opinion. I said that I hadn’t looked at it myself, but referred them to Zeke and Watts. I did tweet their Pants on Fire conclusion.
Skepticism in the technical climate blogosphere
Over at the Blackboard, Zeke Hausfather has a three-part series about Goddard’s analysis – How not to calculate temperatures (Part I, Part II, Part III). Without getting into the technical details here, the critiques relate to the topics of data dropout, data infilling/gridding, time of day adjustments, and the use of physical temperatures versus anomalies. The comments thread on Part II is very good, well worth reading.
Anthony Watts has a two-part series On denying hockey sticks, USHCN data and all that (Part 1, Part 2). The posts document Watts’ communications with Goddard, and make mostly the same technical points as Zeke. There are some good technical comments in Part 2, and Watts makes a proposal regarding the use of US reference stations.
While I haven’t dug into all this myself, the above analyses seem robust, and it seems that Goddard has made some analysis errors.
The data
OK, acknowledging that Goddard made some analysis errors, I am still left with some uneasiness about the actual data, and why it keeps changing. For example, Jennifer Marohasy has been writing about Corrupting Australian’s temperature record.
In the midst of preparing this blog post, I received an email from Anthony Watts, suggesting that I hold off on my post since there is some breaking news. Watts pointed me to a post by Paul Homewood entitled Massive Temperature Adjustments At Luling, Texas. Excerpt:
So, I thought it might be worth looking in more detail at a few stations, to see what is going on. In Steve’s post, mentioned above, he links to the USHCN Final dataset for monthly temperatures, making the point that approx 40% of these monthly readings are “estimated”, as there is no raw data.
From this dataset, I picked the one at the top of the list, (which appears to be totally random), Station number 415429, which is Luling, Texas.
Taking last year as an example, we can see that ten of the twelve months are tagged as “E”, i.e estimated. It is understandable that a station might be a month, or even two, late in reporting, but it is not conceivable that readings from last year are late. (The other two months, Jan/Feb are marked “a”, indicating missing days).
But, the mystery thickens. Each state produces a monthly and annual State Climatological Report, which among other things includes a list of monthly mean temperatures by station. If we look at the 2013 annual report for Texas, we can see these monthly temperatures for Luling.
Where an “M” appears after the temperature, this indicates some days are missing, i.e Jan, Feb, Oct and Nov. (Detailed daily data shows just one missing day’s minimum temperature for each of these months).
Yet, according to the USHCN dataset, all ten months from March to December are “Estimated”. Why, when there is full data available?
But it gets worse. The table below compares the actual station data with what USHCN describe as “the bias-adjusted temperature”. The results are shocking.
In other words, the adjustments have added an astonishing 1.35C to the annual temperature for 2013. Note also that I have included the same figures for 1934, which show that the adjustment has reduced temperatures that year by 0.91C. So, the net effect of the adjustments between 1934 and 2013 has been to add 2.26C of warming.
Note as well, that the largest adjustments are for the estimated months of March – December. This is something that Steve Goddard has been emphasising.
It is plain that these adjustments made are not justifiable in any way. It is also clear that the number of “Estimated” measurements made are not justified either, as the real data is there, present and correct.
Watts appears in the comments, stating that he has contacted John Nielsen-Gammon (Texas State Climatologist) about this issue. Nick Stokes also appears in the comments, and one commenter finds a similar problem for another Texas station.
Homewood’s post sheds light on Goddard’s original claim regarding the data drop out (not just stations that are no longer reporting, but reporting stations that are ‘estimated’). I infer from this that there seems to be a real problem with the USHCN data set, or at least with some of the stations. Maybe it is a tempest in a teacup, but it looks like something that requires NOAA’s attention. As far as I can tell, NOAA has not responded to Goddard’s allegations. Now, with Homewood’s explanation/clarification, NOAA really needs to respond.
Sociology of the technical skeptical blogosphere
Apart from the astonishing scientific and political implications of what could be a major bug in the USHCN dataset, there are some interesting insights and lessons from this regarding the technical skeptical blogosphere.
Who do I include in the technical skeptical blogosphere? Tamino, Moyhu, Blackboard, Watts, Goddard, ClimateAudit, Jeff Id, Roman M. There are others, but the main discriminating factor is that they do data analysis, and audit the data analysis of others. Are all of these ‘skeptics’ in the political sense? No – Tamino and Moyhu definitely run warm, with Blackboard and a few others running lukewarm. Of these, Goddard is the most skeptical of AGW. There is most definitely no tribalism among this group.
In responding to Goddard’s post, Zeke, Nick Stokes (Moyhu) and Watts may have missed the real story. They focused on their previous criticism of Goddard and missed his main point. Further, I think there was an element of ‘boy who cried wolf’ – Goddard has been wrong before, and the comments at Goddard’s blog can be pretty crackpotty. However, the main point is that this group is rapidly self-correcting – the self-correcting function in the skeptical technical blogosphere seems to be more effective (and certainly faster) than for establishment climate science.
There’s another issue here and that is one of communication. Why was Goddard’s original post unconvincing to this group, whereas Homewood’s post seems to be convincing? Apart from ‘crying wolf’ issue, Goddard focused on the message that the real warming was much less than portrayed by the NOAA data set (caught the attention of the mainstream media), whereas Homewood more carefully documented the actual problem with the data set.
I’ve been in email communications with Watts through much of Friday, and he’s been pursuing the issue along with Zeke and help from Neilsen-Gammon to NCDC directly, who is reportedly taking it seriously. Not only does Watts plan to issue a statement on how he missed Goddard’s original issue, he says that additional problems have been discovered and that NOAA/NCDC will be issuing some sort of statement, possibly also a correction, next week. (Watts has approved me making this statement).
This incident is another one that challenges traditional notions of expertise. From a recent speech by President Obama:
“I mean, I’m not a scientist either, but I’ve got this guy, John Holdren, he’s a scientist,” Obama added to laughter. “I’ve got a bunch of scientists at NASA and I’ve got a bunch of scientists at EPA.”
Who all rely on the data prepared by his bunch of scientists at NOAA.
How to analyze the imperfect and heterogeneous surface temperature data is not straightforward – there are numerous ways to skin this cat, and the cat still seems to have some skin left. I like the Berkeley Earth methods, but I am not convinced that their confidence interval/uncertainty estimates are adequate.
We have studied the long-term toxicity of a Roundup-tolerant GM maize (NK603) and a whole Roundup pesticide formulation at environmentally relevant levels from 0.1 ppb. Our study was first published in Food and Chemical Toxicology (FCT) on 19 September, 2012. The first wave of criticisms arrived within a week, mostly from plant biologists without experience in toxicology. We answered all these criticisms. The debate then encompassed scientific arguments and a wave of ad hominem and potentially libellous comments appeared in different journals by authors having serious yet undisclosed conflicts of interests. At the same time, FCT acquired as its new assistant editor for biotechnology a former employee of Monsanto after he sent a letter to FCT to complain about our study. This is in particular why FCT asked for a post-hoc analysis of our raw data. On 19 November, 2013, the editor-in-chief requested the retraction of our study while recognizing that the data were not incorrect and that there was no misconduct and no fraud or intentional misinterpretation in our complete raw data – an unusual or even unprecedented action in scientific publishing. The editor argued that no conclusions could be drawn because we studied 10 rats per group over 2 years, because they were Sprague Dawley rats, and because the data were inconclusive on cancer. Yet this was known at the time of submission of our study. Our study was however never attended to be a carcinogenicity study. We never used the word ‘cancer’ in our paper. The present opinion is a summary of the debate resulting in this retraction, as it is a historic example of conflicts of interest in the scientific assessments of products commercialized worldwide. We also show that the decision to retract cannot be rationalized on any discernible scientific or ethical grounds. Censorship of research into health risks undermines the value and the credibility of science; thus, we republish our paper.
Background
There is an ongoing debate on the potential health risks of the consumption of genetically modified (GM) plants containing high levels of pesticide residues [1]. Currently, no regulatory authority requests mandatory chronic animal feeding studies to be performed for edible GMOs and formulated pesticides. This fact is at the origin of most of the controversies. Only studies consisting of 90-day rat feeding trials have been conducted by manufacturers for GMOs. Statistical differences in the biochemistry of treated rats versus controls may represent the initial signs of long-term pathologies [2], possibly explained at least in part by pesticide residues in the GM feed. This is why we studied the long-term toxicity of a Roundup-tolerant GM maize (NK603) and a whole Roundup pesticide formulation at environmentally relevant levels from 0.1 ppb.
We first published these results in Food and Chemical Toxicology (FCT) on 19 September, 2012 [3] after a careful and thorough peer review. However, 1 year and 2 months later, in an unusual step, the editor-in-chief requested the retraction of our study, while conceding that the data were not incorrect and that there was no misconduct and no fraud or intentional misinterpretation. According to him, some data were inconclusive, but for reasons already known at the time of submission of the paper. The present paper is a summary of the debate resulting in this retraction, which in our view is a historic example of conflicts of interests in the scientific assessments of products commercialized worldwide.
The long-term toxicity study of the NK603 maize and Roundup
An initial study on NK603 maize was submitted by Monsanto Company in support of commercial authorization of the maize. NK603 maize was fed to 4 groups of 20 Sprague Dawley rats (2 doses of 11% and 33% in the diet of both sexes) for 90 days [4]. The blood analyses were performed on 10 rats per group. The re-analysis of the raw data resulted in a debate on the biological relevance of admitted statistical differences versus controls as the first signs of hepatorenal toxicities [5]. To solve the problem, a 2-year-long study was carried out using two hundred Sprague Dawley rats to which the following treatments were administered: NK603 maize treated or not with Roundup at three different levels in their feed (11%, 22%, and 33% of the total diet) and Roundup alone, administered via drinking water at three different concentrations, from the admitted residual level in regular tap water (0.1 ppb), to the maximum level authorized in GMOs (400 ppm), up to half of the agricultural dose (0.5%). They were divided into ten groups, each containing ten males and ten females. No other long-term study has examined the effects of regular consumption of Roundup-tolerant GM maize and of a pesticide formulation, in any dilution, on blood parameters, sexual hormones, and multiple organs.
We found that these products provoked statistically discriminant disturbances in biochemical markers of livers and kidneys in females at the 15th month, when most of the rats were still alive. At the same time, testosterone and estradiol levels were also disturbed. At the end of the experiments, these disrupted biochemical markers corresponded to pathologies evidenced in a blinded manner: notably hepatorenal deficiencies, more severe in males, and female mammary tumors, which led to premature deaths. For instance, after around 700 days, there were up to 3.25 more mammary tumors (the highest rate was observed in females consuming 0.1 ppb of Roundup in water). This could be associated with a 2.4-time increase in pituitary dysfunctions noticed by the end of the experiment (2 years).
These findings were immediately dismissed by persons involved in the products’ authorizations, or in collaboration with biotech industries. A number of them wrote to FCT to nourish a controversy, including Richard Goodman, a former Monsanto employee in charge of the immunotoxicity files of GMOs, and Paul Christou, a patent holder of the methods used to create transgenic plants. This was rapidly followed by a coordination of national regulatory agencies organized by the European Food Safety Authority (EFSA), released on 4 October, 2012 [6]. The EFSA had previously assessed NK603, and glyphosate, the declared active principle of Roundup, as safe on the basis of regulatory data, which they never fully published. The EFSA has since published Monsanto’s safety data on NK603 maize [7], but not on glyphosate. The NK603 data are in a pdf format preventing an easy statistical re-analysis. However, there was no long-term toxicological assessment for NK603, or for Roundup. Moreover, we demonstrated in several studies [8-10] that Roundup is far more toxic than glyphosate because of non-inert adjuvants. On 10 October, 2012, the Monsanto Company also sent its criticisms to FCT [11] but did not release its safety data, claiming commercial confidentiality.
Overall, the first wave of criticisms arrived within a week, mostly from plant biologists. We answered all criticisms [12] in FCT on 9 November, 2012. The debate then encompassed scientific arguments. A second wave of ad hominem and potentially libelous comments appeared in different journals [13-16]. Regrettably, there were no invitations to respond to these exacerbated attacks, which we discovered only by our literature survey. Some of the authors of these articles had serious yet undisclosed conflicts of interest. The scientific remarks concentrated on the supposedly inadequate choice of the Sprague Dawley rat strain, which is, however, a classic model for toxicology [17]. The Sprague Dawley strain was also used by Monsanto in its 90-day test on the same maize [4]. In addition, Monsanto measured biochemically the same number of rats per group as in our experiment. Thus, with regard to blood and urine biochemistry, Monsanto gathered data from the same number of rats that we did.
Unsubstantiated allegations of fraud or errors
Paul Christou, the lead author of Arjo et al. [13], demanded that our paper be retracted and insulted us personally. He claimed first in a letter addressed to the editor-in-chief that the publication of our study ‘does not meet minimal acceptable standards of scientific rigor’ and ‘will damage an entire scientific discipline due to flawed conclusion’ (personal communication). Then, he attacked us in an article published in the journal Transgenic Research on 20 December 2012 [13]. The quantity of insults and defamations in this paper, authorized and co-authored by the editor-in-chief in a supposedly serious journal, is excessive. They include: ‘abject failure to treat the experimental animals in a humane manner’, ‘inability to formulate a valid hypothesis’, ‘media fanfare’, ‘fraudulent or knowingly inaccurate statements’, ‘unethical behavior’, ‘transparent attempt to discredit regulatory agencies’, ‘ammunition for extremists’, ‘flawed science’, ‘disingenuous or inept’, and ‘unjustified waste of animals’ (while at the same time asking for more animals in the groups). Christou and co-authors suggest that by practising ‘flawed science’, we are working against ‘progress towards a better quality of life’ and in fact are ‘actively working to make life worse’. We were not invited to reply. This behaviour can be explained, though not justified, by the undisclosed conflicts of interests.
Christou is not only the editor-in-chief of Transgenic Research, the journal in which he published his article, but is also linked to Monsanto [18]. He is named as the inventor on several patents on GM crop technology, for most of which Monsanto owns the property rights. These include patents on the plant transformation process [19] used to make glyphosate-tolerant transgenic corn plants [20]. He worked as a researcher at Agracetus Inc. (later acquired by Monsanto) for 12 years. Then, from 1994 to 2001, Christou worked at the John Innes Centre in the UK [18], which is heavily invested in GM crop technology [21]. He thus has no mammalian toxicology background. However, in his published article, Christou only gave as his affiliation his publicly funded position at a research institute. Christou’s failure to declare his current interests – his inventor status on patents concerning the company that developed the products we tested – could be considered grounds for retraction of a paper in a scientific journal, according to ethical guidelines for scientific publishing [22].
The Arjo et al. article was co-authored by Wayne Parrott, an active member of the Biotechnology Committee at the International Life Sciences Institute (ILSI) [23]. ILSI is funded by multinational food, agribusiness, and biotechnology companies, including Monsanto and Syngenta [24]. ILSI has proved highly controversial in North America and Europe due to its influence on risk assessment methodologies for chemicals, pesticides, and GM foods [25-27]. Wayne Parrott also has an inventor status in patents on materials and methods for selecting transgenic organisms [28] and transformation vector systems [29].
In addition, Christou and his co-authors made numerous mistakes, false and unsubstantiated assertions, and misrepresentations of our data. The title of Arjo et al.’s paper includes defamation and a misrepresentation of our research, implying that it is ‘pseudoscience’ and alleging that it claimed Roundup Ready maize and Roundup herbicide caused ‘cancer’ in rats – a claim we never made. We did not even use the word ‘cancer’ in our paper although this argument was reiterated in the final letter of the editor-in-chief of FCT when explaining his decision to retract our paper [30]. Tumors do not always lead to cancer, even if they can be more deleterious in a shorter time because of their size or body position, by hurting internal functions.
Arjo et al.’s paper begins with a false assertion that is not evidenced in the paper or in the cited source: ‘It started with a press conference in which journalists agreed not to engage in fact-checking’. The authors made other false assertions about our study, for example, alleging that ‘the water consumption was not measured’. In fact, we measured both the water and food consumption, and the stability of the Roundup solution over time. This was indicated in the paper, in which we explained that all the data cannot be shown in one paper and that we concentrated on the most important data; these parameters were only part of a routine survey. They also falsified the reporting of the data, compiling the mortality data only at the end of the experiment and ignoring the originality and the major findings of the differential chronological effects between treated rats and controls, which we established by measuring tumor size twice a week over 2 years. Moreover, we respected legal requirements and ethical norms relating to animal experiments, and Arjo et al. present no evidence of the contrary, so their allegation of inhumane treatment of the rats is without substance.
Importantly, we had already answered many of the criticisms of our paper made by Arjo et al. in a paper that was published before that of Arjo et al. [12]. Their publication was received on 20 December 2012, when our paper was published on 9 November 2012. Our published answers were simply ignored.
Christou was not alone in failing to declare conflicts of interest in his criticism of our paper. Since we underlined that 75% of the comments addressed to FCT within a week after our study was published came from plant biologists, it was discovered that several had developed patents on GMOs. Some authors were employees of Monsanto Company, which owns NK603 GM maize and sells Roundup herbicide [4,11]. Other more recent papers, published by plant biologists and/or affiliates of the industry-funded group ILSI [15,16], repeated the arguments. The author of a separate article criticizing our study expressed concern that our results could damage public opinion about GM crops [14] – a sentiment that gives precedence to economic interests over public health. An article in Forbes magazine even alleged, without presenting any evidence, that we had committed fraud [31]. Surprisingly, even Monsanto authors [11] declared that they had ‘no conflicts of interest’ in their first draft published online on FCT website. Investigative reports [32,33] evidenced that many authors of these opinions had failed to disclose their conflicts of interest, including Henry Miller, Mark Tester, Chris Leaver, Bruce Chassy, Martina Newell-McGloughlin, Andrew Cockburn, L. Val Giddings, Sivramiah Shantharam, Lucia de Souza, Erio Barale-Thomas, and Marc Fellous. The undisclosed conflicts of interest included links with biotechnology companies that develop GMOs and with industry-backed lobbying organizations.
All of this has huge implications for public health. We observed an intense lobbying in parliaments, as well as proofs of conflicts of interests for persons involved in the regulatory decisions for the commercialization of these products [26]. A series of high-profile conflict-of-interest revelations (not restricted to GMOs and pesticides) led to the resignations of leading administrators involved in decisions affecting the assessment of these products, including the European Commissioner John Dalli [34] and the former chair of the European Food Safety Authority’s (EFSA) management board Diana Banati [35]. In February of 2013, a strange occurrence following the publication of our paper raised questions about the connections of industry to scientific publishing, described below.
Conflicts of interests in the editorial board
In February 2013, FCT acquired a new assistant editor for biotechnology, Richard E. Goodman. The editor-in-chief has admitted that Goodman was introduced into the editorial board after he sent a letter to FCT to complain about our study. In his letter, Goodman appears worried about economic consequences but not so much about potential public health consequences (personal communication). He wrote: ‘The implications and the impacts of this uncontrolled study is having HUGE impacts, in international trade, in consumer confidence in all aspects of food safety, and certainly in US state referendums on labelling’. Further in his letter, Goodman asked for ‘an evaluation by an independent set of toxicologists’. This is particularly why the Publishing Assistant for FCT asked for our raw data on 15 March 2013.
In fact, we can question the independence of this re-evaluation. After his appointment at FCT, Goodman was a member of the subcommittee that requested our raw data, until we complained to Elsevier publishing group. Goodman is far from being independent. He previously worked for Monsanto for 7 years [36]. He also has a long-standing affiliation with ILSI [37]. Goodman will now deal with all biotechnology papers submitted to FCT. Another scientific paper on GMO risks was withdrawn from FCT, without explanation shortly after it had been accepted and published by the journal [38]. The paper was immediately published by another journal [39] according to the authors’ initiative.
We received a letter from the editor-in-chief of FCT, A. Wallace Hayes, asking us to retract our paper on 19 November 2013, more than 1 year after its publication [40]. In his retraction notice, the editor-in-chief certifies that ‘no evidence of fraud or intentional misrepresentation of the data’ was found in the investigation, that the results are ‘not incorrect’, ‘there was no misconduct’, and that the sole reason for retraction is the ‘inconclusiveness’ of the paper. He argued that no conclusions could be drawn because we studied 10 rats per group over 2 years, because they were Sprague Dawley rats, and because we could not conclude on cancer. In fact, the Sprague Dawley is a standard choice for 2-year studies performed by industry and independent scientists alike [17,41]. We also measured 10 animals per sex per group according to OECD 452 guideline on chronic toxicity studies [42] because our study is a chronic toxicity study that was never intended to be a carcinogenicity study. We wish to point out that Dr Hayes’ decision is in violation of the retraction guidelines of the Committee on Publication Ethics (COPE), of which FCT is a member. ‘Inconclusiveness’ is not a valid reason for a journal to retract a paper. Lack of conclusiveness (which can be discussed) and error are not synonymous. COPE criteria for retraction included scientific misconduct/honest error, prior publication, plagiarism, or unethical research. None of these criteria applied to our study. On the contrary, numerous published scientific papers contain inconclusive findings. It is for further studies to build on the reported findings and arrive at a more conclusive position. In contrast with our study measuring toxicity, the Monsanto study reporting safety with the same number and the same strain of rats, but limited to 90 days, [4] is not subject to the same controversy. The data in the Monsanto study show statistically significant differences in multiple-organ functions between the GM and non-GM feeding groups, which the authors dismissed as not ‘biologically meaningful’, using a set of questionable criteria [43]. The significant effects observed do not have to be linear to the dose to be taken into consideration; otherwise, endocrine effects will be dismissed. In addition, biochemical disturbances do not have to correlate simultaneously with organ lesions, in contrast to the claims of Doull et al. [44] in defence of Monsanto. These outdated concepts coming from the toxicology of poisons, and are not valid for endocrine disruption [43,45]. If 10 rats/sex/group are too few to demonstrate a toxic effect, then this number of rats is certainly too small to demonstrate safety. Overall, in the current system of assessment, any toxic effect is first suspected to be a false positive, arising by chance, rather than questioning whether no evidence of effect is a false negative result. The Monsanto data as presented are thus inconclusive and should also be retracted.
Following the retraction of our paper, many letters were sent to the editor-in-chief of FCT. On 10 December 2013, he published a defence of the retraction, which raised many doubts as to his understanding of our data [30]. He claimed that we concluded on cancer, although ours was a long-term toxicity study with a detailed statistical analysis of blood and urine parameters. He also defended the study done by Monsanto [4] claiming that they used 20 rats/sex/group while we only used 10 rats/sex/group. In fact, despite the fact that the Monsanto study used twice our sample size, the Monsanto authors only analyzed blood and urine from half of the animals (10), the same number of sampled animals as in our study.
According to an editorial in Environmental Health Perspectives [46], ‘the decision to retract a published scientific work by an editor, against the desires of the authors, because it is ‘inconclusive’ based on a post hoc analysis represents a dangerous erosion of the underpinnings of the peer-review process, and Elsevier should carefully reconsider this decision’.
Confidentiality and censorship erode the value of science
Recent reviews of the GM food safety literature have found that research concluding that GM products were safe tended to come from industry and that research conducted by those with either financial or professional conflicts of interest was associated with outcomes favorable to the GM sector [47]. In fact, it appears in our case that consequences of conflicts of interests in science go beyond divergence in scientific interpretations and also rely on unscientific practices: confidentiality and censorship.
Transparency of, and access to, all the raw data obtained by companies and accepted by regulatory agencies (overall blood analyses of rats) as proof of safety for products, is an unavoidable first step to move forward in this debate. It is the only way in which the scientific community can enter the scientific discussion. This is why we republish our paper in an open access way, together with its raw data allowing debate about our results. This is not possible for the data used as a proof of safety for commercial authorizations. The Monsanto toxicological data on NK603 maize recently made public by EFSA is not in a statistically usable format and an agreement with Monsanto is requested before use. Moreover, the data examined for Roundup authorizations are clearly inadequate [48]. For instance, ANSES (French Agency for Food, Environmental and Occupational Health & Safety), confirmed to us in writing (January 2013) that there were no 2-year studies of Roundup in its whole formulation on animals, adding that there are a few studies of acute toxicity (a few days up to 3 weeks) without any blood tests. Instead, glyphosate, which is much less toxic than Roundup [10,49], is tested alone by Monsanto, in its reports to regulatory authorities [50]. We strongly emphasize that data with implications for public health are not related to manufacturing patents and should not be kept confidential. Removal of confidentiality claims on biosafety data is necessary to adhere to standard scientific procedures of quality assurance, to increase transparency, to minimize impacts of conflicts of interests, and ultimately to improve public confidence in GMOs [51]. Moreover, in the regulatory assessment of GMOs, chemicals, and medicines, confidential tests are conducted by the applicant companies themselves, often in their own laboratories or in those of subcontractors.
The second step must be the building of new experiments for new or the most important products, by laboratories independent of the companies. They will be recruited by public tender, with compulsory transparency of the results. This public research will be funded by companies, at a level corresponding to their previous budget for regulatory testing, but managed independently of the companies. The protocols and results will be submitted to open and contradictory assessments. Thus, there will be no additional financial cost or time delay to the current system. Such reforms will not only radically transform the understanding and knowledge of toxicology and science in general, but will radically reduce public health costs and promote trust in companies and science. This will move the world towards a sustainable development of products with low, if any, impacts on health and environment.
The reason given to retract our paper – ‘inconclusiveness’ – is unprecedented and violates the norms of scientific publishing. The decision to retract cannot be rationalized on any discernible scientific grounds. Censorship on research into the risks of a technology so critically entwined with global food safety undermines the value and the credibility of science.
Competing interests
The authors declare that they have no competing interests.
Authors’ contributions
GES designed and coordinated the commentary. RM participated in the drafting of the manuscript and final version. ND and JsDV helped in the writing, compiling the literature, revising details, and proofreading the manuscript. All authors read and approved the final manuscript.
Acknowledgements
We acknowledge the Charles Leopold Mayer (FPH) and Denis Guichard Foundations, together with CRIIGEN, for fellowships and structural supports. We are equally thankful to Malongo, Lea Nature, and the JMG Foundation for their help.
References
Seralini G-E, Mesnage R, Clair E, Gress S, de Vendomois J, Cellier D (2011) Genetically modified crops safety assessments: present limits and possible improvements. Environ Sci Eur 23:10 BioMed Central Full Text
Spiroux de Vendômois J, Cellier D, Velot C, Clair E, Mesnage R, Seralini GE (2010) Debate on GMOs health risks after statistical findings in regulatory tests. Int J Biol Sci 6:590-598 Publisher Full Text
Seralini GE, Clair E, Mesnage R, Gress S, Defarge N, Malatesta M, Hennequin D, de Vendomois JS (2012) RETRACTED: Long term toxicity of a Roundup herbicide and a Roundup-tolerant genetically modified maize. Food Chem Toxicol 50:4221-4231Retracted in Food and Chemical Toxicology. 2014, 4263: 4244Publisher Full Text
Hammond B, Dudek R, Lemen J, Nemeth M (2004) Results of a 13 week safety assurance study with rats fed grain from glyphosate tolerant corn. Food Chem Toxicol 42:1003-1014 Publisher Full Text
Spiroux de Vendômois J, Roullier F, Cellier D, Seralini GE (2009) A comparison of the effects of three GM corn varieties on mammalian health. Int J Biol Sci 5:706-726 Publisher Full Text
(2012) Review of the Séralini et al. (2012) publication. EFSA J 10(10):2910
Richard S, Moslemi S, Sipahutar H, Benachour N, Seralini GE (2005) Differential effects of glyphosate and roundup on human placental cells and aromatase. Environ Health Perspect 113:716-720 Publisher Full Text
Benachour N, Seralini GE (2009) Glyphosate formulations induce apoptosis and necrosis in human umbilical, embryonic, and placental cells. Chem Res Toxicol 22:97-105 Publisher Full Text
Mesnage R, Bernay B, Seralini GE (2013) Ethoxylated adjuvants of glyphosate-based herbicides are active principles of human cell toxicity. Toxicology 313:122-128 Publisher Full Text
Hammond B, Goldstein DA, Saltmiras D (2013) Letter to the editor. Food Chem Toxicol 53:459-464 Publisher Full Text
Seralini GE, Mesnage R, Defarge N, Gress S, Hennequin D, Clair E, Malatesta M, de Vendomois JS (2013) Answers to critics: why there is a long term toxicity due to NK603 Roundup-tolerant genetically modified maize and to a Roundup herbicide. Food Chem Toxicol 53:461-468
Arjo G, Portero M, Pinol C, Vinas J, Matias-Guiu X, Capell T, Bartholomaeus A, Parrott W, Christou P (2013) Plurality of opinion, scientific discourse and pseudoscience: an in depth analysis of the Seralini et al. study claiming that Roundup Ready corn or the herbicide Roundup cause cancer in rats. Transgenic Res 22:255-267 Publisher Full Text
Houllier F (2012) Biotechnology: bring more rigour to GM research. Nature 491:327 Publisher Full Text
Martinelli L, Karbarz M, Siipi H (2013) Science, safety, and trust: the case of transgenic food. Croat Med J 54:91-96 Publisher Full Text
Romeis J, McLean MA, Shelton AM (2013) When bad science makes good headlines: Bt maize and regulatory bans. Nat Biotechnol 31:386-387 Publisher Full Text
King-Herbert A, Sills R, Bucher J (2010) Commentary: update on animal models for NTP studies. Toxicol Pathol 38:180-181 Publisher Full Text
Robinson C, Holland N, Leloup D, Muilerman H (2013) Conflicts of interest at the European Food Safety Authority erode public confidence. J Epidemiol Community Health 67(9):712-720
Lougheed T (2006) WHO/ILSI affiliation sustained. Environ Health Perspect 114(9):A521 Publisher Full Text
(2005) ILSI Protein Allergenicity Technical Committee.
Mezzomo BP, Miranda-Vilela AL, de Souza Freire I, Barbosa LC, Portilho FA, Lacava ZG, Grisolia CK: WITHDRAWN: Effects of oral administration of Bacillus thuringiensis as spore-crystal strains Cry1Aa, Cry1Ab, Cry1Ac or Cry2Aa on hematologic and genotoxic endpoints of Swiss albino mice.Food Chem Toxicol 2012, doi:10.1016/j.fct.2012.10.032.
Mezzomo B, Miranda-Vilela A, Freire I, Barbosa L, Portilho F (2013) Hematotoxicity of Bacillus thuringiensis as spore-crystal strains Cry1Aa, Cry1Ab, Cry1Ac or Cry2Aa in Swiss albino mice. J Hematol Thromb Dis 1:104doi:104172/jhtd1000104Publisher Full Text
Hayes AW (2014) Retraction notice to “Long term toxicity of a Roundup herbicide and a Roundup-tolerant genetically modified maize” [Food Chem. Toxicol. 50 (2012) 4221–4231]. Food Chem Toxicol 63:244 Publisher Full Text
Meyer H, Hilbeck A (2013) Rat feeding studies with genetically modified maize – a comparative evaluation of applied methods and risk assessment standards. Environ Sci Eur 25:33 BioMed Central Full Text
(2012) OECD Guidelines for the Testing of Chemicals, Section 4: Health Effects Test No. 452: Chronic Toxicity Studies. OECD Publishing, Paris.
Séralini GE, de Vendomois JS, Cellier D, Sultan C, Buiatti M, Gallagher L, Antoniou M, Dronamraju KR (2009) How subchronic and chronic health effects can be neglected for GMOs, pesticides or chemicals. Int J Biol Sci 5:438-443 Publisher Full Text
Doull J, Gaylor D, Greim HA, Lovell DP, Lynch B, Munro IC (2007) Report of an Expert Panel on the reanalysis by of a 90-day study conducted by Monsanto in support of the safety of a genetically modified corn variety (MON 863). Food Chem Toxicol 45:2073-2085 Publisher Full Text
Vandenberg LN, Colborn T, Hayes TB, Heindel JJ, Jacobs DR Jr, Lee DH, Shioda T, Soto AM, Vom Saal FS, Welshons WV, Zoeller RT, Myers JP (2012) Hormones and endocrine-disrupting chemicals: low-dose effects and nonmonotonic dose responses. Endocr Rev 33:378-455 Publisher Full Text
Portier C, Goldman L, Goldstein B (2014) Inconclusive findings: now you see them, now you don’t! Environ Health Perspect 122(2):A36doi:10.1289/ehp.1408106Publisher Full Text
Diels J, Cunha M, Manaia C, Sabugosa-Madeira B, Silva M (2011) Association of financial or professional conflict of interest to research outcomes on health risks or nutritional assessment studies of genetically modified products. Food Policy 2011(36):197-203 Publisher Full Text
Mesnage R, Defarge N, SpirouxDeVendômois J, Séralini GE (2014) Major pesticides are more toxic to human cells than their declared active principles. BioMed Res Int 2014:Article ID 179691 Publisher Full Text
There may be a fatal tumour in your brain. The only way we’ll know is if I cut it open – but there’s a chance that might kill you. Shall I go ahead?
We’ve just been confronted with a question a bit like this by scientists at the University of Wisconsin-Madison. They insist the only way to guard against the outbreak of a deadly flu epidemic like the Spanish flu of 1918 is to create viruses very similar to those responsible. Not to study them in the wild, mind, but to actively engineer from bird flu genes a strain that can pass in airborne droplets from one animal – or perhaps species – to another. Sure, it is dangerous. But what about the risk of doing nothing?
Not according to Sir Robert May, one of the world’s most respected epidemiologists. Publicly he has called the work “absolutely crazy”, and given May’s reputation for directness his private opinion is likely to be less polite. He’s not alone. Other researchers have challenged the claims of the Wisconsin team that their work is the only way to find out how to combat a lethal flu outbreak effectively, and that the experiments were deemed necessary and safe by experts. May even suggests that the team effectively hoodwinked the US National Institutes of Health into granting approval and funding.
Research on pathogens, particularly viruses, has become increasingly disputatious over the past decade. In 2002 a team at the State University of New York ordered pieces of synthetic DNA through the mail, from which they pasted together the genome of the polio virus. They then “booted it up” to infect mice, explaining that the work had been done to highlight the risk of how easy it was. Others accused the team of an irresponsible publicity stunt. The Wisconsin team, led by the virologist Yoshihiro Kawaoka, courted controversy in 2012 when it created a mutant strain of H5N1 bird flu that could spread among mammals. Its results, and similar ones from a team in the Netherlands, were deemed too dangerous to publish by a US biosecurity panel that feared what bioterrorists might do with them.
In one sense we have been here before. Research often carries risks, whether of intentional misuse or accidents. The discovery of nuclear energy in the early 20th century, and of how to release it through nuclear fission in 1938, were arguably examples of “pure” research with perilous applications that still loom apocalyptically today. The common response of scientists is that such is the inevitable price of new knowledge.
But the dangers of biotechnology, genetics and synthetic biology are something new. For centuries we struggled to keep nasty microorganisms at bay. Even the discovery of antibiotics gave us no protection from viruses, and the emergence of HIV was a bitter reminder of that. But with the arrival of genetic manipulation in the 1970s, nature was no longer an inscrutable menace warded off with trial-and-error potions: we could fight back at the genetic level.
This new means of intervention brought a new way to foul up. Synthetic biology promises to take the battle to the next level: to move beyond tinkering with this or that resistance gene, say, and to enable full-scale engineering and design of life. We can take our nemeses apart and rebuild them from scratch.
Yet we arrive at this point relatively unprepared to deal with the moral dilemmas. The heated nature of the current debate signifies as much: scientists have never been averse to shouting at each other about the interpretation of their results, but it is rare to see them so passionately opposed on the question of whether a piece of research should be done in the first place. If even top experts can’t agree, what’s to be done?
Physical scientists are often faced with questions that can’t be answered experimentally; not, on the whole, because the experiments are too dangerous – but because they are too hard. Their usual response is to figure out what should happen in theory, and then see if the predictions can be tested in more accessible, simpler ways. But in biology it is much, much harder to make reliable theoretical predictions (or any predictions at all), because living things are so damned complicated.
We’re getting there, however, as witnessed by the development of computer models of human physiology and biochemistry for drug testing. It’s not too much to hope that one day drugs might be designed and safely trialled almost wholly on the computer, without the need for controversial animal tests or expensive human trials.
Other models might be adequate for understanding viruses, which are after all the simplest organisms known. One reason why some researchers argue that remaining smallpox stocks be destroyed is that the live virus is no longer needed for research – its genome sequence is enough. Looked at this way, making hair-raisingly lethal viruses to understand their behaviour reflects our lamentable ignorance of the theoretical principles involved.
There could be ways to make experiments safer too. Faced with fears about the quasi-artificial life forms they are starting to create, synthetic biologists say that it should be possible to build in safety measures – for example, so that the organisms can only survive on a nutrient unavailable in the wild, or will self-destruct after a few rounds of replication.
These are not fantasies, although they raise questions both about whether such fail-safe strategies give natural selection even more urgency to evade them – and whether there’s a false security in the whole engineering paradigm when applied to biology.
All the same, the questions raised by flu research can’t be defused with techno-fixes alone. Forget the new Longitude prize – here is a place where science really does need to be democratic.
One thing you can say for sure about the question posed at the outset is that the patient should have a say. If scientists are going to take these risks for our sake, as they claim, then we had better be asked for our approval.
It’s in our interests to ensure that our decision is informed and not kneejerk, and the appropriate democratic machinery requires careful construction. But the consent must be ours to give.
Although the legal debates about Iran are not taking place in an international court – at least not yet – the veracity of the scientific evidence espoused by all sides to support their legal arguments is nevertheless an extremely important matter, particularly in light of the debacle of the 2003 Iraq war having been based, at least in part, on bad technical and scientific analysis of intelligence information on similar questions. – Dan Joyner
This week the P5+1 and Iranian officials meet again to try to narrow differences over a comprehensive nuclear deal, which is to last for an as-yet unknown duration. Reaching an agreement will be a challenging task because Iran and P5+1 seem to disagree – among other things – about the enrichment capacity Iran should be allowed during the (unknown) term of the comprehensive deal.
According to the Institute for Science and International Security (ISIS) limits on Iran’s enrichment capacity are important because they would lengthen the time needed for Iran to “breakout” and quickly enrich uranium to weapons-grade in any hypothetical race to a uranium-based device.
But Jeffrey Lewis of the Monterey Institute has suggested that such limits are meaningless, saying, “This is completely wrong. Breakout is precisely the wrong measure of whether a deal is successful,” because the Iranians – goes the argument – could use a covert facility to breakout if they wanted to do that.
Instead, intensive verification and intrusive inspections above and beyond what is codified in international law by the so-called “Additional Protocol” have been suggested to try to address this fear.
In a separate report last week, Mr. Porter assesses that David Albright, the founder and executive director of the Institute for Science and International Security (ISIS) in Washington, DC, a prominent commentator on nonproliferation and Iran’s nuclear program has embraced an alarmist line on the Iran issue – despite his knowledge that there were serious problems with the evidence on which it was based.
My intention here isn’t to evaluate the specific items of evidence presented in Mr. Porter’s reports but to weigh in with my own expert analysis – some of it done in collaboration with Dr. Ferenc Dalnoki-Veress of the Monterey Institute – of the quality of the evidence against Iran.
By way of context, Iran has never been formally accused of manufacturing nuclear weapons. The IAEA did determine that Iran was in “non-compliance” with its safeguards agreement in 2005. But this had to do with technical nuclear material accountancy matters — “non-compliance” does not mean Iran was making nuclear weapons. For example, South Korea and Egypt both violated their safeguards agreements in 2004 and 2005. But these U.S. allies were never even referred to the UN Security Council — let alone targeted for sanctions. Pierre Goldschmidt, a former deputy director of safeguards at the IAEA, has noted the “danger of setting bad precedents based on arbitrary criteria or judgments informed by political considerations” at the IAEA.
It is not always easy to obtain access to the actual evidence being used against Iran, but occasionally some is leaked to the press and is amenable to scientific scrutiny. Below, I list some of this evidence being used against Iran, as well some historical record of the group(s) making the allegations:
[1]. An indication into the quality – or, rather, lack thereof – of the evidence against Iran comes from my analysis (done with another physicist, Ferenc Dalnoki-Veress of the Monterey Institute) of the graphs published by the Associated Press purporting to show an Iranian interest in modeling a nuclear explosion. Aside from the fact that there is nothing illegal with doing such theoretical modeling, our analysis showed that there was a large numerical error in the graph and that the time-scale of the explosion was wrong.
[2]. In February 2013, the Washington Post published a story that “purchase orders obtained by nuclear researchers show an attempt by Iranian agents to buy 100,000 … ring-shaped magnets” and that such “highly specialized magnets used in centrifuge machines … [are] a sign that the country may be planning a major expansion of its nuclear program.” As evidence, the Post’s Joby Warrick cited a report authored by David Albright of the Institute for Science and International Security (ISIS).
The Washington Post’s ombudsman eventually got involved and his report is appended below (the cc field has been x’ed out as it mentions the emails of editors & others):
I’ve read everything that Mr. Butt referred me to, and Joby’s story.
A couple of things trouble me. Language like “place the order” doesn’t seem borne out by the nature of those notes that ISIS included copies of in the PDF. It certainly looks like that Iranian company is looking to buy magnets, but I’m not sure I would say “place the order” or “new orders” based on that evidence. And that there is no evidence that a purchase actually went through, as Joby wrote, correct? And there is no date, other than mentioned in the story “about a year ago.” That’s pretty vague, and Iran since then has made some moves, as Joby reported, such as converting some enriched uranium into metal, that suggest it might be listening to international concerns.
Is Joby persuaded that these magnets could only be used for centrifuges? Could Mr. Butt be correct that they could be used for other things and Iran would have the industrial and economic demand for them as speaker magnets or what have you? And how would these magnets, if they were intended for use in centrifuges, play in to the damage caused by stuxnet, in which many of the first generation Iranian centrifuges were damaged?
Just before nuclear talks get underway I am always suspicious of stories that suddenly surface that seem to reinforce the narrative that Iran is building nuclear weapons.
Last July, Joby had the story on the potential increasing threat of the Iranian Navy against the U.S. Navy. Nowhere in that story was there anything about the economic sanctions that many defense experts say are hurting the Iranian military deeply.
I’ve been on some 60 U.S. Navy ships, including five or six carrier battle groups underway. The planes and helicopters that circle in the air above battle groups have considerable surveillance- and fire power. So do U.S. attack submarines who patrol with the battle groups. The new littoral combat ships have plenty of ability to attack shoreline installations in minutes. That is a formidable array of offensive capability.
Of course we should always be vigilant and pay attention to information that comes to us, and report it out. But neither do we want to overstate any threat from any enemy, real or potential.
Kelley is a true authority on such matters, being a nuclear engineer and a veteran of over 35 years in the US nuclear weapons complex, most recently at Los Alamos. He managed the centrifuge and plutonium metallurgy programs at Lawrence Livermore National Laboratory, and was seconded by the US DOE to the IAEA where he served twice as a Director in the nuclear inspections in Iraq, in 1992-1993 and 2002-2003. He is currently an Associate Senior Research Fellow at the Stockholm International Peace Research Institute (SIPRI).
Most importantly, the SIPRI report says that the paving work at Parchin would not completely hide any alleged contamination because there is an area west of the building of interest that remains untouched. And, in any case, the important samples in such a test would come from within buildings not outside on the ground.
Let’s also recall that the IAEA has already visited Parchin twice in 2005 and found nothing – although they did not go to the specific area they are now interested in. However, the IAEA could have gone to that area even in 2005 – they simply chose to go to other sites on the military base. As the IAEA report at the time summarized:
“The Agency was given free access to those buildings and their surroundings and was allowed to take environmental samples, the results of which did not indicate the presence of nuclear material, nor did the Agency see any relevant dual use equipment or materials in the locations visited.”
When the IAEA last went to Parchin, Olli Heinonen was head of IAEA safeguards and led the inspections – the methodology for choosing which buildings to inspect is described in an excellentChristian Science Monitorarticle which is worth reading in its entirety, but I quote the relevant bits:
“At the time, it[Parchin] was divided into four geographical sectors by the Iranians. Using satellite and other data, inspectors were allowed by the Iranians to choose any sector, and then to visit any building inside that sector. Those 2005 inspections included more than five buildings each, and soil and environmental sampling. They yielded nothing suspicious, but did not include the building now of interest to the IAEA.
“The selection [of target buildings] did not take place in advance, it took place just when we arrived, so all of Parchin was available,” recalls Heinonen, who led those past inspections. “When we drove there and arrived, we told them which building.”
“Also unusual is how open and specific the IAEA has been about what exactly it wants to see, which could yield doubts about the credibility of any eventual inspection.
“I’m puzzled that the IAEA wants to in this case specify the building in advance, because you end up with this awkward situation,” says Olli Heinonen, the IAEA’s head of safeguards until mid-2010.
“First of all, if it gets delayed it can be sanitized. And it’s not very good for Iran. Let’s assume [inspectors] finally get there and they find nothing. People will say, ‘Oh, it’s because Iran has sanitized it,’” says Mr. Heinonen, who is now at Harvard University in Cambridge, Mass. “But in reality it may have not been sanitized. Iran is also a loser in that case. I don’t know why [the IAEA] approach it this way, which was not a standard practice…”
Hans Blix, former chief of the IAEA and later of UN weapons inspectors in Iraq, has also expressed surprise at the focus on Parchin, as a military base that inspectors had been to before.
“Any country, I think, would be rather reluctant to let international inspectors to go anywhere in a military site,” Mr. Blix told Al Jazeera English… “In a way, the Iranians have been more open than most other countries would be.”
One of the reasons that Mr. Blix says that is because normally the IAEA does not have the legal authority to inspect undeclared non-nuclear-materials related facilities, in a nation – like Iran — that has not ratified the Additional Protocol.
The IAEA can call for “special inspections” but they have not done so. They can also choose arbitration, as specified in the Comprehensive Safeguards Agreement, but again they have not done that.
So Iran has been more cooperative than they have needed to be in already allowing inspections of Parchin.
“Iran has engaged in large-scale bulldozing operations on about 25 hectares near the Parchin building. This includes the bulldozing of old dirt piles to level a field 500 metres north of the building of interest. However, there has been no such activity in the area west of the building, except for removing some parking pads within about 10 m of it. The fact that the building’s immediate vicinity has been largely untouched on the west side strongly suggests that the purpose of the earth-moving operations was for construction and renovation work and not for ‘sanitizing’ the site by covering up contamination.”
“Some of the experiments described by the IAEA do not and cannot use uranium. The results would be inconclusive if they did. So the basis for the IAEA’s requests continues to be opaque. The timeline for the alleged experiments is also highly suspect, with claims that massive experimental facilities had been fabricated even before they had been designed, according to the available information. The IAEA work to date, including the mischaracterization of satellite images of Parchin, is more consistent with an IAEA agenda to target Iran than of technical analysis.” [Emphasis added]
[4]. The biased analysis of Parchin is, unfortunately, part of a longstanding pattern at ISIS. David Albright co-authored a Sept. 10, 2002, article – entitled “Is the Activity at Al Qaim Related to Nuclear Efforts?” – which declared:
“High-resolution commercial satellite imagery shows an apparently operational facility at the site of Iraq’s al Qaim phosphate plant and uranium extraction facility (Unit-340), located in northwest Iraq near the Syrian border. This site was where Iraq extracted uranium for its nuclear weapons program in the 1980s. …
“This image raises questions about whether Iraq has rebuilt a uranium extraction facility at the site, possibly even underground. … Unless inspectors go to the site and investigate all activities, the international community cannot exclude the possibility that Iraq is secretly producing a stockpile of uranium in violation of its commitments under Security Council resolutions. The uranium could be used in a clandestine nuclear weapons effort.”
Of course the passage is evasive and does not make any definitive claim. But its suggestive and misleading rhetoric implying a possible nuclear weapon program in Iraq turned out to be wrong.
However, ISIS has written almost identical slippery rhetorical statements about various facilities in Iran. There is no end to such “possible facilities” in any country. The point to take home from the erroneous (suggestive) interpretation of the satellite images of facilities in Iraq is that it is very difficult to be sure of what one is seeing in satellite imagery.
[5]. The Exploding Bridgewire Detonators (EBWs) issue is among other pieces of circumstantial evidence publicized by Albright’s ISIS group as possibly implicating Iran. But there are many non-nuclear weapons uses for EBWs, especially for an oil-rich nation like Iran. One manufacturer of EBWs explains that these have “… applications in explosive welding of piping and tubing, seismic studies, oil well perforating & hard rock mining.”
The manufacturer is explicit that EBWs “… have found a wide range of applications within the mining, explosive metal welding and energy exploration field. Many of these uses could not be accomplished using conventional blasting equipment without a compromise of safety.”
Furthermore, Iran was not secretive about its work on EBWs. As the November 2011 IAEA report states: Iran “provided the Agency with a copy of a paper relating to EBW development work presented by two Iranian researchers at a conference held in Iran in 2005. A similar paper was published by the two researchers at an international conference later in 2005.”
The Agency, however, noted, “Iran’s development of such detonators and equipment is a matter of concern…” It really is not given its other civilian (and conventional military) uses, and Iran’s relative openness in pursuing the technology.
The expert Atomic Reporters have weighed in: “While the IAEA reported in 2011 that there are ‘limited civilian and conventional military applications’ for exploding bridge wire detonators, the open source literature shows the technology is widely used in the mining, aerospace and defense industries.”
Again, as long ago as 2011 Robert Kelley, a former IAEA inspector, stated: “The Agency is wrong. There are lots of applications for EBWs… To be wrong on this point, and then to try to misdirect opinion shows a bias towards their desired outcome… That is unprofessional.”
[6]. Other technical experts have also weighed in on Albright’s and ISIS’ track-record. For instance, in a long-running argument with the Federation of American Scientists (FAS) over the capability of Iran’s centrifuges at the Fordow facility, ISIS consistently exaggerated their capability. Ivanka Barzashka and Dr. Ivan Oelrich explained how ISIS generated the wrong numbers:
“When given the choice between a higher value attributed to unnamed sources and values he calculates himself, Albright consistently chooses the higher values.This is especially misleading when dealing with weapon production scenarios, which evaluate what Iran can currently achieve.” [emphasis added]
[7]. In a separate long-running argument with a scientist, Dr. Thomas Cochran, at the Natural Resources Defense Council (NRDC) over the plutonium production capability of the Khushab II reactor in Pakistan it took David Albright years to admit that he and Paul Brannan over-estimated the capability of the reactor by a factor of 10 to 25. This is not a minor error.
Thus, the pattern that emerges of the “evidence” against Iran (and other nations) is of consistent bias, exaggeration and unprofessionalism by some independent nonproliferation security analysts, as well as by the IAEA itself.
“What about the three indications that the arms project may have been reactivated?
Two of the three are attributed only to two member states, so the sourcing is impossible to evaluate. In addition, their validity is called into question by the agency’s handling of the third piece of evidence.
That evidence, according to the IAEA, tells us Iran embarked on a four-year program, starting around 2006, to validate the design of a device to produce a burst of neutrons that could initiate a fission chain reaction. Though I cannot say for sure what source the agency is relying on, I can say for certain that this project was earlier at the center of what appeared to be a misinformation campaign.
In 2009, the IAEA received a two-page document, purporting to come from Iran, describing this same alleged work. Mohamed ElBaradei, who was then the agency’s director general, rejected the information because there was no chain of custody for the paper, no clear source, document markings, date of issue or anything else that could establish its authenticity. What’s more, the document contained style errors, suggesting the author was not a native Farsi speaker. It appeared to have been typed using an Arabic, rather than a Farsi, word-processing program. When ElBaradei put the document in the trash heap, the U.K.’s Times newspaper published it.
This episode had suspicious similarities to a previous case that proved definitively to be a hoax. In 1995, the IAEA received several documents from the Sunday Times, a sister paper to the Times, purporting to show that Iraq had resumed its nuclear-weapons program in spite of all evidence to the contrary. The IAEA quickly determined that the documents were elaborate forgeries. There were mistakes in formatting the documents’ markings, classification and dates, and many errors in language and style indicated the author’s first language was something other than Arabic or Farsi. Inspections in Iraq later in 1995 confirmed incontrovertibly that there had been no reconstitution of the Iraqi nuclear program.”
The words of well-connected and informed senior ex-IAEA officials are worth heeding: Dr. Hans Blix, former head of the IAEA, has stated: “So far, Iran has not violated the NPT,” adding, “and there is no evidence right now that suggests that Iran is producing nuclear weapons.” And Mohamed ElBaradei, the Nobel Peace Prize laureate who spent more than a decade as the director of the IAEA,said that he had not “seen a shred of evidence” that Iran was pursuing the bomb. “All I see is the hype about the threat posed by Iran,” he concluded.
The maximalist approach to non-proliferation advocated by ISIS and other groups may be seen as useful but it is inconsistent with existing international law, as codified in the safeguards agreements. In fact, IAEA records show that all substantial safeguards issues raised in 2005 had been resolved in Iran’s favor by 2008. So Iran was again in compliance with its safeguards agreement at that date. All UN Security Council sanctions ought to have been dropped at that point. Yet Iran’s nuclear file still remains tied up at the Security Council due mainly to the IAEA and Security Council’s flawed handling of the case.
Out of all the countries it inspects, the IAEA spends the second-highest amount on Iran’s nuclear inspections— only Japan, with a vastly greater nuclear infrastructure, accounts for a bigger chunk.About 12 percent of the IAEA’s $164 million inspections budget is spent just on Iran. This is now increased to about 17% during the period of the interim deal because of the even more intrusive—and thus expensive—inspections being carried out now.
On a “per nuclear facility” basis the IAEA spends – by far – the largest amount of its inspections budget on Iran. Comprehensive deal or not, the IAEA will continue to conduct in Iran one of the most thorough and intrusive inspections it carries out anywhere.
However, achieving a deal is in everyone’s favor. It will be made easier by rejecting any flawed (or exaggerated) evidence or analysis being used against Iran – especially by individuals or groups who have a track-record of bias, exaggeration or erroneous scientific analysis.
Dr. Yousaf Butt, a nuclear physicist, is director of the Emerging Technologies Program at the Cultural Intelligence Institute, a non-profit dedicated to promoting fact-based cultural awareness among individuals, institutions, and governments. The views expressed here are his own.
SSRI stands for Selective Serotonin Reuptake Inhibitor, and it is a class of drugs that is often used to treat depression and anxiety. It includes Prozac, Zoloft, Celexa, Paxil and a host of other commonly prescribed antidepressants. And the perpetrators of a raft of school shootings, mass murders and other violent incidents in recent years have been taking them.
Find out more about the correlation between SSRIs and mass murder in this week’s edition of the EyeOpener Report with James Corbett.
[CLICK HERE for a French translation of this video.]
In May 1998, 15 year old Kip Kinkel murdered his parents and two classmates, as well as injuring 25 others, after engaging in a shooting spree that ended up in his school’s cafeteria. In the investigation it emerged that he had been taking popular antidepressant medication Prozac since the summer of the previous year.
In December 2000, Michael McDermott went on a shooting rampage at his workplace, Edgewater Technologies, killing seven of his co-workers. During his trial, the court heard testimony that in the weeks before the shooting, McDermott had tripled the dosage of his antidepressant medication, Prozac, from 70 milligrams per day to 210 milligrams.
In March 2005, 16 year old Jeff Weise shot and killed nine people, including five students at Red Lake Senior High School in Minnesota, before turning the gun on himself. It was later revealed he had been undergoing treatment for depression and had been on Prozac at the time.
In September 2008, Finnish post-secondary student Matti Saari shot and killed ten other students on campus before killing himself. The official Finnish government report on the incident revealed that he had been taking an SSRI medication at the time of the shooting.
SSRI stands for Selective Seratonin Reuptake Inhibitor, and it is a class of drugs that is often used to treat depression and anxiety. It includes Prozac, Zoloft, Celexa, Paxil and a host of other commonly prescribed antidepressants. And the perpetrators of a raft of school shootings, mass murders and other violent incidents in recent years have been taking them.
And so it was perhaps not surprising when the culprit of this month’s mass shooting at Fort Hood, Specialist Ivan Lopez, turned out to be taking unnamed antidepressants himself.
Although it has yet to be reported (and may in fact never be revealed) precisely what type of antidepressant Lopez was taking or whether it was an SSRI, the number of confirmed SSRI shooters in recent years has raised the question of a causal link between the medication and incidents of violence.
Although the drug manufacturers are quick to downplay this connection as anecdotal or coincidental mounting scientific evidence points to a strong correlation between the use of psychiatric drugs in general, and SSRIS in particular, and violent behavior.
In 2010, the Public Library of Science published a study titled “Prescription Drugs Associa
ted with Reports of Violence Towards Others” which examined how 484 drugs were associated with 1,937 documented cases of violent behaviour. Of those 484 drugs, 31 of them were responsible for 79% of the violence, including 11 antidepressants.
When incidents of school massacres in the US are charted against prescription of psychiatric medication, the correlation is undeniable. Further research is needed to establish if there is a causal linkage between these pharmaceuticals and the incidents of violence, but critics of the big pharmaceutical manufacturers complain such research is hampered by the low standards for reporting that these companies are held to.
One such critic, David Healy, author of over 150 peer-reviewed papers in the field of psychiatry and the author of numerous books, including Pharmageddon, joined me on The Corbett Report last week to discuss this issue.
Further complicating the issue is the fact that the general public is often, as in the case of the Fort Hood shooter, left in a state of limbo regarding the medical history of the perpetrators of these mass shooting events. Often stories are reported with vague and unconfirmed details about “antidepressants” or sometimes just medication. It can be difficult for the average person to sort through the daily reports of adverse and violent effects of these types of drugs.
One website that helps in that effort is SSRIStories.org. Begun in the 1990s, it is a repository of over 5000 news articles in which prescription drugs were linked to adverse events, including incidents of violence. Last week Julie Wood, one of the proprietors of the site, joined me to discuss the problem of sorting through the often incomplete information from these reports.
In the final equation, the question of the causal linkage between SSRIs and indeed other forms of psychiatric drugs and incidents of violence needs to be taken seriously. There are many factors at play here, from differences in individual reactions to the fact that people who are more likely to commit violent acts in the first place are often the people who are prescribed these drugs.
But the threat of violence has been taken seriously enough that the FDA in the US, the Ministry of Health in Japan and other similar bodies in countries around the world have added a warning in their guidelines for antidepressants. According to the Japanese Ministry of Health, “There are cases where we cannot rule out a causal relationship [of hostility, anxiety, and sudden acts of violence] with the medication.” And in the FDA formulation: “Antidepressant medicines may increase suicidal thoughts or actions in some children, teenagers, and young adults within the first few months of treatment.”
How can it be seen to be a good thing for anyone but the drug manufacturers themselves that these drugs have been on the market for decades and the bodies in charge of regulating them still can only offer such wishy-washy, non-evidence based statements? The issue of drug-linked violence is one that we as a society need to start discussing and acting on soon, otherwise we will continue to let the status quo be ruled not by doctors or patients or their loved ones, and certainly not by the victims of these mass murders, but by the men in the board rooms of these pharmaceutical companies who have been shown time and time again to care about nothing other than their own bottom line.
Dr. Caleb Rossiter was “terminated” via email as an “Associate Fellow” from the progressive group Institute for Policy Studies (IPS), following his May 4th, 2014 Wall Street JournalOpEd titled “Sacrificing Africa for Climate Change,” in which he called man-made global warming an “unproved science.” Rossiter also championed the expansion of carbon based energy in Africa. Dr. Rossiter is an adjunct professor at American University. Rossiter, who has taught courses in climate statistics, holds a PhD in policy analysis and a masters degree in mathematics.
In an exclusive interview with Climate Depot, Dr. Rossiter explained: “If people ever say that fears of censorship for ‘climate change’ views are overblown, have them take a look at this: Just two days after I published a piece in the Wall Street Journal calling for Africa to be allowed the ‘all of the above’ energy strategy we have in the U.S., the Institute for Policy Studies terminated my 23-year relationship with them… because my analysis and theirs ‘diverge.’”
“I have tried to get [IPS] to discuss and explain their rejection of my analysis,’ Rossiter told Climate Depot. “When I countered a claim of ‘rapidly accelerating’ temperature change with the [UN] IPCC’s own data’, showing the nearly 20-year temperature pause — the best response I ever got was ‘Caleb, I don’t have time for this.’”
Climate Depot has obtained a copy of a May 7, 2014 email that John Cavanagh, the director of IPS since 1998, sent to Rossiter with the subject “Ending IPS Associate Fellowship.”
“Dear Caleb, We would like to inform you that we are terminating your position as an Associate Fellow of the Institute for Policy Studies,” Cavanagh wrote in the opening sentence of the email.
“Unfortunately, we now feel that your views on key issues, including climate science, climate justice, and many aspects of U.S. policy to Africa, diverge so significantly from ours that a productive working relationship is untenable. The other project directors of IPS feel the same,” Cavanagh explained.
“We thank you for that work and wish you the best in your future endeavors,” Cavanagh and his IPS associate Emira Woods added.
On May 13, 2013, Rossiter wrote a blog titled on his website further detailing his climate views. The article was titled: “The Debate is finally over on ‘Global Warming’ – Because Nobody will Debate.” He wrote: “I have assigned hundreds of climate articles as I taught and learned about the physics of climate, the construction of climate models, and the statistical evidence of extreme weather.”
“My blood simply boils too hot when I read the blather, daily, about climate catastrophe. It is so well-meaning, and so misguided,” Rossiter explained.
Rossiter also ripped President Barack Obama’s climate claims in his blog post: “Obama has long been delusional on this issue, speaking of a coming catastrophe and seeing himself as King Canute, stopping the rise in sea-level. But he really went off the chain in his state of the union address this year. ‘For the sake of our children and our future’ he issued an appeal to authority with no authority behind it.”
Rosstier’s May 4, 2014 Wall Street Journal OpEd also pulled no punches. Rossiter, who holds a masters in mathematics, wrote: “I started to suspect that the climate-change data were dubious a decade ago while teaching statistics. Computer models used by the U.N. Intergovernmental Panel on Climate Change to determine the cause of the six-tenths of one degree Fahrenheit rise in global temperature from 1980 to 2000 could not statistically separate fossil-fueled and natural trends.”
His Wall Street Journal OpEd continued: “The left wants to stop industrialization—even if the hypothesis of catastrophic, man-made global warming is false.” He added: “Western policies seem more interested in carbon-dioxide levels than in life expectancy.”
“Each American accounts for 20 times the emissions of each African. We are not rationing our electricity. Why should Africa, which needs electricity for the sort of income-producing enterprises and infrastructure that help improve life expectancy? The average in Africa is 59 years—in America it’s 79,” he explained.
“How terrible to think that so many people in the West would rather block such success stories in the name of unproved science,” he concluded his WSJ OpEd.
But Rosstier’s credentials as a long-time progressive could not trump his growing climate skepticism or his unabashed promotion of carbon based fuels for Africa.
Rossiter’s website describes himself as “a progressive activist who has spent four decades fighting against and writing about the U.S. foreign policy of supporting repressive governments in the formerly colonized countries.”
“I’ve spent my life on the foreign-policy left. I opposed the Vietnam War, U.S. intervention in Central America in the 1980s and our invasion of Iraq. I have headed a group trying to block U.S. arms and training for “friendly” dictators, and I have written books about how U.S. policy in the developing world is neocolonial,” Rossiter wrote in the Wall Street Journal on May 4.
Rossiter’s Wall Street Journal OpEd continued: “The left wants to stop industrialization—even if the hypothesis of catastrophic, man-made global warming is false. John Feffer, my colleague at the Institute for Policy Studies, wrote in the Dec. 8, 2009, Huffington Post that ‘even if the mercury weren’t rising’ we should bring ‘the developing world into the postindustrial age in a sustainable manner.’ He sees the ‘climate crisis [as] precisely the giant lever with which we can, following Archimedes, move the world in a greener, more equitable direction.”
“Then, as now, the computer models simply built in the assumption that fossil fuels are the culprit when temperatures rise, even though a similar warming took place from 1900 to 1940, before fossil fuels could have caused it. The IPCC also claims that the warming, whatever its cause, has slightly increased the length of droughts, the frequency of floods, the intensity of storms, and the rising of sea levels, projecting that these impacts will accelerate disastrously. Yet even the IPCC acknowledges that the average global temperature today remains unchanged since 2000, and did not rise one degree as the models predicted.
…
“But it is as an Africanist, rather than a statistician, that I object most strongly to ‘climate justice.’ Where is the justice for Africans when universities divest from energy companies and thus weaken their ability to explore for resources in Africa? Where is the justice when the U.S. discourages World Bank funding for electricity-generation projects in Africa that involve fossil fuels, and when the European Union places a ‘global warming’ tax on cargo flights importing perishable African goods?”
The new Chief Executive of the National Health Service (NHS) in England, Simon Stevens, was recently reported arguing that the NHS must be transformed to make people’s personal genetic information the basis of their treatments (1).
His proposition is unsurprising since it is in line with the efforts of successive UK governments to build a DNA database in the NHS in England by stealth. In particular, sequencing every baby at birth and storing whole genomes in electronic medical records is a plan backed by Health Secretary Jeremy Hunt (2). The current version of this plan would involve sharing whole or partial DNA sequences (genomes or genotypes) with companies like Google, which would use genetic information and health data to calculate personal risk assessments for feedback to patients (3). Massive investment from taxpayers would be required as part of a public-private partnership that would allow commercial exploitation of the data.
Building a DNA database within the NHS would be a massive waste of public money. But it would also create a system of total surveillance which would enable the government and private companies to track every individual, not to mention their relatives. This is not speculation; as wikileaks revealed, the United States government is already actively collecting DNA samples and biometric data on foreign officials and populations.
Commercial companies wish to exploit genetic information to market products such as drugs and supplements to healthy people, based on genetic risk assessments. If current trends continue, this will harm, not benefit, health: it is personalised marketing not personalised medicine. The potential for misuse is very high. Doctors could be replaced by computer algorithms used to market medication, massively expanding the drug market to include large numbers of healthy people, rather than smaller numbers of (often poorer) people who are sick. There is also the danger that prescribing would be driven by vested interests, rather than medical need: with high financial costs and more harmful side effects. Genetic risk assessments could also be misused, leading to stigma or discrimination, for example by insurers.
Why does the NHS want to collect genetic information?
There is one element of truth to Simon Stevens’ remarks. Some cancer drugs have been successfully tailored to genetic mutations that arise in the cancer tumour. However, attempts to select drugs for people based on the genetic make-up they are born with (their genome or genotype) have largely been a failure. This is because genetic differences only account for a part of individual differences in metabolism. For example, a recent study found that targeting warfarin treatment based on genetic make-up did not improve health outcomes, although this application was regarded as the ‘poster child’ of this approach (4).
Why did it not work? An important misconception, apparently shared by Simon Stevens, is that genes are good predictors of most diseases and adverse drug reactions in most people. However, contrary to misleading claims made to promote the Human Genome Project, this is not true (5). Moreover, even if it were true, there is no evidence that genetic selection of individuals, for example into high risk and low risk groups, improves outcomes or cuts costs. All of which begs the question of the purpose of taking and storing genetic information as a default medical procedure.
The online gene testing company 23andMe, funded by Google, has been forced to withdraw its gene tests from the US market due to failure to prove they can reliably predict individual risks of many common conditions using computer algorithms. The company now wants to target the UK market, where genetic testing is not regulated (6). Patrick Chung, a 23andMe board member and partner at the venture-capital firm NEA told Fast Company (7): “… 23andMe will make money by partnering with countries that rely on a single-payer health system. “Let’s say you genotype everyone in Canada or the United Kingdom or Abu Dhabi,” he says, “and the government is able to identify those segments of the population that are most at risk for heart disease or breast cancer or Parkinson’s. You can target them with preventative messages, make sure they’re examined more frequently, and in the end live healthier lives, and the government will save massive expenses because they halted someone who’s prediabetic from getting diabetes. 23andMe has been in discussion with a bunch of such societies“. Yet there is not a scrap of evidence that this approach is good for health. This is because genomic tests have limited clinical validity or utility; so in reality there is no health benefit to targeting segments of the population in this way.
Genetic testing remains useful to diagnose rare genetic disorders, mainly in babies and young children, and whole genome sequencing has helped to identify new mutations causing these diseases. Rare familial (largely inherited) forms of many common diseases also exist, including breast cancer, but these account for only a small percentage of cases of these conditions.
Use of genetic testing in the NHS should focus on prioritising resources for the limited applications that do work, not on introducing misleading and harmful screening of the whole population and creating unnecessary, expensive databases. Certainly it should not be driven by ulterior commercial and government interests.
The Obama administration today threw a potential — and limited — lifeline to the country’s ailing nuclear industry, highlighting the ability of existing reactors to help states curb emissions.
U.S. EPA unveiled a proposal for curbing emissions from existing power plants that pointed to the United States’ fleet of about 100 reactors as playing a critical role — alongside ramping up efficiency and shifting to natural gas and other low-carbon alternatives — in cutting the utility sector’s greenhouse gas emissions by 30 percent compared with 2005 levels by 2030.
At issue is EPA’s finding in the proposal that preventing the closure of “at-risk” existing reactors could avoid up to 300 million metric tons of carbon dioxide during the initial compliance phase of 10 years.
“Policies that encourage development of renewable energy capacity and discourage premature retirement of nuclear capacity could be useful elements of CO2 reduction strategies and are consistent with current industry behavior,” the agency said. “Costs of CO2 reductions achievable through these policies have been estimated in a range from $10 to $40 per metric ton.”
The U.S. nuclear industry is facing a host of challenges, including stiff competition from cheap natural gas, low wholesale energy prices, increasing fixed operation and maintenance costs, and high upfront capital costs for building new units.
EPA noted that units have recently closed in California, Florida and Wisconsin, and additional closures have been announced in Vermont and New Jersey. EPA also noted that the U.S. Energy Information Administration in its most recent annual energy outlook projected that an additional 5.7 gigawatts of capacity — about 6 percent of the country’s current capacity — is at risk of retiring.
EPA pointed to a February 2013 Credit Suisse report that found nuclear plant operators may be experiencing a $6-per-megawatt-hour shortfall in covering operating costs with electricity sales.
“Assuming that such a revenue shortfall is representative of the incentive to retire at-risk nuclear capacity, one can estimate the value of offsetting the revenue loss at these at-risk nuclear units to be approximately $12 to $17 per metric ton of CO2,” the agency wrote. “EPA views this cost as reasonable.”
The agency went on to propose that emission reductions from retaining 6 percent of each state’s historical nuclear capacity should be factored into each state’s goals. EPA also asked for comments on whether the cost of completing new reactors in Georgia, South Carolina and Tennessee should be considered in the states’ compliance plans.
Steve Clemmer, the Union of Concerned Scientists’ director of energy research and analysis, said it’s reasonable for EPA to include existing nuclear generation in the baseline and to credit states for new reactors, adding that the agency’s modeling of the rule doesn’t project the construction of new reactors beyond the five currently being built. But Clemmer questioned EPA’s methodology and finding that 6 percent of the nation’s existing fleet is at-risk economically and applying that percentage equally across the states, noting that factors playing into each plant’s closure varies.
“In states where existing plants aren’t economically vulnerable, they could get a windfall profit by getting extra credit,” he said, noting that the industry already receives generous subsidies.
The EPA proposal is already emboldening the industry’s focus on state compliance plans.
Marv Fertel, the Nuclear Energy Institute’s president and CEO, said in an interview that the U.S. nuclear industry in coming months and years will be pushing states with merchant nuclear plants to value those units for avoiding emissions. States must submit compliance plans by June 30, 2016, or ask for an extension by April 1, 2016. The rule is slated to be finalized next June.
“We have a bunch of states that have renewable portfolio standards; we think you ought to be basically looking at in the state maybe a clean energy standard … and you should be including nuclear as a part of that,” Fertel said.
Fertel said state policies could bolster nuclear units — just as they currently boost wind and solar.
“It would work the same way it’s working for renewables right now. You have to meet the renewable standard, so you’re driving renewables into certain portfolios in the state; this would say that you ought to be looking not only to drive nuclear by either updates or whatever, but value the existing nuclear for the attribute of no emissions, as well as all it does for reliability,” Fertel said.
The current fleet of reactors avoids 600 million metric tons of carbon dioxide each year, equivalent to removing 113 million cars from the road, Fertel added.
The Obama administration in recent months has highlighted the link between climate mitigation and nuclear power. Pete Lyons, the Energy Department’s assistant secretary for nuclear energy, said earlier this year that a rash of premature U.S. reactor closures could threaten the country’s climate goals.
EPA Administrator Gina McCarthy in prepared remarks for an event in Washington, D.C., today placed nuclear power on par with solar and wind, saying states have the opportunity to “shift to ‘no’ carbon sources like nuclear, wind, and solar.” McCarthy went on to say that the nation’s nuclear reactors continue to “supply zero carbon baseload power. Homegrown clean energy is posting record revenues and creating jobs that can’t be shipped overseas.”
The administration earlier this year finalized $6.5 billion worth of loan guarantees for the country’s first U.S. reactors in decades without requiring developers to pay a “credit subsidy fee” — money that protects taxpayers should the developers default (Greenwire, April 21).
The nuclear industry has stepped up its campaign efforts in recent months, with Exelon Corp. taking the lead, partially funding a new front group called Nuclear Matters. The group’s members include former White House climate adviser and former EPA Administrator Carol Browner, former Sens. Evan Bayh (D-Ind.) and Judd Gregg (R-N.H.), former Energy Secretary Spencer Abraham, and former Commerce Secretary and Obama Chief of Staff Bill Daley.
Doug Vine, a senior fellow at the Center for Climate and Energy Solutions (C2ES), said EPA is setting base lines over the country’s total generation mix, and a state’s job becomes more difficult if a reactor retires. Vine said C2ES has seen two approaches that could benefit nuclear plants, including the clean energy standard that Fertel mentioned and carbon pricing.
Kyle Aarons, a senior fellow at C2ES, said the rule could act to incentivize states to keep current reactors running.
“It’s certainly going to change states’ thinking,” Vine said. “It’s going to put a more long-term focus on nuclear.”
Last week (May 22), I received an unsolicited email from Dr. Dag Vongraven, the current chairman of the IUCN Polar Bear Specialist Group (PBSG).
The email from Vongraven began this way:
“Dr. Crockford
Below you’ll find a footnote that will accompany a total polar bear population size range in the circumpolar polar bear action plan that we are currently drafting together with the Parties to the 1973 Agreement. This might keep you blogging for a day or two.” [my bold]
It appears the PBSG have come to the realization that public outrage (or just confusion) is brewing over their global population estimates and some damage control is perhaps called for. Their solution — bury a statement of clarification within their next official missive (which I have commented upon here).
Instead of issuing a press release to clarify matters to the public immediately, Vongraven decided he would let me take care of informing the public that this global estimate may not be what it seems.
OK, I’ll oblige (I am traveling in Russia on business and finding it very hard to do even short posts – more on that later). The footnote Vongraven sent is below, with some comments from me. You can decide for yourself if the PBSG have been straight-forward about the nature of their global population estimates and transparent about the purpose for issuing it.
“As part of past status reports, the PBSG has traditionally estimated a range for the total number of polar bears in the circumpolar Arctic. Since 2005, this range has been 20-25,000. It is important to realize that this range never has been an estimate of total abundance in a scientific sense, but simply a qualified guess given to satisfy public demand. It is also important to note that even though we have scientifically valid estimates for a majority of the subpopulations, some are dated. Furthermore, there are no abundance estimates for the Arctic Basin, East Greenland, and the Russian subpopulations. Consequently, there is either no, or only rudimentary, knowledge to support guesses about the possible abundance of polar bears in approximately half the areas they occupy. Thus, the range given for total global population should be viewed with great caution as it cannot be used to assess population trend over the long term.” [my bold]
So, the global estimates were “…simply a qualified guess given to satisfy public demand” and according to this statement, were never meant to be considered scientific estimates, despite what they were called, the scientific group that issued them, and how they were used (see footnote below).
All this glosses over what I think is a critical point: none of these ‘global population estimates’ (from 2001 onward) came anywhere close to being estimates of the actual world population size of polar bears (regardless of how scientifically inaccurate they might have been) — rather, they were estimates of only the subpopulations that Arctic biologists have tried to count.
For example, the PBSG’s most recent global estimate (range 13,071-24,238) ignores five very large subpopulation regions which between them potentially contain 1/3 as many additional bears as the official estimate includes (see map below). The PBSG effectively gives them each an estimate of zero.
Based on previous PBSG estimates and other research reports, it appears there are probably at least another 6,000 or so bears living in these regions and perhaps as many as 9,000 (or more) that are not included in any PBSG “global population estimate”: Chukchi Sea ~2,000-3,000; East Greenland, ~ 2,000-3,000; the two Russian regions together (Laptev Sea and Kara Sea), another ~2,000-3,000 or so, plus 200 or so in the central Arctic Basin. These are guesses, to be sure, but they at least give a potential size
In other words, rather than assigning a “simple, qualified guess” for these subpopulations that have not been formally counted as well as those that have been counted (generating a total figure that is indeed a “global population estimate,” however inaccurate), the PBSG have been passing off their estimate of counted populations as a true global population estimate, with caveats seldom included.
As we start hurricane season today, we note the unprecedented 3142 day drought of major hurricane landfalls, shattering a record that goes back to the year 1900.
Despite clear evidence to the contrary, president Obama is now warning us that “storms like Hurricane Sandy will become more frequent as climate change intensifies.” It’s merely the latest in the administration’s seemingly endless stream of headline-grabbing scare stories, designed to justify the job-killing, economy-strangling, family-bashing rules for vehicles, power plants, cement kilns, refineries, factories, farms, shopping malls and countless other facilities that are or soon will be regulated by Environmental Protection Agency fiat. We need to keep one vitally important fact in mind.
Every one of these “looming calamities” is based on assumptions, assertions and computer models that represent the real world about as well as the special-effects T-rexes and raptors do in Jurassic Park. The data on hurricanes says otherwise:
Climate modelers and disaster proponents remind me of the four guys who were marooned on an island, after their plane went down. The engineer began drawing plans for a boat; the lumberjack cut trees to build it; the pilot plotted a course to the nearest known civilization. But the economist just sat there. The exasperated workers asked him why he wasn’t helping.
“I don’t see the problem,” he replied. “Why can’t we just assume we have a boat, get on it and leave?”
In the case of climate change, those making the assumptions demand that we act immediately to avert planetary crises based solely on their computer model predictions. It’s like demanding that governments enact laws to safeguard us from velociraptors, after Jurassic Park scientists found that dinosaur DNA could be extracted from fossilized mosquitoes … and brought the carnivores back to special-effects life.
Climate models help improve our conceptual understandings of climate systems and the forces that drive climate change. However, they are terrible at predicting Earth’s temperature and other components of its climate. They should never be used to set or justify policies, laws and regulations – such as what the Environmental Protection Agency is about to impose on CO2 emissions from coal-fired power plants.
Even our best climate scientists still have only a limited grasp of Earth’s highly complex and chaotic climate systems, and the many interrelated solar, cosmic, oceanic, atmospheric, terrestrial and other forces that control climate and weather. Even the best models are only as good as that understanding.
Worse, the models and the science behind them have been horribly politicized. The Intergovernmental Panel on Climate Change was ostensibly organized in 1988 to examine possible human influences on Earth’s climate. In reality, Swedish meteorologist Bert Bolin and environmental activist groups wanted to use global warming to drive an anti-hydrocarbon, limited-growth agenda. That meant they somehow had to find a human influence on the climate – even if the best they could come up with was “The balance of evidence suggests a discernible human influence on global climate.” [emphasis added]
“Discernible” (ie, detectable) soon metamorphosed into “dominant,” which quickly morphed into the absurd notion that greenhouse gas (GHG) emissions have now replaced natural forces and become the only factors influencing climate change. They are certainly the only factors that climate activists and alarmists want to talk about, while they attempt to silence debate, criticism and skepticism. They use the models to generate scary “scenarios” that are presented as actual predictions of future calamities.
They predict, project or forecast that heat waves will intensify, droughts and floods will be stronger and more frequent, hurricanes will be more frequent and violent, sea levels will rise four feet by 2100 [versus eight inches since 1880], forest fires will worsen, and countless animal species will disappear. Unlikely.
Natural forces obviously caused the Medieval Warm Period, the Little Ice Age and the Pleistocene Ice Ages. (A slab of limestone that I dug up has numerous striations – scratches – left by the last mile-thick glacier that covered what is now my home town in Wisconsin.) After long denying it, the IPCC finally acknowledged that the LIA did occur, and that it was a worldwide agricultural and human disaster.
However, the models and computer algorithms the IPCC and EPA rely on still do not include the proper magnitude of solar cycles and other powerful natural forces that influence climate changes. They assume “positive feedbacks” from GHGs that trap heat, but understate the reflective and thus cooling effects of clouds. They display a global warming bias throughout – bolstered by temperature data contaminated by “urban heat island” effects, due to measuring stations being located too close to human heat sources. They assume Earth’s climate is now controlled almost entirely by rising human CO2/GHG emissions.
It’s no wonder the models, modelers and alarmists totally failed to predict the nearly-18-year absence of global warming – or that the modeled predictions diverge further from actual temperature measurements with every passing year. It’s no wonder modelers cannot tell us which aspects of global warming, global cooling, climate change and “climate disruption” are due to humans, and which are the result of natural forces. It’s hardly surprising that they cannot replicate (“hindcast”) the global temperature record from 1950 to 1995, without “fudging” their data and computer codes– or that they are wrong almost every time.
In 2000, Britain’s Met Office said cold winters would be a thing of the past, and “children just aren’t going to know what snow is.” The 2010 and 2012 winters were the coldest and snowiest in centuries. In 2013, Met Office scholars said the coming winter would be extremely dry; the forecast left towns, families and government agencies totally unprepared for the immense rains and floods that followed.
In 2007, Australia’s climate commissioner predicted Brisbane and other cities would never again have sufficient rain to fill their reservoirs. The forecast ignored previous drought and flood cycles, and was demolished by record rains in 2011, 2013 and 2014. Forecasts of Arctic and Antarctic meltdowns have ignored the long history of warmer and colder cycles, and ice buildups and breakups.
The Bonneville Power Administration said manmade warming will cause Columbia River Basin snowpack to melt faster, future precipitation to fall as rain, reservoirs to be overwhelmed – and yet water levels will be well below normal year round. President Obama insists that global temperatures will soar, wildfires will be more frequent and devastating, floods and droughts will be more frequent and disastrous, rising seas will inundate coastal cities as Arctic and Antarctic ice shelves melt and disintegrate, and 97% of scientists agree. Every claim is based on models or bald-faced assertions unsupported by evidence.
And still the IPCC says it has “very high confidence” (the highest level it assigns) to the supposed agreement between computer model forecasts and actual observations. The greater the divergence from reality, the higher its “confidence” climbs. Meanwhile, climate researchers and modelers from Nebraska, Penn State, Great Britain and other “learned institutions” continue to focus on alleged human influences on Earth’s climate. They know they will likely lose their government, foundation and other funding – and will certainly be harassed and vilified by EPA, environmentalists, politicians, and their ideological and pedagogical peers – if they examine natural forces too closely.
Thus they input erroneous data, simplistic assumptions, personal biases, and political and financial calculations, letting models spew out specious scenarios and phony forecasts: garbage in, garbage out.
The modelers owe it to humanity to get it right – so that we can predict, prepare for, mitigate and adapt to whatever future climate conditions nature (or humans) might throw at us. They cannot possibly do that without first understanding, inputting and modeling natural factors along with human influences.
Above all, these supposed modeling experts and climate scientists need to terminate their biases and their evangelism of political agendas that seek to slash fossil fuel use, “transform” our energy and economic systems, redistribute wealth [upward], reduce our standards of living, and “permit” African and other impoverished nations to enter the modern era only in a “sustainable manner,” as defined by callous elitists.
The climate catastrophe camp’s focus on CO2 is based on the fact that it is a byproduct of detested hydrocarbon use. But this trace gas (a mere 0.04% of Earth’s atmosphere) makes life on our planet possible. More carbon dioxide means crops, forests and grasslands grow faster and better. CO2’s role in climate change is speculative – and contradicted by real-world measurements, observations and history.
Computer models, scenarios and predictions of planetary Armageddon are little more than faulty, corrupt, even fraudulent pseudo-science. They have consistently forecast what has not happened on Planet Earth, and failed to forecast what did happen.
They must no longer be allowed to justify EPA’s job-killing, economy-strangling, family-bashing rules for vehicles, power plants, cement kilns, refineries, factories, farms, shopping malls and countless other facilities that are or soon will be regulated by agency fiat.
Paul Driessen is senior policy analyst for the Committee For A Constructive Tomorrow (www.CFACT.org) and author of Eco-Imperialism: Green power – Black death.
By Mark Curtis | MintPress News | November 16, 2022
There is a myth the UK did not support Washington’s war against Vietnam in the 1960s and 1970s. In fact, Labour and Conservative governments backed every phase of US military escalation and played secret roles in the conflict, declassified files show.
UK sent SAS team to Vietnam in 1962, flew secret RAF missions to deliver arms, and provided intelligence to US
UK governments lied to parliament they were not providing military advice to South Vietnam’s brutal regime
Labour government secretly gave arms to US for use in Vietnam, stressing need for “no publicity”
It also connived with Washington to deceive UK public over its support for US
UK governments knew of atrocities against civilians but backed US war aims
Whitehall only started to advocate a peaceful solution, on US terms, once the war became unwinnable
During its war in Vietnam in the 1960s and 1970s the US dropped more bombs than in the whole of World War Two, in a conflict that killed over two million people. The wholesale destruction of villages and killing of innocent people was a permanent feature of the US war from the beginning, along with widespread indiscriminate bombing.
Britain’s role in the war has been largely buried and must be almost completely unknown to the public. When the UK media mentions the war now, reports often simply reference the refusal by Harold Wilson’s government to agree to US requests to openly deploy British troops.
Although this was certainly a public rebuff to Washington, Britain did virtually everything else to back the US war over more than a decade, the declassified documents show. … continue
This site is provided as a research and reference tool. Although we make every reasonable effort to ensure that the information and data provided at this site are useful, accurate, and current, we cannot guarantee that the information and data provided here will be error-free. By using this site, you assume all responsibility for and risk arising from your use of and reliance upon the contents of this site.
This site and the information available through it do not, and are not intended to constitute legal advice. Should you require legal advice, you should consult your own attorney.
Nothing within this site or linked to by this site constitutes investment advice or medical advice.
Materials accessible from or added to this site by third parties, such as comments posted, are strictly the responsibility of the third party who added such materials or made them accessible and we neither endorse nor undertake to control, monitor, edit or assume responsibility for any such third-party material.
The posting of stories, commentaries, reports, documents and links (embedded or otherwise) on this site does not in any way, shape or form, implied or otherwise, necessarily express or suggest endorsement or support of any of such posted material or parts therein.
The word “alleged” is deemed to occur before the word “fraud.” Since the rule of law still applies. To peasants, at least.
Fair Use
This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more info go to: http://www.law.cornell.edu/uscode/17/107.shtml. If you wish to use copyrighted material from this site for purposes of your own that go beyond ‘fair use’, you must obtain permission from the copyright owner.
DMCA Contact
This is information for anyone that wishes to challenge our “fair use” of copyrighted material.
If you are a legal copyright holder or a designated agent for such and you believe that content residing on or accessible through our website infringes a copyright and falls outside the boundaries of “Fair Use”, please send a notice of infringement by contacting atheonews@gmail.com.
We will respond and take necessary action immediately.
If notice is given of an alleged copyright violation we will act expeditiously to remove or disable access to the material(s) in question.
All 3rd party material posted on this website is copyright the respective owners / authors. Aletho News makes no claim of copyright on such material.