Your Time Is Up “Professor” Wadhams
By Paul Homewood | Not A Lot Of People Know That | September 17, 2016
https://www.theguardian.com/environment/2012/sep/17/arctic-collapse-sea-ice
Time’s up, so-called Professor Wadhams.
It is now exactly four years ago that you forecast the demise of Arctic sea ice this summer:
One of the world’s leading ice experts has predicted the final collapse of Arctic sea ice in summer months within four years.
In what he calls a “global disaster” now unfolding in northern latitudes as the sea area that freezes and melts each year shrinks to its lowest extent ever recorded, Prof Peter Wadhams of Cambridge University calls for “urgent” consideration of new ideas to reduce global temperatures.
In an email to the Guardian he says: “Climate change is no longer something we can aim to do something about in a few decades’ time, and that we must not only urgently reduce CO2 emissions but must urgently examine other ways of slowing global warming, such as the various geoengineering ideas that have been put forward.”
https://www.theguardian.com/environment/2012/sep/17/arctic-collapse-sea-ice
So, what does the Arctic actually look like now?
http://ocean.dmi.dk/arctic/icethickness/thk.uk.php
Of course, this was not the first time you made a fool of yourself, was it? At various times in the last few years, you have issued many predictions of ice free Arctics by 2013, and then 2015.
Even as recently as June this year, you were still forecasting:
“The Arctic is on track to be free of sea ice this year or next for the first time in more than 100,000 years”
Be honest. You are not actually very good at your job, are you?
September 23, 2016 Posted by aletho | Deception, Mainstream Media, Warmongering, Science and Pseudo-Science | Cambridge University, Peter Wadhams, The Guardian | Leave a comment
NYTimes & Zika: a brief case study on climate change hype
By David Wojick | Climate Etc. | September 20, 2016
The folks who make their living by hyping the supposed threat of runaway global warming use a lot of scary language in the process. Here the ever creative New York Times has set what may be a new standard in scary climate change hype, by tying it to the Zika outbreak.
In our Framework Analysis of Federal Funding-induced Biases we point to the press exaggerating unproven scientific hypotheses that support government policies. Policies that depend on scaring people are especially subject to this kind of press bias. The NYT has provided a fine example of this sort of scientific distortion, one that is worth analyzing to see just how the game is played. Not surprisingly, they do this in what they call a “Science” article.
It begins with this ever so scary headline:
Zika itself is pretty scary, so that sets the stage. They then combine this with “epidemic” and “a Warning on Climate Change.” So instead of unsubstantiated possibilities we now have warnings and threats. This is a rhetorical flourish that we have not seen before, especially warnings.
Note that most people will only read this headline, which contains no science whatsoever. They will be told, falsely, that the Zika outbreak is a warning of a supposed climate change threat.
Beyond the scary headline, the article itself is a study in rhetorical structure. It begins with innuendo and ends with standard speculation, but in between it manages to provide some solid science regarding several mosquito borne diseases. The latter is to the effect that these various disease outbreaks and increases are likely due to increased urbanization. You would never guess this from the headline or the first paragraph, which uses a question to make an accusation, a classic form of innuendo:
“The global public health emergency involving deformed babies emerged in 2015, the hottest year in the historical record, with an outbreak in Brazil of a disease transmitted by heat-loving mosquitoes. Can that be a coincidence?”
The answer turns out to be probably, but it takes a lot of reading to realize this. Even worse, the article simply assumes that there will be extensive future warming, all due to human emissions. None of this is known to be true, or even likely. In fact this is a standard rhetorical set piece. Assume great human-induced global warming and prophesy the worst.
Not surprisingly the key prophesying quotation comes from an activist-scientist at the National Science Foundation-funded Nation Center for Atmospheric Research. NSF is the Obama Administration’s leading proponent of the unconfirmed hypothesis that human emissions are creating dangerous global warming. NCAR has even issued a Zika forecast for 50 US cities, based as usual on an unverified computer model.
We also get a juicy quote from the federal Centers for Disease Control and Prevention, which assumes that a warmer and wetter world lies ahead. What happened to those pesky droughts?
They even throw in a picture of a sick baby and a Brazilian with dengue (not Zika). In our view tying this hyperbolic “climate change threat” rhetoric to the real misery created by Zika and related diseases is simply despicable.
September 23, 2016 Posted by aletho | Deception, Science and Pseudo-Science | National Science Foundation, New York Times, Obama, United States | Leave a comment
Why did NIST decide WTC steel could not conduct heat?
By Loop Garou | OffGuardian | September 17, 2016
The National Institute of Standards & technology (NIST) was engaged by Congress and by FEMA, shortly after the events of 9/11, to produce a report on the destruction of the three WTC towers.
While it did pursue some initial real-world experimentation (which should be discussed in turn), NIST built its conclusions on the collapse primarily on the basis of computer models.
It follows their conclusions can only be as good as those models.
Let me explain first how a predictive computer model works. It’s virtual reality. If you are building a model to predict anything from the stock market to building collapses you are essentially telling a computer a set of rules that enable it to construct a real-world simulation of your money markets or your building. The most important thing to understand is the result you get is only as reliable as the data you input, because computers are quick but not smart.
If you input garbage, you will output garbage. If you punch in wrong values a computer won’t realise they make no sense, it will just run its program with those values and produce a result that has no connection to the real world, and can even be downright ridiculous. There’s no fail-safe or common sense override. Punch the wrong data into your computer model and you will get “proof” cars can drive on water, or birds can fly through solid rock.
Any computer model of anything is only as good as the parameters fed into it.
NIST’s models can’t be assessed independently as a whole because NIST refuses to release any data about them. Their claimed reason for this is that releasing the docs might endanger national security. However NIST did disclose some limited information about their parameters in the body of their reports, most perturbing and inexplicable of which is their acknowledgement they assigned all the steel in their WTC model a thermal conductivity of zero, or close to zero.
To explain to a non-science-based readership what that means, just consider what you would expect to happen if you placed one end of a steel bar in a fire and kept hold of the other end. Would you expect:
A) the end you were holding to gradually heat up to the point you could not keep it in your hand?
B) the end you were holding to remain cool no matter how hot the end in the fire becomes?
Believe it or not, NIST chose the second option. Here it is in their own words:
“The steel was assumed in the FDS model to be thermally-thin, thus, no thermal conductivity was used.” NCSTAR 1-5F, p 20
“The interior walls [including insulated steel columns] were assumed to have the properties of gypsum board [0.5 W/m/K].” NCSTAR 1-5F, p 52
“Although the floor slab actually consisted of a metal deck topped with a concrete slab… the thermal properties of the entire floor slab were assumed to be that of concrete [1.0 W/m/K].” NCSTAR 1-5F, p 52
You don’t need to be a professional scientist to know this is bunkum and a total disregard of basic physics.
Why does this matter? It matters a LOT. Changing the assumed conductivity of steel from its actual figure to zero would allow the model to produce much higher temperatures in the steel directly exposed to fire than would be possible in reality. It’s like calculating the amount of water you could get into a sieve at any one time by assuming the sieve has no holes. The model will show the sieve can be filled to the brim, but that is just so much garbage with no real-world application.
Just so with the temperatures of the steel. NIST needed to produce a model that allowed cool office fires of around 800deg to somehow produce enough heat in localised areas to weaken and buckle steel girders and struts. If they’d allowed the steel to behave normally and wick the heat away along its length they simply could not achieve this aim. Only by turning the assumed thermal conductivity to zero (the equivalent of assuming the sieve has no holes) could they get their model to create enough heat to do the buckling and weakening.
This is a huge problem. In fact it could not be a bigger problem. This bogus assumption that steel has zero thermal conductivity not only renders the NIST report as a whole deeply suspect, it entirely nullifies even the flawed basis for its “collapse by fire” hypothesis.
This is why so many scientists are calling for another investigation. They aren’t saying the gumment did it, they aren’t claiming a conspiracy, they just see huge errors in the previous investigation and want more work to be done.
Bottom line is NIST punched in false data that totally invalidated their model. The zero thermal conductivity issue alone is sufficient grounds for a new investigation.
This article is based on a comment LG posted on another 9/11 thread. We welcome replies and rebuttals, please send them to submissions@off-guardian.org, marked “9/11”
September 17, 2016 Posted by aletho | Deception, False Flag Terrorism, Science and Pseudo-Science, Timeless or most popular | 9/11, FEMA, NIST, United States | Leave a comment
NIST finally admits free fall of WTC7
OffGuardian
David Chandler, physics teacher and member of AE9/11 Truth describes the journey toward NIST’s public admission that their initial calculations were incorrect and that WTC7’s first eight floors did descend at free-fall speed.
This concession by NIST (see section 11) raises many additional questions about the plausibility of the fire-induced progressive failure explanation for WTC7’s collapse that NIST published in 2008.
September 15, 2016 Posted by aletho | Deception, Science and Pseudo-Science, Timeless or most popular, Video | 9/11, NIST, WTC7 | Leave a comment
Heresy and the creation of monsters
By Judith Curry | Climate Etc. | October 25, 2010
I’m having another “Alice down the rabbit hole” moment, in response to the Scientific American article, the explication of the article by its author Michael Lemonick, Scientific American’s survey on whether I am a dupe or a peacemaker, and the numerous discussions in blogosphere.
My first such moment was in 2005 in response to the media attention associated with the hurricane wars, which was described in a Q&A with Keith Kloor at collide-a-scape. While I really want to make this blog about the science and not about personalities (and especially not about me), this article deserves a response.
The title of the article itself is rather astonishing. The Wikipedia defines heresy as: “Heresy is a controversial or novel change to a system of beliefs, especially a religion, that conflicts with established dogma.” The definition of dogma is “Dogma is the established belief or doctrine held by a religion, ideology or any kind of organization: it is authoritative and not to be disputed, doubted, or diverged from.” Use of the word “heretic” by Lemonick implies general acceptance by the “insiders” of the IPCC as dogma. If the IPCC is dogma, then count me in as a heretic. The story should not be about me, but about how and why the IPCC became dogma.
And what exactly is the nature of my challenges to the dogma? Lemonick made the following statement: ““What I found out is that when [Curry] does raise valid points, they’re often points the climate-science community already agrees with — and many climate scientists are scratching their heads at the implication that she’s uncovered some dark secret.” This statement implies that I am saying nothing new, nothing that climate scientists don’t already know. Well that is mostly true (an exception being my recent blog series on uncertainty); I am mostly saying things that are blindingly obvious to everyone. Sort of like in the story “The Emperor’s New Clothes.” A colleague of mine at Georgia Tech, a Chair from a different department, said something like this: “I’ve been reading the media stories on the Georgia Tech Daily News Buzz that mention your statements. Your statements seem really sensible. But what I don’t understand is why such statements are regarded as news?”
Well that is a question that deserves an answer. I lack the hubris to think that my statements should have any public importance. The fact that they seem to be of some importance says a lot more about the culture of climate science and its perception by the public, than it says about me.
The narrative
Why am I being singled out here? Richard Lindzen and Roger Pielke Sr. have been making far more critical statements about the IPCC and climate science for a longer period than I have. And both score higher than me in the academic pecking order (in terms of number of publications and citations and external peer recognition).
The answer must be in the narrative of my transition from a “high priestess of global warming” to engagement with skeptics and a critic of the IPCC. The “high priestess of global warming” narrative (I used to see this term fairly frequently in the blogosphere, can’t spot it now) arose from my association with the hurricane and global warming issue, which at the time was the most alarming issue associated with global warming.
The overall evolution of my thinking on global warming is described in the Q&A at collide-a-scape (the relevant statements are appended at the end of this post.) My thinking and evolution on this issue since 11/19/09 deserves further clarification. When I first started reading the CRU emails, my reaction was a visceral one. While my colleagues seemed focused on protecting the reputations of the scientists involved and assuring people that the “science hadn’t changed,” I immediately realized that this could bring down the IPCC. I became concerned about the integrity of our entire field: both the actual integrity and its public perception. When I saw how the IPCC was responding and began investigating the broader allegations against the IPCC, I became critical of the IPCC and tried to make suggestions for improving the IPCC. As glaring errors were uncovered (especially the Himalayan glaciers) and the IPCC failed to respond, I started to question whether it was possible to salvage the IPCC and whether it should be salvaged. In the meantime, the establishment institutions in the U.S. and elsewhere were mostly silent on the topic.
In Autumn 2005, I had decided that the responsible thing to do in making public statements on the subject of global warming was to adopt the position of the IPCC. My decision was based on two reasons: 1) the subject was very complex and I had personally investigated a relatively small subset of the topic; 2) I bought into the meme of “don’t trust what one scientist says, trust what thousands of IPCC scientists say.” A big part of my visceral reaction to events unfolding after 11/19 was concern that I had been duped into supporting the IPCC, and substituting their judgment for my own in my public statements on the subject. So that is the “dupe” part of all this, perhaps not what Lemonick had in mind.
If, how, and why I had been duped by the IPCC became an issue of overwhelming personal and professional concern. I decided that there were two things that I could do: 1) speak out publicly and try to restore integrity to climate science by increasing transparency and engaging with skeptics; and 2) dig deeply into the broader aspects of the science and the IPCC’s arguments and try to assess the uncertainty. The Royal Society Workshop on Handling Uncertainty in Science last March motivated me to take on #2 in a serious way. I spent all summer working on a paper entitled “Climate Science and the Uncertainty Monster,” which was submitted to a journal in August. I have no idea what the eventual fate of this paper will be, but it has seeded the uncertainty series on Climate Etc. and its fate seems almost irrelevant at this point.
Monster creation
There are some parallels between the “McIntyre monster” and the “Curry monster.” The monster status derives from our challenges to the IPCC science and the issue of uncertainty. While the McIntyre monster is far more prominent in the public debate, the Curry monster seems far more irksome to community insiders. The CRU emails provide ample evidence of the McIntyre monster, and in the wake of the CRU emails I saw a discussion at RealClimate about the unbridled power of Steve McIntyre. Evidence of the Curry monster is provided by this statement in Lemonick’s article: “What scientists worry is that such exposure means Curry has the power to do damage to a consensus on climate change that has been building for the past 20 years.” This sense of McIntyre and myself as having “power” seems absurd to me (and probably to Steve), but it seems real to some people.
Well, who created these “monsters?” Big oil and the right-wing ideologues? Wrong. It was the media, climate activists, and the RealClimate wing of the blogosphere (note, the relative importance of each is different for McIntyre versus myself). I wonder if the climate activists will ever learn, or if they will follow the pied piper of the merchants of doubt meme into oblivion.
A note to my critics in the climate science community
Let me preface my statement by saying that at this point, I am pretty much immune to criticisms from my peers regarding my behavior and public outreach on this topic (I respond to any and all criticisms of my arguments that are specifically addressed to me.) If you think that I am a big part of the cause of the problems you are facing, I suggest that you think about this more carefully. I am doing my best to return some sanity to this situation and restore science to a higher position than the dogma of consensus. You may not like it, and my actions may turn out to be ineffective, futile, or counterproductive in the short or long run, by whatever standards this whole episode ends up getting judged. But this is my carefully considered choice on what it means to be a scientist and to behave with personal and professional integrity.
Let me ask you this. So how are things going for you lately? A year ago, the climate establishment was on top of the world, masters of the universe. Now we have a situation where there have been major challenges to the reputations of a number of a number of scientists, the IPCC, professional societies, and other institutions of science. The spillover has been a loss of public trust in climate science and some have argued, even more broadly in science. The IPCC and the UNFCCC are regarded by many as impediments to sane and politically viable energy policies. The enviro advocacy groups are abandoning the climate change issue for more promising narratives. In the U.S., the prospect of the Republicans winning the House of Representatives raises the specter of hearings on the integrity of climate science and reductions in federal funding for climate research.
What happened? Did the skeptics and the oil companies and the libertarian think tanks win? No, you lost. All in the name of supporting policies that I don’t think many of you fully understand. What I want is for the climate science community to shift gears and get back to doing science, and return to an environment where debate over the science is the spice of academic life. And because of the high relevance of our field, we need to figure out how to provide the best possible scientific information and assessment of uncertainties. This means abandoning this religious adherence to consensus dogma.
Addendum: reproduced from my Q&A at collide-a-scape
“Circa 2003, I was concerned about the way climate research was treating uncertainty (see my little essay presented to the NRC Climate Research Committee).
I was considered somewhat quixotic but not really outside of the mainstream (p.s. the CRC didn’t pay any attention to my essay, they went off in a different direction that focused on communicating uncertainty and decisionmaking under uncertainty). During this period, I was comfortably ensconced in the ivory tower of academia, writing research papers, going to conferences, submitting grant proposals. I was 80% oblivious to what was going on in terms of the public debate surrounding climate change.
This all changed on September 14, 2005, when I participated in a press conference on our forthcoming paper that described a substantial increase in the global number of category 4 and 5 hurricanes. The unplanned and uncanny timing of publication of this paper was three weeks after Hurricane Katrina devastated New Orleans. While global warming was mentioned only obliquely in the paper, the press focused on the global warming angle and a media furor followed. We were targeted as global warming alarmists, capitalizing on this tragedy to increase research funding and for personal publicity, a threat to capitalism and the American way of life, etc.
At the same time, we were treated like rock stars by the environmental movement. Our 15 minutes stretched into days, weeks and months. Hurricane Katrina became a national focusing event for the global warming debate. We were particularly stung by criticisms from fellow research scientists who claimed that we were doing this “for the money” and attacked our personal and scientific integrity. We felt that one scientist in particular had crossed the line and committed a series of fouls, and this turned the scientific debate into academic guerrilla warfare between our team and the skeptics that was played out in the glare of the media. This “war” culminated in an article published on the front page of the Wall Street Journal, “Debate shatters the civility of weather science” on Feb 2, 2006 . . . This article became a catharsis for the hurricane research community, that engendered extensive email discussion among scientists on both sides of the public debate. We did an email version of a “group hug” and vowed to stop the guerilla warfare.
I had lost my bearings in all of this, and the Wall Street Journal article had the effect of a bucket of cold water being poured over my head. I learned several important lessons from this experience: just because the other guy commits the first “foul” doesn’t give you the moral high ground in protracted academic guerilla warfare. Nothing in this crazy environment is worth sacrificing your personal or professional integrity. After all, no one remembers who fired the first shot, all they see is unprofessional behavior.
I took a step back and tried to understand all this craziness and learn from it. I even wrote a journal article on this, “Mixing Politics and Science in Testing the Hypothesis that Greenhouse Warming is Causing a Global Increase in Hurricane Intensity.” This paper got quite a bit of play in the blogosphere upon its publication in Aug 2006, and at this time I made my first major foray into the blogosphere, checking in at all the blogs where the paper was being discussed. See esp realclimate and climateaudit (but I can no longer find the original thread on climateaudit ).
At climateaudit, the posters had some questions about statistics and wanted to see the raw data. I was pretty impressed by the level of discussion, and wondered why I had not come across this blog before over at the realclimate blogroll. Then I realized that I was on Steve McIntyre’s blog (I had sort of heard of his tiff with Mann, but wasn’t really up on all this at the time). I was actually having much more fun over at climateaudit than at realclimate, and I thought it made much more sense to spend time at climateaudit rather than to preach to the converted at realclimate. Back in 2006 spending time at climateaudit was pretty rough sport (it wasn’t really moderated at the time). When I first started spending time over there, the warmist blogs thought it was really funny, and encouraged me to give ‘em hell.
I was continuing my overall thinking on how to better deal with skeptics and increase the credibility and integrity of science. I gave an invited talk at Fall 2006 AGU meeting, entitled “Falling out of the ivory tower: Reflections on mixing politics and climate science.” This is where I first started talking about circling the wagons, etc. I don’t think this was quite what the convenors had in mind when they invited me to give this talk, but at the time I still had pretty solid status as a survivor of vicious political attacks during the hurricane wars and was a heroine for taking down Bill Gray.
When the IPCC Fourth Assessment Report was published in 2007, I joined the consensus in supporting this document as authoritative; I was convinced by the rigors of the process, etc etc. While I didn’t personally agree with everything in the document (still nagging concerns about the treatment of uncertainty), I bought into the meme of “don’t trust what one scientist says, listen to the IPCC.” During 2008 and 2009, I became increasingly concerned by the lack of “policy neutrality” by people involved in the IPCC and policies that didn’t make sense to me. But after all, “don’t trust what one scientist says”, and I continued to substitute the IPCC assessment for my own personal judgment [in my public statements].
November 19, 2009: bucket of cold water #2. When I first saw the climategate emails, I knew these were real, they confirmed concerns and suspicions that I already had. After my first essay “On the credibility . . .” posted at climateaudit, I got some emails that asked me to be sensitive to the feelings of the scientists involved. I said I was a whole lot more worried about the IPCC, in terms of whether it could be saved and whether it should be saved. I had been willing to substitute the IPCC for my own personal judgment [in public statements], but after reading those emails, the IPCC lost the moral high ground in my opinion. Not to say that the IPCC science was wrong, but I no longer felt obligated in substituting the IPCC for my own personal judgment.
So the Judith Curry ca 2010 is the same scientist as she was in 2003, but sadder and wiser as a result of the hurricane wars, a public spokesperson on the global warming issue owing to the media attention from the hurricane wars, more broadly knowledgeable about the global warming issue, much more concerned about the integrity of climate science, listening to skeptics, and a blogger (for better or for worse). . . People really find it hard to believe that I don’t have a policy agenda about climate change/energy (believe me, Roger Pielke Jr has tried very hard to smoke me out as a “stealth advocate”). Yes, I want clean green energy, economic development and “world peace”. I have no idea how much climate change should be weighted in these kinds of policy decisions. I lack the knowledge, wisdom and hubris to think that anything I say or do should be of any consequence to climate/carbon/energy policy.”
September 14, 2016 Posted by aletho | Deception, Science and Pseudo-Science, Timeless or most popular | IPCC | Leave a comment
The Toronto Hearings on 9/11 Uncut – Jon Cole
October 3, 2012
Ten years have passed since the World Trade Center attacks of September 11, 2001, and there are still many unanswered questions surrounding that fateful day.
In 2011, experts and scientists from around the world gathered in Toronto, Canada to present new and established evidence that questions the official story of 9/11. This evidence was presented to a distinguished panel of experts over a 4 day period.
Through their analysis and scientific investigations, they hope to spark a new investigation into the attacks of September 11, 2001.
Press For Truth and The International Center for 9/11 Studies Present:
“The Toronto Hearings on 9/11: Uncovering Ten Years of Deception”
Produced by:
Steven Davies
Dan Dicks
Bryan Law
An over 5 hour DVD, with comprehensive coverage of the 4 day Toronto Hearings from September 2011.
Featuring expert witness testimony from:
David Ray Griffin
Richard Gage
David Chandler
Kevin Ryan
Niels Harrit
Barbara Honegger
Peter Dale Scott
Graeme MacQueen
Jonathan Cole
Cynthia McKinney
…and many more!
Support the film makers and own The Toronto Hearings on 911 on DVD!
http://www.pressfortruth.ca/pft-shop/…
We rely on you the viewer to help us continue to do this work. With your help we can continue to make videos and documentary films for youtube in an effort to raise awareness all over the world. Please support independent media by joining Press For Truth TV!
As a Press For Truth TV subscriber you’ll have full access to the site’s features and content including Daily Video Blogs on current news from the PFT perspective and High Quality Downloads of all Press For Truth Films, Music and Special Reports! Subscribe to Press For Truth TV: http://pressfortruth.tv/register/
For more information visit:
http://www.facebook.com/PressForTruth
http://www.youtube.com/weavingspider
http://twitter.com/#!/DanDicksPFT
September 14, 2016 Posted by aletho | Deception, False Flag Terrorism, Science and Pseudo-Science, Timeless or most popular, Video | 9/11, United States | Leave a comment
Sugar industry blamed fat in fake studies published by New England Journal of Medicine
RT | September 13, 2016
The sugar industry paid Harvard researchers in the 1960s to bury research linking sugar intake to heart disease and to instead make fat the culprit, according to a study of archival documents.
“These internal documents show that the Sugar Research Foundation initiated coronary heart disease research in 1965 to protect market share and that its first project, a literature review, was published in the New England Journal of Medicine without disclosure of the sugar industry’s funding or role,” stated the study.
The internal sugar industry documents were found in public archives by a researcher at the University of California, San Francisco.
UCSF researchers analyzed more than 340 documents indicating the relationship between the sugar industry and Roger Adams, then a professor of organic chemistry who served on the scientific advisory boards for the sugar industry, and Mark Hegsted, one of the Harvard researchers who produced the literature review.
The documents showed the sugar industry was aware of evidence in the 1960s that linked sugar consumption to high blood cholesterol and triglyceride levels thought to be risk factors for coronary heart disease.
The sugar industry commissioned Project 226, a literature review written by researchers at the Harvard University School of Public Nutrition Department, which concluded there was “no doubt” that the only dietary intervention required to prevent coronary heart disease was to reduce dietary cholesterol and substitute polyunsaturated fat for saturated fat in the American diet.
The sugar industry paid the Harvard scientist the equivalent of $50,000 in 2016 dollars.
The study found the NEJM review served the sugar industry’s interests by arguing that studies “associating sucrose with coronary heart disease were limited” and that sugar should not be included in assessments of risk of heart disease.
Researchers found the sugar industry would spend $600,000 (the equivalent of $5.3 million in 2016 dollars) to teach “people who had never had a course in biochemistry… that sugar is what keeps every human being alive and with energy to face our daily problems,” according to a UCSF press release.
Among the documents was a speech from 1954 by Sugar Research Foundation (SRF) president Henry Hass, which showed that they recognized that if Americans adopted low-fat diets, then per-capita consumption of sugar would increase by more than one-third. The trade organization represented 30 international members.
“The literature review helped shape not only public opinion on what causes heart problems but also the scientific community’s view of how to evaluate dietary risk factors to heart disease,” said lead author Cristin Kearns, who discovered the industry documents.
Other documents showed the sugar industry became concerned in 1962 with evidence showing that a low-fat diet high in sugar could elevate serum cholesterol level. In 1964, the SRF vice president and director of research, John Hickson, said new research on coronary heart disease found that “sugar is a less desirable dietary source of calories than other carbohydrates,” and referred to the work since 1957 of British physiologist John Yudkin, who challenged population studies singling out saturated fat as the primary dietary cause of coronary heart disease “and suggested other factors, including sucrose, were at least equally important.”
“Hickson proposed that SRF ‘could embark on a major program’ to counter Yudkin and other ‘negative attitudes toward sugar,’” stated the study.
It found that Hickson recommended an opinion poll “to learn what public concepts we should reinforce and what ones we need to combat through our research and information and legislation programs,” a symposium to “bring detractors before a board of their peers where their fallacies could be unveiled,” and recommended the sugar industry fund coronary heart disease research to “see what the weak points there are in the experimentation, and replicate the studies with appropriate corrections. Then publish the data and refute our detractors.”
The analysis ‘Sugar Industry and Coronary Heart Disease Research: A Historical Analysis of Internal Industry Documents’ was published Monday in JAMA Internal Medicine.
September 13, 2016 Posted by aletho | Corruption, Deception, Science and Pseudo-Science, Timeless or most popular | Harvard University, New English Journal of Medicine | Leave a comment
The New York Times is looking for a climate change editor
Drone footage that shows Greenland melting away. Long narratives about the plight of climate refugees, from Louisiana to Bolivia and beyond. A series on the California drought. Color-coded maps that show how hot it could be in 2060.
The New York Times is a leader in covering climate change. Now The Times is ramping up its coverage to make the most important story in the world even more relevant, urgent and accessible to a huge audience around the globe.
We are looking for an editor to lead this dynamic new group. We want someone with an entrepreneurial streak who is obsessed with finding new ways to connect with readers and new ways to tell this vital story.
The coverage should encompass: the science of climate change; the politics of climate debates; the technological race to find solutions; the economic consequences of climate change; and profiles of fascinating characters enmeshed in the issues.
The coverage should include journalism in a variety of formats: video, photography, newsletters, features, podcasts, conferences and more. The unit should make strategic decisions about which forms are top priorities and which are not.
The climate editor will collaborate with many others throughout the newsroom, but will operate apart from the current department structure, with no print obligations. —
To Apply
Applicants should submit a resume, examples of previous work, and a memo outlining their vision for coverage to Dean Baquet and Sam Dolnick by Sept. 19. This vision is the most important part of the application. It should be specific and set clear priorities. Some important questions to wrestle with:
What audiences should we be focusing on?
How will our coverage fit into their lives, and how will they experience it?
How will we distinguish our coverage from other journalism in this space?
What will be the main vehicles for the coverage? Features? News? Videos?
Should there be a signature voice attached to our climate coverage? Who?
How will you make a difficult subject interesting and accessible?
What stories are we willing not to do?
What should the team look like to get it done?
This non-Guild position is open to internal and external candidates. Applications should be sent to nytrecruit@nytimes.com.
September 12, 2016 Posted by aletho | Deception, Science and Pseudo-Science | New York Times, United States | Leave a comment
Guardian “Facebook fact-check ” on 9/11 – every bit as poor as you would expect
By Catte | OffGuardian | September 10, 2016
The Guardian is no better at telling the truth about the nature of the 9/11 debate than about Syria, Ukraine or indeed anything. Its recent bid at being both social-media savvy and weirdly Orwellian, “Facebook Fact Check”, has this little snippet up atm:

The paper they are referring to is On the Physics of High Rise Building Collapses, which we have published here, and the “professor” who, according to them, “left Brigham Young University in disgrace” is of course physicist Steve Jones, who was the subject of a hostile media campaign after he and his BYU research team claimed to have discovered evidence of nanothermite in tiny “red gray chips” found in the dust from the WTC explosions.
For the record, Jones’ research work on the red gray chips has been challenged, but never debunked, and his experiments have been replicated successfully by independent researchers elsewhere in the world, such as Mark Bazile. Jones was suspended from his teaching duties and then offered “early retirement” by BYU in 2006 in the midst of the media campaign against him.
BYU’s reasons for this action were never publicly disclosed but the Guardian’s claim that Jones was “disgraced” is little more than a sleazy bit of innuendo, so gross it doesn’t even appear in the sourced WaPo article, which does at least try to be a tad objective. “Disgrace” is just the Graun’s own little bit of tabloidese. Because tabloid is all it seems to do now.
Jones’ three co-authors are described in this piece as “a retired professor and two longterm 9/11 truthers.” I guess the Graun didn’t want to admit two of them are structural engineers as well as being “truthers”?
When the (ironically named?) “fact-check” briefly discusses the physics of 9/11, it’s simply to offer yet more deception. The investigation by serious professionals of the still not fully explained and extraordinary triple collapses on 9/11 is listed along with claims we didn’t go to the Moon and some random nonsense about Hillary Clinton, presumably in some attempt to discredit by proximity. Instead of honestly addressing the very real areas of uncertainty which the scientists of NIST have quite openly admitted, the Graun does what many other agenda-driven “debunkers” do and tries to reframe the issue as being between “settled science” (to borrow a term) on one side and crazy, discredited or otherwise unreliable kooks on the other.
This, we need to clearly understand, is a purely propagandist ploy meant to convince only the under-informed “masses” (ie us), and not those on either side versed in the real issues. If you read their report and other commentaries, the experts of the National Institute of Standards and Technology are well aware that the explanation they have produced for the 9/11 building collapses is neither complete nor beyond rational question. They are well aware there is plenty of room for science-based interrogation and counter-hypothesis.
But for some reason it seems to be very important to the manufacturers of consent that we, the public, are not made aware of these continuing and probably understandable uncertainties. So, through outlets such as the Guardian (and many others) they disseminate simplistic statements, soundbites and frank lies, designed to convince people that what is uncertain, poorly explained and capable of interpretation is simple, settled, dusted and done.
The link the Guardian provides that is alleged to “disprove” all such “conspiracy theories” is to the 2005 Popular Mechanics article that did indeed claim to do this. The Guardian doesn’t mention that this article has itself been “debunked” and makes several provably false assertions.
If the fire-induced collapse explanation for the events of 9/11 is really beyond debate, why do outlets such as the Graun (and indeed Popular Mechanics) not make this self-evident by simply allowing both sides to place their evidence before the public on their pages, so that readers can make up their own minds? Outlets don’t need to take a side and defend it. In fact they work best when they try to avoid this and do their best to offer argument from all sides.
We aren’t claiming Jones and his co-authors are ultimately correct. We’re just pointing out that scientific truth doesn’t need to be defended by spin or censorship or grotesque ad hominem.
September 10, 2016 Posted by aletho | Deception, Mainstream Media, Warmongering, Science and Pseudo-Science, Timeless or most popular | 9/11, The Guardian | Leave a comment
The distinction between true scepticism and denial
By Don Aitkin | September 4, 2016
I came across the phrase in the title, and followed a link to a recent journal article which for once was available on open access. Entitled ‘Science and the Public: Debate, Denial, and Skepticism’, it looked interesting. You can read it here. The four authors come from different fields, and propose to outline ‘the distinction between true scepticism and denial’. They also offer some guidelines to help researchers, and interested members of the public, decide how to deal with enquiries, on the one hand, and problems which people see in published science, on the other.
The reader is brought into the area of ‘climate change’ at once. The controversy surrounding climate change is just one example of a polarized public debate that seems remote and detached from the actual state of science: Within the scientific community, there is a pervasive consensus that the Earth is warming from greenhouse gas emissions (Anderegg, Prall, Harold, & Schneider, 2010; Cook et al., 2013; Doran & Zimmerman, 2009; Oreskes, 2004; Shwed & Bearman, 2010), but outside science there is entrenched denial of this fact in some sectors of society (e.g., Dunlap, 2013; Lewandowsky, Gignac, & Oberauer, 2013). [my emphasis]
Whoops! Substantively, ‘climate change’ is not simply whether the planet has warmed through greenhouse gas emissions. More important and related questions include, for example, by how much has it warmed, what else has been at work besides greenhouse gases, is the warming unprecedented or not, does it matter anyway (isn’t warming better than cooling?), and many others. Pedantically, there is no need for a consensus to be graced with the adjective pervasive. If it is a consensus then it is by definition pervasive, meaning ‘permeated’, ‘diffused through’, etc.
Then interested readers might wonder where to find the entrenched denial of the supposed fact that the Earth is warming from greenhouse gas emissions. The sceptical community for the most part, I think, accepts that greenhouse gas emissions have contributed to the warming that has occurred over the past century or so (which is not quite the same thing). There are a few dragon-slayers who don’t agree. But entrenched denial? I’m not aware of it. The links don’t help, since Dunlap 2013 is a study of 108 climate change denial books with most of the interest being in their supposed links to business groups. The Lewandowsky link is even less helpful, as well as being an intellectually dreadful paper. I don’t know quite what I would expect to find as an example of entrenched denial in opposition to pervasive consensus, but there’s no evidence for it here. To continue:
Media reports occasionally even proclaim that warming has stopped (Ridley, 2014) or that we are headed for global cooling (e.g., Rose, 2013). These propositions have no scientific support …
Well, Matt Ridley’s op. ed. in the Wall Street Journal may not be top-of-the-line science, though he refers to the science, but the UK Met Office did indeed agree that there was a hiatus in warming, and that it would continue until 2017. The scientists who propose the possibility of cooling are solar physicists, for the most part, and their views may be wrong. But the ‘cooling’ view does have some scientific support (see, for example, here).
These introductory remarks are a little jarring, in the context of the pure bromide that is to come. Public debate and scepticism are essential to a functioning democracy. Indeed scepticism has been shown to enable people to differentiate more accurately between truth and falsehood. How could we disagree? So how do we tell when what we are getting is scientific fact or denial? Ah, you see, there are three factors that are always present when denialists are involved. First, they make stuff up. Second, denial commonly invokes notions of conspiracies. (I think Dunlap 2013, mentioned above, is an excellent example of the way in which conspiracies can be invoked, but I don’t think the authors had him in mind.). Third, denialists engineer personal and professional attacks on scientists both in public and behind the scenes, and issue prolific complaints to scientists’ host institutions with allegations of research conduct. Two of the authors of this article claim to have experienced such behaviour.
The authors claim, on the basis of what they call recent evidence, is that up to US$1billion flows into foundations and think tanks in the U.S. every year that are dedicated to political lobbying for various issues. One of the principal objectives of this network is to support a climate “counter movement” that seeks to reframe public discourse surrounding climate change from one of overwhelming scientific consensus to one of doubt, debate, and uncertainty (Brulle, 2014; Plehwe, 2014). To illustrate, more than 90% of recent books that dismiss environmental problems have been linked to conservative think tanks (Jacques, Dunlap, & Freeman, 2008), and such books typically never undergo peer review (Dunlap & Jacques, 2013). This does look like conspiracy stuff to me, on first reading, but again, I doubt the authors had this in mind either.
Now comes more bromide: In a democracy, calls for genuine debate are to be welcomed and must be taken seriously. Given that scientific issues can have far-reaching political, technological, or environmental consequences, greater involvement of the public can only be welcome and made led to better policy outcome. Who could disagree? We are given a small example of how this has worked in practice (it is not in climate change). Notwithstanding the public’s entitlement to be involved in issues that are scientifically informed, scientific debates must still be conducted according to the rules of science. Arguments must be evidence-based and they are subject to peer review before they become provisionally accepted. Hang on there! If arguments have to be evidence-based, and the evidence doesn’t support them, what then? Do we really have to wait for good policy until the peer-review process (something that applies almost solely to academic work) has considered the matter? In the climate science arena even well-credentialled sceptical scientists have found it hard to get critical papers accepted for publication.
In the matter of disagreement, the two first-named authors acknowledge the uncertainty in climate projections, but note that contrary to popular intuition, any uncertainty provides even greater impetus for climate mitigation. I’ve come across this line of argument before, and have to go along with ‘popular intuition’ here. If there is uncertainty about whether something needs to be done, because the evidence is weak or equivocal, it would seem strange indeed to say ‘Hah! That’s even more reason to go down my chosen path!’ I am open to persuasion, but not to this kind of assertion.
What I think is happening in this strange, muddled and evidence-free paper is a kind of explicit argument that peer review is the only way to go, if only because the blogospherical world (which the authors denounce) has very little in the way of support for the supposed consensus. By now the title of the paper has been forgotten by the authors, and we get this: People who deny scientific facts that they find challenging or unacceptable, by contrast, are by and large not skeptics. On the contrary, they demonstrably shy away from scientific debate by avoiding the submission of their ideas to peer review. One has to say, again, that peer review is for academics and is not the gold standard for science. Bad data, bad argument and self-interest are usually quickly discovered, and any proposition that results from them is usually dismissed, or at least put aside. What distinguishes ‘climate change’ is that policies like the carbon tax came before the science was properly in (it still isn’t), and for political reasons the policies remained current, despite the lack of continually corroborative scientific evidence.
Oh well, another blinkered, dodgy, peer-reviewed paper. Who let this through? Oh, I forgot to mention the Guidelines. The first, ‘Proposed Guidelines for Critical Scientific Engagement by Members of the Public’ begins with this little preamble: If your goal is to contribute to a scientific conversation, then you need to follow certain rules. One of those rules is that scientific arguments are conducted in the scientific peer-reviewed literature. If you are unwilling to do so, these guidelines are of little value. Indeed so. Good luck, would-be contributor!
The second set is for scientists who might be approached by a member of the public seeking critical engagement. The Guidelines tell you to be careful — you might be approached by someone who is not in good faith, and wants to find errors in your work. Don’t help them!
And when you’ve finished both guidelines, you still don’t know what the authors think a ‘true sceptic’ is, or how he or she is to be distinguished from a ‘denialist’. Yet that is embodied in the title of the article.
Finally, the authors. The first two names will be familiar to readers of this website, and indeed to anyone interested in the ‘climate change’ issue: Stephan Lewandowsky, Michael E. Mann, Nicholas J. L. Brown and Harris Friedman. You will learn about the third and fourth by reading the article. They seem somewhat more sensible than the first two. Oh, there are 96 references, of which 22 are self-referenced articles, 16 of them by Lewandowsky alone. I may be wrong, but I could find just three references that were critical of the authors’ standpoint. Not exactly a review paper, for all its pretension.
And I find myself saying, yet again, this awful, poorly argued, self-seeking paper has passed peer review? What have we come to in the journal world?
September 10, 2016 Posted by aletho | Deception, Science and Pseudo-Science, Timeless or most popular | Michael E. Mann, Stephan Lewandowsky | Leave a comment
Jill Stein Predicts One Foot Of Sea Level Rise Per Year
By Marc Morano | Climate Depot | September 7, 2016
The people at NOAA who actually study sea level say it is rising less than seven inches per century.
the absolute global sea level rise is believed to be 1.7-1.8 millimeters/year.
September 10, 2016 Posted by aletho | Deception, Science and Pseudo-Science | United States | Leave a comment
On the physics of high-rise building collapses
By Steve Jones, Robert Korol, Anthony Szamboti and Ted Walter – Europhysics News
In August 2002, the U.S. National Institute of Standards and Technology (NIST) launched what would become a six-year investigation of the three building failures that occurred on September 11, 2001 (9/11):
- the well-known collapses of the World Trade Center (WTC) Twin Towers that morning and
- the lesser-known collapse late that afternoon of the 47-story World Trade Center Building 7, which was not struck by an airplane.
NIST conducted its investigation based on the stated premise that the
WTC Towers and WTC 7 [were] the only known cases of total structural collapse in high-rise buildings where fires played a significant role.”
Indeed, neither before nor since 9/11 have fires caused the total collapse of a steel-framed high-rise—nor has any other natural event, with the exception of the 1985 Mexico City earthquake, which toppled a 21-story office building. Otherwise, the only phenomenon capable of collapsing such buildings completely has been by way of a procedure known as controlled demolition, whereby explosives or other devices are used to bring down a structure intentionally.
Although NIST finally concluded after several years of investigation that all three collapses on 9/11 were due primarily to fires, fifteen years after the event a growing number of architects, engineers, and scientists are unconvinced by that explanation.
Preventing high-rise failures
Steel-framed high-rises have endured large fires without suffering total collapse for four main reasons:
- Fires typically are not hot enough and do not last long enough in any single area to generate enough energy to heat the large structural members to the point where they fail (the temperature at which structural steel loses enough strength to fail is dependent on the factor of safety used in the design. In the case of WTC 7, for example, the factor of safety was generally 3 or higher. Here, 67% of the strength would need to be lost for failure to ensue, which would require the steel to be heated to about 660°C);
- Most high-rises have fire suppression systems (water sprinklers), which further prevent a fire from releasing sufficient energy to heat the steel to a critical failure state;
- Structural members are protected by fireproofing materials, which are designed to prevent them from reaching failure temperatures within specified time periods; and
- Steel-framed high-rises are designed to be highly redundant structural systems. Thus, if a localized failure occurs, it does not result in a disproportionate collapse of the entire structure.
FIG.1:WTC5 is an example of how steel- framed high-rises typically perform in large fires. It burned for over eight hours on September 11, 2001 (a), and did not suffer a total collapse (b) (Source: FEmA)
Throughout history, three steel-framed high-rises are known to have suffered partial collapses due to fires; none of those led to a total collapse. Countless other steel-framed high-rises have experienced large, long-lasting fires without suffering either partial or total collapse (see, for example, Fig. 1 a and b) [1].
In addition to resisting ever-present gravity loads and occasional fires, high-rises must be designed to resist loads generated during other extreme events — in particular, high winds and earthquakes. Designing for high-wind and seismic events mainly requires the ability of the structure to resist lateral loads, which generate both tensile and compressive stresses in the columns due to bending, the latter stresses then being combined with gravity-induced compressive stresses due to vertical loads.
FIG.2: WTC7fell symmetrically and at free-fall acceleration for a period of 2.25 seconds of its collapse (Source: NIST).
It was not until steel became widely manufactured that the ability to resist large lateral loads was achieved and the construction of high-rises became possible. Steel is both very strong and ductile, which allows it to withstand the tensile stresses generated by lateral loads, unlike brittle materials, such as concrete, that are weak in tension. Although concrete is used in some high-rises today, steel reinforcement is needed in virtually all cases.
To allow for the resistance of lateral loads, high-rises are often designed such that the percentage of their columns’ load capacity used for gravity loads is relatively low. The exterior columns of the Twin Towers, for example, used only about 20% of their capacity to withstand gravity loads, leaving a large margin for the additional lateral loads that occur during high-wind and seismic events [2].
Because the only loads present on 9/11 after the impact of the airplanes were gravity and fire (there were no high winds that day), many engineers were surprised that the Twin Towers completely collapsed. The towers, in fact, had been designed specifically to withstand the impact of a jetliner, as the head structural engineer, John Skilling, explained in an interview with the Seattle Times following the 1993 World Trade Center bombing:
“Our analysis indicated the biggest problem would be the fact that all the fuel (from the airplane) would dump into the building. There would be a horrendous fire. A lot of people would be killed,” he said. “The building structure would still be there.”
Skilling went on to say he didn’t think a single 200-pound [90-kg] car bomb would topple or do major structural damage to either of the Twin Towers.
“However,” he added, “I’m not saying that properly applied explosives—shaped explosives—of that magnitude could not do a tremendous amount of damage […] I would imagine that if you took the top expert in that type of work and gave him the assignment of bringing these buildings down with explosives, I would bet that he could do it.”
In other words, Skilling believed the only mechanism that could bring down the Twin Towers was controlled demolition.
Techniques of controlled demolition
Controlled demolition is not a new practice. For years it was predominantly done with cranes swinging heavy iron balls to simply break buildings into small pieces. Occasionally, there were structures that could not be brought down this way. In 1935, the two 191-m-tall Sky Ride towers of the 1933 World’s Fair in Chicago were demolished with 680 kg of thermite and 58 kg of dynamite. Thermite is an incendiary containing a metal powder fuel (most commonly aluminum) and a metal oxide (most com- monly iron(III) oxide or “rust”).
Eventually, when there were enough large steel-framed buildings that needed to be brought down more efficiently and inexpensively, the use of shaped cutter charges became the norm. Because shaped charges have the ability to focus explosive energy, they can be placed so as to diagonally cut through steel columns quickly and reliably.
FIG. 3: The final frame of NIST’s WTC 7 computer model shows large deformations to the exterior not observed in the videos (Source: NIST)
In general, the technique used to demolish large buildings involves cutting the columns in a large enough area of the building to cause the intact portion above that area to fall and crush itself as well as crush whatever remains below it.
This technique can be done in an even more sophisticated way, by timing the charges to go off in a sequence so that the columns closest to the center are destroyed first. The failure of the interior columns creates an inward pull on the exterior and causes the majority of the building to be pulled inward and downward while materials are being crushed, thus keeping the crushed materials in a somewhat confined area — often within the building’s “footprint.” This method is often referred to as “implosion.”
The case of WTC 7
The total collapse of WTC 7 at 5:20 PM on 9/11, shown in Fig. 2, is remarkable because it exemplified all the signature features of an implosion:
- The building dropped in absolute free fall for the first 2.25 seconds of its descent over a distance of 32 meters or eight stories [3].
- Its transition from stasis to free fall was sudden, occurring in approximately one-half second.
- It fell symmetrically straight down.
- Its steel frame was almost entirely dismembered and deposited mostly inside the building’s footprint, while most of its concrete was pulverized into tiny particles.
- Finally, the collapse was rapid, occurring in less than seven seconds.
Given the nature of the collapse, any investigation adhering to the scientific method should have seriously considered the controlled demolition hypothesis, if not started with it. Instead, NIST (as well as the Federal Emergency Management Agency (FEMA), which conducted a preliminary study prior to the NIST investigation) began with the predetermined conclusion that the collapse was caused by fires.
FIG.4: The above graph[10]compares David Chandler’s measurement[9] of the velocity of the roofline of WTC 1 with Bažant’s erroneous calculation [11] and with Szamboti and Johns’ calculation using corrected input values for mass, acceleration through the first story, conservation of momentum, and plastic moment (the maximum bending moment a structural section can withstand). The calculations show that—in the absence of explosives—the upper section of WTC 1 would have arrested after falling for two stories (Source: Ref. [10]).
Trying to prove this predetermined conclusion was apparently difficult. FEMA’s nine-month study concluded by saying, “The specifics of the fires in WTC 7 and how they caused the building to collapse remain unknown at this time. Although the total diesel fuel on the premises contained massive potential energy, the best hypothesis has only a low probability of occurrence.”NIST, meanwhile, had to postpone the release of its WTC 7 report from mid-2005 to November 2008. As late as March 2006, NIST’s lead investigator, Dr. Shyam Sunder, was quoted as saying,
Truthfully, I don’t really know. We’ve had trouble getting a handle on building No. 7.
All the while, NIST was steadfast in ignoring evidence that conflicted with its predetermined conclusion. The most notable example was its attempt to deny that WTC 7 underwent free fall. When pressed about that matter during a technical briefing, Dr. Sunder dismissed it by saying,
[A] free-fall time would be an object that has no structural components below it.
But in the case of WTC 7, he claimed,
there was structural resistance that was provided.
Only after being challenged by high school physics teacher David Chandler and by physics professor Steven Jones (one of the authors of this article), who had measured the fall on video, did NIST acknowledge a 2.25-second period of free fall in its final report. Yet NIST’s computer model shows no such period of free fall, nor did NIST attempt to explain how WTC 7 could have had “no structural components below it” for eight stories.
Instead, NIST’s final report provides an elaborate scenario involving an unprecedented failure mechanism: the thermal expansion of floor beams pushing an adjoining girder off its seat. The alleged walk-off of this girder then supposedly caused an eight-floor cascade of floor failures, which, combined with the failure of two other girder connections — also due to thermal expansion — left a key column unsupported over nine stories, causing it to buckle.
FIG. 5: High-velocity bursts of debris, or “squibs,” were ejected from point-like sources in WTC 1 and WTC 2, as many as 20 to 30 stories below the collapse front (Source: Noah K. murray).
This single column failure allegedly precipitated the collapse of the entire interior structure, leaving the exterior unsupported as a hollow shell. The exterior columns then allegedly buckled over a two-second period and the entire exterior fell simultaneously as a unit [3].
NIST was able to arrive at this scenario only by omitting or misrepresenting critical structural features in its computer modelling.[4] Correcting just one of these errors renders NIST’s collapse initiation indisputably impossible. Yet even with errors that were favorable to its predetermined conclusion, NIST’s computer model (see Fig. 3) fails to replicate the observed collapse, instead showing large deformations to the exterior that are not observed in the videos and showing no period of free fall. Also, the model terminates, without explanation, less than two seconds into the seven-second collapse.
Unfortunately, NIST’s computer modelling cannot be independently verified because NIST has refused to release a large portion of its modelling data on the basis that doing so “might jeopardize public safety.”
The case of the Twin Towers
Whereas NIST did attempt to analyze and model the collapse of WTC7, it did not do so in the case of the Twin Towers. In NIST’s own words,
The focus of the investigation was on the sequence of events from the instant of aircraft impact to the initiation of collapse for each tower…. this sequence is referred to as the ‘probable collapse sequence,’ although it includes little analysis of the structural behaviour of the tower after the conditions for collapse initiation were reached and collapse became inevitable.”[5]
Thus, the definitive report on the collapse of the Twin Towers contains no analysis of why the lower sections failed to arrest or even slow the descent of the upper sections — which NIST acknowledges “came down essentially in free fall” [5-6]— nor does it explain the various other phenomena observed during the collapses.
When a group of petitioners filed a formal Request for Correction asking NIST to perform such analysis, NIST replied that it was
unable to provide a full explanation of the total collapse
because
the computer models [were] not able to converge on a solution.
However, NIST did do one thing in an attempt to substantiate its assertion that the lower floors would not be able to arrest or slow the descent of the upper sections in a gravity-driven collapse. On page 323 of NCSTAR 1-6, NIST cited a paper by civil engineering professor Zdeněk Bažant and his graduate student, Yong Zhou, that was published in January 2002 [7] which, according to NIST, “addressed the question of why a total collapse occurred” (as if that question were naturally outside the scope of its own investigation).
FIG. 6: molten metal was seen pouring out of WTC 2 continuously for the seven minutes leading up to its collapse (Sources: WABC-Tv, NIST).
In their paper, Bažant and Zhou claimed there would have been a powerful jolt when the falling upper section impacted the lower section, causing an amplified load sufficient to initiate buckling in the columns. They also claimed that the gravitational energy would have been 8.4 times the energy dissipation capacity of the columns during buckling.
In the years since, researchers have measured the descent of WTC 1’s upper section and found that it never decelerated — i.e. there was no powerful jolt [8-9]. Researchers have also criticized Bažant’s use of free-fall acceleration through the first story of the collapse, when measurements show it was actually roughly half of gravitational acceleration [2]. After falling for one story, the measurements show a 6.1 m/s velocity instead of the 8.5 m/s velocity that would be the result of free fall. This difference in velocity effectively doubles the kinetic energy, because it is a function of the square of the velocity.
In addition, researchers have demonstrated that the 58 × 106 kg mass Bažant used for the upper section’s mass was the maximum design load—not the actual 33 × 106 kg service load [10]. Together, these two errors embellished the kinetic energy of the falling mass by 3.4 times. In addition, it has been shown that the column energy dissipation capacity used by Bažant was at least 3 times too low [2].
In January 2011 [11] Bažant and another graduate student of his, Jia-Liang Le, attempted to dismiss the lack-of-deceleration criticism by claiming there would be a velocity loss of only about 3%, which would be too small to be observed by the camera resolution. Le and Bažant also claimed conservation-of-momentum velocity loss would be only 1.1%. However, it appears that Le and Bažant erroneously used an upper section mass of 54.18 × 106 kg and an impacted floor mass of just 0.627 × 106 kg, which contradicted the floor mass of 3.87 × 106 kg Bažant had used in earlier papers.
The former floor mass is representative of the concrete floor slab only, whereas the latter floor mass includes all the other materials on the floor. Correcting this alone increases the conservation-of-momentum velocity loss by more than 6 times, to a value of 7.1%. Additionally, the column energy dissipation has been shown to be far more significant than Bažant claimed. Researchers have since provided calculations showing that a natural collapse over one story would not only decelerate, but would actually arrest after one or two stories of fall (see Fig. 4) [2, 10].
Other evidence unexplained
The collapse mechanics discussed above are only a fraction of the available evidence indicating that the airplane impacts and ensuing fires did not cause the collapse of the Twin Towers. Videos show that the upper section of each tower disintegrated within the first four seconds of collapse. After that point, not a single video shows the upper sections that purportedly descended all the way to the ground before being crushed.
Videos and photographs also show numerous high-velocity bursts of debris being ejected from point-like sources (see Fig. 5). NIST refers to these as “puffs of smoke” but fails to properly analyze them [6]. NIST also provides no explanation for the midair pulverization of most of the towers’ concrete, the near-total dismemberment of their steel frames, or the ejection of those materials up to 150 meters in all directions.
NIST sidesteps the well-documented presence of molten metal throughout the debris field and asserts that the orange molten metal seen pouring out of WTC 2 for the seven minutes before its collapse was aluminum from the aircraft combined with organic materials (see Fig. 6) [6].
Yet experiments have shown that molten aluminum, even when mixed with organic materials, has a silvery appearance — thus suggesting that the orange molten metal was instead emanating from a thermite reaction being used to weaken the structure [12]. Meanwhile, unreacted nano-thermitic material has since been discovered in multiple independent WTC dust samples [13].
As for eyewitness accounts, some 156 witnesses, including 135 first responders, have been documented as saying that they saw, heard, and/or felt explosions prior to and/or during the collapses [14]. That the Twin Towers were brought down with explosives appears to have been the initial prevailing view among most first responders. “I thought it was exploding, actually,” said John Coyle, a fire marshal.“Everyone I think at that point still thought these things were blown up” [15].
Conclusion
It bears repeating that fires have never caused the total collapse of a steel-framed high-rise before or since 9/11. Did we witness an unprecedented event three separate times on September 11, 2001? The NIST reports, which attempted to support that unlikely conclusion, fail to persuade a growing number of architects, engineers, and scientists. Instead, the evidence points overwhelmingly to the conclusion that all three buildings were destroyed by controlled demolition. Given the far-reaching implications, it is morally imperative that this hypothesis be the subject of a truly scientific and impartial investigation by responsible authorities.
ABOUT THE AUTHORS
Steven Jones is a former full professor of physics at Brigham Young University. His major research interests have been in the areas of fusion, solar energy, and archaeometry. He has authored or co-authored a number of papers documenting evidence of extremely high temperatures during the WTC destruction and evidence of unreacted nano-thermitic material in the WTC dust.
Robert Korol is a professor emeritus of civil engineering at McMaster University in Ontario, Canada, as well as a fellow of the Canadian Society for Civil Engi- neering and the Engineering Institute of Canada. His major research interests have been in the areas of structural mechanics and steel structures. More recently, he has undertaken experimen- tal research into the post-buckling resistance of H-shaped steel columns and into the energy absorption associated with pulverization of concrete floors.
Anthony Szamboti is a mechanical design engineer with over 25 years of structural design experience in the aerospace and communications industries. Since 2006, he has authored or co-authored a number of technical papers on the WTC high-rise failures that are published in the Journal of 9/11 Studies and in the International Journal of Protective Structures.
Ted Walter is the director of strategy and development for Architects & En- gineers for 9/11 Truth (AE911Truth), a nonprofit organization that today represents more than 2,500 architects and engineers. In 2015, he authored AE-911Truth’s Beyond Misinformation: What Science Says About the Destruction of World Trade Center Buildings 1, 2, and 7. He holds a Master of Public Policy degree from the University of California, Berkeley.
References
[1] NIST: Analysis of Needs and Existing Capabilities for Full-Scale Fire Resistance Testing (October 2008).
[2] G. Szuladziński and A. Szamboti and R. Johns, International Journal of Protective Structures 4, 117 (2013).
[3] NIST: Final Report on the Collapse of World Trade Center Building 7, Federal Building and Fire Safety Investigation of the World Trade Center Disaster (November 20, 2008).
[4] R. Brookman, A Discussion of ‘Analysis of Structural Response of WTC 7 to Fire and Sequential Failures Leading to Collapse, Journal of 9/11 Studies (October 2012).
[5] NIST: Final Report of the National Construction Safety Team on the Collapses of the World Trade Center Towers (December 1, 2005).
[6] NIST: Questions and Answers about the NIST WTC Towers Investi- gation (Updated September 19, 2011).
[7] Z. Bažant, Y. Zhou, Yong, Journal of Engineering Mechanics 128, 2 (2002).
[8] A. Szamboti and G. MacQueen, The Missing Jolt: A Simple Refu- tation of the NIST-Bažant Collapse Hypothesis, Journal of 9/11 Studies (April 2009).
[9] D. Chandler, The Destruction of the World Trade Center North Tower and Fundamental Physics, Journal of 9/11 Studies (February 2010).
[10] A. Szamboti and R. Johns, ASCE Journals Refuse to Correct Fraudulent Paper Published on WTC Collapses, Journal of 9/11 Studies (September 2014).
[11] J.-L. Le and Z. Bažant, Journal of Engineering Mechanics 137, 82 (2011).
[12] S. Jones, Why Indeed Did the WTC Buildings Collapse Completely? Journal of 9/11 Studies (September 2006).
[13] N. Harrit et al., Open Chemical Physics Journal (April 2009).
[14] G. MacQueen, Eyewitness Evidence of Explosions in the Twin Towers, Chapter Eight, The 9/11 Toronto Report, Editor: James Gourley (November 2012).
[15] Fire Department of New York (FDNY): World Trade Center Task Force Interviews, The New York Times (October 2001 to January 2002).
September 7, 2016 Posted by aletho | Deception, False Flag Terrorism, Science and Pseudo-Science, Timeless or most popular | 9/11, NIST, WTC-7 | Leave a comment
Featured Video
Seyed M. Marandi: Iran Rejects U.S. Deal – War Is Likely Imminent
or go to
Aletho News Archives – Video-Images
From the Archives
World War II Didn’t End The Great Depression
Stark Realities with Brian McGlinchey | April 4, 2024
A principal goal of Stark Realities is to “expose fundamental myths across the political spectrum” — and few myths are as universally embraced as the notion that US participation in World War II (1941-1945) lifted the American economy out of the Great Depression.
This myth is dangerous not only because it leads citizens and politicians to see a bright side of war that doesn’t really exist, but also because it helps foster a belief that government spending is essential to countering economic downturns. That belief, in turn, has helped propel us to a point where the national debt now exceeds $34.6 trillion, with interest payments alone on pace to reach $1 trillion a year in 2026, inviting financial catastrophe. … continue
Blog Roll
-
Join 2,458 other subscribers
Visits Since December 2009
- 7,502,510 hits
Looking for something?
Archives
Calendar
Categories
Aletho News Civil Liberties Corruption Deception Economics Environmentalism Ethnic Cleansing, Racism, Zionism Fake News False Flag Terrorism Full Spectrum Dominance Illegal Occupation Mainstream Media, Warmongering Malthusian Ideology, Phony Scarcity Militarism Progressive Hypocrite Russophobia Science and Pseudo-Science Solidarity and Activism Subjugation - Torture Supremacism, Social Darwinism Timeless or most popular Video War Crimes Wars for IsraelTags
9/11 Afghanistan Africa al-Qaeda Australia BBC Benjamin Netanyahu Brazil Canada CDC Central Intelligence Agency China CIA CNN Covid-19 COVID-19 Vaccine Donald Trump Egypt European Union Facebook FBI FDA France Gaza Germany Google Hamas Hebron Hezbollah Hillary Clinton Human rights Hungary India Iran Iraq ISIS Israel Israeli settlement Japan Jerusalem Joe Biden Korea Latin America Lebanon Libya Middle East National Security Agency NATO New York Times North Korea NSA Obama Pakistan Palestine Poland Qatar Russia Sanctions against Iran Saudi Arabia Syria The Guardian Turkey Twitter UAE UK Ukraine United Nations United States USA Venezuela Washington Post West Bank WHO Yemen Zionism
Aletho News- Seyed M. Marandi: Iran Rejects U.S. Deal – War Is Likely Imminent
- 46 IPCC Scientists Break Rank, Publicly Challenge Long-Standing Dogmatic Climate Claims
- Coming Off Seroquel Alone
- Iran’s ‘threat’ to Western hegemony is not nuclear weapons
- Iran warns UAE, Bahrain over alignment with US, Israeli interests
- ‘Little Sparta’: Why The UAE Attacked Iran for Israel’s Sake
- Iran Blames European Tanker for Oil Slick Near Kharg
- Hamas leaders say targeting families will fail to extract concessions
- Harrowing testimonies expose Israeli torture of Gaza hospital director
- UAE provides $100m for US-backed Gaza police force vetted by Shin Bet
If Americans Knew- Group indicted for insider trading allegedly used ‘going to Israel’ as code for illegal sales
- Israel is enabling a possible plague in Gaza – Daily Update
- A Conservative’s View: Trump’s War on Iran Is Destroying America
- Israel’s war on the West Bank comes for Palestinian greenhouses
- New $270 million Israeli-only roads project in the West Bank is Netanyahu’s latest bid to impose de facto annexation
- Gaza investigation: A family’s fight to find their missing relatives
- MSF: Israel’s deliberate restriction of food and aid led to alarming malnutrition levels in Gaza
- Israel earmarks $270M for Israeli-only roads (that’s apartheid) – Daily Update
- Two Supreme Court Justices were secret agents who helped Israel cover up its attack on the USS Liberty
- Bari Weiss ‘Meddles’ With ‘CBS Sunday Morning’ Story on Palestine
No Tricks Zone- New Study: Declining Trends In 1980-2023 Tropical Cyclone Frequency, Accumulated Energy
- 46 IPCC Scientists Break Rank, Publicly Challenge Long-Standing Dogmatic Climate Claims
- Another Study Links Warming To Cloud Forcing, Shortwave Radiation, Natural Atmospheric Circulation
- Wind Energy Is Toxic, Hazardous To Human Health, Scientific Review Shows
- Oversupply Of Volatile Solar Energy Leads To Record NEGATIVE Prices!
- New Study: Extreme Heat Records, Heatwaves, Extreme Cold Records Declining Across US Since 1899
- It’s The Cold, Stupid! Cold 20 Times More Lethal Than Heat, Multiple Studies Show
- European Institute For Climate And Energy: “Climate Debate is Seldom About Science”
- New Study: The Climate May Be 5 Times More Sensitive To Solar Forcing Than Commonly Assumed
- EV Industry Reached $70 Billion In Losses In 2024 Due To Delusional Green Ideologies
Contact:
atheonews (at) gmail.com
Disclaimer
This site is provided as a research and reference tool. Although we make every reasonable effort to ensure that the information and data provided at this site are useful, accurate, and current, we cannot guarantee that the information and data provided here will be error-free. By using this site, you assume all responsibility for and risk arising from your use of and reliance upon the contents of this site.
This site and the information available through it do not, and are not intended to constitute legal advice. Should you require legal advice, you should consult your own attorney.
Nothing within this site or linked to by this site constitutes investment advice or medical advice.
Materials accessible from or added to this site by third parties, such as comments posted, are strictly the responsibility of the third party who added such materials or made them accessible and we neither endorse nor undertake to control, monitor, edit or assume responsibility for any such third-party material.
The posting of stories, commentaries, reports, documents and links (embedded or otherwise) on this site does not in any way, shape or form, implied or otherwise, necessarily express or suggest endorsement or support of any of such posted material or parts therein.
The word “alleged” is deemed to occur before the word “fraud.” Since the rule of law still applies. To peasants, at least.
Fair Use
This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more info go to: http://www.law.cornell.edu/uscode/17/107.shtml. If you wish to use copyrighted material from this site for purposes of your own that go beyond ‘fair use’, you must obtain permission from the copyright owner.
DMCA Contact
This is information for anyone that wishes to challenge our “fair use” of copyrighted material.
If you are a legal copyright holder or a designated agent for such and you believe that content residing on or accessible through our website infringes a copyright and falls outside the boundaries of “Fair Use”, please send a notice of infringement by contacting atheonews@gmail.com.
We will respond and take necessary action immediately.
If notice is given of an alleged copyright violation we will act expeditiously to remove or disable access to the material(s) in question.
All 3rd party material posted on this website is copyright the respective owners / authors. Aletho News makes no claim of copyright on such material.




![FIG.4: The above graph[10]comparesDavid Chandler’s measurement[9] of the velocity of the roofline of WTC 1 with Bažant’s erroneous calculation [11] and with Szamboti and Johns’ calculation using corrected input values for mass, acceleration through the first story, conservation of momentum, and plastic moment (the maximum bending moment a structural section can withstand). The calculations show that—in the absence of explosives—the upper section of WTC 1 would have arrested after falling for two stories (Source: Ref. [10]).](https://offgraun.files.wordpress.com/2016/09/screen-shot-2016-09-07-at-15-12-48.png?w=300&h=170)


