Scholz and Macron belong to ‘ash heap of history’ – Medvedev
German Chancellor Olaf Scholz and French President Emmanuel Macron should abandon politics after their respective parties suffered damaging setbacks in the European Parliament elections, former Russian President Dmitry Medvedev believes.
Scholz’s center-left Social Democratic Party (SPD) is projected to finish third in the key ballot, behind the center-right Christian Democrats and the right-wing Alternative for Germany (AfD).
Macron’s Besoin d’Europe coalition is expected to win less than half of the votes received by the right-wing National Rally party associated with Marine Le Pen, prompting the French president to call a snap parliamentary election after preliminary results emerged on Sunday.
In a social media post on Monday, Medvedev claimed the outcome proves that Scholz and Macron are “respected by no one.” The former Russian leader linked the poor performance at the ballot box with the “idiotic economic and migration policy” pursued by the two leaders and their support for Ukraine “at the cost of [their] own citizens.”
“Time to retire. To the ash heap of history!” said Medvedev, who currently serves as deputy chair of the Russian Security Council.
Vyacheslav Volodin, the speaker of the lower chamber of the Russian parliament, earlier called on Scholz and Macron to resign and to “stop victimizing the citizens of their states.”
Officials in Moscow have accused leaders of EU nations of betraying the interests of their populations in favor of US geopolitical goals. Responding to the Ukraine crisis in 2022, the bloc vowed to support Kiev militarily for “as long as it takes,” and imposed an array of economic sanctions against Russia. Most notably, Brussels has pushed EU countries to stop buying Russian natural gas.
Large consumers such as Germany have struggled to substitute cheap Russian pipeline fuel with other sources, including renewables and expensive liquified natural gas. American LNG producers have since taken over a large share of the European market. A hike in energy prices has forced many energy-intensive businesses to either move out of the EU or shut down entirely.
The “Energy Transition” Won’t Happen
By Mark P. Mills | City Journal | May 23, 2024
Foundational innovation in cloud technology and artificial intelligence will require more energy than ever before—shattering any illusion that we will restrict supplies.
The laptop class has rediscovered a basic truth: foundational innovation, once adoption proceeds at scale, is followed by an epic increase in energy consumption. It’s an iron law of our universe.
To illustrate that law, consider three recent examples, all vectors leading to the “shocking” discovery of radical increases in expected electricity demand, now occupying headlines today. First, there’s the electric car, which, if there were one in every garage, as enthusiasts hope, would roughly double residential neighborhood electricity demands. Next, there’s the idea of repatriating manufacturing, especially for semiconductors. This is arguably a “foundational innovation,” since policymakers are suddenly showing concern over the decades-long exit of such industries from the U.S. Restoring American manufacturing to, say, the global market share of just two decades ago would see industrial electricity demand soar by 50 percent.
And now the scions of software are discovering that both virtual reality and artificial intelligence, which emerge from the ineluctable mathematics of machine-learning algorithms, are anchored in the hard reality that everything uses energy. This is especially true for the blazing-fast and power-hungry chips that make AI possible. Nvidia, the leader of the AI-chip revolution and a Wall Street darling, has over the past three years alone shipped some 5 million high-power AI chips. To put this in perspective, every such AI chip uses roughly as much electricity each year as do three electric vehicles. And while the market appetite for electric vehicles is sagging and ultimately limited, the appetite for AI chips is explosive and essentially unlimited.
Consider a recent headline in the Wall Street Journal: “Big Tech’s Latest Obsession Is Finding Enough Energy”—because the “AI boom is fueling an insatiable appetite for electricity.” And, as Reuters reports, “U.S. electric utilities predict a tidal wave of new demand . . . . Nine of the top 10 U.S. electric utilities said data centers were a main source of customer growth.” Today’s forecasts see near-term growth in demand for electric power three times as great as in recent years. Rediscovery of the iron law of growth inspired an urgent Senate hearing on May 21 entitled “Opportunities, Risks, and Challenges Associated with Growth in Demand for Electric Power in the United States.” (Full disclosure; a hearing at which I testified.)
Data centers, the information “powerplants” at the center of the cloud revolution, are flagged as the primary culprit for this exploding power demand. These warehouse-scale buildings are chock-full of all manner of computer chips, including conventional processors, memory chips, and communications chips. And now datacenters are pouring AI chips into the mix as fast as manufacturing plants can build them. As one researcher notes, adding AI to Google “search” boosts the energy use per search tenfold. And that’s only the first, perhaps the least, significant of the many possible applications for AI.
As one senior operative at Friends of the Earth recently put it: “We can see AI fracturing the information ecosystem just as we need it to pull it back together.” The fracturing is not about AI and child safety, or deep fakes, or the looming threat of new regulations. It’s about aspirations for an “energy transition” in how the world is fueled. It is inconvenient, to put it mildly, to see demand for electricity—especially reliable, 24–7 supply—take off at the same time as regulators are forcing utilities to shut down conventional power plants and spend money on costlier and less reliable power from wind and solar hardware. The epiphany that transition aspirations and the power realities of AI are in conflict was epitomized in a recent New Yorker essay titled, “The Obscene Energy Demands of A.I.” The article’s subtitle asks: “How can the world reach net zero if it keeps inventing new ways to consume energy?” The question answers itself.
The challenge is not only the need for far more electricity than forecast a mere year or so ago but also the need for it to be both inexpensive and available precisely when needed—and soon. New factories and new datacenters are coming online rapidly with many more coming in a few years, not decades. There aren’t many ways to meet the velocity and scale of electric demand coming without a boom in building more natural-gas-fired power plants.
This seemingly sudden change in the electricity landscape was predictable—and predicted. Almost exactly 25 years ago, my long-time colleague Peter Huber and I published articles in both Forbes and the Wall Street Journal pointing to the realities at the intersection of energy and information. (A decade ago, I also published a study on the matter, which, it turns out, accurately forecast electric demands from data, and I more recently expanded on that theme in my book The Cloud Revolution.) At the time, we were nearly alone in making such observations in the public-policy space, but we were far from alone in the technical community, which has long recognized the power realities of information. Indeed, in the engineering community, the convention for talking about the size of datacenters is in terms of megawatts, not square feet.
There’s a full-on race in the tech industry, and in tech-centric investment communities, to spend billions of dollars on new AI-infused infrastructures. The furious pace of expanding manufacturing to produce AI-capable silicon chips and simultaneously building massive, AI-infused datacenters is shattering the illusion that a digital economy enables a decoupling of economic growth from rising energy use.
As recently as two years ago, an analysis from the OECD (an organization in the vanguard of the “energy transition” vision) concluded: “Digital transformation is increasingly recognised as a means to help unlock the benefits of more inclusive and sustainable growth and enhanced social well-being. In the environmental context, digitalisation can contribute to decoupling economic activity from natural resource use and their environmental impacts.” It turns out that the physics of power and information neutered that aspiration.
Now the key question for policymakers and investors is whether the current state of affairs is a bubble or signals a more fundamental shift. Just how much more power will information consume? It is now conventional wisdom to see the digital economy as vital for economic growth, and that information supremacy matters both for economies and for militaries. But the core feature of an information-centric economy is in the manufacturing and operation of digital hardware—and unavoidably, the energy implications of both.
To see what the future holds, we must take a deep dive into the arcana of today’s “cloud,” the loosely defined term denoting the constellation of data centers, hardware, and communications systems.
Each datacenter—and tens of thousands of them exist—has an energy appetite often greater than skyscrapers the size of the Empire State Building. And the nearly 1,000 so-called hyperscale datacenters each consume more energy than a steel mill (and this is before counting the impacts of piling on AI chips). The incredible level of power use derives directly from the fact that just ten square feet of a datacenter today has more computing horsepower than all the world’s computers circa 1980. And each square foot creates electric power demands 100 times greater than a square foot of a skyscraper. Even before the AI revolution, the world was adding tens of millions more square feet of datacenters each year.
All that silicon horsepower is connected to markets on an information highway, a network whose scale vastly exceeds that of any of its asphalt and concrete analogues. The universe of communications hardware transports bytes not only along “highways” comprised of about 3 billion miles of glass cables but also along the equivalent of another 100 billion miles (that’s 1,000 times the distance to the sun) of invisible connections forged by 4 million cell towers.
The physics of transporting information is captured in a surprising fact: the energy used to enable an hour of video is greater than the share of fuel consumed by a single person on a ten-mile bus ride. While a net energy-use reduction does occur when someone Zooms rather than commutes by car (the “dematerialization” trope), at the same time, there’s a net increase in energy use if Zoom is used to attend meetings that would never have occurred otherwise. When it comes to AI, most of what the future holds are activities that would never have occurred otherwise.
Thus, the nature of the cloud’s energy appetite is far different from that of many other infrastructures, especially compared with transportation. For transport, consumers see where 90 percent of energy gets spent when they fill up a gas tank or recharge a battery. When it comes to information, though, over 90 percent of energy use takes place remotely, hidden away until utilities “discover” the aggregate impact.
Today’s global cloud, which has yet to absorb fully the power demands of AI, has grown from nonexistent, several decades ago, to using twice as much electricity as Japan. And that estimate is based on the state of hardware and traffic of several years ago. Some analysts claim that, as digital traffic has soared in recent years, efficiency gains were muting or even flattening growth in datacenter energy use. But such claims face countervailing factual trends. Since 2016, there’s been a dramatic acceleration in datacenter spending on hardware and buildings, along with a huge jump in the power density of that hardware—and again, all of this before the AI boom.
To guess what the future holds for the energy appetite of the cloud, one must know two things: first, the rate at which efficiency improves for digital hardware in general, especially for AI chips; second, the rate of growth in demand for data itself.
The past century of modern computing and communications shows that demand for data has grown far faster than engineers can improve efficiency. There’s no evidence to suggest this trend will change. In fact, today’s information-system energy use is the result of astounding gains in computing energy-efficiency. At the energy-efficiency of computing circa 1984, a single iPhone would use as much power as a skyscraper. If that were the case, there would be no smartphones today. Instead, we have billions of them. The same patterns hold across the entire silicon landscape, including for AI. Chip efficiencies for AI are improving at a blistering pace. Nvidia’s latest chip is 30-fold faster for the same power appetite. That won’t save energy—it will accelerate the market’s appetite for such chips at least 100-fold. Such is the nature of information systems. And the continued and dramatic improvement in AI chip efficiencies is built into the assumptions of all the industry-insider forecasts of ballooning overall energy use for AI.
But this raises the fundamental question: Just how much demand is there for data, the “fuel” that makes AI possible? We are on the precipice of an unprecedented expansion in both the variety and scale of data yet to be created, stored, and subsequently refined into useful products and services. As a practical matter, information is an infinite resource.
If it feels as though we’ve reached a kind of apotheosis in all things digital, the truth is otherwise: we are still in the early days. As an economic resource, data are unlike natural analogues—because humanity literally creates data. And the technological means for generating that resource are expanding in scale and precision. It’s one of those rare times when rhetorical hyperbole understates the reality.
The great explosion of data production will come from the nature and capacity to observe and measure the operation and activities of both our built environment and our natural environment, amplified by the increasing automation of all kinds of hardware and systems. Automation requires sensors, software, and control systems that necessarily generate massive data streams. Long before we see the autonomous car, for example, the “connected” car, with all its attendant features and safety systems, is already generating massive data flows.
Similarly, we’re seeing radical advances in our capacity to sense and measure all the features of our natural environment, including our own bodies. Scientists now collect information at astronomical scales, not only in the study of astronomy itself but also in the biological world, with new instruments that generate more data per experiment than trafficked on the entire Internet a few decades ago.
All trends face eventual saturation. But humanity is a very long way away from peak information supply. Information, in effect, is the only limitless resource.
One way to guess the future magnitude of data traffic—and derivatively the energy implications—is in the names of the numbers we’ve had to create to describe quantities of data. We count food and mineral production in millions of tons; people and their devices in billions of units; airway and highway usage in trillions of air- or road-miles; electricity and natural gas in trillions of kilowatt-hours or cubic feet; and our economies in trillions of dollars. But, at a rate of a trillion per year of anything, it takes a billion years to total one “zetta”—i.e., the name of the number that describes the scale of today’s digital traffic.
The numerical prefixes created to describe huge quantities track the progress of society’s technologies and needs. The “kilo” prefix dates back to 1795. The “mega” prefix was coined in 1873, to name 1,000 kilos. The “giga” prefix for 1 billion (1,000 million) and “tera” (a trillion, or 1,000 billion) were both adopted in 1960. In 1975, we saw the official creation of the prefixes “peta” (1,000 giga) and “exa” (1,000 peta), and then the “zetta” (1,000 exa) in 1991. Today’s cloud traffic is estimated to be roughly 50 zettabytes a year.
It’s impossible to visualize such a number without context. A zetta-stack of dollar bills would reach from the earth to the sun (93 million miles away) and back—700,000 times. All the molecules that comprise the Earth’s atmosphere weigh about five zettagrams. Even if each byte entails an infinitesimal amount of energy, the sheer volume of zettabyte-scale operations leads to consequential energy use.
Until just over a year ago, there was only one remaining official prefix name for a number bigger than a zetta: the 1,000 times bigger “yotta.” Given the AI-accelerated pace of data expansion, we’ll soon be in the yottabyte era. So now the bureaucrats in the Paris-based International Bureau of Weights and Measurements have officially given names to even bigger numbers, because before long, data traffic will blow past the yottabyte scale. One thousand yottabytes? That’s a ronnabyte. Your children will be using such numbers.
Such astonishing volumes of data being processed and moved will overwhelm the gains in energy efficiency that engineers will inevitably achieve. Already today, more capital is spent globally on expanding the energy-consuming cloud each year than all the world’s electric utilities combined spend to produce more electricity.
Credit Andreessen Horowitz’s “Techno-Optimist Manifesto” for observing that “energy is the foundational engine of our civilization. The more energy we have, the more people we can have, and the better everyone’s lives can be.” Our cloud-centric and AI-infused twenty-first-century infrastructure illustrates this fundamental point. The world will need all forms of energy production imaginable. An “energy transition” would only restrict energy supplies—and that’s not going to happen. The good news is that the U.S. does have the technical and resource capacity to supply the energy needed. The only question is whether we have the political will to allow the proverbial “all of the above” energy solutions to happen.
Mark P. Mills is a contributing editor of City Journal, executive director of the National Center on Energy Analytics, a strategic partner in the energy fund Montrose Lane, and author of The Cloud Revolution: How the Convergence of New Technologies Will Unleash the Next Economic Boom and a Roaring 2020s.
Copyright © 2024 Manhattan Institute for Policy Research, Inc. All rights reserved.
The amount of copper needed to build EVs is ‘impossible for mining companies to produce’
By Tanya Weaver | Engineering & Technology | May 16, 2024
Copper cannot be mined quickly enough to keep up with current policies requiring the transition to electric vehicles (EVs), according to a University of Michigan study.
Copper is fundamental to electricity generation, distribution and storage. According to GlobalData, there are more than 709 copper mines in operation globally, with the largest being the Escondida mine in Chile, which produced an estimated 882,100 tonnes of copper in 2023.
This may sound like a lot but with electrification ramping up globally it is not. The Michigan study, Copper mining and vehicle electrification, has focused on the copper required just for the production of EVs over the coming years.
Many countries across the world are putting forward policies for EVs. For instance, in the US the Inflation Reduction Act, signed into law in 2022, calls for 100% of cars manufactured by 2035 to be electric.
However, an EV requires three to five times more copper than petrol or diesel cars, not to mention the copper required for upgrades to the electricity grid.
“A normal Honda Accord needs about 40 pounds of copper. The same battery electric Honda Accord needs almost 200 pounds of copper,” said Adam Simon, professor of earth and environmental studies at the University of Michigan.
“We show in the paper that the amount of copper needed is essentially impossible for mining companies to produce.”
The researchers examined 120 years of global data from copper production dating back to 1900. They then modelled how much copper is likely to be produced for the rest of the century and how much copper the US electricity infrastructure and fleet of cars would need to upgrade to renewable energy.
The study found that renewable energy’s copper needs would outstrip what copper mines can produce at the current rate. Between 2018 and 2050, the world will need to mine 115% more copper than has been mined in all of human history up until 2018 just to meet current copper needs without considering the green energy transition.
To meet the copper needs of electrifying the global vehicle fleet, as many as six new large copper mines must be brought online annually over the next several decades. About 40% of the production from new mines will be required for EV-related grid upgrades.

The research concluded that instead of fully electrifying the entire US fleet of vehicles, they should focus on manufacturing hybrid vehicles.
“We know, for example, that a Toyota Prius actually has a slightly better impact on climate than a Tesla. Instead of producing 20 million EVs in the US and, globally, 100 million battery EVs each year, would it be more feasible to focus on building 20 million hybrid vehicles?”
Apart from EVs, copper is, of course, vital in other sectors: for instance, building infrastructure in the developing world such as an electricity grid for the approximately one billion people who don’t yet have access to electricity.
“What we will end up with is tension between how much copper we need to build infrastructure in less developed countries versus how much copper we need for the energy transition,” warned Simon.
“We are hoping this study gets picked up by policymakers who should consider copper as the limiting factor for the energy transition, and to think about how copper is allocated.”
© 2024 The Institution of Engineering and Technology
Labour’s energy claims are ‘divorced from reality’
Net Zero Watch | May 31, 2024
The Labour Party is saying that its energy policies – a rapid decarbonisation of the electricity system – will save consumers money. The claim is apparently based on an October 2023 report by Ember,[1] which says that a decarbonised electricity system can reduce bills by £300 per household.
However, the report also says[2] that the authors are assuming that windfarms in the future will secure ‘the same price as [Contracts for Difference] auction round 4’. The prices achieved in Round 4 (£37.50) are around half the price (£73/MWh) currently on offer to offshore windfarms in Round 6 [3]. And industry insiders are suggesting that even the latter figure may be inadequate.[4]
In other words, Labour’s claimed savings rely on assuming that wind power costs half of what it actually does.
A second problem Labour’s putative savings figure is that Ember’s report compares bills in their hypothetical decarbonised electricity system against bills in the third quarter of 2023, which were still inflated by the Ukraine war.
Net Zero Watch director Andrew Montford said:
Labour’s claim of a reduction in household bills is based on figures that are entirely divorced from today’s reality.
And Mr Montford continued by calling for a new reality-based debate on Net Zero.
When it comes to energy policy, the political establishment is operating in a fact-free void. For the sake of the country, they need to start asking very hard questions about what they are being told by civil servants and environmental activists like Ember.
Notes
2. Page 20.
3. All values are in 2012 prices, as is standard practice when discussing CfDs. In current prices, AR4 is worth £47/MWh, and AR6 is offering around £102/MWh.
The Carbon Capture Con
By Viv Forbes | Master Resource | May 17, 2024
Carbon-capture-and-underground-storage “(CCUS)” tops the list of silly schemes “to reduce man-made global warming.” The idea is to capture exhaust gases from power stations or cement plants, separate the CO2 from the other gases, compress it, pump it to the chosen burial site and force it underground into permeable rock formations. Then hope it never escapes.
An Australian mining company who should know better is hoping to appease green critics by proposing to bury the gas of life, CO2, deep in the sedimentary rocks of Australia’s Great Artesian Basin.
They have chosen the Precipice Sandstone for their carbon cemetery. However, the chances of keeping CO2 gas confined in this porous sandstone are remote. This formation has a very large area of outcrop to the surface and gas will escape somewhere, so why bother forcing it into a jail with no roof?
Glencore shareholders should rise in anger at this wasteful and futile pagan sacrifice to the global warming gods. It will join fiascos like Snowy 2, pink bats and SunCable (a dream to take solar energy generated in NT via overhead and undersea cable for over 5000 km across ocean deeps and volcanic belts to Singapore).
Engineers with buckets of easy money may base a whole career on Carbon Capture and Underground Storage. But only stupid green zealots would support the sacrifice of billions of investment dollars and scads of energy to bury this harmless, invisible, life-supporting gas in the hope of appeasing the high priests of global warming.
The quantities of gases that CCUS would need to handle are enormous, and the capital and operating costs will be horrendous. It is a dreadful waste of energy and resources, consuming about twenty percent of power delivered from an otherwise efficient coal-fired power station.
For every tonne of coal burnt in a power station, about 11 tonnes of gases are exhausted – 7.5 tonnes of nitrogen from the air used to burn the coal, plus 2.5 tonnes of CO2 and one tonne of water vapour from the coal combustion process.
Normally these beneficial atmospheric gases are released to the atmosphere after filters take out any nasties like soot and noxious fumes.
However, CCUS also requires energy to produce and fabricate steel and erect gas storages, pumps and pipelines and to drill disposal wells. This will chew up more coal resources and produce yet more carbon dioxide, for zero benefit.
But the real problems are at the burial site – how to create a secure space to hold the CO2 gas. There is no vacuum occurring naturally anywhere on earth – every bit of space on Earth is occupied by something – solids, liquids or gases. Underground disposal of CO2 requires it to be pumped AGAINST the pressure of whatever fills the pore space of the rock formation now – either natural gases or liquids. These pressures can be substantial, especially after more gas is pumped in.
The natural gases in sedimentary rock formations are commonly air, CO2, CH4 (methane) or rarely, H2S (rotten egg gas). The liquids are commonly salty water, sometimes fresh water or very rarely, liquid hydrocarbons.
Pumping out air is costly; pumping out natural CO2 to make room for man-made CO2 is pointless; and releasing rotten egg gas or salty water on the surface would create a real problem, unlike the imaginary threat from CO2.
In some cases, CCUS may require the removal of fresh water to make space for CO2. Producing fresh water on the surface would be seen as a boon by most locals. Pumping out salt water to make space to bury CO2 would create more problems than it could solve.
Naturally, some carbon dioxide buried under pressure will dissolve in groundwater and aerate it, so that the next water driller in the area could get a real bonus – bubbling Perrier Water on tap, worth more than oil.
Then there is the dangerous risk of a surface outburst or leakage from a pressurised underground reservoir of CO2. The atmosphere contains 0.04% CO2 which is beneficial for all life. But the gas in a CCUS reservoir would contain +90% of this heavier-than-air gas – a lethal, suffocating concentration for nearby animal life if it escaped in a gas outburst.
Pumping gases underground is only sensible if it brings real benefits such as using waste gases to increase oil recovery from declining oil fields – frack the strata, pump in CO2, and force out oil/gas. To find a place where you could drive out natural hydro-carbons in order to make space to bury CO2 would be like winning the Lottery – a profitable but unlikely event.
Normally however, CCUS will be futile as the oceans will largely undo whatever man tries to do with CO2 in the atmosphere. Oceans contain vastly more CO2 than the thin puny atmosphere, and oceans maintain equilibrium between CO2 in the atmosphere and CO2 dissolved in the oceans. If man releases CO2 into the atmosphere, the oceans will quickly absorb much of it. And if by some fluke man reduced the CO2 in the atmosphere, CO2 would bubble out of the oceans to replace much of it. Or just one decent volcanic explosion could negate the whole CCUS exercise.
Increased CO2 in the atmosphere encourages all plants to grow better and use more CO2. Unfortunately natural processes are continually sequestering huge tonnages of CO2 into extensive deposits of shale, coal, limestone, dolomite and magnesite – this process has driven atmospheric CO2 to dangerously low concentrations. Burning hydrocarbons and making cement returns a tiny bit of this plant food from the lithosphere to the biosphere.
Regulating atmospheric carbon dioxide is best left to the oceans and plants – they have been doing it successfully for millennia.
The only certain outcome from CCUS is more expensive electricity and a waste of energy resources to do all the separation, compressing and pumping. Unscrupulous coal industry leaders love the idea of selling more coal to produce the same amount of electricity, and electricity generators would welcome an increased demand for power. And green zealots in USA plan to force all coal and gas plants to bury all CO2 plant food that they generate. Consumers and taxpayers are the suckers.
Naturally the Greens love the idea of making coal and gas-fired electricity more expensive. They conveniently ignore the fact that CCUS is anti-life – it steals plant food from the biosphere.
Global Warming has never been a threat to life on Earth – Ice is the killer. Glencore directors supporting this CCUS stupidity should be condemned for destructive ignorance.
————-
Geologist Viv Forbes is the founder of the Carbon Sense Coalition.
Andrew Bridgen and an Idiot!
Climate Realism by Paul Burgess | May 21, 2024
Is it worth even asking our idiot ministers any questions in Parliament?
Wretch this factual question and truly stupid answer.
Net Zero Watch calls for UK to follow Dutch example
Net Zero Watch | May 20, 2024
Net Zero Watch is calling on UK ministers to follow the example of the Dutch government, which has announced the scrapping of cornerstone climate policies such as mandatory heat pump targets and the compulsory purchase of farmland.
The reversal is part of a populist backlash against environmentalist policies that has so far been more pronounced in parts of continental Europe than in the UK.
The desire of Britain’s politicians to ‘lead the world’ in the fight against climate change has led it to be early adopters of ‘ambitious’ climate targets, without thinking through their implications. Theresa May’s decision to introduce a legally-binding Net Zero target was debated for just 90 minutes in the House of Commons, but it was a decision that was followed by many other countries.
The Dutch experience shows that voters do not appreciate being on the receiving end of inflexible, compulsive policies that hit the poorest hardest. The delaying of the 2030 ban on petrol and diesel cars to 2035 and the delay by a year of the Clean Heat Market Mechanism, show that the Government has at least woken up to the risk it faces. But it will need to go much further to protect consumers.
Harry Wilkinson, head of policy at Net Zero Watch, said:
What has happened in the Netherlands is likely to be replicated across Europe. We have heard some encouraging language from Claire Coutinho, but she needs to go further to avoid a backlash.’
’I’ve never been against heat pumps, but it is absurd to mandate their use when they will be inappropriate in many homes. Green technologies must stand on their own merits, or the public will be left poorer.’
Brussels should remember that Europeans are sovereign, not the EU treaties or the Eurocrats
BY GRZEGORZ ADAMCZYK | REMIX NEWS | MAY 17, 2024
In an interview with Tygodnik Solidarność weekly, Prof. Ryszard Piotrowski, a constitutional lawyer from the University of Warsaw, expressed his concerns over the legal challenges posed by the implementation of the European Green Deal. He believes that these challenges threaten the legal identity and autonomy of both the European Union and Poland.
The professor emphasized the foundational role of dialogue in Polish law, noting that European laws are becoming increasingly incomprehensible and detached from the real needs of European citizens.
He argued that the perception of Europeans as subordinates to the European Parliament and the European Council, rather than as sovereigns over the treaties, poses a significant threat. “The sooner we understand that we, as Europeans, are not servants to the treaties and the European Parliament, the better it will be for Europe,” he stated.
He also questioned whether the Green Deal’s objectives align with the Polish constitution, which mandates environmental protection guided by the principle of sustainable development. According to him, the current shape of EU climate policy contravenes this principle by jeopardizing overall economic growth and thereby the security of citizens.
Piotrowski additionally highlighted that the Green Deal threatens essential social rights guaranteed by the Polish constitution, such as housing, energy, and communication security.
“We have a right to energy security, and its violation threatens democracy itself because democracy without a socio-economic dimension is devoid of meaning,” said the professor. Furthermore, he noted that the Green Deal also threatens the principle of subsidiarity, which aims to empower citizens and their communities.
Adding to the urgency of his concerns, Professor Piotrowski pointed out that the implementation of the Green Deal might weaken Poland’s defensive capabilities at a time when a military conflict looms near its eastern border. He criticized Europe’s stance on the conflict in Ukraine, arguing that despite European treaties pledging to promote peace, the current approach could lead to tragic consequences for Europe.
“Contrary to what European treaties stipulate and what they commit Europe to, it has chosen to speak of war instead of striving for peace. Such actions have always ended tragically for Europe,” the professor warned.
Green Blob Tells Government to Spend £30 Billion on Machine to Remove CO2 From the Air

BY BEN PILE | THE DAILY SCEPTIC | MAY 5, 2024
A story in the Telegraph last week featured a report by Energy Systems Catapult (ESC) which recommended the Government commit to a £30 billion project to pull CO2 from the air. According to the report, Direct Air Carbon Capture and Storage (DACCS) machines sited across the east coast could separate the greenhouse gas from air and pump it to underground storage facilities, thereby helping the U.K. to meet its ambitious 2050 Net Zero target. Not only is this extraordinarily expensive idea pointless in itself, it exposes the equally pointless and expensive constellation of publicly-funded lobbying organisations.
According to ESC, “carbon capture in its various forms is a critical component of a low-cost energy transition”, and “without it, at scale, we risk non-compliance with our Net Zero requirement”. And here is the thing that would, were such things subject to public debate, cause millions of people to scratch their heads. So what if the U.K. does not comply with its Government’s self-imposed target? What is the ‘risk’? And why should the public fork out billions of pounds merely for a daft machine that serves no function other than help a Government achieve its ambition that nobody else really cares about?
Madder still, the ESC admits that DACCS “remains unproven at scale”. This raises two important problems.
First, if something has yet to be proven at such a gigantic scale, any estimate of its cost is both for the birds and in all probability, like all Government-backed projects such as HS2 and wind power, will exceed those estimates. Government vanity project HS2, for example, originally had a similar estimated cost of £37.5 billion in 2009 prices. But by 2020, estimates put the cost well north of £100 billion.
Second, it shows yet again that no government, no political party, no MP or peer, no think tank or its wonks, no academic at a lofty research outfit, no green lobbyist or campaigner, and no journalist has any idea how Net Zero will be achieved, but nonetheless nearly all of them fought for such targets to be imposed on us.
It is a problem known as putting the cart before the horse. And it is a characteristic of all climate-related policies that they are driven by ambition, not reality. Not even ESC can explain what DACCS is, how it will work or how much it will cost. All they really know is that it will be required to remove 48 million tonnes of CO2 from the air each year from 2050 – approximately a tenth of the U.K.’s current domestic annual emissions.
Vanity and intransigence drives this irrational push for solutions to non-problems. Air capture of CO2 serves no useful purpose whatsoever. It won’t make a dent in atmospheric CO2 concentration. It won’t change the weather. It won’t make anyone’s life better. And it won’t stand up to any meaningful cost-benefit analysis. £30 billion, roughly equivalent to £500 per head of the population, could do vastly more good were it to be spent in countless other ways, from healthcare through to addressing genuine environmental issues such as water quality. Of course, not spending the money on such contraptions would likely do more good by leaving that much money in people’s pockets to spend how they see fit.
The Telegraph spots the problem. DACCS plants “would need to be powered by wind, nuclear or solar energy so as not to generate as much CO2 as they save”. A fleet of green generators would be working to power the DACCS plants, merely to hit targets. Recent studies show that existing DACCS technology is extremely inefficient, requiring a whopping 2,500 kilowatt hours to isolate just one tonne of CO2. To extract 48 million tonnes of CO2 would therefore require power stations with a capacity of 14 gigawatts – that’s more than four times the capacity of Hinkley Point C. That nuclear power station itself, dubbed at the time “the most expensive power station in the world”, was initially estimated to cost £26 billion but more recent estimates are putting the cost closer to £46 billion. Thus the cost of a widespread DACCS project – with batteries included – is likely to be in the order of seven times greater than ECS claim. And we have not yet even considered the operating cost.
All this puts me in mind of those fun little clips of devices whose only function is to press a switch to turn themselves off. On Youtube, electronics hobbyists compete to build the most impressive ‘useless machine’. Here is one such contender.
But the problem of useless machinery goes far beyond the device itself. Not unlike white elephants such as wind turbines, Energy Systems Catapult is a strange outfit summoned up out of the blobbish technocracy required by the green agenda. ECS is part of an umbrella group of government-backed private companies called the Catapult Network, which itself seems to be part of Innovate U.K., which in turn is part of UK Research and Innovation – the successor public funding body to the erstwhile research councils. ESC and its sister organisations each benefit from millions of pounds of public funding, topped up by opaque philanthropic funding (i.e., green blob organisations), which as ESC claims, allows them to “support Central and Devolved Governments with the evidence, insights and innovations to incentivise Net Zero action”.
The problem at its core is that publicly-funded organisations, though set up as ‘independent’ bodies run at arms-length from Government, are nonetheless wholly committed to political agendas. Seemingly intended to ‘drive prosperity’ through R&D, such a constellation of opaque agencies are tantamount to the Government picking ‘winners’, who invariably turn out to be abject losers, at vast public expense. There are no consequences for such wonks spaffing hundreds of millions of pounds of taxpayers money on pilots that come to nought, or glossy reports that might just as well be case studies from Narnia. Criticism of ideas such as CO2 capture is excluded from academia and business because even if any critics were not already disinclined to apply for roles within the network, and were then not rejected for their obvious hostility to the dominant political culture of such bullshit factories, their politically inconvenient work would soon be shelved.
In other words, the green agenda has produced a useless machine whose only function is to produce designs for useless machines. The parent idea of DACCS, Carbon Capture and Storage (CCS), in which CO2 is taken from power stations, compressed and then stuffed under the sea, was an idea that attracted attention following the Climate Change Act. But despite the government offering a billion pounds in funding competitions to prove the concept, the project failed and today remains economically unproven. The even crazier idea of pulling CO2 – which is still a trace gas at just 400 parts per million – from the air and then burying it underground faces a similar future. Meanwhile, the U.K.’s climate agenda will run on, as usual, built on extremely expensive pie-in-the-sky fantasies. Nobody has any idea how to achieve Net Zero without destroying ourselves.
Subscribe to Ben Pile’s The Net Zero Scandal Substack here.
Poles taking to the streets against EU Green Deal
By Olivier Bault | Remix News | May 9, 2024
On Friday, May 10, Poles will be taking to the street in a protest organized by the legendary Solidarity trade union. Solidarity, which was the main dissident social movement against communism in Eastern Europe in the 1980s, is now demanding a referendum on the EU Green Deal. Its current leader, Piotr Duda, has even called the EU Green Deal a new “red plague,” in reference to communism.
The protest is supported by Law and Justice (PiS), the main opposition party in Poland, and also by the other parties of its United Right coalition as well as by the Confederation, an alliance of Christian nationalists and libertarians to the right of the United Right. The trade union, however, makes “the whole political class” in Poland responsible for the EU’s climate policy and notes that it warned from the outset of the threats linked to that policy, which means it makes the United Right leaders responsible too, as the EU Green Deal was adopted during their eight years in power.

“The solutions implemented under the Green Deal in the future will translate into, among other things, increases in electricity and heating bills, new taxes on energy and fuel, a ban on heating with fossil fuels, as well as increases in food prices and the country’s food insecurity. NSZZ Solidarity has decided to loudly express its opposition to such policies,” Solidarity’s leaders wrote in a press release published in mid-March.
They also wrote:
“The Solidarity trade union, which won Poland’s freedom in the past and later used it many times for just causes, has again decided to reach for the highest form of direct democracy, which is a nationwide referendum in which citizens will be asked about the continuation of the implementation of the Green Deal. The referendum will be preceded by an information campaign. This will allow for a broad awareness-building public debate on the real effects of the EU’s climate policy so that every citizen of Poland will be able to express his or her opinion on the subject based on reliable knowledge. After all, EU policy should not be determined by officials in Brussels, but based on the consent of the citizens of member states.”
The May 10 protest will start at noon on the Plac Zamkowy Square in central Warsaw, when farmers are expected to turn up en masse as they did on March 6 when a large farmer protest was brutally repressed by Donald Tusk’s left-liberal government.
However, it is not only farmers who are going to be very negatively affected by the EU Green Deal. As the Ordo Iuris legal think tank stresses in an EU-wide petition against the Green Deal it has just launched, not only is European agriculture facing a catastrophe, but car drivers and homeowners will have to pay a high price for plans dictated not by reason and based not on consultations, but driven by ideology.
We can still “Stop the Green Deal” in its current form, we remind people in our petition, as it is a matter of the political decisions made by the heads of state and government in the European Council that can be later translated into new EU law processed through the EU Council (where ministers of the EU-27 meet) and the European Parliament.
This is why we demand not only that there should be a referendum in Poland on the Green Deal, but that an EU summit should be convened to work through the demands of farmers and other actors from across Europe.
We should all have in mind that under the current plans, the production of food and many intermediate and industrial goods will not stop, but will only be transferred outside the European Union, where the EU’s absurd climate regulations do not apply. This will only make matters worse for our planet and it will push millions of Europeans toward poverty and destroy the European Union’s economic competitiveness.
We encourage all citizens of EU countries to sign the petition against the EU Green Deal here.
