Aletho News

ΑΛΗΘΩΣ

A Little Arithmetic: The Costs Of A Solar-Powered Grid Without Fossil Fuel Back-up

By Francis Menton – Manhattan Contrarian – July 29, 2021

Yesterday’s post made the point that states or countries seeking to march toward 100% “renewable” electricity don’t seem to be able to get past about the 50% mark, no matter how many wind turbines and solar panels they build. The reason is that, in practical operation, due to what is called “intermittency,” no output is available from the solar and wind sources at many times of high demand; therefore, during those times, other sources must supply the juice. This practical problem is presented most starkly in California, where the “renewable” strategy is based almost entirely on solar panels, with only a very small wind component. Daily graphs published by the California Independent System Operator (CAISO) show a clear and obvious pattern, where the solar generation drops right to zero every evening just as the peak demand period kicks in from about 6 to 9 PM.

Commenter Sean thinks he has the answer: “Given the predictable daily power generation cycle of solar in sunny places like California and the predictable daily demand which peaks in the evening perhaps solar generators should be required to have electricity storage equivalent to the daily generation of their PV system.”

I thought it might be instructive to play out Sean’s idea to see just how much solar generation capacity and storage it would take to make a system out of just those two elements that would be sufficient to fulfill California’s current electricity requirements. Note: this is an exercise in arithmetic. It is not complicated arithmetic. There is nothing here that goes beyond what you learned in elementary school. On the other hand, few seem to be willing to undertake the effort to do these calculations, or to recognize the consequences.

We start with the current usage that must be supplied. Currently, the usage ranges between a low of around 30 GW and a high of around 40 GW over the course of a day. For purposes of this exercise, let’s assume an average usage of 35 GW. Multiply by 24, and we find as a rough estimate that the system must supply 840 GWH of electricity per day.

How much capacity of solar panels will we need to provide the 840 GWH? We’ll start with the very sunniest day of the year, June 21. California currently has about 14 GW of solar capacity. Go to those CAISO charts, and we find that on June 21, 2021, which apparently was a very sunny day, those 14 GW of solar panels produced at the rate of about 12 GW maximum from about 8 AM to 6 PM, about half that rate from 7-8 AM and 6-7 PM, and basically nothing the rest of the time. Optimistically, they produced about 140 GWH for the day (10 hrs x 12 GW plus 2 hrs x 6 GW plus a little more for the dawn and dusk hours). That means that to produce your 840 GWH of electricity on a sunny June 21, you will need 6 times the capacity of solar panels that you currently have, or 84 GW. When 7 PM comes, you’ll need enough energy in storage to get you through to the next morning at around 8 AM, when generation will again exceed usage. This is about 13-14 hrs at an average of 35 GW, or around 475 GWH of storage.

That’s June 21, your best day of the year. Now let’s look at a bad day. For the past year, a good example would be December 24, 2020, which besides being one of the shortest days of the year, must also have been rather cloudy. Production from the existing 14 GW of solar capacity averaged only about 3 GW, and only from 9 AM to 3 PM. That’s 18 GWH in that window (3 GW x 6 hrs). Then there was another about 1 GWH produced from 8 to 9 AM, and another 1 GWH from 3 to 4 PM. About 20 GWH for the whole day. You need 840 GWH. If 14 GW of solar panels only produced 20 GWH for the day, you would have needed 588 GW of panels to produce your 840 GWH. (14/20 x 840) That 588 GW of solar panels is some 42 times your existing 14 GW of solar panels. And when those 588 GW of capacity stop producing anything at all around 4 PM, you are also going to need at least 16 hours worth of average usage in storage to get yourself to 8 AM the next morning. That would be around 560 GWH of storage.

So you can easily see that Sean’s idea of providing storage “equivalent to the daily generation of the PV system” doesn’t really get to the heart of the problem. Your main problem is that you will need capacity of close to 15 times peak usage (nearly 600 GW capacity to supply peak usage of around 40 GW) in order to deal with your lowest-production days of the year.

Cost? If you assume (charitably) that the “levelized cost” of energy from the solar panels is the same as the “levelized cost” of energy from a natural gas plant, then this system with 15 times the capacity is going to cost 15 times as much. Plus the cost of storage. In this scenario, that is relatively modest. At current prices of around $200/KWH the 560 GWH of storage will run around $112 billion, or around half of the annual budget of the state government of California.

But you may say, no one would build the system this way, with gigantic over-capacity in place just to cover the handful of days in the year with the very lowest solar output. Instead, why not build much less solar capacity, and save up power from the summer to cover the winter. Since the average output of the solar facilities in California is about 20% of capacity averaged over the year, then you ought to be able to generate enough power for the year with capacity of about 5 times peak usage, rather than the 15 times in the scenario above. You just will need to save up power all the way from the summer to the winter. Oh, and you will need a huge multiple more storage than for the one-day-at-a-time scenario. If 180 days per year have less production than usage, and the average shortfall of production on each of those days is 300 GWH, then you will need 54,000 GWH worth of batteries (180 x 300). At $200 per GWH, that will run you around $10+ trillion. This would be about triple the annual GDP of the state of California.

But don’t worry, batteries to store power for six months and more and release it without loss on the exchange don’t exist. Maybe someone will invent them in time for California to meet its 2030 renewable electricity targets.

Any reader can feel free to check my math.

I just can’t believe that anybody talks about this as something remotely connected to reality.

July 31, 2021 Posted by | Economics | | Leave a comment

The Triumphant March Toward 100% “Renewable” Electricity: Germany and California

By Francis Menton – Manhattan Contrarian – July 28, 2021

As a state or a country, if you want to have any status in the ranks of the climate virtuous, the key metric is your commitment to get most or all of your energy from “renewables” (mainly wind and solar) by the earliest possible date. Everybody is doing it, and you are nobody if you don’t get in on the bidding. Just a couple of weeks ago (July 14), according to Reuters, the European Commission entered a bid of 40% of final energy consumption from “renewables” by 2030. Back here in the US, the most recent bid from the Biden administration (from April 28) is a goal of 80% of electricity by 2030, which is ambitious on its own, although electricity is a minority of final energy consumption. Congress has yet to consider the Biden administration bid.

Within both the EU and the US, there are national and state champions that are far out-virtuing everybody else. In the EU, it’s Germany. Germany adopted its “Energiewende” way back in 2010 to transition its energy sector to wind and solar. Since then Germany has repeatedly ramped up its renewable energy targets. Most recently, in December 2020, Germany adopted by statute a binding goal of 65% of electricity from renewables by 2030. Here in the US, our champion is California. In California the governing law is the famous SB 100, enacted in 2018, which sets mandatory targets for the electricity sector of 60% from “renewables” by 2030 and 100% by 2045.

As readers here know, the Manhattan Contrarian from time to time has expressed a high degree of skepticism as to whether these mandatory targets are achievable in the real world. Indeed, I have often noted that at somewhere around 40 – 50% of electricity from “renewables,” it becomes impossible as a practical matter to increase the share of electricity from renewables just by adding more renewable capacity. As far as I am aware, no large jurisdiction to date has gotten its percentage of electricity generation from “renewables” up above 50% for any extended period of time. (If a reader can point me to an example, I will be very interested.)

But maybe I’m just a crank. Surely these geniuses in Germany and California must know what they are doing. So let’s check in on the latest news.

Germany

The website No Tricks Zone has a report on July 27 covering electricity output in Germany for the first half of 2021. The No Tricks Zone post is based on data compiled at a German website called Die kalte Sonne.

And the answer is that in the first half of 2020 Germany achieved the level of 50% of its electricity from “renewables.” But in 2021 that level fell back to 43%:

“The share of renewable energies in gross electric power consumption in the first half of 2021 fell from 50% to 43% compared to a year earlier,” Die kalte Sonne reports.

What happened? The wind just didn’t blow as much:

“The production of onshore and offshore wind energy decreased by 20%.” . . . The reason for the steep drop, according to the findings, was due to unfavorable weather conditions. “This year, especially in the first quarter, the wind was particularly still. . . .”

So did solar energy then pick up the slack? Unfortunately, no:

“[T]he sun output was low. . . . Solar energy output . . . rose a modest 2%.”

So how did Germany make up the difference? The answer will not surprise you:

“Coal energy saw a renaissance. Brown coal [lignite] power plants produced 45.8 terawatt-hours of the net power – that is the power mix that comes out of the outlet.  That’s a strong increase of 37.6% compared to 2020, when only 33.6 terawatt-hours were produced. The net production by black coal power plants also increased, by 38.9% to 20.4 terawatt-hours after 14.4 terawatt-hours in 2020.”

Basically, Germany has hit the limit of what can be achieved by adding capacity of wind and solar power sources. To get to the higher levels of “renewable” market share that they have committed to, they will need to add large and rapidly-increasing amounts of grid-scale storage. So far, they have barely begun that process.

California

Perhaps you remember the excited headline from the LA Times from April 29: “California just hit 95% renewable energy.” April 29 was just the very day after President Biden had announced his goal of 80% of US electricity from “renewables” by 2030. Now California was already showing the world that they were way ahead and basically all the way to home plate:

Something remarkable happened over the weekend: California hit nearly 95% renewable energy. I’ll say it again: 95% renewables. For all the time we spend talking about how to reach 100% clean power, it sometimes seems like a faraway proposition, whether the timeframe is California’s 2045 target or President Biden’s more aggressive 2035 goal. But on Saturday just before 2:30 p.m., one of the world’s largest economies came within a stone’s throw of getting there.

(Emphasis in the original.). But maybe we shouldn’t get too excited just yet. First, although the author (Sammy Roth) says this was “95% renewable energy,” it turns out as you read further that he is only talking about electricity, which is only about 30% of energy consumption. And for how long did the renewables provide the 95% of electricity consumption?

Saturday’s 94.5% figure — a record, as confirmed to me by the California Independent System Operator — was fleeting, lasting just four seconds.

So what’s the real picture over the course of multiple months or a year? For that you’ll have to ignore the cheerleading reporters at the MSM, and try to find some aggregate statistics. Here are the figures from the California Energy Commission for the full year 2020. The total contribution to electricity supply from “renewables” is claimed to be 33.09%. Oh, but that includes 2.45% from “biomass,” 4.89% from “geothermal,” and 1.39% from “small hydro.” Take those out and you’re left with a big 24.36% from wind and solar. And since electricity is only about 30% of final energy consumption, that means that wind and solar are only contributing around 8% of total energy consumption in California.

Over at the website of California’s Independent System Operator (“CAISO”) they provide a chart for every day’s electricity production that dramatically illustrates the problem. California’s peak electricity demand is around 40 GW, generally occurring around 6 – 8 PM. The large majority of their “renewable” production is from solar. Their current solar capacity, on a sunny mid-summer day like today, provides around 12 GW from about 9 AM to 5 PM — and nothing the rest of the time, including at the time of peak usage. In the winter, the output is more like 8 GW from 10 AM to 4 PM, and nothing the rest of the time. So far, they have almost nothing in the way of grid scale energy storage. In the evening, they ramp up the natural gas plants, and import power from Arizona and Nevada — mostly natural gas, nuclear, and coal. Close to 30% of California’s electricity comes from imports from neighboring states.

Is California going to meet its statutory mandatory goal of 60% of electricity from renewables by 2030? I know which way I’m betting.

July 31, 2021 Posted by | Science and Pseudo-Science | , , | Leave a comment

California Governor Gavin Newsom Has a New Coronavirus Crackdown Hypocrisy Scandal

By Adam Dick | Ron Paul Institute | July 28, 2021

California Governor Gavin Newsom, over the last year and a half, has been one of the American governors imposing the most extensive crackdowns on freedom in the name of countering coronavirus. He also famously exhibited extreme hypocrisy in November by flagrantly violating his own California coronavirus-related mandates while taking part in a dinner party at the uber-expensive French Laundry restaurant. Newsom’s attitude seems to be that his rules are for regular people, not for himself and his friends.

Now comes word of another scandal in which Newsom has flaunted the mandate he has imposed in the state. Eric Ting reported Tuesday at the San Francisco Gate that two of Newsom’s children recently attended a basketball summer camp that had informed parents ahead of time that children would not be required to wear masks despite a state mandate that children ages two to 11 do so. After a picture of one of Newsom’s children, along with other children at the camp, with uncovered faces appeared on the internet, Newsom’s kids were pulled out of the camp early. Woops, the Newsom family had missed reading the camp’s email mentioning the camp’s mask policy, explained the communications director of Newsom’s governor office.

It is great that Newsom and his friends can enjoy an “old normal” dinner party with friends, though the dinner party at issue looks like it was also a get-together of government and special interest lobbyists. And it is great that Newsom’s children, who are in an age group for which risk of serious injury or death from coronavirus is nearly zero, can participate in a summer camp without wearing uncomfortable, dehumanizing masks that are known to cause health problems but have not been shown to provide any net protection from coronavirus. It would also be great if more summer camps followed freedom-friendly policies as did the camp Newsom’s children attended. Kudos for people taking part in such forbidden activities that bring joy to life. The problem with Newsom is that he takes these actions for himself and his children while, at the same time, he decrees that ordinary people are prohibited from doing so.


Copyright © 2021 by RonPaul Institute

July 28, 2021 Posted by | Civil Liberties, Progressive Hypocrite | , , | Leave a comment

California utility PG&E admits it probably started yet ANOTHER devastating wildfire

RT | July 20, 2021

Pacific Gas & Electric (PG&E) seems to have once again helped ignite a deadly wildfire in California, contributing to the carnage in the most populous US state for the fourth year in a row, according to a report on its website.

Documents posted to the utility’s website on Monday and filed with the California Public Utilities Commission indicate that a PG&E employee saw “blown fuses in a conductor on top of a pole, a tree leaning into the conductor, and a fire at the base of the tree” when he responded to a reported circuit outage around 7am local time last Tuesday. The equipment problem is believed to have helped contribute to the start of the Dixie Fire in Feather River Canyon, a devastating blaze that is still just 15% contained.

Unable to access the pole until nearly 12 hours after first taking note of the fire, due to “challenging terrain and road work resulting in a bridge closure,” the employee reported that upon returning to the site around 4:40pm local time, he encountered “a fire on the ground near the base of the tree” plus “two of three fuses blown and what appeared to him to be a healthy green tree leaning into the Bucks Creek 1101 12 kV conductor, which was still intact and suspended on the poles,” according to the report.

Only then did the worker call his supervisor – who subsequently called 911. Given PG&E’s abysmal track record of responding to wildfires (especially those linked to its equipment), it is perhaps unsurprising that the utility reportedly waited five days – rather than the required two to four hours – to report the nascent blaze to the state regulatory agency.

The Dixie Fire has already consumed more than 30,000 acres as of Monday, and continues to force evacuations in Plumas and Butte counties. PG&E’s systems reportedly showed an outage near Cresta Dam in the area of Feather River Canyon where the fire began. Mandatory evacuation orders remain in force in High Lakes, Bucks Lake, and Meadow Valley in Plumas County; Jonesville and Philbrook in Butte County are also under evacuation order. Cal Fire reported on Monday that 800 structures remain under threat.

PG&E has become notorious for its dysfunctional equipment’s apparent contributions to the increasingly devastating wildfires plaguing the region. PG&E equipment has been connected to at least one wildfire every year for the past four years, starting with the deadly 2018 Camp Fire.

The utility pleaded guilty last year to 84 counts of manslaughter, each count representing one life lost in the Camp Fire. The deadly blaze began in October in the town of Pulga, eventually engulfing 140,000 acres aided by high winds and low humidity. Some 8,700 homes were destroyed and tens of thousands of people forced to evacuate, while even those whose homes were spared the destruction were unable to go outside due to the extremely unhealthy air quality. The worst wildfire in California history, the Camp Fire killed 85 people and all but wiped out the town of Paradise.

The judgment pitched PG&E into a bankruptcy from which it finally emerged last year, with a promise to compensate fire victims for whatever damages had not been covered by their insurance – a sum of $13.5 billion that will be partially paid in company stock.

Last year, PG&E equipment was found to be partially responsible for the Zogg Fire in Shasta County. The company is still facing a criminal investigation over that blaze and was forced to pay out $43 million to local governments for that fire and the previous year’s Kincade Fire in Sonoma County. The utility still faces prosecution in Sonoma County over the 2019 fire.

Should PG&E continue to perform not just poorly but criminally, the utility could ultimately be taken over by the state, though California’s Public Utilities Commission requires the firm to progress through six ‘tiers’ of its so-called enhanced oversight program first. PG&E is already on the first tier, having been nailed for the shoddy job it did clearing out tree limbs and other kindling from its riskiest lines since November, and has pledged to spend $4.9 billion on “wildfire safety” this year.

Its promise to “do better” after four years of contributing to the devastating losses experienced by California residents was made as a condition for exiting bankruptcy. Meanwhile, company officials have attempted to blame drought and climate change, instead of taking responsibility.

PG&E has also outraged and alienated customers by shutting off the power supply during peak usage hours for hundreds of thousands of people, hoping to prevent the sparking that has been known to cause wildfires out of fear that high winds could topple the power lines altogether.

An investigation by the CPUC accused the utility of lacking even a rudimentary safety strategy, noting it only makes “positive changes” when forced to do so by dire accidents like fires and explosions. The CPUC report itself was issued seven years after the explosion of a gas pipeline in 2010, which killed eight people and uncovered poor if not criminal business practices such as overcharging customers, underspending on maintenance, and in general placing profit over everything – including but not limited to safety.

This year’s fire season is already predicted to be especially devastating, with expectations it will be longer, drier, and riskier than previous years’ even as PG&E struggles to fix its decaying infrastructure.

Over 158,000 acres of Northern California forest have burned so far this season, including the Tamarack Fire, which grew to 23,000 acres as of Monday morning and remains entirely uncontained. It was reportedly ignited by a lightning strike earlier this month, and local firefighters made the questionable decision not to dispatch fire crews “because of safety concerns,” leaving Alpine County Sheriff Rick Stephens to explain the bizarre response to local residents who now face losing their homes to the inferno. The Beckwourth Complex Fire has burned over 105,000 acres as of Monday and is 82% contained.

July 20, 2021 Posted by | Aletho News | , , | Leave a comment

A Sinister Agenda Behind California Water Crisis?

By F. William Engdahl – New Eastern Outlook – 10.06.2021

In recent months a crisis situation in the USA food supply has been growing and is about to assume alarming dimensions that could become catastrophic. Atop the existing corona pandemic lockdowns and unemployment, a looming agriculture crisis as well could tip inflation measures to cause a financial crisis as interest rates rise. The ingredients are many, but central is a severe drought in key growing states of the Dakotas and Southwest, including agriculture-intensive California. So far Washington has done disturbingly little to address the crisis and California Water Board officials have been making the crisis far worse by draining the state water reservoirs…into the ocean.

So far the worst hit farm state is North Dakota which grows most of the nation’s Red Spring Wheat. In the Upper Midwest, the Northern Plains states and the Prairie provinces of Canada winter brought far too little snow following a 2020 exceedingly dry summer. The result is drought from Manitoba Canada to the Northern USA Plains States. This hits farmers in the region just four years after a flash drought in 2017 arrived without early warning and devastated the US Northern Great Plains region comprising Montana, North Dakota, South Dakota, and the adjacent Canadian Prairies.

As of May 27, according to Adnan Akyuz, State Climatologist, ninety-three percent of the North Dakota state is in at least a Severe Drought category, and 77% of the state is in an Extreme Drought category. Farm organizations predict unless the rainfall changes dramatically in the coming weeks, the harvest of wheat widely used for pasta and flour will be a disaster. The extreme dry conditions extend north of the Dakota border into Manitoba, Canada, another major grain and farming region, especially for wheat and corn. There, the lack of rainfall and warmer-than-normal temperatures threaten harvests, though it is still early for those crops. North Dakota and the plains region depend on snow and rainfall for its agriculture water.

Southwest States in Severe Drought

While not as severe, farm states Iowa and Illinois are suffering “abnormally dry” conditions in 64% for Iowa and 27% for Illinois. About 55% of Minnesota is abnormally dry as of end May. Drought is measured in a scale from D1 “abnormally dry,” D3 “severe drought” to D4, “exceptional drought.”

The severe dry conditions are not limited, unfortunately, to North Dakota or other Midwest farm states. A second region of very severe drought extends from western Texas across New Mexico, Colorado, Arizona, Nevada and deep into California. In Texas 20% of the state is in “severe drought,” and 12% “extreme drought.” Nearly 6% of the state is experiencing “exceptional drought,” the worst. New Mexico is undergoing 96% “severe drought,” and of that, 47% “exceptional drought.”

California Agriculture is Vital

The situation in California is by far the most serious in its potential impact on the supply of agriculture products to the nation. There, irrigation and a sophisticated water storage system provide water for irrigation and urban use to the state for their periodic dry seasons. Here a far larger catastrophe is in the making. A cyclical drought season is combining with literally criminal state environmental politics, to devastate agriculture in the nation’s most important farm producing state. It is part of a radical Green Agenda being advocated by Gov. Gavin Newsom and fellow Democrats to dismantle traditional agriculture, as insane as it may sound.

Few outside California realize that the state most known for Silicon Valley and beautiful beaches is such a vital source of agriculture production. California’s agricultural sector is the most important in the United States, leading the nation’s production in over 77 different products including dairy and a number of fruit and vegetable “specialty” crops. The state is the only producer of crops such as almonds, artichokes, persimmons, raisins, and walnuts. California grows a third of the country’s vegetables and two thirds of the country’s fruits and nuts. It leads all other states in farm income with 77,500 farms and ranches. It also is second in production of livestock behind Texas, and its dairy industry is California’s leading commodity in cash receipts. In total, 43 million acres of the state’s 100 million acres are devoted to agriculture. In short what happens here is vital to the nation’s food supply.

California Crisis Manmade: Where has the water gone?

The water crisis in California is far the most serious in terms of consequences for the food supply, in a period when the US faces major supply chain disruptions owing to absurd corona lockdowns combined with highly suspicious hacks of key infrastructure. On May 31, the infrastructure of the world’s largest meat processor, JBS SA, was hacked, forcing the shutdown of all its US beef plants that supply almost a quarter of American beef.

The Green lobby is asserting, while presenting no factual evidence, that Global Warming, i.e. increased CO2 manmade emission, is causing the drought. The NOAA examined the case and found no evidence. But the media repeats the narrative to advance the Green New Deal agenda with frightening statements such as claiming the drought is, “comparable to the worst mega-droughts since 800 CE.”

After 2011, California underwent a severe seven year drought. The drought ended in 2019 as major rains filled the California reservoir system to capacity. According to state water experts the reservoirs held enough water to easily endure at least a five-year drought. Yet two years later, the administration of Governor Newsom is declaring a new drought and threatening emergency measures. What his Administration is not saying is that the State Water Board and relevant state water authorities have been deliberately letting water flow into the Pacific Ocean. Why? They say to save two endangered fish species that are all but extinct—one, a rare type of Salmon, the second a Delta Smelt, a tiny minnow-size fish of some 2” size which has all but disappeared.

In June 2019 Shasta Dam, holding the state’s largest reservoir as a keystone of the huge Central Valley Project, was full to 98% of capacity. Just two years later in May 2021 Shasta Lake reservoir held a mere 42% of capacity, almost 60% down. Similarly, in June 2019 Oroville Dam reservoir, the second largest, held water at 98% of capacity and by May 2021 was down to just 37%. Other smaller reservoirs saw similar dropsWhere has all the water gone?

Allegedly to “save” these fish varieties, during just 14 days in May, according to Kristi Diener, a California water expert and farmer, “90% of (Bay Area) Delta inflow went to sea. It’s equal to a year’s supply of water for 1 million people.” Diener has been warning repeatedly in recent years that water is unnecessarily being let out to sea as the state faces a normal dry year. She asks, “Should we be having water shortages in the start of our second dry year? No. Our reservoirs were designed to provide a steady five year supply for all users, and were filled to the top in June 2019.”

In 2008, at the demand of environmental groups such as the NRDC, a California judge ordered that the Central Valley Water project send 50% of water reservoirs to the Pacific Ocean to “save” an endangered salmon variety, even though the NGO admitted that no more than 1,000 salmon would likely be saved by the extreme measure. In the years 1998-2005 an estimated average of 49% of California managed water supply went to what is termed the “environment,” including feeding into streams and rivers, to feed estuaries and the Bay Area Delta. Only 28% went directly to maintain agriculture water supplies.

This past January Felicia Marcus, the chair of the California State Water Resources Control Board, who oversaw the controversial water policies since 2018, left at the end of her term to become an attorney for the Natural Resources Defense Council (NRDC) one of the most powerful green NGO’s, with a reported $400 million in resources to wage legal battles to defend “endangered species” such as the California salmon and the Delta Smelt.

Appointed by green Gov. Jerry Brown as chair of the State Water Board in 2018, Marcus is directly responsible for the draining of the reservoirs into the ocean after they filled in 2019, using the claim of protecting endangered species. In March 2021 with Marcus as attorney, the NRDC requested that the State Water Resources Control Board Marcus headed until recently, take “immediate action” to address perceived threats to listed salmon in the Sacramento River watershed from Central Valley Project (“CVP”) operationsThis as the state is facing a new drought emergency?

In 2020 Gov. Gavin Newsom, a protégé of Jerry Brown, signed Senate Bill 1, the California Environmental, Public Health and Workers Defense Act, which would send billions of gallons of water out to the Pacific Ocean, ostensibly to save more fish. It was a cover for manufacturing the present water crisis and specifically attacking farming, as incredible as it may seem.

Target Agriculture

The true agenda of the Newsom and previous Brown administrations is to radically undermine the highly productive California agriculture sector. Gov. Newsom has now introduced an impressive-sounding $5.1 billion Drought Relief bill. Despite its title, nothing will go to improve the state reservoir water availability for cities and farms. Of the total, $500 million will be spent on incentives for farmers to “re-purpose” their land, that is to stop farming. Suggestions include wildlife habitat, recreation, or solar panels! Another $230 million will be used for “wildlife corridors and fish passage projects to improve the ability of wildlife to migrate safely.” “Fish passage projects” is a clever phrase for dam removal, destroying the nation’s most effective network of reservoirs.

Then the Newson bill allocates $300 million for the Sustainable Groundwater Management Act implementation, a 2014 law from Jerry Brown amid the previous severe drought to prevent farmers in effect from securing water from drilling wells. The effect will be to drive more farmers off the land. And another $200 million will go to “habitat restoration,” supporting tidal wetland, floodplains, and multi-benefit flood-risk reduction projects—a drought package with funding for floods? This is about recreating flood plains so when they demolish the dams, the water has someplace to go. The vast bulk of the $500 billion is slated to reimburse water customers from the previous 2011-2019 drought from higher water bills, a move no doubt in hopes voters will look positively on Newsom as he faces likely voter recall in November.

The systematic dismantling of one of the world’s most productive agriculture regions, using the seductive mantra of “environmental protection,” fits into the larger agenda of the Davos Great Reset and its plans to radically transform world agriculture into what the UN Agenda 2030 calls “sustainable” agriculture—no more meat protein. The green argument is that cows are a major source of methane gas emissions via burps. How that affects global climate no one has seriously proven. Instead we should eat laboratory-made fake meat like the genetically-manipulated Impossible Burger of Bill Gates and Google, or even worms. Yes. In January the EU European Food Safety Agency (EFSA), approved mealworms, or larvae of the darkling beetle, as the first “novel food” cleared for sale across the EU. 

F. William Engdahl is strategic risk consultant and lecturer, he holds a degree in politics from Princeton University.

June 12, 2021 Posted by | Malthusian Ideology, Phony Scarcity, Science and Pseudo-Science | , , | Leave a comment

California Bill Would Fine Retailers That Keep Boys and Girls’ Toys and Clothing in Separate Sections

By Paul Joseph Watson | Summit News | February 25, 2021

A new bill proposed in California would fine retailers, including online stores, that continue to display boys and girls’ toys and clothing in separate sections.

Introduced by Democrats Evan Low and Cristina Garcia in the California state legislature, the bill would require retailers to display the “majority” of the items in unisex sections.

The legislation would also forbid signs that indicate whether the toys and clothing are intended for boys or girls.

Websites would also be made to show all the items on a single page entitled “kids,” “unisex” or “gender neutral,” something that would cause practical confusion even outside of the political intentions of the bill.

Retailers with over 500 employees would be hit with an initial $1,000 fine for non-compliance.

“There are clear political and social motivations behind this bill, namely to use the state to compel “inclusivity” and encourage the “self-expression” of disordered inclinations at a very young age. It’s despicable,” commented Evan James.

As we highlighted earlier, a new Gallup poll found that when it comes to Generation Z, one in six now identify as some form of LGBT.

February 26, 2021 Posted by | Civil Liberties | | 1 Comment

Assigning Blame for the Blackouts in Texas

By Planning Engineer | Climate Etc. | February 18, 2021 

The story from some media sources is that frozen wind turbines are responsible for the power shortfalls in Texas. Other media sources emphasize that fossil fuel resources should shoulder the blame because they have large cold induced outages as well and also some natural gas plants could not obtain fuel.

Extreme cold should be expected to cause significant outages of both renewable and fossil fuel based resources. Why would anyone expect that sufficient amounts of natural gas would be available and deliverable to supply much needed generation? Considering the extreme cold, nothing particularly surprising is happening within any resource class in Texas. The technologies and their performance were well within the expected bounds of what could have been foreseen for such weather conditions. While some degradation should be expected, what is happening in Texas is a departure from what they should be experiencing. Who or what then is responsible for the shocking consequences produced by Texas’s run in with this recent bout of extreme cold?

TRADITIONAL PLANNING

Traditionally, responsibility for ensuring adequate capacity during extreme conditions has fallen upon individual utility providers. A couple decades ago I was responsible for the load forecasting, transmission planning and generation planning efforts of an electric cooperative in the southeastern US. My group’s projections, studies and analysis supported our plans to meet customer demand under forecasted peak load conditions. We had seen considerable growth in residential and commercial heat pumps. At colder temperature these units stop producing heat efficiently and switch to resistance heating which causes a spike in demand. Our forecasts showed that we would need to plan for extra capacity to meet this potential demand under extreme conditions in upcoming winters.

I was raked over the coals and this forecast was strongly challenged. Providing extra generation capacity, ensuring committed (firm) deliveries of gas during the winter, upgrading transmission facilities are all expensive endeavors. Premiums are paid to ensure gas delivery and backup power and there is no refund if it’s not used. Such actions increased the annual budget and impact rates significantly for something that is not likely to occur most years, even if the extreme weather projections are appropriate. You certainly don’t want to over-estimate peak demand due to the increasing costs associated with meeting that demand. But back then we were obligated to provide for such “expected” loads. Our CEO, accountants and rate makers would ideally have liked a lower extreme demand projection as that would in most cases have kept our cost down. It was challenging to hold firm and stand by the studies and force the extra costs on our Members.

Fortuitously for us, we were hit with extreme winter conditions just when the plan went in place. Demand soared and the planned capacity we had provided was needed. A neighboring entity was hit with the same conditions. Like us they had significant growth in heat pumps – but they had not forecasted their extreme weather peak to climb as we had. They had to go to the overburdened markets to find energy and make some curtailments. The cost of replacement power turned out to be significantly greater proportionately than we incurred by planning for the high demand. They suffered real consequences due to the shortcomings of their planning efforts.

However, if extreme winter had not occurred, our neighbor’s costs would have been lower than ours that year and that may have continued many years into the future as long as we didn’t see extreme winter conditions. Instead of the praise we eventually received, there would have at least been some annoyance directed at my groups for contributing to “un-needed expenditures”. That’s the way of the world. You can often do things a little cheaper, save some money and most of the time you can get away with it. But sometimes/eventually you cut it too close and the consequences can be extreme.

The Approach in Texas

Who is responsible for providing adequate capacity in Texas during extreme conditions? The short answer is no one. The Electric Reliability Council of Texas (ERCOT) looks at potential forecasted peak conditions and expected available generation and if there is sufficient margin they assume everything will be all right. But unlike utilities under traditional models, they don’t ensure that the resources can deliver power under adverse conditions, they don’t require that generators have secured firm fuel supplies, and they don’t make sure the resources will be ready and available to operate. They count on enough resources being there because they assume that is in their owner’s best interests. Unlike all other US energy markets, Texas does not even have a capacity market. By design they rely solely upon the energy market. This means that entities profit only from the actual energy they sell into the system. They do not see any profit from having stand by capacity ready to help out in emergencies. The energy only market works well under normal conditions to keep prices down. While generally markets are often great things, providing needed energy during extreme conditions evidently is not their forte. Unlike the traditional approach where specific entities have responsibilities to meet peak levels, in Texas the responsibility is diffuse and unassigned. There is no significant long term motivation for entities to ensure extra capacity just in case it may be needed during extreme conditions. Entities that might make that gamble theoretically can profit when markets skyrocket, but such approaches require tremendous patience and the ability to weather many years of potential negative returns.

This article from GreenTech media praises energy only markets as do many green interests. Capacity markets are characterized as wasteful. Andrew Barlow, Head of the PUC in Texas is quoted as follows, “Legislators have shown strong support for the energy-only market that has fueled the diversification of the state’s electricity generation fleet and yielded significant benefits for customers while making Texas the national leader in installed wind generation. ”

Why has Capacity been devalued?

Traditional fossil fuel generation has (as does most hydro and nuclear) inherent capacity value. That means such resources generally can be operated with a high degree of reliability and dependability. With incentives they can be operated so that they will likely be there when needed. Wind and solar are intermittent resources, working only under good conditions for wind and sun, and as such do not have capacity value unless they are paired with costly battery systems.

If you want to achieve a higher level of penetration from renewables, dollars will have to be funneled away from traditional resources towards renewables. For high levels of renewable penetration, you need a system where the consumers’ dollars applied to renewable generators are maximized. Rewarding resources for offering capacity advantages effectively penalizes renewables. As noted by the head of the PUC in Texas, an energy only market can fuel diversification towards intermittent resources. It does this because it rewards only energy that is fed into the grid, not backup power. (Side note-it’s typical to provide “renewable” resources preference for feeding into the grid as well. Sometimes wind is compensated for feeding into the grid even during periods of excess generation when fossil fuel resources are penalized. But that’s another article.)

Traditional planning studies might recognize that wind needs to be backed up by fossil fuel (more so under extreme conditions) such that if you have these backup generators its much cheaper to use and fuel them, than to add wind farms with the accompanying significant investment for concrete, rare earth metals, vast swaths of land … . Traditional planning approaches often have to go to get around this “bias” of favoring capacity providing resources over intermittent resources.

When capacity value is rewarded, this makes the economics of renewables much less competitive. Texas has stacked the deck to make wind and solar more competitive than they could be in a system that better recognizes the value of dependable resources which can supply capacity benefits. An energy only market helps accomplish the goal of making wind and solar more competitive. Except capacity value is a real value. Ignoring that, as Texas did, comes with real perils.

In Texas now we are seeing the extreme shortages and market price spikes that can result from devaluing capacity. The impacts are increased by both having more intermittent resources which do not provide capacity and also because owners and potential owners of resources which could provide capacity are not incentivized to have those units ready for backup with firm energy supplies.

Personal Observations

Wind and solar have value and can be added to power systems effectively in many instances. But seeking to attain excessive levels of wind and solar quickly becomes counterproductive. It is difficult to impossible to justify the significant amounts of wind and solar penetration desired by many policy makers today using principals of good cost allocation. Various rate schemes and market proposals have been developed to help wind and solar become more competitive. But they come with costs, often hidden. As I’ve written before, it may be because transmission providers have to assume the costs and build a more expensive system to accommodate them. It may be that rates and markets unfairly punish other alternatives to give wind and solar an advantage. It may be that they expose the system to greater risks than before. It may be that they eat away at established reliability levels and weaken system performance during adverse conditions. In a fair system with good price signals today’s wind and solar cannot achieve high penetration levels in a fair competition.

Having a strong technical knowledge of the power system along with some expertise in finance, rates and costs can help one see the folly of a variety of policies adopted to support many of today’s wind and solar projects. Very few policy makers possess anything close to the skill sets needed for such an evaluation. Furthermore, while policy makers could listen to experts, their voices are drowned out by those with vested interests in wind and solar technology who garner considerable support from those ideologically inclined to support renewables regardless of impacts.

A simpler approach to understanding the ineffectiveness of unbridled advocacy for wind and solar is to look at those areas which have heavily invested in these intermittent resources and achieved higher penetration levels of such resources. Typically electric users see significant overall increases in the cost of energy delivered to consumers. Emissions of CO2 do not uniformly decrease along with employment of renewables, but may instead increase due to how back up resources are operated. Additionally reliability problems tend to emerge in these systems. Texas, a leader in wind, once again is added to the experience gained in California, Germany and the UK showing that reliability concerns and outages increase along with greater employment of intermittent resources.

Anyone can look at Texas and observe that fossil fuel resources could have performed better in the cold. If those who owned the plants had secured guaranteed fuel, Texas would have been better off. More emergency peaking units would be a great thing to have on hand. Why would generators be inclined to do such a thing? Consider, what would be happening if the owners of gas generation had built sufficient generation to get through this emergency with some excess power? Instead of collecting $9,000 per MWH from existing functioning units, they would be receiving less than $100 per MWH for the output of those plants and their new plants. Why would anyone make tremendous infrastructure that would sit idle in normal years and serve to slash your revenue by orders of magnitudes in extreme conditions?

The incentive for gas generation to do the right thing was taken away by Texas’s deliberate energy only market strategy. The purpose of which was to aid the profitability of intermittent wind and solar resources and increase their penetration levels. I don’t believe anyone has ever advanced the notion that fossil fuel plants might operate based on altruism. Incentives and responsibility need to be paired. Doing a post-mortem on the Texas situation ignoring incentives and responsibility is inappropriate and incomplete.

February 19, 2021 Posted by | Economics | , , , | 1 Comment

Yet Another California Sheriff Refuses To Enforce Lockdown Measures

Orange County Sheriff Don Barnes has became the latest law enforcement head in California to declare that his officers will not be enforcing COVID restrictions.

By Steve Watson | Summit News | December 7, 2020

As the entire state was plunged into what is effectively a complete lockdown over the weekend, Barnes tweeted that he will not send deputies to any calls about “compliance with face coverings, social gatherings or stay-at-home orders.”

“Compliance with health orders is a matter of personal responsibility and not a matter of law enforcement,” Barnes asserted in a statement.

“To put the onus on law enforcement to enforce these orders against law-abiding citizens who are already struggling through difficult circumstances, while at the same time criticizing law enforcement and taking away tools to do our jobs, is both contradictory and disingenuous,” Barnes urged in a clear shot at Governor Gavin Newsom.

Newsom issued a quarantine order which will affect approximately 85% of the state’s population of 40 million people.

The lockdown will remain in place for at least three weeks over the holidays, seeing the closure of businesses including bars, hair salons and indoor restaurants. People will be prevented from meeting with anyone outside their household.

Thus far, Sheriffs at Riverside County, Los Angeles County and San Bernardino County have all announced that they will not be enforcing the restrictions.

On Friday, Riverside County Sheriff Chad Bianco slammed Newsom’s “dictatorial attitude toward California residents while dining in luxury, traveling, keeping his business open and sending his kids to in-person private schools,” labelling Newsom “extremely hypocritical.”

“These closures and stay-at-home orders are flat-out ridiculous,” Bianco declared, adding “The metrics used for closure are unbelievably faulty and are not representative of true numbers and are disastrous for Riverside County.”

Bianco noted that “Newsom is expecting us to arrest anyone violating these orders, cite them and take their money, close their businesses, make them stay in their homes, and take away their civil liberties or he will punish all of us.”

“While the governor’s office and the state has threatened action against violators, the Riverside County Sheriff’s Department will not be blackmailed, bullied or used as muscle against Riverside County residents in the enforcement of the governor’s orders,” Bianco continued.

Watch:

December 7, 2020 Posted by | Civil Liberties, Timeless or most popular, Video | , , | Leave a comment

Sheriff Bianco has a message for Gavin Newsom

https://www.youtube.com/channel/UCTRfqE64o1G0lmfWyiPrFnA

December 6, 2020 Posted by | Civil Liberties, Video | , , | 1 Comment

Why Did Governor Gavin Newsom Veto A “Critical Race Theory” Education Bill?

By Eric Striker – National Justice – October 1, 2020

California Governor Gavin Newsom is an unlikely ally in the fight against the anti-white critical race theory, but yesterday he shocked and confused his colleagues with a surprise veto of Assembly Bill 331.

AB 331, which passed 62 to 12 in California’s State Assembly, would’ve mandated students in the Golden State’s failing high schools to take a “Critical Ethnic Studies” class about the oppression and discrimination faced by one of four minority groups (African-Americans, “Latinx,” Native Americans and Asian-Americans) at the hands of white supremacists. A special emphasis on forcing white students to take these classes was emphasized in the law’s discussion.

While AB 331 was passed in January 2019, Newsom’s veto has the optical misfortune of coinciding with Donald Trump’s current campaign seeking to put an end to critical race theory in federally funded institutions. The bill’s sponsor, Assemblyman Jose Medina, lashed out at his fellow Democrat, calling the Governor’s move “a failure to push back against the racial rhetoric and bullying of Donald Trump.”

So why did he do it?

The answer lies in Newsom’s donors, who also happen to be members of prominent Jewish ethnic lobbies. For example, the American Jewish Committee responded to news of his decision with a “Bravo.”

Jewish groups protested AB 331 because, while they agreed with the anti-white message, they also resented the lack of an exemption for Jewish students. Under the law, Jews would be considered “white” and not allowed to choose Jewish Studies for their credit.

Roselyn Swig, a billionaire heiress, wrote an op-ed reflecting this sentiment four days ago. In the piece, she urges that the bill be altered.

According to Swig, critical race theory is “crucial to ensure a tradition of tolerance, understanding and respect – three of my core values – for future generations, while advancing justice for marginalized communities.” However, she furiously contested that “[An] initial draft of the educational plan had both excluded Jews and antisemitism education, and included anti-Jewish tropes in lyrics and anti-Israel boycotts.”

Swig concluded her open letter by stating that the “interests of the Jewish community are actually aligned with other ethnic studies groups. We should come together to advance our shared values, both in the classroom and beyond, for years to come.”

The proposed ethnic studies curriculum tried to adapt to these demands, but the end result only enraged Jews further. The final version on Newsom’s desk would’ve taught that Jews were beneficiaries of “white privilege,” which doomed it.

While non-white groups who supported the bill will blame Newsom’s own “white privilege” for its failure, the final decision was made outside of the Governor’s office.

Swig belongs to what California media has dubbed Newsom’s “Faithful Eight” — eight wealthy families who have given the Governor millions of dollars to transform him from mediocre dog catcher to a national figure with future presidential ambitions.

Of the Faithful Eight, the Swig’s are joined by four other elite Jewish dynasties: the Guggenheims, the Marcuses, the Fishers, and the Pritzkers.

While California has earned a reputation for its radical left policies, it also suffers from Progressive Except Palestine syndrome. In 2016, Governor Jerry Brown signed one of the most draconian and controversial anti-BDS (Boycott, Divestment, Sanctions) bills in the country.

The lesson for white privilege peddlers who saw AB 331 as a political lay up is simple: real power always strikes back.

October 1, 2020 Posted by | Aletho News | , , | Leave a comment

Gavin Newsom’s Exceedingly Ignorant Climate Claim

By Jim Steele | Watts Up With That? | September 14, 2020

Scientific evidence reveals there has been no climate effect regards California’s wildfires! None! The data below proves it beyond all doubt. There is no denying that warmer temperatures can cause drier fuels and promote larger fires. But that fact is being misapplied to all wildfires. About 70% of California’s 2020 burnt areas have been in grasslands and dead grass is so dry by the end of California’s annual summer drought that dead grasses are totally insensitive to any added warmth from climate change. Dead grasses only require a few hours of warm dry conditions to become highly flammable. It’s fire weather not climate change that is critical. Furthermore, the century trends in local temperatures where California’s biggest fires have occurred reveal no connection to climate change. In most cases the local maximum temperatures have been cooler now than during the 1930s. Those cooler temperatures should reduce the fire danger. Newsom is either ignoring or distorting the scientific evidence, is totally stupid, or is a dishonest demagogue.

Maximum temperatures are typically used by fire indexes to issue red flag warnings because it is the heat of midday that has the greatest drying effect. Minimum temperatures are often low enough to drop below the dewpoint at which time fuel moisture increases. So averaging minimum and maximum temperatures is inappropriate. In addition, referencing a higher global average temperature is meaningless. Only local maximum temperatures determine the dryness of surface fuels during every fire. As in Park and Abatzoglou 2019, the months of March through October are averaged to determine maximum temperatures during California’s dry season.

Here are some relevant facts (from the Western Regional Climate Center).  Trust the scientific evidence

1) The August 2013 Rim Fire centered around Yosemite National Park, was California’s 5th largest fire.

2) The November 2018 Camp Fire was California’s deadliest fire destroying the town of Paradise. It was also its 16th largest fire.

3) The 2018 Mendocino Complex Fire was California’s largest fire (since 1932 excluding 2020) .

4) In the October 2017 wine country fires, the Tubbs Fire was the 4th deadliest. It only burned 37,000 acres but high winds drove embers into the dwellings of the heavily populated outskirts of Santa Rosa.

Governor Newsom ignores the data to disgustingly hijacking the tragedy of California’s fires to push is climate change agenda. But he is not alone. There are climate scientists pushing catastrophes by ignoring the local maximum temperature trends. Bad analyses promote bad policies and obscure what needs to be done regards fuel management and creating defensible spaces in fire prone California. Newsom must focus on fuel management and fire suppression. As fire ecologist Thomas Swetnam echoed the experts’ growing consensus against fire suppression wrote, “The paradox of fire management in conifer forests is that, if in the short term we are effective at reducing fire occurrence below a certain level, then sooner or later catastrophically destructive wildfires will occur. Even the most efficient and technologically advanced firefighting efforts can only forestall this inevitable result.”

Further information about California’s wildfires are

Why Worse Wildfires – part 1

Why Worse Wildfires?  Part 2

Minimizing California Wildfires

Wildfires: Separating Demagoguery from the Science

How Bad Science & Horrific Journalism Misrepresent Wildfires and Climate


Jim Steele is Director emeritus of San Francisco State’s Sierra Nevada Field Campus and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism

September 15, 2020 Posted by | Deception, Science and Pseudo-Science, Timeless or most popular | , | Leave a comment