Rumble Royale: Can the U.S. Grid Work With 100% Renewables?

Four Days in 2055: Dynamic heat and power supply in the mid-century wind, water and sunlight-fuelled Continental U.S. simulated by Stanford’s Mark Jacobson. Credit: ASU/PNAS

A battle royale between competing visions for the future of energy blew open today on the pages of a venerable science journal. The conflict pits 21 climate and power system experts against Stanford University civil and environmental engineer Mark Jacobson and his vision of a world fuelled 100 percent by renewable solar, wind, and hydroelectric energy. The criticism of his “wind, water and sun” solution and an unapologetic rebuttal from Jacobson and three Stanford colleagues appear today in the Proceedings of the National Academy of Sciences (PNAS).

The critics enumerate what they view as invalid modeling tools, modeling errors, and “implausible and inadequately supported assumptions” in a projection of the mid-century U.S. energy supply that Jacobson and his coauthors published in PNAS in 2015. “The scenarios of [that paper] can, at best, be described as a poorly executed exploration of an interesting hypothesis,” write the experts, led by Christopher Clack, CEO of power grid modeling firm Vibrant Clean Energy.

Clack says their primary goal is accurate science, the better to equip policymakers for critical decisions: “We’re trying to be scientific about the process and honest about how difficult it could be to move forward.”

The text and statements by Clack’s coauthors question Jacobson’s evaluation of competing energy technologies, and specifically his rejection of two non-renewable energy options: fossil fuel power plants equipped to capture their own carbon dioxide pollution and nuclear reactors.

Jacobson calls Clack’s attack, “the most egregious case of scientific fraud I have encountered in the literature to date.”

In fact, while both sides claim to be objectively weighing the energy options, the arguments and backgrounds of the protagonists belie well-informed affinities for various energy sources (and informed biases against others). As sociologists of science would say, their choice of data and their reading of it reflects hunches, values, and priorities.

Consider Clack’s coauthor Ken Caldeira, a climate scientist at the Carnegie Institution for Science. Caldeira’s press release broadcasting their critique argues that removing carbon dioxide from the U.S. power supply is a massive job demanding the biggest tool box possible: “When you call a plumber to fix a leak, you want her to arrive with a full toolbox and not leave most of her tools at home,” says Caldeira.

The same document then abandons this technology-agnostic tone to call out nuclear energy and carbon capture as technologies that “solving the climate problem will depend on.” And Caldeira has appealed for deploying a new generation of nuclear reactors which he and other nuclear boosters such as former NASA scientist Jim Hansen say are needed because renewables “cannot scale up fast enough.”

They could be right. Then again, expert sources they cite, such as the International Energy Agency, have consistently underestimated renewable energy growth. And identical scale-up critiques have also been well argued against nuclear energy and carbon capture and storage (CCS).

Jacobson makes some powerful arguments for walking away from those technologies in his PNAS papers. Nuclear liabilities cited by Jacobson include the threat of future Fukushima-like disasters, nuclear weapons proliferation facilitated by large-scale uranium enrichment, and the financial risks such as those that recently bankrupted Westinghouse. And, as he notes in his rebuttal, the International Panel on Climate Change has determined there is “robust evidence” and “high agreement” among experts validating these nuclear risks.

Jacobson’s rejection of CCS technology, meanwhile, may provide deeper insight on what makes him a magnet for academic attacks.

The main thing that soured Jacobson on CCS was his own pioneering work on the climate change impacts of black carbon, or soot. Fossil fuel plants that capture most of their CO2 still release soot that’s both a public health menace and an agent of climate change. In a 2001 paper in Nature on simulations of soot particles in the atmosphere, he controversially argued that soot in the air and on blackened snow and ice fields absorbs enough heat to make its climate impact second only to CO2. Sixteen years on, that view now enjoys strong support from the science community.

Jacobson subsequently turned his science on black carbon into political activism, helping to convince California regulators to deny low-carbon incentives to so-called “clean diesel” technology. A decade later that tech is in disarray—sullied by scandals over fraudulent tailpipe cleanup claims. European automakers long reliant on diesel technology are now abandoning it in favor of hybrids, as IEEE Spectrum reported in April.

His style, however, has not necessarily made him popular among climate scientists. Drew Shindell, a climate modeler at Duke University, describes Jacobson as a brilliant scientist with an unusual go-it-alone style that alienates some researchers.

Whereas most climate models are honed by large teams, composed of hundreds of scientists, Jacobson single-handedly constructed the GATOR-GCMM climate model that underpins his work—including the 2001 Nature and 2015 PNAS reports. Shindell says other climate models are also shared more freely and subject to much more independent validation than Jacobson’s. So when he claims that “his model is more complex and therefore better,” says Shindell, it “rubs people the wrong way.”

To Shindell, however, Jacobson’s outspokenness and solo style do not invalidate his work. In fact, he argues that raising important questions is Jacobson’s greatest contribution: “His work does prompt people to really look closely. That’s also a service to the community.”

Could something similar be playing out now in PNAS? Power experts can bristle at outsiders who offer novel approaches to their problems. In 2004 Spectrum profiled mathematical modelers—a power engineer and a pair of plasma physicists—who were drawing heavy fire for the unorthodox prediction that big blackouts are inevitable. That trio (Ian Dobson, Benjamin Carreras, and David Newman) endured years of derision and saw their research funding pulled. They ultimately triumphed by all but predicting the Southwest blackout of 2011, a story I documented for Discover magazine last year.

Jacobson and his PNAS coauthors similarly crafted their own unorthodox energy model—LOADMATCH—which sets a bold vision. While their critics mostly model power grids, Jacobson’s team used LOADMATCH to map out a 100-percent-renewable route to meeting all major energy needs in the Continental United States. The model replaces fossil fuels for heating and transportation with renewably-generated hydrogen and electricity, thereby tackling nearly all greenhouse emissions from fossil fuels rather than just the 35 percent from power plants.

Mark Jacobson. Photo: L.A. Cicero

The weakest point in Jacobson’s 2015 paper identified by his critics is a heavy reliance on hydropower plants, which serve as his simulated power grid’s backstop energy supply during long periods of weak sun and becalmed winds. This jumps out in the graph [above], which simulates total continental U.S. heat and power generation over four days in January 2055. Hydro turbines ramp up heavily each day after the sun sets, delivering as much as 1,300 gigawatts at their peak—a level that implies a 15-fold expansion in hydropower generating capacity.

Jacobson says the LOADMATCH code adds turbines to existing hydropower dams as required to prevent power outages. The reservoirs are untouched, and thus store the same amount of energy. But that energy is concentrated into those hours of the year when no other power source is available.

However, no such expansion is documented in the 2015 paper. Critic-in-chief Christopher Clack argues that it is a modeling error because, according his analysis, adding the required turbines at existing dams is not physically possible. And even if it were, he says, discharging the hydropower as described would impose unacceptable impacts on aquatic ecosystems and downstream water users. Invalidating that option, says Clack, means Jacobson’s scheme will cause blackouts: “The whole system breaks down.”

Jacobson admits the 2015 paper was “vague” on the hydropower upgrade but stands by its technical and economic viability. The environmental impacts, he says, reflect a cost that policymakers pursuing his roadmap would need to consider. All clean energy solutions will require tradeoffs, says Jacobson, noting that the low-carbon grid projection that made Clack’s reputation, a 2016 report in Nature Climate Change, calls for a much larger build-out of unpopular powerlines.

Jacobson also has alternative sources of backup power, such as adding turbines to more rapidly convert stored heat from solar thermal power plants into electricity. “We could increase [their] discharge rate by a factor of 3.5 to obtain the same maximum discharge rate of hydro, without increasing the [solar thermal plants’] mirror sizes or storage,” he says.

If you’re wondering where battery storage figures in all of this, it is yet another option—though one that both Jacobson and Clack deemed unnecessarily costly in their respective studies. Clack’s 2016 projections relied mostly on flexible natural gas power plants rather than dams or batteries to handle residual power demand—delivering a 78 percent reduction of power sector carbon emissions from 1990 levels by 2030 [see map below].

What is certain, from the darkening findings of climate science, is that climate change calls for a bold remake of the global energy system of the sort that both Clack and Jacobson have championed. Their respective visions certainly appear to have more in common than ever as the Trump Administration seeks to turn back the clock on grid engineering.

The U.S. power sector is bracing for the release of a power grid study ordered by President Trump on whether renewable energy installations degrade grid reliability by undermining continuously operated “baseload” nuclear and coal power plants. U.S. Energy Secretary Rick Perry’s memo commissioning the study states as fact that “baseload power is necessary to a well-functioning electric grid.”

Last week, the Rocky Mountain Institute’s Mark Dyson and Amory Lovins called out that “curious claim,” which they say, “has been thoroughly disproven by a diverse community of utilities, system operators, economists, and other experts that moved on from this topic years ago.” What the grid needs, they write, is flexibility, not baseload power plants.

The power industry is already moving in that direction. For example, flexibility was California utility PG&E’s central argument last year when it announced plans to shut down the Diablo Canyon nuclear plant when its reactor licenses expire in 2024 and 2025. The baseload plant was ill-suited, PG&E said, to help them manage the increasingly dynamic power flowing on California’s grid.

Dyson and Lovins’ prescription for the power grid community, meanwhile, is unity. As they titled last week’s post: “The grid needs a symphony, not a shouting match.”

Chris Clack’s 2016 U.S. power sector simulation slashes CO2 emissions 78% using a DC supergrid and gas-fired power to balance renewable energy. Source: Nature Climate Change

This post was created for Energywise, IEEE Spectrum’s blog about the future of energy, climate, and the smart grid

Clim’ City Animates the Climate Challenge

clim_city2French science center Cap Sciences takes flash-based learning to new heights in a free online game launched this week: Clim’ City (click Le Jeu to play). At present this climate change adventure is for those of you who read French or set learning to do so as a New Year’s goal. But here’s to hoping that Bordeaux-based Cap Sciences gets an English version out quick because this educational game is a beautifully crafted and ingeniously programmed device for learning the contingencies and costs that lie ahead on the road to a low-carbon energy future.

The action in Clim’ City takes place on a small map of an imaginary town animated by commuters driving here and there and all manner of agricultural, industrial and even entertainment operations (including a ski hill) energetically going about their business. The goal is to reduce the “Clim’s” carbon footprint and thus avert the town’s demise by tweaking the way its actors produce and consume energy.

Playing such games turns information into knowledge. According to the international Association of Science-Technology Centers’s program International Action on Global Warming, the gamers at Cap Sciences hope players will do some informed pondering of such questions as:

Why is global climate change accelerating? What kind of climate can we expect by the year 2100? What human activities contribute most to the emission of greenhouse gases? And how is it possible to reduce these emissions?

In my first stab at Clim’ City I have converted the town’s carbon-belching coal-fired power plant to biomass. To do so I was forced to first launch a forest management program, which really brought home the fact that collecting biomass to generate a meaningful amount of energy is, in itself, a substantial and complex task.

My powerplant conversion also came with an opportunity cost, drawing down my limited supply of government, corporate, and individual action points. In the words of International Action on Climate Change, this was a powerful reminder of the “sociopolitical constraints” facing decision makers today.

I’m not deep enough in to Clim’ City to know whether mine is going to make it. If accounts in the French press and blogosphere are to be believed there’s a good chance it won’t. This is a tough game and failure appears likely — at least early on — which imparts a healthy dose of realism.

But even if the Clim’s get cooked under my leadership, I can’t help learning.

add to : Add to Blinkslist : add to furl : Digg it : add to ma.gnolia : Stumble It! : add to simpy : seed the vine : : : TailRank

This post was created for the Technology Review guest blog: Insights, opinions and analysis of the latest in emerging technologies

Nukes, Gas, Oil and Coal All Losers in EU Energy Strategy

The European Commission issued its Strategic Energy Review yesterday, proposing energy efficiency investments, a shift to alternative fuel vehicles to end oil dependence in transport, and more aggressive deployment of renewable energy and carbon capture and storage to “decarbonise” the EU electricity supply. Figuring prominantly among its first six “priorities essential for the EU’s energy security” are the North Sea offshore electric power supergrid that Energywise covered in September and the Mediterranean Ring electric interconnection of Europe and North Africa that I’ve been harping on this week. 

The EC energy strategy not only endorses the MedRing, but views it as a component of a future supergrid traversing Europe and stretching beyond the Mediterranean to Iraq, the Middle East and Sub-Saharan Africa.

How would this new vision (and $100/barrel oil) alter the complexion of European energy consumption? The energy review projects that by 2020 total energy demand drops from the equivalent of 1811 metric tons of oil in 2005 to 1672 MTOE in 2020. Demand met by renewables such as wind, solar and hydro more than doubles in real terms from 123 to 274 MTOE, while their share of total demand leaps from 6.8% to 16.4%. Imported renewables – with the MedRing delivering North African wind and solar power – jump 10-fold from 0.8% in 2005 to 8.8% in 2020.

Oil, gas, coal and nuclear, meanwhile, all see a diminished role, both in real terms and as a share of European energy demand. Interestingly the role of natural gas – the low-carbon fossil fuel – drops the most, from 25% to 21%, reflecting EU concern over dependence on gas imports from Russia. Nuclear’s share drops the least, from just slightly over to slightly under 14% of demand; this assumes that nuclear phaseout plans, particularly Germany’s, are followed through. 

How to make it all come true? Accompanying the EC review is a ‘green paper‘ (the EU’s unbleached alternative terminology for what we’d call a ‘white paper’) outlining a variety of new regulatory and financial mechanisms. The EU is already a world leader in terms of incentives for lower carbon energy with strong price supports for solar and wind and a carbon cap and trade program up and running (though still lacking teeth as my Energywise colleague Bill Sweet notes). However, the energy review warns that the primarily national-level financing that drives energy projects today are inadequate to drive infrastructure that is pan-European or larger. A perfect example is the massive investment in high-voltage dc lines needed to turn the MedRing into a bulk power mover (see the second half of our feature on MedRing: “Closing the Circuit”). 

Even less viable under existing financing mechanisms are those projects that entail considerable “non-commercial risks” such as threats of political instability or terrorism. Did someone say North Africa?

add to : Add to Blinkslist : add to furl : Digg it : add to ma.gnolia : Stumble It! : add to simpy : seed the vine : : : TailRank

This post was created for EnergywiseIEEE Spectrum’s blog on green power, cars and climate

CO2’s Bottom Line Just Keeps On Rising

Climate change skeptics obsess about the immense uncertainties that plague climate modeling. It’s not, however, all a matter of mysterious physics and chemistry. Human behavior — in this case our boundless capacity to ignore grave danger — poses the greatest challenge. No order of scientific progress nailing down the links that regulate Earth’s climate will enable certain projections of climate change over the next century because it is human behavior that controls the most powerful element: emissions of greenhouse gases such as CO2 and methane.

Today scientists with the international Global Carbon Project are releasing an updated accounting of CO2 emissions, and they far exceed the best guesses of human behavior by the Intergovernmental Panel on Climate Change (IPCC). “Emissions in 2007 were at the high end of’ those used for climate projections in the last [IPCC] report,” says Global Carbon Project participant Corinne Le Quéré, an environmental chemist at the University of East Anglia.

Emissions from burning fossil fuels and cement manufacturing — the largest sources of anthropogenic CO2 — continue to increase rapidly (see graph above); in 2007 they were now 38% higher than in 1990. In total, emissions drove up the atmospheric concentration of CO2 by 2.2 parts per million last year, compared with 1.8 ppm in 2006. At the end of 2007 CO2 was at 383 ppm — the highest concentration during the last 650,000 years and probably during the last 20 million years according to the Global Carbon Project. 

At the same time the oceans, which in past acted as a buffer to absorb excess CO2, are saturating. Le Quéré, who coauthored a report last year in Science that the Antarctic Ocean had already saturated, calls it a dangerous combination: “If this trend continues and the natural sinks weaken, we are on track towards the highest projections of climate change.”

add to : Add to Blinkslist : add to furl : Digg it : add to ma.gnolia : Stumble It! : add to simpy : seed the vine : : : TailRank

Buckle Down: Climate solutions won’t come at the push of a button

The message of the day seems to be Get real: Reversing, stopping or even just slowing climate change is going to take a little more effort than some newcomers to the problem hoped.

Kicking it off was the report one week ago that atmospheric concentrations of CO2 grew 35% faster than expected since 2000. Climate trackers at the University of East Anglia estimate that Earth’s plants and oceans scrubbed 18% less CO2 from the air, largely due to stronger winds in Antarctica’s Southern Ocean that bring deep CO2 up and thus prevent more CO2 from being absorbed. Greater than expected CO2 releases from the boom in coal-fired power generation and a dearth of technological advances contributed the balance of the CO2 speed-up.

I’m throwing a little more warm water on our melting optimism today with a story on the newly relaunched web portal MSN Green. “Does Daylight Saving Time Really Save Money?” is really an accounting of the energy savings Congress promised when it extended DST by three weeks in March and one week in the fall (this week in fact). When Congress passed the Energy Policy Act of 2005, it predicted that springing forward earlier and falling back later would trim the use of lighting, saving the equivalent of 100,000 barrels of oil per day. Extending DST probably did just the opposite, boosting energy use by giving us all more time to consume. 

The shortfall is grave given that extended DST was one of the only energy efficiency measures in the 2005 law, which focused mostly on boosting fossil fuel production and nuclear power. Remember Vice President Dick Cheney’s famous comment dissing energy efficiency as a “lifestyle choice”? Well, Congress went along for the ride.

The take home message is not that a smarter, climate-friendly energy system is impossible. Rather, we need to get beyond hopeful quick fixes such as DST and begin implement the real solutions that those East Anglia researchers found to be under-exploited — from true energy efficiency measures such as hybrid vehicles to renewable energy sources such as wind and solar power and even smarter use of fossil fuels whereby CO2 is captured and stored away underground. Let’s get real.

add to : Add to Blinkslist : add to furl : Digg it : add to ma.gnolia : Stumble It! : add to simpy : seed the vine : : : TailRank