Micro-Satellite Tracks Carbon Polluters From Space


Simulated satellite image of methane plume from French Guyana’s Petit Saut hydroelectric power plant. Image: GHGSat

Attention greenhouse gas emitters: There’s a new eye in the sky that will soon be photographing your carbon footprint and selling the images to any and all. It’s a micro-satellite dubbed “Claire” (clear, bright, and clean in French) by its Montreal-based developer, GHGSat.

This microwave-oven-sized pollution paparazzo rocketed to a 512-kilometer-high orbit in mid-June care of the Indian Space Agency, with a mission to remotely measure the plumes of carbon dioxide and methane wafting up from myriad sources on Earth’s surface. Claire’s targets include power plants, natural gas fracking fields, rice paddies, and much more—just about any emissions source that someone with a checkbook (corporations, regulators, activists) wants tracked, according to GHGSat president Stéphane Germain.

Germain says Claire’s data can improve compliance reporting to regulators and carbon markets, enable tracking of industrial efficiency, and provide competitive intelligence, among other uses. “Our vision is to be the global standard for emissions monitoring across the world. That’s ambitious, but we think it’s attainable,” he says.

Space agencies already monitor emissions from orbit. They have launched a series of satellites such as NASA’s OCO-2 to specifically track atmospheric CO2 and methane, and those missions are delivering an important reality check on national pollution inventories, which are based largely on engineering estimates.

Both satellite and ground-based research has identified vast undercounting of methane emissions by the U.S. EPA, prompting the agency to revise its inventories earlier this year.

However, OCO-2 and other satellites such as Europe’s Tropomi and Japan’sIbuki were primarily designed to generate data for climate models, and thus emphasize a wide field and exquisite greenhouse-gas level precision but with low spatial resolution. Germain says GHGSat is designed for a totally different mission.

“We have deliberately limited the scope of the physical field of view. We know where the emissions are coming from. All we need to do is point at known sites and characterize the plume from the facilities,” says Germain.

Like its predecessors, Claire uses an infrared spectrometer to detect telltale patterns of sunlight absorption that indicate levels of atmospheric CO2 and methane. Unlike its predecessors, which use “push-broom” spectrometers that scan side to side to generate a 2-D spectral database, Claire has an imaging spectrometer that takes a series of 2-D spectral snapshots as the micro-satellite passes overhead.

Telescopic lensing focuses Claire’s spectrometer to take 12 kilometer x 12 kilometer snapshots. With its 0.25 megapixel detector, each pixel thus represents a 25 meter x 25 meter plot of ground—a spatial resolution two orders of magnitude higher resolution than OCO-2.

GHGSat also parted ways with its predecessors on cooling. Cryogenic cooling of spectrometers minimizes background noise for missions such as OCO-2. Claire does without cooling to minimize complexity and weight, helping keep total project cost including launch below C$10 million (US $7.7 million). Signal-to-noise ratio is maintained, says Germain, by combining multiple snapshots and filtering out noise.


Claire. Photo: GHGSat Inc.

Further data processing turns Claire’s composite images of CO2 and methane concentrations above a given site into an estimate of the rate at which gases are flowing from their source. GHGSat will do this by factoring in data on wind speed and direction, using the same “inverse modeling” proven by OCO-2 and Tropomi.

Some pre-launch coverage of GHGSat was skeptical of the firm’s ability to deliver. InsideClimate News wrote last month that most scientists it contacted questioned the readiness of both remote sensing technology and inverse modeling to deliver reliable emissions estimates.

Harvard University remote sensing expert Daniel Jacob, a GHGSat collaborator, strikes a more optimistic tone. He told IEEE Spectrum that if Claire’s spectrometer can deliver the +/- 1 to 5 percent precision that GHGSat is predicting, “the ability to detect point sources will be impressive.”

GHGSat expects to complete testing of Claire’s systems this month and to begin making observations for initial customers within a month or two. Up first are hydropower reservoirs in Québec and Manitoba and large tailings ponds associated with oil sands mining in Alberta. Estimating emissions from the latter is currently dangerous, costly, and fraught with uncertainty levels in excess of 50 percent. Ground measurements taken simultaneously at those sites will validate the satellite’s performance.

Plans are already afoot, meanwhile, to launch a second satellite in 12-18 months to assure continuity of observations should Claire malfunction and to expand GHGSat’s customer capacity. (Claire will be able to monitor over 1,000 sites per year.)

More satellites will also enable more frequent tracking of a given site. While Claire orbits Earth every 90 minutes, its route takes it within range of the same site just once every two weeks. That could make data from one or a few satellites hard to interpret for dynamic operations, such as power plants that ramp up and down hourly or daily or natural gas compressor stations and fracking wells that sporadically belch large volumes of methane.

Germain says that, with sufficient demand, they may ultimately launch a fleet large enough to meet even those challenges: “It’s well within the reach of our business case to get to daily measurements, or even to 2 to 3 times per day.”

This post was created for Energywise, IEEE Spectrum’s blog about the future of energy, climate, and the smart grid

How the Paris Climate Deal Happened and Why It Matters

One month after the terror attacks that traumatized Paris, the city has produced a climate agreement that is being hailed as a massive expression of hope. On Monday the U.K. Guardian dubbed the Paris Agreement, “the world’s greatest diplomatic success.” Distant observers may be tempted to discount such effusive language as hyperbole, yet there are reasons to be optimistic that last weekend’s climate deal finally sets the world on course towards decisive mutual action against global climate change.

The birthing process clearly sets Paris apart from earlier efforts at global climate action, such as the Kyoto Protocol crafted in 1997. Only last-minute intervention by then U.S. Vice President Al Gore clinched a deal at Kyoto Continue reading

Paris Climate Talks Facing Growing Carbon Emissions and Credibility Gaps

Credit: Peter Fairley

EN GARDE! Paris treaty pledges are still too rich, and contain some iffy ingredients

Three weeks before the start of the Paris climate talks, negotiators working to craft an international agreement to curb rising global greenhouse gas emissions are staring into a wide gulf between what countries are willing to do and what they need to do. Most countries have stepped up with pledges to meaningfully cut carbon emissions or to at least slow the growth of emission totals between 2020 and 2030. However, national commitments for the Paris talks still fall short of what’s needed to prevent the average global temperature in 2100 from being any more than 2 degrees Celsius warmer than at the start of this century—the international community’s consensus benchmark for climate impact.

Worse still, the national pledges employ a hodgepodge of accounting methods that include some significant loopholes that ignore important emissions such as leaking methane from U.S. oil and gas production and underreported coal emissions from China. How the promised emissions reductions will be verified post-Paris is “a big debate right now and it makes a massive difference in the numbers,” says Jennifer Morgan, global director for the climate program at the World Resources Institute (WRI), a Washington, D.C.-based non-governmental organization. Continue reading

EPA Coal Cuts Light Up Washington

A meeting at the U.S. Federal Energy Regulatory Commission’s (FERC’s) Washington headquarters yesterday lived up to expectations that it would be one of the most exciting sessions in the agency’s history. Buttoned up policy wonks, lobbyists, and power market experts showed up in droves—over 600 registered—to witness a discussion of what President Obama’s coal-cutting Clean Power Plan presaged for the U.S. power grid. The beltway crowd was joined by activists for and against fossil fuels—and extra security.

Inside proceedings, about the Environmental Protection Agency (EPA) plans’ impact on power grid reliability, protesters against fracking and liquid natural gas exports shouted “NATURAL GAS IS DIRTY” each time a speaker mentioned coal’s fossil fuel nemesis. Outside, the coal industry-backed American Coalition for Clean Coal Electricity distributed both free hand-warmers and dark warnings that dumping coal-fired power would leave Americans “cold in the dark.”

As expected, state regulators and utility executives from coal-reliant states such as Arizona and Michigan hammered home the ‘Cold in the Dark’ message in their exchanges with FERC’s commissioners. Gerry Anderson, Chairman and CEO of Detroit-based utility DTE Energy, called the Clean Power Plan “the most fundamental transformation of our bulk power system that we’ve ever undertaken.”

EPA’s critics argue that the plan’s timing is unrealistic and its compliance options are inadequate. Anderson said Michigan will need to shut down, by 2020, roughly 40 percent of the coal-fired generation that currently provides half of the state’s power. That, he said, “borders on unachievable and would certainly be ill-advised from a reliability perspective.”

EPA’s top air pollution official, Janet McCabe, defended her agency’s record and its respect for the grid. “Over EPA’s long history developing Clean Air Act standards, the agency has consistently treated electric system reliability as absolutely critical. In more than 40 years, at no time has compliance with the Clean Air Act resulted in reliability problems,” said McCabe.

McCabe assured FERC that EPA had carefully crafted its plan to provide flexibility to states and utilities regarding how they cut emissions from coal-fired power generation, and how quickly they contribute to the rule’s overall goal of lowering power sector emissions by 30 percent by 2030 from 2005 levels. (Michigan has state-verified energy conservation and renewable energy options to comply with EPA’s plans according to the Natural Resources Defense Council.)

McCabe said EPA is considering additional flexibility before it finalizes the rule, as early as June. EPA would consider, for example, specific proposals for a “reliability safety valve” to allow a coal plant to run longer than anticipated if delays in critical replacement projects—say, a natural gas pipeline or a transmission line delivering distant wind power—threatened grid security.

As it turned out, language codifying a reliability safety valve was on offer at yesterday’s meeting from Craig Glazer, VP for federal government policy at PJM Interconnection, the independent transmission grid operator for the Mid-Atlantic region. The language represents a consensus reached by regional system operators from across the country—one that is narrowly written and therefore unlikely to give coal interests much relief. “It can’t be a free pass,” said Glazer.

A loosely-constrained valve, explained Glazer, would undermine investment in alternatives to coal-fired power, especially for developers of clean energy technologies. “Nobody’s going to make those investments because they won’t know when the crunch time really comes. It makes it very hard for these new technologies to jump in,” said Glazer.

Clean energy advocates at the meeting, and officials from states that, like California, are on the leading edge of renewable energy development, discounted the idea that additional flexibility would be needed to protect the grid. They pushed back against reports of impending blackouts from some grid operators and the North American Electric Reliability Corporation(NERC). Those reports, they say, ignored or discounted evidence that alternative energy sources can deliver the essential grid services currently provided by conventional power plants.

NERC’s initial assessment, issued in November, foresees rolling blackouts and increased potential for “wide-scale, uncontrolled outages,” and NERC CEO Gerald Cauley says a more detailed study due out in April will identify reliability “hotspots” caused by EPA’s plan. At the FERC meeting, Cauley acknowledged that “the technology is out there allowing solar and wind to be contributors to grid reliability,” but he complained that regulators were not requiring them to do so. Cauley called on FERC to help make that happen.

Cleantech supporters, however, are calling on the government to ensure that NERC recognizes and incorporates renewable energy’s full capabilities when it issues projections of future grid operations. They got a boost from FERC Commissioner Norman Bay. The former chief of enforcement at FERC and Obama’s designee to become FERC’s next chairman in April, Bay pressed Cauley on the issue yesterday.

Bay asked Cauley how he was going to ensure that NERC is more transparent, and wondered whether NERC would make public the underlying assumptions and models it will use to craft future reports. Cauley responded by acknowledging that NERC relied on forecasts provided by utilities, and worked with utility experts to “get ideas on trends and conclusions” when crafting its reliability studies.

Cauley also acknowledged that they were not “entirely open and consensus based” the way NERC’s standards-development process was. And he demurred on how much more open the process could be, telling Bay, “I’ll have to get back to you on that.”

The challenge from Bay follows criticism leveled at NERC in a report issued last week by the Brattle Group, an energy analytics firm based in Boston. Brattle found that compliance with EPA’s plan was “unlikely to materially affect reliability.”

Brattle’s report concurred with renewables advocates who have argued that NERC got it wrong by focusing too much on the loss of coal-fired generation and too little on that which would replace it: “The changes required to comply with the CPP will not occur in a vacuum—rather, they will be met with careful consideration and a measured response by market regulators, operators, and participants. We find that in its review NERC fails to adequately account for the extent to which the potential reliability issues it raises are already being addressed or can be addressed through planning and operations processes as well as through technical advancements.”

This post was created for Energywise, IEEE Spectrum’s blog on green power, cars and climate

Will Shuttering Coal Plants Really Threaten the Grid?

Does President Obama’s plan to squelch carbon emissions from coal-fired power plants really threaten the stability of the grid? That politically-charged question is scheduled for a high-profile airing today at a meeting in Washington to be telecast live starting at 9 am ET from the Federal Energy Regulatory Commission (FERC).

Such “technical meetings” at FERC are usually pretty dry affairs. But this one could be unusually colorful, presenting starkly conflicting views of lower-carbon living, judging from written remarks submitted by panelists.

On one side are some state officials opposed to the EPA Clean Power Plan, which aims to cut U.S. power sector emissions 30 percent by 2030 from 2005 levels. Susan Bitter Smith, Arizona’s top public utilities regulator, argues that EPA’s plan is “seriously jeopardizing grid reliability.” Complying with it would, she writes, cause “irreparable disruption” to Arizona’s (coal-dependent) power system.

Environmental advocates and renewable energy interests will be hitting back, challenging the credibility of worrisome grid studies wielded by Bitter Smith and other EPA critics. Some come from organizations that are supposed to be neutral arbiters of grid operation, such as the standards-setting North American Electric Reliability Corporation (NERC). Clean energy advocates see evidence of bias and fear-mongering in these studies, and they are asking FERC to step in to assure the transparency and neutrality of future analyses. Continue reading

Understanding the IPCC’s Devotion to Carbon Capture

P1130803-3I’ve delivered several dispatches on carbon capture and storage (CCS) recently, including a pictorial ‘how-it-works’ feature on the world’s first commercial CCS power plant posted this week by Technology Review and typeset for their January print issue. Two aspects of CCS technology and its potential applications bear further elaboration than was possible in that short text.

Most critical is a longer-term view on how capturing carbon dioxide pollution from power plants (and other industrial CO2 sources) can serve to reduce atmospheric carbon dioxide concentrations. Continue reading

Can China Turn Carbon Capture into a Water Feature?

LLNL process image

Water recovery concept for CCS at GreenGen. Source: LLNL

In an intriguing footnote to their historic climate deal this month, Chinese President Xi Jinping and U.S. President Barack Obama called for demonstration of a hitherto obscure tweak to carbon capture and storage (CCS) technology that could simultaneously increase its carbon storage capacity and reduce its thirst for water. Such an upgrade to CCS holds obvious attraction for China, which is the world’s top carbon polluter and also faces severe water deficits, especially in the coal-rich north and west.

As the Union of Concerned Scientists puts it in its The Equation blog, “Cracking this nut … could be a huge issue for China.” Continue reading