Climate

Why the $100 per ton target for carbon removal may be 'pure fantasy'

MIT Energy Initiative’s Howard Herzog explains why the number is unrealistic.

Climeworks factory with it's fans in front of the collector, drawing in ambient air and release it, as largely purified CO2 through ventilators at the back is seen at the Hellisheidi power plant near Reykjavik on October 11, 2021. - Climeworks factory is in ICELAND containers similar to those used in maritime transport are stacked up in pairs, 10 metres (33 feet) high. Fans in front of the collector draw in ambient air and release it, largely purified of CO2, through ventilators at the back. (Photo by Halldor KOLBEINS / AFP) (Photo by HALLDOR KOLBEINS/AFP via Getty Images)

Lowering the cost per ton for carbon dioxide removal is critical to ensuring the industry is economically viable.

Photo: Halldor Kolbeins/AFP via Getty Images

$100 per ton is the carbon dioxide removal industry’s standard-bearing metric. It’s the target identified by both Frontier’s well-respected advance purchase commitment and the Department of Energy’s Carbon Negative Shot for ensuring CDR is scalable.

Experts agree that we need to remove billions — possibly many billions — of tons of carbon dioxide from the atmosphere to have a decent shot at achieving net zero by midcentury. CDR at that scale would be enormously costly, so lowering the cost per ton is critical to ensuring the industry is economically viable.

The $100-per-ton target is what economists, policymakers, investors, and the industry itself generally agree makes CDR feasible at scale. According to a survey of CDR stakeholders from CarbonPlan, there isn’t consensus on what that number means. Some view it as a break-even point for sellers, and others refer to it as a post-incentive price for buyers. But for the industry at large, it’s an accepted — and achievable — target.

Most techniques that reliably pull carbon from thin air currently cost much more than that. “$100 per ton is an extremely ambitious 10-year target, likely probably more of a 15- to 20-year target,” Carbon180 senior visiting scholar Shuchi Talati told Protocol, adding that it’s “important to be ambitious” and “there’s a lot of momentum around CDR and getting these technologies to scale.”

Yet ambition and momentum may not be enough to reach that milestone, according to Howard Herzog, a senior research engineer with MIT’s Energy Initiative. He’s been studying carbon capture for over 30 years (even writing a book about it in 2018) and is more skeptical given the capital costs to build CDR plants and the enormous amount of energy they need to run. He sat down with Protocol to talk about why he thinks $100 per ton is “pure fantasy.”

This conversation has been edited for brevity and clarity.

Why do you think $100 per ton is an unrealistic target?

Carbon dioxide is so diluted in the air that in order to capture it, almost irrespective of what process you use, you’re going to have to push a lot of air through these machines, and that means a lot of capital costs and a lot of energy spent.

Estimates put the energy requirement at 1,200 kilowatt-hours per ton of carbon dioxide. The cost of electricity here where I live in Massachusetts is 20 cents a kilowatt-hour. Europe is pushing up prices to 40 cents. And this energy has to be carbon-free. Very few places have carbon-free electricity, but let’s say you can do it for 10 cents a kilowatt-hour, which I think is really stretching it — that’s $120 per ton of carbon dioxide.

That’s before you even start including the capital cost, which is significant. You need larger machines to process all that air. You want to put the air through these machines at a certain rate. And because of that, it’s going to be a large capital cost. Just looking at that, $100 or even $200 per ton just doesn’t pass the smell test.

So what do you think is a more realistic minimum cost for carbon dioxide removal?

Basic physics and engineering say there are some minimum requirements, and when you look at the most optimistic situation, my estimate for where we might be at is $600 to $1,000 for 2030.

Isn’t it possible to get capital costs down with scale?

There’s some truth to that, but it’s one thing to get it down by 40% or even cut it in half, but getting it down by an order of magnitude is a whole other dimension.

If you have a technology that can give you unlimited carbon removal at $100 a ton, that’s nirvana. We’re done, we’ve solved that problem.

There are also capital costs that go up as they scale. There was trouble with the Climeworks installation in Iceland last winter because of the temperature. [Editor’s note: Herzog is referring to the company’s plant, which dealt with frozen machinery last winter. Climeworks head of climate policy Christoph Beuttler said it was a “very good example of how important it is to deploy now and to get the experience.”] We saw this in Texas when everything broke down in cold temperatures; they didn’t spend money to weatherize it, so that adds cost. You’re putting things out, you want them to run at least 20 years. To do that, you have to harden them to stand up to the elements. It’s one thing to make a small demonstration, but when these things mature, some things will raise costs, other things will lower costs.

On the energy front, isn’t it possible to get costs down with the expansion of renewables?

Even if it’s several cents per kilowatt-hour, [renewables are] still intermittent, and you need this to run 24/7, which has a whole bunch of other costs. Say I just buy a lot of batteries so I can have this running all the time: That's going to cost more than the original wind farm in the first place. On the grid, there’s still the backup problem and the peak problems. When you start putting more and more renewables on the grid, these system costs become more important. So if renewables are only 5% of my energy, there’s not a lot of integration costs; they’ve been absorbed pretty well. When you start getting up to 30[%] to 50% renewables, these costs start becoming much more significant.

Why do these machines have to operate 24/7?

Running these machines costs a lot of money. If I’m running 24/7 and capturing 1,000 tons of carbon a year, OK. If I’m only running half the time, capturing 500 tons a year, the dollar per ton just doubled. The capital cost is still the same. That’s the problem with all of these capital-intensive processes: You need them to operate a significant amount of the time. Usually you shoot for 85[%] to 90% of the time.

$100 per ton is just a target to aspire to. What’s wrong with that?

I just like to deal with facts. I think it’s disingenuous. If you’re really interested in solving climate change, you’ve got to level with people.

Estimates from the U.N. and other sources say that if we want to get to where we need to be, we may need to remove 10 billion tons of carbon dioxide a year by midcentury. Do you agree with that?

It’s at minimum a few billion tons a year, because there are certain sectors that are really hard to decarbonize. A part of that depends on how expensive you think these different sectors are to do, and then how expensive you think the offsets are going to be.

It’s very frustrating. When people think things are too easy, they won’t address the hard decisions, even though those hard decisions may end up with a better solution.

Say I want to decarbonize airplane biofuels and that costs me $700 per ton. If I can capture carbon from the air for $500 per ton, why not just keep emitting the carbon dioxide out of the airplane and capture the air to offset it? And it’s cheaper by $200 per ton. So that’s the driving force. And so I would say that even direct air capture at $500 a ton will have a benefit. For offsets like DAC, they’re going to be more effective for anything that costs more than their price. And so you have to look at the whole system.

It gets more complicated, because there are quite a few negative emissions technologies. DAC isn’t the only one. And none of them are unlimited in their application. If you have a technology that can give you unlimited carbon removal at $100 a ton, that’s nirvana. We’re done, we’ve solved that problem.

If $600 to $1,000 per ton is the likely cost of CDR, what role do you think it will play to get to net zero?

The question is, will there be other, cheaper offsets than that? Every offset has problems. Offsets from bioenergy with carbon capture and storage are cheaper and much more doable. The big issue is the biomass feedstock: how much there is and what the cost will be. Another option, one that I really like, is called liming the ocean. But politically, it’s a nightmare. Think about throwing a chemical in the middle of the ocean. Just think of the protest. But even today, by putting carbon dioxide in the atmosphere, most of that ends up in the ocean.

It’s very frustrating. When people think things are too easy, they won’t address the hard decisions, even though those hard decisions may end up with a better solution. At this point, I don’t know if liming the ocean is a great idea or not, but it has a lot of potential, and we have to look at things like that if we want to get to net zero. And people say capturing carbon dioxide from the air for $100 per ton will get us to net zero. But if it’s a fantasy, it’s not going to get us there.

Fintech

Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep ReadingShow less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more.

Keep ReadingShow less
FTA
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.
Enterprise

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep ReadingShow less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep ReadingShow less
Bennett Richardson

Bennett Richardson ( @bennettrich) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.

Enterprise

Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep ReadingShow less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins