## A Modest Proposal To Save the World

2018-12-07

Warning: what follows are the hare-brained ideas of a madOr at least fairly irate. scientist. Before undertaking any dangerous, untested geo-engineering project that has the potential to destroy life as we know it, you should consult with your physician, physicist, psychic, and/or psychiatrist.

I want to specifically emphasize that there are a lot of open questions and missing details in the ideas below. This essay is not meant as a solutionAnd definitely not as a magic bullet that removes the need for other solutions! If stupid conservatives glom onto this as a way to avoid responsibility I will be quite put out. but rather as an exhortation to undertake the research that can answer these questions while there is still time for the answers to be useful.

## Climate change is the biggest catastrophe in history

There are so many terrible things happening all at once, it's hard to keep track of all them. Originally I had a long section here laying out how bad climate change is, but you probably know all of that already.In case you managed to forget for a few seconds: the world is warming at an alarming rate. We just got the latest report on the danger, but there have been reports like that for a long time. It's hard to imagine the emissions picture changing in the near term, and we only have about a decade if we want to avoid the worst consequences. We need a Green New Deal pronto, but even if we can start that tomorrow we should be thinking of what else we can do to reduce the damage. I'll keep it short and assume you're onboard with the title of this section.

The looming disaster scenarios and our collective failure to take action has led a lot of people to think about options that don't involve reaching a global agreement on energy conservation. Most of these options involve ways of getting CO2 out of the atmosphere and sticking it somewhere safe. Unfortunately most of these options are pretty lousy. The big problem is scale. Since the industrial revolution we've added about 1600 gigatons of carbon dioxide into the atmosphere. We continue to dump more at a rate of >30Gt CO2 per yearsource: IEA.

If sequestration is going to work, it has to work at the multi-gigaton scale. I haven't heard of any sequestration plan that can come close to capturing that amount of carbon: the typical proof of concept is minuscule, and realistic industrial application would only be a drop in the bucket compared to global emissions. In general, any energy-intensive method of sequestration runs into a green twist on the rocket equation: you'll do a lot of work to offset all the carbon you emitted to do the sequestration itself, and if you aren't extremely efficient this will massively increase the amount of effort needed in the end.Another big strategy is to block some of the sunlight hitting us. I consider that plan slightly more crazy and outlandish than the one I'm going to outline below. And for what it's worth, it would likely be far more expensive.

Furthermore, most technologies are not scalable as long as humans are needed: if sequestering one ton of carbon requires one person to work for one hour, we'll need >15 million people working full-time to offset our current emissions, with no source of funding in sight. A practical sequestration strategy needs to be "too cheap to meter"—so efficient that the cost per ton goes to zero.

## Cyanobacteria: processing carbon at scale

Perhaps predictably, this is where a biologist says that biology offers a solution. While biological organisms are frustratingly messy and chaotic and difficult to understand, they're also amazingly efficient, at a scale that no human technology can touch. I'm certainly biased, but I can't think of a more plausible way to sequester carbon than to engineer cyanobacteriaSometimes known as "blue-green algae", but they're bacteria rather than proper algae. to more efficiently capture sunlight and fix carbon into sugar, where it can be introduced to the food chain and sequestered for the long term by larger organisms.

A promising candidate for this type of project is Prochlorococcus marinus, a.k.a. photosynthetic picoplankton. These are some of most abundant organisms on Earth, with an estimated $$2.8 - 3 x 10^{27}$$ living in the world's oceans. To put that number in perspective:

Humans (est.)                                7,000,000,000
Human cells                  2,100,000,000,000,000,000,000
Prochlorococcus      3,000,000,000,000,000,000,000,000,000 (!)

At $$3 x 10^{-14}$$ g of carbon per cellsource: Cermak et al., the existing population of picoplankton contains roughly 81 megatons of carbon. If they divide once per day, that adds up to about 30Gt per yearThis is a very rough estimate, but it's on the same order as real research on the role of picoplankton in the carbon cycle: see Johnson & Zinser et al.. To offset our current emissions, we'd need to roughly double that cycling population, assuming the excess population was being absorbed into the oceanic ecosystem and thereby sequesteredThis is a huge assumption and one of those "open questions" I mentioned at the beginning..

We know that this could at least theoretically work, because it's the reason that our atmosphere isn't full of carbon dioxide in the first place. Roughly 2.45 billion years ago, some early carbon-fixing organisms grew at such a prodigious rate that they converted our atmosphere from methane- and CO2-rich to O2-rich. This wiped out a lot of the existing (anaerobic) life on Earth, in what's been called the Oxygen Catastrophe. Such is the power of little microbes in big numbers. Update (2018-12-12): I made a mistake which I often make, which is that I conflated gigatons of carbon with gigatons of CO2. 81 megatons of carbon represents about 300 megatons of carbon dioxide, because most of the mass in the latter comes from oxygen. This is probably canceled out by an overestimate of the average growth rate, which is how I end up near the literature estimates for total production.

What's even more amazing about this is that they pulled it off with such a lousy enzyme playing the key part.

The most abundant enzyme on Earth is RuBisCO: the protein responsible for converting carbon dioxide into something life can metabolize. The reason it's so abundant is not only because of its important role—it's because RuBisCO is not very good at capturing carbon dioxide, and it is usually the rate-limiting step in the photosynthetic pathway.RuBisCO's inefficiency is largely because it has a difficult time distinguishing between carbon dioxide and oxygen—if you can imagine the shape of each molecule, you might see why they'd be hard to tell apart. If the enzyme accidentally grabs an O2 molecule, it still reacts, but it creates unproductive byproducts that the organism must spend additional energy cleaning up. It's hard for the enzyme to become better at binding CO2 without also increasing this byproduct effect. Research suggests that most plants have a version of RuBisCO that is closely adapted to the CO2 and O2 concentrations of their natural habitat (see Studer et al.), and efforts to engineer the enyzme toward higher efficiency have been mostly fruitless (pun intended).

It seems strange that such an important enzyme would be so inefficient, but it makes a little more sense in light of the evolutionary history that led to its existence. At the dawn of photosynthetic life, there was essentially no molecular oxygen in the atmosphere and oceans. Thus, there wasn't any pressure to become particularly selective for CO2 over O2, and organisms with RuBisCO were just as good as any alternative. Billions of years later, conditions are quite different, but there's no evolutionary path for an organism to develop a better version of the enzyme. We'll have to engineer one, instead.

In the oceans, dissolved CO2 exists in an equilibrium with carbonic acid (H2CO3, or H+ & HCO3- [a.k.a. bicarbonate]). This is the acid in the phrase "ocean acidification", another downside to emitting gigatons of CO2. An enzyme called carbonic anhydrase catalyzes the interconversion between carbon dioxide and bicarbonate. Animals like us use this enzyme to maintain the pH balance in our blood and tissues, but plants also use it for something else: to increase the local concentration of CO2 in their chloroplasts, so that all of their RuBisCO enzymes can operate as efficiently as possible. In contrast with RuBisCO, carbonic anhydrases are incredibly good at their jobs: they are typically diffusion-limited, meaning that the enzyme works faster than the molecules involved can get out of the way. I find it a bit amazing to think about these two enzymes, interacting with such closely related molecules at vastly different levels of efficiency.

## The three-legged stool of bio-geoengineering

Enough background. Here are the three steps to engineering more efficient photosynthesis into picoplankton. Each one of these steps is a major research project that should be spread over many groups, working collaboratively but not necessarily in coordination (as no one can predict what approach will work best). Given the current state of scientific knowledge I believe all three of these steps are possible, although it's hard to guess how long they'd take or how much work they might require. I'll introduce them in what I believe to be increasing order of difficulty:

### 1. Create a genetically isolated strain of Prochlorococcus marinus

Most of the fears about genetically modified organisms are nonsense—GMO crops are safe to eat and safe to grow, although they don't make the problems of industrial agriculture go away on their own. In the case of genetically modifying bacteria, and especially in the case where we're trying to engineer a completely new and potentially very powerful metabolic pathway, it's much more reasonable to be concerned. For that reason, any work along these lines should take two different strategies to maintain isolation.

The first step is to synthesize a strain of Prochlorococcus with a new genetic code, one in which the codons for several pairs of amino acids have been swapped with one anothere.g. swap all the codons for alanine with those for serine, and modify the corresponding tRNA to match. The result of this recoding would be an organism that is genetically isolated from all other life on Earth: its own genome is indecipherable to any organism using the standard code, and any new DNA it incorporates will be likewise useless for its own translation machinery. Recoding several pairs of amino acids will prevent the engineered strain from ever sharing genes with other organisms in the oceanIt's important to do this thoroughly, because the number of these organisms is so huge that every mutation occurs almost immediately. At $$3x10^{27}$$ existing organisms in the ocean, I estimate that every set of three nucleic acids in the Prochlorococcus genome is mutated in at least one organism, every minute. Genetic isolation at that scale means setting up a wall of mutations that must all happen simultaneously for the progeny to remain viable..

The second step, for use during research and development of the organism, is to use non-standard amino acids (NSAAs) to keep this strain confined to the lab until we are confident it should be released (see step 3). Using a NSAA is a good way to keep something from growing outside the lab, but in this case we need our strain to grow outside the lab, so it can't be the long-term solution. At the scale necessary for carbon sequestration, we wouldn't be able to keep feeding it synthetic nutrients. On the other hand, recoding the genome only keeps it genetically isolated, not physically. So both measures are necessary until the design is complete.

Those tasks might seem like complete science fiction, but I list them first because all of the pieces have already been demonstrated by existing researchers. They've synthesized entire genomes on a similar scale to that of Prochlorococcus. They've introduced synthetic metabolites to create strains that can't grow without specific additives. Researchers have even demonstrated that recoded organisms can be genetically isolated from horizontal gene transfer, although not yet at the scale needed for this project. Creating a fully codon-swapped genome for Prochlorococcus is a tall order, but it's not too far beyond the cutting edge of research being done today. It would take time and money to get it done, but it's well within the realm of possibilityWhich is pretty amazing, when you realize that the idea is to create organisms with an entirely new genetic code..

### 2. Build a better RuBisCO

The second step is a lot more audacious, but we're closer to having the tools than ever before. The basic goal is to change the beginning of the carbon-fixing pathway from CO2 to HCO3-, which we know can be bound with high efficiency by carbonic anhydrase. This might sound simple to youOr impossibly difficult, depending on your background., but engineering new enzymes has proven to be extremely challenging, even as our ability to engineer new proteins and simulate protein structure has gotten much better. That's partially because enzymes are dynamic machines that go through conformational changes as they catalyze a reaction—we can design a protein that's very stable in one conformation, but getting it to cycle between multiple different states is much trickier.

I'm no expert in this field, but I do know some expertsLargely skewed towards UCSF, because that's where I did my graduate work.. In my estimation the front-runners in de novo protein design are those working in the Rosetta familystarted by David Baker, now at the University of Washington., using machine learning and clever search algorithms to explore protein spaceTanja Kortemme's group has done research on stabilizing multiple conformations simultaneously as a way to design new enzymes.. There are many other promising areas of research, though, including new ideas in molecular dynamicse.g. Michael Grabe's group has done research on efficiently sampling the structures of enzymatic processes using molecular dynamics.. Just recently Google announced the results of their AlphaFold project, which is using deep learning to predict protein structure from sequence. Their initial results look incredibly promising, although it's not clear if their approach is useful for designing new proteins.

Beyond the computational work, there are also many groups developing high-throughput methods for synthesizing and testing proteins. Almost every computationally-designed protein needs to go through lots of optimization in a real organism before it reaches its full potential, and these methods are going to be at least as important for developing our carbon-fixing machineI know even less about this topic but I'll mention Polly Fordyce and Jennifer Cochran, both at Stanford, as two investigators doing amazing stuff in this area..

It's hard to guess at how long this step might take, or how much it would take in terms of resources. It might turn out to be unfeasibly complicated, as one new enzyme requires another and another, multiplying the complexity. It might turn out to be entirely impossible, but I don't think that's the case. But we'll never have any idea if we don't put the resources into finding out. The worst-case risk is that we learn something useful about enzyme design and carbon metabolism, which isn't too bad all things considered.

Strictly in terms of money, funding researchers to do this kind of work is fairly cheap—maybe a few billion dollars over a decade. Compared to the cost of cleaning up climate-related disasters, it's trivialThe damage from the most recent California wildfire is estimated at \$7.5-10 billion, and that itself is small compared to a bad hurricane or flood.. This is a gamble that may not pay off the way we hope, but we're getting some very good odds.

### 3. Establish ecological efficacy and safety

I list this step last because it's the most important as well as the most difficult. Releasing an engineered organism into the world's largest ecosystem is not a decision to take lightly. We should only do so if we have thoroughly explored the risks involved and weighed them against the potential benefits.

The potential benefits are pretty clear, although we'll have to make a level-headed assessment of how likely they are. Doubling the flux of carbon into the ocean is not going to fix the climate instantly, and we probably have many years of warming to go even in the best case. But we could hope to stabilize ocean acidity, which could help ameliorate the bleaching of coral reefs and the threat to many crustaceans. Depending on the amount of flux, we might hope to not only stabilize our carbon emissions but to pull out some of the excess CO2 emitted over the past two centuries.

Properly estimating the potential benefit is work for ecologists, climatologists, and geologists—the same people who are already working to help us prepare for the next century of climate change. There is a lot of research out there estimating how carbon is processed in the ocean, how it filters through the ecosystem, and how it is eventually either sequestered or is re-released into the atmosphere. I won't attempt to summarize said research, but all of it will be important for modeling the effects of such a large intervention in the environment.

The potential risks are more open-ended. There are ecological risks that would need to be assessed—hopefully the fact that Prochlorococcus is at the bottom of the oceanic food chain would minimize some risk, but it's not clear. It still exists in an ecosystem that may be thrown out of balance by this new arrival (for example, Prochlorococcus coexists with Synechococcus and the relationship between the two is unclear). CO2 is not usually the limiting nutrient for plankton, so one question is whether there is even capacity in the ocean for such an expansion of the picoplankton populationBut it's worth noting that a vastly-more-efficient carbon fixation pathway could change the calculation for limiting nutrients..

Perhaps the most obvious risk is in succeeding too well, and pulling more CO2 out of the atmosphere than we ever put in. Previously I mentioned the Great Oxygenation Event (a.k.a. the Oxygen Catastrophe), when early photosynthesizing organisms removed nearly all the methane and carbon dioxide from the atmosphere. The effect of this was a runaway ice age, the Huronian glaciation, that lasted 300 million years and caused mass extinctions. Obviously we don't want to do thatThis documentary suggests it wouldn't be pleasant..

There are a variety of different solutions for this that need to be explored. We could try to engineer a limited number of divisions into our new strain of picoplankton, although which such a huge number of divisions it's very likely to mutate away from any biological switchJumping to a different domain of life: in organisms like ourselves, the ends of our chromosomes (telomeres) get shorter every time our cells divide, and continual growth requires enzymes called telomerases to lengthen them. If we encoded a telomerase to require NSAAs, the organism could be grown in the lab with synthetic nutrients but would have a ticking clock as soon as it was unable to extend its telomeres.. We could strive to engineer our carbon-fixing enzyme very carefully, such that it loses efficiency as the carbon concentration goes back to normal levels. Again, the organism will have strong incentive to mutate away from any obstacles, but it's possible that we could engineer a local minima that was difficult to escape. In either case, we can use a version that relies on non-standard amino acids to test its ability to evade our control mechanisms—NSAAs aren't economical at global scale but could be used to grow trillions of organisms in a controlled setting so that we can explore its mutational landscape.