Google’s Newest AI Model Acts Like a Satellite to Track Climate Change

AlphaEarth Foundations is a chip off the Google DeepMind block—and it’s here to help save the world.
Image may contain Art Collage Purple Modern Art and Pattern
Images: Alpha Earth Foundation

Google’s newest AI model is going to scour the Earth and, ideally, help it out. That's the plan, anyway. The mission is to find out once and for all, in fine detail, what we are doing to our planet. Crucially, once the model has supposedly done this it will also, apparently, explain where we might be able to best put things in place to help our world.

AlphaEarth Foundations, an offshoot of Google’s DeepMind AI model, aims to leverage machine learning and all the gobs and gobs of data that Google has absorbed about our planet over the last two decades, in order to understand how specific areas are changing over time.

The model uses a system called “embeddings” that takes terabytes of data collected from satellites every day, analyzes it, and compresses it down to save storage space. The result is a model of different filters overlaid over maps that are color coded to indicate material properties, vegetation types, groundwater sources, and human constructions such as buildings and farms. Google says the system will act as a sort of “virtual satellite,” letting users call up on demand detailed information about any given spot on the planet.

The goal, Google says, is for users of the service to be able to better understand how specific ecosystems on the planet work, including how air quality, sunlight, groundwater, and even human construction projects vary and change across a landscape. Ultimately, the company wants the model to help answer questions from paying governments and corporations that wish to know, for example, which ecosystems may have more access to sunlight or groundwater that can help determine the best spots to grow a certain crop. Alternatively, it may aid in identifying areas to plop down solar panels with maximum payoff, or build structures in more climate resilient locations.

Google's new model has already mapped a complex surface in Antarctica—an area notoriously difficult to capture due to irregular satellite imaging—in clear detail. It has also supposedly outlined variations in Canadian agricultural land use that are invisible to the naked eye.

Image may contain Art Modern Art Canvas and Texture

Google's new model assigns colors to AlphaEarth Foundations' embedding fields. In Ecuador, the model sees through persistent cloud cover to detail agricultural plots in various stages of development.

Photograph: Alpha Earth Foundation

Chris Brown, a research engineer at Google DeepMind, says that historically, there have been two main problems for making reliable information about the planet more accessible: Getting overloaded with too much data; and that information being inconsistent. “Before, the challenge was getting a look at it all,” Brown said in a press briefing. “Now, the challenge is to unify all the ways that we have to observe and model our planet and get a complete picture.”

Google, of course, has been at this for a while. While AlphaEarth isn’t a broader consumer-facing application, Google Earth has had its own similar timelapse feature since 2021 that shows how global geography has changed over decades—largely due to climate change. Google has also gotten into the game of putting more specific types of satellites into orbit, such as ones that are designed to spot wildfires from space.

The models aren’t perfect. Google, in its frenzied push to build robust AI models, has hit a few snags with the accuracy of its AI generations, mostly when its AI overviews in Search have gone off the rails. But sucking up petabytes of satellite images and finding the trends is, weirdly, a more straightforward task for AI.

Google says the models can generate accurate enough data about an ecosystem down to an area of 10 meters—and while it may get some things wrong, it is apparently 23.9 percent more accurate than similar AI models. (Google didn’t name which ones it was talking about, but companies such as Privateer have been at this for years.)

How AlphaEarth Foundations works by taking nonuniformly sampled frames from a video sequence to index any position in...

How AlphaEarth Foundations works: by taking non-uniformly sampled frames from a video sequence to index any position in time, the model creates a continuous view of the location while outlining measurements.

Video: Alpha Earth Foundation

Google has worked with partners to test the new system, such as MayBiomas in Brazil and the Global Ecosystems Atlas, which aim to better classify undermapped ecosystems including dense rainforests, deserts, and wetlands.

“It's very difficult to compress all the information available for a piece of land in a traditional way, which is spent literally hours and hours and hours of preparing,” said Tasso Azeved, founder of MayBiomas. After working with Google to test AlphaEarth for the past 18 months or so, Azeved says the software has made it easier to analyze great swathes of rainforest yet keep that data from overwhelming their storage capabilities. “We were not even scratching everything that would be possible,” Azeved says.

AlphaEarth is also being added in a more limited capacity into Google Earth Engine, the cloud-based platform that first launched in 2010 and is used for mapping by agencies and companies such as NASA, Unilever, and the Forest Service. It’s worth noting here that this is separate from the more consumer-friendly Google Earth.

Previously, Google’s Earth Engine processes and analyzes satellite data that has then been used to create interactive, high-resolution maps of deforestation across the world and compile detailed views of bodies of water—rivers, lakes, oceans, and seas—and how these had changed over time. Now, annual snapshots of AlphaEarth’s embeddings will be available as a dataset to track long-term trends, which a company rep says can be used to do more advanced custom mapping if users have a “light coding background.”

No stranger to privacy concerns, Google is eager to wave off any worries people might have about this new system scouring pictures from the sky. The company says the AlphaEarth dataset cannot capture individual objects, people, or faces.