For years, artificial intelligence has been removing its creators, humans, from successive thrones. It was the turn of meteorology, one of the greatest human creations since the Roman augurs and before opened the guts of some animal to know if it was the ideal time to sow or if the next morning would be propitious for war. Current weather predictions are made with very complex models based on the laws that govern the dynamics of the atmosphere and oceans and running them on some of the most powerful supercomputers in the world. Now, Alphabet (Google’s parent company), with a single machine the size of a personal computer and DeepMind’s artificial intelligence, predicts in one minute the weather across the planet in 10 days. And it does so by surpassing the most modern weather forecasting systems in almost everything. However, this time it seems that artificial intelligence comes to complement human intelligence rather than replace it.
The European Center for Medium-Range Weather Forecasts (ECMWF) has such an advanced system. Last year he renewed his predictive muscle. At its facilities in Bologna (Italy), a supercomputer works with around a million processors (compared to two or four in a personal computer) and a computing power of 30 petaflops, something like 30,000 billion calculations per second. And it needs so many flops for one of its tools, the High Resolution Forecast (HRES, in English) to predict, as it does and with great precision, the weather across the planet in the medium term, generally 10 days, and to do so with a spatial resolution of nine kilometers. From there come the predictions told by the men and women of half the planet’s weather. That Goliath has been measured by GraphCast, Google DeepMind’s artificial intelligence for weather prediction.
The results of the comparison, published today Tuesday in science magazine, show that GraphCast predicts hundreds of meteorological variables equally or better than HRES. As they show, in 90.3% of 1,380 metrics considered, the Google machine surpasses the machine of the ECMWF. If the data referring to the stratosphere, about 6-8 kilometers up in the sky, is discarded and the analysis is limited to that of the troposphere, the atmospheric layer where the closest meteorological events occur, artificial intelligence (AI) surpasses supercomputing supervised by humans in 99.7% of the variables analyzed. And it has achieved this with a machine very similar to a personal computer called a tensor processing unit, or TPU.
Once trained, each prediction can be made in less than a minute using a single TPU, [máquina] “Much more efficient than a normal PC, but with a similar size”
Álvaro Sánchez González, DeepMind researcher and co-creator of GraphCast
“TPUs are specialized hardware for training and executing artificial intelligence software much more efficiently than a normal PC, but with a similar size,” explains Google DeepMind researcher, Álvaro Sánchez González. “In the same way that the computer graphics card (also known as GPU) is specialized in to render images, TPUs are specialized in making matrix products. To train GraphCast we use 32 of these TPUs for several weeks. However, once trained, each prediction can be made in less than a minute using a single TPU,” details Sánchez González, one of the creators of the device.
One of the big differences between GraphCast and current prediction systems is that it relies on weather history. Its creators trained it with all the meteorological data stored in the ECMWF archive since 1979. This includes both the rains that have fallen in Santiago since then and the cyclones that have arrived in Acapulco in 40 years. It took them a while to train it, but once done, GraphCast only needs to know the weather six hours ago and the weather just before issuing its new prediction, to know in a second the weather it will be in another six hours. And each new prediction feeds back into the previous one.
Ferran Alet, also from DeepMind and co-creator of the machine, details how it works: “Our neural network predicts the weather six hours in the future. If we want to predict the weather in 24 hours, we simply evaluate the model 4 times. Another option would have been to train different models, one for 6 hours, another for 24 hours. But we know that the physics in 6 hours will be the same as now. Therefore, we know that if we find the right 6-hour model and give it its own predictions as input, it should predict the weather 12 hours from now and we can repeat the process every six hours.” This gives them “much more data for a single model, making it train more efficiently,” Alet concludes.
Until now, forecasts have been based on the so-called numerical weather prediction, which uses physical equations provided by science throughout its history to respond to the different processes that make up a system as complex as the dynamics of the atmosphere. With their results, a series of mathematical algorithms are defined that are the ones that the supercomputers have to execute in order, in minutes, to have the prediction for the next hours, days or weeks (although there are also longer term ones, the reliability drops dramatically to after 15 days). To do all that, the supercomputer must be very super, something that means enormous cost and a lot of engineering work. The striking thing perhaps is that these systems do not take advantage of the time he spent yesterday or last year in the same place and at the same time. GraphCast does it another way, almost the other way around. Its deep learning leverages decades of historical weather data to learn a model of the cause-and-effect relationships that govern the evolution of Earth’s weather.
José Luis Casado, spokesperson for the Spanish Meteorological Agency (AEMET) explains why historical data is dispensed with: “The atmospheric model uses the available observations and the immediately previous prediction of the model itself: if the current state of the atmosphere is well known , its future evolution can be predicted. It does not use predictions or historical data, unlike research methods. machine learning.
“The importance of DeepMind’s work is that it demonstrates that the predictive accuracy of traditional models can even be improved through artificial intelligence”
Ignacio López Gómez, climate scientist at Google Research
From the headquarters of Google Research, in California (United States), researcher Ignacio López Gómez devises weather prediction systems supported by massive data. At the beginning of the year he published his latest work in which he uses artificial intelligence to predict heat waves. Although he knows several of the creators of GraphCast, he has not participated in its design and calculations. “The importance of the work of DeepMind and other similar ones (such as the recent Pangu-Weather system designed by Chinese scientists) is that they demonstrate that the predictive accuracy of traditional models can be achieved or even improved through artificial intelligence” López acknowledges that the AI models are expensive to train, but they can make weather predictions much more efficiently once trained. “Instead of requiring supercomputers, predictions based on artificial intelligence can be made even on personal computers in a reasonable time.”
The ECMWF has taken note and is already developing its own AI-based prediction system. In October they announced that they already had their first alpha version of their AIFS (or Artificial Intelligence/Integrated Forecasting System). “It is based on the same method as Google’s,” says Casado, from the AEMET. “Although AIFS is not a fully operational system, it is a big step forward,” he adds. As the creators of GraphCast concluded in their scientific article, AI is not a substitute for human ingenuity and even less so for “traditional weather prediction methods developed over decades, rigorously tested in many real-world contexts.” In fact, ECMWF actively collaborated with Google, providing access to the data and supporting them for this project. As Casado concludes, “the traditional models based on physical equations and the new models of machine learning “based on data they could be complementary.”
You can followSUBJECT in Facebook,x and instagramor sign up here to receiveour weekly newsletter.