Keynote Speakers
We are happy to introduce our keynote speakers:
Mihai has been a scientist at the ECMWF since June 2021, working on machine learning and atmospheric composition model developments. He has made significant contributions to AIFS, ECMWF’s data-driven operational forecast system, and is now focusing on learning a medium-range forecast directly from Earth System observations (AI-DOP). He has a background in applied mathematics, and has previously worked in the geosciences on acoustic wave velocity modeling in the Earth’s subsurface, and inverse modeling of atmospheric methane.
A machine learning revolution is underway in weather and climate, leveraging the trove of datasets and easy to use frameworks for building models. The EXCLAIM symposium will feature some of the state-of-the-art in this rapidly evolving domain. For experts in weather and climate this brings a new set of jargon and techniques to learn, but these can be learnt and there is a place for you in this revolution. In this tutorial we will break down the jargon and demystify machine learning for this domain. We will introduce the leading techniques in the field, discuss some of the outstanding challenges and highlight how you can engage in the revolution.
Dr. Matthew Chantry
Strategic Lead for Machine Learning
ECMWF
external page Personal website of Matthew Chantry
Matthew Chantry is the Strategic Lead for Machine Learning at ECMWF and Head of the Innovation Platform. Matthew works across ECMWF to advise and coordinate on adoption of machine learning across ECMWF's mission. He champions the AIFS, which is delivering machine learning forecasting systems to operational forecasting. Work at ECMWF on these projects is distributed across an organisation, meaning Matthew must coordinate developments across departments, sections and teams. Matthew works closely with Member States in the co-development of Anemoi as a shared machine learning framework for data-driven forecasting systems. He also advises the ECMWF directorate on future directions for ML-based development.
A machine learning revolution is underway in weather and climate, leveraging the trove of datasets and easy to use frameworks for building models. The EXCLAIM symposium will feature some of the state-of-the-art in this rapidly evolving domain. For experts in weather and climate this brings a new set of jargon and techniques to learn, but these can be learnt and there is a place for you in this revolution. In this tutorial we will break down the jargon and demystify machine learning for this domain. We will introduce the leading techniques in the field, discuss some of the outstanding challenges and highlight how you can engage in the revolution.
Peter is the Head of the Earth System Modelling Section at the European Centre for Medium Range Weather Forecasts (ECMWF) developing one of the world’s leading global weather forecast models — The Integrated Forecasting System (IFS). He is also a Honorary Professor at the University of Cologne. Before, he was AI and Machine Learning Coordinator at ECMWF and University Research Fellowship of the Royal Society performing research towards the use of machine learning, high-performance computing, and reduced numerical precision in weather and climate simulations. Peter is coordinator of the WeatherGenerator Horizon Europe project that aims to build a machine-learned foundation model for weather and climate applications and has been the coordinator of the MAELSTROM EuroHPC-Joint Undertaking project.
This talk will outline three revolutions that happened in Earth system modelling in the past decades. The quiet revolution has leveraged better observations and more compute power to allow for constant improvements of prediction quality of the last decades, the digital revolution has enabled us to perform km-scale simulations on modern supercomputers that further increase the quality of our models, and the machine learning revolution has now shown that machine learned weather models are often better when compared to conventional weather models for many forecast scores while being easier, smaller and cheaper. This talk will summarize the past developments, explain current challenges and opportunities, and outline how the future of Earth system modelling will look like.
Cathy did her PhD at ETH Zurich where she looked at the predictability of convection. It was the time when limited-area models were started to be used for numerical weather predictions. She did her PostDoc at ETH Zurich, then was a visiting scientist at the Department of Atmospheric Sciences at the University of Washington (Seattle) before moving to the Max Planck Institute for Meteorology.
Her research has always focused on deep convection, but her interests have shifted along the years from weather to climate. One question she is particularly interested in is the role that the surface, being the ocean or the land, plays in setting basic features of the climatological precipitation distribution. Her interests in moist convection explains her strong involvement in the use and development of coupled km-scale Earth System Models. She has been co-leading this development at MPI-M. One key achievement was the production of the first coupled global climate simulation run with a grid spacing of 5 km on seasonal time scales (external page https://doi.org/10.5194/gmd-16-779-2023) and, later on, on decadal time scales.
Pushing physics-based models toward kilometre-scale grid spacing is hoped to bring new insights, especially on regional scales and for processes that have to be parameterized in coarser-resolution climate models. In this talk, I will first review what are key new insights that km-scale grid spacing brings, focusing on precipitation in fully coupled global km-scale climate simulations. Second, I will present the next steps in pushing toward kilometre-scale earth system models by presenting first results towards integrating the full carbon cycle in km-scale simulations.
Stephan Hoyer is a Senior Staff Software Engineer at Google, where he leads the NeuralGCM team, building AI-based weather and climate models. His research spans the intersection of physics, numerical computing and machine learning. Stephan has also made significant contributions to open source libraries for scientific computing in Python, including Xarray, NumPy and JAX. He holds a Ph.D in Physics from the University of California, Berkeley.
What would “AI-native” Earth system models look like? What qualitative improvements could they offer over traditional physics-based models? In this talk, I’ll describe our lessons from building NeuralGCM, and how we are expanding NeuralGCM into a flexible and open platform for AI-based weather and climate modeling. I’ll explain the fundamental advantages of AI-based approaches, where they will fall short, and why future models will embrace AI and physics on an equal and interchangeable footing.
Dr. Maria J. Molina is an Assistant Professor within the Department of Atmospheric and Oceanic Science at the University of Maryland, College Park, and is affiliated with the Artificial Intelligence Interdisciplinary Institute at Maryland, the University of Maryland Institute for Advanced Computer Studies, and the National Science Foundation (NSF) National Center for Atmospheric Research.
Maria serves as a member of the US Climate Variability and Predictability (CLIVAR) Predictability, Predictions, and Applications Interface panel and of the World Climate Research Program (WCRP) Scientific Steering Group for the Earth System Modeling and Observations (ESMO) core project. Recently, Maria earned the NASA Early Career Investigator Program in Earth Science award.
Artificial Intelligence (AI) weather and climate (timescale) models have shown their effectiveness in skillfully modeling Earth systems. This capability arises from patterns learned from large observational datasets and physics-based models. This talk will examine the patterns learned by AI-based models and how they can be leveraged to enhance our understanding and prediction of Earth systems. Attention will also be given to evaluating the patterns generated by AI-based models from the perspectives of traditional Numerical Weather Prediction (NWP), computer vision, and social science communities.
Inna joined the Earth System Modelling Section at ECMWF in 2018. Inna’s background is in atmospheric dynamics, and she works on improving the representation of resolved dynamical processes in ECMWF’s numerical weather prediction model IFS. Her current research interests are in km-scale global modelling and in hybrid modelling, combining machine learning models with the physics-based NWP models online.
Prior to joining ECMWF, Inna worked on stratospheric dynamics and stratosphere-troposphere coupling at the University of Reading. She completed her PhD on modelling atmospheres of extra-solar planets at Queen Mary, University of London in 2014.
Machine learning (ML)-based global weather prediction models outperform traditional physics-based numerical weather prediction (NWP) models in large-scale forecast skill but lack fine-scale detail. To leverage this advantage, we apply a scale-selective spectral nudging approach to constrain the large scales of the physics-based ECMWF IFS model to follow predictions from the ML-based ECMWF AIFS model. Results from 9 km deterministic and ensemble-based medium-range forecasts show that this method improves large-scale forecast skill by 15% and enhances tropical cyclone track prediction over conventional IFS forecasts, while maintaining realistic tropical cyclone intensities. This hybrid approach offers several advantages: (i) small scales remain unaffected, unlike in fully deterministic ML models; (ii) it is computationally inexpensive to implement and run; (iii) it seamlessly integrates with updates to the NWP system without requiring retraining; (iv) it preserves the physical consistency of the physics-based model; and (v) it provides the same forecast variables as conventional NWP systems.
Mike Pritchard has been experimenting with AI for atmospheric prediction since 2017 as faculty at UC Irvine, studying how to outsource nested calculations of explicit embedded convection to simple neural network parameterizations, towards a new class of hybrid physics-AI climate simulations. In 2022 he joined NVIDIA Research where he maintains a 50% appointment as Director of Climate Simulation Research, working with AI professionals on whole atmosphere AI forecasting and generative state estimation. He collaborates closely with the LEAP Science & Technology Center at Columbia University.
AI has rapidly changed the paradigm for simulating atmospheric dynamics from planetary to storm-resolving scales. In the first part of this talk I will review progress on hybrid AI physics modeling for climate simulation. Next, I will review emerging NVIDIA research technologies based on generative AI – for (i) spatial downscaling and new channel synthesis, (ii) its extension to ambitious domain sizes via a patch-based multi-diffusion approach, (iii) the advent of AI for dynamical downscaling and mesoscale forecasting and (iv) the promise of generative data fusion and foundational multi-modal climate emulation without autoregression. I will conclude with some remarks on the exciting potential for end-to-end AI forecasting systems that portend a more interactive and computational efficient paradigm for simulating atmospheric and climate physics.
Martin Vetterli received a Dipl.Ing degree from ETHZ in 1981, an MS from Stanford in 1982, and a Doctorate from EPFL in 1986. He has held faculty positions at Columbia University and UC Berkeley before joining EPFL as a Professor in 1995. At EPFL, he was Vice President from 2004 to 2011, and served as Dean of the School of Computer and Communication Sciences in 2011 - 2012. From 2013 to 2016, he was President of the National Research Council of the Swiss National Science Foundation. He was President of EPFL from 2017 to 2024. His research is in the areas of electrical engineering, computer sciences and applied mathematics. He is the co-author of three textbooks, and numerous papers and patents and has received a number of awards for his research. He is a fellow of IEEE and ACM and a member of the US National Academy of Engineering.
external page https://en.wikipedia.org/wiki/Martin_Vetterli
Machine learning, or artificial intelligence, has made amazing progress in recent years, due to three parallel developments, namely (i) vast amounts of open data, (ii) large increase in compute power (especially in forms of specialized chips or GPUs) and (iii) algorithmic advances based on decades of research. The watershed moment happened with the release of large language models (LLMs) like chatGPT with user interfaces for the general public.
A couple of years on is a good moment to reflect on the impact of AI on science practice, education, and society. To do so, I will review the history of AI and its various ups and downs. Then, I will look at the state of play of AI for science and its large potential for making scientific discovery more efficient, if explainability follows. Next, the impact of AI on education is certainly another key issue. Finally, how AI can be used to benefit society as a whole is a fundamental challenge, and I will discuss the alignment problem between human values and AI.
In conclusion, as for any technological advance, AI can be used for the best or the worst, and it is for society to decide between the two. The fact that AI is a generic technology based on data from all fields of human activity makes it all the more important to keep a careful and democratic watch on its development.
Pier Luigi Vidale is Professor of Climate System Science and co-leads the joint NCAS-Met Office global High Resolution Climate Modelling programme. Since a major UK-Japan collaboration in 2004-2007, Pier Luigi has played a world-leading role in high resolution climate modelling and understanding the role of meso-scale processes in the global climate system. He was the Scientific Coordinator of the EU’s H2020 PRIMAVERA (defining HighResMIP for CMIP6), currently co-leads the EU’s Horizon Europe EERIE (defining HighResMIP2 for CMIP7). Pier Luigi is actively involved in the development of km-scale global models, e.g. as investigator in the EU’s H2020 NextGEMS project, and in collaborations with the Met Office. Pier-Luigi is co-Chair of the World Climate Research Programme’s Digital Earths Lighthouse Activity, aiming to define the nature and purpose of Digital Twins for climate, and is the Director of Science Collaboration in the University of Reading’s £30M AFESP programme.
Authors: P. L. Vidale , M. Roberts, A. Baker , K. Hodges , T. Auerswald , A. Sommer
The so-called “physics based” weather and climate models indeed encode some of the fundamental laws of physics, but their process fidelity is limited by the practicalities of defining and solving the governing equations. Examples of practical limitations are model initialisation, forcing, as well as model resolution, compounded by the inescapable empiricisms embedded in physical parameterisation of sub-grid processes. In terms of model resolution, the atmospheric components of weather and climate models started to credibly represent cyclones in the 1970s, recognising their fundamental roles in effecting transports and in driving impacts, but ocean models have only made the equivalent breakthrough in the last decade, when they started to explicitly represent eddies and boundary currents.
There are physically plausible reasons to expect that the next breakthrough in mid-latitude numerical weather prediction depends on process fidelity in the tropics. This is because realistic representation of tropical weather systems, and their associated heat sources, emanate wave trains that travel to mid-latitudes and modulate the weather in those remote regions. Equally, for seasonal to sub-seasonal prediction, all the way to climate applications, air-sea interactions, underpinned by oceanic heat and salinity patterns, are key to understanding how weather and climate will unfold at the regional scale. All of these advances require progressing towards km-scale simulation.
In these scientific endeavours, rigorously assessing process fidelity is crucial, as well as demonstrating how it matters to the trustworthiness of predictions. Current observations, however, are often not fit for purpose: common issues are quality assurance and consistency in space and time, as well as the inadequacy of data and algorithmic solutions that underpin process-based analyses at scale. Examples of incomplete and/or inconsistent observations are storm catalogues, starting with Tropical Cyclones, but also oceanic heat content at depth, precipitation in all its forms, and even river discharge into global oceans. This talk will demonstrate how physics-based models can often complement such observations, as well as point out what our priorities should be in terms of reconstructing past observations, and collecting new, and new types of observations.
Oliver Watt-Meyer is a Lead Research Scientist in the Climate Modeling group at the Allen Institute for Artificial Intelligence (Ai2). His research interests are in atmosphere and climate dynamics as well as in the application of machine learning to climate prediction. He actively works on developing fast, accurate and easy-to-use climate model emulators using AI as the technical lead of the Ai2 Climate Emulator (ACE) project. Previously, he was a NOAA Climate and Global Change and NSERC Postdoctoral Fellow at the University of Washington Department of Atmospheric Sciences. In 2016, he received his PhD from the University of Toronto. Oliver is an Associate Editor at AGU's Journal for Advances in Modeling Earth Systems.
Ai2 Climate Emulator (ACE) is a fast machine learning model that simulates global atmospheric variability in a changing climate over time scales ranging from hours to centuries. ACE is trained on either a global atmospheric model (AGCM)’s output or on observational reanalysis. It has a 1° horizontal grid with eight vertical layers and 6-hourly temporal resolution. The choice of variables predicted assists in climate interpretability and enables the enforcement of mass and energy conservation constraints. The most recent version, ACE2, simulates about 1500 years per day on a single NVIDIA H100 GPU.
When forced by realistic insolation, atmospheric CO2 concentration and specified sea-surface temperature, ACE2 accurately emulates climate trends and ENSO-related interannual variability over the period 1940-2020. It spontaneously generates atmospheric phenomena such as the Madden Julian Oscillation, sudden stratospheric warmings and tropical cyclones. When coupled to a slab ocean model, ACE2 accurately emulates the equilibrium climate sensitivity of a similarly coupled AGCM to CO2 change, including spatial patterns of surface temperature and precipitation response. However, comparing ACE2 with out-of-sample transient climate change simulations exposes remaining challenges with radiative forcing and energy conservation.
Early results coupling ACE2 to an emulator of a comprehensive 3D ocean model show promise for the possibility of a fast AI-based atmosphere-ocean emulator. Multi-decade long simulations are stable with accurate time-mean climate and plausible El Niño-Southern Oscillation variability. We envision a future where fast comprehensive AI emulators of climate models are readily available for quick generation of large ensembles of possible futures across a wide range of forcing scenarios.