Keynote Speakers
We are happy to introduce our keynote speakers:
- chevron_right Dr. Mihai Alexe
- chevron_right Dr. Peter Caldwell
- chevron_right Dr. Matthew Chantry
- chevron_right Dr. Peter Dueben
- chevron_right Dr. Cathy Hohenegger
- chevron_right Dr. Stephan Hoyer
- chevron_right Dr. Maria J. Molina
- chevron_right Dr. Inna Polichtchouk
- chevron_right Dr. Mike Pritchard
- chevron_right Prof. Martin Vetterli
- chevron_right Dr. Oliver Watt-Meyer
Mihai has been a scientist at the ECMWF since June 2021, working on machine learning and atmospheric composition model developments. He has made significant contributions to AIFS, ECMWF’s data-driven operational forecast system, and is now focusing on learning a medium-range forecast directly from Earth System observations (AI-DOP). He has a background in applied mathematics, and has previously worked in the geosciences on acoustic wave velocity modeling in the Earth’s subsurface, and inverse modeling of atmospheric methane.
A machine learning revolution is underway in weather and climate, leveraging the trove of datasets and easy to use frameworks for building models. The EXCLAIM symposium will feature some of the state-of-the-art in this rapidly evolving domain. For experts in weather and climate this brings a new set of jargon and techniques to learn, but these can be learnt and there is a place for you in this revolution. In this tutorial we will break down the jargon and demystify machine learning for this domain. We will introduce the leading techniques in the field, discuss some of the outstanding challenges and highlight how you can engage in the revolution.
Peter Caldwell is the leader of the Simple Cloud-Resolving E3SM Atmosphere Model (SCREAM), which is the US Department of Energy (DOE) global km-scale model. Exploiting technological opportunities is a central focus for DOE, as embodied in SCREAM’s use of C++/Kokkos to enable performance portability on the world’s biggest supercomputers.
Peter's original background is in math and much of his current research spans the boundaries between climate science, numerical analysis, statistics/data science, and computer science. He is also an expert in cloud physics and feedbacks. Peter received his PhD in Atmospheric Sciences in 2007 from the University of Washington and has been at Lawrence Livermore National Lab ever since.
Access to training and validation data makes weather prediction very different from longer-term projections. Empirical validation of far-future projections is impractical because waiting to see what happens removes any benefit from making predictions in the first place. Trust in longer-term predictions instead comes from knowledge that the predictive model is applying physically-accurate equations and has appropriately small numerical error. AI-based models lack this foundation in physical laws, making them hard to trust for predictions that can’t be validated against observations. Thus physics-based models play an essential role in assessing the success of multi-decadal AI-based predictions. AI models trained solely on observation-based data have so far failed at far future prediction, so physics-based models also play an essential role in training AI models for decadal-scale forecasts. Since physics-based models continue to be essential for training and validation of long-term prediction, it is essential that we continue developing them – an AI model will never be better than the data it was trained on.
Historically, computer speed has restricted the resolution of physics-based models, resulting in large discretization error and resulting lack of predictive skill. The new class of exascale supercomputers is changing this situation - the SCREAM model was recently able to run at 1.26 simulated years per day with 3.25 km grid spacing globally. Just a decade ago, such resolutions were typically reserved for short limited-area simulations used as ground truth for global models. Existing exascale computers make decadal simulations with global km-scale resolution possible, but it is just the beginning. This talk will focus on what is realistically possible for physics-based high-resolution global modeling in the next 10 yrs and how such simulations will play an essential role in an AI-enabled world of Earth system prediction.
Dr. Matthew Chantry

Strategic Lead for Machine Learning
ECMWF
external page Personal website of Matthew Chantry
Matthew Chantry is the Strategic Lead for Machine Learning at ECMWF and Head of the Innovation Platform. Matthew works across ECMWF to advise and coordinate on adoption of machine learning across ECMWF's mission. He champions the AIFS, which is delivering machine learning forecasting systems to operational forecasting. Work at ECMWF on these projects is distributed across an organisation, meaning Matthew must coordinate developments across departments, sections and teams. Matthew works closely with Member States in the co-development of Anemoi as a shared machine learning framework for data-driven forecasting systems. He also advises the ECMWF directorate on future directions for ML-based development.
A machine learning revolution is underway in weather and climate, leveraging the trove of datasets and easy to use frameworks for building models. The EXCLAIM symposium will feature some of the state-of-the-art in this rapidly evolving domain. For experts in weather and climate this brings a new set of jargon and techniques to learn, but these can be learnt and there is a place for you in this revolution. In this tutorial we will break down the jargon and demystify machine learning for this domain. We will introduce the leading techniques in the field, discuss some of the outstanding challenges and highlight how you can engage in the revolution.
Peter is the Head of the Earth System Modelling Section at the European Centre for Medium Range Weather Forecasts (ECMWF) developing one of the world’s leading global weather forecast models — The Integrated Forecasting System (IFS). He is also a Honorary Professor at the University of Cologne. Before, he was AI and Machine Learning Coordinator at ECMWF and University Research Fellowship of the Royal Society performing research towards the use of machine learning, high-performance computing, and reduced numerical precision in weather and climate simulations. Peter is coordinator of the WeatherGenerator Horizon Europe project that aims to build a machine-learned foundation model for weather and climate applications and has been the coordinator of the MAELSTROM EuroHPC-Joint Undertaking project.
This talk will outline three revolutions that happened in Earth system modelling in the past decades. The quiet revolution has leveraged better observations and more compute power to allow for constant improvements of prediction quality of the last decades, the digital revolution has enabled us to perform km-scale simulations on modern supercomputers that further increase the quality of our models, and the machine learning revolution has now shown that machine learned weather models are often better when compared to conventional weather models for many forecast scores while being easier, smaller and cheaper. This talk will summarize the past developments, explain current challenges and opportunities, and outline how the future of Earth system modelling will look like.
Cathy did her PhD at ETH Zurich where she looked at the predictability of convection. It was the time when limited-area models were started to be used for numerical weather predictions. She did her PostDoc at ETH Zurich, then was a visiting scientist at the Department of Atmospheric Sciences at the University of Washington (Seattle) before moving to the Max Planck Institute for Meteorology.
Her research has always focused on deep convection, but her interests have shifted along the years from weather to climate. One question she is particularly interested in is the role that the surface, being the ocean or the land, plays in setting basic features of the climatological precipitation distribution. Her interests in moist convection explains her strong involvement in the use and development of coupled km-scale Earth System Models. She has been co-leading this development at MPI-M. One key achievement was the production of the first coupled global climate simulation run with a grid spacing of 5 km on seasonal time scales (external page https://doi.org/10.5194/gmd-16-779-2023) and, later on, on decadal time scales.
Pushing physics-based models toward kilometre-scale grid spacing is hoped to bring new insights, especially on regional scales and for processes that have to be parameterized in coarser-resolution climate models. In this talk, I will first review what are key new insights that km-scale grid spacing brings, focusing on precipitation in fully coupled global km-scale climate simulations. Second, I will present the next steps in pushing toward kilometre-scale earth system models by presenting first results towards integrating the full carbon cycle in km-scale simulations.
Stephan Hoyer is a Senior Staff Software Engineer at Google, where he leads the NeuralGCM team, building AI-based weather and climate models. His research spans the intersection of physics, numerical computing and machine learning. Stephan has also made significant contributions to open source libraries for scientific computing in Python, including Xarray, NumPy and JAX. He holds a Ph.D in Physics from the University of California, Berkeley.
What would “AI-native” Earth system models look like? What qualitative improvements could they offer over traditional physics-based models? In this talk, I’ll describe our lessons from building NeuralGCM, and how we are expanding NeuralGCM into a flexible and open platform for AI-based weather and climate modeling. I’ll explain the fundamental advantages of AI-based approaches, where they will fall short, and why future models will embrace AI and physics on an equal and interchangeable footing.
Dr. Maria J. Molina is an Assistant Professor within the Department of Atmospheric and Oceanic Science at the University of Maryland, College Park, and is affiliated with the Artificial Intelligence Interdisciplinary Institute at Maryland, the University of Maryland Institute for Advanced Computer Studies, and the National Science Foundation (NSF) National Center for Atmospheric Research.
Maria serves as a member of the US Climate Variability and Predictability (CLIVAR) Predictability, Predictions, and Applications Interface panel and of the World Climate Research Program (WCRP) Scientific Steering Group for the Earth System Modeling and Observations (ESMO) core project. Recently, Maria earned the NASA Early Career Investigator Program in Earth Science award.
Artificial Intelligence (AI) weather and climate (timescale) models have shown their effectiveness in skillfully modeling Earth systems. This capability arises from patterns learned from large observational datasets and physics-based models. This talk will examine the patterns learned by AI-based models and how they can be leveraged to enhance our understanding and prediction of Earth systems. Attention will also be given to evaluating the patterns generated by AI-based models from the perspectives of traditional Numerical Weather Prediction (NWP), computer vision, and social science communities.
Inna joined the Earth System Modelling Section at ECMWF in 2018. Inna’s background is in atmospheric dynamics, and she works on improving the representation of resolved dynamical processes in ECMWF’s numerical weather prediction model IFS. Her current research interests are in km-scale global modelling and in hybrid modelling, combining machine learning models with the physics-based NWP models online.
Prior to joining ECMWF, Inna worked on stratospheric dynamics and stratosphere-troposphere coupling at the University of Reading. She completed her PhD on modelling atmospheres of extra-solar planets at Queen Mary, University of London in 2014.
Machine learning (ML)-based global weather prediction models outperform traditional physics-based numerical weather prediction (NWP) models in large-scale forecast skill but lack fine-scale detail. To leverage this advantage, we apply a scale-selective spectral nudging approach to constrain the large scales of the physics-based ECMWF IFS model to follow predictions from the ML-based ECMWF AIFS model. Results from 9 km deterministic and ensemble-based medium-range forecasts show that this method improves large-scale forecast skill by 15% and enhances tropical cyclone track prediction over conventional IFS forecasts, while maintaining realistic tropical cyclone intensities. This hybrid approach offers several advantages: (i) small scales remain unaffected, unlike in fully deterministic ML models; (ii) it is computationally inexpensive to implement and run; (iii) it seamlessly integrates with updates to the NWP system without requiring retraining; (iv) it preserves the physical consistency of the physics-based model; and (v) it provides the same forecast variables as conventional NWP systems.
Mike Pritchard has been experimenting with AI for atmospheric prediction since 2017 as faculty at UC Irvine, studying how to outsource nested calculations of explicit embedded convection to simple neural network parameterizations, towards a new class of hybrid physics-AI climate simulations. In 2022 he joined NVIDIA Research where he maintains a 50% appointment as Director of Climate Simulation Research, working with AI professionals on whole atmosphere AI forecasting and generative state estimation. He collaborates closely with the LEAP Science & Technology Center at Columbia University.
Martin Vetterli received a Dipl.Ing degree from ETHZ in 1981, an MS from Stanford in 1982, and a Doctorate from EPFL in 1986. He has held faculty positions at Columbia University and UC Berkeley before joining EPFL as a Professor in 1995. At EPFL, he was Vice President from 2004 to 2011, and served as Dean of the School of Computer and Communication Sciences in 2011 - 2012. From 2013 to 2016, he was President of the National Research Council of the Swiss National Science Foundation. He was President of EPFL from 2017 to 2024. His research is in the areas of electrical engineering, computer sciences and applied mathematics. He is the co-author of three textbooks, and numerous papers and patents and has received a number of awards for his research. He is a fellow of IEEE and ACM and a member of the US National Academy of Engineering.
external page https://en.wikipedia.org/wiki/Martin_Vetterli
Machine learning, or artificial intelligence, has made amazing progress in recent years, due to three parallel developments, namely (i) vast amounts of open data, (ii) large increase in compute power (especially in forms of specialized chips or GPUs) and (iii) algorithmic advances based on decades of research. The watershed moment happened with the release of large language models (LLMs) like chatGPT with user interfaces for the general public.
A couple of years on is a good moment to reflect on the impact of AI on science practice, education, and society. To do so, I will review the history of AI and its various ups and downs. Then, I will look at the state of play of AI for science and its large potential for making scientific discovery more efficient, if explainability follows. Next, the impact of AI on education is certainly another key issue. Finally, how AI can be used to benefit society as a whole is a fundamental challenge, and I will discuss the alignment problem between human values and AI.
In conclusion, as for any technological advance, AI can be used for the best or the worst, and it is for society to decide between the two. The fact that AI is a generic technology based on data from all fields of human activity makes it all the more important to keep a careful and democratic watch on its development.
Oliver Watt-Meyer is a Lead Research Scientist in the Climate Modeling group at the Allen Institute for Artificial Intelligence (Ai2). His research interests are in atmosphere and climate dynamics as well as in the application of machine learning to climate prediction. He actively works on developing fast, accurate and easy-to-use climate model emulators using AI as the technical lead of the Ai2 Climate Emulator (ACE) project. Previously, he was a NOAA Climate and Global Change and NSERC Postdoctoral Fellow at the University of Washington Department of Atmospheric Sciences. In 2016, he received his PhD from the University of Toronto. Oliver is an Associate Editor at AGU's Journal for Advances in Modeling Earth Systems.
Ai2 Climate Emulator (ACE) is a fast machine learning model that simulates global atmospheric variability in a changing climate over time scales ranging from hours to centuries. ACE is trained on either a global atmospheric model (AGCM)’s output or on observational reanalysis. It has a 1° horizontal grid with eight vertical layers and 6-hourly temporal resolution. The choice of variables predicted assists in climate interpretability and enables the enforcement of mass and energy conservation constraints. The most recent version, ACE2, simulates about 1500 years per day on a single NVIDIA H100 GPU.
When forced by realistic insolation, atmospheric CO2 concentration and specified sea-surface temperature, ACE2 accurately emulates climate trends and ENSO-related interannual variability over the period 1940-2020. It spontaneously generates atmospheric phenomena such as the Madden Julian Oscillation, sudden stratospheric warmings and tropical cyclones. When coupled to a slab ocean model, ACE2 accurately emulates the equilibrium climate sensitivity of a similarly coupled AGCM to CO2 change, including spatial patterns of surface temperature and precipitation response. However, comparing ACE2 with out-of-sample transient climate change simulations exposes remaining challenges with radiative forcing and energy conservation.
Early results coupling ACE2 to an emulator of a comprehensive 3D ocean model show promise for the possibility of a fast AI-based atmosphere-ocean emulator. Multi-decade long simulations are stable with accurate time-mean climate and plausible El Niño-Southern Oscillation variability. We envision a future where fast comprehensive AI emulators of climate models are readily available for quick generation of large ensembles of possible futures across a wide range of forcing scenarios.