Skip to main content

Micro2Macro: Origins of Climate Change Uncertainty Workshop Sessions

Sessions

This session focuses on the microphysical properties that climate models struggle to simulate and the consequences of those struggles for cloud macrophysics, global radiative fluxes and circulation, and other quantities that impact climate projections. It also focuses on the reasons why these microphysics parameterizations struggle.

This session focuses on summarizing the current capabilities and limitations of observational techniques for the characterization of microphysical properties and processes in the surface-atmosphere system toward the validation and uncertainty evaluation in climate model representations of microphysics. The focused measurement techniques include but are not limited to experimental laboratory measurements, airborne/ ground-based in-situ measurements, and remote sensing from any platform.

This session focuses on surveying existing and emerging frameworks and best-practices for utilizing observational data to address and reduce uncertainty in climate model representations of microphysics. This session will explore how structural deficiencies in models, such as missing or inadequately represented microphysical processes, can be identified and addressed by integrating more realistic process representations or by employing machine learning techniques to reduce computational costs and improve model skill. Contributions that both quantify parametric uncertainties and offer innovative solutions to tackle these modeling challenges are highly encouraged.

Due to the complexity of the process interactions in the atmosphere and the operational and technical challenges of field campaign deployments, laboratory experiments, and remote sensing networks, there is a need to develop improved protocols and design processes for observations that can move the needle on critical uncertain processes in climate models. Contributions discussing existing or proposed approaches to integrate atmospheric models (of any scale) into the design of experiments, field campaigns, and observational networks are especially encouraged.

Wrapping up the workshop, this session explores next steps in developing a foundation of a new framework to confront and evaluate climate models using observations.

Topics

Many satellite missions have produced the operational products of cloud-aerosol microphysical properties based on the long-term efforts made by the remote sensing community, and these products have been extensively used for atmospheric and climate science studies. However, these remotely sensed cloud-aerosol properties always involve uncertainties, of which atmospheric and climate science communities may not always be aware, associated with particular assumptions in remote sensing algorithms and measurement sensitivities. Given the growing need for process-oriented cloud-aerosol studies, this topic addresses the capabilities and limitations of the current remote sensing techniques for cloud-aerosol microphysical retrievals and gives feedback from the atmospheric and climate science communities to the remote sensing community about the need for retrieval accuracy and more detailed microphysical property retrievals that advance the understanding atmospheric/climate systems and validate numerical models. 

In situ measurements of aerosol and cloud microphysical properties are commonly made using airborne platforms, and these measurements provide critical information to assess the fidelity of the representation of microphysical processes in numerical models across scales. In addition, in situ measurements provide critical validation for satellite remote sensing. These measurements continue to challenge the scientific community and we are experiencing a period of active development of novel techniques (e.g. holography) and also undertaking a miniaturization of existing probes for use on smaller platforms such as drones. Finally, because airborne in situ data are sparse in their geographical distribution, a challenge arises for using them to evaluate models, particularly global models. 

Laboratory studies have uncovered numerous and complex chemical reactions and physical processes that likely occur in the atmosphere and influence Earth’s radiative budget. For example, laboratory observations have shown hundreds of reactions can form new particles in the atmosphere and the dependence of water uptake and ice nucleation on particle composition. However, the detailed process-level understanding uncovered from experimental studies is often not incorporated into aerosol microphysical models due to computational limitations or lack of global measurements. Consequently, much uncertainty remains on how processes observed in the lab actually influence atmospheric chemistry and Earth’s radiative budget. This goal of this topic is to bridge the gap between laboratory studies and microphysical models in order to (1) identify limitations in extending laboratory observations to models, (2) methods to better collect/present laboratory observations to be incorporated into models, and (3) using models to more efficiently inform experimentalist on the potential significance of their observations and direct future experiments.

There has been enormous investment in measurements of microphysical processes and properties in recent decades, which have great potential to help constrain models since a wide range of observation types are needed to overcome the effects of model equifinality. However, multiple features of measurement data must be harmonized for efficient model constraint. Robust model-observation comparisons can make use of observational uncertainty, as well as spatial and temporal representation uncertainties (e.g. Schutgens et al. 2017). Unfortunately, these uncertainties have not been quantified for most microphysical measurements. The goal of this topic is to highlight how field data provision could be more closely aligned with model constraint activities.

Within the atmospheric sciences community, model-observation closure studies using field campaign measurements are most commonly designed to test local process representation by models. For example, airborne or surface measurements have been used to test aerosol-CCN, CCN-droplet, and aerosol-INP closure. In each case, the objective is to measure all model inputs and then test whether a model can reproduce a dependent predicted and measured quantity within experimental uncertainties. The aerosol community has also studied point and column radiative closure. Could the closure framework be applied more broadly to reduce uncertainty in characterization of cloud scenes and model representation of microphysical processes and their macrophysical effects? For instance, in scenes large enough to characterize mesoscale cloud structure, the liquid water path, droplet number concentration and optical depth fields observed by satellite and in situ platforms along marine boundary layer Lagrangian trajectories can offer an opportunity to test simulated aerosol-cloud-precipitation interactions. Model calibration to multiple observational data sets can also be viewed as a form of closure enforcement. This topic invites contributions that incorporate a closure framework.

Observational cloud and aerosol microphysical datasets are critical for evaluating and improving numerical models, but numerous challenges remain in how best to use them in ways that yield important constraints on models and allow for insights about how to improve them. For remote sensing data in particular, simulators have become widely used but challenges remain in bridging the gap between model output variables and what can be observed. Process-based observations present a unique set of challenges due to frequent mismatches in scales (e.g. between a climate model and the typical scale of field measurements). This topic focuses on both best practices and new innovations in the use of existing observational datasets, and on how future measurement campaigns can be most effectively designed to address known and important model biases and to constrain specific microphysical processes and their macrophysical consequences (e.g. aerosol radiative forcing; climate sensitivity).

Uncertain model parameters in microphysical process parametrisations (parametric uncertainty) cause uncertainty in simulated macrophysical effects. Models also have missing, or poorly represented, microphysical processes (structural deficiencies) that reduce model skill at simulating macrophysical effects. The effects of parametric uncertainties and structural deficiencies are usually indistinguishable when comparing model output to observations. Tackling structural deficiencies by increasing the realism of microphysical process representations increases model complexity and computational costs. These costs may be reduced if community efforts to embed emulators of large eddy models into global models are successful. Yet, there is no guarantee that increased microphysical complexity will increase model skill at simulating macrophysical effects since aerosol and clouds are susceptible to compensating effects. We need a framework to identify the key missing, or poorly represented, microphysical processes with outstanding potential to improve model skill at simulating macrophysical effects. This framework will need to 1) robustly sample parametric uncertainties in microphysical processes (using perturbed parameter ensembles and/or machine learning methods, 2) evaluate the effect of observational constraints on these uncertainties and 3) quantify how constraints on parametric uncertainties feed through to uncertainties in macrophysical cloud properties. This topic invites contributions that quantify microphysical process parametric uncertainty and draw insights about structural model development priorities.

With the ever-increasing computational power, kilometer-scale global modeling has become a reality. These simulations provide an unprecedented level of detail but still fall short of the large-eddy-resolving resolution required to realistically represent shallow cloud systems with complex boundary layer and trade cumulus mesoscale cloud structural features. Such simulations are also still limited to at most a few years of simulation; we still rely on traditional GCMs for long-term predictions of the future climate and assessments of past climate. Nonetheless, kilometer-scale global modeling, as well as higher resolution regional simulations, limited-domain large eddy simulations, and even box/parcel simulations are tremendously valuable and can be used to assess microphysical process uncertainty, to assess how such uncertainty impacts macrophysical properties, to develop and test new process representations, and to develop ML/AI emulators of processes for use in GCMs. This session invites contributions that demonstrate or suggest ways that multiscale modeling can be leveraged to improve climate predictions, particularly in terms of cloud micro- and macro-physical properties.

What might be important things that models don’t parameterize? What additional physical understanding of processes do we need (where are the most important gaps)? What observations do we need for them to be parameterized (lab) and for the parameterizations to be validated (lab/field)?  How do we draw the line in terms of complexity and understand what processes matter? Do current ESMs even have the framework to implement the necessary processes? Some examples of processes that add complexity, but may be important for climate sensitivity, include coupling between the atmosphere and biogenic aerosol sources; size-dependent entrainment and sedimentation; secondary ice production; and new particle formation. Other processes that are gaps in our parameterizations not named here are also encouraged.

As machine learning (ML) approaches are increasingly leveraged to improve climate and process models, various approaches are being tested. One leading approach is wholesale replacement of traditional physics schemes, such as moist boundary layer dynamics schemes that are currently widely used in climate models to predict aerosol activation. Key factors in parameterization development are the robustness of the training data set, whether or not uncertainty in the training set is explicitly accounted for, and out of sample behavior. Another approach is to use machine learning emulators to tackle multi-parameter optimization against observational data sets, thus using ML as a tool to overcome limitations in calibrating conventional physics schemes. This topic invites discussion of contributed examples and visions for extending the use of ML to advance understanding and model representation of uncertain aerosol-cloud processes. Discussions of how to directly implement physics into ML to overcome challenges with robustness and the interpretation of ML results are strongly encouraged.

What strategies exist for optimizing the design of observations and experiments to produce observations and measurements that can help to reduce uncertainty in key processes controlling climate sensitivity? Can we better incorporate modeling exercises and scenario planning in advance of field experiments so that the outcomes are more valuable? How can models be better utilized to inform technical requirements for new instrumentation, observational networks, or observational platforms? Can we use Observing System Simulation Experiments (OSSEs) or other methods to evaluate the potential impact of future observation systems (lab, field, or remote sensing) through simulated experiments, and design improved observational strategies?

Recent modeling studies have re-emphasized the role of microphysical perturbations in influencing large-scale dynamics but large uncertainties remain in the magnitude of these effects, particularly with respect to regional differences. Conversely, the large-scale responses to these perturbations can provide strong constraints on the microphysical perturbations which are not currently fully utilized in process modeling. We welcome any work or methodological approaches that look to unpick or combine these bottom-up and top-down approaches.