In June 2021, the C3.ai Digital Transformation Institute selected 23 research projects to transform energy systems and advance climate security.

A total of $4.4 million and access to the C3 AI Suite and Microsoft Azure computing and storage have been awarded to support the following multidisciplinary projects. Read the announcement here.

AI for Natural Catastrophes: Tropical Cyclone Modeling and Enabling the Resilience Paradigm

An important scientific and societal concern associated with a changing climate is that natural catastrophes, such as tropical cyclones, wildfires, and floods, are expected to exacerbate. Such natural catastrophes have a huge impact on infrastructure, lifeline networks, and human life and well-being. To suitably assess risk associated with natural catastrophes and develop adaptation measures, advances are needed in 1) improved modeling of natural catastrophes under a changing climate and 2) developing resilient infrastructure and lifeline networks that gracefully degrade and quickly recover from such natural catastrophes. Our project will make advances on both of these aspects with specific focus on improved modeling of tropical cyclones in the U.S. Eastern Seaboard and assessing and improving the resilience of transportation networks. Advances will be based on suitably chosen AI models and formulations, ranging from multivariate time series modeling to sequential decision-making.

Arindam Banerjee
Founder Professor of Computer Science
University of Illinois at Urbana-Champaign

Ning Lin
Associate Professor of Civil and Environmental Engineering
Princeton University

Rebecca Willet
Professor of Computer Science and Statistics
University of Chicago

 

Private Cyber-Secure Data-Driven Control of Distributed Energy Resources

The grid-edge is undergoing a rapid transformation. “Prosumers” with distributed energy resources (DERs) are becoming active participants in a vibrant energy economy, playing a very different role than traditional consumers. While individually these DERs have a small energy footprint, they can provide valuable grid services when aggregated. The proposed project focuses on data-driven distributed control of DERs via multi-agent reinforcement learning (MARL) and specifically aims to design algorithms that 1) do not leak private information during its iterations, and 2) can detect data integrity attacks on a system with multiple almost-identical subsystems. The first set of algorithms will guarantee privacy in MARL via non-identifiability that exploits the distributed nature of the computational framework. For identification of data integrity attacks, the second set of algorithms will utilize the quickest change detection techniques in MARL contexts. The theoretical developments will be mapped to questions of voltage and frequency control via DERs using MARL. Scalability of the algorithms will be tested on the C3 AI Suite, which offers machine learning pipelines ideally suited for the task. These simulations on the C3 AI Suite will be complemented by a smaller-scale hardware-in-loop demonstration.

Subhonmesh Bose
Assistant Professor of Electrical and Computer Engineering
University of Illinois at Urbana-Champaign

Tamer Basar
Research Professor
University of Illinois at Urbana-Champaign

Alejandro Dominguez-Garcia
Professor of Electrical and Computer Engineering
University of Illinois at Urbana-Champaign

Venugopal Varadachari Veeravalli
Henry Magnuski Professor of Electrical and Computer Engineering
University of Illinois at Urbana-Champaign

Machine Learning for Power Electronics-enabled Power Systems: A Unified ML Platform for Power Electronics, Power Systems, and Data Science

The future grid will be supported by clouds of distributed and renewable energy resources. Grid-tied power electronics will be pervasively needed to connect renewable energy resources to the grid. Such inverters have sophisticated dynamic behaviors, challenging the system’s stability and control. Princeton and KTH plan to develop a family of C3 AI-enabled methods for learning, optimization, and stability analysis of grid-tied inverters and power electronics-enabled power systems. The goal is to develop a unified machine learning platform for power electronics, power systems, and data science research. A bottom-up approach, starting from modeling a single inverter, to modeling a cluster of inverters connected as a microgrid, will be used as a case study to show a holistic modeling approach supported by C3 AI, including: 1) Modeling a single inverter as a gray box using machine learning – develop gray-box surrogate models for individual inverters based on data generated from electromagnetic transition (EMT) simulations. Trained gray-box models will act as digital twins of EMT circuit models to enable rapid system simulation and analysis. 2) Modeling a cluster of inverters connected as a distribution network. This thrust will apply federated machine learning techniques to aggregate surrogate models of a cluster of inverters. The model will be trained in a federated manner without sharing detailed information of each inverter. 3) Developing deep unfolding machine learning methods for distribution grid analysis – use deep unfolding methods to perform gray-box analysis on an IEEE 9-bus model as a motivating case study.

Minjie Chen
Assistant Professor of Electrical and Computer Engineering
Princeton University

H. Vincent Poor
Michael Henry Strater University Professor
Princeton University

Prateek Mittal
Associate Professor of Electrical Engineering
Princeton University

Lars Nordström
Deputy Head of School, Electrical Engineering and Computer Science
KTH Royal Institute of Technology

Xiongfei Wang
Visiting Professor
KTH Royal Institute of Technology

Scalable Data-Driven Voltage Control of Ultra-Large-Scale Power Networks

One of the main challenges for large-scale integration of renewable generation in electric power distribution systems is increased risk of operation outside the acceptable voltage range. Both under- and over-voltage conditions are detrimental to any equipment connected to the system and might even cause wide-scale power blackouts. The main difficulty in regulating voltages stems from the fact that exact models of distribution systems are not available, and the difficulty in estimating them accurately, because of the lack of sensors and adequate communication infrastructure. In contrast to recent efforts aimed at developing model-based voltage regulation schemes for distribution systems, this team aims to develop a scalable data-driven control strategy without the knowledge of a system model capable of providing fast and optimal response to under- and over-voltage events in large-scale distribution systems. The optimal control policy will be efficiently learned from data by an online iterative approach, where we adopt the “bandit” setting of online optimization, and successfully dispatched by utilizing a graph neural network as an input-output map in a distributed energy resource (DER) dispatch problem. Because of its sparse structure amenable to distributed implementation, we expect the controller to show fast performance in large-scale distribution systems. We plan to test our data-driven approach on realistically large networks with more than a million nodes using the C3 AI Suite and computing platform.

Alejandro Dominguez-Garcia
Professor of Electrical and Computer Engineering
University of Illinois at Urbana-Champaign

Duncan Callaway
Energy and Resources Group Instructor
University of California, Berkeley

Multi-Scale Analysis for Improved Risk Assessment of Wildfires Facilitated by Data and Computation

We seek to develop a comprehensive wildfire protection system that guards against current and future catastrophic disasters where critical infrastructure is destroyed and lives are lost. We do this by supporting strategic planning and policy development focused on reducing the intensity and rate of spread of a wildfire. With this goal achieved, first responders can safely contain ignitions, minimize damage to infrastructure, and save lives. We propose to explore the urban-edge landscape and infrastructure (often called the wildland urban interface) to better identify and model the risk of catastrophic wildfires so that more informed planning and policy decisions can be made to inform improved and enhanced design, management and mitigation efforts under current and future climate conditions. Simply put, this research will enhance energy and climate security against wildfires. We propose key innovations at different scales. First, crowdsourcing, and very high-resolution remote sensing for an AI-driven fuel model identification. Second, models of wildfire behavior, intensity, spread, informed by downscaled climate change predictions, historic catastrophic wildfires, environmental monitoring. Third, egress models that combine large-scale mobile phone data facilitated by data-driven optimization models and computation.

Marta Gonzalez
Associate Professor of City and Regional Planning
University of California, Berkeley

John Radke
Associate Professor of City and Regional Planning
University of California, Berkeley

Zuo-Jun Max Shen
Professor, Industrial Engineering and Operations Research
University of California, Berkeley

Machine Learning to Reduce Uncertainty in the Effects of Fires on Climate

We propose to determine the radiative effects of smoke from fires on Earth’s energy budget, and its parametric uncertainty. We will achieve this by developing advanced, broadly applicable, machine learning and uncertainty quantification (UQ) techniques and applying them to data from a climate model. Fires have important but poorly understood effects on climate, dominated by smoke aerosols and their interactions with clouds. Climate models are needed to predict these effects, but predictions differ widely between models. In this project, CMU climate modelers will collaborate with statisticians to extend existing UQ approaches for climate models using advanced machine learning (ML). They will apply the UQ to constrain a large ensemble of climate simulations created by co-investigators at Leeds University with atmospheric observations. The UQ will produce sets of near-optimal parameters we need to represent atmospheric processes in the model, such as smoke emissions, and sets that span the uncertainty in these processes. We will run short new simulations of Earth’s atmosphere with these parameter sets to calculate the effects of fires on Earth’s radiation balance and quantify the remaining uncertainty. Our results will pave the way to accurately determining the full impact of fires on Earth’s changing climate. The new statistical techniques we develop leveraging ML algorithms will also help constrain aerosol radiative forcing and can be further applied to improve UQ in other scientific disciplines.

Hamish Gordon
Assistant Research Professor, Engineering Research Accelerator
Carnegie Mellon University

Mikael Kuusela
Assistant Professor of Statistics and Data Science
Carnegie Mellon University

Ann Lee
Professor, Statistics and Data Science, Machine Learning
Carnegie Mellon University

Ken Carslaw
Professor, Institute for Climate and Atmospheric Science
University of Leeds

Leighton Regayre
Postdoctoral Researcher
University of Leeds

Quantifying Carbon Credit over U.S. Midwestern Cropland Using AI-Based Data-Model Fusion

Accurate and cost-effective carbon credit accounting is the foundation to evaluate climate-smart agricultural practices and enable market-based agricultural carbon credit markets. However, a severe lack of such quantification methods have demonstrated accuracy and scalability. Systematic data-model fusion has the potential to fill this gap by combining observation and process-based modeling. Several challenges exist in developing an operational data-model fusion system for cropland carbon credit accounting, including 1) the lack of a scalable and effective method to identify carbon-sequestration-related management practices over cropland, and 2) high computational burdens for traditional data-model fusion at each individual field over a broad region. We propose using AI to address these challenges. Specifically, objective 1, we will use multi-scale AI algorithms and multi-source sensing (ground, airborne, and satellite) data to identify management practices at field scales. Objective 2, We will then build an AI-based data-model fusion system to quantify historic carbon credit over Midwestern cropland and use this system to assess carbon credit potentials under different management scenarios at field scale. Advanced hierarchical physics-guided deep learning will be employed in the AI-based data-model fusion system. We will use the C3 AI Suite to integrate heterogeneous big data, build prototype AI models and solutions, and then use the C3 AI Suite, Microsoft Azure, and Blue Waters Supercomputer to scale up our solutions to the whole U.S. Midwest. This project will build the foundation for the agricultural carbon market and thus contribute to climate change mitigation.

Kaiyu Guan
Associate Professor, agroecosystem sensing and modeling
University of Illinois at Urbana-Champaign

Jian Peng
Associate Professor and Willett Faculty Fellow, Department of Computer Science
University of Illinois at Urbana-Champaign

Bin Peng
Postdoctoral Research Associate
University of Illinois at Urbana-Champaign

Sheng Wang
Research Scientist
University of Illinois at Urbana-Champaign

The Role of Interconnectivity and Strategic Behavior in Electric Power System Reliability

We propose to investigate the behavior of individual generators to understand major factors impacting electric system reliability under major weather events like the cold front that affected Texas in February 2021. Using a combination of structural economic estimation, optimization, and machine learning, we will develop a new method to simulate market outcomes under counterfactual conditions in electricity markets. This will allow us to evaluate the relative impact that factors like rm strategic behavior, transmission interconnectivity, and increased gas availability, among others, can have to avoid extended blackouts and system crises such as the one experienced by Texas. We will use detailed data on the electricity market of the neighboring MISO South region, which has similar characteristics and is publicly available, and will combine it with a rich dataset on natural gas pipelines that will allow us to study the interaction between the two systems.

Ali Hortacsu
Ralph and Mary Otis Isham Professor of Economics
University of Chicago

John Birge
Hobart W. Williams Distinguished Service Professor of Operations Management
University of Chicago

Ignacia Mercadal
Assistant Professor of International and Public Affairs
Columbia University

Michael Pavlin
Associate Professor, Operations and Decision Sciences
Wilfrid Laurier University

Optimization of Agricultural Management for Soil Carbon Sequestration Using Deep Reinforcement Learning and Large-Scale Simulations

Soil carbon sequestration in croplands has tremendous potential to help mitigate climate change. It is challenging, however, to develop optimal management practices for maximization of sequestered carbon as well as crop yield. This project aims to develop an intelligent agricultural management system using deep reinforcement learning (RL) and large-scale soil and crop simulations. To achieve this, we propose to build a simulator to model and simulate complex soil-water-plant-atmosphere interaction, which will run on high-performance computing platforms. Massive simulations using such platforms allow the evaluation of the effects of various management practices under different weather and soil conditions in a timely and cost-effective manner. By formulating the management decision as an RL problem, we can leverage state-of-the-art algorithms to train management policies through extensive interactions with the simulated environment. The trained policies are expected to maximize the stored organic carbon while maximizing the crop yield in the presence of uncertain weather conditions. The whole system will be tested using data of soil and crops in both the U.S. Midwest and the central region of Portugal. The proposed research has great potential for impact on climate change and food security, two of the most significant challenges currently facing humanity.

Naira Hovakimyan
W. Grafton and Lillian B. Wilkins Professor
University of Illinois at Urbana-Champaign

Nicolas Martin
Assistant Professor, Crop Sciences
University of Illinois at Urbana-Champaign

Pan Zhao
Postdoctoral Researcher
University of Illinois at Urbana-Champaign

Guillermo Marcillo
Agronomy Data Scientist
University of Illinois at Urbana-Champaign

Zahra Kalantari
Associate Professor in Environmental and Engineering Geosciences
KTH Royal Institute of Technology

Carla Ferreira
Researcher
KTH Royal Institute of Technology

Cyberattacks and Anomalies for Power Systems: Defense Mechanism and Grid Fortification via Machine Learning Techniques

To improve the efficiency, resiliency, and sustainability of power systems and address climate issues, the operation of power systems is becoming datacentric. Data analytics plays a critical role in the economic and reliable operation of the grid because major operational problems – such as security-constrained optimal power flow, contingency analysis, and transient stability analysis – rely on knowledge extracted from sensory data. The current industry practice is based on a set of heuristic iterative algorithms that are known empirically to work properly under normal situations but become brittle under adverse conditions, such as cyberattacks. In this project, we aim to study dynamic SCADA/PMU data to address the following objectives: 1) How to design a set of machine learning algorithms with mathematical guarantees to detect cyberattacks and anomalies? 2) How is the performance of each algorithm related to the number and locations of sensors and types of measurements? 3) What is the trade-off between the accuracy of each learning algorithm and the computational power needed by the algorithm? 4) In case the precise attacked region cannot be pinpointed, what is the smallest region containing the attacked region that each algorithm can identify? 5) How to fortify the grid to ensure that it is incapable of propagating misinformation in case of a cyberattack? 6) How is the severity of the effect of a cyberattack related to the amount of knowledge about the grid available to the attacker? This project is in the intersection of power systems, machine learning, and optimization.

Javad Lavaei
Associate Professor, Industrial Engineering and Operations Research
University of California, Berkeley

Somayeh Sojoudi
Assistant Professor, Electrical Engineering & Computer Sciences, Mechanical Engineering
University of California, Berkeley

Offline Reinforcement Learning for Energy-Efficient Power Grids

We propose to develop offline RL algorithms to incorporate real-world data in training an RL agent to reduce emissions associated with running an electrical grid. Offline RL allows learning policies entirely from previously collected historical data, without any simulation, while still optimizing to improve metrics above and beyond the historical policy that generated that data. In our case, Offline RL will enable us to learn to reduce emissions using electrical grid data and retain the benefits of testing in simulation. We hypothesize that this approach will learn a more efficient grid management policy that can significantly reduce both emissions and the cost of electricity generation.

Sergey Levine
Assistant Professor, Electrical Engineering and Computer Sciences
University of California, Berkeley

Zico Kolter
Associate Professor, Computer Science
Carnegie Mellon University

AI-Based Prediction of Urban Climate and Its Impact on Built Environments

The study of urban climate and its impact on built environments would help provide guidelines and tools for urban planners and building engineers to evaluate the environmental quality in our living space. Given the practical difficulties of performing city-scale or multi-scale experiments, accurate simulation and fast decision-supporting tools are urgently needed to provide pollutant mitigation strategies for researchers, urban planners, environmental engineers, and decision-makers. The existing development of such tools have been plagued mainly by three scientific challenges: computational speed, accuracy, and robustness. This project plans to develop AI-based CFD simulation to realize accurate and fast prediction of urban climate and built environment. This study will focus on two aspects in developing the AI-based model: AI-based turbulence model by learning the “behavior” of turbulence, and AI-based surrogate model by super-resolution. In order to train and test the artificial neural network (ANN) model, this project will collect experimental data of both indoor and outdoor airflow from on-site and lab measurements. For lab measurements, the experiment could be full-scale or small-scale. For the data on outdoor airflow, meteorological data is also a good source. In terms of accuracy, the predictions by AI-based models are expected to be within 10 percent difference from that by conventional CFD simulations. In terms of efficiency, the AI-based models are expected to be at least 10 times faster than conventional CFD simulations.

Wei Liu
Assistant Professor, Civil and Architectural Engineering
KTH Royal Institute of Technology

Niklas Lavesson
Professor of Computer Science
Blekinge Institute of Technology

Giovanni Calzolari
Doctoral student, Civil Engineering
KTH Royal Institute of Technology

A Learning-Based Influence Model Approach to Cascading Failure Prediction

Large blackouts in power grids are often the consequence of uncontrolled failure cascades. The ability to predict the failure cascade process in an efficient and accurate manner is important for power system protection and security. This proposal targets failure cascade prediction in large-scale power systems based on machine learning methods. To this end, we propose a hybrid learning framework based on the influence model, which is a Markovian-like graphical model that can capture the failure cascade process. The proposed framework 1) learns the influence values in the model through Monte Carlo estimation and optimization methods to predict the failure cascade; 2) incorporates a deep neural network (DNN) to estimate the mapping between system load profiles and influence values to accommodate load variations; and 3) identifies critical components in failure cascades. To apply the framework to large-scale systems, we further address parallelizable implementation and computational challenges, and propose to utilize the C3 AI Suite and Azure cloud computing platform to address these requirements efficiently. The prediction performance of the learning framework will be evaluated on real systems for accuracy, efficiency, and robustness to load variations.

Eytan Modiano
Professor, Aeronautics and Astronautics
Massachusetts Institute of Technology

Marija Ilic
Senior Research Scientist
Massachusetts Institute of Technology

H. Vincent Poor
Michael Henry Strater University Professor of Electrical Engineering
Princeton University

A Joint ML+Physics-Driven Approach for Cyber-Attack Resilience in Grid Energy Management

Energy management systems (EMS) ensure reliable and resilient operation of the bulk energy system through two key interdependent analyses: 1) Bad-data detection (BDD) analysis that sanitizes incoming data from grid telemetry, and 2) contingency analysis for “what-if” simulations to gauge network resiliency. These analyses deployed within the EMS today have critical shortcomings in light of emerging cyber-attacks, which pose a grave threat to grid operation. For instance, coordinated attacks on grid telemetry can bypass traditional BDD capabilities and there are no existing practices to evaluate the impact of cyberattacks on an online grid model. Our overarching vision is to enhance EMS subroutines to fill this critical gap with respect to cyberattacks. The challenge here is in ensuring that these analyses are both expressive and scalable. Specifically, traditional approaches using physics-based simulations can help detect complex attacks, but are impractical (e.g., assuming complete grid knowledge) and numerically slow. On the other hand, machine learning (ML) techniques can be fast but lack the required fidelity to capture anomalies and complex attacks, and can result in high false positives and false negatives. We argue for combining domain knowledge from grid-physics with ML techniques to jointly enable a performant cyber-resilient EMS. Specifically, we propose to augment recent advances in circuit-based grid modeling with ML techniques to 1) develop a novel domain-knowledge driven anomaly detection algorithm to identify stealthy attacks on telemetry, and 2) design ML-assisted warm-start techniques to accelerate online simulations for evaluating cyberattacks.

Amritanshu Pandey
Systems Scientist, Electrical and Computer Engineering
Carnegie Mellon University

Vyas Sekar
Tan Family Professor of Electrical and Computer Engineering
Carnegie Mellon University

Lujo Bauer
Professor, Electrical and Computer Engineering
Carnegie Mellon University

Craig Miller
Research Professor
Carnegie Mellon University

Lawrence Pileggi
Tanoto Professor, Electrical and Computer Engineering
Carnegie Mellon University

Sharing Mobile Energy Storage: Platforms and Learning Algorithms

Electricity storage will play a profound role in our collective sustainable energy future. It is required for efficient and clean electrified transportation systems, it is necessary for zero-emissions balancing of intermittent renewable generation, and it is essential to provide ramping services to balance supply and demand when solar PV rapidly goes offline. While the importance of electricity storage in providing diverse electricity services is undisputed, the economic provision of these services remains challenging. Electricity storage is very expensive, and high utilization rates are needed to recoup capital costs. Sharing installed storage to provide a diversity of services or to serve many geographically distributed users can ensure sufficiently high utilization rates to justify investment costs. Mobile energy storage, such as truck-mounted lithium-ion battery modules or conventional electric vehicles, will enable sharing. Mobile energy storage can simultaneously serve the role of generation (supplying energy) and distribution (reticulating energy). Our thesis is that carefully designed data platforms and associated algorithms play a central role in any successful implementation of a sharing business for mobile energy storage. Platforms are necessary to store, manage, process, and distribute enormous volumes of real-time data. Learning algorithms can process the available data and forecast geographic demand for storage, route available mobile supply of storage to these demand points. This must be done by the platform that serves diverse and competing storage services while respecting quality-of-service constraints. This proposal aims to design, validate, and test platforms and learning algorithms for mobile storage applications.

Kameshwar Poolla
Cadence Design Systems Distinguished Professor of Mechanical Engineering
University of California, Berkeley

Pravin Varaiya
Professor, Graduate School
University of California, Berkeley

Junjie Qin
Assistant Professor, Electrical and Computer Engineering
Purdue University

Learning in Routing Games for Sustainable Electromobility

The transportation sector is the largest contributor to greenhouse gas emissions worldwide. The electrification of road transportation can defer emissions from roads to electric power generation. However, the ambition to achieve zero-emission mobility requires a new, sustainability-oriented approach to transportation planning at societal scale that respects infrastructural constraints and individual incentives while being resilient to infrastructure component failures. We propose to develop sustainability-aware traffic-routing algorithms and tools that leverage and fuse heterogeneous, noisy, and often incomplete data from a variety of sources, such as infrastructure condition data, traffic vehicle counts and flow data, power distribution grid, and weather data. The key contribution is to account for operational costs, infrastructure condition deterioration, and environmental externalities due to multi-class traffic into the design of environmentally desirable traffic routing mechanisms. The research will help answer specific questions such as, how should heavy-duty vehicles be deployed and routed at a large scale to ensure enough correct tradeoffs between operational costs, sustainability, and electric power grid constraints? How can vehicle data be integrated with traffic count and environmental measurements to estimate key externalities? How can we develop specific routing strategies to ensure proper exploration for estimating infrastructure conditions, but still limit the environmental footprint of the majority of vehicles? The project leverages the expertise of two research teams, from KTH and MIT, with extensive previous expertise in traffic systems, control systems, optimization, and game theory.

 

Henrik Sandberg
Professor, School of Electrical Engineering and Computer Science
KTH Royal Institute of Technology

Saurabh Amin
Associate Professor, Civil and Environmental Engineering
Massachusetts Institute of Technology

György Dán
Professor, Network and Systems Engineering
KTH Royal Institute of Technology

Gyözö Gidofalvi
Associate Professor, Geoinformatics
KTH Royal Institute of Technology

Reinforcement Learning for a Resilient Electric Power System

Electric grids are subject to a growing cyber-threat as business and operational capabilities are increasingly targeted by malicious actors. Although fewer than cyberattacks, physical attacks that alter the generation and transmission networks in ways leading to system failures are also growing in number. Furthermore, electric grids are being impacted by severe faults resulting from natural calamities that are now frequent due to climate change. Harnessing the potential of AI techniques to make the power system resilient against such extreme cases is crucial. We propose to develop AI-based methods, and corresponding testing strategies, to achieve this goal. We will first develop reinforcement learning-based methods to improve the resilience of electric power system against potential test-time attacks and failures. We will consider the joint cases where both physical and cyberattacks happen at the same time, leading to system failure, with severe implications for grid operation. Our methods will extend seamlessly to cases where only a physical or a cyberattack occurs. We will use Byzantine game analysis to improve system resilience against training-time attacks and failures. We will also design a human-in-the-loop auditing system to prioritize power grid segments that should be manually checked. Finally, we propose techniques based on rare-event modeling and sequential decision-making to efficiently generate realistic adversarial attack scenarios for stress-testing the resilience of the system and for adversarial training of defense strategies. The end product is an open-source application for improving network resilience on large-scale, real-world power networks.

Alberto Sangiovanni-Vincentelli
Edgar L. and Harold H. Buttner Chair of Electrical Engineering and Computer Sciences
University of California, Berkeley

Bo Li
Assistant Professor
University of Illinois at Urbana-Champaign

Ming Jin
Assistant Professor, Electrical and Computer Engineering
Virginia Tech

Plant-wide Leak Detection in Liquified Natural Gas Assets

Methane is a potent greenhouse gas with greater impact on global warming than carbon dioxide. A 2018 research study shows actual methane emissions from the U.S. oil and natural gas supply chain are about 6 percent higher than EPA estimates, because existing measurement methods commonly miss emissions under abnormal operating conditions. The IEA estimate of current methane emission rates is about 1.7 percent, and Shell aims to bring it down to 0.2 percent by 2025. In an industry partnership with Shell, UIUC researchers are working on an AI/ML and sensor data fusion framework for leak localization and leak estimation using process data. The leak localization algorithm should ideally detect single or multiple leaks and assign suitable probabilities to every potential leak, so that it is an advisory for targeted maintenance to check with additional technologies like acoustic sensing, IR Camera, LIDAR, etc.

RS Sreenivas
Professor and Associate Head for Graduate Studies
University of Illinois at Urbana-Champaign

Richard Sowers
Professor of ISE and Mathematics
University of Illinois at Urbana-Champaign

R. Arvind
Data Scientist
Shell

Mithun C. Dharman
Data Engineer, AI Engineering
Shell

Vishwanath D. Doddamani
Software Engineer
Shell

Naveen Kapoor
Senior Machine Learning Engineer
Shell

Shirish Nanda
Researcher
Shell

Rihab Abdul Razak
Data Science Researcher
Shell

Senthil Kumar Vadivelu
Principal Data Scientist
Shell

Stephen Varghese
Data Science Researcher
Shell

AI-Driven Materials Discovery Framework for Energy-Efficient and Sustainable Electrochemical Separations

Clean water is a grand challenge of the 21st century, as 700 million people worldwide lack access due to geographic constraints and anthropogenic pollution. Ionic pollutants like heavy metals and organic nutrients are critical separation challenges, and closely related to climate and energy security. Electrically-driven water purification is advantageous, as it integrates well with renewable energy and eliminates secondary pollution from chemical inputs. Water purification and regional-scale water treatment can be some of the most carbon-intensive and chemical-intensive processes in a municipality. To address these challenges, we will establish a new paradigm for rationally designing redox-polymers for anion-selective separations, using first-principles calculations, machine learning, and molecular dynamics (MD) simulations to guide adsorbent development from redox group selection to polymer synthesis to applications in sustainable separations. Redox-active polymers offer a powerful avenue for electrochemical control of ion-selectivity and reversibility, through tailored synthesis. We will: 1) establish a machine learning framework coupled with distributed computing and quantum mechanical MD (QM/MD) for screening and informing adsorbent design, based on binding calculations between ion-receptors and target ions, and 2) integrate theory with synthesis and electrochemical tests to quantify and later predict selectivity for a range of target ions. Through close iteration between computation and experiments, our novel approach will account for both microscopic molecular binding and macroscopic energy performance. We expect this proof-of-concept project to establish a workflow for accelerating the development of new technologies for water treatment and selective contaminant remediation, with reduced carbon footprint and enhanced energy efficiency.

Xiao Su
Professor of Chemical and Biomolecular Engineering
University of Illinois at Urbana-Champaign

Diwakar Shukla, Associate Professor
University of Illinois at Urbana-Champaign

Graphene Recipes for Synthesis of High-quality Materials (Gr-ResQ)

Industrial production of graphene by chemical vapor deposition (CVD) requires more than the ability to synthesize large domain, high-quality graphene in a lab reactor. The integration of graphene in the fabrication process of electronic devices requires the cost-effective and environmentally friendly production of graphene on dielectric substrates, but current approaches can only produce graphene on metal catalysts. Sustainable manufacturing of graphene should also conserve the catalyst and reaction gases, but today the metal catalysts are typically dissolved after synthesis. Progress toward these objectives is hindered by the hundreds of coupled synthesis parameters that can strongly affect CVD of low-dimensional materials and poor communication in the published literature of the rich experimental data that exists in individual laboratories. The “graphene recipes for synthesis of high-quality material” (Gr-ResQ, pronounced “graphene rescue”) application implements powerful new tools for data-driven graphene synthesis. At the core of Gr-ResQ is a crowd-sourced database of CVD synthesis recipes and associated experimental results. The database captures 300 parameters ranging from synthesis conditions such as a catalyst material and preparation steps, to ambient lab temperature and reactor details, as well as resulting Raman spectra and microscopy images. These parameters are carefully selected to unlock the potential of machine-learning models to advance synthesis. A suite of associated tools enable fast, automated, and standardized processing of Raman spectra and scanning electron microscopy images. To facilitate community-based efforts, Gr-ResQ provides tools for cyber-physical collaborations among research groups, allowing experiments to be designed, executed, and analyzed by different teams. We envision a holistic approach to data-driven synthesis that can accelerate CVD recipe discovery and production control and open opportunities for advancing not only graphene but also many other 1D and 2D materials, unlocking a variety of potential energy applications including energy storage, solar cells, and improved transmission.

Sameh Tawfick
Associate Professor; Ralph A. Andersen Faculty Scholar
University of Illinois Urbana-Champaign

Elif Ertekin
Associate Professor; Andersen Faculty Scholar
University of Illinois Urbana-Champaign

Darren K. Adams
Lead Research Programmer
National Center for Supercomputing Applications
University of Illinois Urbana-Champaign

Affordable Gigaton-Scale Carbon Sequestration: Navigating Autonomous Seaweed Growth Platforms by Leveraging Complex Ocean Currents and Machine Learning

Carbon sequestration is critical to stabilize our climate and mitigate the adverse effects of climate change. However, existing approaches are not scalable, or prohibitively expensive. A promising approach utilizes seaweed, which fixates dissolved CO2 into biomass. Parts of this biomass naturally reach the deep ocean, where the carbon is confined for millennia. Floating platforms that autonomously grow and deposit seaweed could scale this natural process to the open ocean, unlocking its immense potential for harvesting solar energy. But, steering such platforms with powerful motors is prohibitively expensive. We propose to develop control and learning methods to optimally navigate floating platforms by hitchhiking ocean currents, only using minimal solar-powered propulsion to nudge them into the right currents for optimal growth and sequestration. Existing methods struggle to reliably control energy and actuation constrained agents in non-linear, dynamic, and highly uncertain fields, such as the ocean. We plan to combine probabilistic ocean modeling and control theory with machine learning for path-planning algorithms. Our three research thrusts are: 1) Realistic data-driven ocean and platform modeling and simulation; 2) High-level path-planning and online adaption of these paths using observations and physics-based current predictions with machine learning closures; and 3) Receding horizon planning with a learned terminal value function and deep reinforcement learning in parts of the control architecture. The cost per ton of CO2 sequestered by platforms leveraging currents for navigation would be significantly less than other approaches, thus enabling affordable gigaton-scale sequestration. The C3 AI Suite will enable data integration, scalable computation, and fast deployment.

Claire Tomlin
Charles A. Desoer Chair in the College of Engineering
University of California, Berkeley

Pierre Lermusiaux
Professor, Mechanical Engineering
Massachusetts Institute of Technology

Marius Wiggert
Doctoral Student
University of California, Berkeley

Manmeet Bhabra
Graduate Research Assistant
Massachusetts Institute of Technology

Manan Doshi
Doctoral Student
Massachusetts Institute of Technology

Data-Driven Control and Coordination of Smart Converters for Sustainable Power System Using Deep Reinforcement Learning

Moving towards decarbonization and 100 percent renewables, electric power systems are facing voltage stability caused by intermittent renewable generation (e.g., PV and wind). Power electronic converters, widely integrated into the grid as the interface of renewables, energy storages, and electric vehicles, in theory have enough controllability and flexibility to address the voltage stability issue. However, current model-based control methods are not able to handle the high volatility and uncertainty of distribution grids. Moreover, existing works ignore important interconnected dynamics of converters, which can easily lead to interaction instability in real systems. Also, a high communication burden is required for existing works. This project addresses these challenges by leveraging the ongoing digitalization of the grid and state-of-the-art deep reinforcement learning to achieve data-driven and communication-efficient control and coordination of smart converters. We will train optimal policies on advanced simulation environments that capture important converter dynamics and interactions. Moreover, we will implement our algorithms in real experimental microgrids that are already developed in our lab. The PIs are leading experts in simulation of converter dynamics and smart-grid interactions, implementation of control algorithms on smart grid hardware, and distributed and communication efficient AI/ML, optimization and control in networked systems. The developed software will add significant value to industry – including the advanced DRL solutions, active grid management, and grid data analytics that have been tested on real hardware. We will make all developments easily accessible for use by practitioners and publicly available.

Qianwen Xu
Assistant Professor, Electric Power and Energy Systems
KTH Royal Institute of Technology

Sindri Magnússon
Lecturer
Stockholm University

Robert Pilawa-Podgurski
Associate Professor, Electrical Engineering and Computer Sciences
University of California, Berkeley

Interpretable Machine Learning Models to Improve Forecasting of Extreme-Weather-Causing Tropical Monster Storms

We propose to develop interpretable, machine-learning (ML) models to forecast the Madden-Julian Oscillation (MJO) — the Storm King in Earth’s tropics. The MJO is an irregular, month-long, planetary-scale rainfall pattern over the tropical Indian and Pacific Oceans. Its associated winds and precipitation have global, far-flung impacts, e.g., influencing hurricane formation, El Nino initiation, and North America’s rainfalls and heatwaves. Current numerical weather prediction (NWP) models solve the governing equations of the atmosphere on coarse computing grids, not resolving individual convective storms. Such NWP models have repeatedly failed to forecast the MJO due to large uncertainties in representing convection. For example, the strongest MJO event occurred in March 2015 and subsequently triggered an El Nino event. NWP models neither predicted the MJO’s initiation nor accurately forecasted its evolution with even a two-week advance. We propose to improve forecast skills for MJO using an observationally constrained ML model. This ML model learns from observations, thus avoiding biases due to convection representations in NWP models, and would also be computationally efficient. To circumvent limited observations, we will use transfer learning to train a physics-aware convolutional neural network (CNN) – first on high-resolution, convection-resolving atmosphere model simulations, and subsequently on observations. Once a skillful CNN is trained, we will use the contextual decomposition framework to interpret key factors leading to accurate MJO forecasts. The success of this project will improve forecasting capabilities for the MJO and its associated extreme events, providing essential information for flood control and water management.

Da Yang
Faculty Scientist
Lawrence Berkeley National Laboratory

Bin Yu
Chancellor’s Distinguished Professor and Class of 1936 Second Chair Departments of Statistics and Electrical Engineering and Computer Sciences
University of California, Berkeley

Pedram Hassanzadeh
Assistant Professor of Mechanical Engineering
Rice University

Wahid Bhimji
Big Data Architect, Data and Analytics Services, NERSC
Lawrence Berkeley National Laboratory