Blog

Multi-Horizon Weather Impact Forecasting with AI

May 21st 2021

by Vijay Jayachandran, CEO

Extreme weather events cost the United States tens of billions of dollars every year. With footprints that stretch from the Atlantic Seaboard to the Rockies, events like hurricanes, tornadoes, and ice storms create huge disruptions that are now part of the lifestyle calculus for almost every American.

2020 was a blowout year for weather and climate related insurance payouts. According to Aon’s 2021 report, 76% of global insured losses due to natural disasters occurred in the US. Additionally, weather-related disasters accounted for 99% of all insured natural disaster losses.

We are riding a trend that will likely worsen in the coming years. With such exposure, asset managers and underwriters are scrambling to become more nuanced in their interpretation of risk so that they can reduce portfolio exposure and increase capital efficiency.

https://www.ncdc.noaa.gov/billions/

There are three distinct horizons when it comes to preparing for extreme weather.

  • The Long Horizon: Before the storm clouds have started forming. This is the domain of long-term forecasting, where we know that some big events are heading our way but even the best climatological models cannot predict if any of them will be memorable outliers, and how they might play out. Given this, the orientation becomes strategic and the focus shifts towards understanding systemic vulnerabilities and risks. Armed with these insights, asset managers and underwriters can efficiently allocate capital and build more resilient portfolios.

  • The Short Horizon: A few days before the event when the weather forecast indicates that something big is brewing. This is the domain of short-term forecasting, where the focus is on preparing for the imminent and minimizing downside. In this window, the orientation is tactical, and the most important thing is to understand the various disruptions that are likely to occur, and what the worst-case impacts might be. These insights can help asset managers with executing mitigative actions and placing appropriate hedges.

  • The Now Horizon: The event has struck, and its impact is being experienced in real-time. This is the domain of nowcasting, where the focus is on the “here and now” and the orientation is operational. During this time, the most important thing is to gain real-time situational intelligence and accurately interpret what is happening. These insights can help asset managers with performing rapid triage and executing decisive responses.

Forecasting the impact of extreme events is a complex undertaking. There are many influencing factors and variables, all of which combine in unique ways to create highly disruptive outcomes. An emerging technique for managing this complexity is machine learning. ML models are highly extendable and can ingest large sets of disparate data. When built well, these models can tweak out nuances and sensitivities that are not readily discernible via conventional techniques.

At Whether, we have developed proprietary machine learning techniques to estimate and visualize the impact of extreme weather events. Our ML models combine remote sensing data from hazardous events, feature data from impacted assets, and a variety of environmental risk factors. Due to their dynamic nature, our models can be used for forward-looking simulations as well as optimization studies along different time horizons. They are highly scalable, improve with network effects, and can be deployed across large geographies.

In a previous blog post, we defined resilience as the ability of a system to experience the smallest possible disruption for the shortest amount of time when subjected to an external shock. ML-based impact nowcasting and forecasting tools now make it possible for asset managers to proactively manage risks from extreme weather and build resilient portfolios.

Weather Impact "Nowcasting" with AI

March 8th 2021

by Vijay Jayachandran, CEO

Extreme weather events such as hurricanes and derechos are a logistical challenge for emergency managers. Even when they prepare extensively ahead of such events by lining up manpower and stockpiling resources, their ability to execute is gated by the speed with which they can understand the situation on the ground.

The first 24 to 48 hours after an extreme event are usually confusing and information scarce. Physical surveys take time and are slowed down by debris and flooding. Aerial surveys cover ground faster, but the post-processing and interpretation of imagery is time-consuming and can take a couple of days. These delays are further exacerbated when the impact footprint is large, especially in rural and heavily wooded areas. While all this happens, restoration crews wait.

Many organizations have hired meteorologists to address this gap in situational intelligence. They have also engaged weather analytics companies to provide custom forecasting and “nowcasting” packages. Parameters such as wind speed, precipitation, and temperature are now routinely used by frontline operations teams to estimate impacts and prioritize response efforts.

Unfortunately, the relationship between weather and its impacts is complex. Simple heuristics cannot account for the myriad interdependent variables and risk factors that influence outcomes. For example, for the same wind speed, a house sitting in an open field is less likely to be damaged than one that is surrounded by trees. During extreme events, these types of interdependencies must be extracted at scale, across large geographies, and for a range of conditions. This is an ideal use case for machine learning.

In our inaugural blog post, we described how we can use historical records of hazards (extreme weather events), targets (infrastructure assets), risk factors (asset age, soil type, proximity to trees), and impacts (damage, restoration cost) to build complex and capable machine learning models that can be represented simply as:

impact = f (hazards, targets, risk factors)

These types of impact models can enable some extremely useful outcomes. For example, the key input variable during an extreme event is the hazard. If we can measure the weather variables in real-time, we can apply them as inputs to the above model and estimate their impacts also in real-time. This ability to “nowcast” impacts is potentially transformational. Imagine if emergency responders could get a high-confidence visualization of storm damage within an hour after it has passed. They could hit the ground running and deploy their resources in a highly surgical manner.

Impact models can estimate other things besides damage. While damage is the most visible form of impact, models can also be used to estimate other impacts such as the amount of restoration effort involved, the financial cost of the restoration, the potential impacts to society, etc. An Assistive AI system that can estimate these outcomes – rapidly and accurately – would be an indispensable tool for emergency responders, government bodies, and risk managers.

Is any of this possible? In one word, yes. At Whether, we have built such tools and are actively working to bring them to market. It must be noted, however, that the level of accuracy that can be obtained from such systems will never match what you would get through site visits and physical inspections. That said, they are significantly better than traditional heuristic-based approaches and can be of tremendous help to emergency responders during the early hours after an event.

In our last blog post, we defined a resilient system as one that recovers fast after a disruption. Impact Nowcasting provides a viable and critical solution for increasing the resilience of critical infrastructure systems that power humanity.

Resilience Through Impact Modeling

February 18th 2021

by Peter Watson, CTO

Before and After Nighttime Lights Images of Texas During February 2021 Outage Event

If 2020 taught us anything it was that unexpected events can be hugely disruptive. And the lessons have continued in 2021. The recent power outages in large parts of America because of severe winter weather, have endangered lives and disrupted the economy, illustrating the importance of infrastructure resilience.


‘Resilience’ is a popular term with many definitions, but as illustrated in the chart1 below, I am specifically referring to the ability of a system to experience the smallest possible disruption for the shortest amount of time when subjected to an external shock. In order to design this type of system, we must first be able to understand the potential for various external shocks and then execute solutions that make the system robust to those shocks.

Many disasters like the recent blackouts in Texas are related to a lack of imagination about the potential and consequences of extreme events. These scenarios always seem distant and way too improbable to invest serious time and attention. However, as the effects of climate change pile up, these “grey rhino” events are only going to come more often and with increased ferocity. It is critical that infrastructure owners appreciate these risks and invest in tools that give them the situational intelligence to respond effectively. Armed with the right information, emergency managers can prepare effectively in advance of extreme events and react with confidence when they do occur.

There are a range of emerging and sophisticated dynamic modeling approaches that can generate the knowledge that is required to both reduce the size of disruptions and to shorten them when they occur. These approaches take advantage of recent trends in digitization and machine learning to create granular analyses that represent multiple dimensions of disruptive events before, during, and after they happen. They can help us understand how systems fail today, and how they might fail in the future. With those insights, we can appreciate the risk that we face as a society and how best to create a more resilient and prosperous future.

The Resilience Curve1

[1 Chart taken from presentation by Gil Bindewald and Guohui Yuan, North American Energy Resilience Model (NAERM) Status Update. US DOE Office of Electricity. May 29th 2020]


Dynamic Models for Granular Risk Management

February 4th 2021

by Peter Watson, CTO

Predictive models have become ubiquitous tools for businesses to manage financial and operational risk. All modeling approaches, however, are not created equal.


The most common type of risk models today are rooted in statistical aggregation. These models analyze past events in aggregate to estimate the number and type of events that will happen in the future. This is how automotive insurance companies, for example, ensure that what they charge their customers is enough to cover all of their claims. Some customers might be more or less expensive to insure than others, but as long as the policies are priced to overcome the aggregated risks, the company remains profitable.

This generally effective but simple approach has some shortcomings. Firstly, because it relies on aggregation, this technique does not describe individual events very effectively. This effect is pronounced for extremely disruptive events, often leading them not being taken seriously as a threat. Their risks are discounted by decision makers because they appear to be so rare; unfortunately their financial impact can be catastrophic in the situations when they do arise. Secondly, aggregation leads to inefficiency because risks end up being spread out evenly. This approach of equally penalizing everyone across a risk bucket leaves a significant amount of money on the table. A more granular slicing of risky assets can lead to more efficient capital allocation and elimination of leakage. Finally, statistical aggregation using past data does not work well if the characteristics of the events change over time. This is especially important to consider in case of some events like natural disasters because their rate and magnitude has been rapidly accelerating due to climate change. The risks presented by these events will not be correctly quantified if we rely only on historical rates and trends.


We propose to move to an alternative approach called Dynamic Modeling that can overcome the above limitations and lead to more efficient capital allocation. This technique aims to simulate how a system actually works, and predicts outcomes based on the characteristics of individual events or customers. It requires more information than aggregation-based approaches, but also provides more granular insights with greater accuracy.


Dynamic Modeling is getting traction in several industries. Revisiting the example of automotive insurance, there are programs today that involve customers installing a sensor module in their car (e.g.,Progressive Snapshot), which provides the insurance company with information about their driving habits. This enables pricing customized to how each customer drives, hence managing the company’s risk while providing extra value to their customers at the same time.


In short, Dynamic Models help risk managers appreciate each individual case for what it is, allowing for specific decision making to prevent inefficient outcomes. Their individualized treatment of assets and events results in surgically precise insights that maximize capital efficiency. They also enable realistic planning for the most extreme cases and help improve organizational resilience.


The Promise (and Realities) of AI / ML

January 20th 2021

by Vijay Jayachandran, CEO

Artificial Intelligence has been getting a bad rap of late, with numerous opinion pieces and articles describing how it has struggled to live up to the hype. Arguments have centered around computational cost, lack of high-quality data, and the difficulty in getting past the high nineties in percent accuracy, all resulting in the continued need to have humans in the loop.

None of this is new for those of us who have been doing simulation and optimization for some time. When I started my career, I had to contend with naysayers who liked to poke holes in my models and complain about their accuracy. For me (and other believers), it was never about achieving a perfect match between the model’s prediction and the ground truth. Models were simply a means to get new insights that could take us in the general direction of goodness.

All of this brings us to a philosophical question: why do we use models? In my opinion, we use models to explore complex phenomena that are too difficult to wrap our heads around.

Let’s be clear – the human brain is a remarkable evolutionary creation capable of many things that we cannot possibly model (e.g., empathetic and ethical decision making). However, there are certain things we can do with mathematical models that the human brain cannot do. A good example is weather forecasts, which come from large and complex computational models that consider a huge number of atmospheric characteristics. While we often complain about their accuracy, we also appreciate that they are much better than what we would predict without their help.

In the same vein, AI & ML are simply tools for building complex (and sometimes non-linear) models that consider large amounts of information. They are most potent in applications where their pattern finding power significantly exceeds human capability. If we adjust our attitude and expectations, we can leverage their power to bring about all sorts of tangible outcomes for humanity.

With this type of re-calibration, our mission at Whether is to use AI to help human decision makers, rather than replace them. We are using machine learning to build weather and climate impact models that help infrastructure managers allocate their resources efficiently. While our models do not perfectly match the ground truth, they are much more accurate and precise than simple heuristics, and can help infrastructure managers save millions of dollars through faster response and more efficient capital allocation.

Reacting to the Unpredictable

January 5th 2021

by Peter Watson, CTO

There were quite a few notable weather events in the United States over the course of 2020. One of particular note was a derecho that developed on the 10th of August in Iowa. After forming and rapidly intensifying, it headed east, sweeping across much of the Midwest causing widespread damage. In total, the storm caused about $7 billion in damage, and featured wind gusts in excess of 100 mph. It was likely the most damaging thunderstorm event in US history. For more details see: https://www.weather.gov/dvn/summary_081020

Derechos and thunderstorms inflict a huge amount of damage but have traditionally been very difficult to predict and prepare for. How, when, and where the convective energy that powers these events is released depends on a lot of different factors that are difficult for weather forecasters to predict with confidence and precision. In the best case scenario, the National Weather Service’s Storm Prediction Center is able to issue the appropriate watches and warnings several hours before the storm arrives, but even that is very little time for emergency managers to get prepared.

Additionally, because they are so sudden, the confusion and uncertainty from before big convective storms lingers well after they are over. Emergency managers at municipalities and utility companies can be unsure about the exact locations and levels of damage many days after the events have passed. And because such events can occur up to 15 times a year, they can adversely impact utility reliability metrics like CAIDI, SAIDI and SAIFI.

Given the difficulty in forecasting these events, the focus must shift to interpreting them as soon as they have occurred. There are many real-time data sources that can be used to reconstruct events and estimate their impacts. Due to the highly non-linear nature of the interaction between weather and infrastructure assets, machine learning is becoming an increasingly powerful tool for modeling such events after the fact. Insights gained from these models can help emergency managers react quickly and decisively.

Decision support tools based on real-time weather observations are now a reality. They can eliminate post-storm uncertainty and give emergency responders the situational awareness they need to react to sudden storms like the Derecho on August 10, 2020.

Using AI-based Impact Models to Drive Infrastructure Resilience and Adaptation

December 17th 2020

by Peter Watson, CTO

I recently saw this figure in a report (linked here) developed by the National Infrastructure Advisory Council, and thought that it captured the different phases of incident response very nicely. Preparing well before an event, reacting quickly when it happens, and restoring efficiently after the event, all contribute to infrastructure resilience. And after the dust settles, it is equally important to perform retrospectives, learn lessons, and make adaptive changes so that you will be more resilient in the future.

Implementing a robust system of preparation, reaction, restoration, and adaptation is easier said than done. In reality, these activities are hampered by uncertainty, with lack of information causing problems each step along the way. Forecasts can be inaccurate, so preparations can be off. Situational awareness can be lacking, so reactions can be tenuous. Post-storm information can sparse, so restorations can be slow and confused. And long after such events, infrastructural systems can seem so large and complex that it can be very difficult to know what interventions or adaptations would really make a difference during the next storm. Even if such adaptations are made, it can be hard to know if or how much of an improvement the changes actually made.

All of these difficulties can be addressed with high-quality Impact Modeling. Impact forecasts can inform preparations and improve their accuracy. Impact models forced with real-time observations of hazards can generate situational intelligence during and immediately after events - this can inform operational reactions and speed up the recovery process. After the fact, counterfactual models can be leveraged to evaluate the effectiveness of different adaptations which can in turn be used to prioritize infrastructure investments.

Impact models can be an engine for resilience and adaptation. The broad application of this technology could allow us to create a future where humanity not only survives but thrives under climate change.

Robust Impact Modeling

December 1st 2020

by Peter Watson, CTO

Hello everyone. Welcome to the Whether blog! We’ll be posting here about how we’re applying machine learning and other cutting-edge analytical techniques to understand and predict the impacts of natural hazards and other related topics.

Using models to estimate the damage or impacts of natural hazards isn’t new, but now there’s a great opportunity to use modern data science to improve upon the status quo and create robust modeling frameworks that can help us create a resilient future. Established approaches to modeling impacts are often relatively simple, focusing on several main contributors to damage, e.g., the max wind speed of a hurricane, or the magnitude of an earthquake on the Richter scale. But there’s no need for that simplicity, and it can limit the effectiveness of an empirical model if the information considered is not comprehensive. For example, a hurricane impact model that ignored precipitation would never be able to estimate the impacts of wet and slow storms like Hurricane Harvey, and an earthquake impact model that ignored soil characteristics would never be able to quantify the effects of liquefaction.

A robust impact model should contain comprehensive information about all of the pertinent factors, including:

  1. The Hazard: the thing that causes the impacts (hurricane, tornado, earthquake, etc)

  2. The Target: things that could be impacted (power lines, cell towers, homes, etc)

  3. The Risk Factors: things that could contribute to the risk of impacts (trees, soil types, service history, etc)

  4. Observed Impacts: what happened as a result of the combination of Hazard, Target, and Risk Factors

If you can assemble a database of all of these factors based on historical events in a way that also describes their intersection in space and time, that database would be the foundation for a robust model that could predict the impacts of future hazards.

It should be noted that only recently has this data-rich approach for impact modeling become technically feasible. Once you start considering all potentially relevant aspects of a natural hazard, the infrastructure, soils, surrounding vegetation, etc, the data quickly becomes very large and complicated. But with recent developments in data science and machine-learning, creating robust non-linear, non-parametric models that considers a very large number of variables is feasible.

At Whether, we’re leveraging these modern techniques to create high-dimension, robust impact models that can understand the incredible and complex world of natural hazards and will be the foundation of a more resilient future.