Select Page

Episode 13: Microweather

By Brinley Macnamara
Play Now

Preview

Show Notes

Many of us are familiar with the so-called “macro-scale” effects of climate change. But how will climate change impact individual neighborhoods, streets, or your own backyard?

Transcript

Full text of transcript

Speaker 1 (00:07):

September 1938, the Houston seaboard is lashed by a tropical hurricane. For two days, the coast from New Jersey to New England felt its full fury. New York was a city of deluge as 70 mile an hour winds whipped torrential rains for hour after hour. It turned streets into rivers and slowed the pulse of the city. The hurricane center cut north across Long Island, and damage to suburban areas was staggering. Drains and sewer systems were useless, and the raging seas poured millions of tons of water far in land.

Brinley Macnamara (host) (00:40):

Hello, and welcome to MITRE’s Tech Futures Podcast. I’m your host, Brinley Macnamara. At MITRE, we offer unique vantage point and objective insights that we share in the public interest, and in this podcast series, we showcase emerging technologies that will affect the government and our nation in the future. Today, we’re talking about weather forecasting, but not the type you see in the morning news or on your favorite weather app. Rather, we’re going to tell you the story of a MITRE team who set out to forecast the impacts of climate change at a surprisingly granular level, and how their work is helping two cities in the race to adapt to a warmer climate. But before we begin, I want to say a huge thank you to Dr. Kris Rosfjord, the Tech Futures Innovation Area Leader in MITRE’s Research and Development program. This episode would not have happened without her support. Now, without further ado, I bring you MITRE’s Tech Futures Podcast, episode number 13.

Brinley Macnamara (host) (01:40):

On September 21st, 1938, a category three hurricane wreaked devastation along the east coast, taking 682 lives and incurring billions of dollars in property damage. The 1938 hurricane is also looked upon as one of the worst weather forecasting disasters in recorded history. Weather advisories issued on September 21st grossly underestimated the strength of the storm. When 70 mile an hour winds struck New England that afternoon, citizens were completely caught off guard. Luckily, a lot has changed since 1938. The antiquated weather and climate modeling tools of the early 20th century have been radically transformed by the modern computer.

Brinley Macnamara (host) (02:34):

According to a recent article in the journal Science, if a 72 hour hurricane forecast were issued today, it would be about as accurate as a 24 hour forecast issued in 1980. The same authors went on to note that the value of weather forecasting to U.S. Households in terms of avoided property damage, preparation for storms, etc., is an estimated $31 billion per year. This represents a ten fold return on investment. But despite the tremendous leaps and bounds that modern weather forecasting has made, meteorologists of today face an unprecedented challenge: forecasting the impacts of global climate change. Within the scientific community, is there any debate about whether climate change is real and caused by human activity?

Dr. Thijs Broer (03:20):

The short answer is no. Absolutely not, actually for several decades. In the scientific community, the agreement is at a 99 plus percent level. At the macro level, and even at the increasingly finer detail level, are in violent agreement, what is going to happen.

Brinley Macnamara (host) (03:38):

That was Dr. Thijs Broer talking. He’s MITRE’s Deputy CTO and the head of its Research and Development program. Many of us are probably familiar with the so-called macro effects of climate change that Dr. Broer was referring to. Average global temperature will continue to rise, Arctic Sea Ice will disappear. Some regions of the world, such as Sub-Saharan Africa, will get drier and more difficult to farm. Other regions, such as Greenland, will become wet and more arable. But at the end of the day, we humans experience the world at much smaller spatial scales. Meteorologists call this the micro scale.

Mike Robinson (04:17):

We live in the micro scale. We exit a building, we walk down a street, meter by meter.

Brinley Macnamara (host) (04:24):

That’s Mike Robinson talking. He’s a department chief engineer at MITRE and was a principal investigator on the micro weather project.

Mike Robinson (04:30):

I think there’s lots of concerns about individual neighborhoods, individual streets, being more vulnerable and more at risk to climate change. It’s really hard to parse that out when you’re looking at data, models, investigations, at kilometer or multi-kilometer scales. You’re just brushing over all of the key insight that you want to understand at that level if you can get down there. So living there and understanding it, I think we all have to be, for this problem that we’re trying to tackle, down in the meter scale or micro scale.

Brinley Macnamara (host) (05:05):

This isn’t to say that global climate modeling hasn’t come a long way. In fact, modern global climate models are almost a hundred times more granular than the pioneering global climate models of the 1970s. That said, today’s global climate models are still too coarse grained to answer many questions about the so-called micro scale effects of climate change. Questions like “will climate change put my house underwater or make my neighborhood too hot to go outside in the summer?” It was questions like these that inspired Mike to launch this investigation. For the first phase of his project, he decided to focus on cities. Here’s Mike again.

Mike Robinson (05:42):

Even today, or even over the last 20, 30, 40, 50 years, as cities have grown up in the US and around the world, they’ve created a phenomenon called an urban heat island effect, where the close proximity of all that infrastructure and buildings and concrete and tarmac helps to create its own little heat bubble right over the city. This is a well-known phenomenon.

Brinley Macnamara (host) (06:03):

Mike went on to say that climate change is expected to exacerbate the urban heat island effect. But for city dwellers, this won’t just cause a little more discomfort. On especially hot days, the urban heat island effect can be deadly. In fact, as of today, heat deaths are the number one climate related cause of death. According to a recent study in the journal Lancet, there were over 300,000 heat deaths in 2019 alone. Moreover, as global warming accelerates, so will urbanization. According to the UN, the share of the world population living in urban areas is expected to increase from 55% in 2018 to 68% in 2050. That’s an additional 2.5 billion people living in cities. Of course, Mike and his team couldn’t tackle every city all at once, so they faced a really challenging question at the outset of their project: which cities would they start with? Here’s Mike again.

Mike Robinson (07:00):

We don’t want to be hobby shopping with this research. We wanted to get out and help somebody, help some cities. I was lucky enough to actually meet the director for climate resiliency in the New York City mayor’s office at a conference. She was actually giving a presentation, and I went up to her and I said… We’re about to start this phase one research at the time. I said, “We’re still picking cities. If I picked your city, New York City”, which we were focusing on anyways. I mean, it’s New York city. “Would you be willing to just engage with us and share your feedback, share your guidance on what we’re doing, because we want to shape this towards something that could be useful towards a city like yourselves?” And she jumped at the chance, and we’ve established and solidified a relationship with the New York City Mayor’s Office as a result.

Mike Robinson (07:44):

Raleigh followed similarly. We wanted to get out of the Northeast, and we had some relationships in North Carolina and it led us to Raleigh. The other thing too of why we chose New York City and Raleigh is on the urbanization side. There’s no sprawl anymore in New York City. There’s nowhere else to go. You’re just going to continue to build up and out within the urban footprint. But Raleigh’s kind of a mix. The downtown is kind of dense, but you can still add more buildings in that build-out sense. But they’re also sprawling a little bit. We have the model there with that city to add more buildings in and around the city as well. The urbanization effect was also there, but the partnerships and the engagements were the big thing driving those cities.

Brinley Macnamara (host) (08:21):

This wouldn’t be a podcast about emerging technology without a lot of discussion about the technologies that have enabled urban microweather modeling. For the rest of this podcast, we’re going to do just that. Here’s Dr. Broer again.

Dr. Thijs Broer (08:38):

What has enabled modeling simulations and calculations like this? Two things, right? I mean, it’s the computational horsepower that one has right now that you didn’t have even 5, 10, 20, 30 years ago. That, of course, is this huge advance that goes back all the way 50 plus years ago to the discovery of the transistor.

Dr. Thijs Broer (09:03):

That’s one. It is computational infrastructure. And then the other is the explosion of data. Now, there’s a synergy going on those two. But if you just have data, you can’t do anything with it, then it’s just noise. I mean, you don’t know what to do with it, but actually the processing and the analysis of data at super scale is enabled by the availability of those tools, that is, the computational tools.

Brinley Macnamara (host) (09:28):

The bottom line here is that modeling the weather is an extremely computationally intensive job. For this reason, the now United States’ National Weather Service uses some of the world’s largest supercomputers to model the weather. I’m talking to computers that are 10,000 times faster than your laptop or smartphone. But even with the world’s most powerful super computers at their disposal, all weather modelers face one significant trade off. That is, the trade off between domain size, i.e., how much of the world their model can cover, and resolution, i.e., how many atmospheric processes their model can resolve. To understand this trade off, it can be helpful to visualize how weather and climate modeling works. Since Earth is a sphere, you can think of the atmosphere as a giant shell that covers Earth’s entire surface area. This giant shell looks somewhat like a donut. Mathematicians call it a spherical shell.

Brinley Macnamara (host) (10:22):

But in this podcast, we’re going to be referring to it as a donut, because, who doesn’t like a good donut? Anyways, most weather and climate modeling algorithms work by dividing this gigantic donut into a three dimensional grid and using some very fancy mathematical equations to calculate what goes on within each grid cell. By what goes on, I mean things like wind speed, humidity, and heat transfer and when aggregated, these calculations produce things like weather forecasts and global climate change predictions. Now, the resolution of a weather or climate model is set to be the size of each grid cell within this three dimensional atmospheric donut. Hence, smaller grid sizes allow for more grid cells to fit within the donut, which in turn allows for the model to resolve atmospheric processes that happen at smaller scales. For instance, a weather model with 50 square kilometer grid cells can model tropical cyclones, whose diameters typically exceed a hundred kilometers, but cannot model much more modestly sized thunderstorms.

Brinley Macnamara (host) (11:22):

Whereas, a weather model with 10 square kilometers cells can model most thunderstorms, but is too coarse grained for tornadoes, whose diameters average around a hundred meters. This means the domain versus resolution trade off really boils down to a trade off between scale and accuracy. The smaller a model’s grid cells, the more atmospheric processes it can represent, and the more accurate it will be. But even today’s most sophisticated global weather and climate models running on the world’s most sophisticated supercomputers can only reach grid sizes as small as 10 square kilometers. Their scale is just too big to compute the more fine grain atmosphere of processes in so many grid cells. Luckily, Mike and his team don’t care all that much about scale. Remember, they’re trying to model the impacts of climate change on the microweather patterns of just two tiny slices of that atmospheric donut, New York City and Raleigh, North Carolina.

Brinley Macnamara (host) (12:17):

And because of this smaller domain size, the MITRE team was able to select a weather model that can encompass atmospheric processes in every layer of Earth’s atmosphere, including the atmospheric boundary layer, the notoriously complex layer of Earth’s lower atmosphere that is in direct with Earth’s surface. Many processes within the atmospheric boundary layer occur at size scales that are smaller than one kilometer, too fine grained for today’s global climate and weather models, but within reach for the MITRE team. Models that can resolve every layer of Earth’s atmosphere are often referred to as first principles models and are said to have extremely high resolutions. They are the holy grail of weather modeling. For their effort, the MITRE team chose a particular high resolution model called JOULES, which Aeris LLC developed and currently maintains. Here’s Mike Robinson again.

Mike Robinson (13:08):

So JOULES stands for Joint Outdoor-indoor Urban Large-Eddy Simulation, JOULES. It’s a Large Eddy Simulation model, but it’s really an atmospheric physics model, which I like to say because we want to note that it is different than some of the computational fluid dynamics or CFD models that are conventionally used by urban planners to understand airflow around an individual building. The JOULES model is solving the meteorological and atmospheric science in the low level atmosphere, and it can resolve the buildings. It’s a first principles model that will capture the true airflow and behavior, physics-wise, in that environment.

Mike Robinson (13:45):

The thing that makes JOULES actually very, very useful for the types of things that we’ve been doing is all of its code is resolved in a GPU accelerated environment. It can run very, very fast compared to legacy LES models. With the GPU accelerated LES model, for our phase one work, we ran 96 simulations of our two cities at about three to three and a half meter resolution with all of the buildings resolved and multiple, multiple vertical levels at three meter resolution. That just is not possible years ago without that additional computing resource capabilities.

Brinley Macnamara (host) (14:25):

To recap, JOULES relies on two key technologies for modeling the atmospheric boundary layer. Number one is the Large Eddy Simulation, and number two is the Graphics Processing Unit, also known as the GPU. Now, you might have previously heard the term eddy to describe a circular current of water, such as those found in rivers or oceans. Believe it or not, when scientists talk about atmospheric eddies, they’re referring to the exact same phenomenon. Only this time, it’s a circular current of air that may cause a little turbulence in your airplane as it prepares to land, as opposed to some turbulence in your kayak. Hence, a Large Eddy Simulation is a technique for modeling large eddies, i.e., large circulating currents of air, that are present in Earth’s atmosphere boundary layer. Due to their lower complexity, Large Eddy Simulations tend to perform a lot better than other models of the atmospheric boundary layer, making Large Eddy Simulation an ideal candidate for computationally intensive weather modeling.

Brinley Macnamara (host) (15:24):

Another benefit of this atmospheric model is that it runs well on a Graphics processing unit, a special type of computer chip that is designed to run a massive number of computations in parallel. Previously, it would’ve taken hundreds of conventional computer processors to run a Large Eddy Simulation. Today, many Large Eddy Simulations can be run on a single advanced GPU. For the MITRE team, this has meant that they’ve been able to run JOULES on a lot of different future weather scenarios. For instance, in the first phase of their project, they set up the model to run on different building and climate change scenarios in New York City and Raleigh, and focused on how these different scenarios would affect the future airflow at key landmarks within those cities. Here’s Mike again.

Mike Robinson (16:05):

The other permutations were okay, set up the model with today’s climate. We can do that, and we did, based upon what data we have representing today’s environment. And then we selected, for phase one, some representative weather information from global climate models that will be indicative of what 2050 might look like and we use those to downscale and re-parameterize JOULES and produce its output for a 2050 climate. And then finally, the other thing that we changed was we had our current urban footprint today, and we loaded in a 3D shape files of that city into the JOULES model so it could do its thing and perturb the airflow and everything else with that. But then, we also worked a little bit with the city of Raleigh, a little bit with New York City, but largely with the data that they provide on zoning and expected development that’s coming in the next decades.

Mike Robinson (16:56):

We did a proof of concept expansion of both cities where we added about 200 buildings that we could logically and justifiably stand behind so they weren’t outlandish when we worked with the cities and they looked at our data, and that was our advanced urbanization. Continued build out in certain parts of the south Bronx, for instance, continue development to downtown area of Raleigh. And that also changes the 3D shape file set in the model and then further and differently perturbs the microscale weather environment, both given today’s climate, and then also given the 2050 climate.

Brinley Macnamara (host) (17:28):

Now, to get the data on how global warming would affect the future weather in New York City and Raleigh, Mike and his team had to tap into another big modeling effort. The effort to model the impact of greenhouse gas emissions on Earth’s climate. This effort is led by the Intergovernmental Panel on Climate Change, also known as the IPCC. The IPCC is the world’s top scientific authority on climate change. That said, the IPCC doesn’t collect its own observational data or run its own models. Rather, the IPCC releases an assessment report every five years or so that is based on a massive literature review. This means that the IPCC leans on a variety of different peer reviewed global climate models. Thus, these IPCC-approved global climate models were prime candidates for the MITRE team.

Brinley Macnamara (host) (18:17):

There was only one catch. Can you guess what it was? If you guessed that the global climate models were too coarse grained for the MITRE team’s microweather modeling, then you were correct! In fact, these coarse grained spatial scales are such a thorn in the sides of meteorologists, they even have their own name. They are said to be sources of bias within global climate models. Luckily, there is a growing research effort to develop methods for bias-correcting global climate models. And for phase two of the microweather project, the MITRE team is excited about one particular bias corrected global climate model that they were able to dig up. There’s one final technological hurdle to microweather modeling that I wanted to discuss today. Here’s Brian Pettegrew, Mike’s Co-Principal Investigator.

Dr. Brian Pettegrew (19:02):

One of the biggest challenges in modeling the weather is validating how well your model did. We have these vast areas of no observation, so you run into this problem that anybody can compute weather phenomena on their computer, but at the end of the day, how do you know you computed it correctly? We have so many holes in our ability to observe the world. It’s getting better, but it still becomes one of the biggest challenges. How much can you observe the world to validate how well you’re modeling the world?

Brinley Macnamara (host) (19:32):

Do you think citizen science would be helpful? Are there any citizen science projects ongoing right now that are trying to encourage people to make those observations themselves?

Mike Robinson (19:44):

I’m smiling, Brinley, because we are actually going to leverage two citizen science campaigns run in New York City and Raleigh this past year, where they went out and empowered volunteers to go out and measure temperatures all over the city. They even mounted infrared thermometers on cars and drove all over the city to observe, to collect data. That data’s been made public. So we’re actually…as part of our work for 2022, because this validation point is so important…Can we stand behind what we’re modeling, gain trust with our partners, and then move forward into all the applications and aspirational stuff we want to do? This validation’s important. We’re actually going to replicate those two exact days in New York City in Raleigh, where they collected all that data. And then, we actually have an observation set in the city at three different times a day for each of those campaigns that we can compare our results against.

Brinley Macnamara (host) (20:38):

As I was listening to Mike and Brian, I couldn’t help but think of a point Dr. Broer brought up at the tail end of our conversation. He said that to fight climate change, you really do have to think globally and act locally. I know what you’re thinking. “She’s really going to end the podcast with a cliche?” But in this case, Mike and Brian’s work could not be a better example of thinking globally and acting locally. I mean, they are literally taking bias corrected global climate model data and using it to tune their own hyper local microweather climate model. For what it’s worth, my crash course on microweather has showed me that the problem of climate in action may not be because the problem is “just too big”.

Brinley Macnamara (host) (21:18):

Perhaps, it’s just because the problem hasn’t been broken down into small enough pieces that an individual, or small group of individuals, will be capable of tackling. Mike and Brian have found their piece. What’s yours?

Brinley Macnamara (host) (21:35):

The show was written by me. It was produced and edited by myself and my co-host, Eliza Maze, Dr. Kris Rosfjord, Dr. Heath Farris, and Beverly Wood. Our guests were Dr. Thijs Broer, Mike Robinson and Dr. Brian Pettegrew. The music in this episode was brought to you by Ooyy, Sarah the Instrumentalist, Arthur Benson, and Truvio. The opening newsreel was provided by PublicDomainFootage.com. We’d like to give a special thanks to Dr. Kris Rosfjord, the Technology Futures Innovation Area Leader, for all her support. Copyright 2022. MITRE PRS # 22-1327. April 25th, 2022.

Brinley Macnamara (host) (22:15):

MITRE. Solving problems for a safer world.

Meet the Guests

Mike Robinson

Mike Robinson is the Department Chief Engineer for Operations Performance at The MITRE Corporation Center for Advanced Aviation System Development. In his current role, Mike is responsible for driving and contributing innovative research solutions that address execution and performance needs of operations, systems, and stakeholders. His primary areas of interest include improved weather accountability and integration that will benefit transportation and larger societal missions.

Prior to joining MITRE, Mike was the Chief Technology Officer with AvMet Applications, a Technical Staff Scientist with MIT Lincoln Laboratory, and a research analyst at the NASA Goddard Space Flight Center. He holds an M.S. and B.S. degree in meteorology from Texas A&M University, and SUNY Oswego, respectively.

Dr. Matthijs (Thijs) Broer

Dr. Broer helps oversee the activities within the Office of the CTO/CMO, including management of MITRE’s independent R&D program and development of the corporate technology strategy. Before joining MITRE in 2018, Dr. Broer was the Central Intelligence Agency’s Chief Technology Officer, Directorate of Science and Technology, where he implemented a new R&D structure and developed innovation practices. Prior to joining the CIA, he gained extensive industry R&D experience working for Northrop Grumman, Innovance Networks, Corning, Inc., and Bell Laboratories.

Dr. Broer earned his bachelor’s degree in physics from the Technological University, Delft, The Netherlands, and his master’s and doctoral degrees in physics from the University of Wisconsin.

Dr. Brian Pettegrew

Brian completed his MS and PhD in Meteorology from the University of Missouri in 2008.  Prior to joining MITRE in 2021, Brian worked for 13 years as a cooperative institute affiliate for the National Oceanic and Atmospheric Administration performing meteorological research and development to support aviation forecasts across all scales, global to regional, boundary layer to stratosphere, hours to days.  Since joining the MITRE team, Brian has been active in applicable research supporting a wide range of interests, from micro-scale weather investigations and numerical weather prediction to enhanced weather integration research.