Sophisticated Resilience Model in Development in Colorado – And It’ll be Free

By | February 20, 2015

Efforts are underway in Colorado to build a sophisticated computer model that will offer a look down to the minute details at just how communities may withstand – or crumble under – perils like earthquakes, hurricanes, tornadoes, tsunamis and other catastrophic risks.

Those in charge of the project say it will yield the world’s most sophisticated model for forecasting community resilience when it’s completed in five years.

And it will be free to anyone who wants to use it.

Backed by a $20 million grant announced this week from the U.S. Department of Commerce’s National Institute of Standards and Technology, the Community Resilience Center of Excellence is being established at Colorado State University.

catastrophe model revisionsThe grant for the Fort Collins-based center is for $4 million annually over five years.

That funding will enable a team of roughly 60 NIST researchers and partners from 10 universities to develop computer tools to help local governments decide how each can best invest resources intended to lessen the impact of extreme weather and other hazards on buildings and infrastructure and to recover rapidly in their aftermath, according to John Van de Lindt, the center’s principal investigator and co-director.

The focus of the center’s effort is NIST-CORE. That sands for the National Institute of Standards and Technology – Community Resilience Modeling Environment.

The goal is for the computer model and associated software and databases to be capable of mechanistic modeling – a modeling method that examines a complex system though the workings of its individual parts and the manner in which they are coupled.

Unlike catastrophe models used by the insurance industry that rely on historical data and are focused on calculating the probability of losses, NIST-CORE is a physics-based program that will help researchers look into the details of what occurs in a disaster from the larger perspective of how an entire community fares down to how an individual wall in a home behaves, said Van de Lindt, who is a professor of civil and environmental engineering for Colorado State.

For example, the model will enable researchers to take a wall and push it to extremes, watching it deform as if during an earthquake. Researchers will measure the wall’s behavior – how it flexes and strains – and  they will record that behavior to determine with how much force the wall pushed back with, how it reacted when ever-greater force was applied and what was left of the wall after its breaking point.

“What we’re concerned with is ‘What was the behavior?'” said Van de Lindt, as he spoke on his cellphone while speedily returning to the university to stay ahead of oncoming snow storm that was creeping into the area on Friday evening. “We’re really getting down to the details of the underlying physics of the process.”

The model they plan on developing over the next five years will enable researchers to tie the interdependencies of the aforementioned wall to its structure, and then even tie that structure to the larger community, eventually enabling communities to make the best business decisions about where and how to use their resources to be more resilient, according to Van de Lindt.

“There’s no model like NIST-CORE yet,” he said. “There’s nothing like it on Earth.”

NIST-CORE is literally in its embryonic stage. It is being built on the open-source framework from existing software used by the Mid-America Earthquake Center in Illinois, called MAEviz, a highly sophisticated piece of risk analysis software.

A team of software developers and engineers will use MAEviz as a base for building a computational modeling environment that when completed will be able take a look at any community and offer important details on its resilience to every peril known, Van de Lindt said.

“We’re going to do it for wildfires, we’re going to do it for earthquakes, we’re going to do it for tornadoes, we’re going to do if for tsunamis,” he said. “If it’s a hazard, we’ll do it.”

Already working on NIST-CORE is a team of 31 researchers, and they are in the process of adding 20 to 25 students and post-doctoral candidates from the 10 universities involved in the project.

To help create their model researchers will use a computer at University of Illinois called Blue Waters, which is considered one of the most powerful supercomputers in the world. Blue Waters can reportedly achieve peak performance of more than 13 quadrillion calculations per second.

When built out NIST-CORE not only will provide the scientific basis for developing more resilient communities, it will integrate into these tools considerations for social systems that are vital to the functioning and recovery of communities — healthcare delivery, education, social services, financial institutions.

Van de Lindt, speaking like the proud father of a budding prodigy, also noted that NIST-CORE will be able to learn from one analysis to the next, a capability that does not exist in any other risk or disaster-resilience model in the world.

Initially the model will be primarily a research tool to be used solely by engineers and scientists who can handle its complexity, but eventually it will become an open source tool with versions that can be ran at various levels, down to a version for the desktop computer of a city’s emergency planner.

“It will be eventually be open to anyone in the world that wants to use it,” Van de Lindt said.

Get Insurance Journal Every Day

Add a Comment

Your email address will not be published. Required fields are marked *

*

More News
More News Features