ACADEMIA
Turning Environmental Information into Action
ASSIST framework helps decision-makers monitor, understand and protect the environment
Story Highlights:
|
The research begins with a simple question: What do we know about the environment, and how do we know it?
“We can’t observe everywhere at all times,” says Patrick Reed, associate professor of civil and environmental engineering at Pennsylvania State University. “Given that, what should we observe? What is critical for us to understand? Where and when should we put our investments in sensors and sampling?”
For more than a decade, Reed has endeavored to answer these questions with a mix of computational and experimental approaches. His achievements merited him a National Science Foundation (NSF) Career Award, and have led to a computational framework that not only captures the critical details of environmental systems, but is able to evolve and adapt as new sensors, measurements and models are introduced, improving the state-of-the-art in long-term environmental monitoring and forecasting.
“Despite the long history of using observations of the natural world in our management frameworks, our approach to environmental monitoring is largely ad hoc,” Reed said. “We don’t have a very good understanding of how to design or adapt our long-term monitoring efforts. This is a fundamental problem.”
Over the past year, Reed has been using the Ranger supercomputer at the Texas Advanced Computing Center (TACC) to test this novel environmental sampling and computational framework on a variety of test problems, from pollutant dispersal to water resource management. The results point the way to a new means of understanding and protecting the environment.
Watching the World Change
The NSF and U.S. Geological Survey are investing billions of dollars in long-term monitoring and observation because of their value for identifying and predicting changes to our environmental systems—whether they’re oceans, aquifers, or ecosystems.
“We can’t go another 40 years thinking that everything is going to remain as it was,” Reed asserted. “We should at least enhance our measurements and understand where they may be critically important.”
The challenge of observing, predicting and managing the environment lies in improving our understanding of the numerous, highly uncertain, interacting systems that make up the natural world. Simulations of these systems are possible using complex numerical models and powerful supercomputers. However, the models themselves are always approximate, introducing errors to the resultant predictions or decisions.
Reed’s grid-enabled framework, called ASSIST (Adaptive Strategies for Sampling in Space and Time), works around the approximate nature of numerical models by performing extensive data assimilation, incorporating bias corrections, and including uncertainty measures. He believes the combination of these computational approaches will help create simulated environments that better reflect real systems.
Recently, Reed put ASSIST to the test on a physical experiment at The University of Vermont. Using the university’s artificial aquifer, where one day of experimental time equals one year of actual aquifer behavior, Reed was able to check the performance of his long-term monitoring system on a three-week tracer study.
“We can’t go another 40 years thinking that everything is going to remain as it was. We should at least enhance our measurements and understand where they may be critically important.” |
The experiment proved to be a compelling case study for the effectiveness of Reed’s system. Despite being given the wrong initial conditions for the tracer experiment, Reed’s framework recovered the trajectory of a plume of ammonia chloride spreading through the aquifer with a high degree of accuracy.
“We inserted the wrong initial condition which caused an explosion of error,” Reed explained. “Then we used our filtering to learn and remove that error to see if we could make the right space and time sampling choices.”
The ASSIST framework is unique in its combination of evolutionary multi-objective optimization and ensemble Kalman filtering. The filtering corrects biases in the underlying equations, while the evolutionary multi-objective optimization generates a large number of possible sampling strategies that compete against each other to determine the best tradeoff solutions for a system. In the case of the aquifer, this meant integrating sensor data and comparing models with the actual plume dispersal.
Said Reed: “It’s the first step toward making a case for our framework where we can say, ‘See, we recovered an experimental observation without giving the initial conditions to the model.’”
Many-objective solution set for the University of Vermont |
A Socio-Technical Tool
Not all systems have clear outcomes like the aquifer experiment. For this reason, the ASSIST framework is designed to enable human-driven decisions, too. ASSIST takes a traditional cost-benefit analysis and extends it into a many-objective exploration of up to seven dimensions. It includes exponentially more trade-offs, and represents these trade-offs such that every possible option is depicted as a data point in a system-wide evaluation.
Reed’s team is also creating new visualization tools that will allow researchers to "fly" into the many-objective surfaces of his framework to interactively explore what each decision entails.
“The end results are trade-offs,” Reed said. “If we invest here, we get this gain, or if we invest now, we get this benefit. Then it becomes design by shopping.”
Reed and his team have adapted the framework to create a prototype tool for water portfolio management that will help cities better invest in their region’s water supply. He has also collaborated with the Aerospace Corporation to develop a tool that determines the optimal arrangement for satellite constellations.
More than a computational method, Reed says the grid-based framework is a “socio-technical” tool, allowing researchers to move from data to decision, so they can negotiate an appropriate action without resorting to guesswork.
For now, the most complicated environmental monitoring applications being explored with Reed’s framework—some of which include up to five million filter evaluations per experiment—can only be performed on TACC's Ranger, whose parallel-processing power enables higher-level analyses.
“TACC has been key to making discoveries. It helped pull the curtain away from the trade-offs for our most challenging monitoring problems,” Reed said. “I couldn’t show these trade-offs without the computational power available on Ranger.”
The project will ultimately use more than 1.75 million computing hours at TACC, with additional storage capacity provided by the San Diego Supercomputing Center through the NSF TeraGrid.
As today’s supercomputers become tomorrow’s laptops, the methods developed by Reed will enable a wide range of environmental scientists, engineers and managers to address problems through computer-enabled negotiations.
“There are very few questions that don’t have trade-offs,” Reed said. “The stronger we make this multi-objective framework, the more computational discoveries and innovations we’ll be able to make.”