Slope stability model helps prevent landslides to protect communities, save lives

Melbourne researchers able to predict landslides

A mathematical model which can predict landslides that occur unexpectantly has been developed by two University of Melbourne scientists, with colleagues from GroundProbe-Orica and the University of Florence.

Professors Antoinette Tordesillas and Robin Batterham led the work over five years to develop and test the model SSSAFE (Spatiotemporal Slope Stability Analytics for Failure Estimation), which analyses slope stability over time to predict where and when a landslide or avalanche is likely to occur.

In a study, the research team was able to predict landslides, which often cause severe disruption, economic damage, and deaths, of various sizes and speeds and in different environments.

"The key to the success of this model is that it works across a vast range of spatial or temporal scales and is informed by the physics of failure in soil and rock bodies," said Professor Tordesillas.

"It can be used at a mine, where millimeter precision measurements of the surface motion of a rock face are made every few minutes. And it can also be used in a rural area, where the only available data is a satellite radar image taken every few days to weeks."

The SSSAFE model was initially developed for mine monitoring, where landslides are a constant threat, but using publicly available satellite data, the team was able to retrospectively predict the 2017 Xinmo landslide, which buried a township in China.

"For Xinmo, the model highlighted significant movement at what became the rock avalanche source, 10 months before the disaster occurred," said Professor Tordesillas. "If we can use this model, along with freely available satellite data to recognize potential future landslide sites well before they happen, actions can be taken to protect communities, saving many lives."

With SSSAFE exploiting big data analytics, network science, and physics, Professor Tordesillas hopes her research will be used by industry and governments worldwide to help early warning systems (EWS) in mitigating landslide hazards in the face of climate change.

"Very few studies have used remote sensing data to detect precursors of slope failure. Crucially, little is known about how to interpret this data from known physics of granular failure to better understand and predict events leading to catastrophic landslides. We achieved both in SSSAFE," she said.

NSF renews funding for Two-Dimensional Crystal Consortium

Penn State facility enables the development of new ultra-thin materials for advanced electronics

The National Science Foundation (NSF) announced a renewal of funding for the Materials Innovation Platform (MIP) national user facility at Penn State's Materials Research Institute (MRI), the Two-Dimensional Crystal Consortium (2DCC). The 2DCC is one of four MIPs in the United States and was awarded $20.1 million over five years, an increase of 13% above the initial award in 2016.

MIPs are NSF facilities focused on a specific topic that is funded to stimulate materials research innovation and foster the growth of a national community of users to develop next-generation materials. These groups seek to substantially increase the rate at which new materials and new materials phenomena are discovered. The 2DCC at Penn State follows the "materials by design" concept, combining synthesis, characterization, and theory/simulation applied to targeted outcomes to accelerate materials discovery.

The 2DCC received its first five years of funding in 2016, which was used to nucleate and grow the MIP by developing state-of-the-art equipment for thin film deposition with integrated characterization tools, establishing a bulk growth facility, developing new computational tools and a facility-wide database, and initiating an external user program. In 2020, the 2DCC underwent a renewal process for the second five years of funding.

"Over the past five years, the 2DCC has established itself as a premier facility for the synthesis of wafer-scale 2D films and bulk crystals with unique quantum properties," said Joan Redwing, director of the 2DCC and synthesis lead, and professor of materials science and engineering and electrical engineering.

Two-dimensional (2D) materials are atomically thin layers that, due to the restricted electron and atom motion, have physical characteristics that are not present in three dimensions. The 2DCC focuses on the bulk and thin-film synthesis of 2D chalcogenides, i.e., layered compounds of transition elements such as selenium and sulfur. By controlling the growth of these materials on the atomic level, new materials can be created with unique properties and exotic quantum states that hold the potential for revolutionary new device technologies, such as flexible electronics and quantum computing.

"As an inaugural Materials Innovation Platform, 2DCC MIP exemplifies the power of the Materials Genome Initiative approach with close experiment-theory interactions," said Charles Ying, program director for MIPs and National Facilities and Instrumentation with the Division of Materials Research of the National Science Foundation. "Multi-year efforts of studying and refining growth conditions have paid off, leading to reproducible synthesis of 2D materials that have already benefited more than 100 scientists nationwide. The new experimental and data tools will bring 2DCC to a new level in its second five years."

As a core component of the 2DCC's efforts in synthesizing ground-breaking 2D chalcogenide materials, the 2DCC offers a user program that advances 2D materials research across the U.S., not just at Penn State.

"Researchers outside of Penn State at other universities, companies, or national labs can come on-site to receive training in the facility and carry out their research or request samples grown by 2DCC staff," said Redwing. "In addition to the user program, we also have an in-house team of researchers who collaborate with users on their projects. We've sponsored over 125 user projects in our facility since we started in 2016. So, a big part of the MIP is indeed the user program."

The Penn State MIP has proven to be very beneficial for the development of 2D materials.

"Even before the MIP, Penn State had a number of faculty working on 2D material research," said Redwing. "But getting the MIP funded enabled us to expand and more deeply integrate that activity and initiate research collaborations with other universities and national labs through our user program. It's really helped to make Penn State one of the main centers of activity in 2D materials in the world."

During its first five years, the 2DCC has managed to meet challenges of complexity, scale, and even an unexpected obstacle that affected the entire globe.

"The discovery of high-performance materials is a complex process, and the framework of the MIP integrates research methodologies that efficiently aid the optimal synthesis of 2D materials, with a teaming of theory, synthesis science, in situ metrologies, and machine learning from the large data sets," said Clive Randall, director of MRI and distinguished professor of materials science and engineering. "In addition, the outreach has been very impressive, aiding researchers from all over the United States, and even globally. The 2DCC also maintained their mission during the COVID crisis, including holding a virtual research experience for undergraduates program in the summer of 2020."

The 2DCC is one of four user facilities in MRI, along with the Materials Characterization Lab, the Nanofabrication Lab, and the Materials Computation Center. The 2DCC research staff includes 17 faculty and 13 doctorate-level researchers. Graduate students are also involved in the in-house research program.

Along with Redwing, the 2DCC executive leadership team includes Nitin Samarth, associate director and characterization lead and professor and George A. and Margaret M. Downsbrough Department Head in Physics; Vincent Crespi, theory lead and distinguished professor of physics, materials science and engineering and chemistry; Joshua Robinson, director of user programs and professor of materials science and engineering; Eric Hudson, director of education, outreach and diversity programs and associate professor of physics; Zhiqiang Mao, bulk growth lead and professor of physics; Roman Engel-Herbert, industry lead and associate professor of materials science and engineering and physics; Adri van Duin, distinguished professor of mechanical engineering, chemistry, materials science & engineering, chemical engineering and engineering science and mechanics; Jun Zhu, professor of physics; Wes Reinhart, assistant professor of materials science and engineering and Institute for Computational and Data Sciences faculty co-hire; and Kevin Dressler, operations and user facilities director and affiliate assistant professor of civil engineering.

"The 2DCC has created this critical mass of research activity that has brought considerable attention to Penn State and MRI over the last five years," said Redwing. "The funding we received for new equipment, research support, and other activities has established Penn State and MRI as one of the leading institutes for 2D materials research."

With the renewed funding, the 2DCC will work to build on the progress made in 2D materials research through new collaborations and the existing ones created in the MIP's first five years. Plans for the next five years include the addition of a double crucible Bridgman system for synthesis of bulk crystals with improved composition control, development of an integrated etch/deposition tool for the synthesis of 2D metals, and an expansion of the facility database to enable materials discovery through data science methods.

"We are very proud of the 2DCC leadership, researchers, and staff who have partnered with the NSF to develop, refine and model the MIP program as one of the inaugural awardees back in 2016," said Randall. "We are looking forward to this next era and the scientific discoveries that will inevitable come with the multiple university partnerships which will emerge via the MIP program."

Penn State astrophysicist builds dark matter map that reveals hidden bridges between galaxies

A new map of dark matter in the local universe reveals several previously undiscovered filamentary structures connecting galaxies. The map, developed using machine learning by an international team including a Penn State astrophysicist, could enable studies about the nature of dark matter as well as about the history and future of our local universe.

Dark matter is an elusive substance that makes up 80% of the universe. It also provides the skeleton for what cosmologists call the cosmic web, the large-scale structure of the universe that, due to its gravitational influence, dictates the motion of galaxies and other cosmic material. However, the distribution of local dark matter is currently unknown because it cannot be measured directly. Researchers must instead infer its distribution based on its gravitational influence on other objects in the universe, like galaxies. An international team of researchers has produced a map of the dark matter within the local universe, using a model to infer its location due to its gravitational influence on galaxies (black dots). These density maps--each a cross section in different dimensions--reproduce known, prominent features of the universe (red) and also reveal smaller filamentary features (yellow) that act as hidden bridges between galaxies. The X denotes the Milky Way galaxy and arrows denote the motion of the local universe due to gravity.

"Ironically, it's easier to study the distribution of dark matter much further away because it reflects the very distant past, which is much less complex," said Donghui Jeong, associate professor of astronomy and astrophysics at Penn State and a corresponding author of the study. "Over time, as the large-scale structure of the universe has grown, the complexity of the universe has increased, so it is inherently harder to make measurements about dark matter locally."

Previous attempts to map the cosmic web started with a model of the early universe and then simulated the evolution of the model over billions of years. However, this method is computationally intensive and so far has not been able to produce results detailed enough to see the local universe. In the new study, the researchers took a completely different approach, using machine learning to build a model that uses information about the distribution and motion of galaxies to predict the distribution of dark matter.

The researchers built and trained their model using a large set of galaxy simulations, called Illustris-TNG, which includes galaxies, gasses, other visible matter, as well as dark matter. The team specifically selected simulated galaxies comparable to those in the Milky Way and ultimately identified which properties of galaxies are needed to predict the dark matter distribution.

"When given certain information, the model can essentially fill in the gaps based on what it has looked at before," said Jeong. "The map from our models doesn't perfectly fit the simulation data, but we can still reconstruct very detailed structures. We found that including the motion of galaxies--their radial peculiar velocities--in addition to their distribution drastically enhanced the quality of the map and allowed us to see these details."

The research team then applied their model to real data from the local universe from the Cosmicflow-3 galaxy catalog. The catalog contains comprehensive data about the distribution and movement of more than 17 thousand galaxies in the vicinity of the Milky Way--within 200 megaparsecs. The resulting map of the local cosmic web is published in a paper appearing online on May 26 in The Astrophysical Journal.

The map successively reproduced known prominent structures in the local universe, including the "local sheet"--a region of space containing the Milky Way, nearby galaxies in the "local group," and galaxies in the Virgo cluster--and the "local void"--a relatively empty region of space next to the local group. Additionally, it identified several new structures that require further investigation, including smaller filamentary structures that connect galaxies.

"Having a local map of the cosmic web opens up a new chapter of the cosmological study," said Jeong. "We can study how the distribution of dark matter relates to other emission data, which will help us understand the nature of dark matter. And we can study these filamentary structures directly, these hidden bridges between galaxies."

For example, it has been suggested that the Milky Way and Andromeda galaxies may be slowly moving toward each other, but whether they may collide in many billions of years remains unclear. Studying the dark matter filaments connecting the two galaxies could provide important insights into their future.

"Because dark matter dominates the dynamics of the universe, it basically determines our fate," said Jeong. "So we can ask a [super]computer to evolve the map for billions of years to see what will happen in the local universe. And we can evolve the model back in time to understand the history of our cosmic neighborhood."

The researchers believe they can improve the accuracy of their map by adding more galaxies. Planned astronomical surveys, for example using the James Web Space Telescope, could allow them to add faint or small galaxies that have yet to be observed and galaxies that are further away.

BU researchers use artificial intelligence to determine extent of damage in kidney disease

Chronic kidney disease (CKD) is caused by diabetes and hypertension. In 2017, the global prevalence of CKD was 9.1 percent, which is approximately 700 million cases. Chronic kidney damage is assessed by scoring the amount of interstitial fibrosis and tubular atrophy (IFTA) in a renal biopsy sample. Although image digitization and morphometric (measuring external shapes and dimensions) techniques can better quantify the extent of histologic damage, a more widely applicable way to stratify kidney disease severity is needed.

Now, researchers from Boston University School of Medicine (BUSM) have developed a novel Artificial Intelligence (AI) tool to predict the grade of IFTA, a known structural correlate of progressive and chronic kidney disease.

"Having a computer model that can mimic an expert pathologist's workflow and assess disease grade is an exciting idea because this technology has the potential to increase efficiency in clinical practices," explained corresponding author Vijaya B. Kolachalama, Ph.D., assistant professor of medicine at BUSM.

Typical workflow by the pathologist on the microscope involves manual operations such as panning as well as zooming in and out of specific regions on the slide to evaluate various aspects of the pathology. In the 'zoom out' assessment, pathologists review the entire slide and perform a 'global' evaluation of the kidney core. In the 'zoom in' assessment, they perform in-depth, microscopic evaluation of 'local' pathology in the regions of interest.

An international team of five practicing nephropathologists independently determined IFTA scores on the same set of digitized human kidney biopsies using web-based software (PixelView, deepPath Inc.). Their average scores were taken as a reference estimate to build the deep learning model. To emulate the nephropathologist's approach to grading the biopsy slides under a microscope, the researchers used AI to incorporate patterns and features from sub-regions (or patches) of the digitized kidney biopsy image as well as the entire (global) digitized image to quantify the extent of IFTA. Through this combination of patch-level and global-level data, a deep learning model was designed to accurately predict IFTA grade.

When validated, Kolachalama believes AI models that can automatically score the extent of chronic damage in the kidney can serve as second opinion tools in clinical practices. "Eventually, it may be possible to use this algorithm to study other organ-specific pathologies focused on evaluating fibrosis. Such methods may hold the potential to give more reproducible IFTA readings than readings by nephropathologists," he adds.

Japanese astrophysicist's supercomputer simulations of plasma jets reveal magnetic fields far, far away

Radio telescope images enable a new way to study magnetic fields in galaxy clusters millions of light-years away

For the first time, researchers have observed plasma jets interacting with magnetic fields in a massive galaxy cluster 600 million light-years away, thanks to the help of radio telescopes and supercomputer simulations. The findings can help clarify how such galaxy clusters evolve.

Galaxy clusters can contain up to thousands of galaxies bound together by gravity. Abell 3376 is a huge cluster forming as a result of a violent collision between two sub-clusters of galaxies. Very little is known about the magnetic fields that exist within this and similar galaxy clusters.

"It is generally difficult to directly examine the structure of intracluster magnetic fields," says Nagoya University astrophysicist Tsutomu Takeuchi, who was involved in the research. "Our results clearly demonstrate how long-wavelength radio observations can help explore this interaction." A black hole (marked by the red x) at the centre of galaxy MRC 0600-399 emits a jet of particles that bends into a "double-scythe" T-shape that follows the magnetic field lines at the galaxy subcluster's boundary.

An international team of scientists has been using the MeerKAT radio telescope in the Northern Cape of South Africa to learn more about Abell 3376's huge magnetic fields. One of the telescope's very high-resolution images revealed something unexpected: plasma jets emitted by a supermassive black hole in the cluster bend to form a unique T-shape as they extend outwards for distances as far as 326,156 light-years away. The black hole is in galaxy MRC 0600-399, which is near the centre of Abell 3376.

The team combined their MeerKAT radio telescope data with X-ray data from the European Space Agency's space telescope XXM-Newton to find that the plasma jet bend occurs at the boundary of the subcluster in which MRC 0600-399 exists.

"This told us that the plasma jets from MRC 0600-399 were interacting with something in the heated gas, called the intracluster medium, that exists between the galaxies within Abell 3376," explains Takeuchi.

To figure out what was happening, the team conducted 3D 'magnetohydrodynamic' simulations using one of the world's most powerful supercomputer, ATERUI II, located at the National Astronomical Observatory of Japan.

The simulations showed that the jet streams emitted by MRC 0600-399's black hole eventually reach and interact with magnetic fields at the border of the galaxy subcluster. The jet stream compresses the magnetic field lines and moves along them, forming the characteristic T-shape.

"This is the first discovery of an interaction between cluster galaxy plasma jets and intracluster magnetic fields," says Takeuchi.

An international team has just begun construction of what is planned to be the world's largest radio telescope, called the Square Kilometre Array (SKA).

"New facilities like the SKA are expected to reveal the roles and origins of cosmic magnetism and even to help us understand how the universe evolved," says Takeuchi. "Our study is a good example of the power of radio observation, one of the last frontiers in astronomy."