SwRI has been developing and maintaining the Elastic-Plastic Impact Computations (EPIC) Dynamic Finite Element computational tool since 2007. This tool has proven to be very cost-effective in supporting the design of more effective armor and warheads. The image above depicts a simulation of an impact using EPIC. As part of an Other Transaction Prototype Agreement with the U.S. Army Corps of Engineers, SwRI will continue to advance EPIC.
SwRI has been developing and maintaining the Elastic-Plastic Impact Computations (EPIC) Dynamic Finite Element computational tool since 2007. This tool has proven to be very cost-effective in supporting the design of more effective armor and warheads. The image above depicts a simulation of an impact using EPIC. As part of an Other Transaction Prototype Agreement with the U.S. Army Corps of Engineers, SwRI will continue to advance EPIC.

SwRI updates impact modeling software EPIC for the US Army Corps of Engineers to meet the evolving computational needs of the US Department of Defense

Southwest Research Institute (SwRI) has received a grant of $500,000 in the first year, which can be further extended up to $3.5 million for the development of the Elastic-Plastic Impact Computations (EPIC) dynamic finite-element code. EPIC uses particle and finite element methods to simulate complex impact and explosion scenarios, enabling engineers to analyze how a particular design would behave under stress in real-world conditions. It can accurately simulate high-velocity impact events and explosive detonations, providing cost-effective design solutions for effective warheads, armor, vehicles, aircraft, and soldiers. EPIC's simulations can also offer protection against a wide range of threats.

SwRI Staff Engineer Dr. Stephen Beissel, who has been involved in the development of the EPIC project since the mid-1990s, stated that EPIC leverages finite element and particle methods to simulate complex impact and explosion scenarios. The numerical algorithms and material models enable the code to handle highly dynamic and energetic events, allowing engineers to analyze how a particular design for a ground vehicle, ship, or aircraft component would react under stress in real-world conditions.

The EPIC code was initially developed in the 1970s to cost-effectively design warheads, body armor, and armored vehicles, and model their interactions. In 2007, the EPIC development team joined SwRI, which opened an office in Minneapolis, Minnesota, to support them. SwRI took over the maintenance and development of the project at that time.

EPIC uses finite element analysis, an efficient computational technique, to model a full range of impact scenarios, including high-speed impacts that generate large pressures, high strain rates, and permanent deformations in solid materials. It also uses particle methods, an approach similar to the finite-element method, which continuously reassesses the local regions over which information is exchanged.

EPIC's unique feature is the accurate transition from finite elements to particle methods when deformations become extensive. It is a crucial tool for designing effective warheads, and armor for vehicles, aircraft, and soldiers, and to protect against a wide range of threats. It simulates high-velocity impact events and explosive detonation in the case of warheads.

Over the next four years, SwRI aims to improve and update EPIC by increasing its accuracy, expanding the types of problems and scenarios it can handle, and increasing its computational efficiency on supercomputers built with GPUs.

Dr. Beissel noted that as adversaries continue to develop new munitions, such as hypersonic missiles, tools like EPIC become critical to designing new armor and approaches to defeating these threats. Creating physical prototypes and testing them is expensive and time-consuming, especially in destructive events. Simulating these dynamic and explosive large-strain events, instead of repeatedly recreating a physical prototype, makes the design cycle more efficient and cost-effective.

Herbert Jaeger, Professor of Computing in Cognitive Materials at CogniGron | Photo Marleen Annema
Herbert Jaeger, Professor of Computing in Cognitive Materials at CogniGron | Photo Marleen Annema

University of Groningen prof Jaeger takes steps towards creating a formal theory for neuromorphic computing

There is currently a search for new materials to build computer microchips that are more energy-efficient and brain-like. However, no theory can guide this effort on a solid foundation. A theory for non-digital computers is necessary to take into account continuous and analog signals, physical effects at the nanoscale, and the fact that the devices created are often not identical. The paper published by Herbert Jaeger, Beatriz Noheda, and Wilfred G. van der Wiel is the first attempt to provide a sketch of what such a theory for neuromorphic computers might look like.

According to Herbert Jaeger, who is a professor of computing in cognitive materials at the University of Groningen in the Netherlands, there needs to be a solid theory behind the engineering of new microchips. Currently, computers rely on stable switches, usually transistors, that can be either on or off, making them logical machines with programming based on logical reasoning. However, the miniaturization of transistors, which has been the key to making computers more powerful, is reaching its physical limit, which is why scientists are now looking for new materials that can produce more versatile switches capable of using more values than just 0 or 1.

Jaeger is a member of the Groningen Cognitive Systems and Materials Center (CogniGron) which is devoted to creating neuromorphic (brain-like) computers. CogniGron brings together scientists with differing approaches, including experimental materials scientists, mathematical theorists, and computer science and AI specialists. Working closely with materials scientists has given Jaeger insight into the challenges they face when developing new computational materials. It has also made him aware of a dangerous pitfall: there is no established theory for the use of non-digital physical effects in computing systems.

Our brain functions differently than a logical system. Although we can reason logically, this is only a small aspect of what our brain can do. The majority of the time, our brain must figure out how to perform simple tasks such as lifting a cup or waving to a colleague. Jaeger explains that "a lot of the information-processing that our brain does is this non-logical stuff, which is continuous and dynamic. It is difficult to formalize this in a digital computer." Additionally, our brain can function despite external factors such as fluctuations in blood pressure, external temperature, and hormone balance. So, how can we create a computer that is both versatile and robust? Jaeger believes that "the brain is proof of principle that it can be done", and is optimistic that it can.

The brain is a source of inspiration for materials scientists who aim to produce materials that mimic the behavior of neurons. Scientists might create materials that oscillate, show bursts of activity, and resemble how neurons work. However, the field is missing a crucial piece of information: even neuroscientists don't fully understand how the brain works. The lack of a theory for neuromorphic computers is a problem, but the field doesn't seem to acknowledge this. In a recent paper, Jaeger, Noheda, and van der Wiel proposed a theory for non-digital computers. The theory suggests that instead of using stable 0/1 switches, non-digital computers should work with continuous, analog signals. It should also account for the various non-standard nanoscale physical effects that materials scientists are studying.

Neuromorphic computing devices made from new materials are difficult to construct, and if you make a hundred of them, they will not all be identical. This is similar to how our neurons are not all the same. Additionally, these devices are often brittle and sensitive to temperature. Therefore, any theory for neuromorphic computing should consider such characteristics. Importantly, a theory supporting neuromorphic computing will not be a single theory but will consist of many sub-theories, just like digital computer theory, which is a layered system of connected sub-theories. To create a theoretical description of neuromorphic computers, experimental materials scientists and formal theoretical modelers must collaborate closely. Computer scientists must be aware of the physics of all these new materials, and materials scientists should be familiar with the fundamental concepts in computing.

The University of Groningen established CogniGron to bridge the gap between materials science, neuroscience, computing science, and engineering. The aim is to bring together these different groups to work collaboratively. Jaeger, one of the researchers at CogniGron, explains that everyone has their blind spots and the biggest gap in their knowledge is the lack of a foundational theory for neuromorphic computing. To overcome this, their paper provides a first attempt at highlighting how such a theory could be formulated and how a common language can be created.

Map of the study area in Chile. Red curve is the DAS array, black dots are earthquakes, dark red triangles are permanent seismic stations. | TSR doi.org/10.1785/0320230018
Map of the study area in Chile. Red curve is the DAS array, black dots are earthquakes, dark red triangles are permanent seismic stations. | TSR doi.org/10.1785/0320230018

Researchers use a deep-learning model to identify earthquake waves from the DAS data from the offshore cable

Unused telecommunications fiber optic cables can provide three seconds of improved warning time for offshore earthquake early warning systems, as researchers have shown. The researchers used a deep-learning artificial intelligence model to identify earthquake waves from the DAS data obtained from the offshore cable. There are over 1500 cable landing stations across the globe, and this technology allows the use of operational cables and integration of DAS systems without disrupting telecommunications data transportation. This presents an exciting opportunity for further research.

Seismic stations located offshore of heavily populated coastlines are lacking, which poses a significant challenge for earthquake early warning systems (EEW). These areas are some of the world's most seismically active regions. A new study published in The Seismic Record shows how the conversion of unused telecommunications fiber optic cable can address this issue for offshore EEW.

Jiuxun Yin, a Caltech researcher now at SLB, and colleagues utilized a 50-kilometer submarine telecom cable that runs between the United States and Chile. They sampled seismic data at 8,960 channels along the cable for four days using the Distributed Acoustic Sensing (DAS) technique. This technique uses the tiny internal flaws in a long optical fiber as thousands of seismic sensors.

During the study period, Yin and colleagues used the cable data to determine earthquake locations and estimate earthquake magnitudes for one onshore (magnitude 3.7) and two offshore (magnitude 2.7 and 3.3) earthquakes.

Their results showed that using this single offshore DAS array offers an approximate three-second improvement in earthquake early warning compared to onshore DAS arrays. In a simulation conducted by the researchers, they found that by deploying multiple DAS arrays spaced 50 kilometers apart and working together in the area, they could improve EEW alert times in the subduction zone by five seconds.

Yin expressed that they had anticipated some improvements due to the offshore placement of the DAS array. However, the actual speed gains were even greater than their initial projections. The array's offshore location eliminates the wait time for seismic waves to reach land-based stations, which is the primary advantage.

Offshore Chile and the Cascadia region offshore Canada and the U.S. Pacific Northwest are alike. They both have an active subduction zone, where tectonic plates collide, and one plate plunges beneath another, causing some of history's largest and most destructive earthquakes. Even Southern California's offshore region has witnessed numerous faults that have hosted earthquakes of magnitude 6 or more. In all these densely populated coastal areas, offshore earthquake early warning could help protect lives and property.

Yin explained that Chile's elevated seismic risk was the primary reason for selecting this cable. The region experiences frequent offshore earthquakes and has been affected by several significant magnitude 8+ earthquakes in history, including the largest ever recorded in 1960. Considering the high seismic risk and potentially devastating impacts of a large earthquake, there is a pressing need for a reliable offshore earthquake early warning system in Chile.

The researchers utilized a deep learning artificial intelligence model, which had been trained and validated on previous seismic and DAS data, to identify the earthquake waves from the DAS data of this offshore cable. According to Yin, the volume of data collected for DAS is substantial and pre-trained deep learning models offer a highly efficient and reliable option for real-time applications like EEW. However, other traditional seismological methods of picking earthquakes can still be effective in processing DAS data with automation.

Yin also noted that researchers require more data, particularly from larger magnitude earthquakes, to develop and test EEW algorithms effectively, as well as more information on how DAS instruments respond before building a real-time EEW system that integrates with existing EEW frameworks. He stated that there are plenty of places around the world to continue this research.

As per Yin, "There are more than 1500 cable landing stations around the globe, and the progress in the technology permits the use of operational cables and adding DAS systems without affecting [telecommunications] data transportation. We believe that this opens up a host of exciting research opportunities, and we are keen to explore these in future studies. We are looking for close interactions with cable owners, environmental agencies, and policymakers to scale the DAS-EEW for the benefit of coastal communities."

This artist’s concept depicts the Surface Water and Ocean Topography (SWOT) satellite, launched in December 2022. Credit: NASA/Jet Propulsion Laboratory
This artist’s concept depicts the Surface Water and Ocean Topography (SWOT) satellite, launched in December 2022. Credit: NASA/Jet Propulsion Laboratory

Machine Learning can translate sea surface heights into climate change insights

Scientists have developed a new machine learning technique that can translate satellite data on sea surface heights into insights on climate change, heat flow, and current flow. The Surface Water and Ocean Topography (SWOT) satellite, launched in December 2022, captures snapshots of sea surface heights at an unprecedented level of detail. The new technique uses a convolutional neural network to estimate various aspects of current flow in the upper ocean, which can help scientists gain a better understanding of and predict climate change.

Oceanographers rely on satellite technology to monitor the ocean's surface elevation and map the circulation of its currents to understand the role of this movement in climate change and heat transport. In late 2022, the Surface Water and Ocean Topography (SWOT) satellite was launched to capture high-resolution snapshots of sea surface heights at a scale of tens of kilometers. However, this high level of detail has resulted in the detection of waves beneath the surface, making it challenging to use simple physics-based approaches to translate sea surface heights into meaningful information about ocean currents.

To address this challenge, researchers Xiao et al. have developed a novel machine learning method that uses SWOT sea surface height data to estimate various aspects of current flow in the upper ocean. The method applies a computational approach inspired by human vision known as a convolutional neural network, which the team trained on data from realistic simulations of sea surface heights and current dynamics.

The researchers have shown that their convolutional neural network can use detailed sea surface heights to estimate certain aspects of current flow. By gaining a better understanding of how currents transport heat and carbon, scientists may be able to predict and comprehend climate change more accurately.

However, the researchers acknowledge that this is only a proof of concept, and further research is needed to refine the new method before it can be reliably used with SWOT data.

Meanwhile, SWOT will continue to capture high-resolution images not only of Earth's oceans but also of almost all surface water around the world, including lakes, rivers, and reservoirs.

Electron micrographs of the 2D-0D hybrid surface implemented in this study (top left), memory characteristics generated by light pulses (top right), and polynomial memory characteristics generated by multiple light pulses (bottom).
Electron micrographs of the 2D-0D hybrid surface implemented in this study (top left), memory characteristics generated by light pulses (top right), and polynomial memory characteristics generated by multiple light pulses (bottom).

KIST develops tech to store, manipulate electronic states in quantum dots that are smaller than 10 nm, ushering in the era of light-powered multi-level memories

Researchers from the Korea Institute of Science and Technology (KIST) and the Daegu Gyeongbuk Institute of Science and Technology (DGIST) have successfully developed a new semiconductor material that can store and manipulate electronic states in quantum dots measuring 10 nanometers or less. This new material makes it possible to store and manipulate data using light, rather than electrical signals, between the computing and storage components of a multi-level computer, thereby significantly enhancing processing speed. The development of this new multi-level optical memory device is expected to contribute towards accelerating the industrialization of next-generation system technologies, such as artificial intelligence systems.

We are facing an overwhelming amount of data. The data centers that store and process this data consume a lot of electricity, which is a major contributor to environmental pollution. To overcome this issue, researchers are exploring polygonal computing systems that have lower power consumption and higher computation speed. However, these systems are unable to handle the huge demand for data processing as they operate with electrical signals, similar to conventional binary computing systems.

Recently, the Korea Institute of Science and Technology (KIST) announced that Dr. Do Kyung Hwang of the Center for Opto-Electronic Materials & Devices and Professor Jong-Soo Lee of the Department of Energy Science & Engineering at Daegu Gyeongbuk Institute of Science and Technology (DGIST) have jointly developed a new semiconductor artificial junction material that can power next-generation memory with light. Transmitting data between the computing and storage parts of a multi-level computer using light instead of electrical signals can significantly increase processing speed.

The research team has created a new semiconductor artificial junction material by joining quantum dots in a core-shell structure with zinc sulfide (ZnS) on the surface of cadmium selenide (CdSe) and a molybdenum sulfide (MoS2) semiconductor. The new material enables the storage and manipulation of electronic states within quantum dots measuring 10 nm or less.

When light hits the cadmium selenide core, some electrons flow out of the molybdenum sulfide semiconductor, trapping holes in the core and making it conductive. The electron state inside cadmium selenide is also quantized. Intermittent light pulses trap electrons in the electron band one after the other, causing a change in the resistance of the molybdenum sulfide through the field effect. The resistance changes cascade depending on the number of light pulses, allowing for more than 0 and 10 states to be maintained, unlike conventional memory which only has 0 and 1 states. The zinc sulfide shell also prevents charge leakage between neighboring quantum dots, allowing each single quantum dot to function as a memory.

Unlike quantum dots in conventional 2D-0D semiconductor artificial junction structures that amplify signals from light sensors, the team's quantum dot structure perfectly mimics the floating gate memory structure, confirming its potential for use as a next-generation optical memory. The researchers verified the effectiveness of the polynomial memory phenomenon with neural network modeling using the CIFAR-10 dataset and achieved a 91% recognition rate.

Dr. Hwang of KIST says that this new multi-level optical memory device will contribute to accelerating the industrialization of next-generation system technologies such as artificial intelligence systems. These systems have been difficult to commercialize due to technical limitations arising from the miniaturization and integration of existing silicon semiconductor devices.