The poster of the FUGIN (FOREST Unbiased Galactic plane Imaging survey with Nobeyama 45-m telescope) project (https://nro-fugin.github.io/). The upper panel shows the distribution of molecular clouds in the Milky Way Galaxy obtained by the Nobeyama 45-m radio telescope. The lower panel shows infrared observation by the Spitzer Space Telescope.
The poster of the FUGIN (FOREST Unbiased Galactic plane Imaging survey with Nobeyama 45-m telescope) project (https://nro-fugin.github.io/). The upper panel shows the distribution of molecular clouds in the Milky Way Galaxy obtained by the Nobeyama 45-m radio telescope. The lower panel shows infrared observation by the Spitzer Space Telescope.

Osaka Metro uses AI to draw the most accurate map of star birthplaces in the Galaxy

140,000 molecular gas clouds, where stars form, locations predicted

Stars are formed by molecular gas and dust coalescing in space. These molecular gases are so dilute and cold that they are invisible to the human eye, but they do emit faint radio waves that can be observed by radio telescopes.

Observing from Earth, a lot of matter lies ahead and behind these molecular clouds and these overlapping features make it difficult to determine their distance and physical properties such as size and mass. So, even though our Galaxy, the Milky Way, is the only galaxy close enough to make detailed observations of molecular clouds in the whole universe, it has been very difficult to investigate the physical properties of molecular clouds in a cohesive manner from large-scale observations.

A research team led by Dr. Shinji Fujita from the Osaka Metropolitan University Graduate School of Science in Japan, identified about 140,000 molecular clouds in the Milky Way Galaxy, which are areas of star formation, from large-scale data of carbon monoxide molecules, observed in detail by the Nobeyama 45-m radio telescope. Using artificial intelligence, the research team estimated the distance of each of these molecular clouds, determined their size and mass and successfully mapped their distribution, covering the first quadrant of the Galactic plane, in the most detailed manner to date.

“The results not only give a bird's eye view of the Galaxy but will also help in various studies of star formation,” explained Dr. Fujita. “In the future, we would like to expand the scope of observations with the Nobeyama 45-m radio telescope and incorporate radio telescope observation data of the sky in the southern hemisphere, which cannot be observed from Japan, for a complete distribution map of the entire Milky Way.”

This study was financially supported by Grants-in-Aid for Scientific Research (KAKENHI) of the Japanese Society for the Promotion of Science (grant numbers 17H06740 and JP21H00049) and “Young interdisciplinary collaboration project” in the National Institutes of Natural Sciences.

Hien Van Nguyen, University of Houston associate professor of electrical and computer engineering, is developing next-gen artificial intelligence to improve medical diagnostics.
Hien Van Nguyen, University of Houston associate professor of electrical and computer engineering, is developing next-gen artificial intelligence to improve medical diagnostics.

UH engineering prof Nguyen wins NCI grant to create next-gen AI to improve diagnostics 

Despite the remarkable progress in artificial intelligence (AI), several studies show that AI systems do not improve radiologists' diagnostic performance. Diagnostic errors contribute to 40,000 - 80,000 deaths annually in U.S. hospitals. This lapse creates a pressing need: Build next-generation computer-aided diagnosis algorithms that are more interactive to fully realize the benefits of AI in improving medical diagnosis. The new computational framework uses a unique combination of eye-gaze tracking, intention reverse engineering and reinforcement learning to decide when and how an AI system should interact with radiologists.

That’s just what Hien Van Nguyen, the University of Houston associate professor of electrical and computer engineering, is doing with a new $933,812 grant from the National Cancer Institute. He will focus on lung cancer diagnostics. 

“Current AI systems focus on improving stand-alone performances while neglecting team interaction with radiologists,” said Van Nguyen. “This project aims to develop a computational framework for AI to collaborate with human radiologists on medical diagnosis tasks.” 

That framework uses a unique combination of eye-gaze tracking, intention reverse engineering, and reinforcement learning to decide when and how an AI system should interact with radiologists. 

To maximize time efficiency and minimize the amount of distraction on clinical work, Van Nguyen is designing a user-friendly and minimally interfering interface for radiologist-AI interaction.  

The project evaluates the approaches for two clinically important applications: lung nodule detection and pulmonary embolism. Lung cancer is the second most common cancer, and pulmonary embolism is the third most common cause of cardiovascular death.  

“Studying how AI can help radiologists reduce these diseases' diagnostic errors will have significant clinical impacts,” said Van Nguyen. “This project will significantly advance the knowledge of the field by addressing important, but largely under-explored questions.”  

The questions include when and how AI systems should interact with radiologists and how to model radiologists' visual scanning process. 

“Our approaches are creative and original because they represent a substantive departure from the existing algorithms. Instead of continuously providing AI predictions, our system uses a gaze-assisted reinforcement learning agent to determine the optimal time and type of information to present to radiologists,” said Van Nguyen.  

“Our project will advance the strategies for designing user interfaces for doctor-AI interaction by combining gaze-sensing and novel AI methodologies.”  

Scheme of the machine learning model based on the feedforward artificial neural network
Scheme of the machine learning model based on the feedforward artificial neural network

Russian scientists develop a neural network algorithm that predicts Arrhenius crossover temperature with 90 percent accuracy

A joint paper by the Department of Computational Physics and Modeling of Physical Processes and Udmurt Federal Research Center of the Russian Academy of Sciences saw light in Materials.

The algorithm can help speed up the production of many materials, including metal alloys, and simplify quality control during such production. An algorithm based on a neural network created at KFU makes it possible to accurately calculate the Arrhenius temperature from several physical parameters of the material. (a) Diagram of the root mean square error ξ of estimation of the Arrhenius crossover temperature TA calculated for various combinations of the quantities Tm, Tg, Tg/Tm and m, which were the inputs of the machine learning model. Inset: T(pred)A and T(emp)A are the predicted and empirical Arrhenius crossover temperatures, respectively. (b) Correspondence between the empirical TA and the TA predicted by the machine learning model using the validation data set.

Among the parameters used by the team for modeling were melting temperature, glass transition temperature, and brittleness value. They are used to describe phase transitions and structural changes in liquids during cooling.

Co-author, Associate Professor Bulat Galimzyanov comments, “Many solid materials, such as glass, metals, plastics, initially have the form of melts – they are viscous liquids that solidify at a certain temperature, turning into a solid state. The temperature at which a change in the state of aggregation begins is called the Arrhenius temperature. When approaching it, the atoms of matter begin to move in groups and more slowly than before. This indicates the preparation of the liquid for solidification.”

The algorithm was tested for metallic, silicate, borate, and organic glasses, according to the interviewee, “We found out that for the created neural network, the melting and glass transition temperatures of the material are significant and sufficient characteristics for estimating the Arrhenius temperature. From these two values, the algorithm determined the Arrhenius temperature for all analyzed liquids with an accuracy of more than 90 percent.”

The scientists worked out an equation linking the Arrhenius temperature with the melting temperatures and the glass transition temperature.

“Glass transition and melting temperatures are easily measured in lab conditions. Furthermore, they can be found in the literature. Thus, determining the Arrhenius temperature has now become easier. We can analyze the properties of liquids faster and estimate the characteristics of resulting solid materials more precisely,” concludes Galimzyanov.

The team further plans to adapt the created algorithm to more complex materials, such as polymers.

Global map of eddies (Graphic: Nathan Beech)
Global map of eddies (Graphic: Nathan Beech)

EERIE project uses supercomputers for improved Earth system simulations

The ocean has a large effect on our planet’s climate. In this regard, mesoscale – i.e., medium-sized – eddies, which constitute essentially the weather on the ocean, could be far more important than previously believed. Accordingly, a new project, led by the Alfred Wegener Institute has just been launched to assess this aspect. By doing so, “European Eddy Rich Earth System Models” (EERIE) could significantly improve today’s Earth system models and therefore projections of the climate’s future development. 

Eddies come in a range of sizes, with diameters from only a few meters to several kilometers. Their influence on the climate depends on their size. Although these eddies have existed for some time, we still have limited quantitative information on their role, bearing in mind the impacts of a warming climate (Beech et al. 2022). A new EU-financed project aims to change that: with the aid of “European Eddy Rich Earth System Models” (EERIE), eddies will be more realistically represented in climate models – i.e., by the laws of physics rather than empirical parameterizations. Simulated and observed eddy kinetic energy patterns in the global ocean.

EERIE’s goal is to help produce a new generation of Earth system models (ESMs). To do so, it will focus on improving the simulation of mesoscale eddies, which, depending on the region, can be anywhere from five to 40 kilometers wide. The supercomputer modeling improvements will include e.g. the inclusion of open channels of water (“leads”) in sea ice, where the ocean influences the atmosphere via powerful heat fluxes. “The technological hurdles to achieving these high-resolution simulations are immense,” says Prof Thomas Jung, responsible for coordinating the project at the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI). “In order to allow quantitative statements, EERIE will have to achieve a simulation rate of up to five simulated years per day on the latest pre-exascale supercomputers available in Europe. Here, efficiency is a key factor – also to keep the simulations’ energy consumption and CO2 footprint to a minimum.” In order to implement, save and analyze these complex high-resolution simulations, the researchers will have to work hand in hand with software engineers to develop radically new software technologies.

In the course of the project, the researchers also plan to develop new simulation protocols, contributing to national and international climate change assessments in the process. In this way, EERIE is to yield valid and directly applicable climate information and to make valuable contributions in preparation for the IPCC’s next Assessment Report.

The project budget is over 10 million euros. 17 partner institutions are involved, including seven universities. The kick-off event for EERIE took place on 23 and 24 February 2023. The project, which officially began on 1 January 2023, will continue for four years.

Global cloudiness map, based on data collected by the Aqua research satellite over more than a decade (2002-2015). Clouds are not distributed uniformly but rather concentrated in hot spots. Photo: NASA
Global cloudiness map, based on data collected by the Aqua research satellite over more than a decade (2002-2015). Clouds are not distributed uniformly but rather concentrated in hot spots. Photo: NASA

Weizmann scientists solve a 50-year-old puzzle on why Earth’s hemispheres look equally bright when viewed from space

When looking at the Earth from space, its hemispheres – northern and southern – appear equally bright. This is particularly unexpected because the Southern Hemisphere is mostly covered with dark oceans, whereas the Northern Hemisphere has a vast land area that is much brighter than these oceans. For years, the brightness symmetry between hemispheres remained a mystery. In a new study, Weizmann Institute of Science, Isreal, researchers, and their collaborators reveal a strong correlation between storm intensity, cloudiness, and the solar energy reflection rate in each hemisphere. They offer a solution to the mystery, alongside an assessment of how climate change might alter the reflection rate in the future. (l-r) Prof. Yohai Kaspi and Or HadasAs early as the 1970s, when scientists analyzed data from the first meteorological satellites, they were surprised to find out that the two hemispheres reflect the same amount of solar radiation. The reflectivity of solar radiation is known in scientific lingo as “albedo.” To better comprehend what albedo is, think about driving at night: It is easy to spot the intermittent white lines, which reflect light from the car’s headlights well, but difficult to discern the dark asphalt. The same is true when observing Earth from space: The ratio of the solar energy hitting the Earth to the energy reflected by each region is determined by various factors. One of them is the ratio of dark oceans to bright land, which differ in reflectivity, just like asphalt and intermittent white lines. The land area of the Northern Hemisphere is about twice as large as that of the Southern, and indeed when measuring near the surface of the Earth, when the skies are clear, there is more than a 10 percent difference in albedo. Still, both hemispheres appear to be equally bright from space.

In this study, the team of researchers, led by Prof. Yohai Kaspi and Or Hadas of Weizmann’s Earth and Planetary Sciences Department, focused on another factor influencing albedo, one located in high altitudes and reflecting solar radiation – clouds. The team analyzed data derived from the world’s most advanced databases, including cloud data collected via NASA satellites (CERES), as well as data from ERA5, which is a global weather database containing information collected using a variety of sources in the air and on the ground, dating back to 1950. ERA5 data was utilized to complete cloud data and to cross-correlate 50 years of this data with information on the intensity of cyclones and anticyclones.

Next, the scientists classified storms of the last 50 years into three categories, according to intensity. They discovered a direct link between storm intensity and the number of clouds forming around the storm. While Northern Hemisphere and land areas, in general, are characterized by weaker storms, above oceans in the Southern Hemisphere, moderate and strong storms prevail. Data analysis showed that the link between storm intensity and cloudiness accounts for the difference in cloudiness between the hemispheres. “Cloud albedo arising from strong storms above the Southern Hemisphere was found to be a high-precision offsetting agent to the large land area in the Northern Hemisphere, and thus symmetry is preserved,” says Hadas, adding: “This suggests that storms are the linking factor between the brightness of Earth’s surface and that of clouds, solving the symmetry mystery.”

Could climate change make one of the hemispheres darker?

Earth has been undergoing rapid change in recent years, owing to climate change. To examine whether and how this could affect hemispheric albedo symmetry, the scientists used CMIP6, a set of models run by climate modeling centers around the world to simulate climate change. One of these models’ major shortcomings is their limited ability to predict the degree of cloudiness. Nevertheless, the relation found in this study between storm intensity and cloudiness enables scientists to assess future cloud amounts, based on storm predictions.

Models predict global warming will result in a decreased frequency of all storms above the Northern Hemisphere and of weak and moderate storms above the Southern Hemisphere. However, the strongest storms of the Southern Hemisphere will intensify. The cause of these predicted differences is “Arctic amplification,” a phenomenon in which the North Pole warms twice as fast as Earth’s mean warming rate. One might speculate that this difference should break hemispheric albedo symmetry. However, the research shows that a further increase in storm intensity might not change the degree of cloudiness in the Southern Hemisphere because cloud amounts reach saturation in very strong storms. Thus, symmetry might be preserved.

“It is not yet possible to determine with certainty whether the symmetry will break in the face of global warming,” says Kaspi. “However, the new research solves a basic scientific question and deepens our understanding of Earth’s radiation balance and its effectors. As global warming continues, geoengineered solutions will become vital for human life to carry on alongside it. I hope that a better understanding of basic climate phenomena, such as the hemispheric albedo symmetry, will help in developing these solutions.”