a) DeepH provides a practical way to develop comprehensive material models that describe the relationship between material structure and properties. This universal model takes any material structure as input and generates the corresponding DFT Hamiltonian, allowing for straightforward derivation of various material properties.  b) DeepH operates by learning and predicting DFT Hamiltonian matrix blocks independently based on local structure information.
a) DeepH provides a practical way to develop comprehensive material models that describe the relationship between material structure and properties. This universal model takes any material structure as input and generates the corresponding DFT Hamiltonian, allowing for straightforward derivation of various material properties. b) DeepH operates by learning and predicting DFT Hamiltonian matrix blocks independently based on local structure information.

Chinese researchers build universal deep-learning model for materials discovery

The research team, led by Prof. Yong Xu and Prof. Wenhui Duan from China, has developed a large materials model using deep-learning computational techniques. This new method could lead to significant opportunities for advancing artificial intelligence-driven materials discovery. The team's report explains how deep-learning models, already effective in speech recognition and natural language processing, are now proving to be useful in material design research.

Density functional theory (DFT) has become highly valuable in computational materials design and is one of the most popular methods in computational materials science. Leveraging the DFT Hamiltonian, the team created a large database comprising computational data of over 10,000 material structures. This enabled them to develop a universal materials model called DeepH, which uses deep-learning DFT models to handle diverse elemental compositions and material structures, achieving remarkable accuracy in predicting material properties.

The researchers believe that large materials models, as deep-learning computational models for materials design, have attracted great interest since the success of large language models. However, acquiring large materials models remains challenging due to the inherent complexity of the structure-property relationship in materials.

The team's method could significantly reduce the time and expense needed to develop new materials, which is crucial for manufacturing and industry. 

While this research demonstrates the effectiveness of DeepH's universal materials model, some experts warn that the accuracy of such models is still limited by the quality and quantity of data used in their training. Despite this, the scientific community seems optimistic about the potential of AI-driven materials discovery, and this research certainly moves the field forward.

Overall, this article highlights the contributions and potential of the research team's work on AI-driven materials discovery, while considering diverse perspectives.

This image compares the distribution of galaxies in a simulated universe used to train SimBIG (right) to the galaxy distribution seen in the real universe (left). Bruno Régaldo-Saint Blancard/SimBIG collaboration.
This image compares the distribution of galaxies in a simulated universe used to train SimBIG (right) to the galaxy distribution seen in the real universe (left). Bruno Régaldo-Saint Blancard/SimBIG collaboration.

Astrophysicists use AI to calculate the universe's 'settings'

In a recent announcement, astrophysicists claimed to have made a breakthrough in their ability to precisely calculate crucial parameters that form the backbone of the standard model of cosmology, by using artificial intelligence (AI). They asserted that these estimations are significantly more accurate compared to traditional approaches, highlighting the potential of machine learning to reshape our understanding of the universe.

According to the researchers from the Flatiron Institute and their collaborators, their innovative method, known as Simulation-Based Inference of Galaxies (SimBIG), has allowed them to extract detailed information from the distribution of galaxies, leading to precise estimations of five key cosmological parameters. The claims made by the team were published in a recent study in Nature Astronomy, emphasizing the importance of their findings in shedding light on fundamental aspects of the cosmos. 

One of the key proponents of this method, ChangHoon Hahn, emphasized the potential of AI in unlocking intricate details that were previously inaccessible. By training their AI model on simulated universes representing different parameter values and utilizing realistic galaxy survey data, the researchers purportedly achieved remarkable accuracy in their parameter estimations. However, while the team lauded the benefits of their approach, skepticism remains among some experts in the field.

Critics argue that the extent of precision achieved through AI-powered calculations raises questions about the potential biases and assumptions embedded in the model. The reliance on simulated universes and the subsequent inference drawn from real galaxy observations has sparked concerns about the robustness and generalizability of the results. Some experts caution that the AI model may have inadvertently "learned" patterns within the training data that do not accurately reflect the true nature of the universe's parameters.

Moreover, the ambitious claims made by the researchers about the implications of their work, such as its potential to resolve the Hubble tension, have drawn scrutiny from the broader scientific community. The discrepancy in estimates of the Hubble constant, a fundamental metric in cosmology, has been a long-standing enigma that requires meticulous and unbiased analysis. Critics argue that while the AI-enhanced approach shows promise, it is crucial to approach such complex cosmological questions with a healthy dose of skepticism.

Using AI to predict and analyze intricate cosmological parameters undoubtedly sparks intrigue and excitement within the scientific community. However, given the profound implications of accurately determining the "settings" of the universe, it is imperative to approach such advancements with a critical lens. The potential biases, limitations, and uncertainties associated with machine learning algorithms in the realm of astrophysics warrant further scrutiny and validation.

As the debate around the use of AI in cosmological research continues to evolve, it remains essential for scientists to engage in rigorous testing, validation, and open discussions to ensure that the insights gained from these cutting-edge methodologies truly reflect the intricate workings of our universe.

The claims made by astrophysicists regarding the precision and implications of their AI-driven calculations certainly raise eyebrows among skeptics, signaling a broader dialogue on the role of machine learning in shaping our understanding of the cosmos.

Scientists in Jennifer Doudna’s lab at Gladstone Institutes and the Innovative Genomics Institute
Scientists in Jennifer Doudna’s lab at Gladstone Institutes and the Innovative Genomics Institute

New insights from the 3D shapes of viral proteins revealed using AlphaFold

In San Francisco, a recent study has made significant progress in virology. Researchers at Gladstone Institutes and the Innovative Genomics Institute, led by Jennifer Doudna, PhD, used computational tools to predict the three-dimensional shapes of nearly 70,000 viral proteins, gaining new insights into their functions and roles in infection.

Describing the study, Jennifer Doudna stated, “As viruses with pandemic potential emerge, it’s important to establish how they’ll interact with human cells. Our new study provides a tool to predict what those newly emerging viruses can do.” The team used an open-access research platform called AlphaFold to predict the shapes of proteins from various species of viruses, providing the team with a unique opportunity to match the 3D shapes to the structures of proteins whose functions are already known, revealing previously unknown roles and functionalities of these viral proteins.

The study revealed that 38 percent of the newly predicted protein shapes matched previously known proteins, shedding light on the shared molecular mechanisms between viruses and cellular systems. The team also discovered a common strategy for evading host immune defenses shared across viruses that infect animals and bacteria-infecting viruses known as phages, suggesting a conserved mechanism throughout evolution.

Jason Nomburg, PhD, highlighted the essential role of computational tools in the study, stating: "This would not have been possible without recent advancements in computational tools that allow us to accurately and quickly predict and compare protein structures.”

The study also emphasized the rapid advancements in computational tools, allowing for the swift prediction of protein structures, which has previously been a challenge in virology. By sharing the 70,000 newly predicted viral protein structures and data from their analyses with the scientific community, the researchers have opened up opportunities for further discoveries and collaborations that could deepen knowledge of how viruses interact with their hosts.

This study reaches beyond virology and underlines the pivotal role of computational tools in broadening our understanding of complex biological systems. The study’s insights hold the potential to impact antiviral therapies against a variety of viruses, representing a significant stride in combating viral infections effectively. The breakthrough also underscores the transformative impact that recent advancements in computational tools have had on the field of virology, providing researchers with new ways to unravel the intricacies of viral infections and evolve innovative strategies for combating them.

The co-authors include Nathan Price, Yong K. Zhu, and Jennifer Doudna of Gladstone Institutes, UC Berkeley, and the Innovative Genomics Institute; and Erin E. Doherty and Daniel Bellieny-Rabelo of UC Berkeley and the Innovative Genomics Institute. The study was supported by various institutions and organizations, highlighting the collaborative effort behind this groundbreaking research.

This study not only unfolds new dimensions in virology but also underscores the indispensable role of computational tools in driving innovation and pushing the boundaries of scientific discovery.

(a) The Information and Operation Control Center Building, (b) The Operation Control Center hall, and (c) The Data and Supercomputing Center Facility.
(a) The Information and Operation Control Center Building, (b) The Operation Control Center hall, and (c) The Data and Supercomputing Center Facility.

China develops a new space weather monitoring network with cutting-edge data system

The Chinese Meridian Project (CMP) has introduced a groundbreaking network that integrates data from approximately 300 instruments to monitor space weather from the Sun to Earth's atmosphere. The focus is on the CMP's Data and Communication System, which includes data transmission network facilities and a supercomputing center. These components handle data transmission, storage, processing, and distribution services, improving the network's effectiveness.

The Data and Communication System is crucial for transmitting, storing, and processing data from the monitoring instruments in the network. It accommodates and manages data from different layers of the solar-terrestrial space environment, allowing for faster detection and accurate forecasting of space weather events such as solar storms.

The system's data transmission network facilities seamlessly transfer information from the monitoring instruments across the solar-terrestrial system. It is supported by a robust data storage infrastructure, ensuring the safety of the large volumes of data necessary for space weather monitoring.

In addition, the inclusion of a supercomputing center within the Data and Communication System advances the processing and analysis of the extensive data sets acquired by the CMP. It enables complex data processing and analysis to extract valuable insights from the information collected by the network's instruments.

Furthermore, the Data and Communication System serves as a gateway for disseminating the project's findings to the international scientific community. It shares processed data and research outcomes, contributing to a collective understanding of space weather phenomena and promoting collaboration in this critical domain.

The CMP’s Data and Communication System plays a pivotal role in the success and impact of space weather monitoring. Its integration represents a significant leap in our capabilities to monitor and understand the solar-terrestrial environment, promoting enhanced preparedness and resilience against potential adverse space weather events.

The introduction of this cutting-edge Data and Communication System within the CMP marks a new era in space weather monitoring, offering a promising trajectory for global efforts to comprehend and adapt to the influence of space weather phenomena on Earth's vital systems and infrastructure.

After a massive, spinning star dies, a disk of material forms around the central black hole. As the material cools and falls into the black hole, new research suggests that detectable gravitational waves are created.  Credit: Ore Gottlieb
After a massive, spinning star dies, a disk of material forms around the central black hole. As the material cools and falls into the black hole, new research suggests that detectable gravitational waves are created. Credit: Ore Gottlieb

Flatiron Institute announces new detectable gravitational wave source from collapsing stars, as predicted by simulations

In a recent study, researchers from the Flatiron Institute present simulations indicating that detectable gravitational waves could originate from the cataclysmic collapse of massive spinning stars. If proven true, this revelation could potentially revolutionize our understanding of the cosmos and the nature of black holes. However, these unprecedented claims have left many in the scientific community skeptical and cautious about embracing such paradigm-shifting assertions.

The study's bold assertions are based on the utilization of cutting-edge computational simulations. The simulations purportedly demonstrate the emergence of detectable gravitational waves following the dramatic deaths of rapidly rotating stars, offering a tantalizing prospect of expanding the horizons of gravitational wave astronomy.

While the potential implications of these findings are undeniably profound, the underlying fragility and speculative nature of simulations render the research subject to intense scrutiny. It is imperative to acknowledge that simulations, no matter how sophisticated, are inherently simplifications of complex physical phenomena.

Lead researcher Ore Gottlieb, a research fellow at the Flatiron Institute’s Center for Computational Astrophysics, is assertive in his claims that these gravitational waves could be detectable with instruments such as LIGO, the Laser Interferometer Gravitational-Wave Observatory. The predictions even suggest that current datasets might already contain evidence of these elusive signals.

However, the scientific community remains skeptical about the feasibility and robustness of these simulations. The boldness of the claims – presenting the possibility of fundamentally altering our understanding of black holes and the inner workings of collapsing stars – invites cautious contemplation.

The study's acknowledgements of the limitations of simulations reinforce the need for healthy skepticism. Gottlieb himself admits the challenge of capturing the variability and complexity of massive stars' collapse through simulations, illustrating the inherent uncertainties and assumptions in the endeavor.

Moreover, the proposal for detecting these gravitational waves raises pertinent questions about the capability of existing instruments and the potential biases in interpreting observational data. The complexities and subtleties of detecting elusive signals from celestial events demand a rigorous and vigilant approach that is far from guaranteed by relying solely on the outcomes of computational simulations.

While the study’s attempts to shed light on hitherto unexplored aspects of astrophysics are commendable, the scientific community should approach these claims with skepticism and scrutiny. The steadfast reliance on simulations demands a thorough validation process that adheres to the rigors of empirical evidence and observational corroboration.

As we confront these grand claims stemming from meticulously designed simulations, it becomes paramount to exercise caution and temper our enthusiasm with the sobering reality of the complexities inherent in modeling complex astrophysical phenomena. True scientific progress can only be achieved through the meticulous testing and validation of hypotheses, especially when they are based on the speculative outcomes of computational simulations.