NIH researchers develop GeneAgent AI for gene-set analysis

Researchers at the National Institutes of Health (NIH) have created an artificial intelligence (AI) agent called GeneAgent that enhances the accuracy and informativeness of gene set analysis. This AI is powered by a large language model (LLM) and improves upon existing systems by providing more accurate and detailed descriptions of biological processes and their functions.

GeneAgent cross-checks its initial predictions, also known as claims, for accuracy against information stored in established, expert-curated databases. It then generates a verification report that details its successes and failures. This AI agent aids researchers in interpreting high-throughput molecular data and identifying relevant biological pathways or functional modules, which can deepen our understanding of how various diseases and conditions impact groups of genes both individually and collectively.

While AI-generated content is produced by LLMs trained on vast amounts of text data from the internet, these models are not designed to verify facts. As a result, AI-generated content can sometimes be false, misleading, or fabricated—a phenomenon known as AI hallucination. LLMs can also exhibit circular reasoning, whereby they fact-check their outputs against their data, which can increase confidence in incorrect information.

Addressing AI hallucinations is crucial when using LLM tools for gene set analysis, which involves generating collective functional descriptions of grouped genes and their potential interactions. Previous studies utilizing LLMs to answer genomic questions or summarize biological processes did not adequately address the issue of hallucinations in generated content.

GeneAgent tackles this challenge by independently comparing its claims against established knowledge in external expert-curated databases. The research team initially tested GeneAgent on 1,106 gene sets sourced from existing databases that had known functions and process names. For each gene set, GeneAgent first generated an initial list of functional claims. It then used its self-verification module to cross-check these claims against the curated databases and produced a verification report indicating whether each claim was supported, partially supported, or refuted.

To evaluate the accuracy of its self-verification process, the researchers enlisted two human experts to manually review 10 randomly selected gene sets, comprising a total of 132 claims. The experts assessed whether GeneAgent's self-verification reports were correct, partially correct, or incorrect. Their analysis revealed that 92% of the decisions made by GeneAgent were accurate, demonstrating high performance in self-verification, particularly when compared to GPT-4. The experts confirmed the model's effectiveness in reducing hallucinations and producing more reliable analytical narratives.

The research team also explored real-world applications of GeneAgent using animal-model gene sets. When tested on seven novel gene sets derived from mouse melanoma cell lines, GeneAgent provided valuable insights into the functions of specific genes, potentially leading to the discovery of new drug targets for diseases such as cancer.

While LLMs like GeneAgent are still constrained by the information they can access and their inability to reason like humans, GeneAgent's self-driven fact-checking capability shows significant promise in addressing AI hallucinations.

AI reveals hidden patterns of Yellowstone’s supervolcano

Beneath the stunning geysers and expansive landscapes of Yellowstone lies a hidden world of seismic activity, now revealed through advanced machine learning techniques.

A groundbreaking study led by Professor Bing Li at Western University in Canada, in collaboration with Universidad Industrial de Santander in Colombia and the U.S. Geological Survey, has utilized advanced machine learning on 15 years of seismic data from the Yellowstone caldera (2008–2022). The outcome? A seismic catalog of 86,276 earthquakes—nearly ten times more than previously recorded.

Machine Learning: The Seismic Detective 🔍

Historically, detecting earthquakes involved a labor-intensive process of manual review, where researchers would spend hours sifting through waveform data to identify seismic events. However, AI-powered algorithms have scanned the entire dataset, automatically identifying and determining magnitudes for previously overlooked small earthquakes.

Professor Li explains, “If we had to rely on traditional methods, where someone manually clicks through all this data, it’s not scalable.” Machine learning has not only accelerated detection but has also fundamentally transformed our understanding of the seismic patterns beneath Yellowstone.

Revealing Earthquake Swarms and Young Faults

More than half of the detected events were part of “earthquake swarms,” which are bursts of closely spaced small tremors. These swarms illustrate fractal-like fault structures—rough and immature fractures beneath the caldera. By mapping these features, scientists are gaining insights into how subsurface fluids trigger cascades of tremors.

This detailed seismic view enables researchers to apply robust statistical methods to analyze swarm dynamics and the interactions between fluids and faults in unprecedented detail.

As Li noted, these methods are not limited to Yellowstone; they have the potential to revolutionize monitoring at volcanoes worldwide.

In summary, machine learning is transforming Yellowstone from a breathtaking surface spectacle into a finely tuned seismic symphony. With AI guiding the way, scientists are now better equipped than ever to understand the hidden rhythms of our planet’s most famous supervolcano, turning once-silent tremors into enlightening discoveries.

Woolpert acquires Dawood Engineering, enhancing infrastructure, geospatial capabilities

Woolpert has made a significant move to expand its global presence and enhance its engineering and geospatial capabilities by acquiring Dawood Engineering Inc., a respected infrastructure and technology firm based in Pennsylvania. This acquisition brings over 150 engineers, surveyors, and geospatial professionals into Woolpert, bolstering its expertise in the transportation, utilities, and energy sectors across North America, Europe, and the Middle East.

Founded in 1992 by civil engineer Bony Dawood, Dawood Engineering has earned recognition for its multidisciplinary work in infrastructure development, particularly in transportation, utilities, and advanced geospatial solutions. Headquartered in Harrisburg, the firm has served as a lead consultant on major projects for state transportation departments and municipal agencies, including PennDOT, the Pennsylvania Turnpike Commission, and the cities of Boston and Philadelphia.

“We are incredibly proud to welcome the Dawood team to Woolpert,” said Woolpert President and CEO Neil Churman. “Their innovative and entrepreneurial approach aligns perfectly with our mission. This acquisition also strengthens our presence in Pennsylvania, a state at the forefront of AI, data center growth, and critical infrastructure development.”

Dawood Engineering has extensive experience in the energy and utilities sector, with expertise in oil and natural gas, electric utilities, pipeline design, and alternative energy sources. These capabilities enhance Woolpert’s existing portfolio and reinforce its position as a leader in integrated architecture, engineering, and geospatial (AEG) solutions.

“This next chapter, as part of the world’s leading AEG firm, creates an environment where all of our professionals can thrive,” said Dawood CEO Bony Dawood. “Together, we are well-positioned to help our clients shape the future of both digital and physical infrastructure.”

The acquisition also strengthens Woolpert with cutting-edge geospatial technologies, including Dawood’s work in 3D laser scanning, GIS, building information modeling, and its pioneering Twin Track mobile application for building management. Notable projects by Dawood include the $10 million Riverlands Safety Improvements Project and the digitization of Poland’s historic Royal Łazienki Museum.

Woolpert Infrastructure Sector Leader Bryan Dickerson emphasized the strategic value of the acquisition: “Dawood brings a depth of technical expertise that complements and strengthens our team. This partnership is founded on shared values and a common vision for innovation and excellence. For our clients and staff, it’s a transformative step forward.”

With the integration of Dawood’s team and services, Woolpert continues to build on its legacy as a global leader in infrastructure and geospatial services, now with an even stronger foundation rooted in Pennsylvania.

Birmingham modeling illuminates giants of the cosmos

How Cutting-Edge Simulations Helped Decode the Universe’s Heaviest Black Holes 🌌

In a landmark scientific achievement, astrophysicists at the University of Birmingham in the UK have played a pivotal role in unraveling the most massive black hole merger ever detected. Weighing in at an astonishing 240 solar masses, the binary system observed on November 23, 2023 defied expectations and set a new standard in cosmic discovery.

However, behind every gravitational wave lies an intricate dance of colossal forces. To decode this phenomenon, Birmingham researchers utilized supercomputer-powered modeling that advanced the field of computational astrophysics.

Precision Modeling: Turning Whispering Waves into Cosmic Stories

When the gravitational wave signals arrived, raw data alone could not reveal their whole story. Enter the unsung heroes: weeks-long supercomputer simulations that captured every detail of two black holes spinning at near-light speeds, tracing their spiraling embrace through the fabric of space-time. Observers noted that modeling such collisions can take weeks of supercomputer time.

These intensive simulations served a dual purpose: to generate theoretical templates of how black hole mergers ought to appear and to compare them with real signals to confirm the identity of the binary system. This intricate detective work unveiled the mass, spin, and orbital characteristics of these cosmic giants.

Birmingham’s Role: Expertise Meets Computational Power

A team of brilliant minds, including Dr. Amit Singh Ubhi, Dr. Debnandini Mukherjee, Dr. Panagiota Kolitsidou, and others, translated signature wave patterns into astrophysical revelations. Dr. Gregorio Carullo emphasized the importance of these models in uncovering layers of complexity, noting that it will take years for the community to unravel this intricate signal pattern fully.

By combining advanced numerical relativity, machine learning, and extensive computational time, Birmingham scientists confirmed a collision that challenges existing models of stellar physics and opens new avenues for understanding how black holes grow—and potentially collide again to form even larger entities.

Why It Matters: Simulating the Universe’s Most Violent Collisions

- Shattering Cosmic Records: The observed binary system outweighs the previous heavyweight by nearly 100 solar masses, prompting scientists to reconsider how such massive black holes form.
- Testing Einstein’s Legacy: Only through high-resolution simulations can researchers effectively probe general relativity under such extreme conditions.
- Fueling the Next Wave of Discoveries: Birmingham’s modeling framework will support future searches for intermediate-mass black holes, enigmatic objects that lie between stellar and supermassive scales.

Looking Ahead: Modeling the Future of Gravitational-Wave Astronomy

The supercomputer workflows developed around this groundbreaking observation are not just a one-time achievement; they represent a blueprint for future cosmic explorations. As gravitational-wave detectors evolve and become more sensitive, the modeling capabilities must also advance to interpret these signals. Birmingham’s team is at the forefront of this progression, combining computational strength with scientific insight.

🏅 Inspiring the Next Generation

What began as faint echoes in interplanetary space has become, through skill and computational power, a vivid chapter in the history of the cosmos. The supercomputer modeling done in Birmingham does not just process numbers; it brings them to life, showcasing humanity’s ability to simulate, understand, and appreciate the universe’s most dramatic events.

In doing so, these scientists remind us that we are not mere observers of the cosmos, we are its narrators, equipped with technology, intellect, and unwavering determination to tell its grandest stories.

Diamonds are hijacked: AI-powered simulations reveal surprising twist in crystal formation

In a stunning revelation about one of Earth's most iconic natural transformations, researchers at UC Davis have discovered that diamonds may owe their crystalline beauty to an unexpected detour involving graphite. This intriguing finding comes from cutting-edge molecular simulations powered by machine learning.

For decades, scientists have understood the basics: carbon atoms under immense pressure and heat eventually crystallize into diamonds. However, a new AI-assisted perspective has made this transformation story much more interesting.

Using advanced molecular dynamics simulations, the UC Davis team trained machine learning algorithms to model the atomic rearrangement that carbon undergoes deep within the Earth. Their results overturned previous assumptions: instead of carbon atoms seamlessly aligning into diamond form, they first transition into a more chaotic, graphite-like state. In other words, graphite — the same soft material found in pencils — serves as an unexpected intermediary in the creation of diamonds.

The simulations, which demanded extraordinary precision and computational power, revealed that this graphite-like layer "hijacks" the usual path to diamond formation. It creates a kind of atomic jam session that may appear messy on the surface but ultimately lays the foundation for the perfect diamond lattice.

"Without machine learning, we’d never have caught this," said UC Davis physicist and study co-author Subramanian Sankaranarayanan. "The simulations require immense computational complexity — we’re tracking the quantum behavior of thousands of atoms over time."

Traditional physics-based models would have taken years to run, but the team's AI-driven approach dramatically reduced that timeline. Their neural networks were trained on quantum-level data, enabling them to predict how atoms interact, bond, and break apart — all at unprecedented speeds and scales.

This discovery isn't just a scientific curiosity; it could lead to advancements in synthetic diamond technologies, providing cleaner, faster, and potentially cheaper methods for producing gem-quality diamonds or materials for advanced electronics.

As for the diamonds themselves? They may still be everlasting, but we now know that their journey includes a detour through pencil lead. Science is often full of surprises, and with the help of machine learning, we now understand that diamonds are born not only from pressure but also from a touch of chaos.