Glasgow sets its sights on 'cognitive' cities, where urban systems learn, predict, adapt

Imagine a city capable of sensing trouble before it occurs, anticipating traffic jams, worsening air quality, infrastructure strain, or emerging health concerns, much like a living organism responds to its surroundings. According to a new research center at the University of Glasgow, this vision is closer to reality than many realize.
 
Launched this week, the Centre for Integrated Sensing and Communication Enabling Cognitive Cities (ISAC³) is set to explore how next-generation digital technologies, particularly 6G communications, artificial intelligence, and large-scale data analytics, can transform today’s “smart cities” into cognitive ones. Unlike current urban systems, which primarily monitor conditions in real-time, cognitive cities aim to predict and adapt, shifting city management from a reactive response to proactive decision-making.
 
At the heart of this ambition lies data, vast volumes of it. Future 6G networks are expected to collect streams of information from advanced sensors embedded throughout urban infrastructure, as well as from next-generation mobile devices carried by residents. Processing, analyzing, and acting on this data in real time will demand not just clever algorithms, but significant computational power, placing high-performance computing and AI-driven analytics at the core of the cognitive city concept.
 
Professor Qammer H. Abbasi, Founding Director of ISAC³ and Professor at the University of Glasgow’s James Watt School of Engineering, describes the initiative as a response to mounting global pressures on cities. Population growth, climate change, cybersecurity risks, and the push toward decarbonization are converging challenges that traditional urban planning struggles to address in isolation.
 
“Next-generation technologies like real-time data collection, advanced communications, cyber-physical systems, and AI-driven analytics will provide the tools required to turn urban spaces into cognitive cities,” Abbasi said. “ISAC³ brings together the expertise needed to explore how these tools can work together responsibly and at scale.”
The center unites researchers across engineering, computing science, cybersecurity, public health, business, social science, and urban planning, reflecting the inherently interdisciplinary nature of future cities. Cognitive systems, the researchers suggest, will require tightly coupled sensing, communication, computation, and action, an architectural challenge that mirrors the integrated workflows increasingly seen in modern supercomputing environments.
 
One intriguing aspect of ISAC³’s vision is its focus on health and well-being. Professor Frances Mair, Head of the University of Glasgow’s School of Health & Wellbeing, notes that healthcare services have often lagged behind other sectors in adopting advanced digital tools. Cognitive city technologies, she argues, could change that dynamic by identifying early warning signs of health risks and connecting people to support before conditions worsen.
 
"In the future, ISAC technology could work quietly in the background,” Mair said, “helping communities stay healthier, safer, and more supported."
 
Beyond technical innovation, ISAC³ is also emphasizing responsible and inclusive development. Professor Nuran Acur of the Adam Smith Business School highlighted the Centre’s “quadruple helix” approach, which brings together academia, industry, public services, and society from the earliest stages of research. The goal is to ensure that technologies are not only advanced but also practical, socially responsible, and ready for real-world deployment.
 
The center's first year will focus on building a deployment roadmap through workshops and webinars with international experts in integrated sensing, communication, and computing. A key question underpinning this work is how to balance innovation with robust data protection, particularly as cities collect increasingly sensitive information from sensors and personal devices.
 
Glasgow itself will serve as a living laboratory. The University’s campus will be used as a testbed for prototype systems, developed in collaboration with industry partners including BT, Virgin Media O2, Ericsson, InterDigital, and Neutral Wireless. According to Mallik Tatipamula, CTO of Ericsson Silicon Valley, the work underway in Glasgow could have global implications.
 
“Cities that succeed will be those that can sense, interpret, and respond to their environments in real time,” Tatipamula said. “ISAC³ has the potential to help redefine how future societies function.”
 
ISAC³ presents the supercomputing community with intriguing challenges: How can vast urban data streams be processed both efficiently and securely? What part will high-performance computing and AI accelerators play in facilitating real-time, citywide predictions? And how might computational models allow policymakers to test decisions virtually before they are implemented?
 
As ISAC³ embarks on its mission, it does not claim to possess all the solutions. Rather, it is establishing a research environment driven by curiosity, exploring how cities could learn, adapt, and evolve, and examining the computational backbone necessary to make these ambitions a reality. Through this work, Glasgow is setting itself apart as a hub where the future of urban life is not just envisioned, but rigorously investigated through computation.

Supercomputers illuminate deep Earth: How giant 'blobs' shape our magnetic shield

 
Researchers at the University of Liverpool and their collaborators have achieved a significant advance in understanding Earth's interior, leveraging the capabilities of modern supercomputing. For the first time, they have demonstrated how two massive, ultra-hot formations deep within Earth's mantle affect the creation and long-term dynamics of our planet's magnetic field, using sophisticated numerical models powered by high-performance computing.
 
The Earth’s magnetic field, the invisible shield that protects life from dangerous solar and cosmic radiation, is generated by the turbulent motion of molten iron in the outer core, a process known as the geodynamo. Understanding what governs the geodynamo’s behavior over millions of years requires not only sophisticated palaeomagnetic measurements from ancient rocks but also large-scale, three-dimensional simulations that test how variations deep within the planet affect core dynamics.
 
In their study, the research team combined palaeomagnetic datasets, which record changes in the magnetic field over geological time, with supercomputer-based dynamo simulations to reveal the importance of thermal heterogeneity at the core–mantle boundary. Two continent-sized regions of intensely hot rock, located roughly 2,900 km beneath Africa and the Pacific, sit atop Earth’s outer core and create strong, lateral temperature contrasts that profoundly influence the flow of molten iron below.
 
These “blobs,” known in geophysics as Large Low-Velocity Provinces (LLVPs) because they slow down seismic waves, were already observed by seismic imaging, but their significance for magnetic field generation had been unclear until now. By incorporating the effects of thermal heterogeneity into supercomputer models of the geodynamo, researchers found that these deep mantle structures help explain key features of Earth’s ancient and modern magnetic field, including the persistence of a dominant dipolar structure and subtle longitudinal variations that earlier homogeneous models could not reproduce.
 
Running these simulations is a remarkable computational feat. The equations that govern the interaction of heat, fluid motion, magnetic induction, and rotation in Earth’s core are exceptionally complex and demand high-performance computing (HPC) clusters with massive parallel processing capability. Even with today’s most powerful machines, exploring how the magnetic field evolves over hundreds of millions of years, and how it responds to boundary conditions set by deep mantle structures, represents an immense computational challenge.
 
According to Professor Andy Biggin of the University of Liverpool, “strong contrasts in the spatial pattern of core–mantle heat flux … have influenced the geodynamo for at least the last few hundred million years.” This implies that Earth’s magnetic field, while often approximated as a simple bar magnet aligned with the rotation axis, has subtle asymmetries imprinted by deep Earth processes that are only now coming into focus thanks to HPC-enabled modeling.
 
The implications of this research extend beyond geomagnetism. By demonstrating how deep mantle thermal structures influence core dynamics, these models offer a new framework for understanding the long-term evolution of the planet, including the connections between internal dynamics and surface phenomena such as continental assembly and breakup, climate shifts, and the formation of mineral resources. More accurate reconstructions of Earth’s ancient magnetic field also serve as essential constraints in palaeogeographical studies and plate-tectonic history.
 
For the supercomputing community, this breakthrough exemplifies how HPC is becoming indispensable to Earth sciences. Supercomputers are not merely accelerators of computation; they are exploratory instruments that allow scientists to build and test virtual Earths, probing regimes that cannot be accessed through direct observation or laboratory experiments. By enabling models that integrate data spanning hundreds of millions of years with high-resolution physics, supercomputers are transforming our understanding of the deep interior of the planet we call home.
 
With the ongoing growth of computational power, driven by larger HPC systems and improved algorithms, researchers can continue to enhance their models, broaden the range of conditions they examine, and incorporate the latest observational data. These advances will deepen our insights into the geodynamo, Earth’s thermal history, and the intricate connections between our planet’s interior and surface life.
 
According to the study’s authors, combining palaeomagnetic records with dynamo simulations introduces a “new means to constrain the properties and time evolution of the core–mantle boundary.” This approach provides a clearer perspective on the forces that have safeguarded life on Earth for millions of years. Thanks to supercomputing, we are now seeing a far more dynamic and interconnected Earth than previously imagined.

Supercomputers unravel the mystery of missing Tatooine-like planets

The idea of planets with twin suns, like Tatooine from Star Wars, has fascinated both scientists and the public for years. Despite binary stars being common throughout our galaxy, planets orbiting two stars (circumbinary planets) remain unexpectedly scarce. Researchers from the University of California, Berkeley, in collaboration with the American University of Beirut, now offer a compelling answer to this puzzle. Their explanation, grounded in Einstein’s general theory of relativity, emerged from sophisticated computational models powered by cutting-edge supercomputing technology.
 
Of the more than 6,000 confirmed exoplanets identified to date, only a handful orbit binary stars; a statistic that stands in stark contrast to expectations, given that stars commonly form in pairs. The team’s analysis shows that as tight binary stars spiral closer over millions of years due to tidal interactions, general relativistic precession, a subtle warping of spacetime predicted by Einstein, changes the dynamics of the entire system in a way that destabilizes potential planets.
 
Planets in a circumbinary orbit experience gravitational tugs from both stars. Under Newtonian physics alone, this complex interplay is already difficult for planets to navigate. But when the binary stars themselves begin to precess, that is, the orientation of their orbit rotates due to relativistic effects, the system can enter a state of secular resonance. At this point, the precession of the stars’ orbit matches that of the planet’s orbit, steadily pumping energy into the planet’s motion. Eventually, the planet’s path becomes highly elongated and chaotic. It can either be flung outward into interstellar space or drawn inward, where it risks destruction by one of its host stars.
 
This resonant disruption, described in the study “Capture into Apsidal Resonance and the Decimation of Planets around Inspiraling Binaries,” was elucidated through orbit-averaged simulations that explore the dynamical evolution of circumbinary systems under a range of initial conditions. These simulations, computationally intensive and numerically sophisticated, map out the phase space of binary–planet interactions and reveal how frequently planets are captured into destructive resonances as binaries tighten. According to the study, roughly eight out of every ten potential planets in tight binary systems encounter this resonance, and three out of four are eventually destroyed or ejected, leaving behind only a few survivors on distant, hard-to-detect orbits.
 
Such high-resolution modeling is intrinsically dependent on supercomputing capabilities. Simulating the long-term evolution of three-body systems, where two stars and a planet influence one another gravitationally, requires solving coupled differential equations with precision over billions of simulated years. Conventional computing alone is insufficient for this scale of calculation; only through HPC systems can researchers explore vast ensembles of scenarios, integrate relativistic effects accurately, and uncover the nuanced mechanisms that shape planetary destinies.
 
For the supercomputing community, this research offers both inspiration and affirmation of the critical role HPC plays in astrophysics. By enabling simulations that incorporate general relativity alongside classical dynamics, supercomputers open windows into processes that cannot be observed directly, but that govern the architecture of planetary systems throughout the galaxy. They allow scientists to test theoretical ideas against virtual models of reality, refining our understanding of how planets form, persist, or perish in the cosmos.
 
The Berkeley-led team clarifies that their findings do not mean binary stars are devoid of planets. Instead, they show that while planets often form around binary stars, most are pushed into orbits that current detection tools, including NASA’s Kepler and TESS, struggle to find. A few planetary survivors may remain, hidden in distant, long-period orbits that will require innovative search techniques to uncover.
 
Looking ahead, researchers plan to apply similar modeling techniques to other astrophysical contexts, such as the environments around pairs of supermassive black holes, to understand how relativistic dynamics influence large-scale cosmic structures. In doing so, they continue to push the boundaries of computational astrophysics, using supercomputers not just as tools for calculation but as engines of discovery in the quest to understand our universe.

ML, supercomputing unite to revolutionize high-power laser optics

Researchers at the University of Strathclyde in Scotland are leveraging advanced computational techniques to transform scientific discovery. By integrating machine learning algorithms with powerful supercomputer models, they have significantly accelerated the design process for robust optical components used in high-power laser systems. This innovative approach not only shortens design cycles but also uncovers new physical phenomena, marking a breakthrough with wide-reaching impacts across science, industry, and emerging technologies.
 
High-power lasers are vital to advancements in nuclear fusion, high-field physics, and advanced manufacturing, but their optical components must endure extreme intensities without failing. Traditional optics are often large, expensive, and challenging to scale, which restricts the development of next-generation laser facilities. To overcome these limitations, Strathclyde’s multidisciplinary team is developing plasma photonic structures, temporary, self-assembled mirrors formed in ionized gas, that can fulfill the same roles at a much smaller and more cost-effective scale.
 
The central challenge lies in navigating a highly complex parameter space where interdependent variables determine performance. Traditional design methods involve resource-intensive, trial-and-error iterations that may require hundreds of thousands to millions of individual evaluations before an acceptable design can be identified. By coupling machine learning algorithms with supercomputer-driven physical models, specifically deep kernel Bayesian optimization (DKBO) paired with particle-in-cell (PIC) simulations, researchers have reduced this process to just a few dozen iterations, enabling rapid identification of high-reflectivity, robust plasma mirror designs.
 
This achievement depends on computationally intensive supercomputer simulations to model the spatio-temporal evolution of transient plasma structures and evaluate performance metrics such as reflectivity and pulse compression. The simulations, executed at high resolution with millions of interacting particles, are inherently demanding and could not be conducted at scale without HPC resources. In fact, the team’s use of national supercomputing services, including the ARCHER2 UK National Supercomputing Service, exemplifies how targeted computational power can transform scientific inquiry.
 
According to lead Dr. Slavi Ivanov of Strathclyde’s Department of Computer and Information Sciences, the integration of DKBO with particle-in-cell models enables not just faster design optimization but also unexpected discovery. In their work, the optimization framework found regimes where incident laser pulses are compressed by the plasma mirror structure, a phenomenon that emerged from the simulations rather than human intuition, underscoring the capacity of machine-assisted design to reveal new physics.
 
Professor Dino Jaroszynski, co-author and distinguished laser physicist, described the research as an engine of discovery that expands the objectives beyond mere performance targets. “By specifying innovative or unconventional design goals, we can uncover mechanisms that might otherwise remain hidden,” he noted, suggesting that this approach could redefine how optical components are conceived for extreme environments.
 
The implications of this work extend well beyond high-power lasers themselves. The general nature of the machine learning and simulation framework means it can be adapted to other optical elements, from beam splitters to focusing devices, and even to real-time experimental optimization workflows where objective functions are derived from empirical measurements. This flexibility opens new pathways for rapid, HPC-enabled design across photonics, telecommunications, and other advanced technologies.
 
Importantly for the supercomputing community, this research illustrates how machine learning and HPC models can be coupled in powerful synergy. Machine learning provides an intelligent search strategy that dramatically reduces the number of required simulation runs, while the supercomputer executes the high-fidelity physical models necessary to evaluate each candidate design. This integrated loop, where algorithms guide simulations and simulations train algorithms, is becoming a hallmark of contemporary computational science.
 
As high-performance computing infrastructure continues to advance in both capability and accessibility, hybrid approaches such as deep kernel Bayesian optimisation are becoming essential tools for addressing complex, multidisciplinary challenges. From the design of next-generation optical components to the discovery of previously unknown physical phenomena, the integration of machine learning with high-fidelity simulation is accelerating innovation and narrowing the gap between theoretical research and practical application.
 
For the Supercomputing community, the Strathclyde plasma mirror project illustrates how supercomputing has evolved beyond traditional numerical analysis into a collaborative force in scientific discovery, enabling researchers to navigate vast design spaces, reveal unexpected behaviors, and redefine how technologies are engineered for extreme operating conditions.

Supercomputing reveals hidden galactic architecture around the Milky Way

Leveraging the capabilities of modern high-performance computing (HPC), astronomers have unraveled a cosmic mystery: the Milky Way and its closest neighboring galaxies are embedded in a sprawling sheet of matter that shapes the movement of surrounding galaxies. This breakthrough, featured in Nature Astronomy, was achieved using advanced simulations powered by cutting-edge supercomputers to model the mass distribution and dynamics of our local universe.
 
For decades, cosmologists have grappled with an apparent contradiction in galactic motion. While most galaxies in the universe recede from one another in accord with the expansion described by the Hubble–Lemaître law, our immediate neighborhood, the Local Group comprising the Milky Way, the Andromeda Galaxy, and dozens of dwarf galaxies, exhibits surprisingly coherent motion patterns that ordinary mass distributions failed to explain. The Andromeda Galaxy itself moves toward the Milky Way at about 100 km/s, a phenomenon long understood as gravitational interaction within the Local Group. Yet the behavior of other nearby galaxies did not align with theoretical expectations.
 
Now, an international team led by doctoral researcher Ewoud Wempe and Professor Amina Helmi at the University of Groningen has shown that the key to this puzzle lies not within the confines of the Local Group alone but in an extended, planar mass structure surrounding it. Using sophisticated cosmological simulations constrained by observational data, including the positions, masses, and velocities of 31 galaxies just beyond the Local Group, the researchers demonstrated that the vast majority of dark matter and visible matter in our vicinity is organized in a flat sheet extending tens of millions of light-years. Above and below this planar structure are vast voids with minimal matter.wempe
 
What sets this discovery apart is the critical role of supercomputing in constructing these “virtual twin” universes. The team’s simulations began with initial conditions seeded by early-universe observations and then evolved forward using numerical methods that solve Einstein’s equations of gravity together with fluid dynamics for dark matter and baryonic matter. Such calculations involve millions of interacting elements and demand parallel computation at scale, the exclusive domain of HPC systems. By performing these simulations on powerful supercomputers, astronomers were able to trace the gravitational influence of the large-scale sheet on galaxy motions and verify that this configuration reproduces observed velocities with high fidelity.
 
According to Helmi, this marks the first time that the distribution and velocity field of dark matter in the region surrounding our galactic neighborhood have been quantitatively constrained in a manner consistent with both ΛCDM cosmology and observed local dynamics. “Astronomers have been trying to solve this problem for decades,” Helmi said. “It is extraordinary that, based purely on the motions of galaxies, we can infer a mass distribution that matches the observed positions and motions of galaxies within and just outside the Local Group.”
 
For the supercomputing community, this achievement is profoundly inspirational. It highlights how modern HPC infrastructures, with their massive parallelism, high memory bandwidth, and optimized numerical libraries, are enabling scientists to probe cosmic questions that were once deemed intractable. These simulations not only illuminate the hidden architecture of our cosmic neighborhood but also exemplify how simulation-based science complements observation, allowing researchers to explore scenarios that cannot be directly imaged or measured.
 
Beyond resolving a decades-old enigma in galactic astronomy, this work reinforces the broader scientific view that large-scale structures, from filaments of the cosmic web to planar mass configurations like the one now identified around the Milky Way, are fundamental to understanding the universe’s evolution. Supercomputers are not just tools for speeding up calculations; they are essential engines of discovery that empower scientists to simulate the universe with realism and precision.
 
As supercomputing technology advances, both in terms of hardware and algorithms, scientists are poised to create increasingly detailed “virtual universes.” These sophisticated simulations will not only put our cosmological models to the test but also inform future telescope and space mission observations, leading to a richer understanding of our cosmic context.
 
According to the study’s authors, uncovering the influence of the Local Sheet on galactic motion is more than just resolving a persistent mystery; it demonstrates the remarkable discoveries possible when computational power, observational insights, and scientific curiosity are combined on a cosmic scale.