Coral bleaching. © Underwater Earth / XL Catlin Seaview Survey / Christophe Bailhache
Coral bleaching. © Underwater Earth / XL Catlin Seaview Survey / Christophe Bailhache

Prepare now for a hotter ocean: UK scientists urge action to protect marine life

Communities must plan to reduce the risks of extreme weather patterns and record-high temperatures.

Since April 2023, the average global sea surface temperature has been abnormally high and increasing. By August, temperatures in the northern hemisphere ocean reached a record 25°C.

This rapidly warming trend, fuelled by the climate crisis, has manifested as a series of marine heatwaves — periods of unusually warm sea temperatures that can last weeks, months, or even years — across the northern and southern hemispheres.

Marine heatwaves are becoming more frequent, stronger, and longer-lasting. In some areas around the UK and Ireland, surface waters in June–July were 4–5°C above what is usually recorded at this time of the year. Temperatures are also soaring off the coast of Florida and into the Gulf of Mexico, extending across the tropical Pacific, around Japan, and off the coasts of Ecuador and Peru.

Marine heatwaves disrupt, threaten, and damage ecosystems. They are particularly dangerous for temperature-sensitive marine organisms that live in cool waters, such as kelps, invertebrates, and fishes, and organisms that cannot move, such as corals. Many species may be susceptible to disease or mortality, with knock-on effects.

Such events also affect local communities, who suffer economic losses from fisheries and aquaculture impacts.

This makes the concurrent onset of a strong El Niño -- a climate phenomenon that is typically associated with a rise in global temperatures -- particularly worrying. El Niño is a major climate phenomenon that comprises a warm phase (El Niño), a cool phase (La Niña), and a neutral phase. El Niño has the most widespread impact on sea surface temperatures globally. These switch, irregularly, every few years. The combination of marine heatwaves and strong El Niño are particularly worrying.

Earlier in 2023 conditions in the tropical Pacific began to reverse and El Niño appears to be developing. This is likely to increase to the end of the year and possibly next year and is poised to trigger major marine heatwaves.

With impending El Niño conditions with long-term warming trends, it is vital to monitor and help develop plans to reduce risk to wildlife and economies.

In a recent articleresearchers are urging decision-makers in marine and coastal biodiversity conservation, fishing, aquaculture, and tourism industries to set out a strategy to reduce risk before, during, and after the event. They also set out four main priorities:

  • Identify threatened regions

Historical data can show what areas suffered marine heatwaves during previous El Niños and suggest where future events are likely to occur.

  • Improve forecasts and warnings

Developing new predictive supercomputer models and improving the accuracy of current systems is crucial for local biodiversity efforts as well as the fishing, aquaculture, and tourism industries.

  • Plan local responses

Seasonal early warning systems should be developed to inform conservation agencies, fishing and aquaculture industries, and the public. Some industries like aquaculture and fisheries may need to change or adapt practices before and during predicted heatwaves.

  • Monitor impacts of warmer waters

To better understand ecological responses to extreme warming events, researchers should scale up monitoring efforts to understand more about the physical and biological conditions of a region before a heatwave occurs.

Scientists say that unfortunately, the climate crisis may eventually cause the ocean to reach a permanent heatwave state, and some regions may no longer support certain species and ecosystems.

Postdoctoral Research Assistant Dr Katie Smith from the Marine Biological Association (MBA) who co-authored the paper said: “Marine heatwaves are occurring with increasing regularity and it is crucial that we work towards predicting their impacts and implementing adaptive strategies to reduce the consequences of these events.

Regardless of whether marine heatwaves are exacerbated this year by an El Niño event, preparations to soften their impacts will help marine ecosystems and the industries that rely on them, offering them an opportunity to adjust or transform.”

According to scientists, we must take action now to prepare for stronger marine heat waves in the future. To achieve this, we need to reduce greenhouse gas emissions, improve ocean health, and increase our resilience to climate change. As individuals, we can play our part by reducing our carbon footprint, conserving energy, and supporting policies that protect our oceans and the planet. By taking these steps, we can ensure that future generations have a healthy and thriving marine environment.

Image of non-homogeneous glass with color elements concentrated in certain regions (photo: Nilanjana Shasmal/CeRTEV)
Image of non-homogeneous glass with color elements concentrated in certain regions (photo: Nilanjana Shasmal/CeRTEV)

Brazilian reserachers show how to unlock the strength of specialty glass with niobium oxide

Applications of specialty glass range from astronomy to medicine, as well as data and power transmission. The study combined spectroscopy and molecular dynamics (MD) and Monte-Carlo (MC) simulations to show how the structure of the material is affected by the addition of niobium oxide. 

A study conducted at the Center for Research, Education, and Innovation in Vitreous Materials (CeRTEV) in São Carlos, São Paulo state, Brazil, shows for the first time that including niobium oxide (Nb2O5) in silicate glass results in silica network polymerization, which increases bond density and connectivity, enhancing the mechanical and thermal stability of specialty glass.

The study was supported by FAPESP and reported in an article published in the journal Acta Materialia.

The first author of the article, Henrik Bradtmüller, is a postdoctoral researcher at the Federal University of São Carlos’s Center for Exact Sciences and Technology (CCET-UFSCar), with a fellowship from FAPESP. His supervisor is Edgar Dutra Zanotto, director of CeRTEV.

CeRTEV is hosted by UFSCar and is one of the Research, Innovation, and Dissemination Centers (RIDCs) funded by FAPESP.
“Our study combined experimental observations using nuclear magnetic resonance spectroscopy and Raman spectroscopy with computational modeling. Besides the results mentioned, we found that higher levels of niobium led to Nb2O5, clustering, and heightened electronic polarizability, with a significant impact on the optical properties of the glass,” Bradtmüller said.

It's important to remember that Raman spectroscopy is a method that provides accurate information about the molecular structure of materials. On the other hand, nuclear magnetic resonance (NMR) spectroscopy goes a step further by exploring the magnetic properties of atomic nuclei.

“Our strategy based on these two observational techniques plus computational modeling can be used to study functional elements of many other types of glass, including optical materials, bioactive glass, and glassy fast-ion conductors. This will facilitate the development of innovative glass formulations adapted for various applications,” Bradtmüller said.

Alongside the everyday applications of ordinary glass in containers, windows, and so on, high-quality glass has also become almost ubiquitous in today’s world, Bradtmüller noted. It is present in the microscopes and telescopes used by scientists, for example, in the optical fibers used to carry data and power, and in the glass-ceramic orthotic devices increasingly used in medicine. “In recognition of the role played by glass in contemporary society, the United Nations declared 2022 to be the International Year of Glass,” he said.

For advanced high-tech applications, materials scientists are using machine learning software and other computational resources to design glass with customized properties, but to do so they require reliable databases and structural parameters that take into account the physicochemical complexity of glass.

This is the relevance of the study by Bradtmüller and colleagues. “Glass intermediate oxides play a strategic role in this new technological moment. They don’t form glass under standard cooling in the laboratory, but they can make a positive contribution in the presence of other oxides by helping to build oxygen bridges and giving the glass the properties of interest. Niobium oxide is a good example,” he explained.

The glass that contains niobium (Nb) is valued for its non-linear optical properties, with potential applications in optoelectrical devices, and for mechanical properties relevant to the fabrication of bioactive materials. “Although studies had been conducted using Nb2O5 before our own, the structural role of Nb remained obscure, owing mainly to a lack of systematic spectroscopic characterization data. We set out to fill this knowledge gap in our study,” he said.

“We discovered through spectroscopy that the addition of Nb causes ‘polymerization’ of the silica-oxygen network, increasing the connectivity of the glass’s components. This clarified the role of Nb as a ‘network former’. Another highlight of the study is our demonstration that a new NMR technique we developed in 2020 using other materials applies to glass. This technique, which is called W-RESPDOR, can be used to measure the distance between two elements – in this case, lithium and Nb, which has such a challenging nucleus that it had never been measured with similar techniques.”

Computational modeling showed that lithium ions are randomly distributed in silica-based glass at the nanometric scale (5-10 nanometers), while Nb tends to form clusters at higher concentrations of Nb2O5, he explained, adding that this kind of structural arrangement had never been reported in the literature and is an original contribution of the study.

“In a broader perspective, the study points to an experimental and computational strategy to investigate the role played in glass by intermediate oxides with active nuclei for NMR spectroscopy,” Zanotto said.

The other authors of the article include Hellmut Eckert, Vice Director of CeRTEV and a specialist in NMR; and Anuraag Gaddam, a postdoctoral researcher specializing in computer simulations, with a scholarship from FAPESP and supervision by Eckert.

The conducted study has demonstrated that by adding niobium oxide to silicate glass, it is possible to achieve an increased bond density and connectivity, which results in better mechanical and thermal stability of specialty glass. This development is highly promising as it could lead to the creation of more reliable and durable specialty glass products. Further research and development, may open up a range of new possibilities for the use of silicate glass in various applications.

Data collected by the MOSAiC expedition to the central Arctic (shown), and analyzed by McKelvey School of Engineering researchers, revealed blowing snow as a previously unaccounted-for source of sea salt aerosols, impacting Arctic climate models. (Photo courtesy MOSAiC expedition)
Data collected by the MOSAiC expedition to the central Arctic (shown), and analyzed by McKelvey School of Engineering researchers, revealed blowing snow as a previously unaccounted-for source of sea salt aerosols, impacting Arctic climate models. (Photo courtesy MOSAiC expedition)

Wang's lab discovers that Arctic sea salt aerosols are underestimated, improving modeling

Atmospheric scientists, led by Jian Wang, have discovered that wind-blown snow in the central Arctic produces abundant fine sea salt aerosols, resulting in increased seasonal surface warming. Wang

The Arctic is a concerning outlier when it comes to global warming trends. It warms almost four times faster than the global average, and the role of aerosols in this warming is significant. Scientists have known for a long time that pollutants from other regions can build up in the Arctic atmosphere. This leads to a change in atmospheric chemistry, which absorbs sunlight and has an impact on local weather patterns, resulting in localized warming that melts ice and snow. While sea salt particles are the primary aerosol mass concentration, the mechanisms that produce them and their impact on the Arctic climate have yet to be fully understood.

Atmospheric scientists led by Jian Wang, director of the Center for Aerosol Science and Engineering and a professor of energy, environmental and chemical engineering at the McKelvey School of Engineering at Washington University in St. Louis, investigated the production and impact of sea salt aerosols on Arctic warming. Their results revealed abundant fine sea salt aerosol production from blowing snow in the central Arctic, increasing particle concentration and cloud formation.

“Over the past few decades, scientists have identified ‘Arctic haze’ as the primary source of aerosols in the Arctic during winter and spring. This haze results from the long-range transport of pollutants,” said Xianda Gong, first author on the study and a former postdoctoral researcher in Wang’s lab. “However, our study reveals that local blowing snow, which produces sea salt particles, contributes a more substantial fraction to the total aerosol population in the central Arctic.”

Wang’s team analyzed data collected by the Multidisciplinary Drifting Observatory for the Study of Arctic Climate (MOSAiC). Such observations are difficult to obtain — the MOSAiC expedition entailed international collaboration and freezing an icebreaker into the central Arctic ice pack to drift with the sea ice for an entire year — but essential to understanding the full picture of atmospheric conditions in the Arctic.

“The MOSAiC expedition let us observe how aerosols and clouds evolve over a year and led to this discovery,” Wang said. “Sea salt particles in the Arctic atmosphere aren’t surprising, since there are ocean waves breaking that will generate sea salt aerosols. But we expect those particles from the ocean to be pretty large and not very abundant.

“We found sea salt particles that were much smaller and in higher concentration than expected when there was blowing snow under strong wind conditions,” Wang said.

In the central Arctic, the coldest winter nights are the clearest, when heat from Earth can escape into space unimpeded. Under a cozy blanket of clouds, though, long-wave radiation gets trapped and contributes to warming, so any process that leads to increased cloud formation and lingering cloudiness also boosts surface temperatures. Small aerosol particles, including those fine sea salt aerosols produced by blowing snow that Wang’s team discovered, turn out to be very good for cloud formation.

“These sea salt particles can act as cloud condensation nuclei, leading to cloud formation,” Gong said. “Considering the absence of sunlight in the winter and spring Arctic, these clouds can trap surface long-wave radiation, thereby significantly warming the Arctic surface.”

Though scientists had not observed this phenomenon before, fine sea salt aerosols from blowing snow have always been part of the Arctic climate system. With this observational confirmation and systematic study, which revealed that sea salt particles produced from blowing snow account for about 30% of total aerosol particles, climate models can now be updated to include the effects of these fine particles.

“Model simulations that don’t include fine sea salt aerosols from blowing snow underestimate aerosol population in the Arctic,” Wang said. “Blowing snow happens regardless of human warming, but we need to include it in our models to better reproduce the current aerosol populations in the Arctic and to project future Arctic aerosol and climate conditions.”

The findings of this study suggest that model simulations of the Arctic atmosphere must include fine sea salt aerosols from blowing snow in order to accurately represent the aerosol population in the region. This is an encouraging development, as it means that scientists now have a better understanding of the Arctic atmosphere and can use this knowledge to develop more accurate models and predictions. With this new information, researchers can continue to work towards a more comprehensive understanding of the Arctic climate and its effects on the global environment.

How can the new model developed by Japanese scientists help improve tsunami warning systems?

Unlocking the secrets of the sea: How Japanese scientists are working to improve tsunami warning systems

The Hunga Tonga-Hunga Ha'apai volcano in Tonga erupted on January 15, 2022, causing massive amounts of energy to be released into the atmosphere and ocean, leading to tsunamis across the Pacific Ocean. The Shocks, Solitons, and Turbulence Unit of the Okinawa Institute of Science and Technology (OIST) in Japan has conducted research into the disturbances in the atmosphere and ocean during this event and has developed a supercomputer model to enhance the current tsunami early warning systems.

Stephen Winn, a research technician in the unit and first author of the research article, stated “It's important to know how the atmospheric wave changes in time to make accurate predictions that would be of use for warning systems.”    

Unlike a regular tsunami caused by a rapid movement of the seabed, the large waves caused by the Tonga explosion were also influenced by a pressure wave hundreds of kilometers wide released into the atmosphere. The atmospheric pressure wave first moved upwards and then spread outwards traveling at 1,141 km/h on average, about 400km/h faster than a regular tsunami can travel in deep water. It traveled around the earth causing waves as far away as the Mediterranean Sea. “This was the first event of its kind recorded in detail by modern instruments,” Prof. Emile Touber, leader of the Shocks, Solitons and Turbulence Unit stated. 

As the atmospheric wave travels above the ocean, it displaces the body of water underneath, creating waves that travel faster than a regular tsunami. “Normally, a tsunami wave created in the Pacific would not reach the Mediterranean because it would have to travel around land masses to get there, but atmospheric waves are not restricted, traveling over those land masses,” Dr. Adel Sarmiento, a postdoc researcher at the unit explained. This is why the wave can reach worldwide and has a broader impact than a regular tsunami.  

The scientists used measurements from the Tonga event to validate their model and used a state-of-the-art code, dNami, co-developed by Dr. Nicolas Alferez at the Conservatoire National des Arts et Métiers in Paris, France, to rapidly simulate the earth during the event using the supercomputer at OIST. The code allows them to create simulations in satisfactory resolution, faster than real-time, so that they are useful for improving warning systems in the future.

Prof. Touber explained that they can now more accurately predict the arrival time and height of a wave at a specific location and rapidly identify areas at high risk. 

Hurricanes and typhoons can also cause disturbances in the atmosphere that interact with the sea, causing significant water level changes that will affect coastlines. “With our model, we can explore what might happen to the water flow as it approaches the coast if the sea level changes by a certain amount with certain typical storm conditions,” Prof. Touber said. “This can help decide on the kind of coastal defense systems that should be put in place for storm-related surges.” 

A group of scientists from Japan conducted a study on the interactions between the ocean and atmosphere that occurred after the Tonga volcano eruption. The results of the study were very promising as they developed a model that has the potential to predict high-risk areas with great accuracy and improve the existing tsunami warning systems. This research is a major milestone in understanding the complex interplay between the ocean and atmosphere and has the potential to save many lives and properties in the future.

In the above map from the Southern California Earthquake Data Center, some of the individual pixels represent thousands of earthquakes.
In the above map from the Southern California Earthquake Data Center, some of the individual pixels represent thousands of earthquakes.

Discover the power of deep learning with UCSC seismologists' pioneering technology that enables them to predict earthquakes

Earthquake aftershock forecasting models have remained largely unchanged for more than 30 years. These models work well with limited data but struggle with the vast amount of seismology datasets that are now available. To overcome this limitation, researchers from the University of California, Santa Cruz, and the Technical University of Munich have developed a new model called Recurrent Earthquake foreCAST (RECAST). This model uses deep learning and is more flexible and scalable than the current earthquake forecasting models.

The scientists published a paper in Geophysical Research LettersGeophysical Research Letters, which shows that the new model outperforms the existing model, known as the Epidemic Type Aftershock Sequence (ETAS) model, for earthquake catalogs of about 10,000 events or more.

“The ETAS model approach was designed for the observations that we had in the 80s and 90s when we were trying to build reliable forecasts based on very few observations,” said Kelian Dascher-Cousineau, the lead author of the paper who recently completed his Ph.D. at UC Santa Cruz. “It’s a very different landscape today.” Now, with more sensitive equipment and larger data storage capabilities, earthquake catalogs are much larger and more detailed

“We’ve started to have million-earthquake catalogs, and the old model simply couldn’t handle that amount of data,” said Emily Brodsky, a professor of earth and planetary sciences at UC Santa Cruz and co-author on the paper. One of the main challenges of the study was not designing the new RECAST model itself but getting the older ETAS model to work on huge data sets to compare the two. 

“The ETAS model is kind of brittle, and it has a lot of very subtle and finicky ways in which it can fail,” said Dascher-Cousineau. “So, we spent a lot of time making sure we weren’t messing up our benchmark compared to actual model development.”

To continue applying deep learning models to aftershock forecasting, Dascher-Cousineau says the field needs a better system for benchmarking. To demonstrate the capabilities of the RECAST model, the group first used an ETAS model to simulate an earthquake catalog. After working with the synthetic data, the researchers tested the RECAST model using real data from the Southern California earthquake catalog.

They found that the RECAST model — which can, essentially, learn how to learn — performed slightly better than the ETAS model at forecasting aftershocks, particularly as the amount of data increased. The computational effort and time were also significantly better for larger catalogs.

This is not the first time scientists have tried using machine learning to forecast earthquakes, but until recently, the technology was not quite ready, said Dascher-Cousineau. New advances in machine learning make the RECAST model more accurate and easily adaptable to different earthquake catalogs.

The model’s flexibility could open up new possibilities for earthquake forecasting. With the ability to adapt to large amounts of new data, models that use deep learning could potentially incorporate information from multiple regions at once to make better forecasts about poorly studied areas.

“We might be able to train on New Zealand, Japan, California and have a model that's quite good for forecasting somewhere where the data might not be as abundant,” said Dascher-Cousineau.

Using deep-learning models will also eventually allow researchers to expand the type of data they use to forecast seismicity.

“We’re recording ground motion all the time,” said Brodsky. “So the next level is to use all of that information, not worry about whether we’re calling it an earthquake or not an earthquake but to use everything."

In the meantime, the researchers hope the model sparks discussions about the possibilities of the new technology.

“It has all of this potential associated with it,” said Dascher-Cousineau. “Because it is designed that way.”

The use of deep learning by UCSC seismologists for forecasting earthquakes is a groundbreaking development in the field of seismology. It not only provides an unprecedented level of accuracy in predicting seismic activity but also opens up new possibilities for understanding and preparing for the impacts of earthquakes. This research has the potential to save lives and property and serves as an example of the power of science and technology to improve the world we live in. With further research and development, deep learning could become an invaluable tool in the fight against the destructive forces of nature.