Supercomputer models show how crop production increases soil nitrous oxide emissions

A recent ecosystem modeling study conducted by Iowa State University scientists shows how crop production in the United States has led to an increase in the emissions of nitrous oxide, a potent greenhouse gas, throughout the last century. Expansion of agricultural land and the application of nitrogen fertilizers have driven an increase in nitrous oxide emissions from U.S. soils, according to a new study from ISU researchers. Photo by Loren King.

The researchers drew on massive amounts of data on everything from weather patterns to soil conditions to land use and agricultural management practices to feed the model and quantify changes in nitrous oxide emissions from soils in the United States. The research, published in the peer-reviewed academic journal Global Change Biology, breaks soil emissions down by ecosystem types and major crops and found that the expansion of land devoted to agriculture since 1900 and intensive fertilizer inputs have predominantly driven an overall increase in nitrous oxide emissions.

The use of such ecosystem models to assess the sources of nitrous oxide emissions could help guide policymakers as they enact conservation plans and responses to climate change said Chaoqun Lu, associate professor of ecology, evolution, and organismal biology and corresponding author of the study.

“The model we are using is a process-based ecosystem model,” Lu said. “It’s similar to mimicking the patterns and processes of an ecosystem in our computer. We divide the land into thousands of pixels at a uniform size and run algorithms that simulate how ecological processes respond to changes in climate, air composition, and human activities.”

Results show emissions tripled

The study found nitrous oxide emissions from U.S. soil have more than tripled since 1900, from 133 million metric tons of carbon dioxide equivalent (MMT CO2 eq) per year at the beginning of the 20th century to 404 MMT CO2 eq per year in the 2010s. Nearly three-quarters of that rise in emissions originate from agricultural soils with corn and soybean production driving over 90% of the ag-related emissions increase, according to the study.

“Our study suggests a large [nitrous oxide] mitigation potential in cropland and the importance of exploring crop-specific mitigation strategies and prioritizing management alternatives for targeted crop types,” the study authors wrote in their paper.

The rise in emissions corresponds to an expansion of cropland in the United States, Lu said. The computer models found land devoted to agricultural production emits more nitrous oxide than natural landscapes. That’s largely due to the widespread application of nitrogen fertilizers to agricultural land and legume crop production, Lu said. The added nitrogen is partially used by crops, and the remainder either stays in soils or is lost to the environment. During this process, microorganisms living in soils consume nitrogen-containing compounds and give off nitrous oxide as a byproduct. Better understanding the dynamics of which crops lead to the greatest emissions can help shape climate mitigation policy, Lu said. Because more nitrogen fertilizer is applied in corn production on average than other crops, the study found soils, where corn is grown, tend to emit more nitrous oxide per unit of fertilizer used, Lu said.

The researchers designed mathematical models that mimic ecological processes. The models rely on mountains of data gathered and developed over years, Lu said. The researchers compiled government data on crops, land use, weather, and other variables. They also factored in historic and survey data from farmers and other landowners.

The research team also compared the results from their model with real-world data to validate their results. For instance, the scientists showed their model’s yield predictions tracked with national yield records dating back to 1925 for major crops such as corn, soybean, wheat, rice, and others. That shows the model simulation could track the long-term trajectory of nitrogen uptake that supports increasing crop yield over the past century. They also compared their model’s nitrous oxide emission predictions to real-world data collected from multiple natural and managed soils across the nation, as well as time-series measurements from a central Iowa corn-soybean rotation site over seven years.

“Our group has spent lots of time improving model performance and developing the driving force history, including natural and human disturbances, for the model simulations,” Lu said. “Behind the scenes, there are thousands of lines of algorithms to guide the computer model to make predictions. It takes decades of efforts, and more to come, to reduce modeling uncertainties and incorporate better ecological process understanding resulting from the hard work of field scientists.”

Blue Blob near Iceland could slow glacial melting

A region of cooling water in the North Atlantic Ocean near Iceland, nicknamed the "Blue Blob,” has likely slowed the melting of the island's glaciers since 2011 and may continue to stymie ice loss until about 2050, according to new research.

The origin and cause of the Blue Blob, which is located south of Iceland and Greenland, is still being investigated. The cold patch was most prominent during the winter of 2014-2015 when the sea surface temperature was about 1.4 degrees Celsius (2.52 degrees Fahrenheit) colder than normal.

The new study uses climate models and field observations to show that the cold water patch chilled the air over Iceland sufficiently to slow ice loss starting in 2011. The model predicts cooler water will persist in the North Atlantic, sparing Iceland's glaciers until about 2050. Ocean and air temperatures are predicted to increase between 2050 and 2100, leading to accelerated melting.

While cooler water in the North Atlantic offers a temporary respite for Iceland's glaciers, the authors estimate that without steps to mitigate climate change, the glaciers could lose a third of their current ice volume by 2100 and be gone by 2300. If the country's 3,400 cubic kilometers (about 816 cubic miles) of ice melt, sea level will rise by 9 millimeters (0.35 inches).

"In the end, the message is still clear," said lead author Brice Noël, a climate modeler who specializes in polar ice sheets and glaciers at Utrecht University. "The Arctic is warming fast. If we wish to see glaciers in Iceland, then we have to curb the warming."

The paper is published in the AGU journal Geophysical Research Letters, which publishes high-impact, short-format reports with immediate implications spanning all Earth and space sciences. Its findings may help scientists to better understand the indirect effects of the ocean on glaciers.

"It's crucial to have an idea of the possible feedbacks in the Arctic because it's a region that is changing so fast," Noël said. "It's important to know what we can expect in a future warmer climate."

The warming Arctic

Nowhere on Earth has warmed as quickly as the Arctic. Recent studies report the area is warming four times faster than the global average. Iceland's glaciers steadily shrank from 1995 to 2010, losing an average of 11 gigatons of ice per year. Starting in 2011, however, the speed of Iceland’s melting slowed, resulting in about half as much ice loss, or about 5 gigatons annually. This trend was not seen in nearby, larger glaciers across Greenland and Svalbard.

Noël and his colleagues investigated the cause of this slowdown by estimating the glaciers' mass balance — how much they grew or melted annually from 1958 to 2019. They used a high-resolution regional climate model that works at the small scale of Iceland's glaciers to estimate how much snow the glaciers received in winter and how much ice was lost from meltwater runoff in summer. The researchers found that cooler waters near the Blue Blob are linked to observations of lower air temperatures over Iceland's glaciers and coincide with the slowdown of glacial melting since 2011.

Several researchers have proposed that the Blue Blob is part of the normal sea surface temperature variability in the Arctic. Notably, especially cold winters in 2014 and 2015 led to record cooling, which caused upwelling of cold, deep water, even as ocean temperatures around the region warmed due to climate change.

Before the Blue Blob, a long-term cooling trend in the same region, called the Atlantic Warming Hole, reduced sea surface temperatures by about 0.4 to 0.8 degrees Celsius (0.72 to 1.44 degrees Fahrenheit) during the last century and may continue to cool the region in the future. A possible explanation for the Warming Hole is that climate change has slowed the Atlantic Meridional Overturning Circulation, an ocean current that brings warm water up from the tropics to the Arctic, thus reducing the amount of heat delivered to the region.

The end of Iceland's glaciers?

Noël projected the future climate of Iceland by combining the same regional climate model with a global climate model to predict how North Atlantic ocean temperatures would affect the glaciers' fate until 2100, under a scenario of rapid warming. The models predicted that the North Atlantic near Iceland will stay cool, slowing — and perhaps even temporarily stopping — ice loss from the glaciers by the mid-2050s.

The authors verified that the models accurately reconstructed the mass of the glaciers using almost 1,200 measurements of snow depth collected between 1991 and 2019 by colleagues at the University of Iceland and satellite measurements of the elevation and extent of glaciers taken from 2002 to 2019 by co-authors at the Delft University of Technology.

"I think their analysis is very thorough," said Fiamma Straneo, a physical oceanographer at the Scripps Institution of Oceanography who was not involved in the study. "They have a really state-of-the-art regional atmospheric model for looking at the variability of glaciers." Straneo thinks this approach could be used to understand changes in other glaciers that occur over land, such as in the Himalayas and Patagonia. "There is very active research in land terminating glaciers because they are one of the largest contributors to sea level rise right now."

UCI scientists discover how galaxies can exist without dark matter

In supercomputer simulations, collisions cause smaller star groupings to lose material

In a new study, an international team led by astrophysicists from the University of California, Irvine and Pomona College report how, when tiny galaxies collide with bigger ones, the bigger galaxies can strip the smaller galaxies of their dark matter — a matter that we can’t see directly, but which astrophysicists think must exist because, without its gravitational effects, they couldn’t explain things like the motions of a galaxy’s stars. Dark matter distribution in a simulated galaxy group, with brighter areas showing higher concentrations of dark matter. Circles show close-up images of the stellar light associated with two galaxies lacking dark matter. If these galaxies had dark matter, they would appear as bright regions in the main image. Moreno et al.

It’s a mechanism that has the potential to explain how galaxies might be able to exist without dark matter – something once thought impossible.

It started in 2018 when astrophysicists Shany Danieli and Pieter van Dokkum of Princeton University and Yale University observed two galaxies that seemed to exist without most of their dark matter.

“We were expecting large fractions of dark matter,” said Danieli, who’s a co-author on the latest study. “It was quite surprising, and a lot of luck, honestly.”

The lucky find, which van Dokkum and Danieli reported on in a paper in 2018 and 2020, threw the galaxies-need-dark-matter paradigm into turmoil, potentially upending what astrophysicists had come to see as a standard model for how galaxies work.

“It’s been established for the last 40 years that galaxies have dark matter,” said Jorge Moreno, an astronomy professor at Pomona College, who’s the lead author of the new paper. “In particular, low-mass galaxies tend to have significantly higher dark matter fractions, making Danieli’s finding quite surprising. For many of us, this meant that our current understanding of how dark matter helps galaxies grow needed an urgent revision.”    

The team ran computer models that simulated the evolution of a chunk of the universe – one about 60 million light-years across – starting soon after the Big Bang and running to the present.

The team found seven galaxies devoid of dark matter. After several collisions with neighboring galaxies 1,000-times more massive, they were stripped of most of their material, leaving behind nothing but stars and some residual dark matter.

“It was pure serendipity,” said Moreno. “The moment I made the first images, I shared them immediately with Danieli, and invited her to collaborate.”

Robert Feldmann, a professor at the University of Zurich who designed the new simulation, said that “this theoretical work shows that dark matter-deficient galaxies should be very common, especially in the vicinity of massive galaxies.”

UCI’s James Bullock, an astrophysicist who’s a world-renowned expert on low-mass galaxies, described how he and the team didn’t build their model just so they could create galaxies without dark matter – something he said makes the model stronger because it wasn’t designed in any way to create the collisions that they eventually found. “We don’t presuppose the interactions,” said Bullock.

Confirming that galaxies lacking dark matter can be explained in a universe where there’s lots of dark matter is a sigh of relief for researchers like Bullock, whose career and everything he’s discovered therein hinges on the dark matter being the thing that makes galaxies behave the way they do.

“The observation that there are dark matter-free galaxies has been a little bit worrying to me,” said Bullock. “We have a successful model, developed over decades of hard work, where most of the matter in the cosmos is dark. There is always the possibility that nature has been fooling us.”

But, Moreno said, “you don’t have to get rid of the standard dark matter paradigm.”

Now that astrophysicists know how a galaxy might lose its dark matter, Moreno and his collaborators hope the findings inspire researchers who look at the night sky to look for real-world massive galaxies they might be in the process of stripping dark matter away from smaller ones.

“It still doesn’t mean this model is right,” Bullock said. “A real test will be to see if these things exist with the frequency and general characteristics that match our predictions.”

As part of this new work, Moreno, who has indigenous roots, received permission from Cherokee leaders to name the seven dark matter-free galaxies found in their simulations in honor of the seven Cherokee clans: Bird, Blue, Deer, Long Hair, Paint, Wild Potato and Wolf.

“I feel a personal connection to these galaxies,” said Moreno, who added that, just as the more massive galaxies robbed the smaller galaxies of their dark matter, “many people of indigenous ancestry were stripped of our culture. But our core remains, and we are still thriving.”

University of Hong Kong physicists make a great stride in the quest for quantum materials through better measurement of quantum entanglement

A research team from the Department of Physics, the University of Hong Kong (HKU) has developed a new algorithm to measure entanglement entropy, advancing the exploration of more comprehensive laws in quantum mechanics, a move closer towards the actualization of the application of quantum materials. Mr Jiarui ZHAO, a PhD student from Department of Physics from HKU, came up with this new algorithm of computing the quantum entanglement on a trip in the subway.

Quantum materials play a vital role in propelling human advancement. The search for more novel quantum materials with exceptional properties has been pressing among the scientific and technology community.

2D Moire materials such as twisted bilayer graphene are having a far-reaching role in the research of novel quantum states such as superconductivity which suffers no electronic resistance. They also play a role in the development of quantum supercomputers that vastly outperform the best supercomputers in existence.

But materials can only arrive at “quantum state”, i.e. when thermal effects can no longer hinder quantum fluctuations which trigger the quantum phase transitions between different quantum states or quantum phases, at extremely low temperatures (near Absolute Zero, -273.15°C) or under exceptionally high pressure. Experiments testing when and how atoms and subatomic particles of different substances “communicate and interact with each other freely through entanglement” in a quantum state are therefore prohibitively costly and difficult to execute.

The study is further complicated by the failure of the classical LGW (Landau, Ginzburg, Wilson) framework to describe certain quantum phase transitions, dubbed Deconfined Quantum Critical Points (DQCP). The question then arises whether DQCP realistic lattice models can be found to resolve the inconsistencies between DQCP and QCP. Dedicated exploration of the topic produces copious numerical and theoretical works with conflicting results, and a solution remains elusive.

Mr. Jiarui ZHAO, Dr. Zheng YAN, and Dr. Zi Yang MENG from the Department of Physics, HKU successfully made a momentous step towards resolving the issue through the study of quantum entanglement, which marks the fundamental difference between quantum and classical physics.

The research team developed a new and more efficient quantum algorithm of the Monte Carlo techniques adopted by scientists to measure the Renyi entanglement entropy of objects. With this new tool, they measured the Rényi entanglement entropy at the DQCP and found the scaling behavior of the entropy, i.e. how the entropy changes with the system sizes, is in sharp contrast with the description of conventional LGW types of phase transitions.

“Our findings helped confirm a revolutionized understanding of phase transition theory by denying the possibility of a singular theory describing DQCP. The questions raised by our work will contribute to further breakthroughs in the search for a comprehensive understanding of unchartered territory,” said Dr. Zheng Yan.

 “The finding has changed our understanding of the traditional phase transition theory and raises many intriguing questions about deconfined quantum criticality. This new tool developed by us will hopefully help the process of unlocking the enigma of quantum phase transitions that have perplexed the scientific community for two decades,” said Mr. Zhao Jiarui, the first author of the journal paper and a Ph.D. student who came up with the final fixes of the algorithm.

 “This discovery will lead to a more general characterization of the critical behavior of novel quantum materials, and is a move closer towards the actualization of application of quantum materials which play a vital role in propelling human advancement.” Dr. Meng Zi Yang remarked.

The models
To test the efficiency and superior power of the algorithm and demonstrate the distinct difference between the entanglement entropy of normal QCP between DQCP, the research team chose two representative models —the J1-J2 model hosting normal O(3) QCP and the J-Q3 model hosting DQCP.

Nonequilibrium increment algorithm
Based on previous methods, the research team created a highly paralleled increment algorithm. The main idea of the algorithm is to divide the whole simulation task into many smaller tasks and use massive CPUs to parallelly execute the smaller tasks thus greatly decreasing the simulation time. This improved method helped the team to simulate the two models previously mentions with high efficiency and better data quality.

Findings
With the nonequilibrium increment method, the research team successfully obtain the second Rényi entanglement entropy SA(2)  at QCP and DQCP of the two models for different system sizes. The data is shown and one can find from the insets that when deducting the leading term(area law contribution from the entanglement boundary) the signs of the sub-leading term clearly distinguish the QCP (negative in J1-J2 model,) and DQCP (positive in J-Q3 model). This finding rules out the possibility of the description of DQCP based on a unitary assumption and raises several intriguing questions about the theory of DQC. This discovery is likely to lead to a more general characterization of the critical behavior of novel quantum materials.

UCL Prof Fisher develops cancer supercomputer models to identify new drug combos to treat Covid-19

By adapting supercomputer models originally developed to understand the biology of cancer cells, UCL researchers have identified new drug combinations with the potential to treat severe cases of Covid-19 infection at different stages of the disease.

The findings could help lower the number of Covid-19 related deaths and reduce the strain on healthcare systems.

The study tested the potential impact of interfering with different aspects of SARS-CoV-2 infection and the body’s responses to the virus. Results have identified existing therapeutics that might be suitable for treating Covid-19 patients.

Although vaccines and treatments for Covid-19 now exist, additional effective and affordable treatments are still urgently required. Cases of SARS-CoV-2 infection are still highly likely to occur, particularly when new variants arise.

Tackling virus replication and immune response

Therapeutic development for Covid-19 is complicated by the need to consider different stages of the disease. Early symptoms are typically triggered by viral replication, while later and more severe disease is caused by the over-reaction of the body’s immune defenses.

Different stages of the disease are therefore likely to needed different treatments – and getting the timing wrong could have grave consequences: boosting immune responses to prevent viral replication could be highly damaging if they are already being ramped up.

The interplay between the virus, the cell it is infecting, and host immune responses involve a highly complex web of interactions. Interfering with these interactions using therapeutics could therefore have an effect throughout this web, which might help to clear the virus but could also disrupt important cellular processes and cause harmful side effects.

Model solutions

Similar issues have been faced by cancer researchers. To address this challenge, UCL researcher Professor Jasmin Fisher has developed supercomputer models of cancer cell biology, which simulate the biochemical and metabolic pathways of cells and how they are subverted by cancer-causing mutations that drive uncontrolled growth of cells.

Using these models, the Fisher Lab can explore what might happen if particular pathways or cellular processes are inhibited, individually or in combination, so that the best targets for intervention can be identified and possibly harmful side effects anticipated.

“We realized we could model SARS-CoV-2 infection using a computational framework originally developed in my lab to predict personalized treatment combinations for cancer patients and use it to predict effective repurposed drug combinations for treating Covid-19,” says Professor Fisher (UCL Cancer Institute), who was the lead writer of the study.

“We collated information available at the time on SARS-CoV-2 infection of airway cells and the immune response to infection, to create a dynamic model of viral infection and Covid-19 disease processes. Our study focused on two key stages – viral replication following initial infection (before severe symptoms emerge) and late-stage immune-driven disease, which is typically more severe,” says Professor Fisher.

The research team identified a range of therapeutic drugs, already licensed or in late development, that target processes thought to be important at these two stages. They then used their computer model to explore what might happen in cells when these processes were inhibited, mimicking the action of therapeutics. The model provided insights into impacts on virus replication and host responses, and the likely net effect to both treatments of disease and safety.

Importantly, Professor Fisher and colleagues were able to validate their model by showing that the predicted effects of therapeutics already being used to treat Covid-19 at different stages of disease – such as antivirals and anti-inflammatory drugs – matched those seen in clinical studies.

In silico screening

Using this approach, the team examined 9,870 pairs of compounds acting on 140 potential cellular targets. They were able to identify new combinations of therapeutics that would be predicted to be beneficial at either early or late stages of the disease, as well as the ‘windows’ when they might be safely deployed. For example, the combination of two drugs, Camostat and Apilimod, was predicted to have a particularly big impact on virus replication. This strong antiviral effect was confirmed using live SARS-CoV-2 cell culture assays by Dr. Ann-Kathrin Reuschl and Professor Clare Jolly in the Division of Infection and Immunity at UCL.

The study also identified cellular responses, such as the levels of certain cytokines, that correlated with mild, early-severe, and late-severe disease. These could have an important role as biomarkers to guide the use of therapeutics at the most appropriate time. “We not only need to know what drugs might work against Covid-19 but also when they might work. Our model is both a highly efficient way of prioritizing drugs for evaluation as treatments for Covid-19 and could also help ensure that Covid-19 patients get the right drug at the right time,” says Professor Fisher.