J. Keith Moore, UCI professor of Earth system science
J. Keith Moore, UCI professor of Earth system science

UC Irvine prof Moore's analysis shows climate change could cause disaster in the world’s oceans

Deep overturning circulation collapses with strong warming

Climate-driven heating of seawater is causing a slowdown of deep circulation patterns in the Atlantic and Southern oceans, according to University of California, Irvine Earth system scientists, and if this process continues, the ocean’s ability to remove carbon dioxide from the atmosphere will be severely limited, further exacerbating global warming.

In a recent study, these researchers analyzed projections from three dozen climate models and found that the Atlantic Meridional Overturning Circulation and the Southern Meridional Overturning Circulation will slow by as much as 42 percent by 2100. The supercomputer simulations suggest that under worst-case warming, the SMOC could cease entirely by 2300.

"Analysis of the projections from 36 Earth system models over a range of climate scenarios shows that unchecked global warming could lead to a shutdown of the ocean deep circulation,” said co-author J. Keith Moore, UCI professor of Earth system science. “This would be a climate disaster similar in magnitude to the complete melting of the ice sheets on land.”

The importance of overturning circulation

In the Atlantic, as warm water flows northwards on the surface, it cools and evaporates, making it saltier and denser. This heavier water sinks into the deep ocean and proceeds to the south where it eventually rises back up, carrying from the depths the nutrients that are the food foundation of marine ecosystems.

In addition, globe-spanning ocean circulation creates a powerful factory for the processing of atmospheric carbon dioxide. The basic physical and chemical interaction of seawater and air – what Moore and his colleagues call a “solubility pump” – draws CO2 into the ocean. While ocean circulation sends some carbon back to the sky, the net amount is sequestered in the ocean’s depths.

Additionally, a “biological pump” occurs as phytoplankton use CO2 during photosynthesis and in forming carbonate shells. When the plankton and larger animals die, they sink, slowly decomposing and releasing the carbon and nutrients at depth. Some come back up with circulation and upwelling, but a portion remains banked beneath the waves.

“A disruption in circulation would reduce ocean uptake of carbon dioxide from the atmosphere, intensifying and extending the hot climate conditions,” Moore said. “Over time the nutrients that support marine ecosystems would increasingly become trapped in the deep ocean, leading to declining global-ocean biological productivity.”

Humans depend on the solubility pump and the biological pump to help remove some of the CO2 emitted into the air through fossil fuel burning, land use practices, and other activities, according to Moore.

“Our analysis also shows that reducing greenhouse gas emissions now can prevent this complete shutdown of the deep circulation in the future,” he said.

Joining Moore on this project, which was funded by the U.S. Department of Energy, were lead author Yi Liu, a UCI Ph.D. student in Earth system science; Francois Primeau, professor and chair of UCI’s Department of Earth System Science; and Wei-Lei Wang, professor of ocean and Earth sciences at Xiamen University in China. The study depended substantially on simulations developed by the Coupled Model Intercomparison Project phase 6 (CMIP6) project used to inform the IPCC climate assessments.

Japan produces simulation that leads to a better understanding of the motion of living organisms; spontaneous organization of a living system

As anyone who drinks their coffee with milk knows, it's much easier to mix liquids together than to separate them. In fact, the second law of thermodynamics would seem to dictate that a mixture would never be able to separate again if there are no attractive forces between similar particles. However, investigators from the Institute of Industrial Science at The University of Tokyo showed the mechanism by which a mixture of actively spinning particles, such as bacteria, in a fluid can sort themselves in a process called phase separation even without attractions between particles.

In a study published recently in Communications Physics, researchers from the Institute of Industrial Science at The University of Tokyo have shown that the demixing behavior of two groups of discs rotating in opposite directions, induced only through self-generated flow, can be explained by turbulent effects.

Sometimes mixed liquids can spontaneously "unmix", in a process of phase separation, such as oil and water. While systems without external energy input have been studied for a long time, the situation with the so-called active matter in which particles expend energy to move autonomously, like bacteria or algae, remains poorly understood.

Now, a team of researchers from The University of Tokyo created a supercomputer simulation of a mixture of discs rotating in opposite directions in a fluid to elucidate this phenomenon. The active motion of bacteria or other living organisms in a straight line that leads to a mixture spontaneously separating is already known as "motility-induced phase separation." However, active motion can include rotation as well as translation, but the organization of self-spinning particles has been studied much less.

"Active matter serves as a bridge between biological and physical worlds when considering the laws of self-organization," says the first author of the study, Bhadra Hrishikesh. The researchers found that in the case of self-spinning particles, phase separation creates the largest structure directly from a chaotic state. This is in contrast with ordinary phase separation, in which phase-separated domains grow gradually over time, as we see in salad dressing.

"It was known that a mixture of oppositely rotating disks can undergo phase separation even without a fluid. We were interested in comparing our system--in which the only interactions between particles are carried by the fluid--with a similar driven system without these interactions," says Hajime Tanaka, senior author. The investigators found that the sudden phase separation of the discs into regions of clockwise and counterclockwise collections is due to nonlinear turbulent effects. This research may lead to a better understanding of the motion of living organisms and thereby, the spontaneous organization of living systems.

Sparsification of the U.S. mobility network. On the left is the original network with about 26 million edges. On the right, a sparsified network based on effective resistance sampling.  CREDIT Mercier et al.
Sparsification of the U.S. mobility network. On the left is the original network with about 26 million edges. On the right, a sparsified network based on effective resistance sampling. CREDIT Mercier et al.

Santa Fe Institute uses mobility network sparsification for high-fidelity epidemic simulations

Simulations that help determine how a large-scale pandemic will spread can take weeks or even months to run. A recent study in PLOS Computational Biology offers a new approach to epidemic modeling that could drastically speed up the process. 

The study uses sparsification, a method from graph theory and computer science, to identify which links in a network are the most important for the spread of disease.

By focusing on critical links, the authors found they could reduce the computation time for simulating the spread of diseases through highly complex social networks by 90% or more. 

“Epidemic simulations require substantial computational resources and time to run, which means your results might be outdated by the time you are ready to publish,” says lead author Alexander Mercier, a former Undergraduate Research Fellow at the Santa Fe Institute and now a Ph.D. student at the Harvard T.H. Chan School of Public Health. “Our research could ultimately enable us to use more complex models and larger data sets while still acting on a reasonable timescale when simulating the spread of pandemics such as COVID-19.”

For the study, Mercier, with SFI researchers Samuel Scarpino and Cristopher Moore, used data from the U.S. Census Bureau to develop a mobility network describing how people across the country commute. 

Then, they applied several different sparsification methods to see if they could reduce the network’s density while retaining the overall dynamics of a disease spreading across the network. 

The most successful sparsification technique they found was effective resistance. This technique comes from computer science and is based on the total resistance between two endpoints in an electrical circuit. In the new study, effective resistance works by prioritizing the edges, or links, between nodes in the mobility network that are the most likely avenues of disease transmission while ignoring links that can be easily bypassed by alternate paths.

“It’s common in the life sciences to naively ignore low-weight links in a network, assuming that they have a small probability of spreading a disease,” says Scarpino. “But as in the catchphrase ‘the strength of weak ties,’ even a low-weight link can be structurally important in an epidemic — for instance, if it connects two distant regions or distinct communities.”

Using their effective resistance sparsification approach, the researchers created a network containing 25 million fewer edges — or about 7% of the original U.S. commuting network — while preserving overall epidemic dynamics.

“Computer scientists Daniel Spielman and Nikhil Srivastava had shown that sparsification can simplify linear problems, but discovering that it works even for nonlinear, stochastic problems like an epidemic was a real surprise,” says Moore.

While still in an early stage of development, the research not only helps reduce the computational cost of simulating large-scale pandemics but also preserves important details about disease spread, such as the probability of a specific census tract getting infected and when the epidemic is likely to arrive there.

The University of Texas at El Paso received a $5 million grant from the National Science Foundation (NSF) to provide financial support and professional development experiences to talented students in the field of computer science. The initiative will provide partial scholarships to 26 students at UTEP who are working on their bachelor’s degrees and focusing on data science or cybersecurity.
The University of Texas at El Paso received a $5 million grant from the National Science Foundation (NSF) to provide financial support and professional development experiences to talented students in the field of computer science. The initiative will provide partial scholarships to 26 students at UTEP who are working on their bachelor’s degrees and focusing on data science or cybersecurity.

University of Texas at El Paso wins $5M grant to support computer science students

The program offers scholarships to UTEP and EPCC students

The University of Texas at El Paso received a $5 million grant from the National Science Foundation (NSF) to provide financial support and professional development experiences to talented students in the field of computer science.

As part of NSF’s Scholarships for STEM (S-STEM) program, the initiative will provide partial scholarships to 26 students at UTEP who are working on their bachelor’s degrees and focusing on data science or cybersecurity.

The UTEP Computer Science Department also will collaborate with El Paso Community College (EPCC) to fund scholarships for 15 students who start at EPCC and transfer to UTEP to complete their bachelor’s degrees.

“This S-STEM program builds on years of NSF support in the Paso del Norte region,” said Kenith Meissner, Ph.D., dean of the College of Engineering. “Moreover, the coordinated effort between UTEP and EPCC will help broaden the talent pool needed to address critical national needs in data science and cybersecurity. We are excited to be part of this collaboration that expands opportunities for highly motivated students in high-demand STEM areas.”

The grant was first awarded to UTEP in 2016. Salamah Salamah, Ph.D., chair of the computer science department and the project’s principal investigator, said it’s unusual for the S-STEM grant to be awarded twice to the same institution.

“The stature of UTEP and what we’re doing here in this department is something that can’t be ignored,” he said. “NSF understands the great things we’re doing.”

Of the 41 students who received scholarships under the first S-STEM grant, nearly all graduated with a bachelor’s degree, 40 attended conferences, 15 were involved in research, and 15 pursued a graduate degree. Additionally, more than half of the program participants were women.

“The S-STEM program has provided the ideal bridge for students from EPCC who want to pursue their computing degree at UTEP,” said Christian Servin, Ph.D., associate professor of computer science at EPCC. “This partnership prepares students mentally and financially to succeed at the four-year institution once they transfer, speeding up the process of developing marketable skills, including research and computational thinking skills.”

Influenced by the best practices pioneered by the Computing Alliance for Hispanic-Serving Institutions (CAHSI), UTEP’s computer science department provides S-STEM scholars with professional development training and opportunities that can build their confidence and give them an edge in the job market. For example, professors accompany students to the annual Great Minds in STEM conference, where students learn how to network with job recruiters, share their stories and highlight their skills.

“One of the greatest things you can see is how the students start to become leaders,” said Diego Aguirre, Ph.D., co-principal investigator of the grant and assistant professor of computer science. “Many of them come into the program with a desire to help others. As they learn skills and move into this space, they start sharing that newfound knowledge with others. The program's impact is not just in the students who get the scholarships, it’s in the impact those students have wherever they go.”

Researchers compared the output (activity on the top and decoder accuracy on the bottom) associated with real neural data (left column) and several models of working memory to the right. The ones that best resembled the real data were the "PS" models featuring short-term synaptic plasticity.
Researchers compared the output (activity on the top and decoder accuracy on the bottom) associated with real neural data (left column) and several models of working memory to the right. The ones that best resembled the real data were the "PS" models featuring short-term synaptic plasticity.

MIT neuroscientists produce insights into how holding information in mind may mean storing it among synapses

Comparing models of working memory with real-world data, MIT researchers found that information resides not in persistent neural activity, but in the pattern of their connections

Between the time you read the Wi-Fi password off the café’s menu board and the time you can get back to your laptop to enter it, you have to hold it in mind. If you’ve ever wondered how your brain does that, you are asking a question about working memory that researchers have strived for decades to explain. Now MIT neuroscientists have published a key new insight to explain how it works.

In a study in PLOS Computational Biology, scientists at The Picower Institute for Learning and Memory compared measurements of brain cell activity in an animal performing a working memory task with the output of various supercomputer models representing two theories of the underlying mechanism for holding information in mind. The results strongly favored the newer notion that a network of neurons stores the information by making short-lived changes in the pattern of their connections, or synapses, and contradicted the traditional alternative that memory is maintained by neurons remaining persistently active (like an idling engine).

While both models allowed for information to be held in mind, only the versions that allowed for synapses to transiently change connections (“short-term synaptic plasticity”) produced neural activity patterns that mimicked what was observed in real brains at work. The idea that brain cells maintain memories by being always “on” may be simpler, acknowledged senior author Earl K. Miller, but it doesn’t represent what nature is doing and can’t produce the sophisticated flexibility of thought that can arise from intermittent neural activity backed up by short-term synaptic plasticity.

“You need these kinds of mechanisms to give working memory activity the freedom it needs to be flexible,” said Miller, Picower Professor of Neuroscience in MIT’s Department of Brain and Cognitive Sciences (BCS). “If working memory was just sustained activity alone, it would be as simple as a light switch. But working memory is as complex and dynamic as our thoughts.”

Co-lead author Leo Kozachkov, who earned his Ph.D. at MIT in November for theoretical modeling work including this study, said matching supercomputer models to real-world data was crucial.

“Most people think that working memory ‘happens’ in neurons—persistent neural activity gives rise to persistent thoughts. However, this view has come under recent scrutiny because it does not really agree with the data,” said Kozachkov who was co-supervised by co-senior author Jean-Jacques Slotine, a professor in BCS and mechanical engineering. “Using artificial neural networks with short-term synaptic plasticity, we show that synaptic activity (instead of neural activity) can be a substrate for working memory. The important takeaway from our paper is: these ‘plastic’ neural network models are more brain-like, in a quantitative sense, and also have additional functional benefits in terms of robustness.”

Matching models with nature

Alongside co-lead author John Tauber, an MIT graduate student, Kozachkov’s goal was not just to determine how working memory information might be held in mind, but to shed light on which way nature does it. That meant starting with “ground truth” measurements of the electrical “spiking” activity of hundreds of neurons in the prefrontal cortex of an animal as it played a working memory game. In each of the many rounds, the animal was shown an image that then disappeared. A second later it would see two images including the original and had to look at the original to earn a little reward. The key moment is that intervening second, called the “delay period,” in which the image must be kept in mind in advance of the test.

The team consistently observed what Miller’s lab has seen many times before: The neurons spike a lot when seeing the original image, spike only intermittently during the delay, and then spike again when the images must be recalled during the test (these dynamics are governed by an interplay of beta and gamma frequency brain rhythms). In other words, spiking is strong when information must be initially stored and when it must be recalled but is only sporadic when it has to be maintained. The spiking is not persistent during the delay.

Moreover, the team trained software “decoders” to read out the working memory information from the measurements of spiking activity. They were highly accurate when spiking was high, but not when it was low, as in the delay period. This suggested that spiking doesn’t represent information during the delay. But that raised a crucial question: If spiking doesn’t hold information in mind, what does?

Researchers including Mark Stokes at the University of Oxford have proposed that changes in the relative strength, or “weights,” of synapses could store the information instead. The MIT team put that idea to the test by computationally modeling neural networks embodying two versions of each main theory. As with the real animal, the machine learning networks were trained to perform the same working memory task and to output neural activity that could also be interpreted by a decoder.

The upshot is that the computational networks that allowed for short-term synaptic plasticity to encode information spiked when the actual brain spiked and didn’t when it didn’t. The networks featuring constant spiking as the method for maintaining memory spiked all the time including when the natural brain did not. And the decoder results revealed that accuracy dropped during the delay period in the synaptic plasticity models but remained unnaturally high in the persistent spiking models.

In another layer of analysis, the team created a decoder to read out information from the synaptic weights. They found that during the delay period, the synapses represented the working memory information that the spiking did not.

Among the two model versions that featured short-term synaptic plasticity the most realistic one was called “PS-Hebb,” which features a negative feedback loop that keeps the neural network stable and robust, Kozachkov said.

Workings of working memory

In addition to matching nature better, the synaptic plasticity models also conferred other benefits that likely matter to real brains. One was that the plasticity models retained information in their synaptic weightings even after as many as half of the artificial neurons were “ablated.” The persistent activity models broke down after losing just 10-20 percent of their synapses. And, Miller added, just spiking occasionally requires less energy than spiking persistently.

Furthermore, Miller said, quick bursts of spiking rather than persistent spiking leaves room in time for storing more than one item in memory. Research has shown that people can hold up to four different things in working memory. Miller’s lab plans new experiments to determine whether models with intermittent spiking and synaptic weight-based information storage appropriately match real neural data when animals must hold multiple things in mind rather than just one image.

In addition to Miller, Kozachkov, Tauber, and Slotine, the paper’s other authors are Mikael Lundqvist and Scott Brincat.

The Office of Naval Research, the JPB Foundation, and ERC and VR Starting Grants funded the research.