Assistant Professor of Biomolecular Engineering Ali Shariati. (photo by Carolyn Lagattuta)
Assistant Professor of Biomolecular Engineering Ali Shariati. (photo by Carolyn Lagattuta)

UCSC prof Shariati develops deep-learning software that detects, tracks individual cells with high performance

Cell growth and division are two of the most fundamental and essential features of life, and closely monitoring cell changes over time can give scientists key insights into the dynamics of these biological processes. Time-lapse microscopy allows scientists to detect and track cells but produces huge amounts of data that are nearly impossible to sort through manually.

Now, however, powerful data processing capabilities of modern deep learning models offer techniques to sort through so much imaging data. Assistant Professor of Biomolecular Engineering Ali Shariati and doctoral student Abolfazl Zarageri together with several student researchers in the Shariati lab have developed and released a new deep learning model called “DeepSea,” one of the only tools with the ability to segment cells, track them and detect their division to follow lineages of cells. DeepSea, which is detailed in a new paper in Cell Reports Methods, is one of the highest-accuracy tools of its kind. 

DeepSea’s model training dataset, user-friendly software, and open-source code are available for use on the DeepSea website, and Shariati and his team of researchers have already used it to make discoveries about stem cell growth and division. 

“The model is more efficient, has fewer parameters, and both segmentation and tracking are integrated into a user-friendly software,” Shariati said. “The software allows you to train the model for any cell type of interest, paving the way for future discoveries.”

Time-lapse microscopy, which captures a series of images from a microscope over time, allows researchers to monitor single cells throughout an experiment to track phenomena such as differentiation — when stem cells become a specific type of cell — or change in shape and size over time. This can allow scientists to make new biological discoveries by measuring the dynamics of cell biological phenomena at the single-cell level.

Once the scientists have gathered images, they need to carry out two main tasks: segmentation, or identifying the borders of individual cells from each other and the background; and tracking, or following a cell from one frame to the next. From that point, the researchers can further investigate characteristics such as size, shape, texture, how they move and change their shape, and more. 

Manually sorting through microscopy images is tedious, time-consuming, and ultimately a task better suited for supercomputing, which is where DeepSea comes in. This efficient deep-learning model can perform segmentation in less than a second, and track cells with 98 percent accuracy. 

Enabling the software to detect cell division was a particularly unique and challenging aspect of this project, as there are few if any other situations in which artificial intelligence and computer vision must track one object transforming into two. 

“This is a very unusual problem for object tracking,” Shariati said. “If you want to track a car or something, the car will be moving around and you can use machine learning and computer vision to follow them as they move. But for cells, all of a sudden one object becomes two, and that's a fundamentally new problem that we needed to solve, and we were able to do so.” 

DeepSea is a generalizable model, meaning it can be used to track a variety of cell types. It uses a modified version of a popular model, 2D-UNET, with significantly fewer parameters to achieve both fast speeds and high accuracy. 

“We compared our model with some of the best cell segmentation models, and ours is now showing the best results in terms of precision, and speed, especially for these cell types,” said Zarageri, an electrical and computer engineering Ph.D. student in Shariati’s lab who led the creation of the software.

The researchers trained DeepSea using a dataset of images of cells manually segmented from their backgrounds, a time-intensive process as the images are often low-contrast and the cell bodies hard to make out. To aid in this process, the team developed another software tool to help crop, label, and edit the microscopy images of cells, which is also available at DeepSeas.org.

The training dataset included images of lung, muscle, and stem cells, meaning DeepSea achieves high precision across different cell types. More cell types can be added to future versions of the model. 

The researchers used DeepSea to study the size regulation of embryonic stem cells, which are the foundation of multicellular life and can differentiate into every other cell type. They came away with the discovery that embryonic stem cells, which are known to divide unusually fast, regulate their size so that smaller cells spend a longer time growing before producing the next generation of cells. 

“We found that if an embryonic stem cell is born small, they kind of know that they are small, so they spend more time growing before they go on and divide again,” Shariati said. “We do not know why and how exactly this happens, but at least that phenomenon is there.” 

In the future, the researchers plan to apply their existing software to gather data to study spatial relationships between cells, and how the cellular features are organized in 3-D patterns to form structures. 

The researchers also aim to resolve bottlenecks they have noticed in using their deep learning models, such as the lack of labeled images of cells that are used to train the models. They plan to use a class of machine learning frameworks called Generative Adversarial Networks (GANs) to create new synthetic data and images of cells that are already annotated to cut down on the time it takes to create labels. The researchers would then have large libraries of datasets of any cell type of interest with minimal human involvement.

Supercomputer simulation of the chorus event at Mars.
Supercomputer simulation of the chorus event at Mars.

Chinese scientists verify trap-release-amplify model by reproducing electromagnetic waves on Mars

The study reveals whistler-mode chorus waves similar to the Earth's detected on Mars, validating the role of magnetic field Inhomogeneity in frequency sweeping phenomenon. 

Chinese scientists have reproduced the observed whistler mode chorus waves on Mars using data from the Mars Atmosphere and Volatile Evolution (MAVEN) mission and compared them with phenomena on the Earth. They found that both Mars and Earth exhibit whistler mode chorus waves triggered by nonlinear processes with the key role played by background magnetic field inhomogeneity in frequency sweeping. 

The study provides crucial support for understanding chorus waves in the Martian environment and verifies the previously proposed "Trap-Release-Amplify" (TaRA) model under more extreme conditions. Magnetic field and chorus emissions at Mars and Earth.

Whistler-mode chorus waves are electromagnetic wave emissions widely presented in planetary magnetospheres. When their electromagnetic signals are converted into sound, they resemble the harmonious chorus of birds in the early morning, hence the name "chorus waves." Chorus waves can accelerate high-energy electrons in space through resonance, leading to a rapid increase in electron flux in Earth's radiation belts during geomagnetic storms. Additionally, they scatter high-energy electrons into the atmosphere, creating diffuse and pulsating auroras. 

One characteristic of chorus waves is their narrowband frequency sweeping structure. The excitation mechanism of this sweeping structure has been of great interest for decades, and scientists have proposed various theoretical models. However, there has been an ongoing debate regarding why frequency sweeping occurs in chorus waves and how to calculate the sweeping frequency. One central point of contention is whether the background magnetic field inhomogeneity plays a crucial role in frequency sweeping and how it affects the sweeping phenomenon. 

The TaRA model, previously proposed by a team from the University of Science and Technology of China of the Chinese Academy of Sciences, is based on modern plasma physics theories and suggests that the frequency sweeping of chorus waves in the magnetosphere is the result of the combined effects of nonlinear processes and background magnetic field inhomogeneity. The model provides a corresponding formula for calculating the sweeping frequency. However, the variation in magnetic field inhomogeneity in Earth's magnetosphere is limited, making it difficult to test the TaRA model in a larger parameter space. 

There exist distinct magnetic field environments between Mars and Earth. The Earth possesses a global dipole-like magnetic field, while Mars only has localized remnant magnetization. In the remnant magnetization environment of Mars, similar chorus wave events have also been observed by the MAVEN satellite. The calculations reveal a difference of five orders of magnitude in background magnetic field inhomogeneity between Mars and Earth. By comparing wave events observed on Earth and Mars, the previously proposed TaRA model can be tested under more extreme conditions. 

To validate this model, in this study, scientists from the University of Science and Technology of China and their collaborators observed the particle distribution on Mars using the MAVEN satellite. They combined it with the corresponding Martian crustal remnant magnetic field model. 

Employing a first-principles particle simulation method, scientists reproduced the observed chorus wave phenomena on Mars. Through the analysis of particle phase space distribution, they confirmed that the sweeping process of these waves is consistent with that of chorus waves on Earth, both triggered by nonlinear processes. 

Furthermore, scientists used two different methods provided by the TaRA model to calculate the sweeping frequency of chorus waves and compared them with the observation and simulation results. The results demonstrated high consistency between the sweeping frequencies calculated based on nonlinear processes and background magnetic field inhomogeneity and the supercomputer simulation results. 

These findings indicated that although Mars and Earth possess distinct magnetic and plasma environments, the observed chorus wave phenomena on Mars follow the same fundamental physical processes as those in Earth's magnetosphere. This study validated the wide applicability of the TaRA model in describing the sweeping physical processes of chorus waves under extreme conditions with a five-order difference in magnetic field inhomogeneity, which confirms the existence of chorus waves on Mars, and provides support for testing and applying the TaRA model under extreme conditions. 

Syracuse University researchers co-authored a study exploring the extent to which human activities are contributing to hydrogeochemical changes in U.S. rivers. The image above is Mills River in Pisgah National Forest, North Carolina.
Syracuse University researchers co-authored a study exploring the extent to which human activities are contributing to hydrogeochemical changes in U.S. rivers. The image above is Mills River in Pisgah National Forest, North Carolina.

Syracuse prof Wen uses supercomputer modeling to discover the sources of salinization, alkalinization in watersheds

From protecting biodiversity to ensuring the safety of drinking water, the biochemical makeup of rivers and streams around the United States is critical for human and environmental welfare. Studies have found that human activity and urbanization are driving the salinization (increased salt content) of freshwater sources across the country. In excess, salinity can make water undrinkable, increase the cost of treating water, and harm freshwater fish and wildlife. 

Along with the rise in salinity has also been an increase in alkalinity over time, and past research suggests that salinization may enhance alkalinization. But unlike excess salinity, alkalinization can have a positive impact on the environment due to its ability to neutralize water acidity and absorb carbon dioxide in the Earth’s atmosphere – a key component to combating climate change. Therefore, understanding the processes at play which are affecting salinity and alkalinity has important environmental and health implications. EES Professor Tao Wen and his co-authors used machine learning to detect sources of salinization and alkalinization in U.S. watersheds.

A team of researchers from Syracuse University and Texas A&M University has applied a machine learning model to explore where and to what extent human activities are contributing to the hydrogeochemical changes, such as increases in salinity and alkalinity in U.S. rivers.

The group used data from 226 river monitoring sites across the U.S. and built two machine-learning models to predict monthly salinity and alkalinity levels at each site. These sites were selected because long-term continuous water quality measurements have been recorded for at least 30 years. From urban to rural settings, the model explored a diverse range of watersheds, which are areas where all flowing surface water converges to a single point, such as a river or lake. It evaluated 32 watershed factors ranging from hydrology, climate, geology, soil chemistry, land use, and land cover to pinpoint the factors contributing to rising salinity and alkalinity. The team’s models determined human activities as major contributors to the salinity of U.S. rivers while rising alkalinity was mainly attributed more to natural processes than human activities.

The team, which included Syracuse University researchers Tao Wen, assistant professor in the College of Arts and Sciences’ Department of Earth and Environmental Sciences (EES), Beibei E, a graduate student in EES, Charles T. Driscoll, University Professor of Environmental Systems and Distinguished Professor in the College of Engineering and Computer Science, and Texas A&M assistant professor Shuang Zhang, recently had their findings published in the journal Science of the Total Environment.

What’s Driving Salinization and Alkalinization?

The results from the group’s sodium prediction model, which detected human activities such as the application of road salt as major contributions to the salinity of U.S. rivers, were consistent with previous studies. This model specifically revealed population density and impervious surface percentage (artificial surfaces such as roads) as the two most important contributors to the higher salt content in U.S. rivers.

According to Wen, the accuracy of the salinity model provided an important proof of concept for the research team.

“Regarding causes of salinity in rivers, the results from our machine learning model matched those of previous studies which focused on field observation, lab work, and statistical analysis,” says Wen. “This proved that our approach was working.”

With the salinity results confirming the accuracy of the team’s model, they then turned their attention to alkalinity. Their model identified natural processes as predominantly contributing to variation in river alkalinity, a contrast to previous research that identified human activities as the main contributor to alkalinization. They found that local climatic and hydrogeological conditions including runoff, sediment, soil pH, and moisture, were features most affecting river alkalinity.

Critical to the Carbon Cycle

Their findings have important environmental and climate implications as alkalinity in rivers forms a vital link in the carbon cycle. Consider the movement of carbon during a rainstorm. When it rains, carbon dioxide from the atmosphere combines with water to form carbonic acid. When the carbonic acid reaches the ground and comes into contact with certain rocks, it triggers a chemical reaction that extracts gaseous carbon dioxide from the atmosphere and transports it to the ocean via land water systems like lakes and rivers. Known as rock weathering, this natural process continuously erodes rocks and sequesters atmospheric CO2 over millions of years. It is also a key regulator of greenhouse gases that contribute to global warming.

“Rock weathering is the primary source of alkalinity in natural waters and is one of the main ways to bring down carbon dioxide in the air,” says Wen. Think of it as a feedback loop: when there is too much carbon dioxide in the atmosphere, temperatures increase leading to enhanced rock weathering. With more rock being dissolved into watersheds due to enhanced rock weathering, alkalinity rises and in turn, brings down carbon dioxide.

“Alkalinity is a critical component of the carbon cycle,” says Wen. “While we found that natural processes are the primary drivers of alkalinization, these natural factors can still be changed by humans. We can alter the alkalinity level in rivers by changing the natural parameters, so we need to invest more to restore the natural conditions of watersheds and tackle global warming and climate changes to deal with alkalinization in U.S. rivers.”

The results from the team’s study can help inform future research about enhanced rock weathering efforts – where rocks are ground up and spread across fields. Distributing rock dust across large areas increases the amount of contact between rain and rock, which enhances carbon removal from the atmosphere. Wen says the team’s model can help answer questions about the evolution of natural conditions in different regions – an important step needed to implement enhanced rock weathering more effectively.

The work was funded through a $460,000 National Science Foundation grant awarded to Wen.

POET Technologies confirms sample availability of POET Infinity, testing with a pair of customers

Alpha samples of the chiplet-based transmitter platform for 400G, 800G, and 1.6T data center solutions are ready for shipment

POET Technologies has announced alpha sample readiness of POET Infinity, a chiplet-based transmitter platform for 400G, 800G, and 1.6T pluggable transceivers and co-packaged optics solutions. Two lead customers have agreed to partner with POET to test the alpha version of the Infinity chiplet.

The POETInfinity chiplet complements the POET 800G 2xFR4 Receiver optical engine that the Company announced in February 2023 and completes the 800G chipset for 2xFR4 QSFP-DD or OSFP applications with two Infinity chiplets and one Receiver optical engine. Both customers intend to develop 800G 2xFR4 QSFP-DD and OSFP transceiver solutions using the POET Optical Engine chipsets.

The Infinity chiplet is the industry’s first implementation of directly modulated lasers (DMLs) for 100G/lane applications. DMLs are power efficient, and cost-effective and become a highly scalable solution when paired with the POET Optical Interposer platform. The chiplet incorporates 100G PAM4 DMLs, DML Drivers, and an integrated optical multiplexer for a complete 400GBASE-FR4 transmitter solution on a chip. The small size of the chiplet and a daisy-chain architecture enables side-by-side placement of multiple instances to achieve 800G and 1.6T speeds.

The POET Infinity product line carries forward the POET differentiation of all passive alignments and monolithically integrated waveguides, multiplexers and demultiplexers, which translates to lower cost, lower power consumption, and ease-of-assembly benefits for customers.

“The availability of a transmitter solution for 400G, 800G, and 1.6T speeds that is power efficient, cost-effective, and highly scalable for the data center market is a major achievement,” said Dr. Suresh Venkatesan, Chairman & CEO of POET. “Our customers are excited to receive the samples and test them because it simplifies their transceiver design significantly and shortens the design cycle with POET optical engines that incorporate all of the required optical elements as well as the key electronic components, including laser drivers and trans-impedance amplifiers.”

The development of a production version of the POET Infinity chiplet is on track and POET expects to deliver beta samples by Q4 of 2023 and start production by the first half of 2024. The ethernet transceiver market for 400G and above data rates is projected to exceed $6 billion by 2028.

 

Planners must take into account that the combined effects of densely-packed housing and climate change can make living in cities unendurable. Photo: Shutterstock
Planners must take into account that the combined effects of densely-packed housing and climate change can make living in cities unendurable. Photo: Shutterstock

Norway simulates air currents for more habitable cities

Densely-packed housing makes urban areas vulnerable to overheating, pollution, and dangerous wind gusts. The effects of climate change can aggravate these problems, but we can also work to prevent them. This can be done by simulating microclimates.

Cities and urban areas are characterized by high population densities and widespread man-made environments. In city centers, this means surfaces covered with concrete and asphalt, less vegetation, more air pollution, and large, densely-packed buildings that reduce both air circulation and access to daylight. 

At the same time, and especially in coastal areas, extremely high wind velocities can develop between tall buildings due to airflow channeling effects. This can make pedestrian walkways and cycle paths dangerous and unsafe, especially for vulnerable groups.

Climate change can make cities unbearable to live in

The impact of climate change may be reinforcing the negative effects of densely-packed housing, and many places in the world are at risk of becoming uninhabitable. Europe is probably not the most severely affected, but even here the effects of climate change, such as heat waves, droughts, and flooding, represent a threat to both infrastructure and human life. The European heatwave of 2003 claimed more than 70,000 lives. The summer of 2022 was the hottest ever experienced in Europe, with more than 20,000 people dying as a direct result of the heat. 

For this reason, it is increasingly important to reduce the impact of extreme heat in urban areas, especially in southern Europe, but also in Norway. In Oslo, in particular, and in the southern parts of Østlandet County, we can expect to experience heat waves in the future. This is why we must do what we can to create habitable conditions in our cities, and in urban buildings, when temperatures increase to levels that we are not used to. We must begin during the planning phase of the construction process.

Advanced simulations deliver more accurate airflow calculations

So, how can urban planners take a warmer climate into account? Advanced simulation tools such as Computational Fluid Dynamics (CFD) can calculate air movements in a variety of environments. This can be done as early as during the planning phase of larger building projects and as part of the overall urban planning process.

In brief, CFD offers a method of calculating the movement of fluids, including liquids and gases such as air. In recent decades, there has been a boom in the use of such methods, in step with the increase in the power of modern computers. In the last 15 to 20 years, the method has also been applied in studies of microclimates. Microclimates represent climatic conditions that develop at scales of less than two kilometers, and for which in the past we based our investigations on measurements and observations.

Trondheim study demonstrates the effect of different measures

Studies of the microclimate at Gløshaugen in Trondheim have demonstrated the effectiveness of the use of various building materials, as well as the impact that the volume of vegetation and natural environments had on local climatic conditions, and how these factors impact the energy consumption of buildings and the thermal comfort of people outdoors.

The simulations also showed that the cumulative evaporation from lawns and trees can reduce the air temperature by several degrees (up to 2.4 °C at Gløshaugen), and thus reduce the need for cooling the buildings in summer. 

These results can be used to look into different scenarios for urban design or to calculate future urban climates. Having said this, it is difficult to give general recommendations because the systems are highly location-specific, and measures must take into account all the relevant factors that influence the microclimate in a given situation.

Multiple applications

CFD can also be used to make high-resolution calculations of wind conditions, such as turbulence and wind speed. In this way, it will be possible to identify sites where urban wind turbines for local renewable energy generation can be profitably located. Moreover, the dispersion of harmful emissions and pollution can be simulated in areas such as heavily trafficked roads, and industrial plants, and also in the aftermath of accidents and fires. 

In other words, CFD can deliver useful information to inform construction projects, urban planning, and architecture, which in the past has been inaccessible in terms of the level of resolution, precision, and speed that we are seeing today.