Even the smallest pollution particles change the rainfall regime in the Amazon

Researchers have found that nanoparticles resulting from human activities such as the burning of fossil fuels rapidly grow in the atmosphere and influence cloud formation

Even the finest particles of pollution influence the process of cloud formation and the rainfall regime. A study conducted in Manaus, the capital of Amazonas state in Brazil’s northern region, shows that oxidation leads small aerosols expelled by factories and car exhausts, for example, to grow very rapidly, reaching up to 400 times their original size and that this affects raindrop formation.

“Understanding cloud and rain formation mechanisms in the Amazon is a major challenge because of the complexity of the non-linear physical and chemical processes that occur in the atmosphere,”  said Paulo Artaxo, a professor at the University of São Paulo’s Physics Institute (IF-USP) and penultimate author of an article on the study published in Science Advances.

The co-authors are all researchers affiliated with institutions in the United States, except Artaxo and Luiz Augusto Machado, also a professor at IF-USP. 

The discovery enhances the accuracy of climate change studies based on mathematical models and simulations. “These nanoparticles of pollution [smaller than 10 nanometers] used to be overlooked in atmospheric calculations and models. The focus was on particles larger than 100 nm because these act as cloud condensation nuclei [on which water vapor condenses to form droplets] and change the rainfall regime. This study shows that smaller particles oxidize as they travel through the atmosphere, expanding rapidly until they reach the size necessary to become condensation nuclei,” Machado said.

The data was collected by instruments onboard a special aircraft that flew over the Manaus pollution plume for about 100 kilometers (km) in 2014 and 2015 during the Green Ocean Amazon (GOAmazon) scientific campaign. FAPESP funded the study via its support for the campaign and a Thematic Project, in both cases under the aegis of the FAPESP Research Program on Global Climate Change (RPGCC).

“Little was known about the role played by these nanoparticles in the rainfall regime,” Machado said. “It so happens that the Manaus area is unique in the world in the sense that it’s an open-air laboratory, a mega-city surrounded by forest at a great distance from other cities where we can investigate how a metropolitan area changes an environment similar to that of the pre-industrial era.”

Aerosols are microscopic solid or liquid particles suspended in the atmosphere. They are produced naturally by forests, as primary aerosols, and in the atmosphere from gases emitted naturally by forests that are known as volatile organic compounds (VOCs), as secondary aerosols. They can also be produced by human activities such as the burning of fossil fuels. The latter are the type investigated in this study. 

According to Machado, aerosols of less than 10 nm emitted by vehicle exhausts, factories, and power plants in the Manaus area form a pollution plume that is blown in a southwesterly direction by the prevailing winds. The researchers concluded that the particles grew rapidly during this journey.

“It’s very hard to estimate the effect of particulate matter on rainfall because of the large number of atmospheric variables that influence this interaction,” Machado said. “We, therefore, compared the pollution line with nearby areas that lie outside the plume. We found that the particles rapidly grow in size. By the time they’re 10 km out of Manaus, they’re larger, and at 30 km they can reach a large enough size to become condensation nuclei, affecting the formation of raindrops.”

Variable impact

Cloud formation mechanisms are complex and involve many atmospheric parameters. Small aerosols interfere in raindrop condensation, but they may intensify or reduce rainfall depending on atmospheric conditions and above all on cloud formation at each moment. According to Machado, a large amount of particulate matter in the plume creates a sort of competition for the water vapor present in clouds, and the size of the droplets decreases as a result.

“For the rain to fall, the droplets have to be a certain size,” he said. “What we call terminal droplet velocity has to be above the velocity of the upwelling air, or the cloud will be full of tiny droplets and no rain will fall.”

If a very strong vertical wind is blowing, however, it can drive this large mass of droplets to a higher altitude where they form ice particles and potentially fuel a fierce storm. “We found that as the particles grow and become condensation nuclei, scant rainfall results if they meet a small, warm cloud. The aerosols reduce the precipitation. However, if the cloud builds up to become a mass of cumulonimbus [dense, towering, vertical cloud], for example, the aerosols increase the precipitation,” Machado said. “In other words, even these small particles of pollution influence the rainfall regime.”

According to the researchers, the project will proceed on a broader basis and fresh data will be collected. This year the team will conduct an experiment called Chemistry of the Atmosphere: Field Experiment in Brazil (CAFE-Brazil), with the aid of a German aircraft that can fly as high as 15,000 m. Artaxo explained that similar studies using remote sensing are also being conducted at the 325 m ATTO tower in the heart of the Amazon Rainforest (more at agencia.fapesp.br/29665).

“In the study just published [on January 12], we collected data on a low flight path [at 4,000 m],” he said. “The German aircraft we’ll use for our next collections is one of the most sophisticated flying laboratories in existence, so we’ll be able to conduct an experiment designed to produce an understanding of key physical and chemical issues in the production of aerosols, clouds, and rain that is still a mystery to us.”

UNH researchers find lower emissions vital to slow warming

Winters are warming faster than summers in North America, impacting everything from ecosystems to the economy. Global climate models indicate that this trend will continue in future winters but there is a level of uncertainty around the magnitude of warming. Researchers at the University of New Hampshire focused on the role of carbon dioxide emissions in this equation—looking at the effects of both high and low levels of carbon dioxide emissions on future climate warming scenarios—and found that a reduction in emissions could preserve almost three weeks of snow cover and below-freezing temperatures. Webp.net resizeimage 2022 02 03T204040.394 1be66

“The local ski hills of New England raised me to love winter and snow,” said Elizabeth Burakowski, research assistant professor in UNH’s Earth Systems Research Center. “But winters are vital to all of us and taking serious action now to limit, or slow, the warming of winter could mean preserving many-core purposes of cold weather including providing more winter protection for woodland animals, preventing the spread of invasive forest pests, and increasing the ability of ski resorts to make snow—protecting the economy by maintaining the area’s multimillion-dollar recreation industry.”

In their study, recently published in the journal Northeastern Naturalist, the researchers analyzed 29 different climate models to determine the effect of reducing carbon dioxide emissions, and other heat-trapping gasses, into the atmosphere. At the current pace, by mid-century (2040–2069) ski areas in North America will face up to a 50% decline in days where conditions would be favorable to make snow. Limiting emissions could slow that to only a 10 to 30% decline in the number of snowmaking days. Colder days (below freezing) and preserving snow cover is also critical for providing winter habitats and protection for animals like porcupines and martens, a carnivorous member of the weasel family. At the current rate of warming, the researchers found that deep snowpacks could become increasingly short-lived, decreasing from the historical two months of subnivium, or beneath the snow, habitat to less than one month. The researchers say that maintaining a cold winter environment is also associated with greater soil carbon storage and helps prevent the spread of invasive and very destructive forests pests such as the Southern Pine Beetle, which was recently detected as far north as New Hampshire and Maine by UNH researchers.

“Emissions scenarios play a critical role in the loss of winter conditions, indicating a potential doubling of the loss of cold days and snow cover under higher emissions,” said Alexandra Contosta, research assistant professor at UNH’s Earth Systems Research Center. “These changes could disrupt and forever change some very significant social and ecological systems that have historically relied on cold, snowy winters for habitat, water resources, forest health, local economies, cultural practices, and human wellbeing.”

Historically, between 1980–2005, the number of snow-covered days in the Northeast was 95 days. Under the low emissions scenario, that would be reduced to 72 days—under the high emissions scenario, there would only be 56 days. Historically, New Jersey, Rhode Island, and Connecticut could expect to see 20–80 days of snow cover per season but by the end of the century, under the higher emissions scenario, they are more likely to have a snow-free winter.

Co-authors include Danielle Grogan, also at UNH; Sarah Nelson, Appalachian Mountain Club; Sarah Garlick, Hubbard Brook Research Foundation; and Nora Casson, University of Winnipeg.

Increase in home delivery service usage during COVID-19 pandemic unlikely to last

Services like Instacart, Grubhub, DoorDash, and Amazon certainly existed before the COVID-19 pandemic. However, demand for groceries, food, and other products purchased online and delivered directly to your door substantially increased when the coronavirus forced many Americans to stay at home. But just how much has the demand for deliveries increased, who uses the services, what kind of products are being delivered, and perhaps most importantly, will this increase in usage last? Researchers at Rensselaer Polytechnic Institute are answering these questions to aid policymakers and transportation logistics planners. 

In the first comprehensive study investigating the initial adoption and continuance intention of delivery services during a pandemicCara Wang, an associate professor in the Department of Civil and Environmental Engineering at Rensselaer, found that over 90% of people who use online delivery services would likely revert to their original way of shopping.GettyImages-1272821104_31e9a_05f14.jpg

“Likely, the increased use of e-commerce is not the result of market competition, where the most efficient competitor outperforms the others,” Dr. Wang said. “Rather, an external disruption — the pandemic — significantly altered the playing field. Once this external effect is removed, some of the gains made by the delivery services will likely fall off.”

Using a survey method and supercomputer modeling, Dr. Wang determined that not all delivery products are the same, nor do all consumers use delivery services in the same way.

Dr. Wang identified four distinct types of users: non-adopters, prior adopters, temporary new adopters, and permanent new adopters. And delivery-service users access products in four different categories: groceries, food, home goods, and other items.

The research showed that both the initial adoption of delivery services and the intent to continue using them varies by goods type. Grocery deliveries had the highest proportion of new adopters, followed by home goods, food, and, finally, other packages. These results imply that the COVID-19 pandemic had a larger impact on the purchase opportunities for essential items than less essential items.

The study also found that while the number of users for grocery deliveries increased by 113% during COVID, almost half of these new adopters would not continue to use it once the pandemic is over.

Temporary new adopters accounted for a larger portion than the permanent new adopters for essential items, while there were more permanent new adopters for less essential items.

These findings are essential for investigating the impacts of the pandemic and predicting future demand.

“Answering these questions is essential to estimate the current and future demand for deliveries,” said José Holguín-Veras, director of the Center for Infrastructure, Transportation, and the Environment at Rensselaer and a co-author of the paper. “Transportation professionals and researchers have assumed that people would still rely on delivery services even after the COVID crisis is over. However, in reality, consumers’ technology acceptance is much more dynamic and complex during a pandemic than during normal conditions. Understanding these nuanced behaviors is essential for sound transportation policymaking.” 

Entezari develops new technique that incorporates customer behavior into recommendation algorithms

Recommendation algorithms can make a customer’s online shopping experience quicker and more efficient by suggesting complementary products whenever the shopper adds a product to their basket. Did the customer buy peanut butter? The algorithm recommends several brands of jelly to add next. Vagelis Papalexakis

These algorithms typically work by associating purchased items with items other shoppers have frequently purchased alongside them. If the shopper’s habits, tastes, or interests closely resemble those of previous customers, such recommendations might save time, jog the memory, and be a welcome addition to the shopping experience. 

But what if the shopper is buying peanut butter to stuff a dog toy or bait a mousetrap? What if the shopper prefers honey or bananas with their peanut butter? The recommendation algorithm will offer less useful suggestions, costing the retailer a sale and potentially annoying the customer. 

New research led by Negin Entezari, who recently received a doctoral degree in computer science at UC Riverside, Instacart collaborators, and her doctoral advisor Vagelis Papalexakis, brings a methodology called tensor decomposition—used by scientists to find patterns in massive volumes of data—into the world of commerce to recommend complementary products more carefully tailored to customer preferences. 

Tensors can be pictured as multi-dimensional cubes and are used to model and analyze data with many different components, called multi-aspect data. Data closely related to other data can be connected in a cube arrangement and related to other cubes to uncover patterns in the data. 

“Tensors can be used to represent customers’ shopping behaviors,” said Entezari. “Each mode of a 3-mode tensor can capture one aspect of a transaction. Customers form one mode of the tensor and the second and third mode captures product-to-product interactions by considering products co-purchased in a single transaction.”

For example, three hypothetical shoppers—A, B, and C— make the following purchases:

A: Buys hot dogs, hot dog buns, Coke, and mustard in one transaction.
B: Makes three separate transactions: Basket 1: Hot dogs and hot dog buns; Basket 2: Coke; Basket 3: Mustard
C: Hot dogs, hot dog buns, and mustard in one transaction.

To a conventional matrix-based algorithm, Customer A is identical to Customer B because they bought the same items. Using tensor decomposition, however, Customer A is more closely related to Customer C because their behavior was similar. Both had similar products co-purchased in a single transaction, even though their purchases differed slightly. 

The typical recommendation algorithm makes predictions based on the item the customer just purchased while tensor decomposition can make recommendations based on what is already in the user’s whole basket. Thus, if a shopper has dog food and peanut butter in their basket but no bread, a tensor-based recommendation algorithm might suggest a fillable dog chew toy instead of jelly if other users have also made that purchase.

“Tensors are multidimensional structures that allow modeling of complex, heterogeneous data,” said Papalexakis, an associate professor of computer science and engineering. “Instead of simply noticing which products are purchased together, there’s a third dimension. These products are purchased by this kind of user and the algorithm tries to determine which kinds of users are creating this match.”

To test their method, Entezari, Papalexakis, and co-authors Haixun Wang, Sharath Rao, and Shishir Kumar Prasad, all researchers for Instacart, used an Instacart public dataset to train their algorithm. They found that their method outperformed state-of-the-art methods for predicting customer-specific complementary product recommendations. Though more work is needed, the authors conclude that big data tensor decomposition could eventually find a home in big business as well.

“Tensor methods, even though very powerful tools, are still more popular in academic research as far as recommendation systems go,” said Papalexakis. “For the industry to adopt them we must demonstrate it’s worthwhile and relatively painless to substitute for whatever they have that works already.”

While previous research has shown the benefits of tensor modeling in recommendation problems, the new publication is the first to do so in the setting of complementary item recommendation, bringing tensor methods closer to industrial adoption and technology transfer in the context of recommendation systems.

“Tensor methods have been adopted successfully by industry before, with chemometrics and food quality being great examples, and every attempt like our work demonstrates the versatility of tensor methods in being able to tackle such a broad range of challenging problems in different domains,” said Papalexakis.

Japanese scientists develop a chaos-based stream cipher that could withstand attacks from quantum supercomputers

While for most of us cryptographic systems are things that just run “under the hood,” they are an essential element in the world of digital communications. However, the upcoming rise of quantum computers could shake the field of cryptography to its core. Fast algorithms running on these machines could break some of the most widely used cryptosystems, rendering them vulnerable. Well aware of this looming threat, cryptography researchers worldwide are working on novel encryption methods that can withstand attacks from quantum supercomputers. low res infographics jpg 1 23f03

Chaos theory is actively being studied as a basis for post-quantum-era cryptosystems. In mathematics, chaos is a property of certain dynamic systems that makes them extremely sensitive to initial conditions. While technically deterministic (non-random), these systems evolve in such complex ways that predicting their long-term state with incomplete information is practically impossible, since even small rounding errors in the initial conditions yield diverging results. This unique characteristic of chaotic systems can be leveraged to produce highly secure cryptographic systems, as a team of researchers from Ritsumeikan University, Japan, showed in a recent study.

Led by Professor Takaya Miyano, the team developed an unprecedented stream cipher consisting of three cryptographic primitives based on independent mathematical models of chaos. The first primitive is a pseudorandom number generator based on the augmented Lorenz (AL) map. The pseudorandom numbers produced using this approach are used to create key streams for encrypting/decrypting messages, which take the stage in the second and perhaps most remarkable primitive—an innovative method for secret-key exchange.

This novel strategy for exchanging secret keys specifying the AL map is based on the synchronization of two chaotic Lorenz oscillators, which can be independently and randomly initialized by the two communicating users, without either of them knowing the state of the other’s oscillator. To conceal the internal states of these oscillators, the communicating users (the sender and the receiver) mask the value of one of the variables of their oscillator by multiplying it with a locally generated random number. The masked value of the sender is then sent to the receiver and vice-versa. After a short time, when these back-and-forth exchanges cause both oscillators to sync up almost perfectly to the same state despite the randomization of the variables, the users can mask and exchange secret keys and then locally unmask them with simple calculations.

Finally, the third primitive is a hash function based on the logistic map (a chaotic equation of motion), which allows the sender to send a hash value and, in turn, allows the receiver to ensure that the received secret key is correct, i.e., the chaotic oscillators were synchronized properly.

The researchers showed that a stream cipher assembled using these three primitives is extremely secure and resistant to statistical attacks and eavesdropping since it is mathematically impossible to synchronize their oscillator to either the sender’s or the receiver’s ones. This is an unprecedented achievement, as Prof. Miyano states: “Most chaos-based cryptosystems can be broken by attacks using classical computers within a practically short time. In contrast, our methods, especially the one for secret-key exchange, appear to be robust against such attacks and, more importantly, even hard to break using quantum computers.”

In addition to its security, the proposed key exchange method applies to existing block ciphers, such as the widely used Advanced Encryption Standard (AES). Moreover, the researchers could implement their chaos-based stream cipher on the Raspberry Pi 4, a small-scale computer, using Python 3.8. They even used it to securely transmit a famous painting by Johannes Vermeer between Kusatsu and Sendai, two places in Japan 600 km apart. “The implementation and running costs of our cryptosystem are remarkably low compared with those of quantum cryptography,” highlights Prof. Miyano, “Our work thus provides a cryptographic approach that guarantees the privacy of daily communications between people all over the world in the post-quantum era.”

With such power of chaos-based cryptography, we may not have much to worry about the dark sides of quantum computing.