Terahertz light-driven spin-lattice control can open up a new path to faster storage

An international team of researchers from the University of Cologne (Germany), Radboud University Nijmegen (The Netherlands), the Ioffe Institute, and the Prokhorov General Physics Institute (Russia) has discovered a new mechanism to control spin-lattice interaction using ultrashort terahertz (THz) pulses (terahertz means 1012 hertz). This mechanism can open up new and elegant ways to control the propagation of spin waves and thus make an important step to conceptually new technologies of data processing in the future. The results have been published in a recent Science publication entitled ‘Terahertz light-driven coupling of antiferromagnetic spins to lattice’.

Currently, magnetic data recording is dominating data storage technology. It is estimated that soon, more than 7% of the world’s energy production will be spent on data storage centers. Hence there is an urgent demand to develop new technologies to process and store data using ultrafast processes in an energy-efficient manner. 

Spin-lattice interaction plays a decisive role in magnetic recording processes, where a spin is the elementary magnetic moment of an electron, whose orientation control (up and down) is the base of modern binary computer systems.  The scientists used special antiferromagnets in their study – materials in which the ordered spins of electrons align in a regular pattern with neighboring spins pointing in opposite directions. The collective motion of spins in these materials, so-called spin waves, are typically 10 times faster than their counterparts in traditional ferromagnetic materials. In contrast to electrons, such spin waves practically do not interact with the crystal lattice and thus can propagate over macroscopic distances without losses. In the future, spintronics could replace traditional electronics and function as a carrier of information in a magnetic material. This brings the potential for much faster and efficient data processing. At the same time, the weak interaction makes control over the propagation of the spin waves challenging. The scientists then ‘drive’ the spin-lattice coupling by applying an ultrashort terahertz pulse.

Dr. Evgeny Mashkovich, Senior Researcher at the Optical Condensed Matter Science group at the University of Cologne’s Institute for Experimental Physics said: "We showed that we can now control the interaction between lattice and spin waves and make it a strong interaction. I believe that this discovery is an important step towards conceptually new technologies for ultra-fast data processing and efficient data storage in the future."

Anomalo, Snowflake partner to help enterprises trust their data

Anomalo has announced a partnership with Snowflake to help customers trust the data they use to make decisions and build products. The combination provides customers with a way to monitor the quality of the data in any table in Snowflake’s platform without writing code, configuring rules, or setting thresholds.

Today’s modern data-powered organizations are using Snowflake’s platform to centralize all of their data and make it easily available for everything from business decision-making to predictive analytics and machine learning.

However, dashboards and data-powered products are only as good as the quality of the data that powers them. Many data-powered companies quickly encounter one unfortunate fact: much of their data is missing, stale, corrupt, or prone to unexpected and unwelcome changes. As a result, companies spend more time dealing with issues in their data rather than unlocking that data’s value.

Anomalo thus addresses the data quality problem by monitoring enterprise data and automatically detecting and root-causing data issues, allowing teams to resolve any hiccups with their data before making decisions, running operations, or powering models. Anomalo leverages machine learning to rapidly assess a wide range of data sets with minimal human input. If desired, enterprises can fine-tune Anomalo’s monitoring through the low-code configuration of metrics and validation rules. This is in contrast to legacy approaches to monitoring data quality that require extensive work writing data validation rules or setting limits and thresholds.

As a result, Snowflake customers can now begin monitoring the quality of their data with Anomalo in under five minutes. They simply connect Anomalo’s data quality platform to their Snowflake account and select the tables they wish to monitor. No further configuration or code is required.

Anomalo and Snowflake are used by customers globally:

  • Discover Financial Services is leveraging Anomalo to quickly gain trust in their most critical data. Discover’s Chief Data and Analytics Officer Keith Toney said: “Discover is transforming and expanding how we use data as an enterprise asset to serve our customers better through advanced data analytics. We were looking for a product that would help us maintain a scalable foundation of trusted data in a fast-paced digital environment. We selected Anomalo to fully automate the basis of our data quality monitoring because their machine learning and root cause detection technology identify late, missing, or anomalous data across our petabyte-scale cloud warehouse. Our data stewards use Anomalo’s intuitive UI to tailor monitoring to their business needs. Compared to legacy solutions, Anomalo will help us detect more quality issues with just a fraction of the time invested by our team.”
  • Faire uses Anomalo to monitor the most important tables in their Snowflake account. Daniele Perito, Chief Data Officer and co-founder at Faire, said: “We monitor hundreds of key tables in Snowflake’s platform with Anomalo. I sleep better at night knowing our data is more reliable, and my team loves how easy it is to use and how insightful the notifications are.”
  • Substack uses Anomalo to empower their small team to keep up with an ever-growing collection of data. Mike Cohen, Substack’s Data Manager, said: “With a small data team at Substack, the automated checks that Anomalo provides are like having another data engineer on the team whose primary focus is to ensure data quality and integrity. With these checks, we've caught internal data and production bugs and detected the presence of bad actors internal to our system that might have otherwise gone unnoticed for long periods.”

“Snowflake provides an ideal environment for tools like Anomalo. With its ability to centralize the full set of enterprise data and its unique ability to automatically size query workloads based on their priority and urgency, Snowflake is a perfect partner in helping enterprises trust all of their important data,” said Elliot Shmukler, co-founder and CEO of Anomalo.

“Anomalo offers an easy-to-use way to monitor every table in a customer’s Snowflake account for data quality issues," said Tarik Dwiek, Head of Technology Alliances at Snowflake. "We're excited to offer Snowflake customers the ability to leverage Anomalo to further build trust in the data they are using to develop products and make decisions.”

As part of today’s announcement, Anomalo is a Select Partner within the Snowflake Partner Program.

UK supercomputing reveals more hostile conditions on Earth as life evolved

During long portions of the past 2.4 billion years, the Earth may have been more  inhospitable to life than scientists previously thought, according to new supercomputer simulations.  Graphic showing how UV radiation on Earth has changed over the last 2.4 billion years. Credit Gregory Cooke.

Using a state-of-the-art climate model, researchers now believe the level of ultraviolet (UV) radiation reaching the Earth’s surface could have been underestimated, with UV levels being up to ten times higher.   

UV radiation is emitted by the sun and can damage and destroy biologically important molecules such as proteins.  

The last 2.4 billion years represent an important chapter in the development of the biosphere. Oxygen levels rose from almost zero to significant amounts in the atmosphere, with concentrations fluctuating but eventually reaching modern-day concentrations approximately 400 million years ago.   

During this time, more complex multicellular organisms and animals began to colonize the land.   

Gregory Cooke, a Ph.D. researcher at the University of Leeds who led the study, said the findings raise new questions about the evolutionary impact of UV radiation as many forms of life are known to be negatively affected by intense doses of UV radiation.  

He said: “We know that UV radiation can have disastrous effects if life is exposed to too much. For example, it can cause skin cancer in humans. Some organisms have effective defense mechanisms, and many can repair some of the damage UV radiation causes.  

“Whilst elevated amounts of UV radiation would not prevent life’s emergence or evolution, it could have acted as a selection pressure, with organisms better able to cope with greater amounts of UV radiation receiving an advantage.”    

The amount of UV radiation reaching the Earth is limited by the ozone in the atmosphere, described by the researchers as “...one of the most important molecules for life” because of its role in absorbing UV radiation as it passes into the Earth’s atmosphere. 

Ozone forms as a result of sunlight and chemical reactions – and its concentration is dependent on the level of oxygen in the atmosphere.  

For the last 40 years, scientists have believed that the ozone layer was able to shield life from harmful UV radiation when the level of oxygen in the atmosphere reached about one percent relative to the present atmospheric level.  

The new modeling challenges that assumption. It suggests the level of oxygen needed may have been much higher, perhaps 5% to 10% of present atmospheric levels.  A rough outline of oxygen (O2) concentrations in Earth's atmosphere through time are illustrated in this figure. Brown blocks show the estimated range for O2 in terms of its present atmospheric level (which is 21% by volume). Grey-blue lines indicated various important events for the evolution of life, including the emergence of eukaryotes and animals. Black arrows refer to important events where atmospheric oxygen concentration changed. The Archean, Proterozoic, and Phanerozoic are geological eons. GOE = Great Oxidation Event; NOE = Neoproterozoic Oxidation Event; CE = Cambrian Explosion; LE = Lomagundi Excursion. credit Gregory Cooke

As a result, there were periods when UV radiation levels at the Earth’s surface were much greater, and this could have been the case for most of the Earth’s history. 

Mr. Cooke said: “If our modeling is indicative of atmospheric scenarios during Earth’s oxygenated history, then for over a billion years the Earth could have been bathed in UV radiation that was much more intense than previously believed. 

“This may have had fascinating consequences for life’s evolution. It is not precisely known when animals emerged, or what conditions they encountered in the oceans or on land. However, depending on oxygen concentrations, animals and plants could have faced much harsher conditions than today’s world. We hope that the full evolutionary impact of our results can be explored in the future.”   

The results will also lead to new predictions for exoplanet atmospheres. Exoplanets are planets that orbit other stars. The presence of certain gases, including oxygen and ozone, may indicate the possibility of extra-terrestrial life, and the results of this study will aid in the scientific understanding of surface conditions in other worlds. 

CfA astronomer Karen Collins uses ML to discover mysterious dusty object orbiting TIC 400799224

The Transiting Exoplanet Survey Satellite, TESS, was launched in 2018 to discover small planets around the Sun’s nearest neighbor stars. TESS has so far discovered 172 confirmed exoplanets and compiled a list of 4703 candidate exoplanets. Its sensitive camera takes images that span a huge field of view, more than twice the area of the constellation of Orion, and TESS has also assembled a TESS Input Catalog (TIC) with over 1 billion objects. Follow-up studies of TIC objects have found they result from stellar pulsations, shocks from supernovae, disintegrating planets, gravitational self-lensed binary stars, eclipsing triple star systems, disk occultations, and more. An optical/near-infrared image of the sky around the TESS Input Catalog (TIC) object TIC 400799224 (the crosshair marks the location of the object, and the width of the field of view is given in arcminutes). Astronomers have concluded that the mysterious periodic variations in the light from this object are caused by an orbiting body that periodically emits clouds of dust that occult the star.  Powell et al., 2021

The Center for Astrophysics | Harvard & Smithsonian (CfA) astronomer Karen Collins was a member of a large team that discovered the mysterious variable object TIC 400799224. They searched the Catalog using machine-learning-based computational tools developed from the observed behaviors of hundreds of thousands of known variable objects; the method has previously found disintegrating planets and bodies that are emitting dust, for example. The unusual source TIC 400799224 was spotted serendipitously because of its rapid drop in brightness, by nearly 25% in just a few four hours, followed by several sharp brightness variations that could each be interpreted as an eclipse.

The astronomers studied TIC 400799224 with a variety of facilities including some that have been mapping the sky for longer than TESS has been operating. They found that the object is probably a binary star system and that one of the stars pulsates with 19.77 days, probably from an orbiting body that periodically emits clouds of dust that occult the star. But while the periodicity is strict, the dust occultations of the star are erratic in their shapes, depths, and durations, and are detectable (at least from the ground) only about one-third of the time or less. The nature of the orbiting body itself is puzzling because the quantity of dust emitted is large; if it were produced by the disintegration of an object like the asteroid Ceres in our solar system, it would survive only about eight thousand years before disappearing. Yet remarkably, over the six years that this object has been observed, the periodicity has remained strict and the object emitting the dust has remained intact. The team plans to continue monitoring the object and to incorporate historical observations of the sky to try to determine its variations over many decades.

Yale physicists build simulations that show hurricanes will roam over more of the Earth

A new, Yale-led study suggests the 21st century will see an expansion of hurricanes and typhoons into mid-latitude regions, which includes major cities such as New York, Boston, Beijing, and Tokyo. © stock.adobe.com

The researchers said tropical cyclones, hurricanes and typhoons, could migrate northward and southward in their respective hemispheres, as the planet warms as a result of anthropogenic greenhouse gas emissions. 2020’s subtropical storm Alpha, the first tropical cyclone observed making landfall in Portugal, and this year’s Hurricane Henri, which made landfall in Connecticut, maybe harbingers of such storms.

“This represents an important, under-estimated risk of climate change,” said first researcher Joshua Studholme, a physicist in Yale’s Department of Earth and Planetary Sciences in the Faculty of Arts and Sciences, and a contributing author on the United Nations’ Intergovernmental Panel on Climate Change sixth assessment report published earlier this year.

“This research predicts that the 21st century’s tropical cyclones will likely occur over a wider range of latitudes than has been the case on Earth for the last 3 million years,” Studholme said.

Co-authors of the study are Alexey Fedorov, a professor of oceanic and atmospheric sciences at Yale, Sergey Gulev of the Shirshov Institute of Oceanology, Kerry Emanuel of the Massachusetts Institute of Technology, and Kevin Hodges of the University of Reading.

While an increase in tropical cyclones is commonly cited as a harbinger of climate change, much remains unclear about how sensitive they are to the planet’s average temperature. In the 1980s, study co-author Emanuel used concepts from classical thermodynamics to predict that global warming would result in more intense storms — a prediction that has been validated in the observational record.

Yet other aspects of the relationship between tropical cyclones and climate still lack physically based theory. For example, there is no agreement among scientists about whether the total number of storms will increase or decrease as the climate warms, or why the planet experiences roughly 90 such events each year.

“There are large uncertainties in how tropical cyclones will change in the future,” said Fedorov. “However, multiple lines of evidence indicate that we could see more tropical cyclones in mid-latitudes, even if the total frequency of tropical cyclones does not increase, which is still actively debated. Compounded by the expected increase in average tropical cyclone intensity, this finding implies higher risks due to tropical cyclones in Earth’s warming climate.”

Typically, tropical cyclones form at low latitudes that have access to warm waters from tropical oceans and away from the shearing impact of the jet streams — the west-to-east bands of wind that circle the planet. Earth’s rotation causes clusters of thunderstorms to aggregate and spins up to form the vortices that become tropical cyclones. Other mechanisms of hurricane formation also exist.

As the climate warms, temperature differences between the Equator and the poles will decrease, the researchers say. In the summer months, this may cause weakening or even a split in the jet stream, opening a window in the mid-latitudes for tropical cyclones to form and intensify.

For the study, Studholme, Fedorov, and their colleagues analyzed numerical simulations of warm climates from Earth’s distant past, recent satellite observations, and a variety of weather and climate projections, as well as the fundamental physics governing atmospheric convection and planetary-scale winds. For example, they noted that supercomputer simulations of warmer climates during the Eocene (56 to 34 million years ago) and Pliocene (5.3 to 2.6 million years ago) epochs saw tropical cyclones form and intensify at higher latitudes.

“The core problem when making future hurricane predictions is that models used for climate projections do not have sufficient resolution to simulate realistic tropical cyclones,” said Studholme, who is a postdoctoral fellow at Yale. “Instead, several different, indirect approaches are typically used. However, those methods seem to distort the underlying physics of how tropical cyclones form and develop. A number of these methods also provide predictions that contradict each other.”

The new study derives its conclusions by examining connections between hurricane physics on scales too small to be represented in current climate models and the better-simulated dynamics of Earth’s jet streams and north-south air circulation, known as the Hadley cells.