Caption:Diagram illustrates the way two laser beams of slightly different wavelengths can affect the electric fields surrounding an atomic nucleus, pushing against this field in a way that nudges the spin of the nucleus in a particular direction, as indicated by the arrow. Credits:Credit: Courtesy of the researchers
Caption:Diagram illustrates the way two laser beams of slightly different wavelengths can affect the electric fields surrounding an atomic nucleus, pushing against this field in a way that nudges the spin of the nucleus in a particular direction, as indicated by the arrow. Credits:Credit: Courtesy of the researchers

MIT engineers discover a new way to control atomic nuclei as qubits

Using lasers, researchers can directly control a property of nuclei called spin, that can encode quantum information.

In principle, quantum-based devices such as computers and sensors could vastly outperform conventional digital technologies for carrying out many complex tasks. But developing such devices in practice has been a challenging problem despite great investments by tech companies as well as academic and government labs.

Today’s biggest quantum supercomputers still only have a few hundred “qubits,” the quantum equivalents of digital bits.

Now, researchers at MIT have proposed a new approach to making qubits and controlling them to read and write data. The method, which is theoretical at this stage, is based on measuring and controlling the spins of atomic nuclei, using beams of light from two lasers of slightly different colors. The findings are described in a paper published Tuesday in the journal Physical Review X, written by MIT doctoral student Haowei Xu, professors Ju Li and Paola Cappellaro, and four others.

Nuclear spins have long been recognized as potential building blocks for quantum-based information processing and communications systems, and so have photons, the elementary particles that are discreet packets, or “quanta,” of electromagnetic radiation. But coaxing these two quantum objects to work together was difficult because atomic nuclei and photons barely interact, and their natural frequencies differ by six to nine orders of magnitude.

In the new process developed by the MIT team, the difference in the frequency of an incoming laser beam matches the transition frequencies of the nuclear spin, nudging the nuclear spin to flip a certain way.

“We have found a novel, powerful way to interface nuclear spins with optical photons from lasers,” says Cappellaro, a professor of nuclear science and engineering. “This novel coupling mechanism enables their control and measurement, which now makes using nuclear spins as qubits a much more promising endeavor.”

The process is completely tunable, the researchers say. For example, one of the lasers could be tuned to match the frequencies of existing telecom systems, thus turning the nuclear spins into quantum repeaters to enable long-distance- quantum communication.

Previous attempts to use light to affect nuclear spins were indirect, coupling instead to electron spins surrounding that nucleus, which in turn would affect the nucleus through magnetic interactions. But this requires the existence of nearby unpaired electron spins and leads to additional noise on the nuclear spins. For the new approach, the researchers took advantage of the fact that many nuclei have an electric quadrupole, which leads to an electric nuclear quadrupolar interaction with the environment. This interaction can be affected by light to change the state of the nucleus itself.

“Nuclear spin is usually pretty weakly interacting,” says Li. “But by using the fact that some nuclei have an electric quadrupole, we can induce this second-order, nonlinear optical effect that directly couples to the nuclear spin, without any intermediate electron spins. This allows us to directly manipulate the nuclear spin.”

Among other things, this can allow the precise identification and even mapping of isotopes of materials, while Raman spectroscopy, a well-established method based on analogous physics, can identify the chemistry and structure of the material, but not isotopes. This capability could have many applications, the researchers say.

As for quantum memory, typical devices presently being used or considered for quantum supercomputing have coherence times — meaning the amount of time that stored information can be reliably kept intact — that tend to be measured in tiny fractions of a second. But with the nuclear spin system, the quantum coherence times are measured in hours.

Since optical photons are used for long-distance communications through fiber-optic networks, the ability to directly couple these photons to quantum memory or sensing devices could provide significant benefits in new communications systems, the team says.  In addition, the effect could be used to provide an efficient way of translating one set of wavelengths to another. “We are thinking of using nuclear spins for the transduction of microwave photons and optical photons,” Xu says, adding that this can provide greater fidelity for such translation than other methods.

So far, the work is theoretical, so the next step is to implement the concept in actual laboratory devices, probably first of all in a spectroscopic system. “This may be a good candidate for the proof-of-principle experiment,” Xu says. After that, they will tackle quantum devices such as memory or transduction effects, he says.

This work “offers new opportunities in quantum technologies, including quantum control and quantum memory,” says Yao Wang, an assistant professor of physics at Clemson University, who was not associated with this work. He adds that “very impressively, this work also provided very quantitative predictions of the expected observations in these application scenarios with accurate first-principles methods. I look forward to the experimental realization of this technique, which I am sure would attract a lot of researchers in the field of quantum science and nuclear technology.”

The team also included Changhao Li, Guoqing Wang, Hua Wang, Hao Tang, and Ariel Barr, all at MIT.

iStock
iStock

University of Gothenburg develops AI-based decisions online to support doctors’ hard judgments on cardiac arrest

When patients receive care after cardiac arrest, doctors can now by entering patient data in a web-based app find out how thousands of similar patients have fared. Researchers at the University of Gothenburg in Sweden have developed three such systems of decision support for cardiac arrest that may, in the future, make a major difference to doctors’ work.

One of these decision support tools (SCARS-1), now published, is downloadable free of charge from the Gothenburg Cardiac Arrest Machine Learning Studies website. However, results from the algorithm need to be interpreted by people with the right skills. AI-based decision support is expanding strongly in many areas of health care, and extensive discussions are underway on how care services and patients alike can benefit the most from it. 

The app accesses data from the Swedish Cardiopulmonary Resuscitation Register on tens of thousands of patient cases. The University of Gothenburg researchers have used an advanced form of machine learning to teach clinical prediction models to recognize various factors that have affected previous outcomes. The algorithms take into account numerous factors relating, for example, to the cardiac arrest, treatment provided, previous ill health, medication, and socioeconomic status.

New evidence-based methods

It will be a few years before official recommendations for cardiac arrest are likely to include AI-based decision support, but doctors are free to use these prediction models and other new, evidence-based methods. The research group working on decision support for cardiac arrest is headed by Araz Rawshani, a researcher at the University’s Sahlgrenska Academy and resident physician in cardiology at Sahlgrenska University Hospital.

“My colleagues and I who tested the tool see great potential in its use in our clinical everyday life. Often the answer from the decision support means that the doctor is strengthened in an opinion he has already arrived at. It helps us not to expose patients to painful care without benefit, while at the same time, it saves on healthcare resources,” Rawshani says. Araz Rawshani, principal investigator at the Institute of medicine, University of Gothenburg, consultant physician at Sahlgrenska University Hospital, and registrar for the Swedish Cardiopulmonary Resuscitation Register. Photo: Johan Wingborg

He emphasizes, however, that the decision support is still at a research stage and that it has not yet been implemented in the hospital's guidelines. However, it can be used to make a survival calculation, in the same way, that, for example, the National Diabetes Registry assists doctors and nurses with a so-called risk engine (read more here).

Highly accurate

To date, the research group has published two decision support tools. One clinical prediction model, known as SCARS-1, is presented in The Lancet’s eBioMedicine journal. This model indicates whether a new patient case resembles other, previous cases where, 30 days after their cardiac arrest, patients had survived or died.

The model’s accuracy is unusually high. Based on the ten most significant factors alone, the model has a sensitivity of 95 percent and a specificity of 89 percent. The “AUC-ROC value” (ROC being the receiver operating characteristic curve for the model and AUC the area under the ROC curve) for this model is 0.97. The highest possible AUC-ROC value is 1.0 and the threshold for a clinically relevant model is 0.7.

One piece of the puzzle

This decision support was developed by Fredrik Hessulf, a doctoral student at Sahlgrenska Academy, University of Gothenburg, and an anesthesiologist at Sahlgrenska University Hospital/Mölndal.

“This decision support is one of several pieces in a big puzzle: the doctor’s overall assessment of a patient. We have many different factors to consider in deciding whether to go ahead with cardiopulmonary resuscitation. It’s a highly demanding treatment that we should give only to patients who will benefit from it and be able, after their hospital stay, to lead a life of value to themselves,” Hessulf says.

This form of support is based on 393 factors affecting patients’ chances of surviving their cardiac arrest for 30 days after the event. The model’s high accuracy may be explained by the huge number of patient cases (roughly 55,000) on which the algorithm is based and the fact that ten of the nearly 400 factors have been found to impact heavily on survival. By far the most important factor was whether the heart regained a viable cardiac rhythm again after the patient’s admission to the emergency department.

Risk of new cardiac arrest

The second decision support tool published has been presented in the journal Resuscitation. This tool is based on data from patients who survived their out-of-hospital cardiac arrest until they were discharged from the hospital. The predictive models are based on 886 factors in 5098 patient cases from the Swedish Cardiopulmonary Resuscitation Register. This tool is partly aimed at helping doctors identify which patients are at risk of another cardiac arrest or death within a year of discharge from the hospital following their cardiac arrest. It also aims to highlight which factors are important for long-term survival after cardiac arrest — an aspect of the subject area that has not been well studied.

“The accuracy of this tool is reasonably good. It can predict with about 70 percent reliability whether the patient will die, or will have had another cardiac arrest, within a year. Like Fredrik’s tool, this one has the advantage that just a few factors can predict outcome almost as well as the model with several hundred variables,” says Gustaf Hellsén, the research doctor who developed this decision support tool.

“We hope,” he continues, “to succeed in developing this prediction model, to enhance its precision. Today, it can already serve as support for doctors in identifying factors with an important bearing on survival among cardiac arrest patients who are to be discharged from hospital.”

Three decision support tools for different aspects of cardiac arrest

Currently, the SCARS-1 tool (developed by Fredrik Hessulf, addressing survival and neurological function 30 days after cardiac arrest) is available to use as an online app. SCARS-2 (developed by Gustaf Hellsén and designed to support decisions on the risk of new cardiac arrest after discharge) will be launched shortly. During 2023, the publication of SCARS-3 (for in-hospital cardiac arrest) is also planned.
Doctors and other health professionals can read more about these decision support tools and download the applications at http://gocares.se.

Caltech engineers discover that Leonardo da Vinci's understanding of gravity was centuries ahead of his time

In an article published in the journal Leonardo, the researchers draw upon a fresh look at one of da Vinci's notebooks to show that the famed polymath had devised experiments to demonstrate that gravity is a form of acceleration—and that he further modeled the gravitational constant to around 97 percent accuracy. 

Da Vinci, who lived from 1452 to 1519, was well ahead of the curve in exploring these concepts. It wasn't until 1604 that Galileo Galilei would theorize that the distance covered by a falling object was proportional to the square of time elapsed and not until the late 17th century that Sir Isaac Newton would expand on that to develop a law of universal gravitation, describing how objects are attracted to one another. Da Vinci's primary hurdle was being limited by the tools at his disposal. For example, he lacked a means of precisely measuring time as objects fell.

Da Vinci's experiments were first spotted by Mory Gharib, the Hans W. Liepmann Professor of Aeronautics and Medical Engineering, in the Codex Arundel, a collection of papers written by da Vinci that cover science, art, and personal topics. In early 2017, Gharib was exploring da Vinci's techniques of flow visualization to discuss with students he was teaching in a graduate course when he noticed a series of sketches showing triangles generated by sand-like particles pouring out from a jar in the newly released Codex Arundel, which can be viewed online courtesy of the British Library.

"What caught my eye was when he wrote ‘Equatione di Moti' on the hypotenuse of one of his sketched triangles—the one that was an isosceles right triangle," says Gharib, lead author of the Leonardo paper. "I became interested to see what Leonardo meant by that phrase."

To analyze the notes, Gharib worked with colleagues Chris Roh, at the time a postdoctoral researcher at Caltech and now an assistant professor at Cornell University, as well as Flavio Noca of the University of Applied Sciences and Arts Western Switzerland in Geneva. Noca provided translations of da Vinci's Italian notes (written in his famous left-handed mirror writing that reads from right to left) as the trio pored over the manuscript's diagrams. Gharib LeGharib Leonardo DaVinci GravityExpe.max 500x500 f60d5

In the papers, da Vinci describes an experiment in which a water pitcher would be moved along a straight path parallel to the ground, dumping out either water or a granular material (most likely sand) along the way. His notes make it clear that he was aware that the water or sand would not fall at a constant velocity but rather would accelerate—also that the material stops accelerating horizontally, as it is no longer influenced by the pitcher, and that its acceleration is purely downward due to gravity.

If the pitcher moves at a constant speed, the line created by falling material is vertical, so no triangle forms. If the pitcher accelerates at a constant rate, the line created by the collection of falling material makes a straight but slanted line, which then forms a triangle. And, as da Vinci pointed out in a key diagram, if the pitcher's motion is accelerated at the same rate that gravity accelerates the falling material, it creates an equilateral triangle—which is what Gharib originally noticed that da Vinci had highlighted with the note "Equatione di Moti," or "equalization (equivalence) of motions."

Da Vinci sought to mathematically describe that acceleration. It is here, according to the study's authors, that he didn't quite hit the mark. To explore da Vinci's process, the team used computer modeling to run his water vase experiment. Doing so yielded da Vinci's error.

"What we saw is that Leonardo wrestled with this, but he modeled it as the falling object's distance was proportional to 2 to the t power [with t representing time] instead proportional to t squared," Roh says. "It's wrong, but we later found out that he used this sort of wrong equation in the correct way." In his notes, da Vinci illustrated an object falling for up to four intervals of time—a period through which graphs of both types of equations line up closely.

"We don't know if da Vinci did further experiments or probed this question more deeply," Gharib says. "But the fact that he was grappling with this problem in this way—in the early 1500s—demonstrates just how far ahead his thinking was."

The paper is titled "Leonardo da Vinci's Visualization of Gravity as a Form of Acceleration."

University of North Florida wins Congressional appropriation for improved IT infrastructure

The University of North Florida's Information Technology Services recently received a congressional directed appropriation in the amount of $750,000 to support a much-needed cloud supercomputing and cybersecurity infrastructure initiative.

The enhanced virtual cloud supercomputing infrastructure will address the need to create a scalable secure learning and research environment for UNF students, faculty, and staff. The project will benefit both traditional and non-traditional students by providing a consistent set of tools, software, security, and a level of interactivity with faculty during their academic career at UNF.

It will provide the UNF community with access to secure campus lab computers and systems from any location; enhance existing computer and research labs; allow university resources to be utilized anytime, anywhere, in a device-agnostic environment; and provide the tools students need to build real-world systems while providing faculty the ability to closely monitor and assist students both on and off campus.

The new virtual infrastructure will allow cyber security instruction, research, and operations in a controlled and scalable environment.  The environment will seamlessly allow remote access to software restricted to running on university-owned hardware. In addition, it will provide remote control of a dedicated machine for student assignments and projects.

Using this technology, UNF will be able to utilize pools of available computers to work together to solve a common problem, as in a supercomputing cluster or a Google-like indexing and search task. Additionally, this environment and service will offload the need for students to continually upgrade their computers to handle demanding assignments and projects. Students will be able to use software or hardware when classroom computers are not powerful enough. 

An aerial view of the Arctic tundra in Nunavut, Canada, from 2023. (Image credit: Getty Images)
An aerial view of the Arctic tundra in Nunavut, Canada, from 2023. (Image credit: Getty Images)

New data gives NOAA a more vast view of global climate

The update includes more Arctic data, the longer historical record

NOAA’s National Centers for Environmental Information (NCEI) is updating its current global climate dataset to provide more information about the Earth’s climate, while also extending the planet’s observed temperature record by 30 years.

The update to NCEI’s current NOAA Global Temperature dataset — one of the most visible and widely used datasets to assess global climate — will debut in the upcoming January 2023 global climate report to be released on February 14, 2023. This new global climate dataset will expand upon and replace the current one used since 2019.  A NOAA crew deploying an Argo float, which provides real-time climate data about the ocean. Credit: NOAA

"This new version of NOAA's global surface temperature dataset is part of NCEI’s commitment to providing a complete and comprehensive perspective of the Earth’s climate," said NCEI director Deke Arndt. “Regular updates to our datasets help us expand our understanding of our dynamic planet."

There are two significant additions in this update:

  • More data for the Arctic region are included, as well as new scientific methods for monitoring climate in other locations with limited climate data. 
  • Using improved methodology to analyze NCEI’s archival land and ocean observations, 30 more years will be added to the world’s current climate record, extending to 1850. 

NOAA’s Global Temperature datasets consist of data from weather stations across the world’s land surface, as well as ocean surface data from ships, buoys, surface drifters, profiling floats, and other uncrewed automatic systems. Until recently, however, monitoring environmental conditions around the Arctic and Antarctic has been more challenging due to fewer temperature observations in these regions. 

The updated version now includes data from more buoys from around the Arctic, along with enhanced methods of calculating temperatures in the Earth’s polar regions. 

The new version of NOAA’s Global Temperature dataset shows similar warming trends in the Earth’s climate when compared to the previous version, indicating that short-and long-term climate trends remain consistent across datasets.

This new information comes at a critical time in the Earth’s climate history. The Arctic is the fastest-warming region globally, warming at least three times faster than any other region. The top 10 warmest years on record for the globe have all occurred after 2010. The last nine years (2014–2022) have been the warmest on record.