Astronomers use a special technique to find stellar streams. They reverse the light and dark tones of images, similar to negative images, but stretch them to highlight the faint streams. Color images of nearby galaxies are scaled and superposed to emphasize the visible disk. These galaxies are surrounded by massive halos of hot gas containing sporadic stars, which are seen as the shadowy areas around each galaxy. NASA's upcoming Nancy Grace Roman Space Telescope is expected to improve these observations by resolving individual stars, allowing for a better understanding of each stream's stellar populations and the ability to spot stellar streams of various sizes in more galaxies. Credit: Carlin et al. (2016), based on images from Martínez-Delgado et al. (2008, 2010)
Astronomers use a special technique to find stellar streams. They reverse the light and dark tones of images, similar to negative images, but stretch them to highlight the faint streams. Color images of nearby galaxies are scaled and superposed to emphasize the visible disk. These galaxies are surrounded by massive halos of hot gas containing sporadic stars, which are seen as the shadowy areas around each galaxy. NASA's upcoming Nancy Grace Roman Space Telescope is expected to improve these observations by resolving individual stars, allowing for a better understanding of each stream's stellar populations and the ability to spot stellar streams of various sizes in more galaxies. Credit: Carlin et al. (2016), based on images from Martínez-Delgado et al. (2008, 2010)

NASA's Roman mission prepares to handle a massive amount of data in the future

The Nancy Grace Roman Space Telescope (Roman) team is preparing for the deluge of data the mission will return by creating simulations, scouting the skies with other telescopes, calibrating Roman’s components, and more. Simulations will be used to test algorithms, estimate Roman’s scientific return, and fine-tune observing strategies so that the most can be learned about the universe. Roman will also identify interesting targets that observatories such as NASA’s James Webb Space Telescope can zoom in on for more detailed studies. ezgif.com resize 1 91c3e

As part of a mission to uncover the mysteries of dark energy, scientists from around the world will work together to maximize the potential of the Roman telescope. The mission is expected to launch by May 2027. To ensure that scientists are equipped with the necessary tools, various teams, and individuals will contribute their efforts to the cause. Julie McEnery, the senior project scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, said that they are laying a foundation by harnessing the science community at large. The goal is to perform powerful scientific research right from the start. The simulation plays a vital role in the preparation phase. Scientists can use it to test algorithms, estimate Roman’s scientific returns, and fine-tune observation strategies.

The teams will sprinkle different cosmic phenomena through a simulated dataset and then run machine learning algorithms to see how well they can automatically find the phenomena. Given Roman’s enormous data collection rate, identifying underlying patterns quickly and efficiently will be crucial. During its five-year primary mission, the Roman telescope is expected to amass 20,000 terabytes (20 petabytes) of observations containing trillions of individual measurements of stars and galaxies.

Preparing for the launch of the Roman Space Telescope is a complex process, as every observation made by the telescope will be used by multiple teams for different scientific purposes. Scientists will carry out preliminary observations using other telescopes such as the Hubble Space Telescope, the Keck Observatory, and PRIME. These observations will help to optimize Roman’s observations and better understand the data the mission will deliver.

Astronomers will explore ways to combine data from different observatories and use multiple telescopes in tandem. For instance, combining observations from PRIME and Roman would help astronomers learn more about objects found via warped space-time. Roman scientists will also use archived Hubble data to learn about the history of cosmic objects and identify interesting targets that telescopes such as the James Webb Space Telescope can study in detail.

Planning for each Roman science case will take many teams working in parallel. Scientists will need to consider all the things needed to study a particular object, such as algorithms for dim objects, ways to measure star positions precisely, understanding detector effects, and developing effective strategies to image stellar streams.

One team is developing processing and analysis software for Roman’s Coronagraph Instrument, which will unveil several cutting-edge technologies that could help astronomers directly image planets beyond our solar system. They will simulate different objects and planetary systems the Coronagraph could unveil, from dusty disks surrounding stars to old, cold worlds similar to Jupiter.

The mission’s science centers are getting ready to manage Roman’s data pipeline and establish systems for planning and executing observations. They will convene a survey definition team to determine Roman’s optimal observation plans in detail based on all the preparatory information generated by scientists and the interests of the broader astronomical community.

The team is excited to set the stage for Roman and ensure that each of its future observations will contribute to a wealth of scientific discoveries.

NASA’s DSOC is composed of a flight laser transceiver attached to Psyche and a ground system that will send and receive laser signals. Clockwise from top left: the Psyche spacecraft with DSOC attached, flight laser transceiver, downlink ground station at Palomar, and downlink detector.
NASA’s DSOC is composed of a flight laser transceiver attached to Psyche and a ground system that will send and receive laser signals. Clockwise from top left: the Psyche spacecraft with DSOC attached, flight laser transceiver, downlink ground station at Palomar, and downlink detector.

NASA demos deep space optical communications

NASA's Deep Space Optical Communications (DSOC) experiment will showcase the use of laser or optical-based communications as far as Mars. The technology involves the use of equipment in space and on Earth, which includes a flight laser transceiver, two ground telescopes, and a high-power near-infrared laser transmitter. Despite the challenges of dealing with faint laser photon signals and a lag of over 20 minutes at the farthest distance, the experiment will offer a groundbreaking experience for transmitting higher data rates from deep space. DSOC will be launched on Oct. 12 as part of NASA's Psyche mission. It will pave the way for future missions to Mars by testing key technologies that would allow the transmission of denser science data and even stream video from the Red Planet. 

 

It is important to know about the amazing technology demonstration that's happening. NASA is testing a new technology called DSOC which uses lasers to increase data transmission from deep space. Until now, NASA has been using only radio waves to communicate with missions that travel beyond the Moon. With optical communications, much like fiber optics replacing old telephone lines on Earth, we can expect much higher data rates throughout the solar system, with 10 to 100 times the capacity of state-of-the-art systems currently used by spacecraft. This will help us better enable future human and robotic exploration missions, as well as support higher-resolution science instruments.

The 200-inch (5.1-meter) Hale Telescope at Caltech's Palomar Observatory in San Diego County, California, has also been equipped with a special superconducting high-efficiency detector array to collect data sent from the flight transceiver. The tech demo involves equipment both in space and on Earth. While NASA's Psyche spacecraft relies on traditional radio communications for mission operations, the DSOC flight laser transceiver, which is an experiment attached to the spacecraft, features both a near-infrared laser transmitter and a sensitive photon-counting camera. The laser transceiver is designed to send high-rate data to Earth and receive a laser beam sent from Earth. However, it is just one part of the technology demonstration.

Since there is no dedicated infrastructure on Earth for deep space optical communications, two ground telescopes have been updated to communicate with the flight laser transceiver. The Optical Communications Telescope Laboratory located at NASA's Jet Propulsion Laboratory in Southern California has integrated a high-power near-infrared laser transmitter with the technology demonstration. The transmitter will deliver a modulated laser signal to DSOC's flight transceiver and serve as a beacon, or pointing reference, to enable accurate aiming of the returned laser beam back to Earth.

DSOC faces unique challenges as it aims to transmit data at a high rate over a distance of up to 240 million miles (390 million kilometers) during the first two years of Psyche's six-year journey to the asteroid belt. As Psyche travels further away from Earth, the laser photon signal weakens, making it increasingly difficult to decode the data. Furthermore, the photons take longer to reach their destination, resulting in a lag of over 20 minutes at the farthest distance of the tech demo. As the positions of Earth and the spacecraft are constantly changing as the photons travel, the DSOC ground and flight systems will need to adjust and point to where the ground receiver (at Palomar) and flight transceiver (on Psyche) will be when the photons arrive.

Advanced technologies will collaborate to ensure that the lasers are accurately targeted and that high-bandwidth data is transmitted from deep space. Precise pointing of the flight laser transceiver and ground-based laser transmitter is crucial. It is comparable to hitting a dime from a mile away while it is moving. Therefore, the transceiver must be isolated from vibrations that could nudge the laser beam off the target. Initially, Psyche will direct the flight transceiver toward Earth, while autonomous systems on the flight transceiver assisted by the Table Mountain uplink beacon laser will control the pointing of the downlink laser signal to Palomar Observatory.

JPL has developed a cryogenically cooled superconducting nanowire photon-counting array receiver, which is integrated into the Hale Telescope. The instrument is equipped with high-speed electronics that record the time of arrival of single photons, allowing the signal to be decoded. The DSOC team has even developed new signal-processing techniques to extract information from the weak laser signals that will have been transmitted over tens to hundreds of millions of miles.

NASA is working on an optical communications project that aims to revolutionize communication in space. In 2013, NASA conducted the Lunar Laser Communications Demonstration which resulted in record-breaking uplink and downlink data rates between Earth and the Moon. In 2021, a new project called Laser Communications Relay Demonstration was launched to test high-bandwidth optical communications relay capabilities from geostationary orbit, enabling spacecraft to communicate with Earth even without a direct line of sight. Additionally, NASA's TeraByte InfraRed Delivery system achieved the highest-ever data rate from a satellite in low-Earth orbit to a ground-based receiver in the last year.

The latest project, called DSOC, is taking optical communications beyond the Moon, paving the way for high-bandwidth communications in deep space. This has the potential to lead to high-data-rate communications that can support streaming and high-definition imagery. This technology could be crucial in enabling humanity's next giant leap when NASA sends astronauts to Mars.

Energy consumption of AI could be equivalent to that of a small country

AI has the potential to have a large energy footprint in the future, potentially exceeding the power demands of some countries. Improvements in AI efficiency can lead to increased demand and the Jevons Paradox. Google processes up to 9 billion searches a day and if every search used AI it would need about 29.2 TWh of power a year, the same as Ireland's annual electricity consumption.

Artificial intelligence (AI) is believed to help coders code faster, make daily tasks less time-consuming, and improve driving safety. However, a recent commentary published in the Joule journal by the founder of Digiconomist suggests that the tool's massive adoption could result in a large energy footprint. In the future, this energy demand could exceed that of some countries.

The author of the commentary, Alex de Vries, a Ph.D. candidate at Vrije Universiteit Amsterdam, states "Looking at the growing demand for AI services, it’s very likely that energy consumption related to AI will significantly increase in the coming years."

Generative AI, which produces text, images, or other data, has undergone rapid growth since 2022, including OpenAI’s ChatGPT. To train these AI tools, large amounts of data are required, which is an energy-intensive process. Hugging Face, an AI-developing company based in New York reported that its multilingual text-generating AI tool consumed about 433 megawatt-hours (MWH) during training, which is enough to power 40 average American homes for a year.

Furthermore, AI's energy demand does not end with training. De Vries's analysis shows that when the tool generates data based on prompts, every text or image it produces uses a significant amount of computing power and energy. For example, ChatGPT may consume 564 MWh of electricity every day.

As companies worldwide are striving to improve the efficiency of AI hardware and software to make the tool less energy-intensive, an increase in the machines’ efficiency often leads to an increase in demand. According to de Vries, technological advancements will lead to a net increase in resource use, a phenomenon known as Jevons’ Paradox.

Making these tools more efficient and accessible can result in allowing more applications it and more people to use them, de Vries says. For instance, Google has been incorporating generative AI in its email service and testing out powering its search engine with AI. Currently, the company processes up to 9 billion searches a day. Based on the data, de Vries estimates that if every Google search uses AI, it would require approximately 29.2 TWh of power per year, equivalent to the annual electricity consumption of Ireland.

However, de Vries notes that this extreme scenario is unlikely to occur in the short term due to the high costs associated with additional AI servers and bottlenecks in the AI server supply chain. Nevertheless, AI server production is projected to grow rapidly soon. By 2027, worldwide AI-related electricity consumption could increase by 85 to 134 TWh annually based on the projection of AI server production, according to de Vries.

This amount is comparable to the annual electricity consumption of countries such as the Netherlands, Argentina, and Sweden. Furthermore, improvements in AI efficiency could also enable developers to repurpose some computer processing chips for AI use, which could further increase AI-related electricity consumption.

“The potential growth highlights that we need to be very mindful about what we use AI for. It’s energy-intensive, so we don't want to put it in all kinds of things where we don’t actually need it,” de Vries says.

This schematic illustrates the most geophysically plausible explanation for the abundance of HSE metals present in the Earth’s mantle. During the long period of bombardment, impactors would strike the Earth and deliver materials. (a) Liquid metals would sink in the locally produced impact-generated magma ocean before percolating through the partially molten zone beneath. (b) Compression causes the metals in the molten zone to solidify and sink. (c) Then thermal convection mixes and redistributes the metal-impregnated mantle components over long geologic time frames.
This schematic illustrates the most geophysically plausible explanation for the abundance of HSE metals present in the Earth’s mantle. During the long period of bombardment, impactors would strike the Earth and deliver materials. (a) Liquid metals would sink in the locally produced impact-generated magma ocean before percolating through the partially molten zone beneath. (b) Compression causes the metals in the molten zone to solidify and sink. (c) Then thermal convection mixes and redistributes the metal-impregnated mantle components over long geologic time frames.

New research proposes impact-driven mixing of mantle materials for current mantle composition, shedding light on Earth's precious metals

A new study has found a geophysically plausible scenario to explain the abundance of certain precious metals, including gold and platinum, in the Earth’s mantle.

Scientists hypothesize that early in Earth’s evolution, about 4.5 billion years ago, the Earth sustained an impact with a Mars-sized planet and the Moon formed from the debris that was ejected into an Earth-orbiting disk.

The study's simulations used the mixing of mantle materials to explain how the metals could have been prevented from completely sinking into the Earth’s core, and that mantle convection could be responsible for redistributing the materials and retaining HSEs in the mantle.

Dr. Simone Marchi from Southwest Research Institute collaborated on a recent study that found the first geophysically plausible scenario explaining the abundance of precious metals in the Earth's mantle, including gold and platinum. The simulations carried out by scientists suggest that an impact-driven mixing of mantle materials could prevent the metals from completely sinking into the Earth's core.

The Earth sustained an impact with a Mars-sized planet about 4.5 billion years ago, resulting in the formation of the Moon from the debris ejected into an Earth-orbiting disk. The so-called "late accretion" followed, during which planetesimals as large as our Moon impacted the Earth, delivering materials like highly "siderophile" elements (HSEs) - metals with a strong affinity for iron - that were integrated into the young Earth.

Previous simulations of impacts penetrating Earth's mantle showed that only small fractions of a metallic core of planetesimals are available to be assimilated by Earth's mantle, while most of these metals, including HSEs, quickly drain down to the Earth's core. This brings us to the question: how did Earth get some of its precious metals? To explain the metal and rock mix of materials in the present-day mantle, the researchers developed new simulations.

The relative abundance of HSEs in the mantle points to delivery via impact after Earth's core had formed; however, retaining those elements in the mantle proved difficult to model - until now. The new simulation considered how a partially molten zone under a localized impact-generated magma ocean could have stalled the descent of planetesimal metals into Earth's core.

The researchers modeled mixing an impacting planetesimal with mantle materials in three flowing phases - solid silicate minerals, molten silicate magma, and liquid metal. The rapid dynamics of such a three-phase system, combined with the long-term mixing provided by convection in the mantle, allows HSEs from planetesimals to be retained in the mantle.

In this scenario, an impactor would crash into the Earth, creating a localized liquid magma ocean where heavy metals sink to the bottom. When metals reach the partially molten region beneath, the metal would quickly percolate through the melt and, after that, slowly sink toward the bottom of the mantle. During this process, the molten mantle solidifies, trapping the metal. That's when convection takes over, as heat from the Earth's core causes a very slow creeping motion of materials in the solid mantle, and the ensuing currents carry heat from the interior to the planet's surface.

"Mantle convection refers to the process of rising hot mantle material and sinking colder material," lead author Dr. Jun Korenaga from Yale University said. "The mantle is almost entirely solid although, over long geologic time spans, it behaves as a ductile and highly viscous fluid, mixing and redistributing mantle materials, including HSEs accumulated from large collisions that took place billions of years ago."

Ein Multi-Wellenlängenblick auf die Umgebung des supermassiven Schwarzen Lochs SgrA* (gelbes X). Rot sind die Sterne, blau der Staub. Viele der jungen Sterne in dem Sternenhaufen IRS13 werden vom Staub verdeckt oder von den hellen Sternen überblendet. Credits: Florian Peißker / Universität zu Köln
Ein Multi-Wellenlängenblick auf die Umgebung des supermassiven Schwarzen Lochs SgrA* (gelbes X). Rot sind die Sterne, blau der Staub. Viele der jungen Sterne in dem Sternenhaufen IRS13 werden vom Staub verdeckt oder von den hellen Sternen überblendet. Credits: Florian Peißker / Universität zu Köln

German astrophysicist discovers a turbulent fountain of youth in the center of our galaxy with a remarkable formation history

A team of researchers led by Dr Florian Peißker at the University of Cologne’s Institute of Astrophysics has discovered an unexpectedly large number of young stars in the vicinity of a supermassive black hole. The study, titled ‘The Evaporating Massive Embedded Stellar Cluster IRS 13 Close to Sgr A*. Detection of a Rich Population of Dusty Objects in the IRS13 Cluster’ and published in The Astrophysical Journal, suggests that the star cluster IRS13, discovered over twenty years ago, migrated towards the supermassive black hole through a variety of processes, with its formation history being turbulent.

The researchers used a wide variety of data gathered over several decades from various telescopes to determine the cluster members in detail. The stars are a few hundred thousand years old, which is considered young for such conditions. Given the high-energy radiation and tidal forces of the galaxy, it is unexpected to have such a large number of young stars in the direct vicinity of the supermassive black hole.

In addition to the discovery of young stars, the researchers found water ice at the center of our galaxy for the first time. The James Webb Space Telescope (JWST) was used for the study, and the prism on board the telescope used to record a spectrum free of atmospheric interference from the Galactic Center was developed at the Institute of Astrophysics in the working group led by Professor Dr. Andreas Eckart, a co-author of the publication. The presence of water ice around very young stellar objects is another independent indicator of the young age of some stars near the black hole.

The researchers also found that IRS13 has a turbulent history of formation behind it and that it migrated towards the supermassive black hole through friction with the interstellar medium, collisions with other star clusters, or internal processes. From a certain distance, the cluster was then captured by the gravitation of the black hole. In this process, a bow shock may have formed at the top of the cluster from the dust surrounding the cluster, similar to the tip of a ship in the water. The increase in dust density stimulated further star formation, which explains why these young stars are mostly at the top or front of the cluster.

According to Dr Peißker, the analysis of IRS13 and the accompanying interpretation of the cluster is the first attempt to unravel a decade-old mystery about the unexpectedly young stars in the Galactic Center. Dr. Michal Zajaček, the second author of the study and scientist at Masaryk University in Brno (Czech Republic), added that the star cluster IRS13 may be the key to unraveling the origin of the dense star population at the center of our galaxy. The researchers have gathered extensive evidence that very young stars within the range of the supermassive black hole may have formed in star clusters such as IRS13, and have identified star populations of different ages in the cluster so close to the center of the Milky Way.

Further Information:
https://iopscience.iop.org/article/10.3847/1538-4357/acf6b5