Swiss astrophysicist takes a closer look at Jupiter’s origin

Researchers of the University of Zurich (UZH) and the National Centre of Competence in Research (NCCR) PlanetS have investigated Jupiter’s formation history in great detail. Their results suggest that the giant planet migrated far from its origin and collected large amounts of material on its journey. Professor of Theoretical Astrophysics at the University of Zurich and member of the NCCR PlanetS, Ravit Helled. Image: Jos Schmid

One of the most important open questions in planetary formation theory is the story of Jupiter’s origin. Using sophisticated supercomputer modeling, researchers of the University of Zurich (UZH) and the National Centre of Competence in Research (NCCR) PlanetS now shed new light on Jupiter’s formation history. Their results were published in the journal The Astrophysical Journal Letters.

A curious enrichment of heavy elements

When the Galileo spacecraft released a probe that parachuted into Jupiter’s atmosphere in 1995, it showed among other things that heavy elements (elements heavier than helium) are enriched there. At the same time, recent structure models of Jupiter that are based on gravity field measurements by the Juno spacecraft suggest that Jupiter’s interior is not uniform but has a complex structure.

Since we now know that the interior of Jupiter is not fully mixed, we would expect heavy elements to be in a giant gas planet’s deep interior as heavy elements are mostly accreted during the early

stages of the planetary formation”, study co-author, Professor at the University of Zurich and member of the NCCR PlanetS, Ravit Helled begins to explain. “Only in later stages, when the growing planet is sufficiently massive, can it effectively attract large amounts of light element gases like hydrogen and helium. Finding a formation scenario of Jupiter which is consistent with the predicted interior structure as well as with the measured atmospheric enrichment is therefore challenging yet critical for our understanding of giant planets ”, Helled says. Of the many theories that have so far been proposed, none could provide a satisfying answer.

A long migration

“Our idea was that Jupiter had collected these heavy elements in the late stages of its formation by migrating. In doing so, it would have moved through regions filled with so-called planetesimals – small planetary building blocks that are composed of heavy element materials – and accumulated them in its atmosphere”, study lead-author Sho Shibata, who is a postdoctoral researcher at the University of Zurich and a member of the NCCR PlanetS, explains.

Yet, migration by itself is no guarantee for accreting the necessary material. “Because of complex dynamical interactions, the migrating planet does not necessarily accrete the planetesimals in its path. In many cases, the planet actually scatters them instead – not unlike a shepherding dog scattering sheep”, Shibata points out. The team, therefore, had to run countless simulations to determine if any migration pathways resulted in insufficient material accretion.

“What we found was that a sufficient number of planetesimals could be captured if Jupiter formed in the outer regions of the solar system – about four times further away from the Sun than where it is located now – and then migrated to its current position. In this scenario, it moved through a region where the conditions favored material accretion – an accretion sweet spot, as we call it”, Sho reports.

A new era in planetary science

Combining the constraints introduced by the Galileo probe and Juno data, the researchers have finally come up with a satisfying explanation. “This shows how complex giant gas planets are and how difficult it is to realistically reproduce their characteristics” Ravit Helled points out.

“It took us a long time in planetary science to get to a stage where we can finally explore these details with updated theoretical models and numerical simulations. This helps us close gaps in our understanding not only of Jupiter and our solar system but also of the many observed giant planets orbiting far away stars”, Helled concludes.

Repeats are key to understanding humanity's genome

A team of researchers from institutions around the globe has finally filled in the gaps in the Human Reference Genome--and repeated sections are more abundant and more important than previously realized

It was like a map of New York missing all of Manhattan. The human reference genome finally has all its blank spots filled in, and seeing everything we missed the first time around is both repetitive—and enlightening.

“We’re realizing there’s a lot of human variation out there,” says geneticist Rachel O’Neill, director of UConn’s Institute for Systems Genomics. And in a contra-intuitive twist of fate, the variation comes in large part from the repeats.

A significant amount of human genetic material turns out to be long, repetitive sections that occur over and over. Although every human has some repeats, not everyone has the same number of them. And the difference in the number of repeats is where most of the human genetic variation is found.

This insight—that the repeats are important—is one of many significant findings from the Telomere-2-Telomere (T2T) project, a globe-spanning collaboration of institutions that filled in the missing sections of the original human genome assembly. O’Neill is a principal investigator on that project and an author on four of the six T2T papers that were published in Science yesterday.

“It took the invention of new methods of DNA sequencing and computational analysis, and the dedication of a remarkable team of scientists, to complete the reading of the 8% of the human genome that was too complex and repetitive in its structure to be resolved 20 years ago. It was worth the wait — a rich array of surprising architectural features is revealed, with major consequences for understanding human evolution, variation, and biological function,” says Francis Collins, White House Science Advisor and former director of the National Institutes of Health.

As amazing as it was at the time, the original Human Genome Project left about 8% of the genome blank.  

“That’s the equivalent of an entire chromosome in human DNA,” O’Neill says. That last 8% includes numerous genes and repetitive areas. Most of the newly added DNA sequences were near the repetitive telomeres (long, trailing ends of each chromosome) and centromeres (dense middle sections of each chromosome).

The blanks were a result of the “short-read” technology the Human Genome Project used. It was the only technology for genome mapping available 20 years ago, and it could only read the equivalent of a few words of the genetic code at a time. For example, imagine a section of the genome consisting of the sentence “All work and no play makes Jack a dull boy,” repeated nine times in a row. The short read technology would reveal only pieces of it such as “All work”, “Jack a”, “makes Jack”, et cetera. The researchers pieced those short sections together to make the sentence “All work and no play makes Jack a dull boy,” but they had no way to know it was repeated nine times.

The T2T project, however, had better tools. The new long-read DNA technology can read entire sentences, even paragraphs, at a time. So the researchers were able to see large chunks, or even full sections, of repeats.

“Generating a truly gapless sequence of the human genome is a major milestone. We would have loved to have done this 20 years ago, but the technology had to advance. This new reference is a truly solid foundation, without cracks, on which to understand human biology. There are no missing pieces!” says Bob Waterston, a biologist at the University of Washington who worked on the original Human Genome Project.

Many early-career researchers and trainees played pivotal roles in the T2T project. At UConn, Savannah Hoyt, Gabrielle Hartley, Patrick Grady in Rachel O’Neill’s lab, and Luke Wojenski in Leighton Core’s lab, were deeply involved in the work. One of their major contributions was developing a compendium of the repeats in the genome. They found the repetitive sections contained mobile elements, which are sections capable of jumping from one part of the genome to the other (the classic example is jumping genes that lead to color changes in corn kernels, such as from red to white); viruses; and new repeats no one had identified before, including some that carry genes. Some of the giant repeats with 10, 20, or 30 copies repeat back-to-back and contain genes that could account for much of human diversity. In the example sentence from earlier, imagine “Jack” is a gene. One person might have 5 copies of it. Another might have 25.

The T2T team got the first look at complete sequences for the central parts of every human chromosome. Called centromeres, they join the different arms of each X-shaped chromosome together. O’Neill’s team found that centromeres contain known mobile elements as well as new repeats. Much of the DNA in the centromere seems to be important for maintaining a cell’s genetic information through the generations. Centromeres are already known to play a role in DNA replication when cells reproduce, and if they shift their position in the chromosome significantly they can give rise to entirely new species. The complete, gapless centromere sequences constructed by the T2T project will allow a much more nuanced understanding of human centromeres and what they do. The next steps in human genetics research can use the complete T2T genome assembly as a jumping-off point to identify interesting areas of our DNA.

“The next phase of research will sequence the genomes of many different people to fully grasp human diversity, diseases, and our relationship to our closest relatives, the other primates,” O’Neill says.

New NV Energy Foundation grant supports wildfire preparedness in Nevada

As the climate warms, wildfires in the Sierra Nevada are happening at unprecedented sizes and intensities, threatening communities and resources throughout Nevada and California. For fire managers trying to understand and predict fire behavior, access to accurate information for decision-making has never been more important. Screenshot of a simulation of the Caldor Fire created with the weather-fire-smoke model. Green lines indicate wind direction, red and yellow area indicates fire perimeter, and gray cloud represents smoke.  CREDIT Adam Kochanski/San Jose State University and Tim Brown/DRI.

A generous grant from the NV Energy Foundation will provide $150,000 to support DRI’s development of a Weather and Research Forecast advanced modeling tool that simulates weather, fire, and smoke for firefighting and prescribed fire operations. Forecasts and simulations produced by this model will be available to NV Energy’s fire mitigation team, and other professionals from the prescribed fire and air quality communities in Nevada and California through the work of the California and Nevada Smoke and Air Committee (CANSAC).

“We are committed to protecting our customers and the environment from the increasing risks of natural disasters, which include wildfires,” said Doug Cannon, NV Energy's president and chief executive officer. “The NV Energy Foundation is proud to support DRI in the development of this technology that will help firefighters better assess fire risk and keep our communities safe.”

Funds from the new NV Energy Foundation grant will be used to expand the current high-performance supercomputer system that is used by CANSAC. The system will provide an interface where users such as prescribed fire managers can conduct simulations of fire spread and smoke behavior.

The model will allow for risk assessment of specific locations by modeling different burn scenarios, help meteorologists identify small-scale wind flows that could have adverse effects on fire spread and behavior, and provide critical air quality forecasts for wildfires or burn day decisions. Simulations can be run for near future forecasting (a few days out) or longer-term scenario modeling for projects that might occur a year or more into the future.

“This tool will be useful to wildfire-fighting operations as well as for prescribed fire planning, which is essential to getting some of our fire-adapted ecosystems back into balance,” said Tim Brown, Ph.D., director of DRI’s Western Regional Climate Center. “By supporting the development of this tool, the NV Energy Foundation is providing a great resource to fire managers in Nevada and California and helping to ensure the safety of firefighters and communities across these two states."

“With this generous grant, the NV Energy Foundation will play a key role in developing new technology that will be used to solve real-world problems in fire mitigation and fire safety,” said DRI President Kumud Acharya, Ph.D. “This project is an amazing example of how community organizations like NV Energy can partner with DRI scientists to develop solutions to the problems that face our society and environment.”