Technical snowmaking on the Gemsstock. (Photo: Valentin Luthiger)
Technical snowmaking on the Gemsstock. (Photo: Valentin Luthiger)

Swiss scientists investigate skiing over the Christmas holidays with snowmaking

For many people in Switzerland, holidays in the snow are as much a part of the end of the year as Christmas trees and fireworks. As global warming progresses, however, white slopes are becoming increasingly rare. Researchers at the University of Basel have calculated how well one of Switzerland’s largest ski resorts will remain snow reliable with technical snowmaking by the year 2100, and how much water this snow will consume.

The future for ski sports in Switzerland looks anything but rosy – or rather white. Current climate models predict that there will be more precipitation in winter in the coming decades, but that it will fall as rain instead of snow. Despite this, one investor recently spent several million Swiss francs on expanding the Andermatt-Sedrun-Disentis ski resort. A short-sighted decision they will regret in the future?

A research team led by Dr. Erika Hiltbrunner from the Department of Environmental Sciences at the University of Basel has now calculated the extent to which this ski resort can maintain its economically important Christmas holidays and a ski season of at least 100 days with and without snowmaking. The team collected data on the aspects of the slopes, where and when the snow is produced at the ski resort, and with how much water. They then applied the latest climate change scenarios (CH2018) in combination with the SkiSim 2.0 simulation software for projections of snow conditions with and without technical snowmaking. The results of their investigations were recently published in the International Journal of Biometeorology.

No guarantee of a white Christmas

According to the results, the use of technical snow can indeed guarantee a 100-day ski season – in the higher parts of the ski resort (at 1,800 meters and above), at least. But business is likely to be tight during the Christmas holidays in coming decades, with the weather often not cold enough at this time and in the weeks before. In the scenario with unabated greenhouse gas emissions, the Sedrun region in particular will no longer be able to offer guaranteed snow over Christmas in the longer term. New snow guns may alleviate the situation to a certain extent, say the researchers, but will not resolve the issue completely.

“Many people don’t realize that you also need certain weather conditions for snowmaking,” explains Hiltbrunner. “It must not be too warm or too humid, otherwise there will not be enough evaporation cooling for the sprayed water to freeze in the air and come down as snow.” Warm air absorbs more moisture and so, as winters become warmer, it also gets increasingly difficult or impossible to produce snow technically. In other words: “Here, the laws of physics set clear limits for snowmaking.”

540 million liters

The skiing will still go on, however, because technical snowmaking at least enables resort operators to keep the higher ski runs open for 100 consecutive days – even up until the end of the century and with climate change continues unabated. But there is a high price to be paid for this. The researchers’ calculations show that water consumption for snowmaking will increase significantly, by about 80% for the resort as a whole. In an average winter toward the end of the century, consumption would thus amount to about 540 million liters of water, compared with 300 million liters today.

But this increase in water demand is still relatively moderate compared with other ski resorts, the researchers emphasize. Earlier studies had shown that water consumption for snowmaking in the Scuol ski resort, for example, would increase by a factor of 2.4 to 5, because the area covered with snow there will have to be largely expanded to guarantee snow reliability.

For their analysis, the researchers considered periods of 30 years. However, there are large annual fluctuations: In addition, extreme events are not depicted in the climate scenarios. In the winter of 2017 with low levels of snow, water consumption for snowmaking in one of the three sub-areas of Andermatt-Sedrun-Disentis tripled.

Conflicts over water use

Today, some of the water used for snowmaking in the largest sub-area of Andermatt-Sedrun-Disentis comes from the Oberalpsee. A maximum of 200 million liters may be withdrawn annually for this purpose. If climate change continues unabated, this source of water will last until the middle of the century, at which point new sources will have to be exploited. “The Oberalpsee is also used to produce hydroelectric power,” says Dr. Maria Vorkauf, lead author of the study, who now works at the Agroscope research station. “Here, we are likely to see a conflict between the water demands for the ski resort and those for hydropower generation.”

At first, this ski resort may even benefit from climate change – if lower-lying and smaller ski resorts are obliged to close, tourists will move to larger resorts at higher altitudes, one of which is Andermatt-Sedrun-Disentis.

What is certain is that increased snowmaking will drive up costs and thus also the price of ski holidays. “Sooner or later, people with average incomes will simply no longer be able to afford them,” says Hiltbrunner.

Pxhere.com
Pxhere.com

Dutch scientists use AI for predicting calving problems before insemination

A small percentage of cows will experience problems when calving and breeders would like to know which cows are at risk. Using the vast dataset of the Dutch cattle breeding company CRV, computer scientists at the University of Groningen used artificial intelligence to develop a predictive model that in theory could halve the number of calving problems. Pxhere.com

Cattle breeding is data science. Breeding firms provide semen from bulls and register the success of their offspring. Data on the milk yield of the cows and many other characteristics are collected and stored in a vast database, together with the genetic data from all the animals. This allows the companies to attribute an ‘estimated breeding value’ to the animals and find matches for optimal breeding.

Risk

One aspect of breeding is the birthing of calves. In about 3.3 percent of all cases, some kind of complication occurs during calving, which is referred to as dystocia. "This could range from the calf needing to be pulled to needing veterinary intervention," explains Ahmad Alsahaf. "There are models to predict the risk of dystocia, but these work with data only available after insemination. We wanted to produce a model that could predict the risk before insemination."

Alsahaf now works as a postdoctoral researcher at the Department of Biomedical Sciences of Cells & Systems of the University Medical Center Groningen. Still, he has worked on a predictive model for dystocia during his Ph.D. project at the Intelligent Systems research group at the Bernoulli Institute for Mathematics, Computer Science, and Artificial Intelligence at the University of Groningen in The Netherlands.

Challenges

"We were asked to create this model for the cattle breeding company CRV and they gave us a large dataset comprising information on cows and bulls," says Alsahaf. ‘We first used a machine-learning system to analyze the data and create a provisional model. Then, we checked if the most important risk factors made sense. They did and, therefore, we proceeded to build a full model."

There were two main challenges: the first was to clean up and compile the available data. The second was that only 3.3 percent of pregnant cows experience dystocia. "This meant that there was a huge imbalance in our dataset," explains Alsahaf. To solve this, he created a large number of subsets with balanced data and aggregated those to train the predictive model. "Subsequently, we tested this model on a subset of the data that was not used for training and studied the results." It turned out that the model performed significantly better than the chance.

"A colleague of ours calculated that, under ideal circumstances, our model could roughly halve the risk of dystocia. But this requires an ideal combination of bull and cow, which is not always possible." Nevertheless, the model can help farmers and the breeding company to assess the risk of a particular mating before insemination. "This is important since, so far, all other models require information gathered after insemination, which means you are not really preventing complications."

Credit: Carlos Padilla
Credit: Carlos Padilla

ALMA successfully restarts observations after cyberattack

Forty-eight days after suspending observations due to a cyberattack, the Atacama Large Millimeter/submillimeter Array (ALMA) is observing the sky again. The computing staff has worked diligently to rebuild the affected JAO computer system servers and services. This is a crucial milestone in the recovery process. 

On 29 October, ALMA suffered a cyberattack. The computing staff took immediate countermeasures to avoid loss and damage to scientific data and IT infrastructure. The attack affected various critical operational servers and computers. 

“The challenge was to securely restore all the communication and computer systems as quickly as possible. We established an aggressive plan that required coordination with the ALMA partnership worldwide,” explains Jorge Ibsen, Head of the ALMA Computing Department. “Thanks to the active engagement of everyone in the partnership worldwide, especially the Computing, Engineering, and Science Operations staff, and the cybersecurity experts from ESO, NAOJ, and NRAO, we managed to be observing as planned.” 

In the coming weeks, the focus will be on recovering testing infrastructure and systems like the ALMA website and other services, which will allow the recovery of all the functionalities existing before the cyberattack. 

ALMA Director, Sean Dougherty, celebrates that: “It is fantastic to be back doing science observations once again! It has been an enormous challenge to rebuild our systems to return to observing securely. Thanks to everyone at the JAO and across the ALMA partnership for attaining this impressive milestone.” 

China performs hydrodynamic simulations for exploring progenitor system of type Ia supernova

Ph.D. candidate CUI Yingzhen and Prof. MENG Xiangcun from the Yunnan Observatories of the Chinese Academy of Sciences (CAS) performed hydrodynamic simulations on the common-envelope wind model of type Ia supernovae (SNe Ia) and revealed the mass loss mechanism and the main observational features of white dwarf binaries in the common-envelope wind phase. 

The study was published in Astronomy & Astrophysics. 

SNe Ia supernovae are some of the most energetic events in the Universe. They are used as cosmological distance indicators, which have led to the discovery of the accelerating expansion of the Universe. 

One of the most popular progenitor models of SNe Ia is the single-degenerate model, in which a carbon-oxygen white dwarf accretes material from a non-degenerate companion star to increase its mass, and eventually undergoes a thermonuclear explosion. The problem with this model is that when the mass transfer rate exceeds a certain critical value, the accreted envelope of the white dwarf expands and eventually forms a common envelope around the binary system, which may prevent the occurrence of SNe Ia. 

The common-envelope wind model is a modified single-degenerate model that can in principle address the above-mentioned problem by suggesting a strong mass loss at the surface of the common envelope. However, it is not clear how the mass loss at the surface of the common envelope arises and what the observational characteristics of such systems are. 

The researchers carried out detailed hydrodynamic simulations of the common-envelope wind model and found that such systems are always dynamically unstable and consequently produce dramatic mass loss, resulting in an envelope mass of only a few thousand of solar mass. 

By analyzing the internal structure, they found that this instability was driven by ionization-recombination processes of hydrogen and helium in the envelope, the same mechanism as the pulsating excitation of classical Cepheids. In the Hertzsprung-Russell diagram, the center of the evolutionary trajectory of the common-envelope wind model was also located within the classical Cepheid instability strip, implying that this system may appear as periodic variable stars. 

This result can provide theoretical guidance for the subsequent observational search for the progenitor system of SNe Ia. 

Brown bypasses the need for massive data sets with the combo of an ML, active learning techniques

When it comes to predicting disasters brought on by extreme events (think earthquakes, pandemics, or “rogue waves” that could destroy coastal structures), computational modeling faces an almost insurmountable challenge: Statistically speaking, these events are so rare that there is just not enough data on them to use predictive models to accurately forecast when they’ll happen next.

But a team of researchers from Brown University and Massachusetts Institute of Technology says it doesn’t have to be that way.

In a new study, the scientists describe how they combined statistical algorithms — which need fewer data to make accurate, efficient predictions — with a powerful machine learning technique developed at Brown and trained it to predict scenarios, probabilities, and sometimes even the timeline of rare events despite the lack of a historical record on them.

Doing so, the research team found that this new framework can provide a way to circumvent the need for massive amounts of data that are traditionally needed for these kinds of computations, instead essentially boiling down the grand challenge of predicting rare events to a matter of quality over quantity.

“You have to realize that these are stochastic events,” said George Karniadakis, a professor of applied mathematics and engineering at Brown and a study author. “An outburst of a pandemic like COVID-19, environmental disaster in the Gulf of Mexico, an earthquake, huge wildfires in California, a 30-meter wave that capsizes a ship — these are rare events and because they are rare, we don't have a lot of historical data. We don't have enough samples from the past to predict them further into the future. The question that we tackle in the paper is: What is the best possible data that we can use to minimize the number of data points we need?”

The researchers found the answer in a sequential sampling technique called active learning. These types of statistical algorithms are not only able to analyze data input into them, but more importantly, they can learn from the information to label new relevant data points that are equally or even more important to the outcome that’s being calculated. At the most basic level, they allow more to be done with less.

That’s critical to the machine learning model the researchers used in the study. Called DeepOnet, the model is a type of artificial neural network, which uses interconnected nodes in successive layers that roughly mimic the connections made by neurons in the human brain. DeepOnet is known as a deep neural operator. It’s more advanced and powerful than typical artificial neural networks because it’s actually two neural networks in one, processing data in two parallel networks. This allows it to analyze giant sets of data and scenarios at breakneck speed to spit out equally massive sets of probabilities once it learns what it’s looking for.

The bottleneck with this powerful tool, especially as it relates to rare events, is that deep neural operators need tons of data to be trained to make calculations that are effective and accurate.

In the paper, the research team shows that combined with active learning techniques, the DeepOnet model can get trained on what parameters or precursors to look for that lead up to the disastrous event someone is analyzing, even when there are not many data points.

“The thrust is not to take every possible data and put it into the system, but to proactively look for events that will signify the rare events,” Karniadakis said. “We may not have many examples of the real event, but we may have those precursors. Through mathematics, we identify them, which together with real events will help us to train this data-hungry operator.”

In the paper, the researchers apply the approach to pinpointing parameters and different ranges of probabilities for dangerous spikes during a pandemic, finding and predicting rogue waves, and estimating when a ship will crack in half due to stress. For example, with rogue waves — ones that are greater than twice the size of surrounding waves — the researchers found they could discover and quantify when rogue waves will form by looking at probable wave conditions that nonlinearly interact over time, leading to waves sometimes three times their original size.

The researchers found their new method outperformed more traditional modeling efforts, and they believe it presents a framework that can efficiently discover and predict all kinds of rare events.

In the paper, the research team outlines how scientists should design future experiments so that they can minimize costs and increase the forecasting accuracy. Karniadakis, for example, is already working with environmental scientists to use the novel method to forecast climate events, such as hurricanes.

The study was led by Ethan Pickering and Themistoklis Sapsis from MIT. DeepOnet was introduced in 2019 by Karniadakis and other Brown researchers. They are currently seeking a patent for the technology. The study was supported with funding from the Defense Advanced Research Projects Agency, the Air Force Research Laboratory, and the Office of Naval Research.