ACADEMIA
Saving lots of supercomputing capacity with a new algorithm
The control of modern infrastructure such as intelligent power grids needs lots of supercomputing capacity. Scientists of the Interdisciplinary Centre for Security, Reliability and Trust (SnT) at the University of Luxembourg have developed an algorithm that might revolutionise these processes. With their new software the SnT researchers are able to forego the use of considerable amounts of supercomputing capacity, enabling what they call micro mining. Their achievements, which the team headed by Prof. Yves Le Traon published in the International Conference on Software Engineering and Knowledge Engineering, earned the scientists a Best Paper Award during this event.
Modern infrastructure – from the telephone network and alarm systems to power supply systems – is controlled by supercomputer programmes. This intelligent software continuously monitors the state of the equipment, adjusts system parameters if they deviate, or generates error messages. To monitor the equipment, the software compares its current state with its past state by continuously measuring the status quo, accumulating this data, and analysing it. That uses a considerable portion of available supercomputing capacity. Thanks to their new algorithm, the SnT researchers' software no longer has to continuously analyse the state of the system to be monitored the way established techniques do. In carrying out the analysis of the system, it instead seamlessly moves between state values that were measured at different points in time.
"In particular the operation of distributed installations such as power grids of today will benefit from our programme", says Dr. François Fouquet, managing the project at SnT with Dr. Jacques Klein: "In these smart grids, as they are referred to, many smaller individual components like solar cells, rectifiers, and other components must be monitored and controlled. For the investment and operating costs to remain economically acceptable, they have to be equipped with small, simple control units." These kinds of small embedded microprocessors cannot continuously measure the system states, store the data, and evaluate it in real-time.
Thomas Hartmann, who is completing his doctoral dissertation as part of the project, explains the new approach by SnT: "Our software stores only the changes of the system state at specific points in time. In order to be able to correctly evaluate the current situation in the network, our algorithm automatically identifies suitable measure-ment values from the past. It therefore pulls the correct measurement values from the archive to carry out a correct analysis of the current state – thereby essentially jumping back and forth in time. That translates into an enormous reduction in computing overhead and thus an increase in [super]computing efficiency for the same standard of security and dependability."
The researchers next want to field test their process. As in the first part of the project, they are collaborating with Creos, the Luxembourg power grid operator and participant in the SnT Partnership Program "Thanks to this collaboration, our research has always remained in accord with corporate realities", says Prof. Yves Le Traon: "We are hoping our fundamental development work will trigger a jump in the technology of smart grids."
TRENDING
- A new method for modeling complex biological systems: Is it a real breakthrough or hype?
- A new medical AI tool has revealed previously unrecognized cases of long COVID by analyzing patient health records
- Incredible findings from the James Webb Space Telescope reshape our understanding of how galaxies form