Image Source: FreeImages
Image Source: FreeImages

Discovering a real-world model of rogue waves with machine learning

Researchers have successfully developed a practical model for predicting rogue waves in real-world ocean settings using machine learning algorithms. This development could have significant implications for the safety of seafarers and coastal communities.

Rogue waves, also known as monster waves, have posed a significant threat to ships and offshore structures for centuries. These waves can reach heights of up to 26 meters and have long been the subject of sailor's legends. While the first rogue wave was measured and captured by digital instruments in 1995, the scientific understanding of these waves has been limited to anecdotal evidence and sailor's tales until recently.

In a groundbreaking study, researchers at the University of Copenhagen's Niels Bohr Institute have utilized artificial intelligence (AI) methods to discover a mathematical model that predicts the occurrence of rogue waves. By analyzing vast amounts of ocean movement data, they have been able to determine the likelihood of encountering a monster wave at sea. This discovery is essential for the shipping industry as it provides a tool to assess the risk of encountering dangerous waves and allows for the selection of safer routes.

Rogue waves have always been a topic of fascination for scientists and sailors alike. These enormous waves, seemingly appearing out of nowhere, pose a grave danger to ships and offshore platforms. With the advent of digital instruments and AI technology, researchers have been able to shed light on the causes and characteristics of rogue waves.

Artificial intelligence played a crucial role in unraveling the mysteries of rogue waves in the study conducted by researchers from the Niels Bohr Institute. Using various AI methods, including symbolic regression, the researchers were able to transform over a billion waves' worth of data into a mathematical model. Unlike traditional AI methods that provide single predictions, symbolic regression produces an equation that describes the recipe for a rogue wave.

Researchers from the Niels Bohr Institute have discovered a mathematical model that predicts the occurrence of rogue waves. To develop their model, they combined a vast amount of data on ocean movements and sea states. They collected wave data from buoys in 158 different locations and amassed over 700 years' worth of wave height and sea state information. This dataset, consisting of more than a billion waves, helped the researchers identify the variables that contribute to the formation of these extreme waves through machine learning algorithms. Dion Häfner

The researchers' study revealed that rogue waves are not as rare as previously believed. Their dataset registered over 100,000 waves that met the criteria for rogue waves. This means that there is approximately one monster wave occurring every day at a random location in the ocean. However, not all of these waves are of extreme size, with some being less than twice the height of surrounding waves.

The most dominant factor that contributes to the formation of rogue waves is a phenomenon known as "linear superposition." This concept occurs when two wave systems cross over each other and reinforce one another for a brief period, increasing the chance of generating high crests and deep troughs, giving rise to extremely large waves. This finding contradicts the long-held belief that rogue waves are primarily caused by the merging of two waves.

The discovery of the mathematical model has significant implications for the shipping industry. With approximately 50,000 cargo ships sailing worldwide at any given time, encountering a monster wave is a constant concern. By utilizing the researchers' algorithm, shipping companies can assess the risk of encountering dangerous waves and plan alternative routes accordingly. This newfound ability to predict the occurrence of rogue waves will undoubtedly enhance safety in maritime transportation.

The researchers have made both their algorithm and research publicly available, along with the weather and wave data they deployed. This accessibility allows interested parties, such as public authorities and weather services, to calculate the probability of encountering rogue waves easily. The researchers' algorithm provides transparent intermediate calculations, making it more understandable and relatable to humans. This transparency is a significant step towards bridging the gap between AI and human understanding.

In conclusion, the discovery of the mathematical model that predicts the occurrence of rogue waves is a significant milestone in understanding and mitigating the risks associated with these extreme ocean phenomena. The researchers from the Niels Bohr Institute have harnessed the power of AI to analyze an enormous dataset and identify the causal variables that contribute to the formation of rogue waves. This newfound knowledge will undoubtedly enhance safety in the shipping industry and contribute to a better understanding of the physics behind these awe-inspiring natural phenomena.

Image Credit: Tsunetomo Yamada from TUS
Image Credit: Tsunetomo Yamada from TUS

Japanese researchers accelerate the phase identification of multiphase mixtures with deep learning

Crystalline materials are crucial components in various industries such as semiconductors, pharmaceuticals, photovoltaics, and catalysts. These materials have an ordered, three-dimensional structure made up of atoms, ions, or molecules. As scientists continue to design novel materials to address emerging challenges, the need for precise identification methods becomes increasingly essential. Powder X-ray diffraction is the most widely used method to identify the structure of crystalline materials. However, accurately identifying different types of crystals in multiphase samples can be complex and time-consuming.

To expedite the phase identification process, researchers have turned to innovative data-driven methods, such as machine learning. While substantial progress has been made in utilizing machine learning for known phases, identifying unknown phases in multiphase samples remains a challenge. In a recent study published in the Advanced Science journal, researchers from Tokyo University of Science, National Defense Academy, National Institute for Materials Science, Tohoku University, and The Institute of Statistical Mathematics proposed a deep learning model that can detect a previously unknown quasicrystalline phase present in multiphase crystalline samples.

The Role of Deep Learning in Phase Identification

Deep learning is a subset of machine learning that involves training artificial neural networks with multiple layers to learn patterns and make predictions. In the context of phase identification, deep learning can be used to analyze X-ray diffraction patterns and distinguish different phases within multiphase samples. This approach offers the potential to significantly reduce the time and effort required for accurate identification.

Developing the Deep Learning Model

To develop their deep learning model, the researchers created a "binary classifier" using 80 types of convolutional neural networks. They trained the model using synthetic multiphase X-ray diffraction patterns that represented the expected patterns associated with the icosahedral quasicrystal (i-QC) phase. The model's performance was assessed using both synthetic patterns and a database of actual patterns. Remarkably, the model achieved a prediction accuracy of over 92%.

Successful Identification of Unknown Phases

The researchers tested their deep learning model on multiphase Al-Si-Ru alloys, which contained an unknown i-QC phase. The model successfully identified the presence of the unknown i-QC phase when screening 440 measured diffraction patterns from unknown materials in six different alloy systems. The presence of the i-QC phase was further confirmed through microstructure and composition analysis using transmission electron microscopy. Importantly, the model was able to identify the i-QC phase even when it was not the most prominent component in the mixture.

Identifying new structures with the proposed deep learning model goes beyond identifying i-QC phases. It also has the potential to identify decagonal and dodecagonal quasicrystals (QCs) and can be applied to various crystalline materials. This makes the model versatile in accelerating the phase identification process of multiphase samples in a wide range of industries.

This deep learning model is a significant breakthrough for materials science. The model enables researchers to identify unknown quasicrystalline phases efficiently and explore the potential applications of these materials. The discovery of new materials with enhanced properties for energy storage, carbon capture, and advanced electronics is possible with this advancement.

The success of the deep learning model in identifying unknown phases showcases the potential of artificial intelligence and machine learning in accelerating scientific research. As researchers refine and expand upon these methods, the identification of complex multiphase mixtures will become faster and more accurate. This approach will streamline materials development processes and pave the way for discoveries and advancements in various scientific fields.

In summary, the development of a deep learning model for the rapid identification of unknown quasicrystalline phases in multiphase samples is a significant milestone in materials science. By harnessing the power of artificial intelligence and machine learning, researchers have an innovative approach to overcome the challenges associated with complex phase identification. This approach has the potential to revolutionize industries reliant on crystalline materials, leading to the discovery of new and improved materials for a wide range of applications. As the field of deep learning continues to advance, we can expect further breakthroughs in phase identification and materials research.

The need for improvement in energy policy simulations

The widespread adoption of nuclear power was predicted by supercomputer simulations over forty years ago. However, the fact that we still heavily rely on fossil fuels for energy today suggests that these simulations require improvement. In order to assess the efficacy of current energy policies, a team of researchers recently examined a influential model from the 1980s that projected a dramatic increase in nuclear power usage. Their findings revealed that the simulations used to inform energy policy often incorporate unreliable assumptions and lack transparency about their limitations. This article will delve into the importance of improving energy policy simulations, the need for transparency, and the potential solutions to enhance these models.

The Role of Energy Policies

Energy policies play a crucial role in shaping how we produce and consume energy. They have wide-ranging impacts on various aspects such as job creation, cost management, climate change, and national security. These policies are formulated based on simulations, also known as mathematical models, which forecast variables like electricity demand and technology costs. However, it is essential to recognize that these forecasts may not always be accurate or comprehensive. The recent study published in the journal Risk Analysis highlights the unreliable assumptions inherent in energy policy simulations and emphasizes the necessity for greater transparency and understanding of their limitations.

Unreliable Assumptions in Energy Policy Simulations

The research team discovered that the simulations used to inform energy policy often incorporate unreliable assumptions. These assumptions can significantly impact the accuracy of the forecasts and potentially lead to flawed decision-making. Therefore, it is imperative to identify and address these limitations to improve the reliability of energy policy models. One potential solution proposed by the researchers is the implementation of sensitivity auditing, a method that evaluates the assumptions made in the models. By subjecting the simulations to rigorous scrutiny, policymakers can gain a better understanding of the uncertainties associated with these models.

Importance of Transparency in Energy Policy Modeling

Transparency is key when it comes to energy policy modeling. By openly acknowledging the limitations and uncertainties of these simulations, a more informed and democratic debate can take place. Energy policy affects everyone, and decisions based on flawed models can have far-reaching consequences. Therefore, it is crucial for policymakers to be upfront and transparent about the assumptions and uncertainties inherent in these models. This transparency will enable stakeholders to make more informed decisions and contribute to the improvement of energy policy modeling.

Enhancing Energy Policy Simulations

To enhance energy policy simulations, the research team recommends implementing new methodologies and practices. Sensitivity auditing, as mentioned earlier, is one such approach that evaluates the assumptions made in the models. By subjecting these assumptions to rigorous testing, policymakers can gain a better understanding of their impact on the overall projections. This method can help identify potential shortcomings and improve the accuracy of the simulations.

In addition to sensitivity auditing, the researchers propose exploring new ways to test and validate energy policy simulations. By incorporating real-world data and conducting sensitivity analyses, policymakers can gain a more realistic understanding of the uncertainties associated with these models. This approach can lead to more robust and reliable energy policy decisions.

The Politics of Modeling

The implications of improving energy policy simulations extend beyond the field of energy. In a chapter of a forthcoming book titled "The Politics of Modeling," the lead author, Dr. Samuele Lo Piano, discusses the broader significance of this research. The chapter explores the complexities and uncertainties posed by human-caused socio-economic and environmental changes. It presents four real-world applications of sensitivity auditing in various fields, including public health, education, human-water systems, and food provision systems. These examples highlight the applicability of sensitivity auditing in different domains and emphasize the need for improved modeling practices across multiple sectors.

Conclusion

The reliance on fossil fuels for energy despite the predictions made by earlier simulations indicates a need for improvement in energy policy modeling. The recent study emphasizing the unreliable assumptions and lack of transparency in these models highlights the importance of addressing these limitations. By implementing sensitivity auditing and exploring new methods for testing and validating simulations, policymakers can enhance the accuracy and reliability of energy policy decisions. Transparency about the uncertainties associated with these models is crucial for informed and democratic debates on energy policy. Ultimately, by improving our understanding of energy policy simulations, we can make more effective decisions and transition towards a cleaner and more sustainable energy future.

Lenovo faces 16% sales drop in Q3

Lenovo Group has reported a significant setback in its financial performance for the third quarter ending in September. The company's revenue fell by 16%, totaling USD $14.41 billion compared to the same period last year. This decline marks the fifth consecutive quarter of declining sales for Lenovo.

In this article, we will explore the key highlights and challenges faced by Lenovo in the third quarter of the fiscal year 2023/24. We will delve into the company's financial performance, its strategies to address the decline, and its future outlook in the technology sector.

Financial Performance

For the third quarter of the fiscal year 2023/24, Lenovo Group reported a revenue of $14.41 billion, representing a 16% decline compared to the same period last year. This decline can be attributed to ongoing challenges in the technology sector, including the global chip shortage and supply chain disruptions caused by the pandemic.

Despite the decline in revenue, Lenovo managed to maintain a gross profit margin of 17.5%, which is a record high for the second quarter. The company's diversified growth engines played a significant role in its performance, with revenue from non-PC businesses accounting for 40% of the total revenue, a 3% increase compared to the previous year.

However, Lenovo experienced a significant drop in net income, with a profit attributable to equity holders of $249 million, representing a 54% decline compared to the same period last year. The company's earnings per share also decreased by 2.45 cents, reflecting the challenging market conditions.

Challenges and Recovery Strategies

Lenovo Group has been facing several challenges that have impacted its financial performance in the third quarter. The company's excess inventory from the previous year, coupled with the global chip shortage, has resulted in supply chain disruptions and lower sales.

To address these challenges and drive recovery, Lenovo is implementing several strategies. The company is focusing on operational excellence, continuous investment in innovation, and executing its intelligent transformation strategy. These efforts have contributed to consecutive quarter-on-quarter performance improvements, indicating a positive trajectory toward recovery.

Additionally, Lenovo is leveraging its expertise in artificial intelligence (AI) to drive growth and profitability. The company's hybrid AI model, pocket-to-cloud portfolio, strong ecosystem, and partnerships position it uniquely to capitalize on the exponential growth opportunities in the AI sector. Lenovo's ongoing investment in innovation, particularly in AI technologies and capabilities, will further strengthen its ability to capture the potential of AI and drive sustainable growth.

Business Segment Highlights

Solutions and Services Group (SSG)

Lenovo's Solutions and Services Group (SSG) achieved record revenue and operating profit in the third quarter. The group reported revenue of $1.9 billion and an operating margin of 20%. The core profit engines for SSG were support services and software, while managed services and project and solutions services expanded, accounting for 56% of SSG's revenue.

SSG's hero offerings, including Digital Workplace Solutions (DWS), Hybrid Cloud, and sustainability solutions and services, have gained strong momentum. The group focuses on providing smart solutions tailored to specific vertical industries, leading to breakthrough customer deals in multiple markets. Moreover, Lenovo's new hybrid AI Professional Services Practice enables enterprises to leverage hybrid infrastructure and AI to transform their businesses.

Infrastructure Solutions Group (ISG)

Lenovo's Infrastructure Solutions Group (ISG) faced headwinds in the third quarter, with a decline in revenue to $2 billion. This decline can be attributed to wider macroeconomic industry headwinds, economic slowdowns, and platform migration. However, ISG delivered a strong performance in storage, software, and services. The storage business reached an all-time revenue record.

Looking ahead, ISG aims to capitalize on the development of hybrid AI, which is expected to drive growth and diversification in the global ICT infrastructure market. Lenovo's rich portfolio of infrastructure products and solutions positions the company well to address this market opportunity. ISG will continue to enhance its portfolio competitiveness and operational excellence to resume growth and profitability.

Intelligent Devices Group (IDG)

Lenovo's Intelligent Devices Group (IDG) maintained its market leadership in the third quarter, despite market challenges. IDG retained its global number-one position in both PC shipments and activations. The group reported a decline in revenue to USD 11.5 billion but maintained an industry-leading operating margin of 7.4%.

The smartphone business within IDG achieved double-digit premium-to-market shipment growth, driven by the sales of Razr and a higher mix of premium products. Looking ahead, IDG plans to leverage generative AI to accelerate the launch of next-generation AI devices, including the introduction of an AI PC next year. Lenovo will continue to invest in technology innovation to drive growth and ensure long-term competitiveness in the market.

Environmental, Social, and Governance (ESG) Achievements

Lenovo has been recognized for its environmental and social achievements during the third quarter. The company was included in the 2023 Hang Seng Corporate Sustainability Index, achieving the highest score in the IT industry. Additionally, Lenovo received "Champion" status in the Canalys Global Sustainability Ecosystems Leadership matrix. The company was also named an EPEAT Climate Champion, with over 400 products registered as part of the first EPEAT Climate+ designated products listing.

In line with its commitment to sustainability, Lenovo joined the UN Global Compact Forward Faster initiative to accelerate private sector action for the UN's 17 Sustainable Development Goals. These achievements highlight Lenovo's dedication to promoting environmental responsibility and social impact.

Conclusion

Lenovo Group faced a significant setback in its financial performance during the third quarter, with a 16% decline in revenue compared to the previous year. The company continues to navigate challenges, including excess inventory and supply chain disruptions caused by the pandemic and the global chip shortage.

Despite these challenges, Lenovo remains focused on executing its intelligent transformation strategy, investing in innovation, and leveraging its AI capabilities to drive growth and profitability. Lenovo's Solutions and Services Group, Infrastructure Solutions Group, and Intelligent Devices Group have shown resilience and potential for future growth.

Furthermore, Lenovo's commitment to environmental sustainability and social impact has been recognized through various ESG achievements. As the technology sector continues to evolve, Lenovo is well-positioned to adapt and thrive, building a more inclusive, trustworthy, and smarter future for everyone.

Rocket exhaust on the Moon: Unveiling the surface effects

For NASA, the exploration of the Moon has always been a fascinating endeavor. With the Artemis program, the space agency is planning to take lunar missions to new heights by establishing a sustained human presence on the Moon. To achieve this goal, a deep understanding of how future landers interact with the lunar surface during landing and liftoff is crucial.

Landing on the Moon is a complex and challenging task. Unlike Earth, the Moon lacks an atmosphere and has low gravity, making the descent of spacecraft a unique and difficult endeavor. To counteract the Moon's gravitational pull, spacecraft employ rocket engines to control their descent. However, this process creates supersonic plumes of hot gas that interact with the lunar surface, causing various hazards and potential risks.

When a spacecraft lands or takes off from the Moon, the intense forces generated by the rocket engine plumes can have significant consequences. These forces kick up dust, eject rocks, and create visual obstructions and dust clouds that can interfere with navigation and scientific instrumentation. Moreover, the plumes can erode the lunar surface underneath the lander, posing risks to the stability of the lander and the safety of astronauts.

To better understand and predict the interactions between rocket engine plumes and the lunar surface, researchers at NASA's Marshall Space Flight Center in Huntsville, Alabama, have developed new software tools. These tools are designed to simulate and predict plume-surface interactions for various NASA projects and missions, including the Human Landing System and Commercial Lunar Payload Services initiative. 

 

One remarkable achievement of the NASA Marshall team is the simulation of the Apollo 12 lander engine plumes interacting with the lunar surface. Through their simulation, they were able to closely match the predicted erosion patterns with the actual landing event. The simulation shows the fluctuating radial patterns of shear stress, which is the lateral force applied over a surface and a leading cause of erosion when fluids flow across it.

To achieve such accurate simulations, NASA utilized the power of supercomputers. The Pleiades supercomputer at NASA's Ames Research Center in California's Silicon Valley played a crucial role in running the simulations. Over several weeks of runtime, the simulations generated terabytes of data, providing valuable insights into plume-surface interactions.

The framework used for these simulations is called the Descent Interpolated Gas Granular Erosion Model (DIGGEM). This framework was funded through NASA's Small Business Innovation Research program, emphasizing the agency's commitment to technological advancements in space exploration. The DIGGEM framework, along with the Loci/CHEM+DIGGEM code, has been refined and optimized through direct support for flight projects within NASA's Exploration Systems Development Mission Directorate.

The insights gained from these simulations and software tools play a crucial role in minimizing risks associated with future lunar missions. By predicting cratering and visual obscuration, NASA can ensure the safety of spacecraft and crew during landing and takeoff. These advancements are not only essential for the Artemis program but also for future Mars landers and other off-world missions.

In conclusion, the study of plume-surface interactions is a critical aspect of lunar missions. NASA's research and simulations conducted at the Marshall Space Flight Center provide valuable insights into how rocket engine plumes interact with the lunar surface during landing and liftoff. By understanding these interactions, NASA can minimize risks and ensure the success of future missions, including the ambitious Artemis program. With the aid of supercomputers and advanced software tools, the agency continues to push the boundaries of space exploration and pave the way for human exploration of celestial bodies beyond the Moon.