BIG DATA
ASU taps Star-P to study the human side of supercomputing
The high performance computing (HPC) industry has tended to emphasize the raw hardware performance of supercomputing, heralding ever-faster processors and parallel architectures. But a new initiative at Arizona State University (ASU) seeks to turn the focus to the human side of supercomputing, studying new tools and techniques that make researchers work faster, easier and more productively. The High Performance Computing Initiative (HPCI) at ASU's Ira A. Fulton School of Engineering is exploring a range of future programming paradigms for HPC systems, comparing them against traditional parallel programming methods. The study is funded under the U.S. Department of Defense program called User Productivity Enhancement and Technology Transfer (PET). The PET project aims to gather and deploy the best ideas, algorithms and software tools emerging from the national HPC centers into the DoD user community.
The HPCI is using Star-P from Interactive Supercomputing Inc. in a user productivity-based study of large scale computing, investigating how parameters such as user interface, ease-of-use, interactive discovery and time-to-solution factor into an optimal computing paradigm. The university has deployed Star-P on a 2,000 multicore processor parallel system and made it available to more than 100 students and faculty members to use for a variety of complex modeling, simulation and analytical applications.
Star-P enables students and faculty to build algorithms and models on their desktops using familiar mathematical tools - such as MATLAB®, Python and R - and then run them instantly and interactively on the parallel system with little to no modification. The HPCI is comparing this approach against traditional parallel methods using C or MPI languages. Star-P eliminates the need to reprogram applications in these low-level languages in order to run on parallel systems. Reprogramming with traditional languages can take months to complete for large, complex problems, so Star-P yields dramatic improvements in productivity - or "time to answer" - and makes problem solving an iterative, interactive process.
"Our mission at the university is to not only provide HPC resources for research, but to also innovate new approaches to high performance computing," said Dr. Dan Stanzione, director of the High Performance Computing Initiative at the Ira A. Fulton School of Engineering. "Star-P will help us develop new programming paradigms that remove the complexity and other productivity-hindering roadblocks from our HPC resources, making them available to a wider group of users."
The Star-P-guided user productivity research at ASU falls under the PET program's Electromagnetics and Network Systems (ENS) area of study. The study will focus primarily on applications in these two important areas of research. According to the DoD, improving the usability of the computational environments at the nation's HPC centers is critical to the agency's computing modernization efforts. Studies on computational environments include all aspects of the user's interface to high performance computing resources, including programming environments (debuggers, libraries, solvers, higher order languages; performance analysis, prediction, and optimization tools), computing platforms (common queuing, clusters, distributed data, and metacomputing), reusable parallel algorithms, user access tools (portals and web-based access to high performance computing resources), and consistency across the HPC centers for locating these capabilities.