UB pharmacy prof builds AI-powered supercomputer model to predict disease progression during aging

The model could support the assessment of long-term chronic drug therapies and help clinicians develop more effective treatments for complex diseases

Using artificial intelligence, a team of University at Buffalo researchers has developed a novel system that models the progression of chronic diseases as patients age. 

Published in Oct. in the Journal of Pharmacokinetics and Pharmacodynamics, the model assesses metabolic and cardiovascular biomarkers – measurable biological processes such as cholesterol levels, body mass index, glucose, and blood pressure – to calculate health status and disease risks across a patient’s lifespan.

The findings are critical due to the increased risk of developing metabolic and cardiovascular diseases with aging, a process that has adverse effects on cellular, psychological and behavioral processes. 1637777727685 9a6ca

“There is an unmet need for scalable approaches that can provide guidance for pharmaceutical care across the lifespan in the presence of aging and chronic co-morbidities,” says lead author Murali Ramanathan, Ph.D., professor of pharmaceutical sciences in the UB School of Pharmacy and Pharmaceutical Sciences. “This knowledge gap may be potentially bridged by innovative disease progression modeling.”

1638894327685_a5287_b516a.jpgThe model could facilitate the assessment of long-term chronic drug therapies, and help clinicians monitor treatment responses for conditions such as diabetes, high cholesterol, and high blood pressure, which become more frequent with age, says Ramanathan. 

Additional investigators include the first author and UB School of Pharmacy and Pharmaceutical Sciences alumnus Mason McComb, Ph.D.; Rachael Hageman Blair, Ph.D., associate professor of biostatistics in the UB School of Public Health and Health Professions; and Martin Lysy, Ph.D., associate professor of statistics and actuarial science at the University of Waterloo.

The research examined data from three case studies within the third National Health and Nutrition Examination Survey (NHANES) that assessed the metabolic and cardiovascular biomarkers of nearly 40,000 people in the United States. 

Biomarkers, which also include measurements such as temperature, body weight, and height, are used to diagnose, treat and monitor the overall health and numerous diseases. 

The researchers examined seven metabolic biomarkers: body mass index, waist-to-hip ratio, total cholesterol, high-density lipoprotein cholesterol, triglycerides, glucose, and glycohemoglobin. The cardiovascular biomarkers examined include systolic and diastolic blood pressure, pulse rate, and homocysteine.

By analyzing changes in metabolic and cardiovascular biomarkers, the model “learns” how aging affects these measurements. With machine learning, the system uses a memory of previous biomarker levels to predict future measurements, which ultimately reveal how metabolic and cardiovascular diseases progress over time.

How two UMass Amherst scientists are balancing the planet's natural carbon budget

New research is first to pin down the mechanics of CO2 fluxes in rivers and streams

A pair of researchers at the University of Massachusetts Amherst recently published the results of a study that is the first to take a process-based modeling approach to understand how much CO2 rivers and streams contribute to the atmosphere. The team focused on the East River watershed in Colorado’s Rocky Mountains and found that their new approach is far more accurate than traditional approaches, which overestimated CO2 emissions by up to a factor of 12. An early online version of the research was recently published by Global Biogeochemical Cycles.

Scientists refer to the total CO2 circulating through the earth and the atmosphere as the carbon budget. This budget includes both anthropogenic sources of CO2, such as those that come from burning fossil fuels, as well as more natural sources of CO2 that are part of the planet’s regular carbon cycle. “In the era of global climate change,” says Brian Saccardi, a graduate student in geosciences at UMass Amherst and lead author of the new research, “we need to know what the baseline levels of CO2 are, where they come from and how those physical process of carbon emission work.” Without such a baseline, it makes it difficult to know how the earth is changing as CO2 levels increase. Brian Saccardi collecting stream data from the East River watershed, Colorado

Streams and rivers are one of the many venues that naturally emit CO2—scientists have long known this, but it’s been a very difficult number to pin down. In part, this is because CO2 emissions fluctuate rapidly and it has proved impracticable to physically monitor all of the earth’s river networks. And so scientists typically rely on statistical models to estimate how much CO2 streams and rivers emit. The problem, Saccardi explains, is that the models don’t account for the full complexity of how CO2 moves from groundwater into the stream or river, what happens to it once there and how much gets emitted to the atmosphere.

“This is the first time we’re accounting for the physical processes themselves,” says Matthew Winnick, professor of geosciences at UMass Amherst and the paper’s co-author. “We need to know how each step of the movement of CO2 works, so we know how they will react to climate change.”

Saccardi and Winnick designed, tested, and validated a “process-based” model that relies on the laws of physics as well as empirical measurements to arrive at its estimates. The pair took 121 measurements of streams in the remote East River watershed in Colorado, against which they could test their new model. And the results were clear: according to the research, their model is far more accurate than the standard approaches.

Though Saccardi and Winnick are quick to point out that their conclusions apply to the East River watershed only, they have plans to apply their process-based model more widely and suspect that their new method may help to radically reevaluate the earth’s natural carbon budget.

German engineers design turbo boost for AI to predict new compounds for materials

A new algorithm has been designed to help discover previously unknown material compounds. It was developed by a team from Martin Luther University Halle-Wittenberg (MLU), a public, research-oriented university in the cities of Halle and Wittenberg in the State of Saxony-Anhalt, Germany. Working with professors from Friedrich Schiller University Jena, and Lund University in Sweden, the researchers designed a form of artificial intelligence (AI) based on machine learning that can perform complex calculations within a very short space of time. This has enabled the team to identify several thousand potential new compounds using a computer. The study was published in the journal "Science Advances."

Inorganic materials are essential for humans. For example, they form the basis for solar cells and for new advancements in semiconductor electronics that are used in technical devices. Around 50,000 stable inorganic compounds have already been identified. "However, considerably more may theoretically exist - if they can be produced artificially," says Dr. Miguel Marques, professor of physics at MLU. There are two basic ways to detect these undiscovered materials: in the laboratory via countless experiments on different substances, or through computer simulation. The latter has increasingly become standard in recent years, says Marques: "The problem is that many earlier approaches require a lot of computing power and are slow to produce results." 

The researchers, therefore, developed a new method based on machine learning. Instead of performing whole calculations, the computer predicts their final results. "In other words, we want to obtain the results of the calculations without having to do the actual calculations," says Jonathan Schmidt from MLU, first author of the new study. "This requires two things: an algorithm that carries out the desired task, and a dataset which can be used to train the algorithm," adds the physicist. The team used several databases containing over 2.4 million compounds. "The calculations on which these databases are based have a combined calculation time of 100 to 200 million hours," says Schmidt.   

The new AI searches for new materials much faster than previous methods and is expected to soon also predict their electrical and optical properties. The researchers have already been able to identify several thousand possible candidates. "Of course, promising material candidates and their properties have to be confirmed by experiments and investigated further. However, we are very confident that most of our predictions will be confirmed," says Marques.