University of Luxembourg life sciences research leads to new supercomputer model supporting cancer therapy

Researchers from the Life Sciences Research Unit (LSRU) of the University of Luxembourg have developed a supercomputer model that simulates the metabolism of cancer cells. They used the program to investigate how combinations of drugs could be used more effectively to stop tumor growth. The biologists now published their findings in the scientific journal EBioMedicine of the prestigious Lancet group.

The metabolism of cancer cells is optimized to enable fast growth of tumors. "Their metabolism is much leaner than that of healthy cells, as they are just focused on growth. However, this makes them more vulnerable to interruptions in the chain of chemical reactions that the cells depend on. Whereas healthy cells can take alternative routes when one metabolic path is disabled, this is more difficult for cancer cells," explains Thomas Sauter, Professor of Systems Biology at the University of Luxembourg and lead author of the paper. "In our study, we investigated how drugs or combinations of drugs could be used to switch off certain proteins in cancer cells and thereby interrupt the cell's metabolism." un modele informatique novateur au service du traitement du cancer medium c4856 {module In-article}

Therefore, the researchers created digital models of healthy and of cancerous cells and fed them with gene sequencing data from 10,000 patients of the Cancer Genome Atlas (TCGA) of the American National Cancer Institute (NCI). Using these models, the researchers were able to simulate the effects different active substances had on cells' metabolisms so they could identify those drugs that inhibited cancer growth and at the same time didn't affect the healthy cells. The models allow filtering out drugs that do not work or are toxic so that only the promising ones are tested in the lab.

With the help of the models, they tested about 800 medications of which 40 were predicted to inhibit cancer growth. About 50 percent of these drugs were already known as anti-cancer therapeutics, but 17 of them are so far only approved for other treatments. "Our tool can help with the so-called "drug repositioning", which means that new therapeutical purposes are found for existing medication. This could significantly reduce the cost and time for drug development," Prof. Sauter said.

The particular advantage of the approach is the efficiency of its mathematical method. "We managed to create 10.000 patient models within one week, without the use of high-performance computing. This is exceptionally fast," comments Dr. Maria Pacheco, a postdoctoral researcher at the University of Luxembourg and first author of the study. In addition, Dr. Elisabeth Letellier, principal investigator at the Molecular Disease Mechanisms group at the University of Luxembourg and collaborator on the present study, further emphasizes "In the future, this could allow us to build models of individual cancer patients and virtually test drugs in order to find the most efficient combination. This could also bring fresh hope to patients for whom known therapies have proven to be ineffective."

So far, the models have been tested only for colorectal cancer, but the algorithm basically also works for all sorts of cancer, according to Thomas Sauter. He and his team are currently considering to develop commercial applications for their method.

Using artificial intelligence to deliver personalized radiation therapy

Newly published Cleveland Clinic-led research first to use medical scans to inform dose delivery

New Cleveland Clinic-led research shows that artificial intelligence (AI) can use medical scans and health records to personalize the dose of radiation therapy used to treat cancer patients.

Published today in The Lancet Digital Health, the research team developed an AI framework based on patient computerized tomography (CT) scans and electronic health records. This new AI framework is the first to use medical scans to inform radiation dosage, moving the field forward from using generic dose prescriptions to more individualized treatments. CAPTION New research led by Mohamed Abazeed, M.D., Ph.D., of Cleveland Clinic shows that artificial intelligence (AI) can use medical scans and health records to personalize the dose of radiation therapy used to treat cancer patients.  CREDIT Russell Lee{module In-article}

Currently, radiation therapy is delivered uniformly. The dose delivered does not reflect differences in individual tumor characteristics or patient-specific factors that may affect treatment success. The AI framework begins to account for this variability and provides individualized radiation doses that can reduce the treatment failure probability to less than 5 percent.

"While highly effective in many clinical settings, radiotherapy can greatly benefit from dose optimization capabilities," says lead author Mohamed Abazeed, M.D., Ph.D., a radiation oncologist at Cleveland Clinic's Taussig Cancer Institute and a researcher at the Lerner Research Institute. "This framework will help physicians develop data-driven, personalized dosage schedules that can maximize the likelihood of treatment success and mitigate radiation side effects for patients."

The framework was built using CT scans and the electronic health records of 944 lung cancer patients treated with high-dose radiation. Pre-treatment scans were input into a deep-learning model, which analyzed the scans to create an image signature that predicts treatment outcomes. Using sophisticated mathematical modeling, this image signature was combined with data from patient health records - which describe clinical risk factors - to generate a personalized radiation dose.

"The development and validation of this image-based, deep-learning framework is exciting because not only is it the first to use medical images to inform radiation dose prescriptions, but it also has the potential to directly impact patient care," said Dr. Abazeed. "The framework can ultimately be used to deliver radiation therapy tailored to individual patients in everyday clinical practices."

There are several other factors that set this first-of-its-kind framework apart from other similar clinical machine learning algorithms and approaches. The technology developed by the team uses an artificial neural network that merges classical approaches of machine learning with the power of a modern neural network. The network determines how much prior knowledge to use to guide predictions about treatment failure. The extent that prior knowledge informs the model is tunable by the network. This hybrid approach is ideal for clinical applications since most clinical datasets in individual hospitals are more modest in sample size compared to non-clinical datasets used to make other well-known AI predictions (i.e. online shopping or ride-sharing).

Additionally, this framework was built using one of the largest datasets for patients receiving lung radiotherapy, rendering greater accuracy and limiting false findings. Lastly, each clinical center can utilize its own CT datasets to customize the framework and tailor it to their specific patient population.

"Machine learning tools, including deep learning, are poised to play an important role in healthcare," says Dr. Abazeed. "This image-based information platform can provide the ability to individualize multiple cancer therapies but more immediately is a leap forward in radiation precision medicine."

DePaul University computer scientist earns NSF CAREER grant to study reproducibility

Assistant Professor Tanu Malik's container method makes it easier to reproduce and compare scientific experiments

Reproducibility is the cornerstone of science. In order for scientists to make advancements, they must be able to validate and build on each other's work. Now that so much science relies on computations and data, many researchers are struggling to share their computational artifacts in ways that are usable for others, said Tanu Malik, an assistant professor in DePaul University's College of Computing and Digital Media.

"We have results that are generated through computational artifacts but are being presented on PDF papers. As a researcher, there are no easy means for verifying the results being presented," said Malik. "Emailing and sharing through websites are old methods. We need more efficient and usable methods to verify results from complex scientific experiments." CAPTION Tanu Malik, assistant professor in DePaul University's College of Computing and Digital Media, was awarded a Faculty Early Career Development (CAREER) grant, the National Science Foundation's most prestigious award in support of early-career faculty.  CREDIT DePaul University/Jeff Carrion{module In-article}

Now, the National Science Foundation has awarded Malik a Faculty Early Career Development (CAREER) grant to support her work to lay the foundation for establishing reproducibility of real-world computational and data science. Malik's project will also increase awareness of the need for computational reproducibility tools through a research and education plan involving scientists, students, and instructors. The $498,889, a five-year research grant is NSF's most prestigious award in support of early-career faculty.

Hitting on an idea

Malik knew she was onto something in 2013 as a research associate scientist at the University of Chicago while working with a group of geoscientists. Spread across seven universities, they were trying to collect and run their computations together, but it wasn't working. Malik and her colleagues created a product, called the Sciunit container (http://sciunit.run), that could align not just the data but also the programs and environments where the information had been created. The geoscientists had been trying to share data and computation for several years.

Malik's system gave them results in 30 minutes.

"They were able to run this tool, and it gathered everything from different machines and made it portable. It became a huge thing," Malik said. She had discovered that it wasn't enough just to share a program code and data, but researchers also need what's called the "compute environment" to ensure that data is being run in the same way, getting relatively the same outputs. Malik likened it to trying to download a new program on your personal computer, but it just won't run. "That's the kind of situation we're trying to avoid."

The solution, said Malik, is to make it all portable -- the data, the program, the operating system -- so that others can move ahead and reproduce research, faster. At that time, NSF recognized the importance of the work with a $1.3 million grant, and Malik moved her research to DePaul in 2016.

"DePaul gave me the bandwidth to actually go deeper into this problem and really think from a computational aspect. I am looking at how containers should be designed to make them really robust for different kinds of computations," said Malik, who co-directs the Data Systems and Optimization Lab in DePaul's School of Computing.

Reproducibility as a spectrum

Malik's work will also make it easier for researchers to judge whether their own attempts at an experiment are reproducible or not. Her research aims to define the phases of reproducibility in computational research.

"You may want to do verification with different data sets, with different input parameters. So how do we make that verification fast? The underlying technology that we use in all of this is what is known as data provenance. It's capturing the provenance of the entire compute, or the history of how exactly it happened. And this time, this is what you have changed," Malik explained.

The term data provenance is derived from the art world, said Malik, and it refers to how data was created.

"Data always interests me," said Malik. "And the provenance of data seemed like a cool thing to study. You always look at your files -- and I think, 'how did I generate this file?' These are questions that come very naturally when I'm working, and I felt that provenance is important and wanted to explore it more."

Recognition and work ahead

The CAREER grant is awarded to scientists who have the potential to serve as academic role models in research and education, and who can lead advances in the mission of their department or organization.

At the heart of this exploration is Malik's work with students at DePaul. This spring, she created an advanced graduate course in the School of Computing about containers and reproducibility, and she said students were enjoying the work. The CAREER grant will allow Malik to engage more students with her work, especially in DePaul's data science program. She hopes to engage more women in the work, as a representation of women in computer science is still lagging, said Malik.

"The number of women who get funded in this area is abysmally low -- so I think it's a big deal," said Malik. "I just feel honored to have that opportunity. If I could share somehow that would be fantastic."

Malik added that coming to DePaul has helped give her the time and space to do the work she "always wanted to do."

"I have been doing this work for some time now, and the fact that this work is being recognized, that we did make an impact in a few lives by making it simpler, it feels good," said Malik. "NSF has recognized my work and is helping us to expand this further to make a greater impact. That's the ultimate fun, to make a dent in this hard problem."