Roundtable: the future of AI in clinical research

We spoke to four leaders harnessing AI in clinical trials about its challenges and their vision for AI in the future.

Roundtable: the future of AI in clinical research
Leaders from start-ups harnessing AI to improve clinical trials discuss the tech’s challenges and its potential for the future. Image credit: Shutterstock

Artificial Intelligence (AI) is quickly weaving itself into the fabric of our lives across all industries.

The life sciences have seen a dramatic leap forward into the digital age in recent years and as such, AI is being recognised for its potential to reduce costs and accelerate every stage of clinical research and drug development – from matching patients with clinical trials and handling data to discovering drugs themselves.

In general, pharma has been quite slow to adopt AI but an increasing number of companies are now partnering with smaller AI start-ups for their expertise in drug discovery and others are actively hiring internally for AI roles. According to CB Insights, healthcare AI startups raised over $6bn in 2020.

SPONSORED ARTICLE

Clase Azul Marks Dia de Muertos with Nuestros Recuerdos

Each year, at the beginning of November, Día de Muertos sees Mexicans in towns and cities across the country gather for three days of paying homage to lost family and friends. But “Day of the Dead” is no solemn occasion; it is a joyous celebration, full of color, pageantry, food and drink.

Read More

For this roundtable discussion, Kezia Parkins put forward three questions to some of the founders, CEOs and thought leaders harnessing AI to improve clinical trials to discover how they use it, its challenges and what they see coming down the pipeline for AI in clinical research in the coming years.

We heard from:

Wout Brusselaers, CEO and founder Deep6 AI
Charles Fisher, CEO and founder Unlearn AI
Jo Varshney, CEO and founder Verisim Life
Brian Dranka, vice president, marketing at Deep Lens

Kezia Parkins: How are you using AI to enhance and accelerate clinical trials?

Wout Brusselaers, Deep6 AI
Wout Brusselaers, CEO and founder Deep6 AI

Wout Brusselaers: We use a variety of natural language processing (NLP), graph vectorisation, and supervised/unsupervised learning techniques on clinical data to improve and accelerate critical steps in the clinical trials process. We combine and mine all patient data, structured and unstructured, across a variety of sources, into high-dimensional patient graphs which allow for real-time precision matching and patient identification. This helps design better study protocols; de-risks site and PI [principal investigator] selection based on actual, current eligible patient counts; accelerates patient recruitment and diversity across far-flung site networks, and allows us to track study endpoints and data in near-real time.

We augment our AI with a specific UI [user interface] for all major stakeholders in clinical trials: researchers, treating physicians, patients, sponsors and CROs [contract research organisations], so they can all make better, faster decisions based on information pulled from real-time, real-world clinical data.https://survey.alchemer.eu/s3/90350250/Pharma-Poll-44-Regulatory-barrier-towards-adopting-software-enabled-practices?permutive_user_id=f3c7d3a7-04e7-4692-8b44-f5fd9d617d5b

Charles Fisher, Unlearn AI
Charles Fisher, CEO and founder Unlearn AI

Charles Fisher: Unlearn uses AI to create digital twins of subjects in randomised controlled trials (RCTs) that simulate how those subjects may respond if assigned to the control group. This enables clinical trial sponsors to run trials that achieve the desired statistical power with smaller control groups. These Twintelligent RCTs generate evidence that is just as reliable as standard RCTs, in a fraction of the time.

Jo Varshney: VeriSIM Life (VSL) is a futuristic drug development company with a vision to solve the decades-old translatability problem within drug development and ensure the clinical success of drugs intended for highly unmet needs.

To realise our vision, we are re-defining the translational landscape in drug development by creating a ‘virtual drug development engine’ that seamlessly integrates at different stage-gates. Our cloud-based full-stack modern infrastructure-based solution offers sophisticated advanced predictions to optimise portfolio management and increase success rate.

We use a first-in-class AI-driven bio-simulation platform to translate, scale, and accelerate drug development insights and de-risk R&D decisions by predicting the clinical benefits before human trials. Designed by world-class experts over several years of R&D, our powerful platform combines machine learning, mechanistic modelling, secure data storage and massive computational power to provide insights and prioritise the most informative experiments much earlier in the drug development process.

Our competitive advantage is AI integration to mechanistic models, extensive validation, scalable solutions and being able to run complex simulations in matters of minutes to hours rather than days/months. Currently, we are working with top 25 pharma/biotech companies, national and federal labs, hospitals, and academic research institutions and very recently, we launched our pharmaceutical subsidiary to develop drug assets at unprecedented speed for rare disorders.

Brian Dranka: Patient recruitment is one of the most challenging aspects of conducting a clinical trial. On-time recruitment of patients for trials has immense benefits to patients as well as trial sponsors – sponsors get to test the safety and efficacy of a particular therapy and move it along the path of development, and patients receive access to cutting-edge therapies sooner.  However, many of these clinical trials – particularly precision oncology trials – have become more complex and therefore, identifying the right patients in an often very narrow window of time becomes even more difficult.

Deep Lens uses AI and deep learning models to ingest patient medical records, pathology reports, and genomic data and compare against study protocols. Through this process, potentially eligible patients are identified automatically, right at the time of a patient’s diagnosis. The heavy lifting of pre-screening patients is minimised for the site’s clinical research team, giving them the time to focus on patient care. The result is an acceleration of trial enrollment by as much as 300% compared to historical accrual rates.

KP: For what areas of drug development and clinical trials do you think AI will be most useful in the future?

WB:  We’re just seeing the tip of the iceberg for the potential and eventual uses of AI in clinical trials and drug development. Google announced some recent progress towards using AI to predict the structure of proteins and how they may interact with each other and with the human body. That sounds very promising, but the bar is really high. In terms of clinical trials, I believe AI will continue to bring much greater precision to each step of the process. Soon many isolated, manual processes, driven by habit rather than by data, will be radically overhauled.

AI can also help in developing and testing synthetic patient cohorts. Where today we use select data as a sample to represent the larger population of real patients out there, we may flip that around in the future and create a synthetic population of digital, AI-built patients and use real-life patients as a control sample to confirm and guide the digital avatars’ features. Combining both those digital drug and digital patient models, could allow for massive scaling of interaction testing. Pretty exciting!

CF: AI will enable the design and execution of faster, more reliable, and more patient-centric clinical trials by leveraging more data from each patient and data from patient populations to generate better evidence. In addition, digital solutions are generally composable so that one can leverage multiple technologies within a single clinical trial to realise substantial improvements in efficiency.

Jo Varshney, Verisim Life
Jo Varshney, CEO and founder Verisim Life

JV: Computational tools such as in silico methods have been used traditionally for drug design and discovery. But, AI/ML can take the existing processes to a new level by making it more efficient and accurate without being as expensive as animal trials. AI has the potential to transform drug discovery by rapidly accelerating the R&D timeline, making drug development cheaper and faster and improving the probability of approval. AI can also increase the effectiveness of drug repurposing research. AI technologies make possible innovations that are fundamental for transforming clinical trials, such as seamlessly combining Phases I and II, developing novel patient-centred endpoints, and collecting and analysing real-world data.

BD: AI technology is now widely deployed across the drug development value chain. Natural language processing is an application of AI that is of particular interest because it has the potential to create structure from the vast amount of unstructured data that exists in biomedical and clinical research. For example, hundreds of thousands of genome sequences have been recorded, however, integrating those data into a usable data model at scale requires interpretation of metadata and integration with medical record data to truly be useful. This is one of the areas where the impact of AI is already being felt and the utility will continue to grow as new data are unlocked. Benefits might include advances in the understanding of adverse event data or automated clinical workflows that aid in care management.

KP: What are some of the hurdles involved in using AI, regulatory or otherwise?

WB: From our perspective, as a smaller, cutting-edge company that does not have the deep pockets or clout of some of the bigger players, there are a few ‘rate limiters’: adoption by healthcare organisations is very slow and labour-intensive. We are lucky to have partnered with some very innovative, science-driven HCOs [healthcare organisations] who understand that the smart, early application of AI in clinical research brings outsized returns and sets them up as ‘winners’ in an increasingly data-driven and therefore increasingly competitive, winner-takes-all clinical landscape. Their high performance will attract more and more studies, sponsors, and talent to these organisations, fueling a virtuous cycle. This also affects further AI development: more data, more users, i.e. feedback loops, allows AI performance to keep improving and address new use cases. Therefore, I believe that some of these hurdles will be unevenly distributed and may hurt organizations with fewer resources and less access to innovation. It is important for their patients and care staff that they do not fall behind as AI increasingly informs patient care and research.

CF: It’s a misconception that AI-based tools face significant regulatory hurdles towards their adoption in clinical research – current regulatory guidance lays a clear path for leveraging AI in drug development if the applications are chosen appropriately. Rather, the biggest challenge to the development of novel AI-based tools for drug development is the curation of high-quality, representative data sets that are sufficient for training AI models to perform the desired tasks.

JV: Current entry/adoption barriers include: unclear tangible benefits to the industry, which includes real-world examples on R&D cost, timelines, or overall probability of success;
the complexity of machine learning makes interpreting AI insights challenging, resulting in a lack of buy-in to findings from biopharma stakeholders; limited availability of relevant and readily usable biopharma proprietary data – AI-based companies often take significant time to clean, identify and extract the data required to produce outcomes; the high stakes for biopharma if things go wrong – this is also due to lack of biopharma stakeholders’ relevant expertise in AI to fully evaluate its potential and limitations, and finally, unclear acceptance criteria from regulatory bodies.

Brian Dranka, Deep Lens
Brian Dranka, vice president, marketing at Deep Lens

BD: Because AI models require data at scale, maintaining the privacy of the large volume of patient data is of utmost importance. Efforts in data tokenisation are enabling analysis across various data sources in new and useful ways. Additionally, a lack of data standards often causes problems with the analysis and effective use of the data. Deep Lens has built data standardisation and harmonisation directly into its software platform to help ensure all data can be accessed easily.

This article is part of a special series by GlobalData Media on artificial intelligence. Other articles in this series include: