Graeme Dennis tells Pf why an overhaul in biopharmacuetical data management is necessary in 2021.
Rarely, if ever, has the process of drug development garnered so much public and media interest as it has over the last year. The unrivalled task for pharmaceutical and biopharmaceutical companies – creating and testing a vaccine in months rather than years – appears to be hugely successful. People all over the world are now poised to receive novel, life-saving vaccinations. In this whirlwind process, the biopharmaceutical industry has learned a lot, not only about what is possible, but also about new efficiencies to sustain and old methods to retire.
Research and development
There remain fundamental challenges across research and development (R&D) to be addressed as we move further into this decade of industrialised biology and biopharma 4.0. Despite great innovations in some of the essential areas (ie, high throughput process development and in silico data science), the biopharmaceutical industry awaits the benefits of digital transformation. It is here – in a massive, but still-fragmented data framework – that much of the opportunity for improvement lies.
This is because as therapeutic possibilities increase, so do complexity and volume – much of it in the form of an endless stream of data back and forth from numerous specialty departments, R&D centres across cities and continents, and regulatory systems. These elements all contribute to a multidimensional data eco-system that presents a staggering challenge from which to store, manage, and draw insight from. In 2021 and beyond, the industry will need to embrace technologies that address longstanding issues in workflow, process quality, collaboration, and data analysis. This puts strain on the capabilities of current electronic solutions, like Laboratory Information System (LIMS) and Electronic Laboratory Notebook (ELN). Overhauling and streamlining the process with management systems that span the entire development lifecycle is a critical step in solving many of the current roadblocks. As innovations deepen and technologies advance, gaps in these areas will transition from simple deficiencies to unsustainable risks.
The time and cost of delivering new therapies and vaccines remains one of the biggest challenges for our industry. Current research1 suggests the average cost to develop a new drug sits at $1.3 billion. With development timelines ranging from 8 to 16 years and attrition rates as high as 88%, it’s clear that the current state of biopharma drug development is an intolerable barrier to the timely and cost-effective treatment of disease.
“Even today, a whopping 60% of work is carried out in paper or spreadsheets”
As mentioned, a major contributor is poor data management – in particular, its current, inefficient, highly siloed nature. Data collection often occurs in a single notebook, instrument, or database; there is no true collaboration platform allowing staff to observe it, talk about it, analyse it, and draw insights from it. Even today, a whopping 60% of work is carried out in paper or spreadsheets. This type of disjointed collection often requires highly skilled process scientists and engineers to devote as much as half of their workday finding, reconciling, and assembling data maintained in disparate systems, a lamentable and expensive use of their time. Also cumbersome – and costly – is the fact that up to 30% of work is subject to rework because the data describing process execution and outcomes simply can’t be located. This creates a serious roadblock to efficiently accessing process and quality data to make forward progress.
The costs are compounded by the high-risk nature of biologics development – risky because even under the best of circumstances, biological production conditions are a major source of unpredictable behavior, from environmental condition variation, to contamination, to equipment issues. In the context of these risks, operational burdens manifest in an inability to understand root causes of unexpected outcomes and to make informed, timely decisions as a therapeutic candidate progresses. These handicaps threaten the progress of any single development campaign, affecting time to market, attrition rates, time to trials, ability to make informed decisions on drug candidate progression and more. These impacts are in the order of 6–18 months.
Ineffective data management clearly has huge effects on the time and cost of developing a new drug, especially factoring in the substantial quality and auditing overheads involved in demonstrating integrity and compliance in manual data management. Regulatory filings and tech transfer take significantly longer and are more cumbersome than they need be.
Our team has calculated that, taking into account all the variables that can slow development, without an effective data management system spanning the entire lifecycle of a drug, the development timeline of a biologic can be prolonged by up to three years. This significant delay impacts not just a company’s bottom line, but the patients who will benefit from the therapeutic.
Streamlining the lifecycle
Navigating the complex, and evolving, R&D landscape with legacy tools is clearly not sustainable. The remedy is a BioPharmaceutical Lifecycle Management (BPLM) platform – an operational foundation for drug development process workflows, with components that allow complete integration into the development ecosystem, all the way from early development to clinical supply. This new product category addresses the many challenges presented by current digital solutions, and leads to fewer human errors and improved accuracy.
“The time and cost of delivering new therapies and vaccines remains one of the biggest challenges for our industry”
Removing paper and bringing process and analytical data together right where the process is executed, contributes to a contextualised data ‘backbone’ that streamlines the entire lifecycle. Such a system permits comprehensive search capabilities that help users find what they are looking for in critical context, which reduces the need for unnecessary duplication. A BPLM will enable companies to navigate the complexities and inefficiencies of the drug development lifecycle much more effectively, while reaping the transformational benefits that come from a well-curated process and quality data backbone. Importantly, a comprehensive data management platform like this will fully support the requirements of advancing process science and in silico methods.
Here at IDBS, we have developed Polar™, which is designed to be rapidly deployed to solve workflow, process quality, collaboration and data analysis challenges that afflict currently available software. It will reduce manual data processing and transcription, leading to fewer human errors and improved overall accuracy, and includes comprehensive search capabilities that help users find what they are looking for, reducing the need to unnecessarily duplicate processes and driving innovation.
The bottom line
Many companies are still struggling to fully embrace the power of digital infrastructure, even now, when increased efficiency and data driven insight make all the difference in the race to market novel products that will transform patients’ health and quality of life. Those who embrace rigorous approaches to unlocking the potential of their data will emerge as industry leaders in the coming decade, and be rewarded with the greatest commercial success, as well. Those who accept business-as-usual will miss these rewards, costing themselves time, money, human error and rework. More importantly, they will delay life-changing pharmaceuticals from reaching the market and patients.
Graeme Dennis is Commercial Director of Preclinical Pharma at IDBS. Go to www.idbs.com
1 Wouters OJ, McKee M, Luyten J. Estimated Research and Development Investment Needed to Bring a New Medicine to Market, 2009-2018. JAMA. 2020;323(9):844–853.