Large Molecule Research: Higher Targeting Efficiencies Still Leave Room for Automation

The rise of biopharmaceuticals has evolved research paradigms and continues to displace use of traditional small molecule therapies in the market. Biologics are forecast to account for more than half of the Top 100 selling drugs by 2022, up from roughly one-third in 2012.

Driving these therapies to market has required significant investment in upfront R&D efforts. With an average biopharmaceutical development timeframe of 8 years, preclinical research programs for large molecules hitting the market today were defined almost a decade ago, pointing to the maturity of upfront preclinical research despite the novelty of large molecule therapeutics.

However, while use of automation in small molecule drug discovery research is well-established – think flexible, integrated automation capable of screening millions of compounds at dedicated screening centers – the use of automation in large molecule discovery lags. One estimate of the discrepancy tags large molecule research at only 1/10th the current level of automation of small molecule.

This difference could be explained by higher success rates in large molecule research programs. With 30% of initial molecules ultimately approved vs. only 22% in small molecule, it can be argued that the targeting efficiency of biologics reduces the number of molecules that need to be screened upfront, and thus lowers overall automation needs from a pure volume standpoint. Primary screens for small molecules are on the scale of 106 while for antibodies they remain limited to just 104. Although large molecule therapeutics have been successfully developed despite this discrepancy, growing competition in the space will necessitate novel approaches. Automation can be central to this differentiation with several opportunities that already stand out today:

  • Sample Conservation – Automation can be leveraged to minimize the high degrees of sample loss that occur in upfront sample prep, which is notably time-consuming and laborious for large molecules. In antibody discovery, the creation of hybridoma fusions results in a 99% yield loss on average and drastically reduces potential screening volumes. Precise and uniform robotic processes can help to minimize any errors and subsequent losses associated with manual handling.
  • Diversity Requirements – Although large molecules are considered more targeted vs. small molecules, optimizing exact binding characteristics requires mining an extremely diverse genetic repertoire. In antibody research, the desire for increased library diversity is shown by trends towards immunization of new animal species (e.g., camels) and growth in phage display technologies. Automation is a clear solution to the efficient processing of larger and more diverse large molecule libraries, bringing primary screens up to the scale of hundreds vs. tens of thousands of molecules.
  • Data Management – A growing library of data is being produced in the pursuit of large molecule drug development. Tracking molecular entity to experiment output forges feedback loops that can drive efficiency of future experimentation, and is made possible by both automation hardware and software. One example of closing the loop via automation is the partnership between Genedata, Titian and HighRes.

While drug development efforts remain robust across the board, some predict that long-term pharma returns will suffer as the “low hanging fruit” opportunities dissipate. Treatments have been developed for less complex disease states, and generic competition stands to enter a range of indications previously addressed. Launching future therapeutics requires achieving a new standard of efficacy or tackling the complex indications that have yet to yield a success. Although these developments certainly signal an increasingly competitive market, this does not mean that pharma returns must suffer. Beyond topline revenue potential, the costs of developing and delivering new therapeutics must be taken into account, in which automation has a clear role to play. Automation has the capacity to improve precision and data quality, repurpose labor to higher order experimental design tasks and reduce supply waste, among other benefits. These operational efficiencies all contribute to improved outcomes at lower total expense and are at the heart of driving returns. In maturing markets, growing competition only increases the urgency at which proper tools, namely automation, must be employed to deliver the same results.

%d bloggers like this: