Screening methodologies

Our blog

15 December, 2016

Counting compounds in or out?

In the search for new drugs, the harvest of low hanging fruit and increased regulatory hurdles have decreased R&D productivity. Many candidates incur enormous financial losses by failing very late in clinical trials. But this failure is due to safety/toxicity concerns as opposed to lack of efficacy. So, are we using the best methods to select our hit compounds?

Target-based high-throughput screening 

In the 1990s, companies turned to target-based primary screening to screen more compounds in a reliable manner. Target-based screening typically uses overexpressing proteins identified as having a key role in the disease area to make a consistent, sensitive model. The assays are robust biochemical ones that are easily automated to run in high throughput, as a pharmaceutical compound library is now around 2-3 million compounds. However, once hit compounds are identified, their effects then need to be checked in a biological system.

Target-based screening has generated many ‘hits’, but there has been a greater failure in the clinic due to poor target validation early on in the drug discovery process. The main cost in drug discovery is not the initial high throughput screen; it is the stages after that.

“People often talk about screening as finding ‘hit’ compounds. Paradoxically, in my opinion, it is about screening out the compounds that are no good – so they aren’t pursued and money is not wasted in the later stages of drug discovery” say Dr Paul Wylie, TTP Labtech’s Head of Applications.

Phenotypic Screening: gaining physiological relevance

There is now a renewed interest in phenotypic screening as a discovery tool. “The real benefit”, says Dr Wylie, “is seeing what each compound actually does to a cell; not just to an isolated protein in a non-physiological environment.”

High Content Screening (HCS) is a standard automated approach to look at phenotypic changes within the cell in response to a compound. Normally, multiple features of each individual cell or organism present are measured. It is this that underpins the approach’s true power.

“HCS enables phenotypic assays that can measure changes within a cell, or movement between cells, or permit the analysis of specific sub-populations of cells in a heterogeneous mix, that would be difficult or impossible to perform with other technologies,” explains Dr Wylie.

“High content assays are amongst the most demanding assays to run on a large scale since they involve using live cells, multiplexing of fluorescent dyes/proteins or probes and have multiple readouts,” he continues.

The vast majority of HCS readers are based on automated microscopy. “Initially, these techniques, whilst powerful, were difficult to use, relatively slow and presented barriers to a mass uptake within pharma” says Dr Wylie. “Over the last 15 years, many of the issues have been addressed and, today, they are used in mainstream R&D for phenotypic screening, which is smaller scale”.

The reasons for not using them in primary screening are:

  • microscope-based high content imaging systems are relatively slow compared to target-based screening methods
  • they are generally incompatible with high density (1536-well) microplates
  • they generate large data sets

“Sophisticated IT storage systems are available to archive these large amounts of data”, says Dr Wylie, “but the IT infrastructure to handle the data effectively and – crucially – retrieval of the data can still be a pain point.”

Another problem is that modern screens have focussed on more physiologically-relevant cell lines, which now include primary cell lines and induced pluripotent stem cells (iPS). “These cells are ever closer to a true in vivo model, but the numbers normally required to run a phenotypic screen can make the cost prohibitive,” explains Dr Wylie.

Is primary screening afraid of phenotypic assays?

“With the availability of significant improvements in assay reagents, labware and readers, I believe it is now possible to run a cost-effective, robust approach to screening entire compound libraries in a phenotypic manner,” says Dr Wylie.

Laser-scanning imaging cytometers (LSICs), such as acumen Cellista (TTP Labtech), combine object recognition with bulk read speeds (typically under 5 minutes per plate). It requires far fewer cells (typically 100/well). It also produces very small file sizes, so the IT requirements are the same as for target-based screening. This means that LSICs like acumen Cellista are ideally placed to offer practical high throughput, full deck phenotypic screening.

“The costs, it is true, are higher than traditional target-based screening approaches;” states Dr Wylie, “but it is surely a much cheaper way forward than pursuing poor compounds that ultimately fail at later stage drug development. Screening is about excluding the compounds that are not useful, not necessarily about finding the hits.”