Phenotypic screening: dealing with data complexity

Our blog

16 July, 2014

Phenotypic screening: dealing with data complexity

Most people are fully aware of the bottlenecks that image-based phenotypic screening programs can create: having to work through possibly tens or even hundreds of thousands of compounds to root out a viable candidate molecule can be a huge drain on time and resources. Something that is perhaps less well recognized, or discussed, is the overwhelming data burden that high content screening creates. Individual plates generating gigabytes of data can quickly affect progress.


The challenge is in trading data complexity for manageable data generation. Imaging with a standard 10x objective, for example, results in only a fraction of an individual well being captured. While this obviously results in a reduced file size compared with imaging the whole well, it also results in a reduced degree of complexity and a misrepresentation of assay data. This issue can be solved by ‘stitching’ together the small frames into a larger composite image of the well. Such a limited field of view also typically restricts samples to small, flat objects such as cells, prohibiting research using larger subject matter such as embryos or even whole organisms (e.g. C. elegans). Couple this imaging issue with the generation of vast data sets and the consumption of valuable laboratory time and you’re left with a surprisingly inefficient screening program with a reduction in the complexity and speed of output.

More imagers, more problems

A common approach to solving the problem of processing time is, understandably, to purchase more imagers. More imagers, means increased throughput, which correlates to progress, right? Not necessarily. If you’re already generating very large data sets with a single imager, you are going to be generating concomitantly more data with more imagers, which only contributes to an already difficult-to-manage data burden. Consequently, many labs are now experiencing problems with data storage, processing and analysis – a whole new bottleneck has emerged!

An optimized solution

Fortunately, approaches to phenotypic screening are evolving quickly. There can be a better way; the acumen® Cellista is a microplate-based laser scanning cytometer designed to provide single-shot, whole well, content-rich cytometric and image-based analysis. With features like a F-theta widefield objective for whole well imaging and a PMT detector array for simultaneous multiplexing, imaging is both advanced and streamlined. Currently the fastest imager on the market, the acumen can process 300,000 wells per day while simultaneously exporting single shot whole well TIFF images. Optimized workflows mean that data becomes a manageable, content rich resource, rather than a burden.


To find out more about our imaging solution that reduces data burden, contact our team.