Life science research is changing. A push for more reliable data, higher sample numbers and uniform execution of protocols, is driving an increase in the adoption of automation. Automation allows scientists to accurately carry out standard techniques, like polymerase chain reaction (PCR) and next generation sequencing (NGS) and is driving developments in genomics and fields that rely on genomics techniques, such as synthetic biology.
Today, several key challenges exist around genomics and molecular biology applications:
The COVID pandemic has hampered reagent and plastic supplies to labs worldwide. To cope with increasing demand for genomics applications across academic and industrial R&D, as well as in drug development and diagnostics for healthcare, many labs turned to miniaturization to help them save on reagent, decrease costs, reduce plastic waste, and ultimately get more reliable and robust data in their processes.
In this page we discuss miniaturization and automation, how they help tackle today’s challenges, and how precision volume reduction benefits your genomics applications without sacrificing quality.
Miniaturization is the term for scaling down the volume of a reaction mixture or assay in molecular biology. Miniaturization is useful for assay development, design of experiment (DoE) runs, and high-throughput screening. As research now shifts to single cell applications, miniaturization can help do a lot more with a single sample, given the low input volume.
Scaling down common molecular biology and genomics protocols is simple because these applications use active enzymes that are both sensitive and precise, making them are highly amenable to scaling down the working volume to 1/10 of the prescribed volume or even lower (though this may be restricted by the kit being used).
At such low volumes, miniaturization of protocols must take advantage automated liquid handling systems. Liquid handlers can perform standard to high throughput execution of protocols, reduce overall costs, and increase data quality through automated sample tracking. Automation also helps reliability by removing human error from the equation, especially for mentally tiring manual processes, and allows the researcher to spend more time on research, data handling, and analysis.
In addition to the obvious savings on reagent, miniaturization uses less of the sample in question. This allows the researcher to gain more data points from a single sample, allowing more strategic use of further assays.
Strategic use of miniaturization proved useful during the COVID-19 pandemic as supply chains of key diagnostic reagents became scarce, forcing clinicians to consider scaling down approved protocols.
In our experience, miniaturization reduces experimental cost at least by 75% while preserving cell/library success rate and method sensitivity.
Other main advantages to miniaturization are:
Find out more about how automation and miniaturization can help testing and reduce bottlenecks.
Automated liquid handling adds benefits such as:
These factors are good to keep in mind when starting your miniaturization and automation journey, as you can build your vision of a lab of the future today.
There are several reasons to miniaturize a protocol: to save on cost of reagents, to maximize the data you can extract from a sample, or simply to minimize waste in the lab. Why you require miniaturization will dictate how you go about it and which automated platform you may require (see Automation considerations below).
Before you miniaturize you must ask a few key questions.
Many common kits can be miniaturized as the protocols are largely additive. Additive protocols require dispensing only and no mixing of the sample. At low volumes, such as those used in a miniaturized protocol, homogenization of the reagents occurs if they are added with sufficient velocity in a process called turbulent mixing. Additionally, as volumes are reduced diffusion has a greater effect on mixing efficiency.
Miniaturized protocols make use of magnetic beads for separation of DNA and cell lysate rather than standard centrifugation and filtration techniques. There are many types of magnetic bead available, each with different densities, diameter, and function depending on your requirements. If bead separation is new to you, find a type that suits your application in advance by consulting a miniaturization specialist.
There are drawbacks to manually scaling down a protocol. As you scale down the working volume, pipetting error becomes a more prominent danger to the accuracy of your experiment, especially when working with viscous reagents. For instance, a 0.1 µL variance in manual pipetting has less impact in a 20 µL total volume than in a miniaturized 2 µL total volume. That is before you factor in potential human error. Automated liquid handling is required for reliable pipetting and effective miniaturization of a protocol.
Over the years, multi-well plates have been condensed to fit more wells. The 96- and 384-well plates are common, however, 1,536-well plates are also in use. These are difficult to handle with multiple reagents, even with electronic pipettes. Additionally, these manual handling tasks are a poor use of a skilled researcher’s brainpower. More and more labs are moving towards automated liquid handling due to the immediate benefits of reliability and accuracy in carrying out experiments, but also increasing the complexity of the experiments that can be done while freeing up the researcher to focus on experimental design and data analysis.
When we talk about high-throughput screening (HTS) and ultra-high-throughput screening (uHTS), this means samples are processed on an order of up to 100,000 per day for HTS and higher in the case of uHTS. Many genomics labs use medium to high throughput for DNA library screenings or whole genome sequencing projects.
The volume range and the way in which liquid is handled are inextricably linked. When miniaturizing protocols, you will require a specialist liquid handler operating a pipetting range in the nanoliter – microliter (nL – μL) range. More standard automated liquid handling platforms operate in the microliter to milliliter (μL – mL) range.
Platforms that use air displacement pipettes, like how a standard hand-held pipette works, can be disturbed by air pressure and may require additional calibration when aspirating and dispensing viscous reagents, such as those containing glycerol or volatile reagents such as Ethanol.
Platforms that use positive displacement tips, where a plunger comes in direct contact with the reagent, are unaffected by viscosity and air pressure. This style of dispensing is suited to additive kit protocols and uses a lower dead volume of reagent than others. If repeat dispensing small volumes at a time from a larger tip to multiple wells, the same tip cannot be used to mix by pipetting. This is not an issue in miniaturized protocols however, as homogenization of the sample occurs by turbulent mixing and diffusion.
Acoustic liquid handlers, as the name suggests, use sound to transfer precision droplets of liquid to a target well. Like air displacement pipettes, these need to be calibrated to the viscosity of the reagents. By not using plastic tips, these devices significantly reduce the amount of waste produced by a lab. However, they are unable to transfer ‘intra-plate’ or handle higher transfer volume as they rely on an inverted destination plate positioned directly over the source plate which can be limiting for some applications.
Always ask vendors to provide full information on the range, variance, and accuracy of their pipetting/dispensing technology, across a range of representative liquids matched to your application, to ensure it fits with the needs of your miniaturized protocol.
It may seem obvious to check the footprint of a new piece of lab kit, but with robotics, it is equally important to be aware of any moving parts or additional space that may be required when installing. A tabletop unit may fit your bench perfectly but if it also advises a specified clearance between the device and wall, it may compromise safety and not be a good fit.
A major barrier to adopting automation – if not the biggest barrier - is the worry that specialized knowledge is required to operate the device. Automated liquid handling is, for the most part, rather straightforward now. Several automated platforms are available as ‘plug and play’ and require little or no software coding expertise. Vendors typically provide training and may assist with protocol development. Additionally, find out if a copy of the software is freely available to install on other computers. This will allow you to design experiments away from the lab, and simply upload the file when you get in – a real advantage when stuck working from home!
With automation comes a big change to your supply chain. If the COVID-19 pandemic has taught us anything in science, it is to be mindful of where our consumables and reagents originate and if they will be delayed. With strategic implementation of miniaturization and automation in your lab, you can regain control of your needs and better cope with supply chain issues in the future.
Miniaturization addresses key supply chain bottlenecks by using much less volume of expensive reagents. When choosing an automation platform, speak to the vendor about tip needs and where to obtain them, and if the supply chain is secure, reliable, and fast. You may need tips in a hurry. An additional bonus is that automated liquid handling translates to less time in the lab for the researcher, meaning fewer gloves used. Implementing automation for the first time can also be a great opportunity for your lab to address the dirty little secret of science: single use plastic waste.
A 2015 study estimated that labs created 5.5 M tons of plastic waste every year. Unfortunately, life science labs cannot recycle plastics contaminated with biological matter – it is typically sent for incineration – or use eco-friendly bioplastics because they can interact with biological and chemical materials in kits and samples. If we cannot recycle, we must reduce our use of plastics to create less waste.
This depends on the kit chemistry and sample type. Some NGS kits are robust at a wide range of volumes and sample concentrations, while others allow up to a 4-fold volume reduction. Sample type (e.g. single cells vs bulk DNA/RNA, or high vs low RNA content) should also be considered. Generally single cell workflows allow substantial miniaturization as the input is just a few picograms of genetic material. In the case of bulk RNA and DNA miniaturization, it is still possible but should be less extreme to ensure sufficient coverage and depth of sequence data. Samples for bacterial and viral genomic amplicons, due to their low complexity, can be processed using low volumes of the reagents.
Yes there is a risk associated when reducing volumes unless you consider using a fit for purpose system. For precision liquid handling at very low volumes, automated liquid handlers like the mosquito HV genomics and dragonfly discovery use true positive displacement technology. Here, the plunger comes in direct contact with the reagent and liquid dispensing is unaffected by viscosity and air pressure. This style of dispensing is suited to additive kit protocols and uses a lower dead volume of reagent than others.
For sample transfer, discrete disposable tips should be used per sample. For reagent additions non-contact dispensing is a good option as this typically utilizes disposable sterile tips which do not contact the target plate or tube. They also dispense liquid without splashing. If contamination is a major concern, check if it is possible to place the liquid handler in a laminar flow hood.
Speed of reagent dispensing is important. Consider a humidity chamber and/or cooling modules to decrease the evaporation rate. Evaporation will always be slower in a smaller diameter well, so miniaturized reactions may work better in 384 well plate than in a 96 well plate. In other words, the smaller the better!
Assuming positive displacement technology is utilized for both pipetting and dispensing steps in miniaturized workflows, the amount of dead volume depends more on the labware chosen for the tips to fill from. Choose labware with low dead volume plates or reservoirs.
Automation reduces human error by precise execution of protocols, minimizing unnecessary repetition. Technologies like positive displacement pipette tips do not physically contact biological samples when dispensing reagent and can be disposed of in an eco-friendly manner. When miniaturizing protocols, a single tip can dispense multiple volumes, thereby reducing the number of tips required for a given process. As mentioned in the previous section, acoustic liquid handling platforms do not require tips at all, although they require a specific type of source plate the samples or reagents to be dispensed need to be loaded in to.
Switching from 96 well plates to more compact 384 well formats requires a thermocycler with an adequate heating block. You may simply need to buy the required block though some instruments may not be compatible, and a new thermocycler must be purchased.
While it might be possible to reduce reagent cost, if consumables (plates, tips) are expensive then total savings can be much lower than expected. This has prompted a shift towards more compact plates, such as 384 well plates.
Automated liquid handling devices nowadays typically have user friendly software, are easy to program (no programming skills required) and are beneficial, especially in an academic environment. Getting the most from your technology may extend beyond simple technical support to also include Field Application Scientists that provide scientific advice and assist laboratory teams in the installation of instruments and design of scientific applications. As each vendor and device is different, do check before purchase if any specialist knowledge is required, what tech support comes with the package, details of license agreements, and of course training.
Ideally not in our lifetime! Most devices are designed with an open platform that allows them to be flexible and change direction if needed.
Most genomics techniques use expensive enzymes and chemical reagents. As research becomes more specific, looking at the single cell level, the challenge faced by researchers is how to extract the most information from less sample volume in a cost-effective way. Genomics techniques are amenable to miniaturization because the enzymatic reactions and genetic material samples they use can be efficiently scaled down, allowing researchers to do more with less.
Genomics is the study of genomes covering everything from their structure and function to their editing and evolution. Most genomics techniques are amenable to miniaturization because they use enzymatic reactions and genetic material samples that can be efficiently scaled down.
Genomics is an interdisciplinary field using advancements in molecular biology tools to assess the makeup of a genome, the transcription of its genes, and more. Common techniques used include CRISPR, NGS, PCR and quantitative reverse transcription PCR (RT-qPCR), as well as transcriptomic techniques such as DNA-seq and RNA-seq.
PCR and NGS is revolutionising diagnostics and public health as advancements in genomic technology are helping usher in the era of personalized medicine, better diagnostic techniques, and new therapeutics.
NGS is powering precision technologies like single cell sequencing, keeping track of mutations accumulating in SARS-CoV-2 during the COVID pandemic, and creating masses of high-quality data for artificial intelligence platforms to trawl. These data are now playing a role in population genomics and conservation.
Genomics is also crucial for the emerging field of synthetic biology/engineering biology in identifying and engineering novel metabolic pathways. These can be used for producing bio-based sources of currently unsustainable or petrochemical-derived compounds, antimicrobial compounds, and therapeutics.
PCR makes copies of a piece of DNA in a matter of hours which can then be used in cloning, sequencing, or simply just run on a gel and viewed. RT-qPCR and Real Time PCR (qPCR) are used to quantify gene copy numbers, with practical application in diagnostics such as Sars-CoV-2 detection. RT-qPCR relates to quantification of gene expression as the reverse transcriptase step creates the DNA template from RNA.
PCR works in a stepwise process: Denaturation of DNA; Annealing of oligonucleotide primers; and Extension of a new DNA strand with DNA polymerase. This process is repeated, typically for 25 or more cycles. Primers and nucleotides in the mixtures can be modified to add features such as restriction enzyme sites or fluorescent tags. RT-qPCR and qPCR use the same method, but additionally use fluorescence to quantify DNA amplification as it happens, so there is no need for a gel run.
The PCR mixture is straightforward and entirely additive, lending itself to easy automation. RT-qPCR is best suited, as the fluorescence detection of amplification gives an immediate result, whereas a small amount of amplified DNA from a standard PCR in miniaturized form may not yield enough volume to visualize on an electrophoresis gel. In functional genomics, PCR is used to create recombinant DNA – typically a gene of interest on a plasmid vector – for expression in a cell.
PCR is the backbone of molecular biology. Its applications in biotechnology, therapeutics development and diagnostics make it an essential tool for every life science researcher. The technique is highly sensitive, requiring only picograms of DNA template in order to work, and is therefore highly amenable to miniaturization. Miniaturizing PCR for diagnostics can help stretch reagent use and reduce costs of diagnostic tests such as for COVID-19.
Learn more about miniaturizing PCR:
Introducing nano-scale quantitative polymerase chain reaction | Drystad et al. 2018.
Saliva is less sensitive than nasopharyngeal swabs for COVID-19 detection in the community setting | Becker et al. 2020.
Plasmid vectors are assembled (and disassembled) using a variety of methods, including type II restriction enzymes, Golden Gate assembly, and Gibson assembly. Using PCR amplicons, molecular parts such as promoter, replicons, transposons, and terminators, plasmid vector backbones can be built with gene circuits and properties to study functional and structural genomics of an organism.
Cloning and vector assembly commonly uses type II restriction enzymes. Restriction enzyme recognition sites are straightforward to add onto PCR primers, making DNA fragments simple to clone into a vector.
Golden Gate assembly uses these enzymes sequentially. A vector of multiple parts can be assembled in a one-pot reaction. The major detraction of using type II restriction enzyme sites is that they tend to leave ‘scars’ or traces of themselves in the backbone of the plasmid vector or DNA sequence, thus shifting open reading frames or putting added distance between a gene and its promoter sequence.
Gibson assembly allows for scarless cloning. This technique makes use of DNA’s ability to anneal to itself. Fragments of the vector and insert are designed with approximately 20-40 bp DNA overlapping with the fragment next to it. Three enzymes are required to complete the cloning: A 5’ exonuclease is added to pare back one strand of DNA on each side, allowing them to fit together; DNA polymerase to close the gaps after DNA has annealed; DNA ligase to seal the nicks.
The importance of the ability to introduce DNA to an organism cannot be overstated. From functional genomics to development of gene therapies, introducing DNA – either introducing a new gene product, overproduction of an existing gene product, or to delete, silence or interfere with existing genes – is fundamentally essential to the field of genomics.
Restriction enzyme reactions, Golden Gate, and Gibson Assemblies can be performed in one-pot with incubation steps, making them favorable cloning techniques among those who use automation. All of these methods are suitable for miniaturization. This can help reduce cost by minimizing reagents and save time on library preparation.
NGS is a rapid DNA sequencing method that is useful for reading whole or partial genomes, confirming DNA sequence of PCR amplicons, and can be used in diagnostics. The technique can also be applied to cDNA and RNA, resulting in rapid and accurate transcriptomic analyses.
Next generation sequencing works like PCR but uses fluorescent di-nucleotide phosphate bases (dNTPs). During the extension step, DNA polymerase incorporates fluorescent dNTPs which emit a signal. This signal is picked up by a detector and records the DNA sequence in real time.
NGS is commonly applied to whole genome sequencing due to its speed but is also used for smaller fragments such as PCR products and plasmid vectors, lending itself to every area of biological research. It is also used to monitor gene expression from cDNA generated from RNA, allowing researchers to take a snapshot of gene expression at a particular time in a cell. This method is called RNA-seq and has replaced the less reliable technologies of hybridization microarrays. Its sensitivity even allows single cell applications in the above. NGS can also be used in population genomics and diagnostics.
To prepare a library for NGS, you must first consider the procedure they will be used for. The protocol below can be miniaturized to a final volume of 4 µL. Check out this demo of library preparation.
Because NGS is based on the principles of RT-qPCR (elongation and fluorescence) and the protocol is simple additive, it is easily miniaturized. This saves on very costly reagents (polymerase enzyme, buffers, fluorescent dNTPs) which currently stand in the way of achieving the $100 genome using NGS technology.
Miniaturization technologies for cost-effective AmpliSeq library preparation for next generation sequencing | Ogiso-Tanaka et al. 2018.
Efficient high throughput NGS Sample Preparation | SPT Labtech.
Increasing NGS experimental power without increasing costs
The high cost of library production for Illumina NGS is a critical factor in the design of under-powered experiments. Dr. Stuart Levine, Director of the BioMicro center at MIT, has developed methodologies to dramatically lower library preparation costs without sacrificing data quality. Performance data from the Collibri ES DNA Library Prep Kits demonstrates 10-15 fold reduction in costs.
A major limitation of using single cell sequencing to study gene expression across a tumor is, after biopsy, it can be difficult to trace the single cell analyzed back to its precise location on the tissue. Spatial transcriptomics is a method of mapping gene activity across a tissue. This differs significantly from bulk RNA-seq which is just an average across the whole sample
A tissue sample is placed on a glass slide or in a plate arrayed with oligonucleotides containing positional barcodes and cDNA primers to capture mRNA from each position. The tissue is fixed and permeabilized, allowing capture of the mRNA. The captured fragments are converted to cDNA and sequenced using NGS and the positional barcodes allow for mapping back to the point of origin.
Spatial knowledge – physically where in the tissue the cell originated – is often lost in single cell RNA-seq. To get a better idea of how gene expression occurs in a dynamic, spatially organized tissue such as the liver, spatial transcriptomics is required to visualize and gain a quantitative analysis of the transcriptome in tissue samples. Spatial transcriptomic data is more resolved and can detect DNA perturbations on the transcriptomic level, allowing more accurate diagnoses.
As with other forms of single cell sequencing, miniaturization of spatial transcriptomics using automated liquid handling results in robust, reliable execution of the protocol while saving on expensive and sensitive reagents and making use of small sample volumes.
It should be noted, miniaturization can only be performed on a plate (as opposed to on-chip methods) and is dependent of the volume in which the sample has been added to the plate in question.
Learn more about miniaturizing spatial transcriptomics:
Clustered regularly interspaced short palindromic repeats (CRISPR) is a family of bacterial DNA-editing proteins that effectively acts as the microbial immune system. Genetic editing is a useful technique in functional genomics. It allows the researcher to add or remove from the genome regions suspected to be involved in regulation of gene expression, as well as alter the location of genes on chromosomes to observe the effect location has on gene activity.
When a bacterium encounters a bacteriophage (a virus for bacteria), CRISPR proteins may recognize its DNA, physically chop it up, and insert parts of it into the bacterial genome so it can recognize it better in the future. This bacterial DNA editing mechanism works in other cell types, including humans. The CRISPR proteins can be re-targeted using a special type of RNA called guide RNA (crRNA) allowing precision modifications to be made anywhere in any genome. Once the CRISPR-Cas complex is guided to the target site, it makes a nick in the DNA. Then, the host organism fixes the nick with its own DNA repair mechanism. This is where another piece of DNA matching upstream and downstream of the nick can recombine into the genome, via a process called homologous recombination.
CRISPR has many applications in R&D gene editing, knocking in and knocking out DNA. One recent example in healthcare genomics, CRISPR allows for rapid in vivo functional genomics in patient-derived xenografts where tumor tissue is transplanted into a mouse model. Recent research has shown that CRISPR can perform targeted gene disruption, allowing clinicians to observe genetic dependencies within the tumor and analyze mechanisms of acquired drug resistance by site-specific editing.
CRISPR effectiveness is often based on the target in question and may not always be effective. Large CRISPR libraries can be generated and screened with ease using automation and miniaturization to keep costs low while maintaining reliable results. Much like other vector assemblies (see Molecular Cloning), a CRISPR vector library with crRNA and DNA repair sequence would be amenable to miniaturization as the process is largely additive.
Synthetic biology/engineering biology is a precision approach to biological research. Based on the Design-Build-Test-Learn framework, automation and software play a major role in carrying out experiments and providing robust and reliable data.
Much like how the silicon chip industry uses standardized parts to build its products, this field is trying to do similar. Standardized molecular parts – such as promoters, terminators, and riboswitches –make up the toolkit relied on in many protocols. Gene circuits, metabolic pathways and vectors are engineered, built, tested, and improved upon in an iterative design process.
The field aims support a growing, sustainable bio-based economy through the development of new R&D tools, bio-based materials and products, and therapeutics. The next generation of bio-based products could be entirely synthesized from standardized DNA parts in the lab.
Synthetic biology/engineering biology relies heavily on many of the NGS and genomics tools listed above and encourages the uptake of automation to ensure valid, reproducible results. Miniaturization helps by reducing environmentally hazardous reagent use and minimizing plastic waste, chiming with the bio-based philosophy driving the field forward. These techniques are highly amenable to miniaturization and, in carrying out high-throughput iterative cycles of experiments and vector/gene circuit building, may benefit from reduced costs by using miniaturized protocols.
This guide covers everything you need to consider when automating and miniaturizing genomics applications, including:
...and much more!Click here to download the guide