Reducing variability in biomedical research

Andrew Peck is a Principal Business Development Manager at Waters Corporation (MA, US) in the Biomedical Research Group. Andrew has over 12 years of experience in the Life Sciences industry covering pharmaceutical, diagnostics, and biomedical research market segments.

Andrew received his PhD in Experimental Neuroscience from the University of Connecticut (CT, US) and conducted his post-doc in the Ashton Graybiel Spatial Orientation Lab at Brandeis University (MA, US). He is interested in the business of science and how technology can push our understanding of biological systems forward. Andrew is particularly interested in how LC–MS/MS workflows can help bring greater resolution to the complexities of human disease and lead to new discoveries for therapeutic interventions.

Biomedical research is largely centered on making new discoveries as it relates to human health and disease. The ultimate goal is to understand how human biology works and when and why it goes awry. With a strong foundation on the mechanisms of health and disease, advances in diagnostics and therapeutic interventions can be generated bringing cures to recalcitrant diseases such as cancer and neurodegenerative pathologies.

In the steady, albeit often times wobbly, march of science, we are continually striving to push back the fog that obscures our understanding of the biological, chemical and physical mechanisms that generate and guide health and disease. Significant gains include:

  • The mapping of the human genome.
  • The build out of expansive protein databases for a variety of species, including humans.
  • The generation of novel therapeutic agents to address a host of diseases and improve the lives of many patients and their caregivers.

Yet, in spite of all of this tremendous advancement in our knowledge, the engine of innovation has stuttered and the translation of new discoveries into products improving human health has slowed. We all know the statistics: thousands of INDs and tens of NMEs approved; tens of thousands of new biomarkers discovered and single digit new diagnostics introduced; internal rate of return steadily decreasing for decades [1–3]. The innovative application of engineering to immune function has opened the door to a new horizon of cancer treatments and offers hope for numerous patients with otherwise bleak survival profiles. Someday, we all might be ‘vaccinated’ against common forms of cancer (if such common forms truly exist), but even these incredible advancements are more of a technical and engineering feat than a deeper understanding of biology. We have known about T-cells and the function of the immune system for decades. The revolutionarily advancement for new oncology therapies, such as CAR-T, are in enabling a person’s immune system to recognize something that was previously invisible.


So why has the return on investment into biomedical research been less than dazzling? Open to a page of a high impact journal or attend a talk at a highly regarded conference and it is typical to see results presented that are statistically significant but at correlation (r) values under 0.5. Research that produces correlation values under 0.5, even if statistically significant, accounts for less than 25% of the variability in the dependent variable. That means that over 75% of the variability in the variable under study is due to some other factor or host of factors not accounted for, or perhaps even considered, in, say, the experimental design. Or, the unaccounted factors might be due to analytical variability or to the inability of the measurement methods used to accurately characterize the extraordinarily complex nature of biology. Regardless of its origin, variability appears to be at the heart of why so much research cannot be reproduced.

Technology advancements can help to control variability. Two of the primary values of liquid chromatography-tandem mass spectrometry (LC–MS/MS) for biomedical research can help address the variability that is inherent in biomedical research. The first centers on the ability to measure with extraordinary accuracy and precision: specificity. The second centers on the ability to quantitatively measure multiple biomolecules in a single sample: multi-analyte measurement capabilities.

Nature is exquisitely efficient. A single progenitor biomolecule, for example, can give rise to hundreds of structurally and chemically similar metabolites depending on which enzymatic pathways are up or down regulated. And often, two metabolites derived from the same parent molecule will have opposing bioactivity, say vasodilation for one metabolite and vasoconstriction for another. Being able to quantitatively measure both metabolites with high precision and accuracy from the same biological sample gives a much richer representation of a biological state than only being able to measure the level of the combined metabolites. With LC–MS/MS, multiple isomers can be selectively measured in the same sample.

Disease is rarely related to single mutation; often multiple perturbations to the biological system are implicated. Using tools and technologies that measure single points in the disease milieu provide keyhole information; much of the larger picture is out of view. The advantage of LC–MS/MS workflows to simultaneously quantitate multiple analytes from the same sample broadens the vantage point for understanding the mechanisms underlying disease. Consider the arachidonic acid pathway. Ubiquitous in many tissues, arachidonic acid will produce hundreds of bioactive metabolites through enzymatic and non-enzymatic pathways. These metabolites, generically called eicosanoids, are implicated in onset, propagation, and resolution of many conditions from cancer, to metabolic syndrome, to inflammation. Measuring only arachidonic acid, or one of its metabolites, may miss key information in the underlying tissue microenvironment supporting the emergence or resolution of disease. Leveraging the specificity and multi-analyte capabilities of LC–MS/MS workflows can enable the accurate and precise measurement of whole families of eicosanoids and provide a richer view on potential mechanisms of disease.

The good news is that with the continual advancements made in integration and ease-of-use, LC–MS/MS systems are more accessible and approachable than ever. Gone are the days when a mass spectrometer would fill an entire room and require teams of highly trained scientists to run it. Today’s integrated LC and MS systems easily fit onto a lab benchtop and can be successfully run by lab teams without formal MS training. Although method development can take time and expertise, tools are becoming more readily available to ease the effort to bring assays on-line. With the Waters Targeted Omics Method Library, for example, users can download, import, and turn up a range of assays with a few mouse-clicks. In a matter of hours, not months, a lab can be enabled to run multiple panels of metabolites.


Further advancements in the application of LC–MS/MS workflows exist on the horizon. The combination of affinity-based enrichment or depletion of samples with the specificity and multi-analyte capabilities of LC–MS/MS opens the door to leveraging the best of both technologies. A number of studies have recently been published highlighting the challenges of immune-based quantitative methods [4,5], particularly the inability to distinguish between isomers or other structurally similar biomolecules. The promiscuity of antibodies, however, can be a benefit when combined with the specificity of LC–MS/MS. Using antibody-capture of analytes of interest on the front end followed by quantitative LC–MS/MS analysis offers the promise of robust and highly specific analytical workflows.

Biomedical research is enormously hard work conducted in an environment that can be challenged by funding and complicated by whims of governments. But equally, it is conducted by some of the brightest, dedicated, and hard-working people who are focused on improving our understanding of human health and disease in order to provide actionable strategies for intervention. With the continual improvement in analytical tools and techniques, and the broader deployment of those tools and techniques, variabilities in biomedical research can be diminished and more discoveries can be translated into improving the well-being of people worldwide.