In this instalment of Robert Macneill’s (Covance) column, ‘The Biomarker Bandwagon’s Journey over Curious Calibration Curves’, Robert explores the potential information to be gleaned from calibration curves and how these insights could possibly fuel innovation in the field of biomarker quantification.
Robert MacNeill received his Bachelor’s degree with Honors in Chemistry from Heriot Watt University then his MSc in Analytical Chemistry from the University of Huddersfield, both in the United Kingdom. Robert is also a Chartered Chemist and a Fellow of the Royal Society of Chemistry. With 22 years of experience in all aspects of quantitative bioanalytical LC–MS/MS method development, 13 of these years heading method development activities within the Princeton site that has housed HLS/Envigo (now Covance), and a regular author and peer reviewer for the journal Bioanalysis, Robert is a recognized expert and innovator in the field.
I don’t think I’ve sensed more of a general buzz about biomarkers than at the moment. They are more pivotally involved in drug development than ever. Like other topics of discussion in the industry, it has seemed to go through ups and downs in terms of discussion intensity over the years, but amidst this and the current state of play it is certainly safe to say that this field is particularly important and engaging, has its fair share of controversy and is not short of analytical challenges.
My R&D team has been fortunate enough to have had a handful of interesting dalliances with addressing these challenges over the years, at least in the LC–MS domain. Then around a year ago, I had the honor of guest editing a special focus issue of Bioanalysis, concerned with the reliability of methods for the quantification of biomarkers using the surrogate matrix approach. That amounted to a very valuable immersion in the goings-on at the time. One aspect it helped to underline was that this surrogate matrix approach (where the ‘matrix’ can often be a simple solution) has proven to be generally the most favored, in the face of the alternative surrogate analyte approach which, although highly potent for reliability, comes with important drawbacks. These drawbacks are the time and expense of synthesizing stable-isotopically labeled analogs, of which two are called for, one as the surrogate analyte itself and the other as the internal standard.
This article is part of Robert MacNeill’s (Covance) quarterly column for Bioanalysis Zone, which focuses on quantitative method design. Click here to read past instalments of the column!
The shared objective in all approaches is proving parallelism, the rugged road to reliability, upon which the ‘Biomarker Bandwagon’ wants to be rolling. For the surrogate matrix/solution approach, demonstrating parallelism can be very challenging. This is where the output, the LC–MS peak area-derived response, translating into the slope of a calibration line which is the defining essence of sensitivity, matches between the surrogate element and the real element within the complete methodology. Such fertile ground for the ‘Biomarker Bandwagon’ to navigate with a view to innovation. Especially for those who enjoy the music being played, a lot of which emanates from the more in-depth utilization and assessment of calibration graphs.
The obvious abnormality in a regularly prepared calibration curve, for those used to working with xenobiotics, is the distinct and statistically clear intercept in the response-axis where the nominal concentration is zero. It’s a direct indication that there is a native level present. As a consequence, how do we make that measurement? Then, to touch on one smidgeon of controversy, what degree of precision do we associate with the measurement? The answer to the former question is that there are a few options, with perhaps even one or two avenues unexplored as yet, and the latter question cannot really be answered succinctly.
Here is one option. At hand in this world is the wonderful tool of standard addition, which we can readily use to calculate the endogenous concentration of an analyte in a given matrix, usually in support of results obtained from separate means. Standard addition can be used provided we have enough sample and there is no need for any isotopologue to be involved. It involves the spiking, using entirely unmodified analyte reference material, of several aliquots of entirely unmodified matrix at a selection of distinct nominal concentrations, as would be done in any calibration curve preparation. The subsequent analysis gives the scenario described in the paragraph prior, typically a line with characteristic slope representing the sensitivity, and a positive intercept in the response-axis and a negative intercept in the concentration-axis. The negative of the intercept in the concentration-axis is the calculated endogenous concentration and indeed the result is reliant on a bit of extrapolation which can make some ‘Bandwagon’ passengers slightly uncomfortable. With well-designed experiments and methods, it has proven to be generally very steadfast. The confidence of the result is reinforced by larger numbers of replicates at each nominal level and by the use of an internal standard that performs adequately within the method.
There is another profoundly important drive of the standard addition experiment and much the same torque is driving our ‘Bandwagon.’ As there is absolutely no question about the integrity of the slope of the calculated calibration line, because there are no surrogates of any type involved, it has been popularly proposed as a complementary means for establishing parallelism where any surrogate-based calibration curves are used. It’s a classic pair-up of extrapolative and interpolative techniques to achieve a valuable objective, to really allow the ‘Bandwagon’ to go places.
I hope this brief outline of what I believe is an invaluable technique gives a taste, at least to the newer generations of analytical scientists, of the sometimes-veiled usefulness within calibration curves. Once the purpose of confirming parallelism is done, the other parameters in the equation describing the standard addition line can be used to glean insight and maybe not only in calculating the endogenous level in the matrix sample used for the standard addition construction, depending on the experimental setup and purpose. Then, in turn, how this awareness of calibration curve meanings and implications, alongside the ever-important method ruggedness foundations, can possibly fuel innovation in fields like biomarker quantification, not to mention fuel the figurative ‘Bandwagons’ merrily traversing these fields.