Interview with Erin Chambers (Waters Corporation): Addressing challenges in Peptide and Protein bioanalysis


Erin_ChambersErin Chambers, Principle Scientist, Waters Corporation, discusses large molecule protein quantification with Bioanalysis Zone, as part of our Spotlight on large molecule analysis by LC-MS.

 

 

 

What do you believe are the main challenges for quantification of proteins using LC–MS?

There are certainly many, many challenges involved in LC–MS protein quantification. Some of the more obvious are those related to the transition of our bioanalytical scientists from small molecule to large molecule work. This transition is difficult because of the lack of expertise in handling large molecules, the complexity and unfamiliar nature of the workflows, or simply the multitude of workflow options out there, all of which make even knowing where to begin problematic.

Looking at these complex workflows, the sample preparation immediately jumps out as an area of concern.  Relative to many conventional ’small molecule‘ techniques such as solid phase extraction (SPE), protein precipitation (PPT), or liquid–liquid extraction (LLE), the sample prep required for a quantitative protein assay, in particular, is incredibly extensive and laborious. We are now talking about needing to perform digestion, which itself has many steps to optimize and many parameters within those steps which require optimization. In addition, about half of those doing this work also perform an affinity-based protein level purification as well as a peptide level clean-up using either affinity or SPE.

If we are talking about protein bioanalysis specifically, finding a suitable surrogate peptide to use to represent that protein is still extremely labor intensive, even with some of the software tools that exist. There is no simple streamlined way to accomplish this. There is no question that the process is time consuming – but steps can be followed which will produce peptide options. From this list though, there is no guarantee that there will even be a good, MS-sensitive and selective peptide to use and/or one devoid of problematic amino acids such as methionine or other undesirable features. Furthermore, it is possible that, depending on where you pull that surrogate peptide from, you may actually end up getting different pharmacokinetic results, for example.

Finally, I think that there is a tendency to lean towards using proteomic approaches, simply because that is an established area for protein analysis. This does not always serve the bioanalytical chemist well, as proteomic methods were designed with a different end goal in mind.

What challenges does sample preparation for LC–MS analysis of proteins present and how can these challenges be combated?

We are talking here about separating proteins or peptides from other endogenous proteins and peptides – these are large molecules made up of a finite list of amino acids, and with relatively speaking, very small differences between them. The greatest problem therefore is specificity! This is especially true when using a nominal mass instrument such as a triple quadrupole.  While simple bi-phasic chemical or physical separations like PPT or LLE are often sufficient for small molecule work, separating a protein of interest from others in a plasma or other biological sample may require a first step generic or specific affinity capture followed by another isolation step again at the peptide level, which can either be SPE or affinity. The SPE that is most effective in terms of specificity for signature peptides is based on what we call ’mixed-mode‘ sorbents. These are sorbents, which have both a reversed-phase and an ion exchange moiety and provide dual orthogonal modes of separation. This is especially effective when the peptides are bound to the sorbent by ion exchange as this is orthogonal to the subsequent LC separation typically performed in the reversed-phase dimension. Here a strong cation exchange is optimal. Trypsin, the most commonly used cleavage enzyme, cleaves at arginine and lysine and will therefore always have a positively charged handle for the negatively charged cation exchange sorbent.

The other aspect of sample prep that is difficult is knowing when and how to apply affinity purification, and to what degree. At the protein level, there are several options. Along the continuum from simplest and least selective to most selective, we have ’none‘ (meaning digest the sample whole, as is), PPT or pellet digestion, generic affinity, specific affinity, and finally using both protein and peptide level affinity. The simplest approach is simply digesting the whole plasma or serum sample, without any pre-purification. This works well where detection limits are not challenging, let’s say where limits of detection (LODs) are in the ~100 ng / mL to several ug / mL range. Of course the actual sensitivity is related to the unique peptide itself and its MS intensity. Whole plasma/serum digestion is best suited to early discovery. Taking one step further, one can perform a dialed-in protein precipitation and subsequently digest the pellet. This can remove over 60-80% or more of the serum albumins if the correct solvent, in the right ratio, is used.

Moving along the continuum, generic affinity capture such as Protein A or G, which isolates the IgG sub-class is also quite common. In this case, only the IgG fraction remains in the sample. One must be aware though, that this is still multiple mgs / mL of protein. Using a generic protein A capture, detection limits as low as 10 ng / mL using only 35 uL of plasma have been reported, although LODs closer to 100 or several hundred ng/mL are more common. Of course a specific affinity capture, or even an anti-human (Kappa) used in preclinical species, will provide the most selective extraction at the protein level, resulting in significant improvements in detection limits. If one wants to get really elaborate, specific or generic affinity at the protein level, coupled to affinity at the peptide level, while expensive and laborious, can provide very high sensitivity.

The final comment I’d make, though there are many more, is that non-specific binding is a real concern in sample prep of any biologic. Especially if one is coming from a small molecule background, this can be one of the toughest problems to overcome simply because many do not recognize its symptoms to know that it is occurring. There are ways to circumvent this problem, such as using vials or plates designed for this purpose, and/or the use of carrier proteins in addition to proper solvent choice.

In your opinion, when compared with LBA assays, what are the main advantages gained when using LC–MS for protein quantification?

Back in 2008 when we first started presenting holistic strategies for peptide quantification using LC–MS, we identified the primary drivers responsible for moving folks away from LBAs and towards LC–MS. These have not changed, and are relevant not only to peptide but also protein quantification. Among the most important advantages that LC–MS provides are a higher degree of accuracy and precision (one has only to look to regulatory guidance to see the reflection of this), the ability to easily multiplex, broader linear dynamic range (which means less sample dilution), ease of assay transfer  from site to site or lab to lab, the ability to distinguish closely related species such as degradation products and metabolites from the peptide or protein of interest, faster method development times (typically days to a few weeks versus 6 months to a year for LBAs), use of a common comfortable analytical platform technology for bioanalysis labs in pharma or CROs, and finally greater selectivity.

Related to the advantages of LC–MS are the common shortcomings of LBAs such as cross reactivity, availability or reagents, reliability and reproducibility of reagents, and perhaps most importantly the lack of standardization. Of course LBAs have a few key advantages such as the use of small sample volumes, high sensitivity, and ease of running systems. While LBA’s may be easy to transfer from lab to lab due to their design towards less-skilled operators, the poor reagent reproducibility can make it difficult for different labs to achieve the same quantitative results.

Even the advantages of LBAs are slowly being eroded as LC–MS technology advances – either through the use of microflow to increase sensitivity for small sample volumes, improvements in ease of use bringing MS to the masses, or fundamental improvements in TQ and HRMS systems.

What are some key challenges one should be prepared for when transitioning from small to large molecule quantification?

I think the biggest surprises are the degree, magnitude and frequency of non-specific binding, lack of specificity and difficulty obtaining it, the need to be aware of and test for ADA effects, poor solubility of target peptides or proteins, sensitivity (with fewer molecules or moles of analyte per unit weight it is much harder to achieve pg or ng / mL detection limits, as the molecular weight of the analyte gets  larger), and the changes in chromatographic performance and analytical tools and workflows required.

Essentially, the rules are different. Parameters that might have a negligible impact on small molecule assays suddenly become critical. I’ve alluded to many of these when talking about sample preparation and the instrumentation. Furthermore, one should expect that a different assay (MS for example) may be used in discovery more often, while LBAs are still more common further into development or for the analysis of patient samples. In addition, while LC–MS offers the opportunity for the standardization that is difficult to achieve with LBAs, unless one is following a kitted approach (with pre-measured, traceable reagents and highly standardized generic protocols), transferring these complex assays from site to site or sponsor to CRO can be especially challenging.

Are there specific qualities/attributes of the hardware or software that are advantageous when quantifying large molecules?

In terms of considerations for instrumentation platform no one size fits all here, there are various complementary techniques, and a continuum of solutions.

Triple quadrupoles are the best for sensitivity for targeted analysis, especially if a bottoms-up approach is taken (i.e., digestion). HRMS can provide added selectivity when adjustments in chromatography or sample prep are either not possible or desired. The added selectivity can result in improved sensitivity. While triple quadrupole instruments may be optimal for bottoms up techniques, HRMS has clear benefits for subunit or intact protein analysis. We should consider the LC platform as well, one of the values of small particle LC is that it increases peak capacity, which reduces the risk of co-elutions. This is particularly important for separating like from like large molecules with highly conserved composition, distinguished by only very minor differences. Microflow should be considered for biomarker work, especially where sample volumes are limited and ultra-high sensitivity is required.

In terms of software, we need predictive software, for example: in silico digestion and in silico fragmentation modeling are both valuable. Both will help not only predict, but identify and confirm observed peptides and/or fragments. There are certain seemingly small details that are important in in silico fragmentation models such as the ability to accurately fragment peptides with di-sulfide bridges. One also needs a tuning regime designed to work with large molecules, for example one that takes into account rules specific to peptides such as fragments appearing at m/z higher than the precursor or the ability to eliminate multiply charged versions of water losses or adducts if one desires. It is also valuable if the software can rank fragments to help the user find those that are most likely to be most specific, based on certain researched rules.  Finally the software needs to be able to take all these permutations of fragments, transitions, signal to noise, sensitivity and present to the user in a way that allows you to confidently review and interpret the results to ultimately yield the most robust and sensitive assay.

What are some new and more challenging molecules bioanalytical scientists are working on?

Recently, ADCs, fusion proteins, and conjugated molecules in general have taken the spotlight. In the case of ADCs specifically, this is primarily due to the number of different analytical tests that need to be performed and the lack of simple, streamlined solutions to perform these analyses. For example, one needs to quantify conjugated and unconjugated payload, determine the drug to antibody ratio (DAR) ratio, and quantify total and conjugated antibody. Scientists are still searching for the best way to tackle these analytical problems. At the moment it looks as though solutions require a combination of intact and digestion approaches, accurate mass and triple quadrupole instrumentation, and protocols designed to maintain the fidelity of the ADC. Due to the complexity and broad range of expertise required, it can be difficult to find a single source for support.

I think that’s why forums and dialogs such as this are so important to the scientific community, to share and increase everyone’s large molecule bioanalytical knowledge and understanding collectively. It is something that vendors need to be a big part of.