Thank you everyone who attended the live webinar: Oligonucleotide and mRNA therapeutics: bioanalytical challenges and lessons learned. Below is a transcription of the Q&A session held during the webinar, as well as responses to the questions posed during the live event that we did not have time to answer. We hope this is a useful resource and thank our webinar attendees and our speakers, Jamil Hantash (Tandem Labs) and Laixin Wang (Tandem Labs), for their time.
The following is a transcription of the Q&A session.
How comparable are the data obtained from different platforms?
This can only be answered by comparing data with the real study samples, unfortunately we do not have much opportunity to do this. We do have the opportunity to compare LC-UV and LC-MS and have found the accuracy within 20%, so it is pretty accurate. The LC-UV data usually were slightly higher for some samples comparing to LC-MS data, but the difference was within 20%. For comparison between LC-MS/MS and the hybridization-ELISA assay, we have had responses from clients who said it was very comparable actually, but we do not have the data in house.
If you have the choice between low resolution triple Quad MRM or HR-MS Q-Exactive (140 resol) for oligo quantification – what would be your preferred choice?
It will largely depend on the structure and the mass of the oligonucleotide itself. We have 3 QE and 9 API5000 MS – usually we would compare which ones give the best sensitivity for a particular analyte. At the moment about 80% get a better sensitivity on the API5000 triple Quad and 20% get a better sensitivity on the QE. The background and interference really depends on each particular oligo, the transition, and the specific molecular weight.
Is an ADA assay required for both oligo and RNA therapeutics?
To my knowledge and based on what has been published, the agency has not made a final decision on the requirement of an ADA for oligos or RNA or DNA therapeutic drugs. However, we have seen from our clients an increasing number of requests to have these assays developed and validated before they go to clinical trials. Part of their submission to the agency involves indicating that these assays are up and running and ready if needed. So the answer is: we would like, and we encourage people, to have these assays developed and validated and ready, just in case the agency requests them. This will mean that the samples can be quickly analyzed, and no further delays to their clinical trial and their submission will be encountered.
How do you choose the labelled RNA/oligo to set up an ADA assay?
We take the full length unmodified therapeutic compound that the client has, and try to find a way to conjugate a biotin or a sulphur tag to certain parts or certain sequences on the compound. In some cases, it is challenging because this is not feasible, or if we do the conjugation, it will be at a sequence or a part of the sequence that could interfere with the epitope binding. In these cases, we work with our partners to engineer a conjugation to a certain sequence, or to add an arm into the drug itself, that will allow us to add biotin and ruthenium. These tags would be in positions far away from any binding epitope that may be developed if an ADA is developed in a patient or a subject taking that drug. So the process involves looking at the sequence with the client and our partners to engineer the most suitable approach to add those tags.
How do you guarantee the stability of RNA samples after collection knowing that the half life is only a few minutes?
That has been the challenge when collecting samples (e.g. plasma, serum) for these kinds of drugs. PAXgene collection whole blood tubes have a unique cocktail that will preserve the RNA upon collection. The stabilisers and protease inhibitors that are part of that cocktail will inhibit any digestion of the compound after we collect it. So the PAXgene collection tube is a technology that must be used when collecting blood samples to allow us to manipulate the samples and extract them smoothly, allowing us to guarantee that the analyte collected at that time is intact and is not going experience any further degradation.
Why should you use a multiplex assay for a single RNA analyte?
The multiplex assay does not need to be used when you have a single analyte. If you really want to push the LOQ to the single digit picogram / mL and you’ve seen variability in your quality control in terms of recovery, then you have to add an internal standard or select a house gene that you can normalise your data to. If you go down that route, then you have to use the multiplex format because you will have one bead specific to your analyte, and the other one for the house gene that you will use as your internal standard. If your assay using the Luminex platform single analyte works, the LOQ is achievable and the reproducibility of the assay is there, there is no need to normalise your response to a housekeeping gene and you do not need to use a multiplex assay.
Is it possible to multiplex LC fluorescence for oligo quantitation to improve throughput?
Yes, we definitely can multiplex the system, but usually we choose not to because the cost of a fluorescence detector is very minimal compared to the complications of multiplexing. Another possibility is if we choose only one injection, we might be able to choose multiple labelling with different fluorescence dyes. We can, at least in different time periods, monitor different transitions or different emissions and excitations. Another possibility is to use the same detector to simultaneously monitor two or more excitation/emissions. We are pursuing all options at the moment.
Is there is a difference in sensitivity between QQQ and HRMAS?
It really depends on the oligo of interest. Some oligo will have better sensitivity on QQQ but some oligo will show better sensitivity on the HR-MS Q-Exactive. It really depends on the structure, length, molecular weight and the matrix to be evaluated.
Where does carryover happen – in the autosampler or everywhere? Do you have any mitigation advice?
Usually carryover happens in the column, but sometimes it also happens in the autosampler. To address carryover in the column, usually adding a backflushing step with a proper reagent can solve the problem, also high temperature usually will help. With the autosampler, you have to try to use different solvents and different washing programs.
For the Luminex platform, what type of response do you typically see for a N-1 or N-2 oligo metabolite?
When we build the assays we target the same calibration range from a low end consistent. For example, if the LOQ is 1 picogram / mL, we would target to have an LOQ for the metabolite N-1 or N-2 or more as 1 picogram / mL. The upper range of the curve will be shortened in most cases because the metabolites captured in the well always show a challenge and have a larger linear range compared to the parent.
The following are responses to unanswered questions posed during the live event.
Which platform do you recommend for an oligonucleotide analysis?
It will depend on the structure of the oligonucleotides and the study purposes. LC-MS/MS and LC-HRAM (high-resolution accurate-mass) have the best specificity. They usually have good sensitivity for short oligonucleotides, but the sensitivity drops as the length of the oligonucleotide increases. Usually hybridization LC-fluorescence works better for longer oligonucleotides.
Have you considered the use of super charging agents for the MS analysis?
We have not, but definitely would consider trying.
Do you think the method will work using API4000?
Yes. API4000 would work, but may not as sensitive as API5000. Also, surprisingly, we have found that the API6500 does not work better than API5000 for any of our tested oligonucleotides.
What was the resolution for your oligonucleotide on Q-Exactive at the m/z of your oligonucleotide?
70K and 140K in my presentation. But the resolution is 70K for all of the validated assays for routine analysis.
What is the shortest sequence that can be detected with either of the techniques?
The LC-MS/MS or the LC-HRAM have no limit (can be as short as one or two nucleotides). The hybridization based assay will require approximately 10-mer.
What are the causes of the poor data before normalization if all samples are treated on a “per volume” basis?
The variability in the extraction can only be accounted for by normalizing the data to a gene that follows the same pattern when treated with the same reagents during the extraction. It is exactly the same as normalizing the biomarker data from urine samples to creatinine.
Do you have an assay for microRNA or small RNA?
Yes, both can be analyzed using Luminex single-plex assays.
What are the advantages of the ABA assay versus hybridization ELISA?
The sensitivity and the complexity of the oligo or mRNA is what dictates the platform. In all cases, the order of preference is LC/MS/MS, then hybridization ELISA, then the Luminex platform.
Can you manipulate the charge state distribution towards one particular charge to increase sensitivity?
Yes. The pH and/or ion-pair reagents in the LC mobile phase can impact the charge distribution, but it will always have multiple peaks.
In which species have you managed to generate high quality positive control for the ADA assay?
Rabbits are usually the best, with goats being the second best, for generating polyclonal antibodies.
What mobile phase did you use for LC-MS method?
HFIP (hexafluoroisopropanol)/TEA (tetraethylammonium) or HFIP/DIPEA (N, N-Diisopropylethylamine) buffers.
Is this kind of ADA assay largely used by biotech or pharma to support clinical development? What about target or drug tolerance?
The ADA assay is the normal platform used in industry. Target or drug tolerance is usually not an issue for mRNAs, but can be an issue for oligonucleotides due to the extended half-life. The acid dissociation is the best approach for resolving the issue.
For the high resolution method (using 140k resolution), it can get 1 ng/ml LLOQ, but how many data points can you get for LLOQ? Can you get enough data points?
This is a very good question. Yes. We can get about 8 points. For QE, the scan speed is 1.5 scans per second at 140k resolution and 3 scans per second at 70k. We did lose half of the data points when the resolution increased from 70K to 140K; however, the data points seemed to be sufficient for accurate quantitation. We usually use 70K resolution for assays being used for routine analysis.
Have you tried other housekeeping genes?
Each project is different and the housekeeping genes are chosen based on the analyte.
For the hybridization assays, when you normalize with the PPDI housekeeping gene, does it mean that the 2 plex was modified to a 3 plex, i.e., did you add a third bead in your assay?
Yes, the 2 plex were modified to a 3 plex by the addition of a third bead in the assay.
Is there more room to increase sensitivity for LC-MS assays and what would you try in order to achieve that?
Yes. It depends on the sample volume and the structure of the target oligonucleotides.
Is it possible for you to walk through the normalization procedure with PPIB? Normalization can sometimes leave a unitless value that leads to fold change calculations, yet in the table I assume your values are ng/ml.
You are absolutely correct. The units would be ng/ml per X ng/ml of PPIB. Usually defined and, thereafter, we use (ng/ml normalized) as a unit.
With regards to HRMS would you suggest Orbitrap technology or TOFs?
This is a great question. We only have Thermo Orbitraps in the lab and we like this platform. Other users also like it. It looks like Orbibtrap doesn’t need to be calibrated as often as TOF.
Was the difference in charge state distribution between the QQQ and QE representative of many oligonucleotides, or isolated to that specific example?
The difference is common, but the example is an extreme case.
What are the advantages of using a glass vial over a polypropylene vial?
We have observed negatively charged oligonucleotides tend to interact more with the surface of the polypropylene containers (i.e., are “sticky”).
To view the webinar on demand click here.