In this short interview Roger shares his experience in microsampling and what drew him to the field. He discusses some of the key challenges still faced by the community.
Could you tell us more about your current work?
I’ve had the fortune of managing an organization that supports predominantly nonclininal bioanalysis drug metabolism and pharmacokinetic(PK); this includes management of an ADME group doing drug metabolism biodistribution and bioanalytical work in both large and small molecules. I also have responsibility for analytical work in supporting nonclinical toxicology as formulation support.
My current works spans the whole gamut of DMPK from discovery through to Phase I and then to Phase III for the bioanalysis. So it includes a lot of the bits and pieces that go to drug development.
What does microsampling mean to you?
Very simply and practically a microsample is a small aliquot that is less than 50 μl. That aliquot could be serum, plasma, blood or even cerebral spinal fluid.
What began your interest in microsampling?
My interest in microsampling actually occurred in parallel with the need to improve assay sensitivity.
It really kicked off for me in the 2000s when we were doing a lot of discovery screening involving new molecular entities that medicinal chemists would synthesize. They were very keen on understanding the pharmacokinetics. You didn’t have a lot of test articles to dose a lot of animals, which then led to finding techniques for even smaller volumes to reduce the amount of test articles being dosed, but also come up with good quality data.
And so serial sampling was the ultimate technique for data quality and so with those two components, small amount of test article with high quality data in a fast as possible scenario, this led to techniques and workflows that relied on small volume of sample.
What are the benefits of microsampling?
Well if I didn’t say the three Rs I’d be frowned upon. So really the replacement, refinement and reduction of animals, that’s key. But ’there’s the quality associated with that, the better you can do serial sampling the better the quality of the data coming out of what you are analyzing. When you’re using fewer animals on study, and this is more strictly in the discovery and nonclinical space, everybody is happier. There is a cost reduction as well; in addition to getting the quality with a cheaper study.
Serial sampling has been taken as the gold standard for doing pharmacokinetic analysis, there’s less variability when you’re taking blood out of the same animal, it’s also faster and you don’t have to wait as long to get that sample. However, with recent volumetric accurate devices that are around, like volumetric absorptive microsampling (VAMS), they can take samples of 10 μl, are much faster and cause less stress on the animals.
Certainly the accuracy there not only just improves the overall quality but also PK timing, trying to get a blood sample every 15, 30 and 45 min when you’re trying to warm an animal up to get the normal approach to bleeding can often leads to problems, for example if the animal squirms and you just can’t get the blood flow. So with new microsampling devices like VAMS, it just improves the overall quality of data and less animal stress.
How has the demand for microsampling increased in recent years?
Well, I would argue microsampling has always been there – any time that you can do bioanalysis with a little as possible sample and push detection limits, this has always been there. I think the industry and the buzzword of microsampling have been confused a little bit with the dried sample. In recent years the dried blood spots (DBS) seem to be the real push and at least microsampling is coming into the vernacular for bioanalysis. And really I think that DBS has its place with simplifying the sample shipments, the opportunity for home base sampling, pediatric trials and all of those kinds of things.
But microsamples have always been there. What has happened is that instead of doing 0.5 ml plasma aliquots of bioanalysis and fairly extensive concentration workflows to get the sensitivity we need, as the instruments got more and more sensitive this became 50 μl and now it is 5 μl.
Currently we’re doing nanoliters on platforms like the Gyrus workstation – so it’s evident that microsampling has always been there, the technology has just improved and has just gotten easier.
What is the biggest challenge microsampling faces today?
I would say getting regulatory approval – certainly this is true in the USA. The US FDA is taking a very conservative approach. For regulator studies, and this is both in GLP toxicology and the clinical space, you need to do complementary assays, both liquid. It’s just the understanding that 25 μl is more than adequate and that this actually qualifies as a microsample.
When it comes to the overall perception, it is the regulatory approval of these DBS that has slowed the real explosion in that technology, but there are ways around that, again the VAMS technology is I think the way that will become very routine. Companies and sponsors are very conservative and if you said you were doing something new or let’s say out of the ordinary they get very nervous and that’s understandable, if I’m spending $1 million on a toxicology study, I don’t want any doubt.
And so there is that conservatism in the USA, but when we get European sponsors they are a little bit more amenable to focus on the three Rs. But what we’ve seen is a conservatism that shows doubt –“are you sure it’s going to work”? We’ve obviously done all of our due diligence within our laboratory and as I said we’ve been doing small volumes of samples for 20 years. There is no real issue. But it’s the DBS that gets all the press, the issue with hematocrit and things like that creates doubts in peoples mind when you say microsamples. So we just say we do small volumes and avoid that issue altogether.
Despite the countless benefits known of microsampling why do you think it is still taking time for it to be adopted?
I think it is the regulatory approval within the USA, I think in Europe that’s not really there – the regulators are much more amenable to accepting it, and again I’m really talking about dried matrices, and that’s where I think there’s opportunities for clinical studies where you can do home-based sampling. That will force the situation where people will have to do it that way, and as that body of data builds, and certainly there are a lot of large pharmas currently doing a lot of pioneering work in this, regulators will start to get comfortable. It will take a little bit of time to get there; I think in the USA by 2020 there will be enough data for the regulators to say ”you know what this is really routine”.
Certainly in my space, a lot of the work in the nonclinical GLP toxicology is just advancing as instrument sensitivity improves. As we do more and more in biologics, just the assay formats were already in the microsampling paradigm. So that’s been accepted for many years – but it’s the vocabulary around microsampling that is being associated with dried matrices that may really be what’s slowing down the full implementation. That will disappear as people get more and more educated about what microsampling is, there is no mystery around it, it will get comfortable and will become as routine in the clinical space as it is in the nonclinical.
Where do you see the field in the next 5–10 years?
The conservatism will disappear; I think microsampling will become absolutely routine. Instrument sensitivity will continue to push the vendors for more and more sensitive equipment. In parallel that will reduce the amount of volume that is needed in the bioanalysis. Regulators will start approving, new drug applications (NDAs) or biologics licence applications (BLAs) in the biologics, the concern will vanish and I think that will happen more in the next 5 years.
The tools that become available for handling small volumes, the accuracy of the sampling devices, the automation all of these will continue to improve. It is not trivial to do 384 well plates, we currently do them. It is not yet routine, 5 years from now we would like to say ”well when we are going to move up to the next platform – 384 is easy”. So I think just the ease of that and the workflows, the accuracy of how we go about doing the bioanalysis will improve and with that the regulators will become very comfortable, it will just become part of the normal course of business.
If you could give a piece of advice to fellow colleagues looking to implement microsampling as part of their biosampling strategy in their research, what would you say to them?
First you have got to assure them that the study will be valid. There has been enough out there about regulators not approving it and they have to do all this extra works and so forth. I think the first piece of advice, after doing it for 20 years, would be that it is just the DBSs that are the issue at the moment, but even that is being resolved.
Secondly it won’t cost more, in fact it will cost less and that’s always a good thing when you’re trying to keep your budget in check. As soon as we have discussion with sponsors and say ”look it’s not a bad thing, the data is valid and it will cost you less”. Also it is not more difficult – again we have been doing it for a long time and devices like the volumetric absorption are making it even easier, and it’s easier for the animals too.
At the end of the day you need an internal champion within the organization to drive it, to be that resource to say ”Can we do this?” You’ve got to get over that activation barrier and that’s true of any incremental changes – you still need someone to push to say let’s give it a go, let’s do the internal due diligence to make sure we can actually take a 10-μl sample out of a mouse, you need that!
And always you have to communicate, you have to communicate with your sponsors letting them know the studies are going along well, that the data quality is there, and then just continue to reinforce those advantages. That’s really the advice you give to anybody about any new workflows. It does work, it’s routine. There’s nothing to be scared about. There’s no need for conservatism. It works.
It is an exciting time to be doing bioanalysis, in 20 years going from 0.5 ml to now thinking of 2 μl as routine that’s exciting. I’d like to see what the next 10 years looks like, it’s almost unimaginable to think that we’ve dropped multiple logs in volume and the sensitivity has enhanced greatly; picagrams have become routine and we used to think micrograms was tough, so it is an exciting time!
About Roger Hayes
Roger Hayes, PhD, is Senior Vice President, DMPK, at MPI Research (Mattawan, MI, US). Dr Hayes has held numerous leadership positions in the global life sciences industry and academia, leading teams in the development of state-of-the-art bioanalytical and analytical techniques, including mass spectrometry, chromatography, and automation in bringing medical and chemical products to market. For nearly two decades, he has led strategic and research initiatives for large pharmaceutical companies that included both GLP and non-GLP preclinical studies as well as clinical trials. Most recently, he served as President of Bioanalytical Operations, at Cetero Research (ND, US)) where he focused on establishing overall corporate direction for bioanalytical and analytical services. Dr Hayes has published extensively and has taught numerous aspects of LC/MS method development. He is an active member of the American Society for Mass Spectrometry and the American Association of Pharmaceutical Scientists.
This interview was featured in an interactive supplement on microsampling published on Bioanalysis Zone. Click here to read the complete supplement.