Microsampling – how the past may bring the future to disease diagnosis

Written by Dr. James Rudge, Global Microsampling specialist, Neoteryx (UK)

James Rudge has served as the Global Microsampling Specialist since January of 2015. Prior to joining Neoteryx, Dr Rudge worked for Phenomenex for 14 years and is a coinventor of the Mitra Microsampling Device and the Volumetric Absorptive Matrix Sampling (VAMS) technology. During his 14 years at Phenomenex, Dr Rudge held a number of roles including Key Account Manager, Field Service Specialist and latterly European Business Development manager for Clinical. These roles allowed him to collaborate with customers on a wide range of projects regularly working in customer laboratories (globally) developing novel sample preparation and LC / LC–MS methods. Dr Rudge graduated from the University Wales, Swansea with a BSc. (Hons) IIi in Biochemistry and a PhD in Organic Chemistry where he worked on novel chemiluminescent probes for immunoassays.

In the early 1960s, a physician called Robert Guthrie from the Institute of Massachusetts (Cambridge, MA, USA) sparked a revolution in the way children are tested for inborn errors.  In doing so, he and his research group were responsible for saving the lives of thousands of children year on year, often allowing them to live almost normal lives.  He achieved this, in part, using a microsampling method.  This was of course DBS, or dried blood spot testing.  What Dr Guthrie realized was that there were a number of inborn errors of metabolism, which if caught early and with the right intervention, could be treated effectively.  One of these illnesses was Phenylketonuria or PKU.  It is estimated that this neurological condition affects about 1 in 10,000 babies (dependent on the population) and is caused by a genetic defect in the production of an enzyme called phenylalanine hydroxylase.  Abnormally low levels of this enzyme, cause an accumulation of phenylalanine (PHE), which in turn affects uptake of other large neutral amino acids in the brain thus causing abnormalities in brain development.  Thankfully, there is an effective treatment for the condition which is achieved through a diet low in phenylalanine plus medication if needed.

Scientists in the 1960s were restricted to the technology of the day.  The modern LC–MS would seem like something out of Star Trek to them.   Today in many hospitals, it’s the modern mass spectrometer (MS) which is currently used to diagnose PKU in a matter of seconds.  Ironically, it can be argued that if Dr Guthrie had available a modern mass spectrometer, DBS may never have become the popular technique that it is today.  The reason is that DBS was involved not only as a convenient way to collect and transport blood samples, but it was also part of the analytical assay.  DBS takes its legacy from the science of bacterial inhibition assays.  A science dating back to 1889!

A bacterial inhibition assay works by placing an impregnated disk of paper on an agar plate impregnated with compounds that either inhibit or promote microbial growth.  Indeed, it was a variant this technique which led Alexander Fleming to discover the antibacterial property of Penicillin.  Using a similar idea, Dr Guthrie took  small drops of blood from new-borns and spotted them onto paper after which he allowed the paper to dry.  He then punched out disks of blood and placed the disks onto his agar plates.  The high concentration PHE in the dried blood of babies with PKU promoted rings of bacterial growth around the spot.  Those with normal PHE levels, did not promote growth.  This was a neat and simple diagnostic which carries a huge legacy to present day neonatal screening.  In some states in the USA, over 50 conditions are now screened by DBS.

If Dr Guthrie had the modern MS would he have been led to use paper disks?  Maybe he would, or maybe he would have developed a conventional serum blood test?  Who knows?  However, his brilliant legacy still remains to this day.  Moreover, the use of dried matrix sampling such as DBS is used in many different fields such as, therapeutic drug monitoring and diagnosis of viral diseases like HIV and hepatitis.

The success of the new-born screening initiative was its simplicity.  However, one of the observations which soon became apparent was that even though DBS worked really well for semi quantitative assays like for PKU, the development of more quantitative assays became more of a challenge.  Indeed, in recent years, there have been efforts to address this by solving a number of fundamental quantitation issues with DBS.  One such issue is volumetric hematocrit effect, where the viscosity of the blood (due to % Hematocrit; HCT) can bias the results when a sub punch is measured from the DBS sample.  This has been solved by the emergence of fixed volume sampling devices and also ‘whole spot’ extraction techniques.  Another issue is blood plasma ratio effect.  Compounds only found in the plasma fraction, will be inversely proportional to the % HCT of that sample.  This raises a fundamental debate in blood measurements as to what matrix we should measure our analytes in.

Blood vs Plasma

I’ve been to a number of meetings where the question of blood vs. plasma has been debated.  Quite simply which is the ‘better’ matrix to measure the physiological concentrations of a certain biomarker / drug?  For simplicity, I’m going to refer to ‘plasma’ as plasma or serum.  It must be noted, that for some assays there are differences between plasma and serum, however I’m not going to address these in this article as these differences are often subtle.  The big difference between plasma and blood is that plasma is clear and blood is not and this determines the type of analytical methodology.

There are certain cases where plasma is the only option.  An example of this is in the measurement of plasma potassium.  Potassium is actively pumped into cells so any degree of hemolysis will skew the data and the assay will fail.  Furthermore, there are some incidences where it is vital that whole blood is measured and not plasma.  One example of this is the measurement of the anti-rejection drug Tacrolimus.  The reason is that Tacrolimus partitions primarily into blood cells. There are however plenty of assays where plasma or blood could equally be analysed.  One of these assays is the detection of hydroxylated Vitamin D metabolites.  These biomarkers are only found in the plasma fraction and not in the cellular fraction.  To compare plasma and blood concentrations of the vitamin, either a measurement of the percentage HCT is needed for each sample or an average HCT value is taken and the observed concentration is corrected with respect to the HCT fraction.  Alternatively, new reference ranges must be agreed on for blood.  So it can be argued that there are many assays where, given the right analytical technologies, dried blood assay could be just as effective as the common wet plasma assay.

So why Plasma?

There are a number of arguments as to why many assays have been developed from plasma.  The first and most important is the choice of detector.  Many of the assays used in routine clinical analyzers today are conducted by measuring changes in the wavelength or intensity or some change in electromagnetic spectrum from a sample.  The methods of how this is achieved, ranges from simple enzyme catalyzed reactions to sophisticated immunoassay experiments, but all using light to detected changes.  What is key for these assays is that plasma is required because it can allow light to penetrate it.  Blood being the colour that this is, is incompatible with many of these assays. Furthermore, instrument manufactures will often declare a maximum % allowable hemolysis.

Other reasons why plasma is chosen over blood is that plasma is seen as a simpler matrix to measure analytes.  Another argument for plasma is based on what is more physiologically relevant, blood or plasma?  And there are compelling arguments for both sides.  Nevertheless, with the immergence of newer technologies like LC–MS and some immunoassays, assays developed from blood is indeed possible.  So, blood does offer some distinct advantages.

An argument for blood

Taking a child for a blood test is one of the most stressful experiences both a parent and child can undergo.  Moreover, the smaller the child the less free circulating blood.  Blood volumes are dependent on the age and size of the individual.  An average 70 kg adult will have approximately 5 L of blood which is plenty for standard blood tests where vacutainers hold up to 10 mL of blood per tube.  However, a neonate weighing 3 kg will only have approximately 250 mL of blood.  So specialised blood tubes are needed to collect small volumes of blood (1mL).

Care needs to be taken about the total allowable amount of blood that can be taken from an individual.  There are various guidlines in the literature for what this limit is.  Let’s take 5% total blood taken per 30-day period as an example, this is only about 12 mL for a 3kg neonate!  This is a lot of blood especially if regular blood draws are needed.  Due to this, the smaller the blood samples the better for the patient.  However, harvesting good quality plasma as a microsample is possible but poses technical and practical challenges.  Analyzing whole blood samples solves this issue, however analyzing whole blood microsamples on clinical chemistry analyzers also poses huge technical challenges.  There must be a better way to conduct blood tests?  The answer could come from the emergence Omics revolution.

Is Microsampling and Omics the answer?

Omics is a broad term for the emerging science of global screening.  Omics is all about finding the proverbial needle in the haystack, whether the haystack is the: proteome, genome or metabolome.  Omics require very sensitive instruments with high levels of selectivity.  The sensitivity of some of these instruments are allowing some labs to employ microsamples in their very promising research.

In conclusion, imagine a future in which a universal detector could rapidly map out the phenome and the genome of an individual from one single drop of blood.  If a significant abnormality is detected, data is sent immediately to a physician via a smart device.  The physician would then be able to respond in real-time with an intervention.  What we are talking of course is part of the ‘personalised medicine’ revolution which we are already seeing the infancy of.  If this happens, then Dr Robert Guthrie’s microsampling legacy will have evolved into the mainstay of clinical diagnosis.

 

This article was featured in an interactive supplement on microsampling published on Bioanalysis Zone. Click here to read the complete supplement.