Brian Beato, Ph.D. has served as senior scientist in mass spectrometry at AIT Bioscience in Indianapolis, Indiana since its inception in January 2009. His primary function continues to be developing and validating quantitative bioanalytical LC-MS/MS methods, but he has a deep-seated passion for increasing efficiency and error prevention in regulated laboratory environments. As a Laboratory Director at a bioanalytical CRO Brian is keenly aware that bioanalysis does not end in the laboratory. In this commentary he argues that automation should not either.
The topic of automation, at least among bioanalytical scientists, usually conjures up thoughts of laboratory instrumentation. Autosamplers connected to myriad bioanalytical instruments have long been the norm. Modern bioanalytical laboratories utilize automated robotic liquid handlers for parallel processing of samples 96, or more, at a time. Automated plate washers, automated cell counters, and numerous other examples of laboratory automation instrumentation greatly enhance productivity compared to performing such tasks manually. As an industry, we’ve spent countless millions on new technology for the laboratory over the years to increase efficiency via automation, enabling us to generate more data and information per scientist than ever before. Perhaps one reason for this significant investment in laboratory automation is that many of us who justify and impact purchasing decisions are current or former bench scientists who tend to desire the latest and greatest instrumentation and software available, whenever possible.
However, one impact of increased laboratory automation and efficiency is that the data/information constriction is moved downstream, out of the laboratory and into quality control (QC), quality assurance (QA), and reporting functions. Regardless of specifically where a constriction exists, the intended customer still suffers delays in data or report release. Thus, limiting automation to the laboratory is not necessarily sufficient. In most cases, QC, QA, and reporting endeavors have remained largely manual compared to upstream laboratory activities. When laboratory automation yields increased data and information flowing downstream, more non-laboratory staff are typically required to audit and report it. Making matters worse, it is often easier financially to obtain additional hardware and software than to increase headcount.
Ideally, we in the bioanalytical community should rethink automation to encompass all related workflows, including traditionally downstream activities. To reduce constrictions, we should consider making investments in automation and new technology in QC, QA, and reporting, commensurate with investments made in the laboratory.
As one example, QC is an area ripe for automation via technology and software investment. The fact is, most auditors expect to find errors when auditing bioanalytical studies and facilities. It is usually only a question of how many errors will be found, and of their severity. When QC is performed after-the-fact, any and all errors found have already occurred. While there are numerous paths available to address these errors, whether big or small, nothing beats error prevention. Error prevention can be accomplished via automated QC in real time, up front, rather than afterwards. Automated QC can be realized through the use of advanced electronic laboratory notebook (ELN) functionality serving as process control software, instead of merely as a simple “electronic sticker book”.
Consider, for instance, a regulated laboratory environment in which an analyst, eager to weigh out a reference standard, neglects to verify the accuracy of the balance with certified weights. While by no means the worst standard operating procedure deviation one could ever encounter, think about when the issue would typically be discovered, and how much work would have already taken place and be impacted. That reference standard is used to prepare a stock solution that in turn generates calibration standards or QC samples, which are later used during validation or sample analysis. As QC activities typically take place after-the-fact, we can each rhetorically contemplate how many days it will take to be discovered, and how timelines, bottom lines, and customer confidence will be impacted by any re-work or justification required by this deviation. Again, while this is not the worst error ever, it is illustrative. Similar errors, from use of expired or incorrect reagents, to use of incorrect method versions or matrix, all add up to erode data quality.
Now imagine instead, preventing such errors via use of automated process control software in the guise of an advanced ELN. When the analyst attempts to use the balance, the ELN workflow checks for the most recent balance verification and the range verified compared to the mass to be weighed. If a sufficient standard operating procedure-compliant verification has not yet been made, the workflow forces the user to perform one before the reference standard may be weighed out. This prevents a cascade of subsequent experiments from ever taking place without documentation of proper balance verification. Similar business logic can be programmed into workflows to prevent all sorts of errors and oversights, enabling bioanalytical scientists to spend more time and energy focused on science.
Actually, some of us are already beyond imagining this type of expanded automation, and have implemented rather extensive “smart” ELNs used for process control. Here, the QC process is no longer a downstream, after-the-fact function, but occurs in real time in a comprehensive, highly automated manner, to prevent errors. The extent to which errors can be prevented is up to the user. Of course, highly impactful errors prone to occur frequently are programmed out. However, while error checks are being programmed into automated electronic workflows, why stop at preventing only critical errors? As with all programming, time invested up front can be multiplied many times over in future time savings. While quality increases, re-work decreases, and turnaround times improve.
In addition to redesigning the QC function, we’ve used the automated “smart” ELN concept to also help automate the QA and reporting processes. Automation helps these traditionally downstream processes keep pace with ongoing efficiency improvements in the laboratory. This new type of automation investment may also be more aligned with technology investments in the laboratory, since it does not require traditional headcount increases.
Bioanalytical science and engineering have long been generating automation innovations that continually increase data and information output from the laboratory. There is no reason to suspect the pace of such innovation will slow any time soon. Those who ignore the need for increased automation in related bioanalytical areas, such as QC, QA, and reporting, risk an increasingly inefficient future. The good news is that technology already exists, and merely awaits deployment to more completely automate bioanalysis, end-to-end.