Conquering Data Integrity by Eliminating Human Error
Microbial Solutions
Matthew Paquette

Conquering Data Integrity by Eliminating Human Error

Five Frequently Asked Questions about data integrity, risk mitigation and human error in endotoxin testing

Data integrity is a critical element of an organization’s quality program, and recently oversights have been brought to the forefront by regulatory agencies citing violations and inadequacies in findings from inspections, audits, and warning letters. Reducing the risk for human error in our manufacturing and laboratory processes will ensure that we comply with data integrity laws and regulations while building quality into our everyday practices and keeping our patients and our drugs safe. As laboratory testing shifts from observation-based qualitative methods to less subjective quantitative methods, focus on data integrity becomes more critical to ensure overall product quality and patient safety.

Our recent webinar, Data Integrity: Eliminating Risk & Human Error in Endotoxin Testing discussed in great detail how utilizing an organized, risk-based approach to closing gaps in processes caused by human error, allows users to demonstrate that data generated to prove product quality is integral and accurate. Here are five of the most frequently asked questions from attendees of the webinar on what can be done to mitigate errors and maintain data integrity in the lab.

What are some other processes in the laboratory that could be deemed too subjective and that present a high risk for human error?

We as an industry look critically at our current processes and methods, so we should realize that a majority of these methods can be considered antiquated in the face of how our systems and manufacturing processes have evolved. Furthermore, as we take a deeper dive into each process, we should understand that these methods rely heavily on the human factor to provide an accurate recordable result.

Other industries, such as the airline and automotive industries, have studied and published work on the effects of human error and how to mitigate risk, so we have an opportunity to learn how we can apply it to gain operational efficiencies and remain compliant. One realization is that a lot of processes fall into the duplication category of human error risk mitigation, implying that to prevent human errors, another set of eyes is required to review and confirm work that has already been performed. This recent trend, known as the “four eyes principle”, is typically implemented to further safeguard against subjectivity in result reporting. This ineffective approach suggests that the occurrence of analyst-dependent errors is becoming more apparent in traditional testing.

Other laboratory processes that fall into the duplication category, tend to also be very subjective in nature. One highly subjective method occurs frequently when a laboratory performs a traditional plated bioburden test to look for microorganisms in their manufacturing processes, or in their manufacturing suites, or even in their water systems. This test relies on an analyst to perform a plate count for recordable data and involves an inherent amount of risk for inaccuracies to occur. As our laboratories strive to operate more efficiently and do more with less, our analysts’ bandwidth are increasingly stretched thin.  This has the potential to incur mistakes from over-worked and distracted analysts where plate counts could be potentially missed altogether. Another factor here is how a colony on a plate is defined by the analyst. Individually different analysts will count colonies in different manners, leading to different plate counts from different analysts. Which analyst should be considered correct?

Another laboratory process that falls into the duplication category are the microorganism identification methods that rely on the human factor to make a subjective call on microorganism names. Many of these methods include gram staining for determining whether an organism is gram negative or gram positive, and API strips that rely on a color change for identifying microorganisms in our manufacturing processes, end products, or water systems. One thing to keep in mind are the common themes with all processes that require the duplication category for risk mitigation; these processes rely on a very subjective reading for results and the human factor is prevalent for those readings to occur.

How are regulators looking at automation in terms of data integrity and human error?

Overall, regulators are starting to become very familiar with laboratory and process automation, as part our manufacturing and laboratory processes. As the regulatory agencies become accustomed to this technology, they gain a better understanding of how these automation technologies interact with an organization’s quality systems and problem-solving strategies. As the data integrity problem evolves with an organization’s strategy for mitigating risk, so to do the regulators view of the two main factors associated with the data integrity concept; software compliance and risk for human error. Regulators’ view of automation as a solution for mitigating the risk for human error and making our processes more data driven and less reliant on subjective data is overall very favorable. They recognize that by working towards the elimination of human error in our processes, we are not only making our processes more efficient, but we are increasing the quality of the products we produce as well. This has an overall positive impact on the health of the patients we serve in all industries.

What kinds of programs can be created to investigate and mitigate risk to human error in the laboratory?

Companies that operate in regulated industries understand that they are obligated to perform well documented investigations that drive to root cause, using data and/or facts as support for any decision making as a result of the investigation. Most firms also recognize that a well-rounded investigation also makes their processes as efficient as possible while utilizing their root cause analysis to ensure that past problems do not reoccur in the future.

Companies will perform investigations and mitigate risk in a variety of different ways, that usually involve an organized strategy predicated on utilizing available data and facts for problem solving and root cause analysis. One method that is very common in many industries is utilizing the Define, Measure, Analyze, Improve, and Control strategy, often referred to as DMAIC. This strategy uses elements from Six Sigma to provide a very organized structure that defines the scope of the problem, evaluates baseline data, relies on visual brainstorming to root cause, implements process improvements based on collected data, and controls the future state of the process to prevent back-slide. Overall, the DMAIC strategy provides firms with a structured and proven method for solving complex process problems and investigations while recording the decision-making process to remain compliant and so that they can defend their actions to regulatory agencies.

The important thing to keep in mind when a firm is setting up their strategy for performing investigations and mitigating risk for human error is that more and more, regulators are being exposed to extremely organized methods for making decisions about their processes. This means that increasingly, regulators are expecting that investigations utilize some form of problem solving and risk mitigation that is based on representative data collected specific to that firm’s process, along with decisions that are scientifically sound.

Are laboratories spending a lot of time investigating results from assays where the root cause is later deemed to be human error?

Investigations that take place at a firm that operates in any regulated industry tend to involve multi-functional teams consisting of colleagues from quality control, quality assurance, manufacturing operations, engineering, and a lead investigator or project manager. Developing a cross-functional team can require a great deal of effort, time and organization, but will result in a high reward due to the problems being addressed. Most firms have documentation in place that defines how long an investigation should take, but in my experience, investigations involving human error tend to be very complex in nature and involve taking a critical look at the processes involved in obtaining a result or value. Because manufacturing or laboratory processes need to be evaluated in these types of investigations, firms tend to spend a large amount of time and effort in resolving investigations involving the human factor. This is just one key reason to adopt automation in our manufacturing and laboratory processes, in an effort to mitigate our risk as an industry to prevent human error and data integrity violations.


Do you see rapid micro methods as a potential solution to data integrity issues in the micro lab?

Rapid micro methods continue to have an important influence on the industries that have implemented and employed them. Not only have they made firms more efficient, but as current regulatory issues have evolved they have also led to giant leaps in strategies for compliance and process improvements. Many rapid micro methods have functioned to make very subjective assays into assays with discrete results. This, of course, works to increase a firm’s understanding of their process and provides the potential for increased control over the future state of their manufacturing or laboratory processes. Making our processes less subjective directly impacts an organization’s ability to mitigate the risk of human error by making their process less reliant on duplication and bringing them more into the categories of error prevention or error proofing. Implementing rapid micro-methods, allows for a heavier adoption of manufacturing or laboratory automation. This is the most effective way for a firm to mitigate their risk and exposure to potential human error and directly impacts their stance on data integrity compliance. It’s also important to recognize other organizational benefits to the adoption of automated systems. These benefits include; freeing up highly trained laboratory staff to focus on more value-added development or investigational opportunities whether it be more complex laboratory assays, LEAN implementation and continuous improvement, or other process control activities.

To learn more about reducing your organization’s exposure to human error and strengthening your firm’s stance on data integrity, please contact us at [email protected] or visit our website.