Current Edition

The Use of Predictive Analytics to Improve Quality in Clinical Trials Quality Assurance:

Independent or an Enabler – or Can it Be Both?

The impact of poor quality in a clinical trial, often discovered late in the process, can not only add costs to addressing the re-work necessary but in some cases can lead to rejection from a regulatory authority. Using predictive analytics and scorecards throughout the lifecycle of a study can, in many instances, prevent a quality issue from occurring or mitigate the impact in other cases. The need to apply this approach in Quality Assurance (QA) is becoming more important due to the increasing complexity of clinical research and the increased use of technology to capture patient data remotely, as is the case with decentralised and hybrid trials.

Quality Assurance – An Evolving Discipline

The traditional role of the QA department is to provide oversight of processes and patient safety by way of audits. This approach involves an auditor looking at a sample of data in a process and making a judgement on if the process is robust and fit-for-purpose, or has gaps. The challenge with such an approach is that it is reactive and involves a deep assessment as to why the process isn’t working or why there was noncompliance to the process, otherwise known as Corrective Action Preventative Action, (CAPA). However, too often we see repeat CAPAs with the same root cause and the same preventative action, i.e. retraining. However, similar to how technology and the insights we can gain from innovations are changing and evolving in many areas across the clinical trial process, there is a real opportunity to develop new and innovative processes in the area of QA.

The role of QA departments is moving beyond audits at a given point in time, and building processes and procedures that will enhance the value it delivers on an ongoing basis. Predictive analytics is one of the key approaches that is enabling this expansion of how QA departments support the clinical trial process. This, coupled with the use of scorecards in the appropriate settings, can prevent serious data quality issues from occurring in a study. In this article we’ll outline how using data analytics can deliver greater value and drive an expansion of the QA department’s role.

What’s Enabling this Evolution?

Predictive analytics is an approach that examines data, trends and content to answer the question, “What is likely to happen?” based on the data trends we are seeing. It involves looking for patterns and trends in real-time throughout the lifecycle of a clinical trial.

In the field of clinical trials, we already have a vast amount of data available to us from external audits, regulatory inspections and internal audits. Putting the findings into agreed consistent categories before data mining is a critical step in being able to refresh the data in real-time. Following this step, we can examine the root cause of each category with an experienced expert and data analyst. This allows the QA department to direct the business in the correct areas which leads to repeat findings. For example, it may be a sub-optimal process that is the issue, or a repeat human error in what can be an over-complicated process. The development of scorecards can also bring considerable insight into the process. They are the product of a snap assessment of a process (or multiple processes) embedded in a study, providing a “score” or colour code indicating if a process is under control or out of compliance, even before a study has begun.

The Benefits of Predictive Analytics and Scorecards

Our experience has shown a number of clear benefits from the use of predictive analytics, not least the ability to assess risks both before and during a study, thereby allowing us to combine initial, inherent and operational risks. It has allowed for timely conversations with clinical trial team members in advance of a process being executed. The approach steers the team in the right direction, thereby avoiding the need to perform root cause analysis explaining why something went wrong.