The Fall Magi Conference is over, but there are still plenty of things to learn from the event. To help with that, here is a summary of the session Ken Lownie, our Head of North American Operations, did with Dr. Penelope Manasco, CEO of MANA RBM, about the future of clinical trial remote monitoring and, in particular risk-based remote monitoring. Let’s get into it.

A Real-World Example of Remote Monitoring

Ken kicked things off by talking about one Agatha customer who switched to remote monitoring when COVID hit. Suddenly they couldn’t go onsite to view study documents or check processes.

That customer – medical device company, Optos – was already using Agatha’s eTMF application, so they reached out to Agatha to discuss their remote monitoring requirements. They were looking for a secure cloud-based solution that would let them set up separate workspaces for each site as a repository for their binder documents. As a sponsor, Optos could then go in a back door and monitor site documents, perform QC (quality control) checks, and issue comments or notes.

Working with Optos’s requirements, Agatha quickly developed an electronic investigator site file (eISF) application that Optos continues to use today. Ken shared a conversation he had with the Optos lead user, Giulia Bignami, where she noted four insights from performing clinical trial remote monitoring:

  1. Basic collection and sharing/QC is going well. However, initially sites perceived the change as a burden (even though it is actually easier!)
  2. They saw improved QC checks (including workflow processes and full audits) when they got into production use.
  3. After COVID, Optos expects to adopt a hybrid model. The cost savings related to remote monitoring are too good to ignore.
  4. From here, Optos is considering the remote monitoring application for other needs, including supporting a Post Clinical Survey process, which addresses a new EU regulatory requirement.

There’s More to Remote Monitoring Than Documents

Optos’s focus was on remote monitoring of documents and processes for clinical studies. But there is more to consider, especially regarding remote risk-based quality management (RBQM). Dr. Penelope Manasco shared the work she does in this area.

Dr. Manasco said that with the introduction of ICH E6 (R2), the ICH wanted to move to more rapid reviews – think every week – instead of planning monthly or quarterly reviews. There was a lot of focus on risk management, including:

  • Trials should be conducted to ensure the rights, safety, and well-being of human subjects.
  • The laws and guidelines must be followed.
  • Sponsors and CROs could tell if there was a systematic error (quality tolerance limits were passed).

To ensure these risks are managed appropriately, there are three fundamental approaches to remote oversight.

  1. Targeted SDV (source data verification) – look at the data fields defined as critical. The challenge with this approach is that it can’t identify a systematic error.
  2. Statistical Outliers & KRIs (key risk indicators) – look for outliers in different categories (i.e., sites not performing like the other sites). This approach requires a lot of data to work well. If you don’t have enough data, you risk getting false positives, or worse, false negatives. Also, if all sites have the same problem, you won’t identify the systematic error.
  3. Science-Driven Oversight – is the approach Dr. Manasco preferred. In this approach, you define the errors to look for from day one. It’s typically protocol-specific.

Which one is right? Dr. Manasco said some sponsors do 1 and 2, but you don’t need either of those if you choose to do 3.

Compliance and Quality Aren’t the Same Things

When you are performing clinical trial remote monitoring, it’s essential to understand that compliance and quality are not the same things, Ken pointed out. Just because you are adhering to regulations and guidelines does not mean you are actually driving improvements in quality. That requires planning and proactive actions, not just conformance with regulations.

Dr. Manasco vehemently agreed and said that, in addition, companies are often trying to fit new technology into old, deeply ingrained methods. She urged a rethinking of quality processes based on risk-based models as PART OF adopting new technologies and added that the industry needs to do a better job of quality oversight.

Both agreed that FDA is leading the way for new approaches and technologies. Though maligned sometimes as creating obstacles, at least in this area, the FDA is urging progress based on new methods and models.

Managing the Shift to Remote Study Processes

Dr. Manasco offered some guidance for managing the shift to clinical trial remote monitoring. First, she said to focus on improving the following areas:

  1. Human Subject Protection (routinely an area of issue in inspections) includes protecting PHI and informed consent.
  2. Primary and Secondary Endpoints and associated data – focus on the analysis and process data.
  3. Safety Data –  including SAEs and recognizing SESARs (suspected, unexpected, serious adverse reactions) – these are related to the study drug and its seriousness, but you also need to compare it to what was reported before.
  4. IP Management – this area gets skipped all the time.
  5. Protocol Compliance – keep track of deviations in real-time.
  6. GCP Documentation – if you don’t document it, then you didn’t do it.

Once you’ve identified these six areas, you need to develop a quality plan that outlines who will do the assessments, how you will capture things like findings, issues, and CAPAs. The point is that while new technologies are available for remote monitoring, new quality oversight strategies have to be combined with the new tools. It all has to be baked into a comprehensive quality plan acted as the summation of the session.

You can listen to the whole session on demand right here.

Share via
Copy link