Jump to content

Good Clinical Practice Committee

News and Updates

Inlcuding:

Committee Activities
GCP QA Clinic Review from the 2016 Annual Conference
Course Review: Fraud and Misconduct
Seminar Review: Data Integrity


GCP Committee Activities, January 2017
Angelika Tillmann, GCP Committee Secretary

During the last quarter of 2016, the GCP Committee held two face to face Committee meetings.

Activities during this quarter included:

  • Continued participation in the Programme Committee for the 2016 annual conference and participation of the conference.
  • Continuous monitoring of the GCP Discussion Forum on the RQA website; several interesting questions are answered by members of the GCP Committee and full answers are published in the Q&A section of the website
  • Continuation of preparation of training courses, new seminars and webcasts for 2016, e.g.
    • Training Course on Sponsor and Vendor Cooperation in Clinical Trials – 8th November 2016 as a pre-conference course
    • Seminar on Risk Management is under discussion
  • Provision of comments to four EMA Guidances released for consultation

 


 

GCP QA Clinic Review
Trish Henley

It’s that time of year again. A time when we join together in a room filled with thousands of combined years of QA and GCP experience to debate, to discuss, to argue those burning topics.  It’s the annual GCP QA Clinic. 

This year we decided to shake it up a little bit by embracing technology and spiking the PowerPoints with some voting questions. 

Chris Shepherd, the commendable compere, started us off with a controversial ice-breaker. Did we think the UK’s awesome MPs would vote for Brexit or not.  Seems we might be desperately putting our faith in our MPs, with 58% saying they thought they would vote to stay in the EU.  But I’m cynical.  Alors, au revoir mes amis!

To the topics of discussion!  Barney Horne started us off with a discussion on what requirements there are for reporting quality deviations.  Turns out most of the delegates use excel to collect deviations (58%), with only 20% using a bespoke IT system.  I felt perfectly at home, as I too use excel, albeit not validated, nor audit trailed, nor with very good versioning.  But collect the deviations we do, and will somehow in some way get them into the clinical study report (CSR). 

Note: a comment from the audience should be heeded. The MHRA would prefer to peruse the entire spreadsheet in electronic form, so they can play with the data and sort it and stuff rather than see a print-out.  Because then they can see what columns you may have *accidentally*deleted or hidden.

Cathy Dove then provided us with background on remote trials, and how remote can we go.  From the trial presented, pretty remote!  This raised some very interesting points as to what type of trial could be conducted remotely (most seem to prefer non-interventional as opposed to phase I-II), how to audit (do we visit the participant at home and check through their medicine cabinets for proper storage of the investigational medicinal product (IMP)), what about data integrity issues surrounding equipment and its proper use?  Is the device reliable at sending data?  Do we know if it stops working?  How are we checking consent?  How do we know the participant understands?  How do we know the intended person is consenting and not a 10-year-old miscreant?  Do we check passport photos or fingerprints?  And do we need consent to do that first?  Do we use video diaries?  45% thought that this would work for phase IV studies only.  But, just a few questions to consider!

Simon Molloy then gave us a presentation on CAPAs and introduced new terms (to me at least): little CAPAs and big CAPAs.  Makes them rather personable. Most delegates (75%) have a formal mechanism for recording and investigating non-compliance and tracking resultant CAPAs.  However, a comment from the audience was made that if the question had asked how many companies had ‘effective’ CAPA mechanisms the % would have been significantly lower!  Not all CAPAs are the same, so some organisations include the big ones (inspection findings, critical and major findings, from audits only), whereas some delegates track everything, even the little ones, as sometimes the little things become big things, and how else do you keep an eye on things.  Most delegates have a CAPA system based on audit findings (45%) with the rest of delegates split on tracking everything or using another internal system.  It’s the unknown unknowns that can really become a pickle later on.

Kerry Bunyan (the RQA award winner!) then provided us with a thought-provoking presentation on the value of sponsor audits from the CRO perspective.  She firstly asked whether these were good value for money and use of the CRO’s time, to which the vast majority said it was!  Kerry then gave us a quick and dirty calculation of roughly how much it costs her company, and could the money not be better placed elsewhere?  Is there another way that the CRO could demonstrate their amazingness?  Could they produce metrics based on inspection findings etc to their sponsors?  Intriguingly a proposal on an accreditation scheme was mooted from the audience.  If the MHRA accredit the phase I units, why not have a register of ‘good’ CROs?  An intriguing prospect, perhaps a topic to be covered at a future conference?  It was noted that part of the sponsor’s audit visit was to look at their own trial master file (TMF), which will change as we move towards more electronic TMFs. There was a plea for sponsors to be clearer up front regarding what they want from CROs, rather than asking at the end. 

We finished the clinic with some details on the US chapter and upcoming regional fora in that part of the world.

And last but not least, delegates told us what training courses and seminars they would like to have over the next year.  And so it shall be done.

 


 

Course Review: Fraud and Misconduct: Detection, Investigation and Management,16 September 2016

Trish Henley (Course Principal), Quality & Governance Manager, London School of Hygiene & Tropical Medicine

The morning dawned rainy, wet, windy, with a chill in the room (from the efficient air conditioning).  Perfect, then, for a day of fraud and misconduct!  Five brave souls entered the room with trepidation.  What would they learn?  How will this be translated into procedures for their own organisation?  How will this help with the for-cause audit scheduled for next week?  And did the RQA really have a course about committing fraud and misconduct, and not to get caught?

To set the tone for the session, we all made introductions and shared what we hoped to achieve from the session.  Expectations and aims stemmed from learning more about detecting fraud, approaches to audit when fraud is suspected and how to manage within a CAPA system. This was helpful to keep us on track, but also to review how we did at the end of the day. 

Paul Strickland introduced the history of fraud and misconduct, and included some interesting cases from some very well-respected scientists, including the “father of a biological science”.  You’ll just have to attend to find out who.  The key to defining fraud is that it is the intention to deceive, and does not include honest error.  Following feedback from the previous session in 2014, we also included some recent examples of fraud in clinical science, and followed one special case throughout the presentations. 

Barney Horne followed this with a session that drilled down the differences between fraud and ‘just’ non-compliance.  There is a big grey area that surrounds the non-compliance area, with varying degrees of severity, and can sometimes venture into misconduct.

Nicky Dodsworth presented on the difference between investigations, routine and for-cause audits. A highlight was learning more about observational and interview techniques.  Playing dumb is not just for blondes, it is a truly effective technique to lull your victim into a false sense of security, and plays to their ego as they are keen to show off their vast superior intelligence, which often tells you far more than they intended.  Success!  Nicky also presented on red flags, which are different to white flags, both of which might be present at site, although with very different meanings.  If the white flag is raised at site, be aware.  If you find red flags at site, also be aware!  Be aware of flags in general.

The post-lunch prandial was in the form of reviewing a large spreadsheet to look for these supposed red flags.  I’m not sure that it got everyone’s blood going, but it definitely strained the eyes! The workshop was aimed at honing our data scrutiny skills in finding where the data just didn’t make sense, or was too similar, or too good to be true.

For their sins, the delegates then had to endure me speaking for 30 minutes on policy development and process improvement.  It was riveting.

Barney and Paul followed this with a double-act on reporting, follow-up and legal consequences.  The recent case study discussed in the very first session was explored further and we could demonstrate exactly what happens when a principal investigator (PI) is very naughty.

The final session was a workshop looking at what delegates would do when faced with various scenarios.  All case studies were based on real events, and made for some very interesting discussions on how we would approach the situation and how we could prevent it from happening in the first place. 

Special thanks to our wonderful tutors: Nicky Dodsworth, Barney Horne and Paul Strickland.

 


 

Seminar Review: Data Integrity

'Integrity is doing the right thing when no one is watching' CS Lewis

The new RQA Data Integrity Seminar took place for the first time on 14th September 2016 at the Novotel, Heathrow.  The day was an introduction to data integrity principles looking at: data life-cycle management, exploring of regulatory expectations, raising awareness of current data integrity trends and sharing knowledge.  The presenters used real life examples alongside scenarios to highlight areas where things could potentially go wrong.

The seminar was attended by representatives from pharma, CROs, laboratories, and academic organisations who worked across GCP, GLP, GMP and GVP.

The day started off with an introduction to data integrity by looking at some definitions and the impact of people, systems and processes and the culture of organisations can have on data integrity.  This was followed by a simple practical example which followed a blood sample through the data process from sample to collection to result reporting looking at all the factors that could potentially impact the validity of the result.

The next session took a look at why data integrity is such a hot topic at the moment and considered the various guidance documents on this subject and the expectations of regulators in terms of ensuring that data was reliable to support good decision making to make good medicines and keep patients (like you and me) safe.

A review of some real life case studies and recent examples of data integrity inspection findings across the GxPs from around the world followed.  This put the impact of data integrity failures into perspective and looked at the different challenges faced by the regulators over the past five years compared with the previous 50 years.

Implementing data integrity policies in different types of organisations was the next discussion point.  Having appropriate arrangements and mechanisms for data integrity in place is an expectation but deciding how to do this was the difficult part.  Determining where to start (quick hits vs the long game), scope, requirements/ references, governance, roles and responsibilities, accountability and definitions were all considered.

The final presentation covered the subject of how to approach data integrity audits and how to get over that IT technical barrier.  The best advice was to talk to the people involved in the process, try to speak the same language and gain an understanding of the data flow and the associated systems and procedures.  Connect the system to the process and understand how the systems involved support the process.  Clarify the terminology, even down to the definition of the terms ‘data’ and understanding the difference between ‘meta’, ‘source’ and ‘raw’ data.  Not forgetting the importance of audit trail review, system configuration, validation and segregation of duties.

The last part of the of the day was a workshop where the delegates were presented with some fictional data and background  information which included adverse event data from a clinical trial, anti-doping data from a fictional eastern European country and customer satisfaction data from a budget airline.  They were asked to interrogate the data, map out the data flow, identify key control steps in the process and look at where the data integrity could have been compromised.

Delegates found all the deliberate, as well as some unintentional flaws in the data and reported back the results of their review to the rest of the group.

The day finished off with some closing remarks before people departed to many and varied locations across the UK and Europe leaving some nice comments and good feedback which prompted the presenters to ask each other, when shall we do this again? Watch this space.