What’s in a name? Not much according to Shakespeare. But the man who penned “A rose by any other name would smell as sweet” was a 16th century poet and playwright, not a 21st century clinical research professional. For us, names matter. Despite recent efforts to standardize the definition of Source Data Verification (SDV), the term still means different things to different people, and that needs to be navigated very carefully.
TransCelerate Defines Source Data Review (SDR)
TransCelerate BioPharma (TCB), an industry consortium of 21 large member companies, has outlined a comprehensive approach to risk-based monitoring (RBM), and developed supporting analysis tools and a 5-part training curriculum. In its 2013 RBM position paper, TCB was the first to coin the term Source Data Review (SDR) to describe the review of source data for protocol compliance, quality of documentation, and site processes. TCB characterizes SDR as a high-value activity, in contrast to the low-value activity of simply verifying accurate transcription of data from source documentation to CRFs.
For example, suppose clinical trial source documentation indicated Subject #8 took a 50 mg tablet of Imitrex, but failed to note the patient took the medication for migraine relief. Further suppose that on the CRF, neither the medication nor the migraine was represented. Transcription validation would detect that the patient’s use of Imitrex was not transcribed into the CRF. But transcription validation would not detect the absence of migraine data from both the source document and the CRF, because both documents match. It’s the SDR process that would investigate why the patient took Imitrex, and eventually discover that the migraine event was missing from both the source and the CRF. [1]
Law And Order: SDV
(newSDR + newSDV = oldSDV)
In the process of coining the term SDR, TCB formally narrowed the definition of SDV to mean transcription verification only.
Here’s the potential problem. SDV is a decades-old term that had often encompassed much more than transcription, frequently including the very activities that TCB attributes now only to SDR. It’s both true and unfortunate that sometimes SDV had been performed poorly, even robotically, but that doesn’t mean the traditional definition of SDV had been limited to consistency checks between source and CRF data.
Along with many in industry, FDA traditionally took a broader view of SDV, judging from this 2009 FDA warning letter excerpt.
“100% SDV was completed for Subject #123 at Site #456, according to Monitoring Reports; however, study monitors failed to identify that no physical examination, wound assessment, or overall clinical assessment was documented in study source documentation or on the CRFs for this subject’s Day-8 visit.”
According to the TCB definition, SDV would never have been able to detect this error because the source and CRF were consistent with each other; both were missing the same data. Yet, it’s clear that FDA fully expected the Sponsor’s SDV procedures to detect the data omission.
To What Extent Has the TCB Terminology Been Adopted?
Presumably, TCB member companies have adopted the new terminology, but many others have not. Most professionals my colleagues and I interact with in the course of doing business, at industry forums, and through social media don’t mean "transcription verification only" when they say SDV. For them, a CRA performing SDV would have caught that missing migraine. Such a mismatch in usage could lead to significant misunderstandings of metrics, of monitoring expectations and conduct, and in trial management. Despite the strong influence of both the consortium and its individual members, we can’t pretend that the issue was entirely settled by TCB’s RBM publication. The reality that the newer, much narrower definition of SDV may be limited to those "at the sharp end of making RBM a reality" [2] should concern us.
TCB and others have published statistics showing the low percentage of SDV-generated
queries[3], leading to the conclusion that SDV does not contribute much to data quality. In response, companies are beginning to dramatically scale back on their SDV percentage. That’s all incredibly logical, and totally consistent with the RBM philosophy of committing resources where they’re needed and conserving resources where they won’t have much effect. In contrast, since SDR is a high-value activity, it contributes significantly more to data quality than SDV. It follows, then, that levels of SDR should be higher than levels of SDV under an RBM approach. That’s precisely what the examples in the TCB position paper show. Any confusion, then, between the new (transcription-only) definition of SDV and the traditional (all-inclusive) definition of SDV could be disastrous. Sponsor and CRO personnel involved in planning, managing, and conducting a clinical trial must have the same interpretation of SDV, or risk unintentionally decreasing the level of a vital QC activity.
“When you say SDV, do you mean…?”
The distinction between true source data review and simple transcription verification is a useful one. We just need to be careful to define our terms clearly, use them consistently, and confirm that whomever we’re talking with shares the same interpretation. This may mean defining the term SDV in SOPs, Monitoring Plans, Clinical Trial Agreements, and other pertinent documents.
By Laurie Meehan
-----------------------------------------------------------------------------------------------------------------
[1] Actually, this review may not be done onsite at all, but might be performed remotely. The TCB position paper states that “SDR is necessary to evaluate areas that do not have an associated data field in the CRF or system available for more timely remote review.” Since the CRF would have data fields for both the concomitant med (entered after SDV detected it missing) and the associated migraine (still missing after SDV), the remote data review would be able to detect that the Imitrex noted in the CRF did not have an associated adverse event in the CRF.
[2] This is a phrase I borrowed, with gratitude, from Algorithmic, Inc’s insightful blog post.
[3] TCB member companies report SDV queries, on average, represented 7.8% of total queries generated. SDV queries on critical data represented just 2.4% of the total number of queries generated. At the end of June, using the data from over 1000 studies available through the Medidata Clinical Cloud, TCB completed a broad analysis of the impact that SDV and SDR have on data quality. Results of this analysis will be published later this fall.
No comments:
Post a Comment