Hundreds of millions of dollars are spent to get a compound from concept to registration. Not only are costs a concern, but the risk for a sponsor launching multiphase clinical trials is extremely high. At the end of the clinical trial process, after having made the required investments, sponsors may find that the data does not support their hypothesis and the compound is abandoned.
More than half of all image-related clinical trial query stoppages result from preventable human errors, causing a delay of up to seven weeks per instance. These errors range from technical oversights such as inconsistent data entry and improper configuration of scanning modalities, to simple human errors such as missing signatures/authorization and illegible handwriting.
So how can technology play a positive role in this process? The following tips illustrate the importance of how, using proper tools and a validated delivery platform for the submission of imaging data by the sites, a trial sponsor can mitigate the risk of error and ensure the highest level of quality for its data at the source.
1. Improve workflow automation: Site coordinators and investigators are involved in several trials from various sponsors requiring different types of submission to imaging core labs (ICLs). Since assembling and sending data for clinical trials is a small and infrequent requirement for overburdened site coordinators, expecting them to remember dozens of submission steps from visit to visit is not always reasonable. With varying requirements, it’s important to have a smart workflow-based platform that presents the right tool at the right time and eliminates the need for the sender to make decisions which are inconsistent with any specific protocol.
2. Implement consistency and quality checks: The user interface must be consistent across all tools to reduce training requirements and the possibility for unnecessary queries. Even though the trial impact of a single data point may not be major, the aggregate of these situations can cloud the overall picture. Think of it like a loom, where you are trying to weave a certain design. Having a single thread break during the process might not affect the final product. However, losing many may cause the pattern to be dull and harder to recognize. If errors or inconsistencies within steps are detected, the tool should alert the user and help her correct the mistake.
3. Minimize manual data entry: The right tools presented in the most efficient order will help the study coordinator:
- Enter the correct values in the de-identification process.
- Replace without human intervention those tags for which constants for the site or trial can be used.
- Prepopulate transmittal forms as much as possible to ensure consistency.
- Facilitate the inclusion of additional files or information to the submission.
- Perform automated checks, on-site, of protocol compliance.
- Identify if any data is outside protocol parameters to enable corrections for the current and future submissions.
4. Integrate with downstream systems: Whether using a commercial EDC/CTMS system or an internal subject data-tracking application, a project manager should not be required to enter information about a subject twice. All submissions and their data should automatically update the systems at the ICL, CRO, and sponsor in real time.
5. Directly deliver into trial repository: User-entered data should not be subjected to manual re-entry at the core lab/CRO. Since quality checks have occurred during the assembly of the submission, re-entry can introduce errors, data quality degradation, and loss of tracking ability back to the source.
Providing investigator sites with the right tools to assemble and submit their clinical trial data, and ensuring the quality and completeness of their work is automatically checked prior to submission greatly enhances the efficiency of the downstream process of converting this input into usable trial data by the ICLs.