The following story is not a fairy tale. It is “your worst nightmare” for some, “recognition at last” for others and an important lesson for all of us. It’s the story of a great study, made possible by the unpaid collaboration of 16 ICUs, and the tremendous work of a PhD student and research nurse. Yet, two years after publication in JAMA, there appeared to be 1 error, which had a major impact on the study outcome.
The study was a cluster-randomized cross-over study in 16 ICUs. In 8 ICUs all eligible patients (those with an expected stay in ICU of >48 hours) received Selective Digestive Decontamination (SDD) during 12 months and then all eligible patients received Selective Oropharyngeal Decontamination (SOD), also for 12 months. In the other 8 ICUs the order (SDD and then SOD) was reversed. Both interventions were implemented as standard care with minimal data collection. Most ICUs agreed to manually enter that data in an electronic CRF, whereas some preferred to make an automatic extraction of the required data from their own electronic patient records. Although time saving for the ICU staff, the latter procedure necessitated manual data recoding to harmonize data files.
In November 2016 Evelien Oostdijk, lead scientist of the trial, and now ICU resident in our hospital, received a phone call from her PhD successors. Over coffee she heard about a discrepancy between the results of her study, as published in JAMA, and the analysis of the data as used for an Individual Patient Data meta-analysis. Not long after that, it was clear that the coding of the intervention (0 or 1) was reversed for one of the largest participating ICUs, providing 16% of all patients. Each month the ICU had delivered an excel file containing admission and discharge dates, with hours and minutes in the same cell. To harmonize these data with those from other ICUs, the hours and minutes needed to removed, requiring several copy and paste procedures. And that is where the human error occurred.
So, all original data sets were controlled, again, analyses repeated, and now there was a small, but statistically benefit for patient survival during the SDD period, which was not obvious previously. So, we immediately contacted all co-investigators, send a correction of the conclusion to all intensivists and clinical microbiologists in the country and asked for a retraction and replacement in JAMA, published today.
- Think of the unthinkable when it comes to checking the quality of your data.
- If you think that all possible has been checked, check again.
- Always let others use your original data for new (or just the same) analyses.
So, what now about SDD? The good news is that there are now 2 large cluster-randomized cross-over studies in which SDD does better than something else (better than nothing in one and better than SOD in the other study). That reduces the likelihood that the benefits were due to differences in patient characteristics, which cannot be fully excluded in such trials. The results of the Individual Patient Data meta-analysis will appear soon.
And what about Evelien Oostdijk? She has been a shining example of scientific integrity.
One thought on “Let others use your data!”
Thank you for sharing your experience. Please note that manual data manipulation should always be avoidable — instead use reproducible code converting master files to analysis files.