Two Thirds of Clinical Trials Unpublished 2 Years Later

Diana Swift

February 22, 2016

Leading academic research centers are doing a poor and variable job of disseminating clinical trial results, according to a cross-sectional analysis published online February 17 in BMJ, and that is leading to serious information gaps and ethical lapses.

Despite there being moral and sometimes legal obligations to circulate results in the public domain, Ruijun Chen, MD, from the Department of Medicine, University of California, San Francisco, and colleagues found that just 29% of completed trials at 51 major research centers had been published 2 years after completion, and a mere 13% had submitted their results to ClinicalTrials.gov.

"There is no excuse for the fact that researchers are using resources and conducting experiments on humans, taking up their time and maybe exposing them to risk, and then failing to report the results. Why, is beyond me," corresponding author Harlan M. Krumholz, MD, a professor of medicine at Yale University, New Haven, Connecticut, told Medscape Medical News.

Delayed data dissemination denies healthcare providers and researchers crucial information. "If all the trial data had been made available in a timely way, then people would have realized the cardiovascular risks of Vioxx [Merck] 2 years before it was taken off the market in 2004," said Dr Krumholz, who coauthored a study of pooled trial data showing increased risk with the drug as early as 2000.

The researchers called for timely action to correct this lapse in commitment to the investigative mission and failure to follow through on the research process. "Additional tools and mechanisms are needed to rectify this lack of timely reporting and publication, as they impair the research enterprise and threaten to undermine evidence based clinical decision making," the authors write.

Identifying centers with at least 40 interventional trials registered on the national database ClinicalTrials.gov and having primary completion dates from October 2007 to September 2010, the investigators looked at the proportion of trials that either published results in peer-reviewed journals or reported them to ClinicalTrials.gov, both overall and within 24 months of completion.

They identified 4347 trials at the 51 centers, 23% of which enrolled more than 100 participants and 33.5% of which investigated cancers and other neoplasms. The studies varied widely in size, phase, design, masking, and randomization.

Overall, the results of 2892 (66.5%) of the trials had been disseminated (defined as published or reported on ClinicalTrials.gov) as of July 2014, with the findings of 1560 trials (35.9%) disseminated within 24 months of completion. In 25.7% of cases, more than 24 months had elapsed from completion to dissemination.

Across institutions, the proportion of trials that disseminated results within 24 months ranged from 16.2% (University of Nebraska) to 55.3% (University of Minnesota), and the proportion published within 24 months of completion extended from 10.8% (University of Nebraska) to 40.3% (Yale University). The overall range for results reported on ClinicalTrials.gov ran from 4.1% (Memorial Sloan-Kettering Cancer Center) to 55.4% (MD Anderson Cancer Center).

The overall rate of dissemination ran from 45.9% at the University of Nebraska to 76.7% at the universities of Minnesota and Rochester. There was also a more than twofold variation from 13.9 (University of California, Irvine) to 28.3 (Boston University) months in median time from study completion to publication or reporting.

Previous studies found similar suboptimal dissemination rates, with 25% to 50% of trials remaining unpublished several years after completion.

Dr Krumholz is at loss to understand the lapse in data sharing, which he considers immoral. "It's not hard to report your results. It only takes about an hour," he said. Nor does reporting to the national trial database jeopardize future publication in a peer-reviewed journal, he said. "Some studies are small, but you have to ask: If the results are not important enough to be reported, was the study important enough to be done? You should not have conducted the study if you can't provide timely results."

He noted that those with concerns about a drug could perhaps tap into trial results circuitously by checking relevant studies registered on ClinicalTrials.gov and asking the investigators directly to share their unreported data.

And although researchers do need adequate time to ensure accurate data, "some, inexplicably, never share their findings," Dr Krumholz said. In his view, less effort should be put into identifying reasons and more into rectifying the unjustifiable status quo: "We can spend our time understanding the causes, or we can just fix it."

Dr Krumholz and two coauthors are recipients of a research agreement from Johnson & Johnson (Janssen), through Yale University, to develop methods of clinical trial data sharing, and of contracts from the Centers for Medicare & Medicaid Services to develop and maintain performance measures that are used for public reporting. Dr Krumholz and one coauthor are recipients of a research agreement from Medtronic, through Yale University, to develop methods of clinical trial data sharing, and of a grant from the US Food and Drug Administration to develop methods for postmarket surveillance of medical devices. Dr Krumholz also chairs a cardiac scientific advisory board for UnitedHealth. The other authors have disclosed no relevant financial relationships.

BMJ. Published online February 17, 2016. Full text

Comments

3090D553-9492-4563-8681-AD288FA52ACE
Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.
Post as:

processing....