Enough With the Coffee Research and Other Distractions

John Mandrola, MD


August 02, 2017

After 2 decades of practicing medicine, I have come to believe that distraction is one of the clinician's greatest foes. We miss important stuff because nonsense steals our full attention.

Having spent the past few years reviewing and reporting on studies, I think distraction also impedes medical progress. The new phenomenon of worship of page views and clickability exacerbates the problem of distracting wasteful research. Good studies (signal) are droned out by the observational nonsense (noise). In these days of information overload, attention has never been more vital.

Examples abound.

A prominent medical journal recently published two more observational trials on coffee intake. They both showed that coffee intake associates with lower mortality and better cardiovascular outcomes.[1,2] But the literature is replete with studies showing the same thing.[3–6] There are even randomized controlled trials in patients with advanced heart disease showing that caffeine is safe.[7]

Coffee in sensible doses is not harmful; this is clear. Sure, in some people, it makes their palpitations worse. Affected people can simply not drink coffee. Let's stop studying coffee and move our attention to more pressing issues.

Coffee studies look reasonable compared with nutrition studies. In 2012, Dr Jonathan Schoenfeld (Harvard University, Boston, MA) and Dr John Ioannidis (Stanford University, CA) published a systematic cookbook review in which they found that associations with cancer have been claimed for most common food ingredients.[8] The authors note that many single studies on a food have implausibly large effects that then shrink in meta-analyses. Of course they do; you don't have to be an epidemiologist to know cancer or heart disease risk depend on far more than a single food substance. Nonetheless, the latest blueberry or quinoa study steals our attention.

Chocolate and Clickbait

Then there is chocolate. Please let the chocolate studies stop. Most proclaim chocolate intake associates with better outcomes.[9]. My wife, Staci, loves (perhaps needs) dark chocolate. Chocolate is her espresso. But if Staci's good health continues, is it due to her impeccable adherence to the Mediterranean diet, regular exercise, and regimented sleep pattern? That's the problem: lots of chocolate lovers also practice other good health habits.

Another obvious fact that need not take up journal space is the association of poor living conditions and poor health outcomes. Earlier this year, a highly publicized and well-covered study published in the Lancet showed that living close to heavy traffic was associated with a higher incidence of dementia.[10] Is this helpful? Of course highway fumes are unhealthy, but could other factors have led to the increased risk of dementia? If people had a choice, would they choose to live next to a highway?

Clickbait research can afflict the highest-level medical journals. In April, at the time of the Boston Marathon, the New England Journal of Medicine published an observational study showing in-hospital mortality for acute MI was 3.3 percentage points higher in marathon-affected hospitals. Response time for these hospitals averaged 4.4 minutes longer.[11] The confidence intervals were wide, including the chance of lower mortality in marathon-affected hospitals; the control groups were historical, not randomized, and confounding was likely. But even if the results were true, would it surprise anyone that time-sensitive conditions might suffer when road closures are in force? Everyone knows it's best to have an MI or stroke during the daytime hours of a weekday.

Most recently, some marijuana studies have become a distraction. At the ACC meeting in 2016, a large observational study found MI patients who reported recent marijuana use had a lower odds ratio for in-hospital mortality (OR 0.83) compared with those who did not use the substance.[12] Need for mechanical ventilation, however, was higher in the marijuana group (OR 1.19). A commenter noted that this was "both interesting positive and negative information," but "there's probably a lot we don't know about mechanisms." To another journalist in the press room, I was bemoaning how weak and useless this study was. He agreed, but said he had to cover it because it had "marijuana" in the title.

I don't mean to say marijuana isn't relevant. It is. Legalization has clearly increased the number of adult users.[13] But what we need more of is focused studies of plausible actions or harms. Wasteful are marijuana studies showing that regular heavy users think more slowly and snack more.

Another danger from weak observational studies is their appeal to confirmation bias. For instance, I strongly believe overtreatment in cardiology leads to harm. That's why it was easy for me to look past the weaknesses of an observational study published in JAMA: Internal Medicine showing high-risk patients with heart failure and cardiac arrest had lower 30-day mortality during dates of national cardiology meetings.[14] The authors observed that during these days fewer MI patients underwent PCI procedures but mortality rates were similar. The authors rightly spend many words on the limitations of the analysis, but Google chronicles the mainstream media's coverage that cardiac care improves with fewer procedures. Maybe it does, but this study can't say that. If I wanted to be clever and misleading, I could make PowerPoint slides from this paper to bolster my bias on overtreatment.

Hope for Improvement

Two recent papers point to areas of promise in stemming the distractions from wasteful research.

The first is a well-publicized report from 70 scientists who propose changing the default P-value threshold for statistical significance for claims of new discoveries from 0.05 to 0.005.[15] The authors write that this change in threshold "will not address problems of multiple hypothesis testing, P-hacking, low power, or other biases like confounding, but, they write that "reducing the P-value threshold complements—but does not substitute—for solutions to these problems." It was well into my years of study review until I learned that use of P=0.05 means being wrong about new discoveries at least 30% of the time.[16]

Not all observational studies are distractions. In a review article published in the August 3, 2017 issue of the New England Journal of Medicine, former CDC director Dr Thomas Frieden describes numerous ways observational studies can help inform public health.[17] Population studies after public-health initiatives (vaccine or smoking bans) and registries for rare diseases were two examples he cited in which observational data can be helpful.


It's unlikely that an opinion column from one doctor will stem the onslaught of wasteful, distracting research studies. But some simple things may help lessen the distraction of overhyped research:

  • Journals could be more restrictive in their acceptance of weak papers and more transparent in listing the weaknesses of a study. Why not put limitations in the abstract?

  • Authors and journals should tone down the press releases. Please. I am far more likely to read and report on a study that exudes honesty in the lead paragraph of a press release.

  • Readers need to be more informed and skeptical about common biases. Think more about confounding factors and reverse causation. And always remember that correlation does not mean causation.


Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.
Post as: