John Mandrola, MD

Disclosures

November 15, 2016

A common scenario: a patient presents with new-onset atrial fibrillation (AF) and multiple risk factors for stroke. Everyone agrees to use anticoagulant therapy to reduce the risk of stroke.

The question is: Which agent—warfarin or a new oral anticoagulant (NOAC) drug?

Multiple clinical trials have compared the agents, and though the absolute differences are small, NOAC drugs reduce the risk of stroke, major bleeding, and intracranial bleeding.

A provocative observational study from Swedish researchers[1] presented as a rapid-fire oral abstract here at the American Heart Association 2016 Scientific Sessions sheds light on the decision to use warfarin or NOAC in newly diagnosed patients with AF.

Recall that a major limitation of all warfarin vs NOAC trials was the generally poor quality of warfarin management. The time in therapeutic range (TTR) varied from 55% to 64%. That means more than one in three patients taking warfarin were under- or overanticoagulated. Although this situation mirrors the real world—especially here in the US—it's not the case in countries like Sweden. In 2015, in Swedish general practice the mean TTR was a remarkable 73.4%.

The question thoughtful clinicians ask is this: Because the absolute differences in the clinical trials were so small, would better warfarin management negate the benefit of NOACs?

Other than fully opening the data sets of the industry-sponsored trials, which is unlikely, the next best way to answer this question comes from large national registries like those in Sweden.

For this study, researchers included patients starting oral anticoagulation (warfarin=36,317; NOAC=12,694) for nonvalvular AF between July 1, 2011 and December 31, 2014. Background and outcome data came from well-validated Swedish clinical registers. Researchers used propensity-score matching to alleviate differences in the nonrandomized patient selection.

Propensity matching allowed comparison of about 12,000 patients on warfarin and NOACs. Patient characteristics of the cohort were consistent with a typical AF population: mean age 72, 58% males, 19% with prior stroke, and a mean CHADS-VASC of 3.3. Average follow-up was only 300 days. Distribution of NOACs was dabigatran (Pradaxa, Boehringer Ingelheim), 40.3%; rivaroxaban (Xarelto, Bayer/Janssen), 31.2%; and apixaban (Eliquis, Bristol-Myers Squibb/Pfizer), 28.5%.

In terms of efficacy end points, the annual incidence of all-cause stroke and systemic embolism was 1.58 in the warfarin group and 1.35 in the NOAC group, which was not significantly different. Likewise, all-cause stroke and ischemic stroke were not significantly different.

The major finding was that patients on NOAC had a 51% lower annual rate of hemorrhagic strokes than those on warfarin (0.16% vs 0.35%, respectively; hazard ratio [HR] 0.49; 95% CI 0.28–0.86). Major bleeding was also observed less often in the NOAC group (2.76% vs 3.61%, respectively; HR 0.78; 95% CI 0.67–0.92).

Differences in GI bleeding, all-cause mortality, and MI did not reach significance.

The authors concluded that compared with newly instituted warfarin treatment with a mean TTR of 70%, NOACs caused fewer major and intracranial bleeds but were not superior in preventing strokes or systemic embolism.

Considering the Limitations

Before these results provoke you too much, let's consider the limitations.

This is retrospective, observational data of nonrandomized groups. Propensity matching is better than no matching, but it's hard to balance all the confounders. Another study limitation is that new prescriptions for warfarin were included. The analysis may not pertain to those already taking warfarin. What's more, average follow-up in this study was less than a year.

That said, this is an interesting signal.

It's hard to do warfarin management better than Sweden. This is a key point because one interpretation of the warfarin-vs-NOAC trials was that the drugs are of equal efficacy, but NOACs did better because they impart less pharmacokinetic variability. These observational data suggest the possibility that even when warfarin management is improved, NOACs provide enhanced safety. In an email, Dr Sanket Dhruva (Yale University, New Haven, CT) wrote that "this study lends confidence that the [randomized controlled trial] results of the three NOACs were not due solely to TTRs in the 50s and 60s for warfarin-treated patients." But he added a word of caution on the methods: "A propensity-score–matched analysis may still have unmeasured confounders."

Helping newly diagnosed AF patients decide which anticoagulant to take is difficult. It's so close. More and more, I've been leaning toward the NOACs. I think more about reducing the burden of being a patient. NOAC therapy minimizes the typical warfarin inconveniences—frequent INR checks and diet–drug and drug–drug interactions.

If this interesting safety signal is confirmed in other studies, it's hard not to favor NOAC drugs over warfarin. If NOACs are safer and equal (or better) in efficacy, should they not be favored?

JMM

Comments

3090D553-9492-4563-8681-AD288FA52ACE
Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.
Post as:

processing....