One- and 2-year Flare Rates After Treat-to-Target and Tight-Control Therapy of Gout

Results From the NOR-Gout Study

Till Uhlig; Lars F. Karoliussen; Joe Sexton; Tore K. Kvien; Espen A. Haavardsholm; Fernando Perez-Ruiz; Hilde Berner Hammer


Arthritis Res Ther. 2022;24(88) 

In This Article


This study examined over 2 years flare frequency and predictors of flares in gout patients actively treated with ULT. Four out of five patients experience a flare during year 1 but only one of four during year 2. Flares were seen most frequently in patients during months 3–6 (46.8%).

Importantly, crystal depositions at baseline were evaluated by three methods (subcutaneous tophi, ultrasound and DECT), and all three methods could predict flares at months 9–12 and ultrasound and DECT also at year 2. This is a novel finding, and determination of the crystal load by three methods and over 2 years in this study strengthens the validity of findings. Crystal depositions are only slowly resolved during therapy, and therefore, flares must be expected in patients with a high crystal burden.

We also found that patients with high self-efficacy for gout pain independently had a lower risk for flares during months 9–12, whereas patients with previous experience with colchicine at baseline had an increased risk of flares. We have earlier shown in NOR-Gout that high self-efficacy contributes to achieving the target SUA level at 1 year.[19]

Patients with frequent flares may have used colchicine more frequently both before and during the first months of the study. It could thus be that colchicine use in this study is more an indicator of frequent flares and disease severity, and our non-randomized design does not allow to study the prevention of flares with colchicine.

Interestingly, no other demographic, life-style factors, SUA, or medication predicted flares in our study. While high SUA does increase gout incidence and flare recurrence,[30] no relevant relationship between low SUA and flares was found in a systematic review[31] based on RCTs, whereas results from the extension studies indicated that lowering and maintaining serum urate to < 360 μmol/L was associated with some reduced occurrence of gout flares, in line with some other studies.[12,16,32] Thus, the association between low SUA levels and reduction in flares seems weak. Flares have also been associated with decreases and fluctuations in urate levels in response to pegloticase treatment,[33] a finding which supports the hypothesis that not momentary SUA levels, but rather fluctuations, could initiate an inflammatory process manifested as a flare.

Other studies find frequent flares early after initiating ULT[3,34] or over time[32] and especially during the first 3–6 months after initiating ULT.[15,35] In a recent randomized controlled trial, gout flares were increased in the active ULT arm even increased during the first year but reduced in year 2 as compared to the usual care arm.[36] We report a high frequency of flares during all quarters of the first year, but mainly during months 3–6 where many patients no longer used prophylactic treatment with colchicine. We set flares during months 9–12 as the primary clinical outcome, expecting that after ambitious ULT the SUA levels had by then been low and stable for some time. In our study, we planned for patients to receive prophylactic colchicine only for the first few months as previously recommended,[20] but treatment was not strictly supervised and only a minority of patients were still using colchicine at 6 months as recommended in the most recent EULAR recommendations from 2016.[13] The observed high frequency of flares during months 3–6 supports consistent flare prophylaxis after ULT.

Absence of consistent clinical predictors of flares was also observed in a long-term evaluation after the incidence of gout.[37] Other studies find that alcohol consumption[38] and co-morbidities such as hypertension and diabetes are associated with more flares.[39] In patients with a gout flare during a hospital stay, flares can be predicted based on factors observed before admission.[40]

The reporting of flares in clinical studies of gout has not been standardized and various methods have been used. Flare in gout shows a high variation,[3] and there are challenges with flare reporting, including the quality of flares.[21] Lack of a standardized and validated flare definition prevents comparisons and within-group discrimination[41] but can now be overcome with a validated method for self-report.[9]

Our study is large and with frequent follow-up visits, showing that while the promoted urate target is realistic in daily clinical practice, gout flares must be expected.

Limitations in our study include the single-center design. Secondly, flare assessment was mainly self-reported, and the study was initiated before publication of validated self-reported flare criteria.[9] Thirdly, recall bias most likely affected reported flares, especially during year 2, which included no study visits between 12 and 24 months follow-up. A patient diary for flare reporting could have overcome recall bias. However, the consultation with study nurses at the 2-year visit gave an opportunity to recall flares the last year. Finally, the observational nature and lack of a control group in our study does not allow causal inferences.

Our study finds frequent flares with increasing cumulative incidence during the first year, even though ULT lead to low SUA levels already after 3–4 months.[19] Four out of five patients must expect at least one flare during the first year of ULT, but flares are clearly less frequent during the second year. The degree of crystal depositions at baseline was found to be associated with the frequency of flares during the two years, supporting that ULT needs to be optimized to achieve the treatment target and remove depositions. Further research should apply a validated definition of flares and investigate if flares decrease in strength and duration during treat-to-target ULT.