The Comparative Effect of Exposure to Various Risk Factors on the Risk of Hyperuricaemia

Diet Has a Weak Causal Effect

Ruth K. G. Topless; Tanya J. Major; Jose C. Florez; Joel N. Hirschhorn; Murray Cadzow; Nicola Dalbeth; Lisa K. Stamp; Philip L. Wilcox; Richard J. Reynolds; Joanne B. Cole; Tony R. Merriman


Arthritis Res Ther. 2021;23(75) 

In This Article


Our previous study[10] concluded, using percent variance explained, that common genetic variants have a greater contribution to urate levels in the non-gout population than overall diet. Using attributable fraction measures, we arrive at the same conclusion, importantly also in a cohort of people with gout. Previously, the summed percent variance for the 30 genetic variants for urate levels was 8.7%, considerably greater than the variance explained by the DASH diet.[10] Here, the summed PAFs for the 30 genetic variants was 141 to 146%, considerably greater than that for following the DASH diet recommendations in cohorts 1 and 2. Thus, empirically for HU at least, the different approaches of decomposition of variance and use of attributable fractions provide similar support for the greater relative role of common inherited genetic variation than overall diet in determining urate levels and HU. In the gout cohort, the attributable fractions for urate-lowering therapy were greater than for diet and BMI < 25 kg/m2. Acknowledging the limitation that we were unable to build compliance, medication dose, and dosing to target into our models (which would contribute to under-estimating the effect of urate-lowering therapy), our data emphasise the importance of gold-standard clinical practice (urate-lowering therapy), to manage HU in gout patients. While weight reduction has established benefits, including to co-morbidities in gout, our data demonstrate the greater impact of urate-lowering therapy in managing HU in gout.

While it is debatable whether public health efforts should be directed to primary prevention of HU, given the lack of evidence that HU is directly causal of conditions other than gout,[44] there are two considerations that can be drawn. One, efforts would need to focus on interventions for which there is unequivocal evidence for a substantial impact to be made. This is not the case for the DASH diet (our AAF estimate in a multivariable model of the proportion of cases of HU prevented by following a DASH diet was only 6 to 7%). Two, the proportion of cases of HU attributable to being overweight or obese from the population was 24 to 26% in the same model, only slightly more than the proportion attributable to SLC2A9 rs12498742 (22 to 24%). It may seem incongruous to compare these exposures in the context of possible public health approaches to prevent primary HU, given that it is not possible to prevent exposure to a common genetic variant. It is, however, possible to modify the impact of a genetic variant. The uricosuric drugs benzbromarone and probenecid inhibit the reuptake of filtered urate by GLUT9 (encoded by SLC2A9);[45] thus, it is conceptually possible to target individuals with the rs12498742 urate-raising allele to improve excretion of urate and prevent HU. From a public health perspective, this is likely a more tractable intervention (in that it targets a single measurable exposure) than preventing obesity, which is caused by multiple environmental and genetic exposures that are not well understood.

That individual foods and estimates of dietary habits associate strongly with urate levels in observational data[10] does not necessarily translate into a clinically significant causal effect. It is interesting to compare association data of the DASH diet score[10] with data from a RCT of the effect of the DASH diet on serum urate levels[27]—the association data show a decrease of 0.023 mmol/L [0.38 mg/dL] between the least and most DASH-like diets in the US population,[10] very similar to the 0.021 mmol/L [0.35 mg/dL] decrease when comparing the DASH diet with an 'average American diet' in the RCT.[27] In both cases, this is a relatively small change attributable to dietary habits and is reflective of the evidence presented here for a weak BMI-mediated causal relationship between diet and urate levels.

An incongruity is the apparent inconsistency between the two dairy-related MR analyses and results from RCTs.[46–48] Using MR as a complementary approach to investigate causality, we found only a small number of weak causal associations between dietary habits and serum urate. Interestingly, two of the significant causal associations represent opposing dietary habits, namely preferentially drinking skim milk or preferentially drinking milk with a higher fat content. The causal effects observed were consistent with these being opposing dietary habits, with skim milk consumption associating with increased urate, whilst consumption of higher-fat milk associated with decreased urate at an approximately equivalent effect size (0.050 mmol/L [0.84 mg/dL] vs. −0.044 mmol/L [− 0.73 mg/dL], respectively). However, whilst these results are consistent with each other they are not consistent with prior studies of milk and dairy proteins in relation to urate levels. Observational studies have reported an inverse relationship between consumption of dairy products and serum urate levels.[7,10,49–52] Many of these observational studies do not separate dairy products into low and high fat content; however, those that do have found that this effect appears to be limited to consumption of skim or low-fat dairy products.[10,51] RCTs have supported these observational findings,[46–48] in particular consumption of skim milk products acutely lowered serum urate levels by approximately 10% in 16 healthy adult men.[46] The apparent inconsistency between the MR and RCT results can be explained by the influence of BMI on the MR analysis. BMI appears to be a common upstream cause in the two dairy-related MR associations reported here, and these two dairy-related dietary habits are highly correlated with BMI, several measures of body fat and weight-loss related traits, including making major dietary changes to lose weight[26] (Table S4). It is plausible that the MR results reflect dietary recommendations given to individuals with a higher BMI (drink skim or low-fat milk), explaining the contradictory results seen here.

Our BMI genetic instrument explains more variance in type of milk consumed (~ 0.5%), than the milk type instruments do themselves (0.04 to 0.1%),[26] highlighting an important limitation to these analyses. Genetic instruments for dietary habits likely explain a small fraction of phenotypic variance[26] or may be linked to diet through indirect mechanisms, potentially subjecting the MR analysis to bias towards the null, pleiotropy or confounding. While multiple MR approaches were used to address some of these pitfalls, future investigation using more biologically based genetic instruments for diet may illuminate previously undetectable causal relationships.

In the sample sets of European ancestry studied here, SLC2A9 rs12498742 had a considerably greater PAF than the ABCG2 rs2231142 variant (29 to 32% vs. 6%, respectively (Table S2)). This is because of the 1.7-fold increased effect size of rs12498742 on serum urate levels and the increased prevalence of the urate-increasing allele (77% vs. 11%).[35] In contrast, in a Japanese study, the PAF for rs2231142 was 29%, compared to 19% for being overweight or obese,[53] suggesting that for any primary prevention of HU in the Japanese population, targeting ABCG2 dysfunction would be a strategy to be considered. The rs2231142 risk allele frequency is 29% in the East Asian population compared to 9% in the European population. The authors of the Japanese study concluded that ABCG2, at least, is a stronger risk factor for HU than other 'typical' environmental risk factors.[53]

One limitation of the gout cohort analysis is the possibility of selection (collider) bias resulting from conditioning the sample set on gout ascertainment which would serve, when testing variables that are risk factors for gout per se, to bias effect sizes towards the null or even in an opposing direction.[54] This phenomenon likely explains the reduced (reversed for age) effect sizes for age, sex, BMI and diuretic exposure for each of risk of HU and change in serum urate levels and reduced variance explained, evidenced by non-overlapping 95% CIs compared to cohorts 1 to 3 (Table 2 and Table 3). For SLC2A9, effect sizes and variance explained were lower in the gout cohort, but some confidence intervals were overlapping. However, for diet and alcohol, there was no difference in effect sizes (the 95% CIs overlapped) suggesting that collider bias does not have a substantive effect on these estimates within the gout cohort. We note that the prevalence of healthy eating diet non-adherence was very similar between the UK Biobank Gout cohort and the equivalent non-gout cohort (cohort 3) indicating that diagnosis of gout did not change dietary behaviour. Selection bias will not influence our effect estimates for urate-lowering therapy; however, estimates for this exposure are likely inflated in the UK Biobank owing to healthy volunteer selection bias.[55] This likely leads to an over-estimate of effect size owing to exposure to urate-lowering therapy, because of a more compliant demographic. Our estimate of OR = 20.2 (Table 2) is considerably higher than a hazard ratio of 4.5 reported for achieving target urate in a gout cohort drawn from the UK primary care population.[56] While our estimate is not representative of the general population, it does indicate the possibility that the relative effect of urate-lowering therapy on HU and serum urate levels is higher when compliance to urate-lowering therapy is increased.