Uric Acid and Evolution

Bonifacio Álvarez-Lario; Jesús Macarrón-Vicente

Disclosures

Rheumatology. 2010;49(11):2010-2015. 

In This Article

Hypotheses on Evolutionary Advantages of the Loss of Uricase

Several independent mutations in the uricase gene occurred during the evolution of hominids as well as in monkeys of the Old and the New Worlds. These mutations have been interpreted as clear evidence of an important evolutionary advantage for the early primates that had increased UA.[21,24] In the same way, as purine degradation is much less complete in higher animals than in others that we consider lower, it is obvious that certain enzymes had been lost during animal evolution and it is assumed that it provided some evolutionary advantage.[15] On the other hand, if UA was a harmful waste product, it would not explain how the kidneys recover 90% of filtered UA,[25] instead of eliminating it. The evolution of hominids and the physiology of renal urate balance have associated UA as something beneficial that we must keep instead of something harmful that has to be removed. These facts have led various authors to propose some hypotheses, which are mentioned below, on the evolutionary advantages of the loss of uricase and the subsequent increase in UA.

Higher Antioxidant Capacity and Greater Longevity

One paradox of metabolism is that, while a large majority of complex life forms require oxygen to live, it is a highly reactive molecule that damages living organisms by producing reactive oxygen species (ROS). ROS are present in cells under physiological conditions, producing toxic effects when their production rate increases and exceeds the antioxidant defence capacity of the cells.[26] Thus, oxidative stress generally means a disturbance in the pro-oxidant–antioxidant balance in favour of the former.

The protection system to prevent and repair the oxidative damage includes enzymes such as superoxide dismutase and glutathione peroxidase, and antioxidants and radical scavengers such as vitamin E and the β-carotenes in the lipid portion of the cells, and glutathione, ascorbic acid and UA in the aqueous phase.[17] UA, being a powerful radical scavenger as well as being able to act as chelator of metal ions, such as iron and copper, by converting them to poorly reactive forms unable to catalyse free-radical reactions, is one of the most important antioxidants in human biological fluids.[26] It is thought that UA contributes to >50% of the antioxidant capacity of blood.[26,27] For this reason, Ames et al.[16] proposed that the loss of uricase expression and the subsequent increase in UA levels had the evolutionary benefit of increasing antioxidant capacity, increasing the life expectancy of hominids and decreasing age-specific cancer rates. The loss of uricase could be associated with the previous loss of capacity to synthesize vitamin C,[28] which occurred 40–50 million years ago due to a mutation in L-gulono-lactone oxidase, in a period in which the primates of the epoch ate large quantities of vitamin C in their diet, so it was an inoffensive mutation.[22] In later epochs changes occurred in their diet, with a lower ingestion of vitamin C and the subsequent loss of antioxidant capacity, which could be corrected with the loss of uricase and the increase in UA.[22]

Oxidative stress has been associated with 100 physiological and pathological conditions, including ageing and cancer.[26] There is a gradual accumulation of macromolecular oxidative damage products with age as well as a higher production of ROS by the mitochondria, and these facts are inversely related to the maximum life expectancy of the species.[29] However, there is not much evidence to support the increase in life expectancy in hominids due to the antioxidant effect of UA. Although there are data to support the role of oxidative stress in ageing in invertebrate organisms,[28,30] the evidence in mammals is less clear.[31,32] In fact, in mice with a combined deficiency in the antioxidant enzymes superoxidase dismutase and glutathione peroxidase, increased levels of oxidative stress parameters were observed, as well as an increase in neoplasia in older mice but not a decrease in life expectancy.[32] Likewise, the increase to double the superoxide dismutase activity in mice did not increase the life expectancy.[30] Neither is there any evidence that patients with a high UA live longer. In fact, patients with gout have a high risk of death, mainly due to cardiovascular causes.[33–35] Furthermore, other mammals, such as elephants, have achieved similar survival rates to humans, and higher than the majority of other higher primates, despite having uricase and UA levels as low as 0.2 mg/dl.[36]

To Maintain Blood Pressure during Low Salt Ingestion

Fossil evidence suggests that hominids of the Miocene epoch (a period between 24 and 6 million years ago) inhabited sub-tropical forests and were woodland quadrupeds that had a diet based mainly on fruit.[37,38] The salt content of the diet at the beginning of the Palaeolithic period, in the mid-Pleistocene (1–2 million years ago), was very low, ~690 mg/day (1.9 g NaCl) compared with a mean of 4000 mg/day (10 g NaCl) in the current American diet. Salt ingestion in hominids in the Miocene was probably even less, because they only ate fruit and leaves, estimating that with such a strict vegetarian diet salt ingestion could only be 225 mg (0.6 g NaCl).[17,22] Watanabe et al.[17] demonstrated that the increase in UA can maintain blood pressure in conditions of low salt ingestion, both acutely (by stimulation of the renin–angiotensin system) as well as chronically (inducing sensitivity to salt by the development of microvascular and interstitial renal disease). The increase in blood UA could enable the hominids to maintain blood pressure in times of low salt ingestion and it has been suggested that this increase in blood pressure from the increase in UA could be essential for hominids to maintain their vertical position.[27]

Various studies have shown that hyperuricaemia leads to an increased risk of hypertension in the following 5 years, regardless of other risk factors.[39–41] Furthermore, hyperuricaemia is also common among adults with pre-hypertension, particularly when there is microalbuminuria. The observation that hyperuricaemia precedes the development of hypertension shows that hyperuricaemia is not just a result of hypertension per se.[4] On the other hand, increased levels of UA have been observed in 40–60% of patients with untreated hypertension and in almost 90% of adolescents with recent-onset essential hypertension.[4] The strength of the relationship between UA levels and hypertension decreases with the age of the patient and the duration of the hypertension, suggesting that UA could be more important in younger patients with recent-onset hypertension.[42] However, accepting the association between UA and hypertension gives the impression that this increase in blood pressure caused by the loss of uricase is more a result of, than a cause, of this loss. In fact, other herbivorous mammals of the epoch, with diets presumably as low in salt as that of the hominids, and still around now, were able to adapt to the situation while maintaining uricase activity. In any case, if the gain has been to maintain blood pressure in times of low salt ingestion, evolution would have already thought of how to get us out of this problem, in times such as now, with a very high ingestion of salt in the diet.[38]

UA and Intelligence

The oldest hypothesis was expressed by Orowan,[43] due to the similarity of the structure of UA and some brain stimulants, such as caffeine and theobromine. According to this idea, the loss of uricase activity and the subsequent increase in UA levels could have given rise to a quantitative and qualitative leap in the intellectual capacity of hominids in the evolutionary process. It has been suggested that UA, like other purines, is able to stimulate the cerebral cortex and that the superior intellectual power of higher primates may partly be due to these higher levels of UA.[44] Consistent with this idea is the finding that glutamic acid, which is involved in the endogenous production of UA, seemed to improve cognitive functions when given therapeutically in cases of mental retardation.[45]

Several authors have found a significant correlation between UA levels and higher intelligence in children and young adults[46–48] and an association of gout with higher intelligence. Sofaer and Emery[44] studied the presence of gout in highly gifted people, with an intelligence quotient >148, and their families, observing that the prevalence of gout in males with an average age of 36 years was 1.8%, higher than that in the general population aged 58 years (1.5%), and that the prevalence of gout among families of both sexes at a mean age of 34 years was double (0.6%) that of the general population aged 44 years (0.3%). That is to say, the highly gifted people and their families have a higher prevalence of gout at earlier ages than the general population. However, other authors have not seen this association between UA and higher intelligence,[49] and the findings observed are difficult to separate from the eating and social habits associated with economic, cultural and intellectual situations.

On the other hand, if we associate the higher intelligence of the Homo genus with the significant increase that occurred in its brain volume in a relatively short space of time, it is unlikely that the loss of uricase could be involved in these changes if we agree with the dating of the mutations. As we mentioned earlier, it is thought that the loss of uricase in hominids occurred in the Miocene epoch, dating the fundamental mutations to >13–15 million years ago;[20,21] however, the large increase in cerebral volume occurred much later. Australopithecus afarensis was already a biped 3.5–4 million years ago, and had a brain capacity of 375–500 cc, similar to the large apes today, which tripled in a short period of time, 2.5 million years, in the Homo genus.[50]

UA and Neuroprotection

Other authors have pointed out that the evolutionary advantage of an increase in UA could be its antioxidant activity in the brain [51]. Scott and Hooper[51] argued that the brain is very vulnerable to oxidative damage as it has a high metabolic rate, using one-fifth of the oxygen that we breathe every day, and because it contains abundant lipid material with a high content of unsaturated fatty acids. Therefore, the antioxidant defence mechanisms against lipid peroxidation in the brain could be of great importance in the prevention of oxidative damage in an increasingly complex brain.[51]

There is increasing evidence that UA has protective effects against various diseases such as multiple sclerosis and neurodegenerative diseases such as Parkinson's disease, Alzheimer's disease or amyotrophic lateral sclerosis.[25] Lower levels of UA than in controls have been reported in all these conditions that have been associated with a higher prevalence and a worse evolution of these diseases, which have led to a proposal to increase UA as a treatment to improve their prognosis.[25,52–55] For example, gout and multiple sclerosis are mutually exclusive, in that there are no reported cases of multiple sclerosis with gout.[25] However, the appropriate level of UA to achieve these neuroprotective effects and whether the improvements obtained by increasing UA levels are clinically relevant are unknown.

The link between these different diseases could be the role played by oxidative stress in their aetiology and, in particular, the negative effects of peroxynitrite, a powerful oxidant formed by the reaction of the superoxide with nitric oxide, on the neurons.[51,56] UA prevents peroxynitrite formation by neutralizing cellular superoxide and preventing its reaction with nitric oxide [56]. UA does not seem to be a direct scavenger of peroxynitrite in vivo, since the peroxynitrite binds to CO2 almost 1000 times faster than to UA.[57] However, UA is a scavenger of free radicals, such as CO3 and NO2, which are formed from the breakdown of peroxynitrite.[25,26] Thus, a reduced concentration of UA could decrease the body's capacity to prevent the actions of peroxynitrite and other free radicals on the various neuronal components.[26] Besides its antioxidant effects, UA may also have neuroprotective effects through mechanisms mediated by astroglia, preventing the toxicity induced by glutamate.[25,58] It does not seem likely that protection against these types of disease, with a higher prevalence at advanced ages, was the cause of the loss of uricase. However, it shows us that UA has an important role in neuronal activity, with increasing levels of UA favouring the development of more complex neuronal functions.

processing....