Can You Have Your Cake and Eat It Too? The Sunlight D-lema

M.F. Holick


The British Journal of Dermatology. 2016;175(6):1129-1131. 

In this issue of the BJD, Felton et al.[1] report the results of their elegantly designed study to determine the impact on vitamin D status and skin DNA damage for healthy adults with skin types 2 and 5 of low-level simulated summer sunlight exposure as would be experienced at a latitude in the U.K.

The photosynthesis of vitamin D has been occurring on this planet for more than 500 million years ever since phytoplankton were producing vitamin D during sun exposure.[2] Although vitamin D's function in early life forms is not well understood, recent evidence has suggested that vitamin D3 stabilizes membranes that may be important in regulating immune function.[3]

Throughout evolution, many organisms, including vertebrates, have depended on sun exposure for their vitamin D requirement. Our hairy dark-skinned ancestors in Equatorial Africa had both their hair and skin pigment as mechanisms to prevent sun-induced skin damage. As our ancestors evolved, they lost their protective hair, and in turn increased melanin production as an effective natural sunscreen. However, the increased amount of melanin in the skin still resulted in a small amount of vitamin D3-producing-ultraviolet (UV) B radiation to reach the basal layer, promoting the production of vitamin D3 throughout the epidermis, which was essential for the maintenance of skeletal health throughout life. As our ancestors migrated north and south of the equator, the increase in the zenith angle of the sun reduced the amount of vitamin D3-UVB-producing photons reaching the skin to produce an adequate amount of vitamin D3. To compensate for vitamin D deficiency, mutations occurred resulting in a reduction in skin melanin content for those who migrated to the farthest northern and southern regions including the Neanderthals who likely had a Celtic skin tone.[4]

However, the loss of skin pigment now permitted UVB-sensitive macromolecules, including DNA, to absorb the solar UVB radiation that penetrated the epidermis. This absorption caused thymidine dimerization and other alterations in the DNA structure, increasing the risk for the development of nonmelanoma skin cancer.[5] The Surgeon General's report from the United States and many dermatology societies have promoted abstinence from any direct sun exposure, which is thought to be a major contributor for the worldwide vitamin D deficiency epidemic.[6]

In support of this message was a recent study that reported on Danish adults exposed to high-intensity sunlight during a vacation in the Canary Islands. Peterson et al.[7] not only observed improvement in their vitamin D status but also a significant and concerning cutaneous DNA damage as measured by increased urinary cyclobutane pyrimidine dimers (CPD), a surrogate for DNA damage. Thus, it was suggested that you could not have your cake and eat it too, i.e. take advantage of the beneficial effect of sun exposure for producing the vital vitamin D3 without significant DNA damage in the skin. However, from an evolutionary perspective, this makes little sense for survival of the species, i.e. the need to be dependent on solar UVB for bone development and health while at the same time increasing the risk for skin cancer. Female infants with infantile rickets would have had a flat pelvis with a small pelvic outlet making it difficult for normal child birthing. This is thought to be the driver for loss of skin pigmentation as people migrated north and south of the equator.[8] Those who migrated into the far northern reaches of the European continent ultimately lost most of their skin pigment in order to permit the vitally important vitamin D-producing UVB radiation to enter the epidermis to produce vitamin D3.[8]

However, the skin of the Danes with skin types 1 and 2 was not designed to be exposed to high-intensity sunlight for an average of 38 h over 6 days in an environment that was much farther south from where their ancestors evolved. Felton et al.[1] explored the possibility that those with skin type 2, whose skin had evolved to produce adequate vitamin D3 when exposed to sunlight at latitudes in the U.K., had also developed mechanisms to repair the damage caused by the same exposure. They exposed healthy adults with little skin pigmentation (i.e. skin type 2) to low-level simulated U.K. June midday sunlight (equivalent to 13–17 min, six times weekly) and evaluated its effect on raising blood levels of 25-hydroxyvitamin D [25(OH)D; a measure of vitamin D status] and at the same time monitored various outcome measures related to cutaneous DNA damage. As expected, they observed a significant 49% increase in the circulating levels of 25(OH)D as a result of 7-dehydrocholesterol in the epidermis absorbing UVB radiation (290–315 nm) resulting in its conversion to previtamin D3. Previtamin D3 is then converted within a few hours by a membrane-enhanced isomerization to vitamin D3. Once formed, vitamin D3 travels to the liver and is converted to 25(OH)D3.[8] However, as UVB is penetrating into the epidermis to form vitamin D3, it is also absorbed by DNA resulting in the formation of CPD and other pyrimidine photoproducts that if unrepaired have been associated with increased risk for nonmelanoma skin cancer.[9] It also resulted in CPD-positive nuclei in keratinocytes for the exposed skin compared with photoprotected skin of the same volunteer demonstrating the UVB-induced consequence of the DNA absorbing this radiation. However, 24 h after 6 weeks of exposure they observed significant clearing of the CPD-positive nuclei. This corresponded to undetectable levels of CPD in the urine and no change or accumulation in another marker for DNA damage from baseline, i.e. urinary 8-oxo-2'-deoxyguasine (8-oxo-dG), a measure of oxidatively damaged DNA.

Felton et al.[1] also conducted the study on Asians with skin type 5. They compared skin type 2 with type 5, and found that there was more DNA damage done to those with type 2, supporting that our ancestors who migrated further from the equator were at a disadvantage when it comes to UVB skin protection. As has been previously reported, increased skin-protecting pigmentation efficiently absorbs UVB radiation and therefore also reduces the number of photons absorbed by 7-dehydrocholesterol, resulting in a decrease in the effectiveness of the sun in producing vitamin D3,[8] which they also observed by demonstrating a statistically insignificant increase in serum 25(OH)D levels in their Asian subjects.

The skin has a large capacity to produce vitamin D3. It has been estimated that exposure in a bathing suit to 1 minimal erythemal dose is equivalent to ingesting approximately 15 000–20 000 IUs of vitamin D.[8] Furthermore, Maasai herders who have skin type 6 have circulating levels of 25(OH)D on average of 48 ng mL−1.[10] Achieving these levels requires the ingestion of 3000–5000 IUs daily.[8] It is likely that our hunter-gatherer equatorial ancestors exposed to daily sunlight were making this amount of vitamin D3 and at the same time their high melanin skin content prevented significant DNA damage. However, as our ancestors migrated to higher latitudes, the effectiveness of the sun for producing vitamin D was markedly diminished. A major target was the melanocortin 1 receptor (MRC1R), which regulates pigmentation in humans and other vertebrates. Impairment of its activity is associated with the pale Celtic skin types 1 and 2.[3,4]

To compensate for the lack of the natural melanin sunscreen, other mechanisms were developed to repair UVB-induced damage to the DNA in the skin to prevent skin cancer.[9] Several enzymes (exinucleases) have been identified that specifically recognize UVB-induced DNA damage and excise the damaged portion with follow-up DNA ligase repair.[11] The rapid disappearance of CPD-positive skin cells 24 h after the final UV exposure for both skin types 2 and 5 and the inability to detect urinary CPD after single and multiple exposures to simulated U.K. sunlight in the Felton et al.[1] study provided strong evidence that these mechanisms are in place to reduce the risk of UVB-induced skin cancer. The reason for not observing any significant DNA damage in the Felton et al.[1] study compared with the Danish study[7] is due to the fact that the Danes were exposed to an excessive amount of sunlight that their skin was not adapted for.

Therefore, you can have your cake and eat it too. The World Health Organization recognizes that limited sun exposure is an important source of vitamin D3 and at the same time warns against sunburning. The results of the study by Felton et al.[1] support the concept that sensible sun exposure that does not cause sunburning, and is appropriate for the person's skin type, can prevent vitamin D deficiency and its negative health consequences with little concern about this exposure increasing risk for skin cancer. However, sun exposure beyond that recommended for each person's skin type should be avoided either through covering the exposed area or using sunscreen to prevent the DNA damage that is associated with longer exposure as was evident in the Danish study.[7]

It is now recognized that there are several other photochemical processes that occur in the skin during exposure to sunlight.[6] These include the production of nitric oxide and carbon monoxide, which can help lower blood pressure and reduce risk for heart disease, and beta endorphin, which improves the feeling of well-being reducing the risk for depression. In addition, sunlight stimulates keratinocyte proopiomelanocortin gene-enhancing adrenocorticotropin hormone (ACTH) production as well as enhancing gene expression of several cytokines, all of which help to modulate immune function to reduce risk for autoimmune diseases and cutaneous infections.[6] The results from Felton et al.[1] should provide healthcare regulators, especially those who have advocated sun abstinence because of their concern for increased risk for skin cancer, with a new perspective for how the sun and our skin work in concert to take advantage of the many beneficial effects of sensible sun exposure while minimizing risk for skin cancer.