• 2019-07
  • 2019-08
  • 2019-09
  • 2019-10
  • 2019-11
  • 2020-03
  • 2020-07
  • 2020-08
  • 2021-03
  • Folate has long been thought to reduce the risk


    Folate has long been thought to reduce the risk of cancer – by influencing DNA synthesis, repair, and methylation [8,66]. Very high intakes of folate, however, may also increase the risk of cancer. For instance, folate plays a crucial role in nucleotide synthesis, which is needed to support DNA synthesis and cell growth among rapidly proliferating cells such as those seen in pre-neoplastic or malignant lesions [7,67]. In fact, several mouse models have shown that once pre-neoplastic lesions are present, higher intakes of folate promote progression of these lesions [68,69]. However, it should be noted that folate supplementation in these studies was very high (four to ten times the basal dietary requirements of mice) and that we found a positive association not with total folate (which included supplemental folate with higher bioavailability) but with dietary folate, which comprises a lower folate intake level. The recommended dietary intake (RDI) of folate is 400ug/d and the upper limit of folate intake is 1000ug/d. In the cohorts, median total folate levels in the 4th and 5th quintiles were over RDI but below the upper limit. In terms of dietary folate, the top quintile of intake was close to RDI. Furthermore, in the presence of ultraviolet radiation folic Malonyl Coenzyme A degrades to 6-formylpterin and pterin-6-carboxylic acid – compounds that act as photosensitizers and generate reactive oxygen species [15,70,71]. Experimental studies using melanocytes and keratinocytes from the human skin have found that these metabolites of folic acid induce DNA damage and may possibly lead to skin photocarcinogenesis [14,72,73]. However, we did not find any association with total folate, which included supplemental folic acid. Furthermore, a recent meta-analysis of 13 randomised trials by Vollset et al. found no association between folic acid supplementation and overall and site-specific cancer, including melanoma [74]. The study, however, was limited by few melanoma cases (n = 126) and some trials with short follow-up periods (average treatment duration was 5.2 years). This study had several strengths. First, we collected and adjusted for a number of potential confounders. Second, our study benefitted from a large sample size and long duration of follow up. Third, the prospective nature of this study likely minimized potential biases such as recall or selection bias. Our study also had a few limitations. First, the generalizability of our findings may be limited – both cohorts comprised white educated U.S. health professionals. This lack of diversity, however, may help to reduce possible residual confounding by socioeconomic status, ethnicity, and healthcare access. Second, dietary data was self-reported and may have been subject to non-differential misclassification. To minimize this error, we used cumulatively averaged intake from multiple time points. Nevertheless, nutritional biomarkers are not affected by bias associated with self-reporting, and serum and red blood cell folate biomarkers, for example, have been shown to be robust indicators of folate status [75]. Thus, future studies could consider investigating the association between these nutrients and melanoma risk using nutritional biomarkers. Third, although we adjusted for several known melanoma risk factors such as history of sunburns and cumulative ultraviolet flux [76], we did not have information on outdoor sun exposure hours. Finally, a potential limitation is residual confounding.
    Financial support Supported by the National Institutes of Health (CA186107, CA167552, and CA198216) and by the Research Career Development Award of Dermatology Foundation.
    Authors' contributions
    Conflict of interest statement
    Introduction The pathogenesis of colorectal cancers (CRCs) is complex, with both genetic and lifestyle causes known to contribute to their development [1]. Food and nutrition play a major role both in the prevention and the causation of CRC, and there is evidence of a dose-response relationship between intake for both red and processed meat and the risk of developing CRC [2,3]. Red meat is defined as all mammalian muscle meat, and it includes beef, veal, pork, lamb, mutton, horse, and goat [[3], [4], [5]]. Processed meat is meat that has been modified through smoking, curing, salting, fermentation or other processes in order to improve its flavour or its preservation. Processed meat can include red meat but also other types of meat, like poultry, offal or meat by-products [[3], [4], [5]]. Both the World Cancer Research Fund (WCRF) and the International Agency for Research on Cancer (IARC) agree there is convincing evidence that a high intake of processed meat increases CRC risk, and that red meat is probably carcinogenic [2,3]. Mechanisms behind the carcinogenicity of red and processed meat are still unclear, but the most common hypotheses highlight the possible roles of heterocyclic amines (HCAs), polycyclic aromatic hydrocarbons (PAHs), heme-iron, and curing-agents nitrate and nitrite. These mechanisms are described elsewhere [[6], [7], [8], [9]].