Iron, Cholesterol, and the Risk of Cancer in an 18-year Cohort

Abstract

The iron catalyzed oxidation of serum lipids is hypothesized to generate oxidative stress, which appears to play ‍an important role in the pathogenesis of many cancers. Previous research has obtained conflicting results regarding ‍the independent contribution of cholesterol and iron on cancer risk. The purpose of this study was to test for an ‍interaction between iron and cholesterol on cancer risk. The present cohort study was an analysis of the National ‍Health and Nutrition Examination Survey I (NHANES I) database linked with the NHANES I Epidemiologic Followup ‍Study. Baseline serum iron and total cholesterol values were obtained on 7,448 adults, who were followed for the ‍development of cancer over 18-21 years. Population weights were applied to create Cox proportional hazard models ‍of time to the development of cancer for the entire U.S. adult population (n=72,602,523). Control variables included: ‍age, race, gender, smoking, body mass index, chronic cough, chronic hepatitis, chronic/recurrent colitis or enteritis, ‍and gastrointestinal bleeding. Independent elevations of either iron or total cholesterol were not significantly related ‍to the development of cancer in the adjusted model. However, the combination of iron and total cholesterol above ‍the 75th percentile was associated with a significant increase in the risk of all cancers (HR 1.39, 95% CI 1.00-1.94). ‍Iron and cholesterol above the 80th and 85th percentiles increased the hazard ratio for cancer further to 1.51 (CI ‍1.10-2.08) and 1.61 (CI 1.07-2.43), respectively. These results support the theory that the iron induced oxidation of ‍serum lipids is important in the pathogenesis of cancer.

Keywords