Lack of Sun in Teen Years Linked to Nearsightedness Later On

Lawndale News Chicago's Bilingual Newspaper - Health

Teens and young adults who spend more time outdoors may be less likely to become nearsighted later in life than those who spend less time outdoors, a new study suggests. People in the study who spent more time exposed to ultraviolet B (UVB) radiation — which the researchers calculated based on the participants’ exposure to sunlight — between ages 14 and 39 were less likely to be nearsighted at 65 than those who spent less time exposed to UVB radiation, the researchers found. “Increased UVB exposure was associated with reduced myopia, particularly in adolescence and young adulthood,” the researchers wrote in the study, published December 1st in the journal JAMA Ophthalmology.

Lawndale News Chicago's Bilingual Newspaper - Health

Trained researchers examined the participants’ eyesight, and collected blood samples to examine the levels of vitamin D in their blood. They did that because previous research had linked higher vitamin D concentrations to a lower risk of nearsightedness. It turned out that people who had been exposed to higher levels of UVB radiation — a factor that’s closely related to how much time a person spends outdoors and is exposed to sunlight — as teens and young adults were less likely to be nearsighted at age 65 than those who had been exposed to lower levels of UVB radiation. This is in line with previous research, published in 2015 in the journal JAMA that suggested that children who spent more time outdoors had a lower risk of becoming nearsighted. However, in contrast to previous research, the new study did not find a link between higher levels of vitamin D and a person’s risk of developing nearsightedness, the researchers said. The new study shows a link between higher levels of exposure to UVB radiation and a lower risk of nearsightedness, but it does not prove that there is a cause-and-effect relationship between the two.

Comments are closed.