Both genetic and environmental factors play significant roles in the onset of vitiligo, an autoimmune disease that results in the loss of color in blotches of skin, according to a pair of new journal articles by researchers at the University of Colorado School of Medicine.
The findings also show that while the tools for scientific understanding of the genetic basis of a complex disease like vitiligo have advanced, there are still many other as-yet unidentified factors that contribute to vitiligo’s onset.
“Vitiligo has been perhaps the easiest of all complex diseases to sort out,” says senior author Richard A. Spritz, MD, director of the Human Medical Genetics Program and professor of pediatrics at the CU School of Medicine, in a media release from University of Colorado Anschutz Medical Campus. “Through years of previous studies, we have identified what could be called a ‘vitiligo parts list’ of 50 common contributory genes/risk variants.”
Spritz and his co-authors reviewed two types of vitiligo cases — simplex and multiplex. In most instances, vitiligo appears in individuals with no family history of the disease, which are referred to as the simplex cases. In the multiplex cases, there are other family members with vitiligo.
A paper by Spritz and his co-authors in the American Journal of Human Genetics combines the 50 vitiligo common risk variants together to make a vitiligo “genetic risk score,” and then compared the simplex and multiplex cases.
“The paper could be called a first chapter to the ‘vitiligo instruction manual,'” Spritz states. “We found that the vitiligo genetic risk score is higher in the multiplex families than in the simplex cases, and the more affected relatives in the family the higher the risk score. That means that vitiligo in multiplex families and simplex cases is basically the same, but that the families with multiple affected relatives have higher genetic risk. That means that the same treatments probably will be effective in both types of cases.”
That finding complicates the ability of scientists and physicians who want to predict who might be affected by vitiligo. Simplex cases and multiplex cases seem to mostly involve the same underlying genetic variants, with different patients just having different combinations of genetic risk variants. Such a finding complicates the use of predictive personalized medicine to diagnose and treat complex diseases, Spritz said, because there doesn’t appear to be genetically-defined patient subgroups with different underlying biology who might thus respond differentially to personalized treatments.
In the second article, which is published as a letter to the editor in the Journal of Investigative Dermatology, Spritz and his co-authors note that the average age of the onset of vitiligo in patients has changed dramatically over past decades, the release explains.
“Vitiligo converted from being principally a pediatric-onset to principally an adult-onset disease over the period 1970-2004,” Spritz continues, in the release. “That is amazing. Our genes haven’t changed over that period of time; altered genes or even gene effects don’t seem to be the cause. This must reflect some beneficial environmental change that somehow delays or reduces vitiligo triggering in people who are genetically susceptible. What was it? We don’t know.”
The authors write that one or more environmental changes seem to have altered triggering of vitiligo and delayed disease onset, with a similar pattern both in North America and in Europe.
“While this apparently beneficial change provides an extraordinary inroad to discover vitiligo environmental triggers, the number of potential candidates is enormous,” Spritz and his colleagues write.
Among just a few of the possibilities in the United States: The Clean Air Acts of 1963 and 1970, the Nuclear Test Ban Treaty of 1963, the Water Quality Act of 1969, the establishment of the Occupational Safety and Health Administration in 1970.
Globally, sunscreens with sun protection factor ratings were introduced in 1974. Even eating habits may contribute. The authors note that yogurt consumption became more common in the early 1970s, which potentially altered the gut microbiome for many people.
[Source(s): University of Colorado Anschutz Medical Campus, Science Daily]