Economic and social factors can determine the prevalence and severity of a health condition, as well as enhance co-morbidities. This article reviews forces that have influenced the appearance and persistence of nutritional rickets over the past 400 years. Childhood rickets first appeared from 1600 to 1640 as an epidemic disorder in the south and west of England when the woolen trade developed as a home labor industry. Later, with the coal-fueled factory system, agrarian laborers migrated to cities, creating a wage earner class. Their children were raised in smog-ridden and crowded conditions and were victims of an urban-based form of rickets lasting from the 1640s to the 1930s.
With the discovery of vitamin D, arising from both dietary sources and from photocutaneous biosynthesis by exposure to ultraviolet B wavelengths from the sun, and recognition of its use as a therapeutic agent, epidemic rickets largely disappeared. The pro-sunshine era was ushered in between 1930 and 1965. Since, rising rates of skin cancer have greatly tempered enthusiasm for sun exposure and led to recommendations for sun protection strategies.
Beginning in the 1960s, examples of nutritional rickets have been seen in dark-skinned immigrants moving to northern latitudes. Often rickets occurred in association with vitamin D-deficient mothers who exclusively breast-fed their offspring. Also, vitamin D deficiency is more evident in young children, adolescents, pregnant mothers and the elderly, many of who remain indoors and do not use vitamin supplements.
These factors - child labor, the almost universal use of coal as a fuel, dietary deprivations during wars, migration of dark-skinned populations to regions where sunshine is scarce, and cancer-associated fear of sun exposure - have resulted in the persistence of nutritional rickets despite the knowledge of medical practitioners, nutritionists, and even parents. This manuscript reviews these factors and their impact on the health of children.
rickets; vitamin D; history; etiology