Jump to content

Category:Public health in the United States

From Wikipedia, the free encyclopedia

Articles related to public health in the United States, "the science and art of preventing disease, prolonging life and promoting health through the organized efforts and informed choices of society, organizations, public and private, communities and individuals". Analyzing the determinants of health of a population and the threats it faces is the basis for public health.