
The U.S. medical field is less dominated by white men than it used to be, but there are still few Black and Hispanic doctors, dentists and pharmacists, a new study finds. The study, which looked at trends over the past 20 years, found that white men no longer make up the majority of physicians and… read on > read on >