One of the first things Nathaniel Villanueva did when applying to medical school was check U.S.News & World Report's Best Medical Schools rankings. "Is this school going to prepare me for a career in medicine?" asked the 26-year-old who is now a fourth-year at the Mount Sinai School of Medicine. "The higher the ranking, the better opportunities I'll have in the future."
Every year thousands of prospective students use the U.S. News rankings to help decide which medical schools to apply to and ultimately attend. The rankings also have a major impact on medical schools, which view them as a way to measure their own performance.
In celebration of the rankings' 20th anniversary, U.S. News and the Mount Sinai Medical Center recently held the first-ever conference where medical school deans, key experts, and students discussed the rankings methodology and how it could be improved.
"The fact of the rankings is ... they are here to stay. This is a consumer-driven need," said Brian Kelly, U.S. News editor, who urged students to use the rankings as one of many tools during their search.
U.S. News sends out surveys to every medical school each year. That data is then used to calculate rankings for the Best Medical Schools in research, primary care, and several other specialties such as geriatrics and pediatrics.
Reputation and student statistics including MCAT scores, GPA, and acceptance rate are factored into all of the rankings. National Institutes of Health (NIH) research activity is factored into the score for research schools, while the proportion of new M.D. graduates entering primary care specialties is used in the primary care rankings.
The two reputation scores, which account for 40 percent of a school's ranking, are the largest factors—and the most highly debated topic at the conference. One reputation survey goes to medical school deans and top administrators and a separate survey goes to residency program directors; both groups are asked to rate the quality of each medical school's program from 1 (marginal) to 5 (outstanding).
Nearly half of all medical school deans filled out the survey last year—a remarkably high response rate—but only 17 percent to 19 percent of residency program directors responded. Surprisingly, according to Robert Morse, U.S. News director of data research—who recently conducted a survey of attitudes toward the rankings—residency directors said they found the rankings to be more helpful when compared with medical school deans.
[See Morse's take on the historic medical school deans meeting.]
"I think what's frustrating everybody ... is that there's nothing really in the formula that is really evaluating the quality of medical education," said Robert Alpern, dean of the Yale University School of Medicine. "That would be so much more useful to the applicants, to the students. And it would incentivize us to do a better job in education."
Robert Golden, dean of the School of Medicine and Public Health at the University of Wisconsin—Madison, added that most deans simply don't know enough about other teaching programs to judge them. "We have a general idea of who gets the best students … but we don't really know ... how much small group teaching they're doing and how many of their teachers are actually good teachers," he said.
The panel members suggested other ways to rate program quality, such as the number of hours of clinical experience each student receives; the type of research opportunities students are offered; or how many students are involved in a rotation program overseas.
[Learn about the call to reform medical school education.]
The panel also had mixed feelings about the factors that contribute to the remaining 60 percent of the rankings score.
Even though most of the deans felt that the MCAT was a reliable indicator of how well a student will perform in medical school, they questioned its relevance to a school's overall ranking. Nancy Andrews, dean of the Duke University School of Medicine, worried that MCAT scores penalize schools with more diversity.