Each year, U.S. News ranks professional-school programs in business, education, engineering, law, and medicine. These rankings are based on two types of data: expert opinions about program quality and statistical indicators that measure the quality of a school's faculty, research, and students. These data come from surveys of more than 1,200 programs and some 11,000 academics and professionals that were conducted in fall 2008.
As you research course offerings and weigh different schools' intangible attributes, U.S. News's rankings can help you compare programs' academic excellence. It's important that you use the rankings to supplement—not replace—careful thought and your own inquiries.
We also rank programs in the sciences, social sciences, humanities, and many other areas, including selected health specialties. These rankings are based solely on the ratings of academic experts. This year, we've produced new rankings of Ph.D. programs in criminology, economics, English, history, political science, psychology, and sociology. We've also made new rankings for master's of library and information studies programs.
In addition to these new rankings, we republish older rankings that are based solely on peer ratings in various health fields, Ph.D. programs in the sciences, master's of public affairs and public policy, and master's of fine arts.
To gather the peer opinion data, we asked deans, program directors, and senior faculty to judge the academic quality of programs in their field on a scale of 1 ("marginal") to 5 ("outstanding").
In business, education, engineering, law, and medicine, we also surveyed professionals who hire new graduates. For the first time, the two most recent years' surveys were averaged to compute the assessment scores among professionals. The statistical indicators used in our rankings of business, education, engineering, law, and medical schools fall into two categories: inputs, or measures of the qualities that students and faculty bring to the educational experience, and outputs, measures of graduates' achievements linked to their degrees.
Different output measures are available for different fields. In business, we used starting salaries after graduation and the time it takes graduates to find jobs. In law, we looked at the time it takes new grads to get jobs, plus state bar exam passage rates.
This year, we modified our main law school rankings methodology. We used the combined fall 2008 class admissions data for both full-time and part-time entering students for the median LSAT scores, median undergraduate grade-point averages, and the acceptance rate in calculating the school's overall ranking. U.S. News's previous law school ranking methodology used only the full-time entering student data for those three admissions variables. This change improves the methodology because U.S. News is now comparing each law school's entering class against every other's based on the entire student body, which produces the most complete comparisons. Since 1990, data for part-time J.D. students have been included in computing all the other statistical variables used in the U.S. News law school ranking methodology. For more details, go to the overall law school ranking methodology.
Also, for the first time, U.S. News is ranking 87 part-time J.D. law programs. These part-time rankings are based solely on the rating of law school academics. For more details, go to the part-time law J.D. program methodology.
To arrive at a school's rank, we examined the data for each quality indicator. Where appropriate, we adjusted the indicators in which low values suggest higher quality, such as acceptance rates. We then standardized the value of each indicator about its mean. The weights applied to the indicators reflect our judgment about their relative importance, as determined in consultation with experts in each field. The final scores were rescaled: The highest-scoring school was assigned 100, and the other schools' scores were recalculated as a percentage of that top score. The scores were then rounded to the nearest whole number and schools placed in descending order. Every school's performance is presented relative to the other schools with which it is being compared. A school with an overall score of 100 did not necessarily top out on every indicator; rather, it accumulated the highest composite score.
A school's rank reflects the number of schools that sit above it; if three schools are tied at 1, the next school will be ranked 4, not 2. Tied schools are listed alphabetically.