Preview: Methodology Changes for 2014 Best Colleges Rankings

U.S. News gives a detailed look inside the 2014 Best Colleges methodology changes.

By + More
One of the methodology changes in the 2014 Best Colleges rankings is a reduction in the importance of high school class standing.
One of the methodology changes in the 2014 Best Colleges rankings is a reduction in the importance of high school class standing.

The 2014 U.S. News Best Colleges rankings will launch on on Tuesday, Sept. 10. For these new rankings, we made significant changes to the methodology to reduce the weight of input factors that reflect a school's student body and increase the weight of output measures that signal how well a school educates its students.

Below, we offer a sneak peek of these changes. 

1. High school class standing: We reduced the weight assigned to the high school class standing of newly enrolled students and gave slightly more weight to SAT and ACT scores. 

It is clear from the data that U.S. News collects that, as each year passes, the proportion of high school graduates with class rank on their transcripts is falling. As a result, the measure is less representative of each college's freshman class than it was five or 10 years ago. 

As noted in the National Association for College Admission Counseling's 2012 "State of College Admission" report, since 1993 "the factor showing the largest decline in importance is class rank." This same research shows that admissions test scores are growing in importance. 

To better reflect this reality, the student selectivity indicator in our ranking model has been adjusted so that the weight of high school class standing drops from 40 percent to 25 percent, and the weight of SAT and ACT scores rises from 50 percent to 65 percent. 

At the same time, the weight of student selectivity overall declines from 15 percent to 12.5 percent so that we place less emphasis on inputs. This move reduces the effective weight of class rank from 6 percent to 3.125 percent. 

2. Graduation rate performance: We expanded the use of the graduation rate performance indicator to all of the Best Colleges ranking categories; for the first time it will apply to Regional Universities and Regional Colleges. 

This important outcome measure focuses on the difference between each school's predicted graduation rate (as calculated by U.S. News based on several characteristics of the incoming class closely linked to college completion, such as test scores and Pell Grants) and its actual graduation rate. 

It gives credit to schools that have a higher-than-expected graduation rate. This factor will have a weight of 7.5 percent in the ranking model for all schools. 

3. Other ranking factors: We changed other weights in the ranking model to further emphasize outcome measures. 

The weight of the peer assessment score was reduced from 25 percent to 22.5 percent in the Regional Universities and Regional Colleges categories; the weight of graduation and retention rates was increased for National Universities and National Liberal Arts Colleges to 22.5 percent from 20 percent. 

Since we added graduation rate performance as a ranking factor for Regional Universities and Regional Colleges, the weights for the graduation rates themselves and retention rates dropped from 25 percent to 22.5 percent. 

As a result of the changes described above, many schools' ranks will change in the 2014 edition of the Best Colleges rankings compared with the 2013 edition. 

Beyond the ranking methodology changes, we'll use clearer footnotes to indicate the schools that did not report to U.S. News fall 2012 SAT and ACT scores for all first-time, first-year degree-seeking students with these scores – including athletes, international students, minority students, legacies, those admitted by special arrangement and those who started in the summer of 2012. 

The footnotes also include schools that declined to tell us whether all students with test scores were represented. 

The value of those footnoted SAT and ACT scores reported by the school was reduced in the Best Colleges ranking model. This practice is not new; since the 1997 rankings, we have discounted the value of such schools' reported scores in the ranking model, since the effect of leaving students out could be that lower scores are omitted. 

More details about our methodologies will be available on Sept. 10 on