US News Rankings Methodology: What's Really Behind the Numbers

David

The US News Rankings Methodology: What's Really Behind the Numbers

People crave rankings. But what do these rankings mean? How are they actually formulated? The 2017 US News and World Report college rankings came out recently, and while people love consuming them, they many times fail to ask what exactly is behind this US News rankings methodology.

Asking this question is important because these rankings are not perfect - no rankings are. And this reliance on a sometimes flawed methodology can be an issue when students prioritize rankings over personal fit during the college application process.

So, with the latest rankings recently released (and with all sorts of buzz around it - UChicago #3? Stanford #5?) this article is a look into what is behind the US News rankings methodology.

How the US News Rankings Methodology is Problematic

In a survey conducted by the National Association of Colleges and Employers (NACE), students were asked what the most important factor was in selecting a college. The most popular considerations are listed in order below:

  • Ability to Obtain a Good Job After Graduation
  • The Cost of Attendance
  • Visit to the Campus
  • Size of the School and Student Body
  • Admissions Potential for Top Graduate/Professional Schools

You will note that none of these factors are directly accounted for in the US News rankings methodology, and the employment, grad school admissions potential, and cost of attendance features are completely ignored in the methodology.

To take just that first point, the US News rankings methodology completely ignores job prospects after college, which is one of the primary concerns of applicants. This means that schools like Dartmouth College (#11 in USNWR), which boast the highest mid-career salaries of any school in the United States, aren’t rewarded for producing graduates who find solid and lucrative employment. If graduation salary statistics are what you care about, compare the below list from New York Times to the latest U.S. News and World Report rankings.

 

payscaleschoolsalary2

Looking at the Numbers

Now, let’s take a look at exactly what a few of the rankings criteria are in the US News rankings methodology and why they are sometimes problematic. You can read an explanation of these criteria from Robert Morse - the head of the U.S. News and World Report rankings operation - here.

1. Undergraduate Academic Reputation, 22.5%

This “prestige” ranking is based on a survey sent to the country’s university and college presidents, provosts, and admissions deans as well as a few high-school officials asking them to grade all of the schools on a 1-5 scale. So, for instance, the President of Vanderbilt University is asked to rank all 262 other national universities. There are several flaws with this approach as it is hardly scientific. 

This is not to say the reputational rankings can’t be helpful. They certainly can be, but only if they are ranking institutions along one dimension and relying on people with specialized knowledge. A more accurate example than the US News rankings methodology is the Wall Street Journal's method of ranking colleges exclusively based on the opinions of corporate recruiters whose opinions are directly relevant to the rankings at hand.

2. Graduation and Retention Rates, 22.5%

This metric is broken down into two sub-categories: six-year graduation rate (80%) and first-year retention rate (20%). These rates are seemingly solid points to include in the methodology - particularly the first-year retention rate.

Yet certain top schools, such as Stanford, are notorious for college dropouts who go on to pursue startup ventures. The most famous example, as we all know, is Mark Zuckerberg dropping out of Harvard to found Facebook. Thus the issue with this particular metric is that schools like those in Silicon Valley, which pride themselves on their entrepreneurial spirit and support of student ventures, are penalized in the US News rankings methodology for dropouts who are going on to act as major contributors to the business and tech worlds.

3. Faculty Resources, 20%

The justification for this being 1/5th of the rankings is purportedly that the more satisfied students are about their contact with professors, the more they will learn. The underlying point - that student engagement is critical - is accurate; the problem is how to measure that. And the US News rankings methodology struggles to do so.

This 20% is broken down into the following: the proportion of classes with fewer than 20 students (30%), the proportion of classes with 50 or more students (10%), the faculty salaries (35%), the proportion of professors with the highest degrees in their fields (15%), the student-to-faculty ratio (5%), and the proportion of faculty who are full time (5%).

What do these statistics have to do with measuring students’ happiness about their contact with professors?

Almost nothing.

Statistics can be manipulated. For example, Clemson managed to move up from 38 to 22 at some point in USNWR’s rankings of public research universities. Clemson allegedly did things like artificially manipulating class sizes with no real benefit to the students. They (apparently) would try to bump sections with 20 students down to 19, but let a class with 60 students rise to 65 in order to try to improve their ranking in the “Percentage of Classes with Fewer than 20 Students” category. These changes would have no real, tangible benefit on student interactions with professors, rather they were more of a means to play into the US News rankings methodology.

About 10 years ago, President Gerhard Casper of Stanford University penned a letter to the new editor of USNWR sharply criticizing the artificial manipulation of school ranks from year-to-year as a transparent ploy to sell more magazines. His commentary regarding the faculty resources category (pasted below) is particularly illuminating:

letter_stanford

4. Student Selectivity, 12.5%

The rankings race encourages schools to increase their selectivity. How do they do this? It’s simple. They make it easier to apply to the school. Stanford - the most selective school in the US in recent years - holds this title at least in part because it has lowered application fees and streamlined the technology needed to apply to allow thousands more students to easily submit applications. Although, Stanford’s Dean of Admissions has spoken out against the obsession with selectivity as simply another way to manipulate rankings. But if applicants are responding to the US News ranking methodology, then schools must as well. Always remember: colleges are businesses too. Even public universities. 

5. Financial Resources, 10%

More money does not always correlate with educational outcomes if it is not spent in ways that directly impact students. It is many times, rather, spent on something such as artificially inflating salaries in order to manipulate the US News rankings methodology. Particularly schools with large endowments, such as Harvard ($36.4 billion), are not allocating all of their financial resources to directly benefit student’s needs on a year-to year basis.

6. Graduation Rate Performance, 7.5%

This might be the most educationally counterproductive statistic of them all. The US News ranking methodology tries to measure the effectiveness of a school by using a “predicted graduation rate” and comparing it with an “actual” graduation rate. Former Stanford President Gerhard Casper captures why this is a problematic metric with a poignant example in his 1996 letter:

“The California Institute of Technology offers a rigorous and demanding curriculum that undeniably adds great value to its students. Yet, Caltech is crucified for having a "predicted" graduation rate of 99% and an actual graduation rate of 85%. Did it ever occur to the people who created this "measure" that many students do not graduate from Caltech precisely because they find Caltech too rigorous and demanding - that is, adding too much value - for them? Caltech could easily meet the "predicted" graduation rate of 99% by offering a cream-puff curriculum and automatic A's. Would that be adding value? How can the people who came up with this formula defend graduation rate as a measure of value added? And even if they could, precisely how do they manage to combine test scores and "education expenditures" - itself a suspect statistic - to predict a graduation rate?”

Essentially, this point of the US News rankings methodology punishes good schools for challenging their students to become better.

7. Alumni Giving Rate, 5%

Keeping a nearly perfect score in this category is not particularly difficult. Just ask every one of your alumni for one cent. We see plenty of schools doing things like this by sending dozens of e-mails per year asking students for a mere one dollar contribution. Is this how you want to make 5% of your decision on where to attend college? Whether enough people have been bothered to donate one dollar?

A Path Forward

None of this is meant to attack US News and World Report. It is meant to provide insight into the issues with rankings generally.

The US News rankings methodology is flawed because there is simply no direct, objective way to measure the quality of an institution. As Malcolm Gladwell writes, there is no way to measure “how well a college manages to inform, inspire, and challenge its students. So, the U.S. News algorithm relies instead on proxies for quality - the proxies for educational quality turn out to be flimsy at best.”

And many times, it is difficult to tell if schools are making the best decisions based off of rankings. For example, if UChicago has increased their selectivity in recent years (the schools has one of the highest standardized test score averages), then they will climb in rankings. This selectivity does not impact students. But if the school’s ability to move to #3 in 2017 allows them to attract more distinguished faculty, or gain more resources for students on campus, then perhaps it is a good thing that they played into the US News rankings methodology.

Applicants buy into rankings, which increases schools’ responses to rankings, which increases the importance of rankings generally. It’s all essentially a positive feedback loop.

These rankings can be useful for those who are just getting familiar with various colleges in the United States and their primary strengths. But that is as far as your reliance on these rankings should go because the quality of a particular college cannot be measured by a set of data points in a one-size-fits-all fashion. Additionally, amongst the top schools, the difference in their rankings is practically statistically insignificant because they score so similarly. This means that if you’re deciding between the #1 ranked school and the #5 ranked school, there is really no difference in quality - only minute points hidden within the US News rankings methodology.

Looking at the different rankings is a good place to start building your college list. But parsing through the US News rankings methodology shows that there is no perfect way to rank a school. And more than that, no rankings can inform which school will be the best for you, individually.

Looking at the different rankings is a good place to start building your college list. But parsing through the US News rankings methodology shows that there is no perfect way to rank a school. And more than that, no rankings can inform which school will be the best for you, individually. The best school isn’t necessarily the school with the top ranking or the school with the highest SAT score averages or the school with the most prestigious professors. It’s whatever school is the best fit for your needs, major, potential career, personality, and interests. As with anything related to education and your future, you need to ask the tough questions, and do the research. There are no shortcuts to hard work in education. Your dreams won’t work unless you do!

 

Schedule a free consultation

to find out how we can help you get accepted.