The 5 most questionable college and university rankings of 2015
Since 1983, there have been two major Rites of September for U.S. higher education: the new academic year begins, and U.S. News and World Report releases its thoroughly disabused college and university rankings. These rankings are perennially welcomed back with a hazing of well-founded criticism. And yet they return year after year, like a super-duper senior, forever three credits shy of a Stats degree but totally pumped to celebrate homecoming.
So much sharp critical commentary has been lobbed at U.S. News that it’s become fashionable to criticize the criticism. The Atlantic’s John Tierney referred to this annual outpouring of anti-U.S. News sentiment as a “bray-a-thon” just before hopping on his own dismissive donkey. Meanwhile, there’s been a growing competitive pool of new college ranking systems, all vying for equal scrutiny. How can students, parents, politicians and academic leaders possibly weigh all the relative flaws of these rival scoring systems?
The only answer is to provide them with a definitively numbered list. Here are the five most questionable college and university rankings of 2015 that were not compiled by U.S. News and World Report. As my contribution to this year’s rankings bray-a-thon, think of it as five irritable burros trailing after an ornery mule.
5. NICHE
Coming in at five is Niche, formerly College Prowler, with its student-based ranking and review system. Part of the fast-growing Rate My Professors genre of college review websites, Niche clearly understands that today’s students value peer reviews far more than institutional surveys. While student reviews are heavily weighted in Niche’s final rankings, there are a statistically insignificant number of student respondents for many of the ranked institutions. If you think this fact would prevent Niche from joining the college and university rankings game, you would be wrong.
An entertainingly dismissive College Times piece from last year evaluated 18 college review websites for their value to prospective students, and offered this assessment of Niche/College Prowler’s methodology:
“Niche.com arbitrarily mixes data from government databases, school administrators, and students themselves, without communicating to visitors which data is which. Yes, they literally allow campus marketing directors to login and update school profiles however they see fit.”
Niche doesn’t hesitate to apply their free-ranging methodology to such highly subjective subcategories as the Friendliest College (it’s Brigham Young!). They have also started ranking K-12 schools, which they suggest will be a viable future replacement for SAT/ACT scores to help admissions staff better rank prospective college students.
Brigham Young University in Provo, Utah, is the nation's friendliest college, according to Niche. Photo: Jaren Wilkey/Wikimedia Commons.
4. FORBES
You have to hand it to Forbes for advancing the critical arguments against U.S. News while simultaneously establishing their own annual college and university rankings. It’s like GQ taking on People’s “Sexiest Man Alive” with its more sophisticated “Man of the Year Award.”
A recent post in Forbes's Education blog by Jessica Brondo Davidoff compared the U.S. News rankings to such quaint cultural relics as cassette tapes and Casey Kasem’s Top 40: things that used to be wildly popular but are now nostalgic rather than relevant. According to Forbes, their methodology trumps U.S. News’ due to its incorporation of contemporary return-on-investment (ROI) statistics:
“This is a new age of return-on-investment education, the very heart of our definitive ranking. Our focus is on just one measurement: outcomes. From low student debt and high graduation rates to student satisfaction and career success, these outstanding institutions are worth it.”
Aside from the many issues with ROI statistics, one of the biggest problems with Forbes’ system is something that typically isn’t viewed as a problem at all: its relation of college rankings to positional wealth and status. If someone makes more money selling junk bonds than as a commercial loan officer, it isn’t because they got their MBA from the Wharton School of UPenn versus from a relatively lower-ranked institution. However, as Susan Engel points out in a recent Salon article, “The underlying expectation is that academic performance is always and only a matter of comparison.”
Forbes named Pomona College, in Claremont, Calif., America's top college. Photo: Dave & Margie Hill/Kleerup/Creative Commons.
3. TIMES HIGHER EDUCATION WORLD UNIVERSITY RANKINGS—WORLD REPUTATION RANKINGS
U.S. News entered the field of international university rankings in 2014, going up against established giants Times Higher Education (THE) and ShanghaiRanking Consultancy’s Academic Ranking of World Universities. Perhaps this is what inspired THE to introduce a spin-off rankings system based solely on the most controversial and discredited part of U.S. News’ rankings: the reputation score. As THE describes it:
“The Times Higher Education World Reputation Rankings 2015 employ the world’s largest invitation-only academic opinion survey to provide the definitive list of the top 100 most powerful global university brands. . . (The) reputation league table is based on nothing more than subjective judgement—but it is the considered expert judgement of senior, published academics—the people best placed to know the most about excellence in our universities.”
While THE’s survey is more extensive than U.S. News’—over 10,500 responses from 142 countries—it still falls prey to the inherent statistical flaws and response biases of reputation scoring. THE’s World Reputation Rankings of the top 100 universities could only get to #50 before the scores became identical clusters—forcing them to resort to alphabetized lists.
2. U.S. DEPARTMENT OF EDUCATION RATINGS SYSTEM/COLLEGE SCORECARD
The U.S. Department of Education’s proposed college ratings system earned the #2 spot on this list before it was officially abandoned on 9/12 after a two-year development process. It remains on the list even though it is being replaced by a new college scorecard system: as a reminder of how close the USDE came to adopting the highly criticized ratings, and of the many pertinent questions still to be asked about the scorecard system.
The idea of tying public funding to a federally administered college ratings system was controversial from the beginning, with the major organizations representing U.S. colleges and universities voicing concern and opposition. While opponents of the ratings generally agreed that holding poorly performing institutions more accountable was an important goal, they thought available federal data for many of the proposed metrics were too incomplete to derive meaningful ratings. Other critics were concerned that the system’s performance metrics would end up punishing schools that serve predominantly low-income and minority students.
The USDE’s newly unveiled scorecard system no longer sets out to rate institutional performance. Instead, it aims to provide students and families with a research tool to sort and compare data for institutions based on considerations such as annual cost, graduation rates, and average salaries after graduation. Many of the scorecard’s new data sets are more comprehensive than what were previously available, especially the income data provided by the IRS. At the same time, critics have been quick to point out that this flood of new data is hardly value neutral, and could further obscure the extent to which equity issues have a greater impact on personal earnings than where students got their degrees.
It also leaves people with the classic scoring system dilemma. If you’re sifting through the data and two students have identical 4.0 GPAs, how can you possibly know which one is the better student unless you can confidently say “The one who went to Yale”?
Though it costs $237,700 for four years to attend Harvey Mudd College in Claremont, Calif., the school offers a $985,300 ROI, the best figure in the nation, according to PayScale. Photo: Imagine/Wikimedia Commons.
1. PAYSCALE COLLEGE ROI REPORT
PayScale has boiled the complexity of ranking colleges down to a single measureable: how much money will students earn relative to what they paid for their degrees? In PayScale’s rating system, college is a transactional investment, same as a stock or mutual fund purchase, and the way you measure success is return on investment after graduation.
If the idea of absolute ROI for an institutional degree independent of student aptitude, performance or aspiration sounds too simplistic, it’s because it is. In addition to all the ROI calculation errors pointed out by economists from P3 and Arizona State University, PayScale lets graduates self-report their income. The rankings also make students who got an education degree from a regional comprehensive and went on to become high school teachers appear as if they got shafted because they didn’t pay more to go to MIT.
PayScale’s College ROI rankings will eventually supplant U.S. News and World Report in the critical scrutiny they receive—especially since commentators like William Bennett use them to argue that only 150 universities in the country are worth their price tag. In the meantime, major media outlets will trumpet their release, analysts will continue to say that it’s better than no ROI standard at all, and the science and engineering-oriented West Coast Ivy Harvey Mudd will continue to be ranked at or near the top.
This year marks the 20th anniversary of when Reed College stopped submitting survey information for theU.S. News rankings. The thoughtful explanation on Reed’s website and the articulate editorial that former President Colin Diver penned for The Atlantic in 2005 are outstanding critiques of the college rankings mindset and methodology. They also reaffirm the core values underlying higher education, and the true value that individual schools can provide to their students. It is an understanding of value that allows us to view questionable college ranking systems as unnecessary rather than inevitable. As Diver concluded:
Before I came to Reed, I thought I understood two things about college rankings: that they were terrible, and that they were irresistible. I have since learned that I was wrong about one of them.