Admit it. You and I cared too much about U.S. News & World Report in high school.
My parents, who know little about the structure of the American educational system let alone the individual universities, can probably recite the top 20 national universities on the U.S. & World Report News Best National University Rankings. Whenever I bring up the name of a college, their first response is always about how highly ranked the institution is.
While students and families worldwide plan their next four years according to a media company’s annual college rankings, few take the time to scrutinize how U.S. News ranks the colleges, or think critically about where its data comes from.
This March, Michael Thaddeus, a math professor at Columbia University, published a 21-page report depicting Columbia’s dubious ascent from the 18th place in 1998 to No. 2 in 2021. His report questioned Columbia’s stated class sizes, percentage of faculty with terminal degrees, spending on instruction, and the student-faculty ratio — all important factors U.S. News considers to calculate its annual college rankings.
Even though Columbia defended its data, it then quickly announced its decision not to participate in the next U.S. News college rankings in June, allowing the admissions office more time to audit the data. More problematic, however, was U.S. News’ decision still to include Columbia in its 2022-2023 college rankings without referencing official data reported by the school. This time, Columbia was downgraded from its previous No. 2 spot to No. 18. Despite Columbia’s decision not to submit its new data to U.S. News, it has collected information about Columbia on its own, relying on outside sources.
Instances of cheating in college rankings are not unheard of, but Columbia is certainly the highest-ranked and one of the most reputable universities to have cheated and gamed the rankings system in recent years. What’s worse, the university decided to cover its tracks and adamantly defended its data before Thaddeus’ revealing report gained the public’s attention.
Perhaps the most ridiculous decision of all in this series of events was that U.S. News still included Columbia in the rankings without using any new data submitted by the university this year.
What does it say about a university’s quality of education when it plummeted from No. 2 to No. 18 in one year? Should students expect Columbia to be eight times worse than it was a year ago based on the rankings? Or was the No. 18 a marketing tactic from U.S. News to increase views and discussion on its new rankings, thus making more profit? Whatever the purpose, Columbia’s sudden plummet and U.S. News’ haste to include Columbia again in its new rankings without due investigation reflects the meaninglessness of a ranking system that encourages universities to lie to get ahead while the company turns a blind eye.
Temple University’s business school dean was accused of fraud a year ago because its former dean, Moshe Porat, boosted the school’s rankings by submitting false reports to U.S. News. He ordered his staff to send inaccurate information to U.S. News because he and Issac Gottlieb, a tenured professor at Temple’s Fox School of Business, believed that U.S. News & World Report would not take the initiative to investigate.
Remember, ranking universities is a business. U.S. News, like many others, rely on people to discuss their rankings and pay annual subscriptions to turn a profit. Their business priority is not to ensure its data accuracy, but to allow some eye-catching variations in each year’s new rankings so that people will view them, argue about them, and eventually pay for them. Granted, rankings have to provide some base-level legitimacy and accuracy to stay in business, but to grow a business that profits from ranking schools from year to year, ranking variations are always welcomed.
Colin Diver, former president of Reed College and former dean of Penn Law School, suspects that U.S. News doesn’t want to spend its resources on independent auditing and fears that schools won’t cooperate with an audit requirement to qualify for ranking. In fact, U.S. News’ recent FAQ page also acknowledges that it “relies on schools to accurately report their data” to ensure ranking objectivity and fairness.
Makers of the college rankings lack the resources or incentives to scrutinize data, while universities have all the incentives to lie or bend the rules to get ahead in college rankings.
After my previous article questioning Penn’s method of calculating its student-to-faculty ratio on its yearly Common Data Set reports, Penn’s newly published 2020-2021 Common Data Set showed an approximately 14% drop in the number of faculty with doctorates or other terminal degrees compared to previous reports from the recent decade. It also raised its student-to-faculty ratio from 6:1 to 7:1 for the first time in the past 10 years of the University’s history.
In response to my questions, Penn’s Office of Institutional Research & Analysis responded that “the decision to provide headcounts for faculty in stand-alone graduate programs” this year caused the modest change. In previous years, Penn’s Common Data Set reports did not distinguish between faculty in stand-alone graduate programs and the undergraduate faculty number, lumping them together as one.
If one institution could lump undergraduate and stand-alone graduate program faculty together for years without scrutiny, how can we ensure that the hundreds, if not thousands, of institutions submitting their data to U.S. News every year are honest and ethical? The Penn community also deserves an explanation of why Penn has suddenly changed its faculty calculation methods and how it arrived at its numbers.
Penn is not an outlier in gaming the system. However, Penn can become the outlier who stands up against the rankings business and calls out the inaccuracies that fester college rankings. While many of us take pride in Penn’s improved ranking from No. 8 to No. 7 in the U.S. News & World Report ranking this year, we should also recognize the arbitrary, shady, and unethical ranking process that is slowly making rankings a sham.
At the end of the day, rankings will always exist, and they provide advice, whether good or bad, in efficient ways. However, a sole fixation on rankings when selecting schools might distort the true intention of ranking colleges. Instead of looking at comprehensive college rankings, take a look at other specific rankings that most relate to you. Regional rankings, specific program rankings, or a college ranking for “most students receiving merit aid” all provide useful information for students from diverse backgrounds. Better yet, create your own rankings, which personalize college rankings based on your preferences such as geography, school size, gender distribution, and so on.
Nowadays, many of us are obsessed with pursuing the “best” through the perspectives of other people. We fail to realize that what’s best for ourselves is not necessarily the same for others. The same goes for selecting your dream school.
SAM ZOU is a College senior studying political science from Shenzhen, China. His email is firstname.lastname@example.org.