College administrators: Rankings can be problematic

How do you quantify this?

It’s college-rankings season in America — the time when when even the most patient college administrators roll their eyes.

The release of the annual U.S. News & World Report Best Colleges rankings, the most influential of them all, is a climax of sorts to a business that makes money off ranking schools by any number of things: academic caliber, value for money — even food.

Each year, the public eats those rankings up — as well as lists and rankings from Forbes, The Princeton Review, Washington Monthly, and a number of other national and international sources.

And yet flaws in the rankings’ philosophy and methodology mean students who shoot for only the highest-ranked school they can reach are bound to be disappointed, higher education officials warn. And college administrators are often left to answer questions about why their schools have inexplicably jumped or dropped in the lists — even though nothing at their schools has really changed from one year to the next.

“Most college presidents I know would prefer that the rankings didn’t exist,” said MaryAnn Baenninger, president of the College of St. Benedict and chair of the Board of the Minnesota Private College Council.

She told me:

“But we’re realists. … You ignore (them) at your peril. You have constituents out there who are paying attention to this stuff.”

The U.S. News rankings for undergraduate institutions, first published in 1983, collects data on 1,600 colleges and universities and then ranks them in various categories and by types of college. The magazine weighs as many as 16 factors such as academic reputation, faculty resources, graduation rates and student selectivity.

“I understand the attraction,” Macalester College President Brian Rosenberg said. “You’re taking a very complicated thing, and appear to make it much simpler. The notion that you can look at the list and see this school is better than that school, which can be very reassuring to people.”

Whether a top school exists

But for years, critics have attacked the idea of a “best” university, saying it’s impossible to quantify something as intangible as educational excellence, or say that one school is best for all, or add up how much better one school is to another — especially one considered a peer.

And U.S. News’ director of data research, Robert Morse, acknowledges that the rankings are just a rough tool to help gauge what colleges are a good fit — especially for students from cash-strapped high schools that don’t have enough college counselors to offer much help.

“(We want) to stress that the rankings not be used as the sole basis to choose a college,” Morse said. “U.S. News is not saying that parents should draw a line in the sand and say, ‘We should only go to schools above (rank) 10 or between 10 and 20.”

Admissions consultant Todd Johnson, owner of College Admissions Partners, agrees.

Choosing a school based on a ranking “is where it becomes problematic,” he said. “If you’re looking at No. 2 vs. No. 50, yeah — maybe there’s a difference. But No. 2 vs. No. 5 — that’s ridiculous. And between No. 2 and 20, probably not much difference, either.”

Concerns over methodology

The philosophy of ranking aside, many critics say the methodology itself is flawed. Among the problems:

    • It’s too subjective. As much as a quarter of the score for some rankings depends on surveys of top college administrators and high school counselors, who are asked to rate American’s many university — an hours-long task. The idea is that administrators are good judges of each others’ colleges, but as University of Minnesota Provost Tom Sullivan said, “Do I really know what’s going on at St. Louis University? No.”
    • Changes in methodology cause swings. Constant tweaks to the way the rankings are put together, such how they classify schools, how much weight they give to various factors, and the inclusion of new schools, mean many colleges will rise and fall a few notches. But when many schools are clustered close together, a minute drop in one factor can cause a dramatic drop in a college’s standing, critics say. That’s deceptive, because campuses just don’t change much in one year, college officials say.
    • Departments are lumped together. Even Ivy League colleges have their less-than-stellar departments, just as many of their less-prestigious cousins have top-notch programs that only those in the field know about. U.S. News does break out business and engineering programs and their specialties, for example, but that’s of no use to students looking for the best schools in their own fields of study. (University of Michigan associate professor in education Michael Bastedo suggests ranking departments — not whole institutions — and letting faculty fill out the forms, since they tend to have a much better understanding of each other’s departments.)
    • It’s ultimately based on money. Macalester’s Rosenberg joined others in saying that a large chunk of what U.S. News weighs — such as faculty resources (class size, salaries, etc.), instructional spending and the alumni giving rate — is financial. The magazine “would do better to rank according to how much money is spent per student,” he said. And colleges that try to trim extra costs are actually penalized because their per-pupil spending goes down. “Isn’t it peculiar that you’re rewarding inefficiency?” he asked. “If you can spend less to get the same outcome, shouldn’t that be seen as a good thing?”

Morse has said surveys of reputation are important because they provide the intangible elements of a school that raw data just can’t show. And he told MPR that aggregated responses of college leaders is still a fairly accurate indication of a school’s standing.

Large changes in a a school’s ranking are more of a problem for smaller schools, he said, because changes in data aren’t diluted as they are at large universities. That said, the actual data does change year to year, so the occasional drop or jump in the rankings is valid, he maintained. Ranking departments is a task too large and complicated for the magazine’s resources, he said, and the role of money is not something one can wish away.

“Schools with limited resources aren’t producing the same level of academic programs or service or academic package that schools with resources are,” he said.

Ranking’s effect on higher education

So how have U.S. News rankings affected schools after a quarter of a century or so?

They’ve produced some odd dynamics, Michigan’s Bastedo explained in a recent article for Oberlin College’s alumni magazine:

  1. The ranking itself helps shape college leaders’ perception of a college — not the other way around. The more years that college presidents see a ranking, he suggests, the more that ranking influences how they see a college. At some point, he told MPR, reputation and ranking “are starting to feed off one another, so they’re not different anymore.”
  2. Students don’t really pay attention to rankings — only higher-ed insiders do. A 2006 study by UCLA found that only one in six students said ranking strongly influenced their choice of a college, Bastedo wrote. In most cases, moving up or down some notches had little if any effect on admissions. When it came to funding, it was a relatively small group — alumni, out-of-state students, and those who made decisions on federal research funding applications — who really cared.
  3. Rankings can be a self-fulfilling prophecy. College officials begin to believe the rankings, and change how they manage the colleges to do even better in them. Clemson University’s president vowed in 2001 to move into the top 20, and Clemson officials reportedly influenced U.S. News surveys and data to move it up in the rankings. Oddly enough, such attempts to game the system ultimately bring little benefit, Bastedo said. “I think it’s actually fairly difficult to change the indicators enough to move up in a significant way.”

Morse said he hadn’t seek Bastedo’s research, but remained “skeptical” that his methods could come to that conclusion.

Ultimately, Bastedo told MPR, “The irony is that rankings are designed to serve those outside of education, (but) it just turns out that the people really interested are in higher education. We’re looking at ourselves.”

Movements against the rankings

The higher education sector’s distaste for rankings has spawned numerous education journal articles with titles such as “Flawed rankings,” “Fixed Rankings?” “Obsession with Rankings Goes Global,” and even “The Madness of Rankings.”

Several dozen college presidents signed a letter in 2007 pledging not to participate in U.S. News’ survey of their peers’ reputations or use the rankings in marketing or other publicity.

But efforts to combat the ratings don’t appear to have affected them much.

On his college’s Web site, Macalester’s Rosenberg wrote a piece called, “The Great Ratings Debate: Should Macalester join the boycott of U.S. News & World Report?”

His answer: No.

“I am inclined to think that more information is better than less and that consumers should determine which forms are most useful and reliable,” he wrote. “Attempts to suppress even the most baseless and scurrilous publications have rarely succeeded.”

As a possible antidote, he pointed to the Annapolis Group, an organization of about 130 liberal arts colleges that announced in 2007 that it would start working on its own system for comparing colleges. Since then, though, the project appears to have gone inactive.

St. Benedict’s Baenninger, who was involved in the project, said, “Over time it’s better to leave (such work) to other more encompassing organizations,” and mentioned that the National Association of Independent Colleges and Universities has produced its own set of online comparisons, called U-Can.

Meanwhile, college presidents continue to view rankings with mixed emotions.

“There’s the old cliche: Presidents love rankings when they’re good, and then they’re not, they think they don’t matter,” Hamline University President Linda Hanson told MPR last year.

Baenninger’s bio page on St. Benedict’s Website, for example, lists the college’s place in the rankings, as well as its rise under Baenninger’s tenure. And at another part of the site, St. Benedict’s touts its appearance in several ratings and college guides — but also includes some caveats about such listings.

The college doesn’t send out a press release when the U.S. News rankings come out, and neither does Macalester anymore.

And would tooting their own horns help anyway?

Probably not too much, as Bastedo has said.

Student use

John Lawlor, founder of higher-education marketing firm The Lawlor Group in Eden Prairie, said students and families generally don’t rely heavily on rankings, and use them them merely as a starting point in their search.

“Rankings influence consideration” of a college, he said. “They don’t influence selection.”

That said, Phil Trout, a college counselor at Minnetonka High School, said a few parents and students do give the rankings their complete trust, and come to his appointments with their ranked listings marked up and highlighted.

Like many in higher education, Trout tells them not to shoot for the most highly ranked college, but for the one that seems best matched to the student’s personality and interests.

Rankings do provide a lot of backup material and data that show some of the personality of a campus, college officials say say. And different ranking systems use different criteria, so parents need to know what’s important to the rankers to understand how they’re skewed.

Still, they’re not the main game.

Minnetonka graduate Jennifer Friedlander, now a junior at Middlebury College in Vermont, wrote in an e-mail to MPR, “Overall, I think rankings matter if you are not an involved student. If you have a sport, activity, hobby that a school can really foster, that is where you go.”

But U.S. Newsrankings did act as a red flag for Minnetonka High parent David Drueding, who said, “I’m worried if a school was poorly ranked. I’m worried if it’s at the bottom.”

He views ratings with a healthy dose of skepticism, but says, “I’m glad they’re there. I’d hate to go out into the world without something.”

  • Lakinitup5

    Ha…It’s funny how a president of a poorly ranked school like College of St. Benedict (Ranked 90) says, “Most college presidents I
    know would prefer that the rankings didn’t exist.”  Why was Carleton not mentioned in this article? Come on it’s the only top ten liberal arts school in the state.

  • rua bigot

    These lists are beyond just unreliable–they are evil. Universities motivated to have their institutions rise in the ratings are often deceptive to the very kids they are trying to recruit. The deception includes recruiting efforts aimed at kids they know they won’t accept and those that they know won’t come to their school. Then to increase their “yield” they are playing games with kids with the early action/early decision and, to add a new horrible wrinkle, now some are using regular admission-but committed-which is bound to spread because it offers the schools yet another wave of kids who will improve their yield. It is simply evil!