
It seems like everyone is trying to get in on the college ranking game–even the president. Earlier this month, the Obama administration rolled out the College Scorecard, a new data-driven method for evaluating colleges and universities. What sets the College Scorecard apart from all those other rankings is its focus on student outcomes. Rather than measuring nebulous things like reputation, the College Scorecard focuses on three key metrics: how much a school costs, what percentage of students graduate, and how much money students earn after graduation. There is also information available about diversity, average debt at graduation, and other key factors.
The Scorecard is supposed to help students determine which schools will give them “more bang for [their] educational buck.”
Certain schools don’t look particularly appealing when viewed through this lens. For example, MICA costs a whole lot more per year than the national average ($35,607 per year–after factoring in aid), but ten years after enrolling, the school’s alumni earn less than the national average ($31,400). In comparison, a year at Hopkins is less than MICA, though still more than average ($26,596), but alumni earn a whole bunch more money ($69,200). Schools like the University of Baltimore and various local community colleges also end up looking a whole lot better when evaluated under this system, too. (Compare and contrast various local schools here.)
Of course, this kind of comparison also highlights some of the weaknesses of the College Scorecard– after all, no one goes to art school because they think it’s going to make them buckets of money after graduation. As critics of the Scorecard have pointed out, schools that focus on liberal arts more than STEM fields tend to lose out in these kinds of calculations because the benefits they provide are often more difficult to quantify. But on the whole, more information is a good thing for anyone embarking on the college quest.