What Makes Our Best Colleges Rankings Different

There is no shortage of rankings out there claiming to identify the "best colleges".  Some are great tools to help students begin their college search. When we decided to publish our Best Colleges Rankings we vowed not to do so unless we thought ours would add something to the overall discussion on college selection. The following article will outline why we think we have accomplished that goal.

Rankings For Better Decisions

We believe our rankings are better. Click any of the links below to read more about each factor.

  • Outcome Focused: If a college doesn't prepare graduates for success, is it really a good college?

  • Objective: We don't rely on subjective survey data.

  • No Division into Arbitrary Lists: Most students don't know why they would want to go to a research university over a liberal arts college. We give them a way to compare all colleges together.

  • Inclusive of All Colleges: Very few students will make it into the "top 100", so we rank all colleges we have data on. We know there are many colleges doing a great job educating the majority of students who cannot get into highly selective universities.

  • Customizable: One size does NOT fit all! This is never more true than it is with college selection!

Outcome Focused

Ranking colleges based primarily on input (determined before admittance, i.e. test scores) is like ranking the quality of hospitals based on the health of their patients before being admitted. Almost all college rankings use some level of input data like test scores in their rankings and we are no exception. However, input data make up a smaller part of our ranking and we take a different approach in a number of ways.

  • We don't include acceptance rates as this can be deceptive (the more students you can convince to apply the lower your acceptance rate appears).

  • We look at the percent of full time teachers, not just of faculty, which exposes many of the colleges that do not classify part time adjuncts as "faculty".

  • Most of the input we include is not weighed very highly.

We instead focus on actual outcomes such as:

  • Graduation & retention rates (common, but relevant)

  • Student loan default rates

  • Early and mid-career earnings of graduates from each college

Earnings Boost, Not Average Earnings

Some rankings do include salary information but they use average earnings for all students at the college. This can paint a very misleading picture that more describes the majors at a school than it does it's educational quality. A college that focuses on liberal arts is almost always going to have graduates who earn less on average than a school focused on engineering. That doesn't mean that the liberal arts school does a worse job educating its students.

We take a different approach. Based on data from the Department of Education, we estimate the average earnings for graduates of each major at a particular college. We then compare this number to salaries of graduates with the same major at other colleges. This tells us if graduates in a particular major at a particular college are performing average, worse or better than their peers. If they are earning more than their peers, this is reflected as an "earnings boost".

We then a weighted average of  the "boosts" for each major at a college to get an overall average "boost" given to graduates earning power at that college.

We think this method of measuring salaries is a better approach to determining how well graduates are doing in the workplace relative to the competition.

No Subjective Insider Surveys

Opinion surveys may provide some insight into the college landscape, but they are by no means objective.  Asking college administrators to rank themselves and their competition makes no sense.  Beyond the obvious problems of bias, there is the simple fact that no administrator could possibly have in-depth knowledge on all of the colleges out there.  Their knowledge would be limited to a small handful of schools, and even then their personal experience may have very little to do with how well a college is actually educating students.

The great risk with such an approach is that the ranking turns into a popularity contest. Colleges often become fixated on boosting their image to other professionals in the hopes of moving up the rankings, instead of focusing on becoming better at providing a quality education. The whole system creates a self-fulfilling prophecy where the best schools continue to be well known and respected by all college administrators and so they continue to rank well.  It raises the question, does a school rank well because it offers a quality education or does it rank well simply because it has ranked well in the past?

Survey data is pretty highly weighted in some rankings out there and we don't think that does a service to students and their families trying to make one of the most important decisions in their lives. Once you get into the top 1% of universities survey data may become more useful, but even so that leaves out 99% of the students looking for a better education.

No Forced Filters

Rather than break 4-year colleges into separate, mutually-exclusive sub-lists, we rank all 4-year colleges together.  We believe that doing otherwise is withholding valuable information from students.  Grouping colleges into separate lists may make sense for administrators and professionals, but it does not make sense for the students who are doing the searching. Most students are not looking strictly at colleges that have a doctoral program (National colleges)  or those that don't (Regional) or only at Liberal Arts schools. Students and parents should have the ability to compare all 4-year colleges against each other.

If there were truly different methodologies that got to the heart of the differences between these colleges, that might be one thing, but breaking them up while using the same methodology for each group does not accomplish anything other than serving as a filter.  We allow students to filter schools of a certain type only if they want to. We believe the power should be in the hands of the consumer (the student) and someone else should not be making the decision of what schools they can and can't compare to each other.

We Rank All Colleges

Well, maybe not all of them, as there are some that don't report enough data, but we don't impose any artificial cut-offs in our list by only showing the top 10, top 100 or the top 50%.  Imposing arbitrary cut-offs hurts those considering schools not on that list (which may be the majority of students).  If a ranking methodology is valid for the top 50% of colleges it should be just as valid for the bottom 50%.  The colleges on the bottom of the list may not like it, but our business is helping students, so we publish our Best Colleges rankings on almost 1,400 4-year colleges.

Certainly a college whose mission is to accept all students is going to rank very differently than one that is highly selective.  Is the highly selective college "better" at educating their student than the open enrollment college?  That's really hard to tell.

What isn't hard to tell is that if the academic caliber of your fellow students is relevant to the top colleges, it is just as relevant to the bottom ones.  If your odds of graduating from a bottom school are only 15%, you just might be that hardworking student who is the exception, but it's still relevant to know that your odds are not good and that most of your classmates won't finish college.

Using the explanation that a college has a "different mission" and so should not be ranked misses the point of what rankings are supposed to be. College rankings are meant to help students by enabling them to make more informed decisions about their higher education.

At a macro level, colleges with different missions may have a point.  But at a micro level, the level of a single student trying to better themselves, how the top colleges rank may be entirely irrelevant.  The only schools for a student with average or less than average grades (again, the majority of students) may be within those bottom tiers.  For them, understanding the difference between the #800 and #900 ranked school is as important as the top students in the country understanding the difference between the #1 and #100.

Anti "One-Size-Fits-All" Rankings

While I just used far too many words explain why our rankings are better, I will now let you in on a little secret.

Rankings are Misleading, Including Ours.  

I hear what you're thinking. I just spent all this time describing how great our rankings are and now I say they're misleading?

Well, let me re-phrase that. Rankings by themselves are not near enough information.  They are woefully inadequate when it comes to helping students make better decisions about their higher education.  Far too many students are encouraged to rely heavily on rankings to make their decision, when they should just be one of many tools students use to determine their best fit.

Rankings are just a starting point.  Granted, we want to give students the best starting point possible, but they are step one of many more steps to come.  The problem with most rankings is that they stop at rankings.  That is just where we get started.

There is no single answer to the question "What is the best college?".  It's a false question.  The only question that matters is "What is the best college for YOU?".  Each student has different strengths, abilities, interests, goals, values, limitations, preferences, etc.  One-size-fits-all rankings assume every student is exactly the same.

How long would you use a search engine that only returned the same list of top 10 pages regardless of what you entered as a search term?  Whether you were searching for "best thai restaurants in NYC", "cheapest flights to San Fransisco" or "How to make a key lime pie", if the top 10 pages were all the same you would find it to be a pretty useless search tool.

You can quickly see the uselessness of a search engine that acted as though all people were searching for the same thing. So why should you accept that when doing a search for one of the most important and expensive purchases of your life, your college education?

Custom Rankings

As a first level of going beyond one-size-fits-all rankings, we allow a student to filter most of our rankings to find a list that is more relevant to them.  For someone who wants to go to school in New York, seeing a list of the top schools nationwide isn't really helpful, so we allow students to see just the college rankings by state and region.  Where there are state specific values (like costs), our rankings adjust to reflect that.

We further allow students to filter by things like public vs. private, academic ability (odds of getting in), likelihood of getting financial aid and more.  For our sports rankings, we allow someone to view rankings by their specific sport and division.  Beyond just filters, we enable a student to customize the weighting of factors in many of our rankings based on what they think is most important.

Beyond Filters to True Matching

One of the problems with filters is that they are often limiting.  If a student only has a mild preference for small schools, excluding medium schools may hide schools that are a much better fit for them overall.  This option also assumes that the student has well informed preferences which many students don't.

There is a better way to truly find the best fit college for a student and that is what we are building.  If you can imagine what would happen if Consumer Reports, eHarmony and Priceline all combined their DNA somehow to create offspring focused on college selection then you have an idea of what we are working on.  Stay tuned for more on this over the next couple months.

Feedback Wanted

If you like our approach, love it, hate it, let us know.  We are constantly trying to improve our rankings and overall ability to help students identify the best fit college and major for them.  All feedback is welcome.

Additional Resources

Some additional resources you might find interesting.