Ranking law schools is a big deal. It’s no secret that the legal world is prestige obsessed. Everything is ranked and ranking is everything.
So how about the rankers themselves? One of Manhattan LSAT Prep’s bloggers, Mary Adkins, got the idea to rank the law school rankings. It’s a good idea, but I strongly disagree with her analysis. For the most part, she just took the rankings at their word. The problem is that law school rankings can be big liars. They all have an agenda of some kind. Fortunately, some are there to help you. Get information from the helpful rankings, and take the rest with a large pinch of salt. In this post, we sort out which are gold and which are garbage.
We will rank four law school rankings, with analysis of their merits (or lack thereof) to follow. The heavy hitter in the group is the US News and World Report Rankings. These incredibly influential rankings have for years now played a big role in applicants’ decisions about where to attend law school.
Next we looked at the recently published Above The Law Rankings, created by the widely read legal blog. Above The Law just started rankings this year to address some perceived shortcomings of the USNWR rankings. One of their chief aims is to take account of the quality of jobs going to graduates of different schools.
Then we took a look at some rankings published a couple days ago by a brand new blog, Tipping The Scales. Tipping the scales is run by a journalist, John A. Byrne, who heads the business school blog “Poets and Quants.”
The last thing we look at are the Law School Transparency’s ‘Employment Scores.’ Law School Transparency is a blog devoted to helping student make well-informed choices about where to attend law school and how much to pay. LST has also been a major force in getting law schools to publish detailed and accurate data regarding students’ employment outcomes.
Without further ado, here is how we stack up these four recent law school rankings:
#1: The Law School Transparency Employment Score Reports
Bottom line: Incredibly helpful and user-friendly. Use should be required before you can apply to law school.
# 2: Above The Law’s Top 50 Law Schools
Bottom line: Best multiple metric ranking. Shows you where to get the good jobs.
# 3 : The US News And World Report Best Law Schools Rankings
Bottom line: Still accurate, still evil.
# 4: Tipping The Scales Debut Ranking of Law Schools
Bottom line: Nothing more than a conversation piece. Laughably self-serious for something so imbecilic. Hopefully no one’s fooled.
To help sort the trash from the treasure, let’s take a look at how these rankings rate the top 14 law schools. The top 14 is an informal category for the schools that have always sat atop the US News rankings (left-most column) since they began in 1987:
|Above The Law
|Tipping The Scales
|Law School Transparency Employment Score
|Chicago – 94.9% (7.9% School Funded Positions)
|UVA – 94.5% (14.9% School Funded Positions)
|UPenn – 94.4%, (2.6% School Funded Positions)
|Columbia – 93.4% (8.1% School Funded Positions)
|Stanford – 91.2% (2.2% School Funded Positions)
|NYU – 91.1% (12% sSchool Funded Positions)
|U Penn, Virginia
|Harvard – 87.1% (2.2% School Funded Positions)
|Berkeley – 85.9%
|Cornell – 85.3% (0.5% School Funded Positions)
|Duke – 85.3%
|Michigan – 82.5% (0.8% School Funded Positions)
|University of California Irvine – 82.1%
|Yale – 82%
|University of Texas- Austin
|Northwestern – 75.9, (0.3% School Funded Positions)
So what’s going on in these rankings? Anyone who is in the know about law school rankings is probably screaming, “what is wrong with the Tipping The Scales Rankings? How the hell is Duke and Northwestern outranking NYU and Chicago?” We wondered the same thing, so let’s tackle that first:
How Not To Rank Law Schools: The #4 Tipping The Scales Rankings
You can criticize the USNews up and down for a number of reasons. We’ll do it later on, but one thing about the USNews ranking of the T14 ( see the left most column) is that it pretty closely approximates broader perception in the legal world as to how these law schools stack up. Schools will change spots a little every year (otherwise US News wouldn’t sell many copies of their rankings), but generally positions remain about the same, stuck in clusters that mirror the legal world’s collective opinion.
The usual clusters if you ask public opinion are as follows: Harvard, Yale and Stanford are almost universally considered the top 3. These are the schools everyone is dying to get into. Next comes Columbia, Chicago, and NYU, all of which are solidly dominant in big-law hiring. Then comes Berkeley, Penn, Virginia and Michigan. These are great schools that grant solid hiring prospects, but they tend to pull in slightly weaker law school applicants for a variety of reasons. The last cluster is Northwestern, Duke, Georgetown and Cornell. These schools carry a strong national reputation; however, their students may experience a little more difficultly lining up a quality job in this economy.
So how does Tipping The Scales come up with a ranking that completely upends the usual setup? Are they seeing something hidden about the underlying value of these schools? Something that the rest of us aren’t? No, Tipping The Scales is just trying to cause a sensation by using a deeply flawed methodology to stir things up.
Here’s what they had to say about their own rankings: “Tipping the Scales’ ranking zeroes in on two key dimensions of the J.D. experience: the quality of the students getting into a law school and the success of the graduates going out. Bottom line is, these metrics are simple to understand and they get at what really counts in a law school education.” (Source)
It’s a good approach, but they failed miserably to measure these ‘key dimensions’ in any meaningful way:
The quality of students getting into a law school is measured by just two numbers: schools’ median LSAT score and their acceptance rates. Just these two numbers are a full half of the whole rankings.
Acceptance rate is a terrible way to distinguish between a broad group of schools. Why? Because different law schools aren’t all going after the same candidates. To quickly illustrate my point, NYU’s acceptance rate is 27.9%, whereas Minnesota’s is a lower 23.2%. Is Minnesota a more selective school? No. NYU attracts applications from — and selects — much stronger candidates. That’s a point on which no reasonable person can disagree. Yet this ranking system would give Minnesota the edge for a full quarter of the rankings weight.
Acceptance rate is only a good way to measure the pulling power of a school when comparing it with its own peer group. NYU is a top school that is going after the same applicants as Harvard, Yale, Stanford, Columbia and Chicago. Within this group, UChicago and NYU have the hardest time getting them to come, so they have a higher acceptance rate than the others.
Acceptance rate is also subject to tampering through ‘yield protection’, the practice of waitlisting highly qualified students on the grounds that such students are likely to accept offers from more prestigious schools. Arguably, this tampering already occurs because of the tiny (2.5%) weight US News put on acceptance rates in their rankings.
So basing a quarter of a rankings system on acceptance rates is a terrible idea. I can’t swear whoever came up with the idea doesn’t have Oreo McFlurry where there brains should be, but then again, I can’t not swear it either. It would be fine if their other metrics somehow made up for the flaw but that doesn’t happen. The problems just grow from there. I’m going to go to bullet points so this doesn’t take all day:
- The only other measure of student quality, median LSAT (25% of the rankings), is incredibly blunt. Sometimes schools have really similar medians, but are vastly different in all other measures of student quality. Look at Northwestern 170 median (#7 on these rankings, screen-captured below) vs. Chicago’s 171. Not much different. However, look at Chicago’s 25th-75th percentile GPA range of 3.65-3.96 versus Northwestern’s 3.38-3.84. Do the same 25th-75th range for LSAT and you see Chicago’s 167-173 over Northwestern’s 164-171. These numbers always consistently measure what Tipping The Scales say they want to: the quality of the students getting into a law school. So why are they ignoring it? Well, they couldn’t shake the traditional rankings up as much without ignoring this stuff.
- Median Private Sector Salary (12.5%). Notice how it’s the same for every school on the list below? It doesn’t distinguish them at all. Now if you looked at the what the bottom half of the class is making, you would see real differences between these schools, differences that matter in the other ‘key dimension’ that these rankings pretend to measure, job outcomes. Yes it true that this metric starts to matter when you rank schools below the T14, but a more discerning methodology would help there as well.
- Median Public Interest Salary (12.5%) is the second most glaringly stupid metric that Tipping The Scales uses. It’s useless because 1. The sample size for a given school is often ridiculously tiny and 2. people who go into these jobs typically aren’t motivated by salary. Salaries at public interest jobs mostly vary by where in the country the job is located. Pay isn’t tied to prestige. This metric yields numbers that are wholly arbitrary.
- Jobs upon graduation (25%) makes up the rest of the rankings. That’s it. This is the only useful metric that they give a reasonable amount of weight. Unfortunately, it’s drowned out of the conversation by the other poorly designed metrics, leading to paradoxical results. See for instance how Northwestern is bested by Duke, despite winning every category except acceptance rate (well, they tied in private sector median salary, every t14 did). Duke and Northwestern are beating UChicago and NYU just on the strength of acceptance rate and public interest salary, even though Duke and NU have way worse job stats.
So why did I go through so much trouble to trash a ranking by some silly new blog? Well first off, it was a good exercise for viewing the anatomy of a poorly designed ranking. Tipping the scales claims they are going for simplicity, but they wound giving us deception.
Now I’m not overly worried about my audience. I know you are smart enough to that you won’t take GW at full price over NYU just because they are mashed up together in some rankings. However, law school rankings should be there to help people make important comparisons about schools.
Here’s what I suspect Tipping The Scales did instead: They sat there and thought “how can we make a ranking with the least amount of work that flips a bunch of stuff around?” I’m not mad about them for doing that. It might be their idea of fun. Where it becomes a problem is when you try to sell it as something serious and meaningful, which they do, big time.
There are two pages of BS pushing it as an important ranking before they actually get to the list. Bad rankings might make for more clicks, but I hope people realize that’s their agenda. Clicks, nothing more. When and if their big-budget blog attracts some readers, people new to the law school world might put some stock in their rankings. That’s a big problem for me.
The Gold Standard: The #1 Law School Transparency Employment Score Reports
The Law School Transparency Employment Score isn’t really a ‘ranking’ as such — it’s probably best viewed as an intelligent alternative to traditional law school rankings.
In our chart, we included the 14 top schools in order of best employment score. The employment score provides a quick look at the percentage of graduates who are employed in long-term, full-time positions (excluding solo practitioners). Though we placed them in order here, Law School Transparency warns wisely against putting too much stock in ranking. Instead, the idea is to look in detail at each school. Go to their score reports and you can view a whole bunch of data that will help you actually evaluate each school’s employment data. With this data, Law School Transparency’s Reports help get you informed about the only dimension that truly matters, a law school’s ability to get you a job.
Here is a screen capture of one of their score reports:
This reports tells University of Michigan’s Employment Score (The amount in full-time, long-term positions requiring a J.D. and excluding solo-practice, 9 months after graduation), it’s Under-Employment Score (The percent of students in jobs not requiring JDs) and the ‘Unknown Score’, which is the percent of students with an unknown status or unknown employment type.
Now, Law School Transparency are the good guys, so they admit their list has limitations (read the disclosure here). The biggest is that it does not examine the quality of the underlying jobs. From their mouth: “we treat all long-term, full-time legal jobs with employers the same. For example, a job with a large law firm counts the same as a job with a very small law firm, even though we have data for this distinction. We do not, however, have data for distinguishing among lawyer jobs at large law firms. Wide variances by pay, prestige, practice settings, and practice specialties exist.”
As they say themselves, it should just be a starting point for your research (that, by the way, is true of any law school ranking). However, their approach gets a big ‘gold star.’ LST reports give you a clear picture of your likely fate should you attend X law school. There is detailed enrollment data there too if you want to look at student quality. It’s a fount of detailed, useful information, and it’s there purely to help you. It’s damned refreshing to see this. I wish it was around back when I was applying to law school in 2008. You had to be Sherlock Holmes to find honest employment data back then, if you could find it at all.
Just a note: some of you might be wondering why Yale is so far down the list when you line the schools up by employment score. It’s nothing to do with their merit really. Yale law grads have a higher tendency to take non-law jobs. They’ve got the opportunity to do a wide range of things and some of them take it. It’s simple self-selection.
The Best Synthesis: The #2 Above The Law Rankings
Now the whole argument behind Law School Transparency’s score reports is that you can’t reduce complex data to a single index in a meaningful way. In other words, rankings will always be meaningless. There’s a lot in that, but if it’s possible to do a meaningful ranking for the top schools, I think Above The Law’s Rankings have managed it.
Above The Law’s Rankings are going to be most useful in tandem with the Law School Reports because they do a passable job at measuring what LST did not: the quality of the jobs that students can get from a certain school. Here’s where they are coming from:
“The basic premise underlying the ATL approach to ranking schools: the economics of the legal job market are so out of balance that it is proper to consider some legal jobs as more equal than others. In other words, a position as an associate with a large firm is a “better” employment outcome than becoming a temp doc reviewer or even an associate with a small local firm. That might seem crassly elitist, but then again only the Biglaw associate has a plausible prospect of paying off his student loans.” (source)
You may or may not like the way they framed it, but the fact is that right now some schools put you in a better position to pay off your debt than others (even when you factor in PAYE). Anyways, that’s their agenda — to make you aware that not all law jobs are created equal, and to tell you which schools send their graduates on to the good stuff. Let’s look at their methods for a moment:
Placement in Scotus clerkships (7.5%)
Number of alum who are active Federal judges. (7.5%)
ATL alumni ratings (10%) – surveys of alum of each school
Education costs (15%)
Employment score (30%) – full time long term jobs requiring bar passage, like LST’s employment score
Quality jobs score (30%) – Placement in the highest-paying law firms (using the National Law Journal’s “NLJ 250”) and the percentage of graduates obtaining federal judicial clerkships.
The focus here is purely on outcomes.
Now, I saw some complaints on the internet about the first two categories. None of it had much merit. These categories are an excellent proxy for a given school’s prestige. You can criticize the notion of ranking schools by prestige all you want, but meanwhile employer’s make their decisions about where to hire from on the basis of school prestige. If it’s a problem, it’s a systemic problem, and not one Above The Law can do much about.
Rankings that take good account of prestige might accurately tell you where the most students get hired for the good gigs. This is a good account: those first two metrics paint in some detail while the quality jobs score does most of the work. If you want to critique prestige rankings, critique the student quality scores (LSAT, acceptance rate, etc.) used in other rankings. They are also just measurement of prestige, only more remote from why prestige matters. Prestige is what gets you a good job. Why not just measure by the jobs it can get you? That’s what ATL has done.
Manhattan LSAT’s blogger Mary Adkins made the point that these rankings are for a particular set of people. That’s true. Once you leave the top 14, the lower schools start performing very poorly by these metrics (note the index scores on their ranking page). That of course, is the point: Right now, the good jobs are scarce. More than ever, they are going to an elect group. Above the Law’s aim is to help law school hopefuls understand that, and they’ve done an excellent job.
Three cheers especially for their explanation of the rankings, which lacks double-talk and nonsense about how “simple metrics are the best.” If and when lawschooli.com gets presumptuous enough to rank the law schools, we’ll be copying the straightforward approach and not the Tipping The Scales one.
The Terrible Dinosaur: The #3 US News And World Report Rankings
I won’t be able to say anything about the US News that hasn’t already been said. They rely heavily on opinion surveys, which do a good job measuring prestige, and thereby, the job prospects you’ll get from a school. For those interested, you can read more about their methodology.
These opinion surveys are the subject of a lot of back and forth. The more idiotic objections are that they are really, really subjective (for instance, Tippings the Scales says: “Those opinion surveys are little more than popularity contests because deans and faculty have only limited knowledge of what is going on at schools other than their own). That might be true, but they accurately reflect the very same beliefs about law schools that drive hiring practices. In that sense they provide a useful metric.
The much realer problem is that the surveys might create a feedback loop: opinion surveys guide placement in the rankings, which then determine the outcome of the surveys, and so on. Schools might be unable to improve their reputation despite having made strong real-world improvements. However, my personal opinion is that US News actually takes good account of real changes.
So what, if anything, is really wrong with US NEWS rankings? USNWR rankings are so dominant and influential that it’s not clear whether reality shapes them or they shape reality. Law schools change their behavior based on how US News ranks them. The real problem with US News is how they have handled this role and the incentives they’ve provided to law schools. Some of them are very harmful. I’m not sure it’s fair to call the US News rankings “evil”, but let’s do it anyways.
Don’t like that you need a 172 LSAT for a shot at your dream school? Blame US News. Schools can jostle each other in the rankings by stealing students with higher numbers from each other, leading them to ignore other, less quantifiable, factors about students that might indicate their potential as law students.
Don’t like how much law school costs? You can blame at lot of that on US News too. US New’s Ranking methodology places a lot of weight on expenditures per student. Schools race each other to get ‘better’ (more expensive) everything — instruction, facilities, clinics, you name it. All this eventually goes on the students tab. You can debate exactly how much US News has egged on enormous overgrowth of law school cost, but there is little question their role is significant.
US News is in a position of power. The problem is that they don’t seem to want to do anything good with it — like lower law school education costs — which they could likely do if they stopped ranking school expenditures. It seems as long as they sell mags and keep their dominant position they are happy. I would rather the top rankings wreath were passed on to someone responsible like Above The Law. Better still, just use the score reports.
There. I’m out of breath, so let’s sum up again:
#1 – Law School Transparency’s Score Reports: Incredibly helpful and user-friendly. Use should be required before you can apply to law school.
#2 – Above The Law Rankings: Makes an important point. Shows you where to get the best pair of golden handcuffs.
#3 – US News: Still accurate but still evil.
#4 – Tipping the Scales: Nothing more than a conversation piece. Laughably self-serious for something so imbecilic. I hope no one’s fooled.