Site Meter

Are Alternative Law School Rankings Any Better than US News?

You may also like...

8 Responses

  1. Anthony says:

    None of the issues you raise indicate “serious flaws” with those rankings. Of course elite law firm placement says nothing about AUSA jobs, law professor placement, etc. I said that in my article, and I told the WSJ reporter the same thing (though it didn’t make the article). But I’ve never argued that my elite law firm placement rankings correlate with clerkship placement, and have never advocated that law schools should abandon every other goal in order to maximize national firm placement. The purpose of my elite law firm placement rankings is to measure elite law firm placement, and the intended audience for my rankings are prospective law students who want to work in elite law firms.

    Now this isn’t to say that there aren’t problems with how some of these ranking systems are actually used. For instance, if U.S. News claims that it’s rankings measure the earning power of law school graduates, then there is clearly a disconnect between the stated purpose of the rankings (measuring earning power) and the methodology employed by the ranker (inputs like the # of volumes in the law library have nothing to do with the earning power of graduates — sticking another 10,000 books in the law library isn’t going to get graduates more jobs or higher salaries). Similarly, there would be a disconnect between purpose and audience if U.S. News marketed its rankings exclusively to law professors (who likely have little or no interest in the earning power of graduates, but are more interested in factors like academic reputation, professor salaries, faculty teaching loads, etc.).

    However, I don’t think any of the rankings listed (with the exception of U.S. News, and possibly some of Leiter’s survey-based rankings) suffer from very serious purpose/methodology or purpose/audience problems (though obviously some improvements could be made to all of them, like having the SCOTUS rankings adjust for school size). If people are using my elite firm placement study as a proxy for public defender placement, or if people are using law journal citations to measure law school academic reputation, then the problem is not with the original study but with the people who are using those rankings improperly.

  2. Bruce Boyden says:

    I also don’t think “elite law firm placement” is a bad measuring stick; but the problem is coming up the right denominator. I suspect that students that place well at elite law firms are equally able to place well at whatever they’d like to do, whether it’s public interest, government, or what have you. It’s just that firm placements are easier to count. So, you could measure school quality by dividing the number of placements in the Am Law 200 by all those who tried to get such positions.

    Of course, that’s the rub. The easiest denominator is just the total number of students in the graduating class. But using that figure assumes that the proportion of students who try to get large firm jobs is constant across all schools. But that’s probably false, at least toward the top of the rankings. Also, using firm placement probably advantages schools that are local maximums — the best of the closest, that is. I think this is why Chicago comes out so well in Anthony’s rankings. Still, I don’t think anything about such a ranking says that firm jobs are better than other jobs, they’re just easier to measure.

  3. Anonymous Person says:

    An interesting alternative to Mr. Ciolli’s study, inspired by Professor Boyden’s comment, might be to consider elite law firm placement but analyzing only placement outside of the locality of the school. That is, for example, don’t consider how Columbia places in NYC, but how Columbia places in other markets. The result would evidence which schools are really “national” ones.

  4. Anthony says:

    Anonymous Person: That information is already in my study–see Appendices A (regional TQS) and C (regional per capita). The national rankings that appeared in the WSJ (Appendix B in the paper) are a amalgamation of the nine regional rankings.

    Bruce: Actually, local schools, including elite local schools, tend to do worse in their home regions than elite non-local schools. For instance, in my study Chicago placed better than NYU in the Mid Atlantic region, but NYU placed better than Chicago in the Midwest. If you skim the placement rankings for individual regions I mentioned above, you’ll see that this was a pretty consistent trend.

    Why this counterintuitive result? My theory: supply and demand. Yes, all else equal, a prestigious local school will likely have advantages over an equally prestigious out-of-state school that would translate into better job placement (e.g. strong local alumni network, easy for local firms to do on campus recruiting, etc.). But all else isn’t equal–for one thing, students from the prestigious local school are in very high supply, whereas students from the prestigious out-of-state school are relatively scarce. As a result, firms can afford to be more picky with students from the local school compared to students from the out-of-state school, explaining why Chicago students place better than NYU students in New York and NYU students place better than Chicago students in Chicago.

    Of course there are other possible explanations. For instance, one could argue that NYU students who want to work in Chicago need higher grades to get an elite firm job than NYU students who want to work in New York, and thus lots of NYU students who want to work in Chicago try to get jobs there, fail because of their grades, and then “settle” for New York, giving the impression that NYU places better in Chicago than it really does. I don’t buy this explanation because I have seen no evidence suggesting that 1) the grade distribution of out-of-state job seekers (at elite law schools at least) differs significantly from the grade distribution of in-state job seekers or that 2) most law students are highly geographically flexible (in fact anecdotal evidence suggests the opposite). Of course, given that the relevant data either doesn’t exist or is locked away in the registrar or career services office makes it impossible to conclusively prove or disprove this claim.

    As for the denominator, I agree that attempts would be ideal, but that data either doesn’t exist or isn’t available. However I don’t think students in the graduating class is a good denominator (in fact Leiter’s improper use of that denominator in his employment study is one of the reasons I decided to do a study of my own). You’re correct that law schools different significantly in the percentage of students who pursue firm vs. non-firm careers. For instance, for the period of my study, 80% of Columbia grads pursued firm careers while only 45% of Yale grads went to firms directly after graduation. Of course a lot of the people not going into firms were clerking, so an adjustment had to be made there (I saw a Yale career services study that indicated that clerks go into law firms after their clerkships at almost the same right as non-clerks, so at least that adjustment was relatively easy to make).

    However regional preferences are an extremely important factor that need to be taken into account, especially when taking into account the audience for such a study–after all, when it comes down to it most law students aren’t thinking “Which law school will maximize my chances of getting a great law firm job anywhere in the United States?” (though I’m sure some do) but rather “Which law school will maximize my chances of getting a great law firm job in [insert name of preferred legal market]?” Regional adjustments are particularly important because legal markets differ dramatically. For instance, at the time of my study the Mid-Atlantic region had almost double the number of jobs (and better average quality of jobs) compared to the Pacific region. Since 78% of Columbia’s class works in the Mid-Atlantic region while only 15% of Stanford’s class works in that region, having the denominator be total class size (or even total class size reduced by those who never attempted to get a firm job) without making any adjustment for student geographical preferences or regional legal market conditions would produce very flawed results that a prospective student couldn’t rely upon. So, ideally the denominator would be the total number of students who attempt to get a job at a law firm in Region A. Then, once you have data for Regions A, B, C, D, E, and so on you can aggregate it (likely after making some more adjustments) in order to measure national placement as best as possible.

  5. Anthony says:

    Ugh, after rereading that I wish I had proofread before hitting the post button. Guess that’s what I get for working on a lengthy screed while doing MBE questions.

  6. I’ve long advocated ranking Law Schools by their access to a quality Major League Baseball park…coincidentally, my alma mater, Maryland Law, is right up the street from Camden Yards and I believe would be ranked first using my totally objective methodology.

    I realize my method would hurt some over-hyped schools such as Yale, UVA and Duke but, hey – prospective students and employers need to know all the facts.

  7. ADERUS MILAN says:

    People are agopnizing over these rankings, wondering if they should exist at all…and the answer is a resounding YES! Then we ask what methods are best, and most have at least some methods, even those that put location on par with prestige. Rankings force businesses and other entities to up their performances and that is always a good thing for any consumer. If there’s one dysfunction we can all look at, it is the employers (mainly prestigious law firms) that put too much emphasis on rankings…and it has a nasty trickle-down (to the students and their LSAT scores and, to a lesser extent, grades) effect. Firms need to start doing the legwork to look for the John Edwardses (UNC) and Johnny Cocherans (Loyola, CA) of the world. And they could even search more for those diamonds in the rough at third-tier schools (and there are lots of them). Law students study virtually the same material, cases and the like. No ranking can determine how a student uses his education, no matter what school he attends. There are many so-called “lower-tiered” schools I would love to attend, but I can’t, because I’m scared I won’t have a job when I graduate. And I know I’ll be a great attorney no matter where I go, but firms don’t recognize that this is the case for many…and that many great lawyers do not come from top-10 schools.

  8. ADERUS MILAN says:

    I meant to say that most rankings have at least some validity