Category: Law School (Rankings)

2

A Sarbanes-Oxley Act for the U.S. News Rankings

SarbanesOxley.jpgLove them or hate them, the U.S. News and World Report rankings have serious implications. If a school rises on the list, it may become more desirable, attracting more applicants and better hires, and the opposite potentially may happen if a school’s rating drops. With so much at stake, and so many complicated factors to be calculated and self-reported by each school, moral hazard is inevitably present. And that’s a troubling incentive, for the same reasons that the pressure to “make the numbers” each quarter is problematic in the corporate context.

But at least in the corporate arena, the accuracy of the data reported is audited, and the CEOs of the companies have to certify that the data is correct. This was re-emphasized by the Sarbanes-Oxley Act, passed in 2002, which has as its goal better corporate governance, as well as accuracy and transparency in reporting. With the U.S. News rankings, we’re talking about a magazine, a private entity, that reports the data as it is given to them.

While I’m confident – or at least, I hope – that the vast majority of the reporting is above-board, I’ve also heard rumors about situations that seem slightly shaky. Some folks – while themselves professing to report everything correctly – think that other schools are not being upfront. Perhaps not exactly dishonest, mind you, but there are suspicions that some of the numbers are the result of skillful dodges or artful interpretations. Of course, this undermines the legitimacy of the vast majority of the schools that report the numbers accurately.

So what about some type of Sarbanes-Oxley analogue for these rankings? I don’t think the system could in any way be hurt, and might be considerably helped, by an occasional random check of the numbers and how they are derived every couple years. Not all the numbers, but just enough to keep everyone on their toes. That way, even if people disagree about the rankings, what factors should be included, what value to give to these rankings, etc. at least everyone is starting from the same place, and there’s more of a feeling that the information is accurate.

Read More

2

Rankings and Precision

A very interesting take on B-school rankings, from organization scholar Rhakesh Khurana (via Pub Sociology):

Rankings provide the illusion of scientific rigor vis-à-vis a process that actually calls for careful judgment and nuanced interpretation. It is one thing to give Wharton, Tuck, or Columbia a rating as a top business school; this leaves some room for interpretation. However, to say that Wharton is number one, Columbia number 3 and Tuck number 2 indicates a level of precision that just cannot be achieved, except on the cover of a newsmagazine and then in the minds of students.

I’ve previously suggested that law school rankings have some real benefits in reducing search costs; and I continue to think that rankings are helpful for many people. However, the problem of quantification and incomensurability, as ably discussed in Khurana’s post, is one of the real weaknesses of an ordinal ranking system like that used by U.S. News.

13

US News Rankings: A Chart of the Past Decade

Co-authored with Dan Filler.

The US News rankings have captivated legal academia. The rankings have had a tremendous effect on student decisions about which law schools to attend. They have also had an impact on which authors receive offers from law reviews (the letterhead effect) and the choices that authors make when faced with multiple publication offers. As one might expect, given that US News wants to sell magazines, schools move around the rankings to some degree. Each year the rankings produce a few new “haves” and a gang of fresh “have nots.”

We have created a chart of the trends in the US News ranking for the top 25 law schools over the past decade. Below is a small version of the chart; click on it for the full-size version. More analysis is below the chart.

As the chart demonstrates, there are some bands of stability and some areas of volatility. The same six schools have occupied the top six positions for the last decade. There has been little movement in the top 15. But below the top 15, schools dance around quite substantially.

When students choose law schools, they should remain focused on the forest and not get lost in the trees. Focusing on year-to-year changes can be misleading. For example, in 2006, Wash. U. moved up five spots from 24 to 19. But a year earlier, it dropped from 20 to 24. What is the real Wash. U? Over time, one can see a dramatic change — Wash. U was in the high twenties and early thirties until it leveled out at 25 in 2002. In another example, if one looked at GW in 1998, it was ranked 20. But at that time the 20 was an anomaly, as Wash U was 24 in 1997 and 25 in 1998. After 2004, GW has been consistenly ranked either 20 or 19. To the extent that the US News rankings have any value at all, it is evident only in long-term trends, not in yearly fluctuations.

There are other instances where the US News rankings are simply a game of musical chairs for certain groups of schools. For example, Berkeley, Virginia, and Michigan have been have engaged in a US News game of cat-and-mouse over the past decade. When one school drops, students may become crestfallen. Prospective students may shift their preferences. However, over time, the ordering of these schools appears just to shuffle around a lot, with no discernible pattern. Relying on the US News rankings to choose among these three law schools is like choosing one’s hometown based on today’s weather report.

Below is the full data set; click on the chart for a larger image.

UPDATE: We have corrected an error pointed out in the comments. The charts now reflect Virginia’s correct 2006 ranking of #8. We also learned that a commenter at xoxohth has created even more comprehensive data charts here in Microsoft Excel format.

One more point regarding what looking at the rankings temporally tells us. For many schools, the rankings don’t change very much. And even the big changes are simply often a reflection of the fact that so many schools are tied or nearly tied; hence, a small nudge upward or downward will lead to a bigger fluctuation in rank. If people look at any given year and compare it to the year before, they might assume that there is some kind of progress for certain schools and some regress for others. But if they look at the big picture, there is lasting change for only a few schools. For example, take Berkeley. From 1997 to 2006, it was ranked 7, 10, 8, 9, 7, 10, 13, 11, and 8. So it is basically where it started, but sometimes it was a “Top 10″ school and sometimes it wasn’t. Of course, in the real world, Berkeley did not make a prodigal journey. If one looks at the rankings when Berkeley went from 10 to 8, she might think: “Berkeley is on the move. It’s now firmly in the top 10.” But a few years later, Berkeley would not only fall to 10, but would plunge as low as 13. One might be tempted to think: “Oh my, Berkeley’s really plunging now. They must be doing something wrong.” Now, Berkeley’s back in the top 10. Should we think “progress”? No. There’s no progress. Berkeley is basically where it always was — in the top 10, where it clearly belongs in my opinion. The only change is where US News places it in the rankings. Therefore, looking at the rankings temporally suggests that one shouldn’t take the US ranking changes in any one year very seriously.

6

US News Law School Rankings: A Comparison With 1998 And 1995

Not surprisingly, there’s been some discussion of the new US News rankings here, here, here, here, and here. In an effort to produce entertaining, if ultimately useless, information, I decided to dig into my US News archives to produce comparisons between the new list and some older rankings. I refer to the rankings by year of publication, so that the new rankings are 2006 (though they are marketed as 2007.)

I’ve tried to do three things in this post. First, I’ve listed the schools that experienced the greatest shifts in reputational numbers comparing the 1998 rankings to the new list. I chose 1998 because that year US News switched to a 1 – 5 scale for measuring reputations. Second, I’ve produced a comparison of the ranking of law schools, by academic reputation, between the 1995 rankings (the oldest material in my personal files) and the new list. In 1995, US News expressed academic reputation in terms of rank nationally, rather than absolute numbers. Third, I’ve compared overall US News ranking of the top 30 schools in 1995 with the new ranking.

Comparing the academic reputation numbers from 1998 and 2006 (although US News is marketing the new list as “2007 rankings”, I will refer to all rankings based on year of release), no school moved more than 0.3 points up or down. Here is a list of the schools that moved up or down 0.3. Note that only only school – Michigan State, which had just acquired Detroit College of Law – moved up 0.3. The rest all dropped.

Baylor (-.3)

Case Western Reserve (-.3)

Duquesne (-.3)

Kansas (-.3)

Michigan State (+.3)

Nebraska (-.3)

Richmond (-.3)

St. Mary’s (-.3)

South Dakota (-.3)

SMU (-.3)

Wayne State (-.3)

West Virginia Univ. (-.3)

Wisconsin (-.3)

What about those schools that had big overall moves – like George Mason (from second tier, unranked, to 37), Washington University in St. Louis (from 29 to 19), or the University of Toledo (from fourth tier – bottom 20 – to 93)? Mason moved up 0.2, Wash U went up 0.1, and Toledo actually dropped 0.1. Hawaii, which moved dramatically from 50 to 93 maintained exactly the same faculty repuation numbers.

The lawyer and judge reputation numbers showed more variation. Here are the top movers over that eight year span:

Read More

5

The Big Law School Shuffle and the US News Rankings

usnwr1.jpgThe US News rankings are officially out (here), although advance copies floating around the blogosphere spoiled the exciting surprises.

As usual, there was some small shuffling here and there. US News sure has designed a great gimmick to captivate the world of legal academia in a near-hypnotic spell. We eagerly watch what is in essence a rather boring snail’s race, where each year, some schools inch up a few paces and some fall a few paces behind. US News gives us just enough shuffling in the race to keep us in suspense, but in reality, this race has the pacing and excitement of a 100-page law review article.

With that said, I confess I’m captivated by this silly race myself, and Paul Caron has a nifty summary of the schools making the biggest shuffles forward and backward in the race. I’m pleased that GW has inched up one notch. Now that the results are out, we’ll all have to wait until next spring to see the snails do their little shuffle again.

Last year at PrawfsBlawg, my co-blogger Kaimi had a very interesting post on the US News rankings:

Everybody loves to bash the US News rankings. Especially Brian Leiter. There is evidence that schools “game the system.” There are absurd results — precipitous drops for University of Washington and University of Kansas. There was even that dark time when the rankings placed NYU above Columbia — sacrilege by any standards, and irrefutable proof of flawed methodology. But even with all of its warts — and they are many — the US News list serves a valuable purpose. It’s cheap, accessible, and easily digestible, and it’s right more often than not. And frankly, it would be pretty ridiculous to expect much more from a $3.50 magazine. With U.S. News, the reader gets exactly what she pays for.

Exactly. We love to gripe about the US News rankings — and with good reason, for the rankings are stupid — but what should we expect from a magazine’s gimmick to sell issues? Actually, I think that the folks at US News are quite brilliant. Why should they invest the time and money to do the rankings properly? They’ve figured out a way to do the rankings cheaply yet with just enough plausibility to have them be widely accepted. They have no particular expertise in legal education, yet their rankings weild tremendous influence over it. They’ve figured out a way to shuffle up the rankings just enough each year so that we keep coming back to find out what’s going on. We gripe and gripe about it in the legal academy, yet what do we do about it? We still play along with US News. Of course, we have to, since so many folks take the rankings seriously. However, what’s to stop us from working on developing alternative ranking systems, as Brain Leiter has done? Or at the very least, why don’t we try to work with US News to get them to improve upon their rankings? Until that time, we’ll continue to be slaves to a magazine.

10

Law Review Citations and Law School Rankings

columbia_law_review.jpgThere’s no shortage of writing on law reviews or law school rankings, to say the least. So why not combine the two?

Questions about law review ranking abound. How does one compare offers from journals at relatively equal schools? Is it better to publish with a journal that is more frequently cited or with one at a higher ranked law school? Is it better to publish with a main law journal at a top 40ish law school or the secondary at a top 10 law school? Questions about law school rankings abound as well, particularly for schools outside of the top 30 or so. (Or so it seems to me.)

I’m partial to citation studies as a way of judging quality. I know that citations have lots of problems as a way of ranking journals (or individual authors). However, I like the objectivity citation studies provide. And so I’m partial to the Washington and Lee Law Library’s website, which provides comprehensive data on citations to hundreds of law journals by other journals and by courts. I’ve found it useful in trying to draw some comparisons between journals. Other people often draw comparisons between journals by looking to the US News ranking of the journal’s school.

Read More

3

On Rankings Bias; or, Why Leiter’s rankings make Texas look good — and why that’s not a bad thing

Recent blog posts by Paul Caron and Gordon Smith note that creators of alternate law school rankings often seem to create rankings systems on which their own schools excel. A possible implication — not an implication that Smith or Caron make, but one that various Leiter critics have been making for some time — is that these alternative rankings are merely a form of naked self-promotion by their creators. In its simplest form, this argument would go something like this: “Brian Leiter promotes his rankings because they rank Texas higher than the U.S. News, and this makes Leiter look better.”

In response, Leiter has asserted through blog posts and comments that his rankings do not necessarily make Texas look better. His recent statements focus on the fact that he lists student quality on his new rankings page. He writes:

My institution, Texas, ranks 8th in faculty quality measured by reputation, 9th in faculty quality measured by impact, and 16th or 18th in student quality, depending on the measure used. Texas ranks 15th in US News, as it has for quite some time now. Texas thus ranks both more highly and more lowly in my ranking systems, depending on the measures used.

This is a singularly unconvincing fig leaf. Everyone knows that the 2000 and 2002 Leiter rankings did not weight student quality particularly heavily; they measured mostly faculty reputation, and they clearly gave an edge to Leiter’s school. (This is readily apparent from a look at Leiter’s archives section). Thus, for some time now, the Leiter rankings have placed Texas higher than the U.S. News list.

Is this cause for concern? Does this suggest that the Leiter rankings are simply self-promotion? Actually, there is a much more innocuous explanation.

Read More