# How to Fill Out the US News Law School Rankings Form

### 12 Responses

This is just brilliant, Dan!

2. Tim Zinnecker says:

How about tossing the form in the trash can?

3. anon says:

Ever heard of decimal numbers? Problem solved.

But the question isn’t whether each individual dean’s rankings can appropriately distinguish between law schools. The only real question is whether the average rankings across all deans do. And those are two different things. For example, say 98% of deans give a 5 to Yale and 2% give a 4, while 94% of deans give a 5 to Stanford, while 6% give a 4, and 75% give a 5 to NYU, while 24% give a 4, and 1% give a 3. Yale ends up with a reputational ranking of 5, while Stanford gets a 4.9, and NYU gets a 4.7.

The point is, because we’re aggregating scores, what ends up mattering is not just what scores deans can give, but also how those scores are distributed over a large number of deans. That will make the results more accurately reflect reality, even if most deans are giving schools 5’s against their will.

All that being said, your general point that giving deans a scale of only 1 to 5, rather than, say, 1 to 100 will make for less precise results is true, and I don’t really see what harm there would be in allowing for finer gradations.

5. Problem solved says:

Here’s what you do, Dean Solove: George Washington gets a 5. Every other school ranked anywhere near you gets a 1. Schools in the bottom 100 where your buddies are deans can have 3s or 4s.

Then your ranking goes up (or barely keeps pace, as your peers race to out-manipulate you) and U.S. News gets to publish its “rankings.”

6. Daniel Solove says:

But how many deans, acting in good faith, on a 1-5 scale, would give Yale or Harvard a 4? Or a 3? The deans doing this likely have either a vastly divergent sense of law school reputation than the norm or they’re gaming the system.

7. notgood says:

4,
Considering the publishing of reputations/rankings tends to reinforce them, we might steer away from finer gradations if we thought it would increase the, already present, distortion of an individual’s personal valuation of the schools and that such an increase was undesirable. However, from a consumer’s perspective, I do appreciate the additional detail.

8. Orin Kerr says:

Excellent questions, Dan.

9. Question says:

Regardless of the score, the “reputation” factor is self-serving. What region are the practitioners and judges from? I’m guessing the bulk of them are from NYC, Boston, LA, DC, and other “elite” cities. Because of that bias, I doubt a school from the South or smaller cities will ever have reputation scores as high as schools in those “elite” cities.

I know respondents are “supposed to” skip rating schools they are unfamiliar with, but is that really the case? I’m sure a lot of people mark schools they are not familiar with as “1” – if they are not familiar with them, they are not “reputable.” Since we don’t know the people surveyed or their geographic distribution, I don’t see how they can use “reputation” as such a strong factor in the rankings.

10. Justice Scalia says:

I’ve got an idea.

Why don’t the powers that be at US News rank the law school deans first (1-5 scale) and then weight their responses based on the higher ranking deans having more influence.

That’d be great. More worthless statistics.

11. Anon says:

Those who suggest decimal points or small numbers of people down-voting a school fix the problem are confusing precision with accuracy. It’s not a more accurate number just because it has more decimal points.

For example, suppose that I’m thinking of a number between one and five. All of you should try to guess what it is. The fact that we average your guesses doesn’t make it any more “accurate.” In fact, the decimal points on the average would ensure that the average was necessarily wrong. Only if you all unanimously guessed what the actual number is would the average be accurate. I’m not suggesting that “reputation” is the same as an imagined number, but the creation of more decimal places as a proxy for accuracy is ludicrous.

Further, the idea that we can accurately declare a school that gets 98% 5s to be better than a school that gets 96% 5s (from a different base of voters, no less, if the deans are truly only voting on schools about which they have some knowledge) is absurd. What’s worse, the “reputation” is largely influenced by last year’s ranking in US News, isn’t it? Yale is not a better school than Harvard. Nor is Harvard a better school than Michigan. Or Cornell or Stanford or any of the others. The rankings are the journalists revenge on law schools and their friends who went to law school instead of journalism school.

P.S. The number I was thinking of was 3.

12. Anon says:

I wish there were no typo in the word journalists above, but that is the uneditable nature of an anonymous post.