Category: Law School (Rankings)

4

Bob Morse’s Stunning Complacency

Why is Bob Morse, USWR’s ranking guru, so unafraid of competition? There’s plenty of evidence that his rankings are fragile – for example, changes at the “bottom” of the scale have unexpected influences on schools at the top because the magazine engages in a statistically bizarre forced normalization – and that they measure factors that have little relevance to an underlying “quality” measure of importance in the market for legal jobs.  But he persists in acting like an incumbent politician, prioritizing optics over real reform.

Here’s an example.  A few weeks back, Bob Morse issued a stern warning to law school administrators out to game his rankings.  In response to a problem created by “openness about our ranking model” Morse took a strong step in the direction of reform by…wait for it…threatening certain schools with punishment for gaming their employed-at-graduation statistic.  For those who follow the rankings, this was a particularly galling and obnoxious post.  The rankings model isn’t at all “open”: for most categories of concern, USNews engages in hidden manipulations of dubious value which make replicating the results quite difficult.  See, e.g., LSAT percentile scoring, COLA adjustments; normalization, treatment of missing data, etc.  Indeed, the rankings would likely fail the very low bar for openness and replication set by even a student-edited law journal, let alone a peer reviewed publication.  And the irony is that savvy administrators would find the at-employment graduation statistic of very little interest, because its relative contribution to a school’s rank is significantly dampened by lack of variance.  To put it another way, Morse seized on “gaming” of a factor that has the second least connection with overall ranking success (behind library volumes).  Leiter said it best: “fiddling while rome burns.”

Why did Morse’s team focus on this particular statistic, as opposed to working on real reform? It’s all about the perception of legitimacy: an increasing number of schools weren’t reporting their employment numbers (because the formula for imputing missing values in this input factor produced a helpful number).  When Paul Caron pointed this out, it was embarrassing for Morse, especially when Caron’s post was picked up widely. Real reform might result in dramatic changes that called into question earlier rankings, as well as adding cost and expense.  This change, on the other hand, will have at most a marginal effects on scores, and costs essentially nothing.  Mission accomplished!

Still, the question remains unanswered: what makes Bob Morse so convinced that incumbency renders his flawed product insensitive to normal market corrections?

11

US News Law School Rankings

I wanted to make an observation about the rankings that were released yesterday.  Duke Law School achieved something truly remarkable.  All of their 2008 graduates (100%) were employed in the six months following graduation.  No other school — not even Harvard or Yale — managed this.  And to have that result as the stock market was imploding is even more amazing.

Or Duke’s claim is just . . . er . . . false.  But I would never accuse a Top 11 school of misrepresenting the facts.

17

Robert Morse’s Response on the US News Law School Rankings

Over at WSJ Blog, Ashby Jones contacted Robert Morse to get his reaction to my post about how raters should fill out the US News law school rankings forms:

We caught up with Bob Morse, the director of data services for U.S. News, who said in his estimation, the 1-5 options generally speaking matched up with the level of knowledge held by the raters. “We’ve felt that the level of judgment isn’t granular enough to provide a wider scale.”

He also said that because the survey reports the results of the reputation question out to the tenths place, “we’re actually publishing it on a scale of 50; the results average out to be more granular.”

Morse defends the granularity of the US News rankings by pointing to the fact that the average scores do have decimal points.  Although this is true, it doesn’t address the problem I pointed out in my post — for the individuals doing the ratings, they can’t accurately reflect their sense of how a school compares to other schools.

Who gives Yale a 4 on only a 1-5 scale?  Or Harvard?  Either this person has a very different theory of law school reputation or is trying to game the system.

Let’s say that there are 10 reviewers, and they rate as follows:

Yale scores: 5, 5, 5, 5, 5, 5, 5, 5, 5, 4 = 4.9 average

Harvard scores: 5, 5, 5, 5, 5, 5, 5, 4, 3 = 4.7 average

Morse would conclude that the difference between Yale and Harvard is meaningful.  I would conclude that the difference is attributable to either (1) a fluke due to quirky beliefs of a very small number of raters; or (2) gaming by some raters.  I just don’t see how, on a 1-5 scale, Yale or Harvard would get any less than a 5 on all the forms.  Their averages should both be 5.0.  Any differences are the result of flukes or gaming and shouldn’t be taken seriously.

These problems exist beyond Yale and Harvard — they persist for the entire US News survey because there’s not enough granularity.  If there’s a granularity problem for individual raters that makes their ratings flawed, then the problem doesn’t just disappear by aggregating flawed ratings.

With all due respect to Morse, I must also disagree with his first point.  As my post demonstrated, Morse is wrong in his statement that “the level of judgment isn’t granular enough to provide a wider scale.”  If he’s right, then readers of my post should conclude that my hypothetical dean is way outside the norm.  But I’m willing to bet that among those in the academy, the consensus on the schools I mentioned in the post is that they deserve different ranking levels: Yale, Michigan, Cornell, USC, Emory, and American.  My sense is that there’s a consensus that most raters would rate them in the order listed, and would think that they are all at different reputational levels.

If this is the consensus, then Morse’s 5 point scale isn’t granular enough for individual raters.  And aggregating scores that are assigned based on a system of rating that isn’t granular enough doesn’t fix the problem.  It just means that the outliers control the outcome, and those outliers are either people with views way outside the norm or people who are gaming the system.  I don’t think we want the ratings to turn on what the outliers are doing.

I appreciate Morse’s response to Ashby Jones, and would be interested in his response to my points above.

Please note that I’m not an expert on statistics, so I’m open-minded about my claims.  If there’s a statistics expert among our readers, I’d be very interested in your thoughts.

0

Rearranging Prof. Leiter’s Data

Just as Law School Deans heed US News rankings, for better or worse, they may heed scholarly impact rankings, including recently-released tabulation performed by U. Chicago Professor Brian Leiter. It counts high citation scholars over the past five years, with attribution and ordering (a) by school, (b) within school, and (c) across a dozen subject areas. Hats off to Professor Leiter for taking what must have been scores of hours to assemble the data.

When studying it, Deans may find it worthwhile to rearrange in any number of ways. In most rearrangements, data are likely to appear in line with the data as presented, though some interesting variations may appear. As an example, Professor Leiter’s report ranks schools according to weighted citations of all scholars within the school. An alternative would rank schools according to the number of times a school’s scholars are named in the high-impact lists by subject matter.

In that rearrangement, there are small differences in rankings of the top seven schools, but many more marked differences further on the line.  Following are the top seven, ranked by number of times scholars appear in high-impact rankings by subject (that number appears in parentheses, followed in separate parentheses with the ranking in Professor Leiter’s report). Read More

12

How to Fill Out the US News Law School Rankings Form

Every year, US News compiles its law school rankings by relying heavily on reputation ratings by law professors (mainly deans and associate deans) and practitioners and judges.  They are asked to assign a score (from 1 to 5) for the roughly 200 law schools on the form.  A 5 is the highest score and a 1 is the lowest.  While many factors that go into the US News ranking have been criticized, the reputation ratings by and large are considered one of the best components in the ranking system. But should it be?

Let’s assume a knowledgeable dean filling out the form in good faith.  How is he or she to go about filling out the form?

Here’s my hypothetical dean’s stream of consciousness:

Okay, I think Yale is the top law school, so I’ll give it a 5.

What about Michigan?  Great school, but not quite as high as Yale.  I’ll give it a 4.

Cornell is an excellent school too, one of the best.  But it’s not Yale or Harvard, so I can’t give it a 5.  It’s not as good as Michigan in my view, so I can’t give it a 4.  I gave Penn and Berkeley 4′s too, and I think Cornell isn’t quite at the same level.  So it’s a 3.

What about USC?  Another excellent school, but it’s not as high as Cornell.  So it’s a 2.

Ruh-roh!  I’m not even out of the top 20, and I have 160+ law schools to assign scores to, and I only have one number left.  But I must go on!

How about Emory?  That’s a bit lower than USC in my view, so I’ll give it a 1.

What about American?  Another terrific school, but I think Emory’s better.  I can’t give American a 0.    What do I do?  Okay, I guess I’ll give it a 1 as well.

But I’m not even out of the top 50.  Yikes!  I’ve run out of numbers.  Maybe I’ll call Robert Morse and ask him if I can start assigning negative numbers.   What do I do?

Time to try some math.   To make things easy, I’ll assume there are roughly 200 law schools.  And I have 5 numbers to assign — 1, 2, 3, 4, and 5.  Assuming an equal number of schools assigned to each number, that’s 40 schools for each number.  OMG!  So I need to give schools I rank 1-40 a score of 5, schools I rank 41-80 a 4,  schools I rank 81-120 a 3, schools I rank 121-160 a 2, and every other school a 1.  But that’s ridiculous.  The law school I think ranks #40 isn’t anywhere near the law school ranked #1.  This system is impossible.

Okay, maybe I give a score of 5 to the top 10 law schools, then a score of 4 to the next 90 law schools, then 3′s and 2′s to 100-150, and 1′s to the rest.   But that still lumps too many schools together.  If every person filling out the form did what I did, then there would be no way to distinguish the top 10, and no way to distinguish schools in the top 100.  Only real outlier scores would determine the difference.

Dear Mr. Morse — what am I to do?  Please help me!

Does anyone have any advice for our poor dean?  How are people to fill out the US News ranking forms in good faith to reflect accurately their sense of law school reputations?

0

Fred Yen Continuing as Guest

We are delighted that our esteemed guest blogger, Professor Alfred Yen (Boston College), with us this past month (and before), will stay another one.  (You can see my post introducing Fred, my former colleague, here.)

In March, Fred contributed an amazingly insightful, thoughtful, reflective, and useful series of seven posts called Thoughts on Choosing a Law School.   The 7-part series broke down as follows: (1) limited utility of popular rankings; (2) curriculum;  (3) faculty staffing of instruction; (4) subject matter distinction; (5) faculty strength; (6) physical facilities; and (7) faculty publishing record.

These were formally directed to students considering which law school’s admissions offer to accept; and they also mean a great deal to we suppliers of legal education.   We’re grateful for these contributions.  And we’re delighted Fred will be back to contribute more wisdom, on these and the many other subjects within his capacious expertise.

0

Exclusive: The US News Law School Rankings — Leaked Memo

A few years ago, I was able to obtain a leaked memo about the US News & World Report law school rankings.  This year, my anonymous source was again able to snatch a memo from the desk of Robert Morse describing how he ranked schools this year.  Without further ado, here’s the secret memo and the rankings everyone has been eagerly anticipating.

Read More

10

Choosing a law school, part 7

In this post, I’m going to argue that prospective students should care whether a law school’s faculty publishes. Not everyone agrees, and we’ve all had professors who were great scholars but indifferent classroom teachers. I also freely concede that teaching ability does not necessarily go hand-in-hand with scholarly ability, so that a school’s best teachers need not be its best publishers. Nevertheless, I still think that faculty who publish have a better chance of offering outstanding classes than those who do not.

To illustrate, I’ll reveal a bit about two classes I have taught: copyright and evidence. I’ve published a reasonable amount about copyright, including a casebook published by West. By contrast, I’ve published nothing about evidence, with my background in that area coming from my work as a litigator.

Students have rated both of these classes well. In fact, I don’t think there’s any significant variation in the numbers. Yet, I firmly believe that I teach better a copyright than evidence class because the things I learn from research and publishing enable me to give copyright a deeper and more nuanced treatment. I know more about the overall structure of the area, respond better to student questions, and challenge students in more ways in copyright than in evidence.

Now granted, I don’t think this is something that students always pick up. My evidence class is pretty “black letter,” sticking to how lawyers need to work through evidentiary problems in courtrooms. This makes sense given how students will use evidence, and I think students feel that the course serves them well. Nevertheless, I am aware that I don’t blend in the “big theory” issues as well as I could because I don’t know them that well.

By contrast, I pack a lot into my copyright course. This sometimes frustrates students. Some only want “black letter” law (something that is very elusive in copyright at best). Some dislike what they consider theoretical digressions from what they need to know for practice. I could teach copyright to that lower common denominator, but I choose not to. And I like to think that my students come to appreciate that the complexity they encounter ultimately serves them well when they deal with that subject’s frustrating ambiguity in practice. In short, although I teach what I think is a good, competent evidence course, the academic “ceiling” in my copyright class is much higher.

To be clear, I am not saying that publishing is the only thing that prospective students should care about in evaluating a law school’s faculty. As I suggested in an earlier post, some law schools clearly value teaching and their professors are accessible to students in ways that can matter a great deal. Students should visit schools, talk to existing students, and see if classes are well-received. Such inquiry will probably identify a number of schools that appear to have good teaching. At this point, I think it makes sense for a prospective student to then compare publication records of the faculties to see how often they will learn from professors who are at the forefront of their fields.

1

Choosing a law school, part 6

Every prospective student notices the physical facilities of a school when he or she visits. Wood paneling, marble floors, and grand foyers create impressions about whether a law school is well-funded and a “nice” place to study. I’d like to suggest a few other ways in which prospective students should evaluate a school’s facilities.

The most important space for students is the classroom. When you visit a school, look at some large and small classrooms and evaluate the sight lines and acoustics, preferably by sitting in on a live class. Do students sit in a pattern where they can see and hear each other? Can they hear the professor? You might be surprised at the number of classrooms where heating or air conditioning interferes with voices. This might not seem bad in the traditional lecture class you had in college, because professors can always wear a mike. But in law school, the Socratic method makes it important to hear what your classmates say. It’s impossible to follow along if you can’t. In addition to sight lines and acoustics, you might also look at the front of the room. Is there full audio-visual capability with a computer for the professor? Is there enough white or blackboard? Is the screen large enough for easy viewing by students?

Next, I would suggest looking at the individual and group work space available for students. Individual work space exists primarily in the library. There needs to be ample seating to support students during high demand periods like exams or major writing projects. Is there seating of the kind you prefer to work in? Long tables? Individual carrels? Big, padded chairs to sit in while reading? Is there ample Internet access, wired or wireless? You are going to spend a lot of time studying in law school. Unless you are sure that your apartment or house provides you with the space you need, you will likely spend a lot of time in these facilities.

Group work space exists in libraries and sometimes elsewhere throughout the school. How many small conference rooms are there that students can reserve? I personally wouldn’t be too happy with only a few. At certain times of the year such as moot court competitions, there is a lot of student collaboration going on, and demand for these spaces can get pretty heavy.

One other type of important student work space involves the facilities of any clinical programs. If the school has clinics where students actually represent clients, are there proper rooms where client meetings and interviews can be held, separate areas where students can do work and maintain case files? Clinics are expensive to run, and it is not uncommon for schools to trim those costs by providing clinic facilities that don’t fully support the clinics’ work. If you think a clinic will be a big part of your legal education, this could matter.

Finally, I suggest looking at the spaces where students can gather informally. Is there a good student lounge or other gathering place like a cafeteria? Are there seats in hallways where you can sit for conversations? Granted, these amenities may not seem terribly important, but their absence impairs the creation of a community where students get to know and support each other.

All of the things mentioned here seem pretty obvious, perhaps so obvious that one would think every law school would take care of them. It may well be the case that the schools you’re comparing will all have good physical facilities. But you might also be surprised at how often schools, even some of the top schools, have facilities that don’t fully support their educational ambitions.

0

Choosing a law school, part 5

I thought I would say a bit about faculty – the people who teach all those classes in the curriculum. Every law school will tell you that its faculty is excellent, and with justification. Law teaching jobs are sufficiently desirable that law schools generally have many, many qualified applicants for openings. Law schools today hire very well qualified people. Nevertheless, I would like to suggest one way in which prospective students can evaluate whether a particular faculty will provide a good educational experience.

Professors come in many types. For purposes of this post, however, we can get along with a distinction between permanent faculty and part-time (frequently called adjuncts) faculty. For permanent faculty, law teaching is their full-time job. Part-time faculty, as their name implies, generally have another job and devote a relatively small amount of their time to law teaching. They generally teach one class at a school, often in the early morning or evening, and they frequently do so from year to year.

A good school should have the vast majority of its courses, particularly first year courses and basic doctrinal upper year courses, taught by permanent faculty. This is not to say that part-time faculty can’t do a good job. Many are good, dedicated teachers. Nevertheless, full-time faculty are at the school, present for students in ways that would be impossible for part-time faculty. Those professors have more time to focus on teaching, and they bring cutting edge expertise based on their research to the classroom. There are, of course, areas in which part-time faculty can do a better job than permanent faculty. For example, skills courses or courses focused on specialized topics related to practice (e.g. business planning) benefit from the day to day practical experience of adjuncts.

Accreditors give significant importance to the principle that law students should be taught primarily by full-time faculty, and accreditors will give law schools trouble if the principle is violated. Surprisingly, however, law schools sometimes overuse part-time faculty. This happens because, at some schools, permanent faculty do not want to teach first year or other basic courses. Student enrollments in those classes are high, so teaching those classes takes more time than teaching smaller seminars that may be more closely related to a faculty member’s research. It’s obviously hard for schools to force tenured professors to teach classes they don’t want to teach. Indeed, faculty who don’t want to teach a class may not do a good job.

For prospective students, a law school that does not put its full-time faculty in basic classes raises a question that needs to be answered. Do the school and its faculty really give sufficient priority to teaching students? Every school will of course answer yes, but sometimes actions speak louder than words.