Site Meter

Category: Law School (Rankings)

0

Law School Capture

My blogging schitk is grousing about legal education. I do this mainly on moneylaw and classsbias and serve as a technical advisor to privilegelaw – a blog that must be read starting earlier and moving to more recent. In many respects I think legal education has been captured by and run for the convenience of faculty who are far more often than not the children of privilege. (If you are already preparing to comment, I ask that you skip it if the comment is about a law professor who is not a child of privilege.) As I blog along this month these themes will become more developed. First, here is a test to examine your own school for its level of capture.

Before taking the test there are some clarifications. There is good and bad capture. I can imagine a law school captured by the faculty and, with or without help from the administration, run for the benefit of stakeholders. This would be faculty that is constantly asking “What should we be doing”? and matching it against what it is doing. On the other hand, capture can mean that a faculty runs the law school for its convenience with only modest limitations imposed by others and even here observing the limits are part of a pattern of self-interested behavior.

Second, from time to time I get an email that carries with it the assumption that all my grousing is about my own School. Wrong! The examples are not all taken from my School, and if you really called my bluff I would not bet that my school is any different than the average. So how does your school stack up on the capture quiz: (you can give your school partial points)

1. Are classes scheduled mid week and mid day even though it means conflicts that limit student choices? (1 point for a yes.)

2. Has you school seriously reviewed any of its foreign programs, centers, institutes or degree programs in the past two years? (1 point for a no.)

3. Does your school depend on adjuncts to teach mainline courses while offering small enrollment specialized courses taught by full time profs? (1 point for a yes)

4. Does your school have a high curve that is sometimes defended by not wanting to hurt the feelings of the students or other justifications that amount to “I do not want to actually have to evaluate someone?” (1 point for a yes)

5. Do colleagues propose programs that are needed even though they will not actually be teaching, traveling, or receiving a reduced teaching load if the program is adopted? (1 point for a no)

6. Can students graduate and take half or more of their classes on a pass/fail basis. (See question 4) (1 point for a yes)

7. Does your school encourage massive, barely supervised, externships that generate tuition dollars, provide free labor and, by the way, mean less teaching ? (1 point for a yes).

8. Does your administration mass mail glossy reports listing every conceivable thing faculty submit as reportable? (1 point for a yes).

9. Does your dean appear to be afraid to suggest that the School should do better and then hold people accountable? (1 point for a yes)

10. Is the norm that just about everyone is gone by 11 AM on Friday? (1 point for a yes)

If you scored a 10, it’s best to go into receivership and start from scratch.

If you are in the 7-9 range you will probably be a 10 soon.

If you are 4-6, I think you are average and a few hires could move you either way.

If you are 3 or less, congratulations.

16

Is Sorting Law School’s Only Function?

17402.jpgBainbridge and others are abuzz over Rush and Matsuo’s paper, Does Law School Curriculum Affect Bar Examination Passage? An Empirical Analysis of the Factors Which Were Related to Bar Examination Passage between 2001 and 2006 at a Midwestern Law School. The paper reports that simply taking “bar courses” generally does not improve performance on the Bar Exam.

The paper is clearly written but not (for me) surprising: it fits unpublished research I’ve seen, and common sense. I’d bet that a large minority of all law professors, and a majority of law professors hired since 1990, haven’t sat for the Bar in the jurisdiction hosting their law school. It would be surprising if teaching behind this veil of ignorance could significantly improve test scores for marginal students. You can’t teach to a test you haven’t seen.

But if that’s true, two questions come to mind. The first has been addressed by some commentators already, and boils down to: if not bar courses, what courses should law students take? Josh Wright responds: antitrust! Sam Kamin disagrees: professors you like! As for me, I offered the following comments in a package of diverse suggestions on this topic from my colleagues distributed to our first year students at the end of the Spring term:

I recommend that you select courses that are challenging and intrinsically interesting. This means tailoring course selection to your abilities (take a tax course, especially if you are afraid of math); and interests (recall what made you excited about the Law before coming here). The data I have seen do not correlate Bar passage with any particular package of courses, but rather with your overall performance and work ethic. Certain employers may expect to see foundational courses like corporations and evidence on your transcript, but I believe those expectations are the exception rather than the rule. The bottom line: take classes that will make you want to come to school in the morning.

Maybe such advice is helpful, maybe not. But regardless, it doesn’t answer the big (second) question, which is this: is there a point to law school beyond sorting students?

Read More

Law School Ranking: Measurement vs. Characterization

I have long been concerned about negative externalities from ranking systems. Perhaps people and institutions are always prone to try to distinguish themselves. If so, Brian Leiter’s expert consultation for the MacLean’s rankings of Canadian law schools may be a good thing, since, as he states, it resulted in “a ranking system that can not be gamed, that does not depend on self-reported data, and is not an indecipherable stew of a dozen different ingredients.”

However, Benjamin Alarie at Toronto has critiqued Leiter’s efforts. Here are a few issues:

One of the central problems with how Faculty quality is measured is that it doesn’t assess influence in publications aside from 33 Canadian law journals. As an initial matter, I think it is fair to say that academics seek to publish in places with the most active audiences for particular types of research—for example, the best journals to publish law and economics research in are likely to be American peer reviewed journals such as the Journal of Legal Studies, or the American Law and Economics Review, or even professional economics journals.

[I]t is unlikely that frequency of citation is a perfect proxy for quality; for example, overly provocative papers are sometimes cited for being so provocative.

It is unclear what the threshold used was for including the firms as among the “elite firms” used by Maclean’s.

[A national reach] measure . . . misses “international reach” of the law schools that regularly place students in the excluded top New York and Boston firms, in international NGOs, and in various other attractive positions.

[By the way, I put the links into those block quotes above.]

I think these are all valid points, but the problem is even larger. The consumers of these rankings are, by and large, students looking for a good education and firms looking for well-trained lawyers. Why so much focus on whether the law schools are feeders for “top” firms? Perhaps the best law teaching is that which manages to train people for diverse careers in law.

Finally, a more philosophical point.

Read More

15

Law (Professor) Blog Ranking

counting2.jpg[UPDATES IN RED] With the assistance of our intern, Sam Yospe, I decided to update the law blog ranking project first completed by Roger Alford at Opinio Juris. The following list ranks 41 law professor blogs according to traffic (as calculated by The Truth Laid Bear). To minimize distortion, we applied average monthly data, and ran the measurements about two weeks ago. This list only includes blogs that have at least one law professor as a regular blogger, and we exclude blogs that focus entirely on politics or current events, and blogs that are not tracked by Truth Laid Bear. Some blogs, like Patently-O, appear to be tracked only inconsistently by TLB and are not included in this list for the time being.

While this list ranks blogs by traffic, we have also included Truth Laid Bear’s own weighted rankings. TLB ranks blogs using an algorithm that accounts for a “link score,” a measure of how often blogs are linked to by other blogs. While the ranking by traffic that appears below and TLB’s ranking are related, the correlation appears to be statistically insigificant. For example, Bainbridge ‘s blog is ranked second by TLB amongst legal blogs. Yet, by traffic it ranks ninth. Conversely, Sentencing Law and Policy is the ranked third amongst all legal blogs in traffic, yet it ranked 2,164 by TLB, a lower ranking than some legal blogs that receive less traffic.

These data suggest that there is significant heterogeneity in the audience of legal blogs, as some blogs seem to have wide audiences of readers not shared by others, and (indeed) exist in entirely different communal spaces. This fractured audience finding challenges my flat traffic thesis. Importantly, this post does not intend to suggest a thing about the relative quality of the blogs ranked, nor those that are not mentioned. This isn’t even a popularity contest.

Read More

Can Lawyers Afford Not to Play the Rankings Game?

In an article in National Jurist, rankings expert Brian Leiter was quoted as saying that “The more info and the more competing measures there are out there, the less concerned law schools will be about pleasing their U.S. News master.” In a different setting, I too have been enamored of a diversity of rankings. I’ve also hoped that law schools would more formally recognize, say, their top 10% of brief-writers, researchers, or oral advocates, elevating the visibility of those with exceptional skills in areas outside of exam-taking.

However, Leigh Jones reports that there are some costs associated with a diversity of rankings:

By some estimates, law firms have about 200 chances each year to participate in rankings, awards programs or so-called “league table” publications that they hope will distinguish them from the competition. Not only are firms finding their marketing resources stretched thin by the onslaught, but they also say it is getting tougher to wade through the rubbish. “Not a day goes by that I don’t come across another one from someone I’ve just never heard of,” said Lloyd Pearson at White & Case.

Pearson is the “communications manager at the 1,907-attorney firm,” and “was brought aboard last year to handle the flood of surveys, questionnaires, phone calls and research related to awards and rankings that the firm pursues each year.” What happens to firms who can’t hire someone to manage the information overload?

Unfortunately, avoiding the rat race may not be much of an option. As law schools learned to their chagrin, an “echo chamber” effect can cause early ratings to become self-reinforcing. This dynamic sheds new light on lawsuits against websites that purport to rank or score lawyers. Plaintiffs may rightly worry that a low initial rating will become a self-fulfilling prophecy, handicapping their chances at getting good cases and thereby pushing them further down the pecking order.

Hat Tip: Eric Goldman.

8

Are Alternative Law School Rankings Any Better than US News?

The WSJ has an article on alternative law school rankings to the infamous US News rankings. According to the article: “In the last two years, at least a dozen upstart Web sites, academic papers and blogs have stepped in with surveys of their own to feed the hunger for information on everything from the quality of the faculty to what a school’s diploma might be worth to future employers.” It has this chart of some alternative rankings of law schools:

law-school-rankings2.jpg

In my opinion, all of these rankings have serious flaws.

US News — The reputation surveys are only given to deans and just one or two faculty members (a very unrepresentative sample of faculty). The reputation surveys are too easy to game. And the reputation scores of 1 through 5 are not granular enough. For example, Yale has an academic reputation score of 4.9, Harvard 4.8, and Stanford 4.7. That means that people in the surveys are rating these schools with 4s or less. Who gives less than a 5 to any of these schools on a 1-5 scale? Some of the other numbers factored into the US News equation are quite silly and can be easily cooked, with schools using accounting tricks that would make Enron officials blush.

Supreme Court Clerkship Placement — This is a ridiculous way to rank schools. Getting a Supreme Court clerkship is like winning the lottery. There are far too many qualified people than positions, and getting a position certainly takes merit but it also takes a lot of luck. Part of it depends upon the connections of a school’s professors, who can place clerks with feeder judges or may even have influence with a Supreme Court Justice. Nobody seriously goes to law school planning on getting a Supreme Court clerkship. And it’s based on total number of clerks, so the ranking in the WSJ column is meaningless since some schools are much larger than others (Harvard is more than twice the size of Yale).

Read More

11

Is This The Beginning of the End for U.S. News Undergrad Rankings, and Will Law School Rankings Survive the Collapse?

The New York Times reports today that the presidents of dozens of liberal arts colleges have agreed to stop participating in U.S. News’ college rankings survey. According to the report, the Annapolis Group, an association of liberal arts colleges, released a statement that a majority of the 80 college presidents attending its annual meeting had declared their intent not to participate in the U.S. News rankings. The move follows on the heels of similar efforts by college presidents earlier this year, and of a widely-publicized critique of the rankings system last month in the Chronicle of Higher Education.

Has the liberal arts world finally decided that enough is enough? The Times quotes Judith Shapiro, president of Barnard College: “Frankly, it had bubbled up to the point of, why should we do this work for them? … [T]his is not our project.” Of course, the jury is still out on whether the liberal arts colleges’ nascent rebellion will have legs. Not surprisingly, some schools at the top of the food chain – e.g., #2 Amherst – plan to continue to cooperate with U.S. News, and want further “discussion” of the issue. Still, this latest move by liberal arts colleges seems to be more than mere window dressing.

All of this has me wondering: If U.S. News loses its undergrad rankings cash cow, will the law school rankings be far behind? Or might the law school rankings survive, even if the undergrad rankings collapse? Put differently, are there reasons why the law school world will (and perhaps should) continue to “do U.S. News’ work for them”?

I can think of a couple of reasons why law school rankings might survive, despite the collapse of undergrad rankings.

Read More

0

May SSRN Download Counts

From the Department of Possibly Misleading Information comes the Law School SSRN Rankings for May. See previous installments here and here. I originally highlighted in blue schools that significantly outperformed my impression of their popularly conceived rank; and in red those that underperformed. But then I reconsidered and concluded that this was an unproductive exercise. So I am presenting these data without further interpretation.

By Total Downloads

1 Harvard University – Harvard Law School 209695

2 University of Chicago – Law School 188088

3 Columbia University – Columbia Law School 149467

4 Stanford Law School 136852

5 University of Texas at Austin – School of Law 124802

6 University of California, Los Angeles – School of Law 114838

7 Yale University – Law School 111644

8 Georgetown University – Law Center 103398

9 George Washington University – Law School 91441

10 University of California, Berkeley – School of Law (Boalt Hall) 81539

11 University of Southern California – Law School 80660

12 University of Illinois – College of Law 79954

13 Vanderbilt University – School of Law 79510

14 New York University – School of Law 74137

15 University of Minnesota – Twin Cities – School of Law 70470

16 University of Pennsylvania Law School 62323

17 Duke University – School of Law 50452

18 University of Michigan at Ann Arbor – Law School 49161

19 Emory University – School of Law 48911

20 George Mason University – School of Law 46048

21 University of San Diego – School of Law 45385

22 University of Virginia – School of Law 42921

23 Boston University – School of Law 35095

24 Ohio State University – Michael E. Moritz College of Law 34651

25 Northwestern University – School of Law 34541

26 Boston College – Law School 33519

27 Florida State University – College of Law 32810

28 Yeshiva University – Benjamin N. Cardozo School of Law 30092

29 Cornell University – School of Law 29511

30 Fordham University – School of Law 28711

31 Loyola Law School – Los Angeles 25838

32 Michigan State University – College of Law 25441

33 Temple University – James E. Beasley School of Law 17913

34 Washington University, St. Louis – School of Law 17536

35 New York Law School 17094

36 Case Western Reserve University – School of Law 16896

37 Indiana University School of Law – Bloomington 16583

38 Rutgers, The State University of New Jersey – School of Law-Camden 14568

39 University of North Carolina at Chapel Hill – School of Law 14084

40 Washington and Lee University – School of Law 13188

41 University of Maryland – School of Law 12471

42 University of Colorado Law School 12403

43 Notre Dame Law School 12216

44 Brooklyn Law School 12096

45 University of Tennessee, Knoxville – College of Law 11952

46 University of Cincinnati – College of Law 11671

47 University of Iowa – College of Law 11637

48 University of California, Davis – School of Law 11310

49 University of Arizona – James E. Rogers College of Law 11202

50 University of Wisconsin – Law School 11106

Read More

Three Views of Education as an Associative Good

The Posner-Becker blog had a good discussion of education rankings 2 months ago. I was particularly struck by Posner’s observations on the self-fulfilling prophecy aspect of rankings:

The effect of college ranking on the education industry is unclear, but my guess is that it is negative. . . .Given the high costs of actually evaluating colleges, employers and even the admissions committees of professional and graduate schools are likely to give weight to a school’s rank, and this will give applicants an incentive to apply to the highest-ranking school that they have a chance of being admitted to (if they can afford it). The result will be to increase the school’s rank, because SAT scores and other measures of the quality of admitted students are an important factor in a college’s ranking. That increase in turn will attract still better applicants, which may result in a further boost in the school’s rank. The result may be that a school will attract a quality of student, and attain a rank, that is disproportionate to the quality of its teaching program.

Henry Hansmann wrote an interesting piece on this phenomenon, calling education an “associative good,” since, “when choosing which producer to patronize, a consumer is interested not just in the quality and price of the firm’s products, but also in the personal characteristics of the firm’s other customers” (emphasis added). Hansmann concludes by wondering if “the increasing technological sophistication of our society, which is fueling the trend toward stratification among the elite educational institutions, will someday produce technologies that make it less important for elite higher education to be a residential experience, and hence remove much of the associative character of higher education.” Franklin Snyder offers evidence that blogging is one such technology.

But don’t underestimate dominant interests’ passion for rankings, cautions McKenzie Wark (whose bookpage for the source I’m quoting interestingly fails to mention it was published by Harvard University Press). He claims that “Education is organized as a prestige market, in which a few scarce qualifications provide entree to the highest paid work, and everything else arranges itself in a pyramid of prestige and price below. Scarcity infects the subject with desire for education as a thing that confers a magic ability to gain a ‘salary’ with which to acquire still more things.” In other words, the rankings are the purest form of artificial scarcity. . . . a precious commodity in an era when the diminishing scarcity of resources that meet basic needs limits their contribution to economic growth. Wark worries that education will “split[] into a minimal system meant to teach servility to the poorest workers and a competitive system offering the brighter workers a way up the slippery slope to security and consumption.”

I’ll expressly disclaim endorsement of any of these three theories. I just find it interesting how the staid and sober observations of a Posner can resonate with Wark’s radical theory, once we interpose the “associative goods” concept.

2

Defending Alabama

The University of Alabama, that is, and in particular Dean Ken Randall. Randall has been called to task by Brian Leiter and Gordon Smith for his comments to the Tuscaloosa News about Alabama’s rise in the new US News rankings. Randall said: “It is a proud day for our campus, the legal profession, and the entire state of Alabama. We have proven that our state can offer premier educational opportunities.” Brian places Randall (and others) in the Decanal Hypocrisy Hall of Fame. Gordon called these the “most over the top comments of the season.”

OK, everyone knows I’m biased. Alabama is my academic alma mater, a place where I spent my first eight years in teaching. But there are a couple of reasons why I think this criticism of Randall is harsh. The first, as it relates to the H-word (hypocrisy.) Bucking the dominant “official line”, Ken Randall declined to sign the LSAC letter critiquing US News rankings. (Gordon notes this.) People may disagree with Randall’s decision, as well as his comments, but they can’t quarrel with his consistency.

Second, Randall’s comments are capable of a more generous reading. For example, his second point – that the school has proven that Alabama can offer premier educational opportunities – is not necessarily a claim that this new ranking provides the proof. Indeed, if you listened to Randall travel across the state, you’d have heard him offer that same message for years – well before the new ranking. This is simply a point of pride and a bit of marketing. And it’s something else – something that folks in Alabama, Mississippi, Arkansas and the like will appreciate: it’s an opportunity to respond to both the external critiques, and the internal self-image problems, of a state that hasn’t always excelled in education. In this sense, the comment is both a retort to outsiders and a rallying cry to residents. The US News rankings were an opportunity to get this message into the papers, but it is a message that he has been effectively delivering for many years.

Finally, as to the most apparently problematic comment – that the new rankings are proud day for the campus, profession and state – the critiques I think overstate the case. First, this comment was offered to the local mass media and sounds in the language of sports and competition – something anyone in Alabama would recognize as part of the state’s patois. It is also a way to stir up donors. Like his comment about offering a premier education in the state, these words are designed to convince alumni that the law school is a worthy investment.

I suspect that someone could raise a defense of the other offending deans, so I don’t mean to damn them by my failure to comment. And I also don’t mean to argue that US News offers “accurate” rankings. I now teach at a school that is new and utterly unranked. That fact is surely disconcerting to some potential students. Yet I would also put many aspects of our program head to head with schools in the Top 50 – including, yes, Alabama. So in this sense, these rankings very much hurt Drexel. (And as I’ve shown previously, most newer schools do quite badly in the US News reputation competition.) But these rankings do provide some information to students (and, by the way, potential faculty) who might otherwise know very little about the University of Alabama’s of the world. And they also produce significant benefits for schools that do well – in terms of money, faculty recruiting and student recruiting. Isn’t it just as disingenuous to act like the rankings are no big deal, then quietly reap their rewards? And isn’t that what most of the other schools in the Top 50 do every day?