Category: Law School (Rankings)

0

Targeted Rankings Marketing (a/k/a the Law Porn Avalanche)?

I was lucky enough to be granted tenure by KU over the Summer. That makes me the most recently tenured faculty member at my school and part of a key demographic in the rankings world. As it happens, over the last two weeks, my mailings have probably increased ten-fold with law porn. Are we now in a world where law schools specifically target potential rankings voters (Deans and the most recently tenured faculty members) for mountains of law school updates and brochures? Assuming I am experiencing targeted marketing, and not the subject to some cruel joke, where do law schools get the list of newly tenured faculty? From AALS? US News? Or is some poor employee toiling away at each institution scanning every law school’s webpage for subtle changes?

US News Rankings 01
0

The Fundamental Problem with the US News School Rankings

Last week, all the law schools in America were holding their collective breaths for the latest pronouncement by US News about how their school ranked. For law schools, as well as other graduate schools as well as universities, the US News rankings play an enormously influential role. The rankings affect the number and quality of applicants. Employers use the rankings too, and the rankings thus affect job opportunities. The careers of law school deans can rise and fall on the rankings too. Key decisions about legal education are made based on the potential affect on ranking, as are admissions decisions and financial aid decisions.

In the law school world, grumbling about the US News rankings never ceases. The rankings use a formula that takes into account a host of factors that are often not very relevant, that can easily be misreported, skewed, or gamed, and that ultimately say little of value about the quality or reputation of a school. Each year, I read fervent outcries to US News to improve their formula. These cries are deftly answered with a response that is typically a variant of the following: “We’ll look into this. We are always looking to improve our ranking formula.” Not much changes, though. The formula is tweaked a little bit, but the changes are never dramatic.

And yet each year, we keep grumbling, keep hoping that someday Godot will arrive and US News will create a truly rigorous ranking.

We should stop hoping.

It isn’t going to happen. This is because there is a fundamental problem at the heart of the US News rankings — doing a rigorous and more accurate ranking is at odds with the economic interest of US News, which is to make money by selling its rankings to eager buyers each year and getting people to visit their site.

Read More

0

US News Rankings – The Biggest Loser

Imagine if the input-based approach used by US News was applied to the TV show The Biggest Loser. Currently, contestants win the show if they lose the largest percentage of their body weight. The input (original weight) is controlled for by using a percentage decline in weight instead of focusing on actual final weight or actual pounds lost. A system like US News uses would not control for the original weight and would simply use the final weight, regardless of starting weight, as part of the metric of success. Even worse, the US News system would give bonus points in some form to people that started out lighter. In other words, a 120 lbs person who gains 20 lbs. would beat a 350 lbs. person who loses 150 lbs. in the bizarro-US-News-version of The Biggest Loser. In our world of law schools, deans do far better by attracting high-score students who ultimately don’t make good lawyers than low-score students who have better long-term success in the marketplace.

0

US News Rankings – Negative Incentives of Input Focus

The focus on inputs by the US News Rankings creates a pernicious rat race where resources are over-allocated to getting students with the scores needed to maintain or improve a school’s median scores. In a down market, as exists today, such competition is often fierce and scholarship dollars are overwhelmingly given to students with high scores regardless of need. Further, a rankings-focused Dean will spend more on attracting students than on educating them (assuming the money would otherwise have gone to classroom instruction in some form). A one-point drop in LSAT median can be due to a single above-median LSAT student making a last minute decision to not attend law school. And that single-point drop could cause a US News rankings decline and dean firing. Instead of pursuing well-rounded, diverse, and interesting entering classes, deans must fight tooth-and-nail, allocating personnel and financial resources, to meet arbitrary statistical benchmarks that are essentially products of the previous years’ ranking.

4

US News Rankings – A Different Role for Inputs

Instead of our current world where higher LSAT/GPA numbers lead to more choices, I want to consider a hypothetical admissions model where students are randomly assigned to schools and forced to attend the schools selected for them. In such a world, LSAT/GPA scores would be relatively equal across all schools (in comparison to the present distribution). In formulating a rankings system from a student perspective, we would only care about what the schools do to improve the relatively equal incoming student quality (measured by job placement or other output variables).

Still, particularly for smaller schools, there would be some statistical variation in incoming LSAT/GPA scores and we would seek to control for those differences in assessing the output variables such as job placement. An ideal rankings system would discount success that could be attributed to incoming student quality and vice versa for schools with lower incoming scores. So, in a world where variations between incoming classes are small and unlikely to have substantial effects on the overall rankings, a good statistician would still like to control for the expected variation in class quality. But, strangely, in a world where the variations in entering class quality are very large, US News not only doesn’t control for entering class quality, it actually adds it to its overall formula in a prominent manner. This makes little sense if the goal is truly to measure law school quality (however it is defined) and aid students in their decision-making.

12

US News Rankings – Focus on Inputs

In my discussion that the US News Law School Rankings create negative incentives for law schools, I want to start by examining the misplaced emphasis of the components of the rankings. One of the oddities of the US News rankings is that the quality of incoming students (as measured by median GPA and LSAT) factors so prominently in how schools are ranked. The two factors account for 22.5% of a school’s overall ranking. Yet, what exactly do we expect to learn from the median GPA and LSAT scores? First, those two factors tell us, in aggregate, how the last entering class perceived the relative value of each law school. Students with the highest LSAT/GPA numbers typically have the greatest number of choices in terms of schools and scholarships. As a result, the GPA/LSAT scores are highly correlated with the previous year ranking. And, thus, the previous year ranking largely predicts the subsequent year ranking. Second, the numbers give us a crude sense of the quality of the student body before receiving law school education. Such considerations offer guidance to potential employers, but we might ask why those factors are important in any ranking system seeking to assess law school quality from a student perspective.

6

ABA Task Force on Legal Education: Down with Status

aba status merceGood news for law professors now submitting articles seeking offers from high-status journals: the importance of status in American law schools is over-rated and is about to be reduced. At least that is the urging of an American Bar Association Task Force Working Paper released last Friday addressing contemporary challenges in U.S. legal education.

Obsession with status is a culprit in the woes of today’s American law schools and faculty, the Working Paper finds.  It charges law professors with pitching in to redress prevailing woes by working to reduce the role of status as a measure of personal and institutional success.  The group’s only other specific recommendation for law faculty is to become informed about the topics the 34-page Working Paper chronicles so we might help out as needed by our schoools. 

Much of the rest of the Working Paper is admirable, however, making the two specific recommendations to law faculty not only patently absurd but strange in context.   After all, the Working Paper urges reform of ABA/AALS and state regulations with a view toward increasing the variety of law schools. It calls for serious changes in the way legal education is funded, though it admits that the complex system of education finance in the U.S. is deeply and broadly problematic and well beyond the influence of a single professional task force.

The Task Force urges US News to stop counting expenditure levels as a positive factor in its rankings.  It stresses problems arising from a cost-based rather than market-based method of setting tuition. It notes a lack of business mind-sets among many in legal education.  It questions the prevailing structure of professorial tenure; degree of scholarship orientation; professors having institutional leadership roles; and, yes, faculty culture that makes status an important measure of individual and institutional success.

But amid all that, law professors have just two tasks: becoming informed and demoting status.  So there must be some hidden meaning to this idea of status as a culprit and the prescription for prawfs to reduce the importance of status as a measure of success.  I am not sure what it is. The Working Paper does not explain or illustrate the concept of status or how to reduce its importance.

I’ll to try to be concrete about what it might mean.   Given the other problems the Task Force sees with today’s law faculty culture (tenure, scholarship and leadership roles), I guess they are suggesting that faculty stop making it important whether: Read More

10

What Would Happen if USNews Didn’t Weigh Money?

no-money-300x300-150x150Recently the ABA announced that it will no longer collect expenditures data from law schools: Leiter and Merritt offer thoughts on how that decision will influence the USWR rankings.  Both posts are interesting, though somewhat impressionistic.  Leiter thinks that state schools will benefit and Yale will lose it’s #1 spot; Merritt believes that USWR should reconfigure its method. [Update: Bodie adds his two cents.]

It’s well known that the influence of particular categories of data on the ranking can’t be determined simply by reading the charts that the magazine provides. Paul Caron notes that the rankings depend on on inputs that aren’t displayed (like expenditures). But it gets worse: (1) the point accumulation of each school influences that of every other school; (2) USWR changes the raw data through manipulations that are not well explained (placement discounts for law school funded positions) or are simply obscure (CoL adjustments for expenditures); (3) many schools don’t report information and USWR doesn’t advertise their missing-data imputation method; etc. etc. Bottom line: the rankings are very, very fragile.  (Many would say they are meaningless except at 10,000 feet.)  Luckily, Ted Seto’s work enables everyone to give their best shot to approximating each year’s ranking.  Seto argues that variance within a category turns out to influence the final scores as much as the purported weight that USWR assigns to it.

As thought experiment, I decided to estimate what would happen if each school’s expenditure data was set to average school’s expenditure.  I then used Seto’s method on 2011-2012 historic data to estimate the rankings in the absence of expenditure variance.  This basically eliminates the influence of expenditure as a category.  (A perhaps better, but more time consuming, approach would be to eliminate the expenditure categories altogether and re-jigger the equation accordingly). My back-of-the-napkin approach produces some wacky results, particular at the lower end of the ranking scale.  To keep it simple, after the jump I’ll focus on the top ten winners and losers from the elimination of expenditure variance in the 2013 t100 and then offer some thoughts.

Read More

13

Ranking: Law v. Undergrad

Inspired by this 2007 Taxprof post, I decided to compare the 2013 US News undergrad ranking to the 2013 overall law school rank. This project was a bit more complicated than it was six years ago,  due both to scandal & to the proliferation of regionally rankings.  But, ignoring schools that aren’t present on both lists, the results are illuminating.  For figures, follow me after the jump.

Read More

0

Free Advice to Incoming Law Review Boards

While academics angst, law journal editors toil to manage the fire hose of submissions, real and fake expedites, and the uncertainty that comes with a new job.  Many journal editors now seem to have the goal of “improving their ranking“.  Seven years ago (!) I wrote some advice on that topic.  It seems mostly right as far as it goes, but I want to revise and extend those comments below, in letter form.

Read More