Site Meter

Is It Harder to Get a Top Law Review Placement Today?

You may also like...

7 Responses

  1. Jeff Lipshaw says:

    My rejection e-mails over the last couple years from the loosely defined top schools, when they mentioned it, used numbers in the magnitude of 1,000 submissions for the 10-12 they would select. I was visiting the Widener campus at Harrisburg a couple years ago, and the editor there told me the Review got 200 submissions for roughly the same number of selections.

    If that is the case, even accounting for multiple submissions, then it seems like the vast majority of pieces never get published, much less in the loosely defined top reviews. That would also be an interesting question. How many articles get submitted in total to ALL reviews and how many actually get accepted?

  2. The missing data here have to do with the number of law reviews, and with the correlation between law review rank and quality of pieces published. Consider the following stories:

    All the Print That Fits: The expansion in submissions has been accompanied by an equal (or greater) expansion in terrible law review submissions. Many more pieces are written, but most of them never stand a chance at the top journals and never would have. Competition for the top N is still among roughly the same-size pool of highly-polished and/or big-name articles.

    Survival of the Fattest: The new entrants have forced the stakes up for everyone. Tenure committees still look for top N publications, so everyone still tries to place there, and will write as long and as hard as they can in order to make it. Since the game is zero-sum at that level, the result is an arms race, with more citations, more gigantic empirical studies, more grandiose claims, etc. The amount of work that goes into a creditable top N submission has skyrocketed.

    Nobody Knows Anything: The selection process is now, as it has always been, statistically indistinguishable from the case in which all editors make their selections with a dartboard. Raw numerical competition drives down the odds for everyone. Unlike in the previous story, there is not much authors can do about it. (Indeed, the dominant strategy becomes writing many articles, rather than highly polished ones — more submissions means more tickets in the submission lottery.)

    All three stories may have an element of truth to them. I’m not sure that raw submission counts would tell us anything meaningful in distinguishing among them.

  3. Lawrence Cunningham says:

    The Solove hypothesis seems correct. In addition, there are about 20% more US law schools today than a generation or two ago.

  4. Lawrence Cunningham says:

    The Solove hypothesis seems correct. In addition, there are about 20% more US law schools today than a generation or two ago.

  5. TwoL says:

    What is the impact of specialized journals? Do top Crim law professors want to be in top “general” journals? or top journals focusing in their area?

  6. Miriam Cherry says:

    Bepress would know …

  7. Miriam — The problem with BePress’s statistics is that they are very distorted and inaccurate. Several top law reviews prefer that professors upload papers directly on their websites or send a hard copy. Therefore, many professors don’t use BePress for these journals. Thus, for example, I don’t use BePress for Harvard, Yale, Columbia, Penn, etc. — instead, I use their websites. So the stats are quite skewed.

    The best statistics would come from the journals themselves. I know that many journals keep track. Each article is logged in to their computer. So the stats are out there — if only some kind law review editors would share them with us!