Site Meter

Chick Sexers To Be Automated: Are Lawyers Next?

Dave Hoffman

Dave Hoffman is the Murray Shusterman Professor of Transactional and Business Law at Temple Law School. He specializes in law and psychology, contracts, and quantitative analysis of civil procedure. He currently teaches contracts, civil procedure, corporations, and law and economics.

You may also like...

7 Responses

  1. Jeff Lipshaw says:

    Dave, assuming that this isn’t tongue-in-chick, I’m not worried about neural net pattern recognition software replacing human judgment, even among lawyers.

    First, Kahan’s description, even if it’s accurate about about some aspect of practicing law, isn’t accurate about how transactional lawyers learn judgment. Second, he has a point about “authoritative certification.” The distinction between arguments from authority and arguments from merit have something to do with what successful transactional lawyers learn despite their inculcation in legal education. (To what sounds like your concluding point, I agree that there’s a guild effect, but it manifests itself in an entrenched way of thinking, not necessarily its entrance requirements.) Third, more generally, if professional judgment can be automated, it would only be in the most trivial kinds of ways.

    Shameless self-promotion alert: I talk (too extensively perhaps) about judgment, including how it differs from algorithm and induction, in The Venn Diagram of Business Lawyering Judgments: Toward a Theory of Practical Metadisciplinarity, coming soon to a law review near you.

  2. It’s not truly a case of lawyer automation — the machine doesn’t do exactly what the humans did. Rather, it’s an advance in process, effectively avoiding the judgment-call fork in the road by moving the determination to an earlier stage where it can be automated (estrogen detection in egg). I’m not sure what that means on balance. Maybe that the real danger isn’t that lawyers will be automated, but that the river will find a more efficient course that avoids the Lawyer Rapids altogether.

    Great picture of the chick doing a back flip.

  3. A.J. Sutter says:

    I’m with Jeff in rejecting the narrative of how transactional lawyers learn judgment. One might also think the recent brouhaha over Toyota’s possible software problems would reduce the enthusiasm for delegating judgment to bots. And since many negotiations are preludes to ongoing relationships between the principals, clients who’d entrust the necessary people skills to software ipso facto would be so clueless that they’d be beyond human help, in any case.

  4. Marc Blitz says:

    I suspect that — as in the movie, Blade Runner — we lawyers and law teachers already are automated and just haven’t realized it yet. Once we do realize it, then implementing the Carnegie Report on Legal Education will be a whole lot easier: We’ll simply install the next version of the software into our circuits (or perhaps directly into those of our students, making three years of Socratic questioning and exam-taking unnecessary).

    I’m guessing your post was, as Jeff Lipshaw already suggests as a possibility, just tongue-in-cheek and to underscore the irony of chick sexers being replaced by machines only a few years after Dan Kahan mentioned the chick sexing as an example of an astonishing feat that happens by intuition alone and is not reducible to rules (As Wikipedia helpfully notes, “The ‘example of the chicken sexers’ is famous in several debates in philosophy, especially in the internalism/externalism debate in epistemology.” http://en.wikipedia.org/wiki/Chick_sexing).

    Of course, the fact that one task that humans have learned to perform by intuition might also be performed by a machine — a task involving a binary choice (male or female) — doesn’t automatically mean that all such tasks, including those that are a whole lot more complex, are also ready to be automated. Richard Horsey suggests in “The Art of Chick Sexing” that chick sexing is an example, and only one example, of a process where “expert categorization does take place as a result of the recognition of specific features, but that since the process is not accessible to introspection, we are not aware that categorization is feature-based.” ( http://www.phon.ucl.ac.uk/publications/WPL/02papers/horsey.pdf ). It may be that some legal reasoning tasks will one day be taken over by machines, but it may take a bit longer than chick sexing to replace, and some components of legal reasoning may not be replaceable in this way at all — to borrow from Jeff Lipshaw again — in his discussion here of Cass Sunstein’s 2001, paper, Of Artificial Intelligence and Legal Reasoning
    ( http://lawprofessors.typepad.com/legal_profession/2009/12/artificial-intelligence-and-legal-wisdom.html ). Fortunately, even if computers can’t be lawyers, that doesn’t mean they can’t be clients. See Larry Solum, Legal Personhood for Artificial Intelligences ( http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1108671 )

  5. Blunt Instrument says:

    Never discount the benefits of a very entrenched guild.

  6. Marc Blitz says:

    I suspect that — as in the movie, Blade Runner — we lawyers and law teachers already are automated and just haven’t realized it yet. Once we do realize it, then implementing the Carnegie Report on Legal Education will be a whole lot easier: We’ll simply install the next version of the software into our circuits (or perhaps directly into those of our students, making three years of Socratic questioning and exam-taking unnecessary).

    I’m guessing your post was, as Jeff Lipshaw already suggests tongue-in-cheek and meant to underscore the irony of chick sexers being replaced by machines only a few
    years after Dan Kahan mentioned the chick sexing as an
    example of an astonishing feat that happens by intuition alone and is not reducible to rules (As Wikipedia helpfully notes, “The ‘example of the chicken sexers’ is famous in several debates in philosophy, especially in the internalism/externalism debate in epistemology.” http://en.wikipedia.org
    /wiki/Chick_sexing

    The fact that one task that humans have learned to perform by intuition might also be performed by a machine — a task involving a binary choice (male or female) — doesn’t
    automatically mean that all such tasks, including those that are a whole lot more complex, are also ready to be automated. Richard Horsey suggests in “The Art of Chick Sexing” that chick sexing is an example, and only one example, of a process where “expert categorization does take place as a result of the recognition of specific features, but that since the process is not accessible to introspection, we are not aware that categorization is feature-based.” It may be that some legal reasoning tasks will one day be taken over by machines, but it may take a bit longer than chick sexing to replace, and some components of legal reasoning may not be replaceable in this way at all See http://lawprofessors.typepad.com/legal_profession/2009/12/artificial-intelligence-and-legal-wisdom.html

    Fortunately, even if computers can’t be lawyers, that doesn’t mean they can’t be clients. See Larry Solum, Legal Personhood for Artificial Intelligences http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1108671

  7. dave hoffman says:

    Thanks for everyone’s comments. Marc expresses what I wanted to say more articulately that I ever could. One thing to consider is whether and to what extent lawyers’ fees are over-emphasizing the amount of legal judgment used (as opposed to, say, implementation of judgment, which could be automated).