Archive for the ‘Symposium (Convicting the Innocent)’ Category
posted by Dan Simon
I would like to underscore Brandon’s point about reform efforts that are currently underway. While for the most part, the criminal justice process is stuck in a bad place (thanks to a large degree to the US Supreme Court), it is refreshing to note that a few local and state jurisdictions are moving ahead with thoughtful reforms.
Just two weeks ago, the Oregon Supreme Court handed down a groundbreaking decision regarding the admissibility of eyewitness identification testimony. The decision in Oregon v. Lawson is noteworthy for at least 4 aspects (http://www.publications.ojd.state.or.us/docs/S059234.pdf).
(1) The Court explicitly endorsed the view that the admissibility of identification testimony should hinge on its reliability. This decision constitutes a notable break from the US Supreme Court, which recently decided that exclusion of evidence based on Federal Due Process grounds is intended for regulating the police, not for guaranteeing the reliability of the testimony. The Supreme Court thus denied relief to a defendant who could not prove that the suggestibility of an identification was the product of police misconduct (see Perry v. New Hampshire; http://www.supremecourt.gov/opinions/11pdf/10-8974.pdf). Importantly, the Oregon court thus punctuated the primacy of accuracy over competing goals of the criminal justice process.
(2) In sharp contrast to the majority’s decision in Perry v. New Hampshire, the Oregon court relied heavily on social science, namely, on experimental psychological research. As such, the Lawson decision conveys a sensibility towards the limited accuracy of eyewitness identification testimony as well as to the myriad of factors that affect its accuracy. The Court also cited Brandon’s “Convicting the Innocent” to punctuate the contribution of eyewitness misidentification to the problem of false convictions.
(3) The Oregon court acknowledged the defendants’ predicament of bearing the burden of proof to show suggestibility “when the state — as the administrator of that procedure — controls the bulk of the evidence in that regard.” The Court thus draws attention to the “informational disadvantage,” which is one of the less familiar factors that skews the adversarial process against criminal defendants (see “In Doubt,” pp. 44, 182).
(4) In conclusion, the Court shifted the burden of proof onto the prosecution, requiring that the state first show that the ID testimony was not obtained by meas of suggestive procedures. Such a showing does not preclude the defendant from countering the prosecution’s evidence. This is a bold and innovative move. I expect that it will not only curb the admissibility of ID testimony in court, but also make a favorable impact upstream, by inducing the police to conduct ID procedures in a more meticulous manner.
It should be noted that the Oregon decision comes on the heels of New Jersey’s landmark decision regulating lineup procedures (Steve v. Henderson: http://njlaw.rutgers.edu/collections/courts/supreme/a-8-08.opn.html), which was followed up by a reform of the pertinent jury instructions (http://www.judiciary.state.nj.us/pressrel/2012/jury_instruction.pdf).
Also notable in this regard is legislative effort in Texas that has sprouted a thoughtful Model Policy for conducting eyewitness identification procedures (the Model Policy was drafted by the Bill Blackwood Law Enforcement Management Institute of Texas at the Sam Houston State University:http://www.lemitonline.org/publications/ewid.html).
posted by Steven Drizin
Thanks for inviting me to participate.
Welcome from Chicago, the “False Confession Capital of the United States” or so we have been dubbed in a recent episode of 60 Minutes.
The show featured two cases from the Chicagoland area from the 1990′s which are reminiscent of New York’s Central Park Jogger case. In these two cases — known as the Englewood Four and Dixmoor Five — nine black teens were charged and later convicted in separate rape murders (in the interest of full disclosure, I was one of the defense attorneys in both of these cases). DNA evidence at the time of trial excluded the boys but prosecutors persisted in their belief that the boys had gang-raped their victims before murdering them. Their theory — that these nine teenage boys must have raped these women but failed to ejaculate.
Recent, more sophisticated DNA testing, however, told a different story. When the DNA profiles were placed in the CODIS database, they hit to two different adult male convicted rapists with long and violent rap sheets. As in the Jogger case, some in law enforcement still insisted that the boys were guilty. Fortunately, a Cook County judge rejected these claims, finding it unbelievable that nine teenage boys would engage in sexual intercourse without leaving a DNA trace. In a classic case of what Dan Medwed calls “the prosecution complex,” however, Cook County State’s Attorney Anita Alvarez continued to defend the convictions on 60 Minutes, a move which some have speculated may have damaged her career. For a more detailed account of these cases, see Joshua A. Tepfer, et. al, Convenient Scapegoats: Juvenile Confessions and Exculpatory DNA in Cook County, 18 Cardozo J. L. & Gender 631-684 (Spring 2012).
As I watched 60 Minutes, my mind turned to the ways in which the four books reviewed here related to the Englewood and Dixmoor cases. The confessions in these cases were filled with the kinds of details that only the true perpetrators could have known. When CBS News Correspondent Byron Pitts first read Terrill Swifts’ 21 page false confession, he was convinced of Terrill’s guilt because of the level of detail in the confession. It’s the same way I felt back in 2002 when I watched the video confessions of the Central Park Jogger defendants. How could these teenagers produce such detailed confessions if they were innocent? As Brandon Garrett points out in Convicting the Innocent, there is only one explanation – police contamination (fact-feeding). In cases of multiple false confessions. all the police really need is one contaminated false confession. That confession then becomes the script which other detectives use to browbeat the co-defendants into accepting.
In the absence of an electronic recording of the entire interrogation, suspects don’t stand much of a chance of convincing a jury that their confessions were coerced or false. As Dan Simon points out in In Doubt, jurors have great difficulty accepting the idea that anyone would confess to a crime they did not commit, especially a murder or other crime that could result in death or a lengthy prison sentence. Moreover, in a classic case of the “cover-up being worse than the crime,” the trial process actually encourages prosecutors to prepare police witnesses to testify that the details came from the suspect in an uninterrupted narrative with no prompting, prodding or persuasion from the police. It’s not a case of suborning perjury. Without a recording, everyone has plausible deniability about police contamination. The collective memory of the officers involved — often months or years after the fact — is washed clean of any contamination. During trial preparation, prosecutors prepare the officers to testify in ways that are persuasive to juries. There is nothing more persuasive than a police officer who testifies that the reason he knew the suspect was guilty was because the suspect came up with details that only the true perpetrator could have known.
In both the Dixmoor and Englewood cases, prosecutors solidified their cases by getting one of the defendants to plead guilty, and in Dixmoor, persuaded the most vulnerable defendant to testify against his co-defendants. As Stephanos Bibos points out in the Machinery of Criminal Justice, today, the leverage that prosecutors exert in plea negotiations — as a result of the draconian, often mandatory sentences for crimes — can convince even the innocent to plead guilty. The fact that one defendant pleaded guilty in the Englewood and Dixmoor cases only solidified the system’s belief that these were righteous convictions and no doubt contributed to the State’s Attorney’s prosecutorial complex about the case.
The problem of police contamination was not discovered by Brandon Garrett. Professor Richard Leo and Richard Ofshe have been writing about it for years. But the most enduring contribution of Professor Garrett is the finding that police contamination is epidemic in false confessions, not episodic. The prevalence of contamination provides the single strongest argument for electronic recording of the interrogation process. By slowing down the action of the interrogation, jurors can actually see the way in which facts are leaked to suspects during the interrogation. A recording should also prevent police officers from plausibly denying contamination and prosecutors from preparing them to testify that the suspect was the source of the details.
Will more transparency of the interrogation process disinfect it of contamination? Will knowledge of contamination lead judges and juries to make more reliable decisions in confession cases? Will jurors see contamination or will they be blinded by their faith in the power of confession evidence? Will these recordings make appellate judges less likely to affirm convictions based on contaminated confessions? The jury is still out on all these questions. The answer to these questions, however, may well depend on the ability of criminal defense attorneys to operationalize Professor Garrett’s research in criminal courtrooms throughout the United States – to make jurors see how contamination corrupts the search for truth. See Laura H. Nirider, Joshua A. Tepfer, and Steven A. Drizin, Combating Contamination in Confession Cases, 79 U.Chi.L.Rev. 837-862 (Spring 2012). I like our chances.
posted by Karen Newirth
I also thank Danielle and Brandon for including me in this symposium, and am very happy to join the discussion of four very important works on the state of the criminal justice system in America today.
The reference to the Central Park Five in Danielle’s original post highlights one of the most important qualities of Convicting the Innocent: it uses the powerfully told stories of the exonerated to bring to life the new and important detail about the causes of wrongful convictions that Garrett’s research has uncovered. The result is the fullest picture to date of the scope of the “nightmarish reality” that has led to 301 DNA-based exonerations in this country. Convicting the Innocent is not only a great read for lawyers and lay people alike, it is also a powerful tool for bringing about much-needed systemic change. Dan Medwed’s post appropriately asks whether the works being discussed here urge change that is gradual and specific or change that is revolutionary, going to the heart of the adversary system. In the context of eyewitness misidentification – the leading contributing cause of wrongful convictions, occurring in (as Garrett found) 75 percent of the first 250 exonerations – we see great success in effecting change in both courts and police precincts alike. Brandon Garrett’s research has been critical to these successful reform efforts.
As the attorney responsible for the Innocence Project‘s work in the area of eyewitness identification, I have relied on Convicting the Innocent in my efforts to educate attorneys, judges and policy makers about the perils of misidentification and the flaws in the current legal framework for evaluating identification evidence at trial that is applied in nearly all jurisdictions in the United States. That legal framework, set forth by the Supreme Court in Manson v. Brathwaite, directs courts to balance the effects of improper police suggestion in identification procedures with certain “reliability factors” – the witness’s opportunity to view the perpetrator, the attention paid by the witness, the witness’s certainty in the identification, the time between the crime and confrontation and the accuracy of the witness’s description. (These factors are not exclusive, but most courts treat them as if they are.)
Psychological research in the area of perception and memory has offered conclusive evidence that the identified reliability factors are not well-correlated with accuracy; do not objectively reflect reality to the extent that they are self-reported; and – most critically – are inflated by suggestion, leading to the perverse result that the more suggestive the identification procedure, the higher the measures of reliability under the Manson test.
Garrett’s work in Convicting the Innocent adds an important dimension to the psychological research – and makes even more urgent the call to reform the Manson test – by demonstrating that the Manson test failed in the cases of the 190 exonerees who were convicted based, at least in part, on identification evidence that was either not challenged or admitted as reliable under Manson. Garrett’s work shows just how the Manson reliability factors fail to ensure reliability: in most cases reviewed by Garrett, the witnesses had poor viewing opportunities; had only a few seconds to see the perpetrator’s face, which was often disguised or otherwise obscured; made identifications weeks or months after the crime; and provided descriptions that were substantially different from the wrongly accused’s appearance. In addition, almost all of the witnesses in the cases reviewed by Garrett expressed complete confidence at trial – stating for example that “there is absolutely no question in my mind” (Steven Avery’s case); that “[t]his is the man or it is his twin brother” (Thomas Doswell’s case) – although DNA later proved that these witnesses were entirely wrong. Perhaps most striking of all of Garrett’s research findings in the area of eyewitness misidentification is that in 57 percent of the trials with certain eyewitnesses, the witnesses had expressed earlier uncertainty (strongly suggesting that the identification was unreliable), but only 21 percent of these witnesses admitted their earlier uncertainty.
The Innocence Project has relied on Garrett’s research in advocating for the reform of the legal framework for evaluating identification evidence in courts around the country, from the U.S. Supreme Court (Perry v. New Hampshire) to state supreme courts from Oregon (State v. Lawson) and Washington (State v. Allen) to New Jersey (State v. Henderson) and Pennsylvania (State v. Walker). In two of these cases – Henderson and Lawson – high courts found that Manson fails to ensure reliability and implemented new legal tests that better reflect the scientific research and, we hope, will better prevent wrongful convictions based on eyewitness misidentification. Both the Henderson and Lawson courts cited Convicting the Innocent in rendering their decisions, demonstrating just how powerful a force for change Garrett’s work is.
posted by Daniel Medwed
Special thanks to Danielle for organizing this symposium, Brandon for his generosity in expanding its scope, and Dan S and Stephanos for their participation.
I recently published a book review of Convicting the Innocent in Criminal Justice Ethics, lauding the work as a brilliant contribution to the field. A hallmark of Brandon’s work generally, and his book in particular, is that he focuses on what we know rather than speculating about what we don’t. He carefully deconstructed the first 250 documented DNA exonerations and, in the process, shed newfound light on what went awry in those cases. The result is an incredible book that shines a spotlight on various aspects of the criminal justice system that had long remained shrouded in darkness. For instance, we knew that false confessions were a major factor in wrongful convictions, but we – or at least I – did not know that in the vast majority of those cases (38 of 40) the false confessions contained details about the crime that only the true perpetrator would know, suggesting that law enforcement had either inadvertently or intentionally fed this information to the suspect.
Recent books by Stephanos and Dan S likewise shed light on criminal justice issues that previously existed largely in the shadows. Stephanos artfully explores the assembly-line nature of our modern criminal justice system in which plea bargains come off the conveyor belt in crisp, neat packages that make it harder to evaluate their underlying accuracy. Dan S offers a thorough analysis of the key moments in the criminal process where innate psychological biases can yield flawed decisions and increase the likelihood of error. I focus on discretionary decisions made by prosecutors–which often occur behind the scenes– that can help produce and prolong wrongful convictions.
One pressing question that emerges from all four of these works, I think, relates to the magnitude of any potential reforms: Should we adopt a gradual or evolutionary approach and tinker with problems one by one? Alter interrogations to avoid unintentional dissemination of information by the police to suspects, implement greater review of pleas, change how identification procedures are handled, and so forth. Or should we take a more revolutionary approach and consider revamping the adversary system itself on a much more fundamental level?
posted by Stephanos Bibas
I’d like to thank the entire Concurring Opinions crew for hosting this mini-symposium and Danielle, Brandon, and my fellow mini-symposiasts for organizing this discussion of our recent works. In some ways, I seem like the odd one out here. The other books under discussion are more concerned with factually wrongful convictions, to explain where, how, and why our system on occasion convicts the factually innocent.
By contrast, my recent book, The Machinery of Criminal Justice, focuses more on the moral justice of both substantive outcomes and the procedures we use to get there. In the colonial era, criminal justice was fundamentally a morality play. The central drama was about both factual guilt and moral desert. Victims prosecuted in their own name, defendants defended themselves, and juries and spectators sat in judgment on both guilt and public punishment. The point of the system was to let people see justice done, indeed to do justice themselves, so wrongdoers would visibly pay their debts to society and the victim and earn forgiveness and reintegration. There was almost no permanent exiling of ex-cons, as remarkably few colonial Americans were executed or banished. To oversimplify, most were welcomed back after a transparent, public morality play.
The professionalization of criminal justice over the course of the nineteenth and twentieth centuries undoubtedly brought various benefits, including the ability to handle staggering caseloads. But it came at the unacknowledged cost of transferring almost all power from laymen to lawyers, as prosecutors supplanted victims, defense lawyers silenced defendants, plea bargains eclipsed juries, and increasingly long prison sentences supplanted temporary shaming and restitution. That not only cut out the lay actors who looked at cases through the lens of common-sense morality rather than jaded professional perspectives. It also focused on cookie-cutter dispositions and bottom-line sentence numbers, ignoring the procedural justice that is central to laymen’s evaluation of fairness.
Superficially, my approach might seem at odds with the other three books. After all, one might think that miscarriages of justice call for more expert reforms and fewer bumblers in the system. Yet, as Brandon’s first post suggested, the plea-bargaining assembly line imperils both factual accuracy and moral justice (both procedural and substantive) simultaneously. It engenders enormous agency costs, making it far harder to ferret out injustices that result from hidden plea bargains and thus bypassing the disciplining exercise of public trials. Prosecutors exert tremendous leverage to secure guilty pleas, offering sweetheart deals to cooperating witnesses (sometimes a necessary evil, sometimes not) and threatening far heavier sentences for those who refuse to play ball. Public scrutiny is essential to catching injustices, but most bargained-for convictions are rubber-stamped without any meaningful scrutiny or testing. Most defendants who plead guilty are probably factually guilty, but some are not. And even factually guilty defendants deserve varying sentences, but the bureaucratic imperatives of plea bargaining tailor punishments to the needs of assembly-line efficiency, not just justice.
In short, our criminal justice system is a broken machine running almost on auto-pilot. It needs to be more transparent and democratically accountable for its failures. We cannot abolish plea bargaining, as we need its ability to handle staggering caseloads. But we can hope that the exposure of factual and moral injustices can prompt rethinking, forcing us to slow down the assembly line, to increase the quality of convictions and punishments even if that means reducing the quantity and doing more triage. Books like Brandon’s, Dan’s, and Dan’s can, I hope, prompt more oversight and public involvement to ensure both factual and moral justice.
posted by Brandon Garrett
That image is from the false confession of Ronald Jones, a man whose tragic story begins my book, Convicting the Innocent: Where Criminal Prosecutions Go Wrong. In fact, it is an image of his entire false confession, at least the statement that the detectives had typed at the end of eight grueling hours of interrogation in Chicago in the mid-1980s. I turned the statement into a word cloud to illustrate the words that Jones had repeated the most. In his statement, Jones was unfailingly polite, and according to the police stenographer, at least, he responded “Yes, Sir,” as the detectives asked him questions. In reality, he alleged at trial, detectives had brutally threatened him, beat him, and told him what to say about a crime he did not commit. The jury readily sentenced Jones to death for a brutal rape and murder on Chicago’s South Side.
The word cloud shows why the jury put Jones on death row. Some of the most prominent words, after “Yes, Sir,” are key details about the crime scene: that there was a knife, that the murder occurred in the abandoned Crest hotel, that the killer left through a window. Jones protested his innocence at trial, but those facts were powerfully damning. The lead detective had testified at trial Jones told them in the interrogation room exactly how the victim was assaulted and killed, and finally signed that confession statement. The detectives said they brought Jones to the crime scene where Jones supposedly showed them where and how the murder occurred. After his trial, Jones lost all of his appeals. Once DNA testing was possible in the mid-1990s, he was denied DNA testing by a judge who was so convinced by his confession statement that he remarked, “What issue could possibly be resolved by DNA testing?”
In my book, I examined what went wrong in the first 250 DNA exonerations in the U.S. Jones was exonerated by a post-conviction DNA test. Now we know that his confession, like 40 other DNA exoneree confessions, was not just false, but likely contaminated during a botched interrogation. Now we know that 190 people had eyewitnesses misidentify them, typically due to unsound lineup procedures. Now we know that flawed forensics, in about half of the cases, contributed to a wrongful conviction. Now we know that informants, in over 50 of the cases, lied at trial. Resource pages with data from the book about each of these problems, and with material from these remarkable trials of exonerees, are available online.
Returning to Ronald Jones’ false confession, the Supreme Court has not intervened to regulate the reliability of confessions, such as by asking courts to inquire whether there was contamination, or simply requiring videotaping so that we know who said what and whether the suspect actually knew the actual facts of the crime. Typical of its rulings on the reliability of evidence in criminal cases, the Court held in Colorado v. Connelly that though a confession statement “might be proved to be quite unreliable . . . this is a matter to be governed by the evidentiary laws of the forum . . . not by the Due Process Clause of the Fourteenth Amendment.” Preventing wrongful convictions has largely fallen on the states. I end the book with optimism that we are starting to see stirrings of a criminal justice reform movement.
posted by Danielle Citron
This week, we will be hosting an online symposium on Brandon Garrett’s Convicting the Innocent: Where Criminal Prosecutions Go Wrong (Harvard University Press 2011) (just released in paperback). Garrett’s book exposes the “nightmarish reality” of systemic flaws in our justice system that result in wrongful convictions. Those flaws include false and coerced confessions, troubling eyewitness procedures, invalid forensic testimony, corrupt statements by jailhouse informers, and the judiciary’s overweening procedural focus and blind eye to actual factual innocence. Garrett demonstrates that “[w]hat makes the trials of exonerees so frightening is that they show how the case against an innocent person may not seem weak. The case may seem uncannily strong.” In the New York Times Sunday Book Review, Professor Jeffrey Rosen described Convicting the Innocent a “gripping contribution to the literature of injustice, along with a galvanizing call for reform.”
In reading Garrett’s book, it was hard for me to shake my own memories of the Central Park Jogger trial in 1990-91. I was a trial preparation assistant at the Manhattan District Attorney’s Office for the bureau next to the one in charge of prosecuting the case. In the mornings when delivering files to the various courtrooms, we’d witness the march to the courthouse: prosecutor Liz Lederer’s team, the defendants’ family members, their lawyers, activist Al Sharpton, and Bill Tatum, editor of the New York Amsterdam News. Public conversation was divisive. The mainstream media cast the defendants as a “wilding” mob of black teenagers who descended on the petite white jogger; the Amsterdam News decried the arrest and prosecution as racial injustice. Even though forensic evidence exonerated the defendants (the FBI lab conclusively ruled out the semen found on the victim’s sock did not belong to any of the defendants), the jury convicted them. The justice system failed us: we convicted the innocent. Ken Burns and Sarah Burns’ film The Central Park Five is an important documentary companion to Garrett’s important work on our flawed system of justice.
Along with Garrett’s book, we will be discussing three new books that intersect well with the problems he tackles in Convicting the Innocent: Stephanos Bibas’s The Machinery of Criminal Justice (Oxford University Press 2012), Daniel S. Medwed’s Prosecution Complex: America’s Race to Convict and its Impact on the Innocent (New York University Press 2012), and Dan Simon’s In Doubt (Harvard University Press 2012).
To discuss the books, we will be joined by an exciting group of scholars:
posted by Danielle Citron
Next week, we will be hosting an online symposium on Brandon Garrett’s Convicting the Innocent: Where Criminal Prosecutions Go Wrong (Harvard University Press 2011) (just released in paperback). When I first approached Brandon about bringing a group together to discuss his ground-breaking work, he suggested that we highlight and discuss three new books that intersect well with the problems he tackles in Convicting the Innocent: Stephanos Bibas’s The Machinery of Criminal Justice (Oxford University Press 2012), Daniel S. Medwed’s Prosecution Complex: America’s Race to Convict and its Impact on the Innocent (New York University Press 2012), and Dan Simon’s In Doubt (Harvard University Press 2012). On Monday, Professor Garrett will kick off the symposium by providing an overview of all four books and their contributions. That will no doubt spark a conversation about our criminal justice system, including the value of DNA evidence, the black box nature of plea bargains and its troubling implications, prosecutorial misconduct and discretion, cognitive biases affecting juries, among other issues. Brandon Garrett will be joined by an exciting group of scholars: