Greetings from Indianapolis and the National Legal Aid and Defender Association (NLADA) annual conference! The proceedings have gathered many of the best minds and providers in the legal services field. There is real momentum for collecting data, making use of promising technologies, and implementing evidence-based practices. All great news for the A2J Lab.
It was only fitting, then, that I attended the second annual Empirical Methods in Legal Scholarship Workshop, hosted by Alex Stremitzer of UCLA Law School (along with Eric Talley of Columbia Law). The two-day conference (Nov. 4-5) brought together economists, psychologists, and, yes, lawyers to discuss ground-breaking papers. Most of the studies presented used lab participants; a few engaged with actors in the field. Regardless of the subject matter or even the country in which the experiments were run, all eight papers leveraged the gold standard of randomization.
Of particular interest to the A2J Lab’s work were submissions from Dan Simon (USC), examining the effect of the “adversarial mindset” in lab-replicated legal proceedings; Avani Sood (Berkeley) testing how different legal rules and irrelevant facts affect guilty verdicts and punishments among MTurk participants; and our colleague Holger Spamann‘s (Harvard) work with federal judges to explore the impact of precedent and irrelevant information criminal appeals. Keynote speaker Ian Ayres (Yale) invited us to think about how to apply field experiments to a set of vexing legal questions.
Just as the NLADA conference has assembled the best in the legal services business, the EMLS Workshop showcased the vanguard in legal scholarship’s experimental turn. We discussed methodological issues, from the role of theory to external and construct validity. Exciting (for an assembly of law professors) debates emerged. Even if consensus wasn’t always reached, the conversations proved that empiricists are no longer content with observational data analysis. The law & economics revolution of the 1960s opened up to include sophisticated empirical methods by the 1980s. But to really understand how people interact with the law and legal processes, we have to randomize. And the Workshop showed how far we have advanced in that direction.
We at the A2J Lab, however, think the field can progress even farther. Building on the work we are conducting, more legal scholars should work with actual lawyers representing actual clients in actual matters. All of Sophie’s prior posts asking “Why RCTs?” point to the need for transporting these methods into the “real world.” In the meantime, we hope that our portfolio of studies convinces fellow researchers as much as the bench and bar of the promise that field experimentation holds.