Happy New Year!

Happy (nearly) New Year from all of us here at the Lab!

We’re excited for all we accomplished in 2017. This past year, we’ve seen the Lab grow in size and impact.

We now have over 6,360 participants enrolled in the Lab’s evaluations. We’re collaborating with 38 partners, including court systems, legal aid organizations, and other academic institutions. Over 75 student team members, along with our staff, have developed over 1,850 pages of self-help materials, as well as two digital self-help tools, to test for efficacy as we seek to learn the best way to help pro se defendants.

As the Lab runs more and more studies, our impact increases—and so do our costs.

In 2018, we’re hoping to double the number of studies we have in the field, but we can’t do it without your support.

If you’re thinking about making any final gifts in 2017, would you consider making a contribution to help the Lab continue to learn the best ways to help people with legal problems? Your gift will be put to immediate use in support of the Lab’s mission.

We look forward to sharing more news of our work in 2018!

More information on our Default Part II study in four graphs

We’ve been working on some new data representations for our Problem of Default Part II study, which is now in the field in Boston. This Part II study doesn’t have its own non-intervention control group (meaning, all of the groups we’re evaluating are receiving some sort of intervention). This is because Part I already demonstrated that even limited intervention has a statistically significant effect on defendants’ answer and appearance rates compared with no intervention. Part II seeks to build on that knowledge by testing whether some interventions are more effective than others.

That said, we always like to be as thorough as possible as we design our studies. To that end, before we launched Part II, we did some analysis of existing court case data for all small claims cases filed in 2016 to gather some baseline information. We’ve created four graphs, now live on a new study web page. (If you haven’t seen the study volume tracker, that’s worth a look as well.)

The graphs contain a lot of information, and, if you’re not familiar with statistics or the intricacies of programs available in Massachusetts courts, they might be a little difficult to read.

Before we drill into an example, we have a few notes on the definitions of the different variables. One variable is whether or not a hearing for a case was scheduled on a Lawyer for the Day (LFD) program day. The Massachusetts Lawyer for the Day program is a pro bono legal service that provides some pro-se advising services in some courts on certain days of the week. Exact services and availability varies between courts. Another is whether a defendant fails to appear (FTAs) at a given hearing.[1] The graphs break down data between these two variables at different courts in four different ways:

  • If a defendant ever failed to appear (FTA’d) at any hearing that was held
  • If a defendant failed to appear at their first hearing that was held
  • If the defendant’s first scheduled hearing was scheduled on a day when the Lawyer for the Day (LFD) program was happening at the court and the defendant appeared at that scheduled hearing
  • If any of the defendant’s scheduled hearings were scheduled on a day when the LFD program was happening at the court and the defendant appeared at one or more such scheduled hearings

Let’s take a look at an example data point:

In this example, the circled dot is the proportion of study ineligible (noted by color) cases in Cambridge Small Claims Court (y-axis). The dot’s size shows that the number of cases it represents comprises about .4 of the total cases in the court, which in this case would be around 325 cases (.4 of the court’s total number of cases in the sample, 811).

The dot shows us that in almost 25% of the study ineligible cases in Cambridge Small Claims Court, the first hearing was scheduled on a Lawyer for the Day program weekday and the defendant appeared at that hearing.

Our hope is that these graphs, along with the frequently updated study volume information, provide a window into the study’s design and progress as we move forward. Look for more updates on data from this and our other studies in early 2018.

[1] In Boston Municipal Court (Civil), the defendant FTAs if the defendant does not file an answer or does not appear at the first hearing; the defendant does not FTA if the defendant does both of those things.

In The News

Over the past few weeks, we’ve been talking about a few news stories here at the Lab. We thought they might be of interest to you as well. If you’re looking for some reading, consider the following:

Happy reading!

RCTs in law: the Shriver studies

As you may remember from a previous post, the A2J Lab is developing an RCT in Providence, Rhode Island to study the effectiveness of triage in summary eviction cases.

Part of our interest in studying eviction is that it’s a topic very much on policymakers’ minds. Because housing instability continues to receive a lot of attention (see, e.g., Matthew Desmond’s Evicted) more resources and political action tend to follow. The even better news is that those resource allocations and policy changes have been based, at least sometimes, on empirical research.

As the Lab has documented elsewhere, access to justice interventions often aren’t studied at all. If they are, the studies are often observational rather than randomized—and readers know how important we think randomization is here at the Lab!

We’re always very excited to see the work of other legal studies teams who think so as well. One recent example is the evaluation of the California Shriver Civil Counsel Act. Among other things, the Act enhanced tenant representation in eviction cases. As part of a trial for additional funding provided by the Act, the State studied the impact of seven pilot programs designed to increase access to legal representation among low-income populations in California. More than simply moving toward empirical data, analysis in the report states that: ”[i]mportantly, for a limited period of time, three pilot projects randomly assigned litigants to receive Shriver full representation or no Shriver services, and data for these two groups were compared.”

We’re excited to see other researchers embrace randomized evaluations and policymakers appreciate their findings. The ultimate hope is that this progress continues to transform the legal profession into a more evidence-based one.

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Why RCTs? Recent study on stents is one example

This past week, we’ve been avidly watching reactions to a new study, published in The Lancet, about the efficacy of using stents to help patients with chest pain. The New York Times ran an article on the study; so did The Atlantic.

If you haven’t been following this (potential) bombshell of an RCT, the study found no value in using stents to combat heart pain. Why is this such big news? Partially because using stents for cardiac pain is big business. According to the study’s authors, more than 500,000 patients receive the procedure annually for chest discomfort.

It’s also big news because it goes against intuition, even the sort that medical laypeople possess. Without evidence to the contrary, it might seem logical that opening blocked arteries with a stent would reduce chest pain. No wonder doctors adopted the practice with vigor! Now there are data that don’t back up that perception. Even in medicine, a field long conditioned to accepting the validity of empirical research, studies will bump up against the fallacy of conventional wisdom.

That fact doesn’t surprise us at the A2J Lab. What did grab our attention is that the authors received permission to run the study at all. As we mentioned in a recent post, all RCTs in the U.S. need to receive institutional approval before human subjects can enroll in a study. Based on our experience, it would be fairly startling if this type of study, which flies so baldly in the face of “conventional wisdom,” were to receive approval in the United States. An ethical review committee could have responded that this evaluation would prevent some participants from receiving a “benefit,” namely the treatment they “need.” The deeper held the belief, the harder it is to accept or allow the introduction of contrary evidence. That’s why we need to test interventions rigorously, particularly when resources are scarce and lives are at stake.

One final note on the study’s design. Critiques from medical researchers have included that the study is flawed due to “Type II error.” In short, they contend that the sample size (in this case, about 200) is insufficient to rule out false negatives. The challenge of having sufficient sample size is an important component of any RCT. The Lab, for example, uses power analysis to maximize the chance that a study will have enough observations to detect an effect, should that effect really exist. But a study’s sample size isn’t the only factor that’s important in determining its validity; it’s also important to know how generalizable the results are, regardless of their statistical significance.

This is just one more example of why RCTs are important. Have you seen others recently? Share them with us in the comments or on social media.

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Top 10 A2J Research Priorities #10: Ex Ante Law for Human Beings

We’ve reached the final installment in our Top 10 A2J Research Priorities series. While this is the last of the videos, it’s just the beginning of our work on these topics.

This week’s segment features Faculty Director Jim Greiner on Ex Ante Law for Human Beings.

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

RITES Goes Local

We’re excited to announce a new collaborative venture on the Rhode Island Triage and Eviction Study (“RITES”) team. Five outstanding students at the Roger Williams University School of Law have joined the project to conduct summary eviction court observations and interview unrepresented tenant-defendants who are experiencing the eviction process themselves. These raw data, so to speak, will give everyone working on the project a better understanding of the current system and inform ongoing study design. For example, information about what tenants would have liked to know before their hearings will enhance the self-help materials designed for the evaluation.

In addition to this unique learning experience, their participation strengthens the project’s connections within the Providence community. These dedicated students—under the supervision of Professor Jonathan Gutoff; Director of the Feinstein Center for Pro Bono & Experiential Education, Laurie Bannon; and Director of Pro Bono & Community PartnershipsEliza Vorenberg—will bring important new perspectives and added capacity to the growing RITES field organization.

This partnership is an exciting one for us at the Lab. One of our core missions is to work with the next generation of legal practitioners and scholars and cultivate dedication to making the law an evidence-based profession. Having those future leaders actively participate in all phases of a Lab study is the most effective way for us to do just that.

We also prioritize having multiple partners work together on RCTs to maximize their impact. Having RWUSOL join us and the Rhode Island Center for Justice as we study the latter’s triage process for assigning representation in eviction cases will only improve our work. We all will benefit from each other’s perspectives, ideas, and insights into the understudied process of distributing scarce legal resources.

Stay tuned for more on this venture!

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

MA Guardianship Policy Institute Event this December

The Lab’s current Guardianship Service of Process Study is going strong in the field, and we’re not the only ones actively thinking about the topic. On Wednesday, Dec. 6, the Massachusetts Guardianship Policy Institute will facilitate “A National Perspective on Guardianship and Decisional Support,” a day-long conference exploring the state of both areas in several different contexts. You can learn more about the conference here.

This event will present several different perspectives, and the Lab is always looking for new lenses through which to view fields we study. The A2J Lab doesn’t have an opinion about whether guardianship, or alternatives to it, are good or bad, nor about the situations in which any alternatives are appropriate. We believe that as long as guardianship is a step that the law makes available, its availability should turn on the criteria outlined in governing law and not (as it appears to now) on whether a petitioner has a lawyer or is able to negotiate procedural hurdles involved in a petition’s processing.

In the News

We’re happy to share two great articles from the past week that feature the work of the Lab.

The first, “The Justice Gap: America’s unfulfilled promise of ‘equal justice under law'” by Lincoln Caplan, is a longform piece in Harvard Magazine that puts the Lab’s studies of self-help materials in the context of the larger debate about how best to address the access to justice gap. It’s a great read if you’re interested in learning more about the history of legal aid and how the work of the Lab fits into that framework.

The second, “Unicorns: RCTs, the Social Sciences, and IRBs” by Tonya Ferraro, published in Public Responsibility in Medicine and Research (PRIM&R)‘s blog, Ampersand, draws on Jim Greiner’s research on the history of RCTs in the legal profession. Ensuring that studies that involve human subjects, such as the studies run by the Lab, are ethical is an important part of the legal research process that both lawyers and review boards can be unfamiliar with because of the paucity of RCTs in law. The article describes how the IRB process can adapt to meet the needs of such studies.

For those of you who don’t engage in academic research, “IRB” is short for Institutional Review Board; institutions whose faculty and students engage in research establish their own committees, which ensure that any studies involving human subjects meet federal ethical standards. (If you’re unfamiliar with IRBs and how they operate, you can learn more here.)

See any great work about access to justice in the press? Share it with us on Twitter or Facebook.

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Top 10 A2J Research Priorities #9: Alternatives to Litigation

This video series describes what we consider to be the top 10 access to justice research priorities. The quick 2-3 minute videos will wake you up and get you excited about the ways that experimentation, research, and (of course) RCTs can improve access to justice.

Here’s the latest installment: Alternatives to Litigation.

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.