By Gwyneth K. Shaw
Several UC Berkeley Law experts spoke at a recent webinar organized by the Santa Clara County public defender’s office to update lawyers across the state about new tools, research, and tactics involving technology in the criminal justice system.
Criminal Law & Justice Center Executive Director Chesa Boudin and Professors Colleen V. Chien ’02, Andrea Roth, and Rebecca Wexler spoke at the webinar, which offered a way for members of the California Bar to fulfill the recently introduced “tech” Continuing Legal Education requirement.
Chien outlined new work she’s involved in on a number of fronts — including her scholarly research on AI and access to justice as well as the criminal justice system and efforts to offer practical aid for systems-impacted people — involving California’s Racial Justice Act (RJA) as well as California’s “Clean Slate” laws.
Enacted in 2020, the RJA prohibits bias based on race, ethnicity, or national origin in charges, convictions, and sentences issued in court. It permits a challenge to a criminal conviction if a judge, attorney, law enforcement officer, expert witness, or juror exhibited bias or animus towards the defendant because of their race, ethnicity, or national origin — or used racially discriminatory language during the trial.
To make a case for relief on the basis of a pattern of racial disparity, however, the law requires evidence of this disparity — which is often hard to come by for individual defendants. Chien and a team of computer scientists, law students, and an economist, developed an innovative database collecting data provided by the California Department of Justice’s Criminal Offender Record Information database of arrests, court actions, convictions, and sentences in California. Users can sort by a variety of variables, including the race of a defendant, what they were charged with, the county where it happened, and the year.
The tool, developed by the Paper Prisons Initiative, a grant-funded research project Chien founded and leads, makes it possible to access data supporting a prima facie case, leveling the playing field and enabling more people to assess and bring their potential claims.
The database is constantly updated and has the potential to fulfill the promise of the RJA, Chien said.
“We’ve gotten a lot of inquiries from public defenders, which we’re really happy to see, but we’ve also got a lot of inquiries from individual people,” she said.
UC Berkeley student Alonzo Harvey, who was formerly incarcerated and works with both Chien and Boudin, told attendees about a friend who’s helping others with RJA claims.
“The tool shows the racial disparities and then we’re able to use that to tell our stories behind what’s going on in the criminal justice system,” he said. “We know it’s built on racism, but it’s a lot easier when the statistics are being highlighted behind our stories.”
Santa Clara County Deputy Public Defender Jake Rhodes talked about a project he’s working on with Chien involving another California law aimed at automatically clearing the records of people who have completed their sentences and probation.
“This is a great benefit to people, but it’s only a benefit if, number one, the person knows about it, and number two, this is effectual,” he says. “And we’re seeing that this is not effectual in reaching the counties.”
Santa Clara, a large county, has over 260,000 expungement cases, Rhodes said. But getting notice to the people affected, and helping them understand how they proceed, is a huge challenge.
That’s where Chien and her team come in. In a notification and research initiative that Chien has analogized to Juneteenth, she is working with behavioral scientists to develop a protocol for letting people know about their new freedom from criminal records. Chien is also teaming up with UC Berkeley computer science Professor Irene Chen to develop a chatbot to navigate the legal questions and options that open up when people’s records are cleared.
“The goal of the rule isn’t just to change the law and pat ourselves in the back, it’s to really make sure that it actually has impact on people’s lives and it is meaningful for them, and that they can actually use that information in a way that’s going to benefit for them,” said Chien, who’s a faculty co-director of the Berkeley Center for Law & Technology (BCLT) and on the faculty advisory board of Boudin’s center.
“Often, what we’re scared about with respect to AI is prediction or taking away human discretion, but in these projects we’re talking about leveraging AI’s capabilities for things like translation, and we know legalese is a language people do not understand,” she added. “I think this is where you can see that generative AI can be leveraged very powerfully.”
A courtroom blueprint
Roth, who’s been writing about the evidentiary issues raised by machines and software for nearly a decade and is also a BCLT faculty co-director, offered participants a how-to guide for thinking about — and potentially challenging or defending — these kinds of materials in court. Few rules govern the assertions of automated systems, she said, which presents a challenge for prosecutors and defense lawyers.
“What I’m hoping to suggest to you is that you start thinking broadly in reverse about what you would need to meaningfully scrutinize this proof, and then to think about what legal hooks you could use to get it,” she said.
Roth outlined some of the legal tests and considerations lawyers should consider, noting that prosecutors should also be able to ensure they understand and trust the results of machine-generated evidence they’re presenting, and to realize that the defense might present these tools as well.
Understanding what you need to know about the validity of a given result, Roth emphasized, is critical to crafting the argument for getting access. She gave as an example two DNA testing devices that came to different conclusions about the same sample because each method was making different assumptions about DNA markers — an avenue for challenging the evidence at trial, but only if you know how to uncover it.
“If you want the source code, you’re going to have to explain why you need it, and you may need to talk to an expert about that,” Roth said. “If this program could talk, and you could submit them to a deposition, what would you ask the program about their assumptions, or the hypotheticals they considered?”
Roth, who has presented to the Advisory Committee to the Federal Rules of Evidence to help develop new rules for machine-generated proof, said a potential draft rule being explored would make this evidence, including AI-based programs, subject to the same level of methodological scrutiny used for a human expert.
“I think you can use this as a hook to be able to impeach a machine in any way that you would otherwise be able to impeach an expert,” she said.
Wide-ranging efforts
Boudin talked about some of the ways the technology-aided systems, often obtained by police forces through grants, might create some unintended problems. Using the license plate readers sold by Flock Safety as an example, he noted that while the images are very useful for tracking down stolen cars, the prevalence of incidents might divert police resources away from more serious crimes.
In addition, Boudin said, making it easier for police to make arrests is happening without a corresponding increase in other elements of the criminal system.
“What does it do to the number of judges, the number of trial courtrooms, to the number of public defenders, and to the technical capacities to process or to challenge or to interrogate the validity of the evidence that is being generated by these new technologies?” Boudin said. “Are we going to have money available to hire technology experts in public defender’s offices? Are we going to have more courtrooms available to try more cases that are being generated by these kinds of low-hanging fruit arrests that rely heavily on technology?”
Wexler explained some recent and potential developments around the federal Stored Communications Act (SCA), which protects privacy in electronic communications by third parties, limiting what the government can ask for and what a company can provide voluntarily. The way today’s tech companies, including Meta and Snapchat, are using the law is “a profound and growing injustice in which major technology companies have distorted this statute to block criminal defense subpoenas,” she said.
Also a BCLT faculty co-director, Wexler offered a model brief for attendees to use and detailed a case before the California Supreme Court that could upend the process. Lawyers for the criminal defendant in the case have argued that the tech companies’ data-mining policies mean the SCA doesn’t apply to them, and therefore they must comply with subpoenas and produce what the defense is asking for.
“Not only does it mean that the defense subpoenas will be enforceable, but it also means that none of the SCA protections, none of its limitations on government access to stored communications, none of its limitations on technology companies’ voluntary disclosures, would apply to most major service providers today,” Wexler said. “On the one hand, that shock to the system could potentially encourage Congress to enact a new federal data privacy law updated for today’s technologies. But on the other hand, if Congress doesn’t do that, then one of the few significant federal data privacy laws that we have on the books will be dead.”