NEW SEARCH TECHNOLOGIES HELP FACULTY ANALYZE OUTCOMES AND SHAPE STRATEGIES
Professor Justin McCrary likens data analytics to a first-rate GPS, at least when it comes to legal research. “They prevent ill-advised turns and get you where you want to go faster,” he says.
Data analytics and new search technologies are helping Berkeley Law scholars improve their work’s predictive quality. Their hope: that efficiently harnessing unwieldy information will lead to more effective litigation, legislation, and adjudication.
“It’s exciting to have people from different disciplinary perspectives using similar tools,” says McCrary, who directs UC Berkeley’s Social Science Data Laboratory. “We’re beginning to see how these tools can help increase efficiency and transparency in many legal areas.”
McCrary recently co-built a Web portal for the California Attorney General’s Office with Steve Raphael of the Goldman School of Public Policy. It includes extensive information on arrests, violence against police, and deaths in police custody. “The portal is a more helpful release and analysis of data than we’ve seen from any other attorney general,” he says.
McCrary also helped the U.S. Equal Employment Opportunity Commission develop data systems that reveal factors influencing private sector and governmental diversity. On another project, he teamed with fellow professor Robert Bartlett to analyze how various features of U.S. financial markets influence highfrequency trading.
Berkeley Law students are also learning about these new techniques and their importance. McCrary and professor Kevin Quinn teach Litigation and Statistics, which illuminates how big data is shaping legal practice—particularly litigation.
Quinn and professor Mark Gergen used new search models to analyze cases before the New York State Court of Appeals—a key court in tort and contract jurisprudence—from the first half of the 20th century. They unearthed implicit relationships between decisions in these cases and recurring areas of disagreement among the judges: moralistic versus pragmatic, liberal versus conservative, stability versus flexibility. The professors are conducting a similar study of California Supreme Court cases.
“Much of what legal academics pursue is rooted in large bodies of text—court opinions, agency regulations, or statutes,” Quinn says. “These new statistical methods streamline our analysis of these texts and have the potential to open new areas of research.”
Professor Eric Biber and former colleague Eric Talley have begun analyzing appellate briefs and opinions in National Environmental Policy Act (NEPA) cases. They will try to identify patterns that correlate with outcomes—leading to predictions about how courts will rule in similar cases. Such predictions could help determine if NEPA might successfully challenge a development project, or guide those interested in challenging NEPA compliance.
“Much of this predictive work has been done in private law,” Biber says. “We want to learn how to do it in public law, which might be tougher because it involves a wider range of statutes. But it’s vital for nongovernmental entities and small organizations to have access to these methods.”
—Andrew Cohen