Striving for Algorithmic Justice

Law professors and students look behind the curtain of computer programs

Ellen Goodman is director of the Rutgers Institute for Information Policy & Law.

By Sam Starnes

Algorithms are everywhere. These digital codes that define how computers complete tasks can determine what ads you see on a website, who is eligible for a loan, and who employers hire. Algorithms also inform many decisions made in the criminal justice system, and governments use all manner of algorithms to decide on distributions of benefits, resources, and sanctions. “People don’t realize how much influence algorithms have on our lives,” said Ellen Goodman, a professor at Rutgers Law School in Camden who is director of the Rutgers Institute for Information Policy & Law. “Many of the decisions that people have made in the past are being made by algorithms.”

And like decisions made by people, algorithms are often tainted with bias. Computers do what they are told, or what the data they are fed tell them to do. “Concerns about fairness and equity arise when the algorithms have been trained on biased data,” Goodman said. “These algorithms may make recommendations to police departments or housing departments or social services that are themselves biased and disadvantage certain populations.”

As an example, Goodman cites how police departments often use data to create maps that assign officers to patrol particular areas. “You would want to send police to where there is a lot of criminal activity,” she said. “What’s a proxy for criminal activity that algorithms use? Arrests. But those records reflect histories of over-policing and over-arresting, so you end up perpetuating that.”

She said arrests for marijuana possession are an example. African American youth are arrested for marijuana possession much more often than white youth, so they are targeted more by police. “You send the police to neighborhoods with more historic arrest data and then there are more arrests,” Goodman said. “It’s a reinforcement of historic data.”

Brooke Lewis CCAS’15, RLAW’17, associate counsel for criminal justice reform at the New Jersey Institute for Social Justice (NJISJ) in Newark, said algorithms are used to inform many other governmental decision-making processes, such as determining pretrial release, sentencing, and screening allegations of child abuse. “These are very serious governmental actions that have serious consequences for people’s liberty and autonomy,” said Lewis, a native of Gibbsboro, New Jersey, who earned her undergraduate degree in political science and her law degree from Rutgers–Camden’s B.A./J.D. program.

As a student, Lewis worked as a research assistant for Goodman, helping with research for Goodman’s article “Algorithmic Transparency for the Smart City” that was published in 2018 in the Yale Journal of Law & Technology. Goodman is a noted speaker and researcher on the trend of “smart cities,” which involves pervasive data gathering and integration, big data analytics, and artificial intelligence to manage many city functions. Lewis said her background in learning about algorithms from Goodman has been very useful in her current work in criminal justice reform. “One of the most dangerous things about using these types of algorithms is the level of deference people inherently give them,” Lewis said, noting that she was speaking on her own behalf and not that of NJISJ. “People look at it and say, ‘These are numbers. It’s an algorithm. It’s a computer. We are not having any human judgment.’ But that’s not true. There is human judgment because there is still a human who sits down and decides what factors matter and how much each factor matters.”

Kayvon Paul, a second-year Rutgers Law student in Camden, is working as a research assistant with Goodman this semester and helping her to track down policies on the use of algorithms in cities such as Dallas, Texas, and Detroit, Michigan. “It’s eye-opening to see that when you remove humans from the decision-making process, it has the potential to do more harm than good,” Paul said.

Paul, a part-time law student from Asbury Park, New Jersey, has worked as a government lobbyist in Trenton. He said Goodman’s efforts to advocate for transparency in how algorithms function is critical going forward. “All governments are going to transition to the use of algorithms and smart technology,” he said. “The only question is when?”

An expert in freedom of information law, Goodman said that looking behind the curtain of how algorithms are created is crucial to ensuring they are effective and just. She said freedom of information laws must be rigorous in ensuring transparency of the data inputs and computational programs that lead to decisions that affect the lives of many. “It’s very important that decision makers understand the meaning of algorithms,” Goodman said. “We must through freedom of information laws fight for algorithmic justice and end the unfairness, opacity, and lack of due progress inherent in many algorithms.”

For more on a related topic, see “Harnessing Hateware” from the fall 2020 Rutgers–Camden Magazine.

Posted in: 2020 Fall, On Campus

Comments are closed.