Comment
Author: Admin | 2025-04-27
Economy, housing, and a number of other areas. In some cases, these outcomes may be a result of historically racist police tactics. As Harvard Law and Computer Science Professor Jonathon Zittrain raised in a recent talk, “If it’s not outlandish to think arrest rates are affected by things they really shouldn’t be—by demographics or innate personal characteristics—the algorithm would happily predict what a police officer would do rather than what the accused would do.” Research Director of the Harvard Access to Justice Lab Christopher Griffin agreed, explaining challenges in a model his team is working on: “An outcome is measured not as a charge, certainly not as a conviction, but as an arrest. That could very much be not related at all to the underlying criminal behavior, but to the practices of law enforcement.” Some vendors, like algorithmic policing company Azavea, have sought to mitigate bias by deemphasizing some arrest data, particularly concerning racially loaded drug and nuisance crimes. Yet a trickier issue is how to manage situations where some racial or ethnic group is indeed more likely to commit a crime, not just more likely to be targeted by police. In many cases, these trends are the result of a long and complex history of prejudicial treatment in many walks of American life. Do you target these residents, put them in jail, and perpetuate systems that have made it so that these groups are more likely to commit these crimes? This does not seem like a satisfying response, but a substantive analysis of these issues is outside the scope of this article. However, what this article does offer is that identifying threats of crime using good data rather than police intuitions can in fact reduce bias in policing, that keeping considerations of bias in mind while developing such initiatives will ensure continual improvement, and that cities need to have conversations about perpetuating bias before deploying these kinds of initiatives. MAKING ALGORITHMS PUBLICLY ACCESSIBLE The potential for prejudice in social listening initiatives points to a broader problem common to many analytics initiatives: the algorithms employed are often not publicly available
Add Comment