Olivier toscer

Comment

Author: Admin | 2025-04-28

And are thus not subject to challenge from residents. Usually, cities employ algorithms developed by private contractors who are unwilling to release detailed information for fear of revealing proprietary information to competitors, and cities are hesitant to release source code because of concerns over cybersecurity and opportunities to game public systems. Citizens are therefore unable to assess the impartiality of the tools used to target people as potential criminals. A lack of access to algorithms is a problem not only for ensuring equity, but also for confirming that the information gathered is accurate. The experience of Fresno City Councilmember Clinton Olivier shows that social media mining can be a misleading source of information. Representatives from “Beware”—a company that produces a social listening software that assigns residents and properties threat levels of green, yellow, or red based on their social posts—presented their product to the Fresno City Council. During the course of their presentation, Council Member Clint Olivier asked the company reps to look up his threat level. His property showed up yellow. In this case, Olivier’s property appeared as risky because Beware analyzes addresses based on seven-year periods, and former occupants may have had a criminal history. However, social mining may also miscategorize residents based on hyperbolic or sarcastic posts or even by misidentifying them completely. Similar algorithmic tools—like one that predicts recidivism risk based on resident data—have proven no more accurate than non-expert assessments. One could imagine these cases of mistaken identity leading to unnecessary escalation. As Olivier colorfully explained, “even though it’s not me that’s the yellow guy, your officers are going to treat whoever comes out of that house in his boxer shorts as the yellow guy.” Thinking that Olivier was potentially dangerous, officers could use excessive precaution and force, creating an unnecessarily tense situation. According to Kortz, in order to ensure due process, governments need to make their algorithms available to residents in one way or another. “To be able to challenge algorithms, they need to be auditable in some sense,” he explained. “While companies shouldn’t have to reveal their algorithms in their entirety, they should

Add Comment