A San Francisco animal shelter deployed a K5 security robot to patrol sidewalks outside its facility, which was perceived as targeting homeless people and led to public outcry and the robot's removal.
In November 2017, the San Francisco SPCA deployed a Knightscope K5 security robot to patrol the sidewalks and parking areas around its Mission District facility. The 400-pound, 5-foot-tall robot was equipped with four cameras capable of reading 300 license plates per minute and cost $6 per hour to rent. The SPCA stated it deployed the robot to address security concerns including break-ins, vandalism, harassment of staff, and discarded needles around their campus. However, homeless individuals camping near the facility perceived the robot as targeting them, with some calling it the 'anti-homeless robot.' The robot faced physical attacks including being knocked over and having BBQ sauce smeared on its sensors. After a San Francisco Business Times article quoted SPCA president Jennifer Scarlett saying the robot was 'much easier to navigate than an encampment,' public outcry erupted on social media. The city then ordered the SPCA to remove the robot from public sidewalks or face $1,000 daily fines for operating without permits. Following widespread criticism and accusations of targeting homeless people, the SPCA suspended the pilot program and apologized, stating their words were 'ill-chosen' and did not reflect their values.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
Human
Due to a decision or action made by humans
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed