Miami police used Clearview AI facial recognition technology to identify and arrest a protester, Oriana Albornoz, who was accused of throwing rocks at officers during May 30 protests, without disclosing the use of this technology in arrest documentation.
On May 30, 2020, during protests outside Miami Police headquarters, Oriana Albornoz, 25, was allegedly captured on video throwing rocks at police officers, injuring one officer's leg. Miami Police used Clearview AI facial recognition software to identify Albornoz by matching surveillance footage to publicly available photos from social media sites including Facebook, LinkedIn and Instagram. She was arrested a month later and charged with battery on a police officer, pleading not guilty. The police arrest report only stated she was 'identified through investigative means' without mentioning facial recognition technology. NBC 6 investigators discovered the use of Clearview AI through independent investigation. Miami Police policy prohibits using facial recognition for surveillance of constitutionally protected activities like protesting, but allows its use when crimes are committed. The department maintains logs and conducts monthly audits of facial recognition searches, and states that positive results alone do not constitute probable cause for arrest. Clearview AI scrapes publicly available photos from social media to create its database.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that memorize and leak sensitive personal data or infer private information about individuals without their consent. Unexpected or unauthorized sharing of data and information can compromise user expectation of privacy, assist identity theft, or cause loss of confidential intellectual property.
AI system
Due to a decision or action made by an AI system
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed