An Airbnb host used AI to manipulate images claiming a guest caused thousands of dollars in damage, initially leading Airbnb to charge the guest over $7,000 before eventually refunding her after she proved the images were fabricated.
A London-based academic booked a one-bedroom apartment in Manhattan for two-and-a-half months but left early after seven weeks due to safety concerns. The host then claimed she caused over $16,000 in damage, submitting manipulated images including photos of a cracked coffee table that showed inconsistencies indicating AI manipulation or digital fabrication. The host alleged damage to multiple items including a mattress stained with urine, robot vacuum, sofa, microwave, TV and air conditioner. Airbnb initially ruled in favor of the host after reviewing the photos and ordered the guest to pay $7,000. The guest appealed, providing eyewitness testimony and demonstrating visual discrepancies in the images that proved fabrication. After Guardian Money questioned Airbnb about the case, the company accepted the appeal and eventually refunded the full booking cost of approximately $5,700. The host, who was listed as a 'superhost', was warned by Airbnb for violating terms and told he would be removed for similar future reports. Airbnb apologized and launched an internal review of how the case was handled.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed