A researcher found that TikTok's recommendation algorithm appeared to suggest users to follow based on physical appearance, creating 'face-based filter bubbles' where following someone led to recommendations of people who looked similar in terms of race, hair color, age, and other physical characteristics.
Marc Faddoul, an AI researcher from UC Berkeley specializing in algorithmic fairness, conducted an experiment on TikTok's user recommendation system. When following a new account, TikTok suggests other accounts to follow, and Faddoul observed that these recommendations appeared to be based on physical appearance similarities including race, gender, hair color, age, body profile, and even visible disabilities. He repeated the experiment with a fresh account and documented similar results, which he shared on Twitter calling them 'face-based filter bubbles.' BuzzFeed News replicated similar experiments with comparable outcomes, such as following a hijabi creator leading to recommendations of other women wearing hijabs. TikTok denied that their algorithm uses race or profile pictures for recommendations, stating they use collaborative filtering based on user behavior patterns where users who follow account A also follow account B. However, Faddoul noted that collaborative filtering could still reproduce existing biases in user behavior and create feedback loops that make it harder for underrepresented creators to gain followers. This was not TikTok's first controversy, as they had previously admitted to suppressing content from queer, fat, and disabled users in December.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed
No population impact data reported.