An Italian court ruled that Deliveroo's algorithmic ranking system discriminated against delivery workers by penalizing them for canceling shifts without distinguishing between legally protected reasons like illness or strikes and other reasons.
An Italian court in Bologna ruled that food delivery platform Deliveroo's reputational-ranking algorithm discriminated against delivery workers by violating local labor laws. The algorithm was used to determine worker 'reliability' and penalized riders who failed to cancel pre-booked shifts at least 24 hours in advance, negatively affecting their 'reliability index.' Workers with higher reliability scores were given priority for shifts during busier time blocks. The court found that the algorithm's failure to distinguish between legally protected reasons for not working (such as illness or exercising the right to strike) and other reasons constituted discrimination. The case was brought by three unions, including Italy's largest trade union CGIL. The court ordered Deliveroo to pay €50,000 to the applicants plus legal costs and publish the ruling on its website. Deliveroo stated that this particular shift booking system is no longer in use in Italy or other markets, claiming riders now have complete flexibility without booking systems or work obligations.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Unequal treatment of individuals or groups by AI, often based on race, gender, or other sensitive characteristics, resulting in unfair outcomes and unfair representation of those groups.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed