Skip to main content
Home/Risks/Schnitzer2024/Lack of robustness

Lack of robustness

Category
Risk Domain

AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.

"Robustness characterizes the resilience of an AI system’s output against minor changes in the input domain. A great variation in an AI system’s response to small input changes indicates unreliable outputs."(p. 10)

Other risks from Schnitzer2024 (24)