BackConcept drift
Concept drift
Risk Domain
Challenges in understanding or explaining the decision-making processes of AI systems, which can lead to mistrust, difficulty in enforcing compliance standards or holding relevant actors accountable for harms, and the inability to identify and correct errors.
"Concept drift refers to a change in the rela- tionship between input variables and model output. If not treated appropriately, concept drift can reduce the reliability of AI systems."(p. 11)
Entity— Who or what caused the harm
Intent— Whether the harm was intentional or accidental
Timing— Whether the risk is pre- or post-deployment
Other risks from Schnitzer2024 (24)
Inadequate specification of ODD
7.3 Lack of capability or robustnessHumanUnintentionalPre-deployment
Inappropriate degree of automation
7.2 AI possessing dangerous capabilitiesAI systemUnintentionalPost-deployment
Inadequate planning of performance requirements
7.3 Lack of capability or robustnessHumanUnintentionalPre-deployment
Insufficient AI development documentation
7.4 Lack of transparency or interpretabilityHumanOtherPre-deployment
Inappropriate degree of transparency to end users
7.4 Lack of transparency or interpretabilityHumanOtherPre-deployment
Choice of untrustworthy data source
7.0 AI System Safety, Failures & LimitationsHumanUnintentionalPre-deployment