Privacy Violations
AI systems that memorize and leak sensitive personal data or infer private information about individuals without their consent. Unexpected or unauthorized sharing of data and information can compromise user expectation of privacy, assist identity theft, or cause loss of confidential intellectual property.
"EAI systems interact with huge amounts of data, creating significant privacy concerns. These systems are often trained on vast corpora and process a variety of data modalities— spanning visual, auditory, and tactile information—during deployment [12]. Like text-based virtual AI models, which are known to memorize and expose personally identifiable information [75, 76], commercial robots have been shown to disclose proprietary information through simple prompts [61]."(p. 5)
Supporting Evidence (1)
"Whereas virtual AI systems are constrained to collect data from either virtual interfaces or fixed points in the physical world (e.g. security cameras collecting facial-recognition data), EAI’s mobility and the vast array of sensors used in EAI technologies expand concerns about unauthorized data collection. For example, EAI systems can monitor user behavior, infer physical preferences, and potentially contribute to future model training without the informed consent of those being observed beyond the limitations of immobile microphones or security cameras [77–79]. Bad actors within governments or corporations could gain access to private data streams and monitor users’ movements 24/7, providing significant leverage over individuals to squash dissent or achieve personal power [80]."(p. 5)
Other risks from Perlo et al. (2025) (12)
Economic Risks
6.0 Socioeconomic & EnvironmentalPurposeful or malicious harm
4.2 Cyberattacks, weapon development or use, and mass harmAccidental harm
Misinformation
3.1 False or misleading informationLabour Displacement
6.2 Increased inequality and decline in employment qualitySocioeconomic Inequality
6.2 Increased inequality and decline in employment quality