Scammers are using AI tools like ChatGPT to create fake student profiles and written responses to fraudulently enroll in California community colleges and steal millions in federal and state financial aid.
Scammers are exploiting California's open enrollment policies for community colleges by creating fake student applications using AI tools, particularly ChatGPT, to generate written responses for identity verification. The California Community Colleges Chancellor's Office reports that fake student applications have increased from 20% in 2021 to 34% in 2024. These 'Pell runners' create AI-generated student profiles, submit minimal AI-generated coursework, and disappear after collecting $7,400 federal Pell grants. In the past 12 months alone, scammers have stolen over $10 million in federal funds and $3 million in state funds. The problem worsened after COVID-19 pandemic restrictions loosened financial aid requirements and moved coursework online. Despite California spending $150 million on cybersecurity since 2022 and contracting with companies like ID.Me for verification, scammers continue to adapt with new techniques. Teachers are forced to act as fraud detectors, identifying fake students who often impersonate vulnerable populations like homeless, undocumented, or former foster care students. The fraud affects real students by taking enrollment spots and forcing instructors to police their classrooms. Nationwide, these crimes cost institutions over $100 million in 2023, representing a tenfold increase from pre-2020 levels.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed
No population impact data reported.