Amends Section 230 to limit liability protections for dominant service providers. Deems providers as content creators if they manage content or use targeted algorithms with a discernible viewpoint. Excludes protections for actions against religious material. Mandates public disclosure of content management practices. Reclassifies immunity as an affirmative defense.
Analysis summaries, actor details, and coverage mappings were LLM-classified and may contain errors.
This is a proposed federal statute (Act of Congress) that would amend existing law (Section 230 of the Communications Act of 1934) with binding legal obligations, enforcement mechanisms, and mandatory compliance requirements for covered entities.
The document primarily addresses risks related to content moderation practices, platform governance, and free speech concerns. It has good coverage of governance failure (6.5) and competitive dynamics (6.4), with minimal coverage of discrimination (1.1), toxic content (1.2), misinformation (3.1, 3.2), and malicious actors (4.1). The focus is on platform accountability and transparency rather than direct AI system risks.
The document primarily governs the Information sector, specifically interactive computer service providers (social media platforms, content platforms) with dominant market share. It does not substantially address other economic sectors.
The document primarily addresses the Deploy and Operate and Monitor stages of the AI lifecycle, focusing on content moderation systems, algorithmic amplification, and ongoing disclosure requirements. It does not substantially cover earlier stages like planning, data collection, or model building.
The document does not explicitly define or mention AI models, AI systems, or specific AI categories. It focuses on 'algorithms' and 'automated computer processes' used for content moderation and amplification by interactive computer service providers. No compute thresholds, model types, or technical AI classifications are referenced.
United States Congress
The document is a proposed Act of Congress, as indicated by the title 'DISCOURSE Act' and the legislative format amending existing U.S. law (Section 230 of the Communications Act of 1934).
Federal Communications Commission (Commission); Courts (civil and criminal)
The FCC is referenced as the body that would receive and publish disclosure information. Courts are implicitly the enforcers through civil and criminal actions where providers must prove immunity as an affirmative defense.
Federal Communications Commission; Public/consumers; Entrepreneurs and small businesses
The FCC would monitor through public disclosure requirements. The public, consumers, and small businesses are positioned as monitors through access to disclosed information about content moderation practices.
Interactive computer service providers with dominant market share; Mass-market interactive computer service providers
The Act specifically targets 'provider of an interactive computer service with a dominant market share' and 'provider of an interactive computer service that provides the service through a mass-market offering to the public', which are platforms that deploy AI systems for content moderation and algorithmic amplification.
8 subdomains (3 Good, 5 Minimal)