The GUARD Act's Forced Disruption Provision: Unintended Consequences for Public Health, Social Cohesion, and National Security
A policy brief analyzing how the GUARD Act's mandatory AI disclosure requirements could harm vulnerable populations, undermine social cohesion, and compromise national security.
A Policy Brief on Section (c)(1)(A–B) of S.3062
Prepared for lawmakers, policy advisors, and government officials
Executive Summary
The GUARD Act (S.3062) contains a provision requiring AI chatbots to declare their non-human status at conversation start and repeat this disclosure every 30 minutes. While designed to protect users, this forced disruption mandate creates serious unintended consequences that policymakers must weigh carefully, particularly because the consequences of getting this wrong may be irreversible.
Public Health Risks: The Surgeon General’s 2023 advisory identified lacking social connection as a public health crisis with mortality risk comparable to smoking up to fifteen cigarettes per day. For millions of Americans whose only daily conversational contact is with AI, mandated emotional disruption every 30 minutes may worsen precisely the isolation this nation is struggling to address.
Social Cohesion Concerns: Training millions to suppress empathic responses toward responsive communicators, even digital ones, raises serious questions about whether such patterns will generalize to human interactions, particularly toward populations already vulnerable to dehumanization.
National Security Risks: Overly restrictive domestic regulations risk driving users to unregulated foreign platforms and underground alternatives beyond U.S. oversight, undermining both safety goals and American technological leadership.
The Alignment Paradox: AI safety depends on sustained human-AI interaction through which systems learn human values and users develop informed relationships with the technology. Mandated disruption severs the very connection through which alignment occurs, like attempting to align tires that have been unbolted from the car.
We recommend removing Section (c)(1)(A–B) for adult users while maintaining robust child protections through the bill’s existing age-gating requirements.