Skip to content
DWT logo
People Services Insights
About Offices Careers
Search
People
Services
Insights
About
Offices
Careers
Search
Insights
Media & Entertainment

Wave of Federal "Online Safety" Legislation Hits Congress

Federal government poised to reassert itself in the constitutionally fraught exercise of online speech regulation
By   Adam S. Sieff, Ambika Kumar, David M. Gossett, and Nicole Saad Bembridge
01.23.26
Share
Print this page

After five years of largely failed state-level "online safety" regulations, the U.S. House of Representatives' Subcommittee on Commerce, Manufacturing, and Trade of the Committee on Energy and Commerce recently held a hearing on 19 new federal digital media bills, including a revised version of the Kids Online Safety Act (KOSA), updates to the Children's Online Privacy Protection Act (COPPA), and a sweeping age-verification and parental-consent regime for content made available through app stores. Other proposals considered at the hearing would prohibit minors under 16 from maintaining social media accounts, require platforms to provide API access to registered third-party safety software, and restrict data collection and market-based research involving minors.

Several of the proposals raise constitutional concerns. The Supreme Court last term upheld a Texas law requiring age verification to access adult content in Free Speech Coalition, Inc. v. Paxton, 606 U.S. 461 (2025). But laws restricting or conditioning access to fully protected speech, presumptively banning minors from broad swaths of the internet, or imposing design and duty-of-care mandates on speech intermediaries have consistently failed First Amendment scrutiny—many in lawsuits brought by DWT. Attempts to commandeer and require disclosures about AI systems, or to force online services to integrate their features with competitors services, may also be unconstitutional.

This rapidly evolving landscape requires early planning by online intermediaries, AI developers, and other regulated providers. Participation in rulemaking, proactive compliance strategies, and readiness for litigation—whether pre-enforcement challenges or enforcement defense—can shape how these laws are interpreted and whether they are applied or enjoined. Engaging early preserves flexibility, reduces disruption, and best positions stakeholders to assert constitutional and related defenses. For more information or help with any of the 19 bills below—or any other related matters—reach out to the authors or your usual DWT contact.


Index:

  1. COPPA 2.0 (HR 6291)
  2. 2025 Kids Online Safety Act ("KOSA") (H.R. ____; S. 1748)
  3. Reducing Exploitative Social Media Exposure for Teens Act (RESET Act)
  4. Sammy's Law (H.R. 2657)
  5. App Store Accountability Act (H.R. 3149)
  6. Shielding Children's Retinas From Egregious Exposure on the Net Act (SCREEN Act)
  7. Safe Social Media Act (H.R. 6290)
  8. Algorithmic Transparency and Choice Act (H.R. 6253)
  9. No Fentanyl on Social Media Act (H.R. 6259)
  10. Promoting a Safe Internet for Minors Act (H.R. 6289)
  11. Safeguarding Adolescents From Exploitative BOTs Act (SAFE BOTs Act)
  12. Kids Internet Safety Partnership Act (H.R. ____)
  13. AI Warnings and Resources for Education Act (AWARE Act)
  14. Safer Guarding of Adolescents From Malicious Interactions on Network Games Act (Safer GAMING Act)
  15. Assessing Safety Tools for Parents and Minors Act (H.R. 6499)
  16. Stop Profiling Youth and Kids Act (SPY Kids Act)
  17. Safe Messaging for Kids Act of 2025 (SMK Act)
  18. Don't Sell Kids' Data Act of 2025 (H.R. 6292)
  19. Parents Over Platforms Act (H.R. 6333)

1. COPPA 2.0 (HR 6291)

This bill broadens COPPA's coverage to minors under 17, prohibits targeted advertising to minors, and requires an "eraser button" to delete minors' personal data.

Key Provisions:

  • Raises the age of coverage from under 13 to under 17.
  • Prohibits targeted advertising directed at children and teens.
  • Consent Requirements: Requires online platforms to obtain explicit consent from users aged 13 to 16 before collecting or processing their personal information.
  • Eraser Button: Introduces an "eraser button" requirement, allowing parents and teens to delete personal information.
  • Youth Marketing and Privacy Division: The bill establishes a dedicated division within the FTC to oversee youth marketing and privacy issues.

Expanded Definitions:

  • "Teen" means an individual aged 13-17.
  • "Personal Information" includes biometric data and government-issued identifiers.
  • "Operator" includes any entity that collects, maintains, or allows the collection of personal information, including mobile applications. Nonprofit organizations are excluded.

Mandates:

  • Targeted advertising: Operators cannot use personal information to target ads to children or teens.
  • Data retention: Personal information cannot be retained longer than necessary to fulfill a transaction or service.
  • "Eraser button": Requires online services to provide an "eraser button" to have the service delete or correct information, including information that "was submitted by any person other than the user who has requested that the content or information be deleted or corrected, including content or information submitted by the user that was republished or resubmitted by another person."
  • Storage in foreign nations: Operators must notify parents or teens if data is stored, transferred, or accessed in certain foreign nations.

Standard of Knowledge: In determining whether an operator has knowledge that a user is minor, the FTC or state attorneys general "shall rely on competent and reliable evidence, taking into account the totality of the circumstances, including whether a reasonable and prudent person under the circumstances would have known that the user is a child or teen."

Effective Date: Regulations and enforcement mechanisms would likely begin shortly after enactment. The bill includes provisions for the FTC to submit reports and assessments within one to three years of enactment.

Back to index


2. 2025 Kids Online Safety Act ("KOSA") (H.R. ____; S. 1748)

First introduced in 2022, KOSA imposes broad requirements on online services related to their design features and editorial decisions. The bill has been introduced multiple times: one version passed the Senate in July 2024, but stalled in the House. There are now two versions of KOSA—one in the House, which defines "know" as "actual knowledge or to have acted in willful disregard," and one in the Senate, which defines "know" as "actual knowledge or knowledge fairly implied on the basis of objective circumstances." Both versions require platforms to implement certain safeguards for "known" minors.

The 2025 KOSA does not contain its predecessor versions' duty of care for the "best interests of minors," and instead requires covered platforms to implement reasonable policies, practices, and procedures to prevent the following harms to known minors:

  • Threats of physical violence
  • Sexual exploitation and abuse
  • Distribution, sale, or use of narcotic drugs, tobacco products, cannabis products, gambling, or alcohol
  • Any financial harm caused by deceptive practices

Other Requirements:

  • Safeguards: Readily accessible safeguards to mitigate danger and "compulsive usage." A covered platform must provide "readily accessible" and "easy-to-use" safeguards to (a) limit the ability of other users or visitors to communicate with the minor; and (b) limit by default design features that result in compulsive usage.
  • Reporting mechanisms: Platforms must provide tools for users to report harm to minors, including an electronic point of contact for harm-related issues and timely responses to reports (within 10 days or sooner).
  • Disclosure requirements: Platforms must provide clear notice of their policies and practices regarding safeguards for minors. Verifiable parental consent is required for children under 13, consistent with COPPA.
  • Audits: Platforms must undergo annual independent audits to assess compliance with the Act. Audit reports must be submitted to the FTC.

Age Verification: Like its predecessor versions, the 2025 KOSA does not explicitly require platforms to perform age verification. It directs federal agencies to study device-level and operating system-level age verification methods and prohibits platforms from advertising age-inappropriate products to minors.

Effective Date: 18 months after enactment.

Back to index


3. Reducing Exploitative Social Media Exposure for Teens Act (RESET Act)

This bill would prohibit platforms (as defined under the TAKE IT DOWN Act Congress enacted in 2025) from allowing users under 16 to open or maintain social media accounts, and would require platforms to terminate any accounts identified as belonging to a minor.

Account Termination Process:

  • Identification: Within 60 days of enactment, platforms must identify accounts or profiles belonging to covered minors.
  • Notification: Within 180 days of enactment, platforms must notify users identified as covered minors that their accounts will be terminated.
  • Termination: Platforms must terminate accounts within 30 days of notifying users.

Deletion of Personal Data:

  • Upon account termination, platforms must "immediately delete" all personal data collected from covered minors. This requirement is "subject to" the provision below, however, which suggests that deletion should actually occur 90 days after account termination.
  • Platforms must allow covered minors to request and receive a copy of their personal data for up to 90 days after account termination. This data must be provided in a readable, portable, and machine-readable format.
  • Platforms must fulfill data requests within 45 days of receipt.

Effective Date: One year after enactment.

Back to index


4. Sammy's Law (H.R. 2657)

Requires large social media platforms to provide real-time application programming interface access to registered third-party safety software providers, enabling them to:

  • Manage Online Interactions: Control content, account settings, and privacy settings for children.
  • Transfer User Data: Facilitate secure, hourly transfers of user data in a machine-readable format to third-party safety software providers. Provide summaries of transferred data to children and their parents/legal guardians.

Other Provisions:

  • Disclosure: Platforms must notify children and parents/legal guardians when third-party safety software providers are managing accounts.
  • Third-Party Provider Registration: Providers must register with the FTC and meet requirements, including U.S.-based operations; secure data handling within U.S. boundaries; deletion of user data within 14 days unless otherwise authorized; annual audits to ensure compliance with safety standards.

Effective Date: Once the FTC issues guidance, which must occur within 180 days of enactment.

Back to index


5. App Store Accountability Act (H.R. 3149)

This bill imposes age-verification obligations on app store providers and app developers.

Requirements for App Stores:

  • Age Verification:
    • App stores must verify the age category of each user (young child: under 13 years old; child: 13-15 years old; teenager: 16-17 years old) at account creation using "commercially available methods."
    • If a user is a minor (under 18), the account must be linked to a parental account, and verifiable parental consent must be obtained before allowing minors to download or purchase apps or make in-app purchases.
  • Parental Notifications:
    • App stores must notify parents of any significant changes to an app (e.g., changes in data collection, age ratings, or monetization features) and obtain new parental consent.
  • Data Sharing with Developers:
    • Provide app developers with the user's age category and the status of parental consent for minors.
    • Limit the collection, processing, and storage of personal data related to age verification to what is strictly necessary.
  • Transparency: Clearly display age ratings and content descriptions for apps in plain, concise language.

Requirements for Developers:

  • Age Verification: Verify the age category of users through the app store's system and confirm parental consent for minors.
  • Parental Consent Disclosures:
    • Provide parents with clear information about:
      • Data collected and shared by the app.
      • Measures to protect user data.
      • Age ratings and content descriptions.
  • Significant Changes:
    • Notify app stores of any significant changes to the app (e.g., new data collection practices, monetization features, or changes to age ratings).
  • Prohibited Practices: Cannot enforce terms of service against minors unless parental consent has been verified. Cannot share age category data with unaffiliated third parties.
  • Safe Harbor: Developers are not liable for violations if they rely in good faith on age-verification data from covered app stores and reasonably conform to widely accepted industry standards or best practices, or to standards or best practices identified by the FTC.

Effective Date: One year after enactment.

Back to index


6. Shielding Children's Retinas From Egregious Exposure on the Net Act (SCREEN Act)

Age verification to prevent minors from accessing sexually explicit content.

"Technology Verification Measures":

  • Platforms must implement systems to verify users' ages and prevent minors from accessing harmful content. Verification measures must:
  • Use technology to determine whether a user is likely a minor.
  • Prevent access to harmful content for minors.
  • Subject all users' IP addresses (including VPN IP addresses) to verification unless the user is determined to be outside the U.S.
  • Transparency: Platforms must publicly disclose their verification processes.

Audits and Guidance:

  • The FTC must conduct regular audits of covered platforms to ensure compliance.
  • Guidance must be issued within 180 days of enactment to assist platforms in meeting requirements.

GAO Report:

  • The Comptroller General must submit a report to Congress within two years of compliance deadlines, analyzing:
    • Effectiveness of verification measures.
    • Compliance rates.
    • Data security practices.
    • Behavioral, economic, psychological, and societal effects of the Act.

Effective Date: One year after enactment. The FTC must issue guidance within 180 days of enactment.

Back to index


7. Safe Social Media Act (H.R. 6290)

This Act directs the FTC, in coordination with HHS, to conduct a comprehensive study on social media use by individuals under 17.

The study must address data collection and use (what personal information social media platforms collect from individuals under 17, and how this information is used for targeted feeds and advertising); usage patterns (differences in social media use based on age ranges within the under-17 demographic); and mental health impacts (potential harmful effects and benefits of extended social media use for minors).

Effective Date: The FTC must submit a report to Congress within three years of enactment.

Back to index


8. Algorithmic Transparency and Choice Act (H.R. 6253)

This Act imposes "transparency and choice" requirements for minors.

Who It Applies To:

  • Platforms that use "personalized recommendation systems" to determine the content shown to covered users (users under 18), including websites, internet applications, and mobile applications that primarily provide forums for user-generated content.
  • Excludes providers of broadband internet access services and email services.

What It Requires:

  • Clear and conspicuous notice to minors when they interact with a personalized recommendation system for the first time.
  • Platforms must include in their terms and conditions:
    • A description of the features, inputs, and parameters essential to the operation of the personalized recommendation system.
    • Information on how user-specific data is collected or inferred and the categories of such data.
    • Options available to users to opt out, modify their profile, or influence the personalized recommendation system.
  • User Options: Platforms must provide minors with:
    • An option to switch between the personalized recommendation system and an input-transparent algorithm (a system that does not use user-specific data for recommendations).
    • An option to limit the type or category of recommendations from the personalized recommendation system.
  • Default Settings: Platforms must use an "input-transparent algorithm"—defined as a system that does not use user-specific data to determine recommendations unless the data is expressly provided by the user (e.g., search terms, filters, or preferences)—as the default setting for minors.

Enforcement: Violations are treated as unfair or deceptive acts or practices (UDAPs) under the Federal Trade Commission Act. The FTC has authority to enforce compliance, impose penalties, and conduct audits.

Effective Date: One year after enactment.

Back to index


9. No Fentanyl on Social Media Act (H.R. 6259)

This Act directs the FTC, in coordination with HHS and DEA, to submit a publicly available report to Congress about fentanyl sales facilitated through social media platforms, including the policies implemented by social media platforms to address illicit sales. The report will include recommendations for Congress to reduce minors' ability to access fentanyl online.

Effective Date: The FTC must submit the report to Congress within one year of enactment.

Back to index


10. Promoting a Safe Internet for Minors Act (H.R. 6289)

This Act directs the FTC to create a public awareness campaign (similar to DHS's Know2Protect campaign) for "online safety."

  • Scope: The scope of "online safety" under the bill includes protecting minors from cybercrimes, access to narcotics, tobacco, gambling, alcohol, and other adult content; preventing compulsive online behavior and adverse impacts on minors' physical and mental health; and facilitating the use of safeguards, parental controls, and tools to empower parents, guardians, and minors to protect minors online.
  • Best practices: The FTC is directed to identify, promote, and encourage best practices for educators, online platforms, minors, parents, and guardians to protect minors online.
  • Information exchange: the FTC is also instructed to facilitate access to and sharing information about online safety to ensure up-to-date knowledge of risks and benefits impacting minors online.
  • Public access: the campaign must be publicly accessible and available through relevant agencies, governments, nonprofit organizations, schools, and industry.

Effective Date: 180 days after enactment.

Back to index


11. Safeguarding Adolescents From Exploitative BOTs Act (SAFE BOTs Act)

This Act requires disclosures about chatbots, implements features to mitigate compulsive use of chatbots, and commissions a study on chatbots and mental health.

Who It Applies To: Entities that provide chatbots directly to consumers for use, including through websites, mobile applications, or other online means. Excludes providers whose chat functions are incidental to the primary purpose of their service.

What It Requires:

  • Prohibited Statements. Chatbots cannot claim to be licensed professionals unless the statement is true.
  • Mandatory Disclosures. Chatbot providers must clearly and conspicuously disclose the following to minors (defined as users under 17):
    • AI System Disclosure. Disclose that the chatbot is an artificial intelligence system and not a natural person. Disclosure must occur at the start of the first interaction with a minor and whenever a minor asks whether the chatbot is an AI system.
    • Crisis Resources Disclosure. Provide resources for contacting a suicide and crisis intervention hotline if a minor prompts the chatbot about suicide or suicidal ideation. Disclosures must be made in plain, age-appropriate language.
  • Policies and Procedures. Chatbot providers must establish, implement, and maintain reasonable policies to:
    • Advise minors to take a break after three hours of continuous interaction with the chatbot.
    • Address harmful content, including sexual material harmful to minors, gambling, distribution, sale, or use of illegal drugs, tobacco products, or alcohol.

Study on Chatbots and Mental Health: The Secretary of Health and Human Services, through the National Institutes of Health, must conduct a four-year longitudinal study to evaluate the risks and benefits of chatbots on minors' mental health.

Enforcement: Violations are treated as Unfair or Deceptive Acts or Practices (UDAPs) under the Federal Trade Commission Act and enforceable by the FTC and state attorneys general.

Effective Date: One year after enactment.

Back to index


12. Kids Internet Safety Partnership Act (H.R. ____)

This Act directs the Secretary of Commerce to establish a partnership to create a public awareness campaign, including:

  • Coordinating with federal agencies and stakeholders to identify risks and benefits of social media for minors and to determine industry best practices.
  • Publishing a playbook to help providers and developers implement best practices on:
    • Age-verification techniques.
    • Design features.
    • Parental tools.
    • Default privacy and account settings.
    • Reporting systems and tools.
    • Third-party safety software services.
    • Limitations and opt-outs for personalized recommendation systems and chatbots.
  • Facilitating stakeholder coordination among researchers, parents and minors with experience in online safety, educators, online platforms, civil society experts in privacy, free expression, and civil liberties, and state attorneys general.

Effective Date: The Secretary must establish the Partnership within one year of enactment. The Partnership's publicly available "playbook" is due two years after enactment. The Partnership terminates five years after establishment.

Back to index


13. AI Warnings and Resources for Education Act (AWARE Act)

Directs the FTC to work with other agencies to develop and share resources on the safe use of AI chatbots by minors. The program is modeled on the FTC's existing Youville, a free, standards-based, in-class educational program.

Effective Date: The FTC must develop and make available the mandated resources within 180 days of enactment.

Back to index


14. Safer Guarding of Adolescents From Malicious Interactions on Network Games Act (Safer GAMING Act)

This Act requires online video game providers to implement features to protect minors from harmful interactions. Key provisions include:

  • Parental Safeguards: Providers must offer safeguards to parents of minors that allow them to limit communication between minors and other users including adult users, and to control privacy and safety settings for minors.
  • Default Settings: Settings must be at the most protective level of privacy and safety—with only the parent of the minor able to change those settings.

Enforcement: Violations are treated as Unfair or Deceptive Acts or Practices (UDAPs) under the Federal Trade Commission Act and enforceable by the FTC and state attorneys general.

Effective Date: One year after enactment.

Back to index


15. Assessing Safety Tools for Parents and Minors Act (H.R. 6499)

Directs the FTC to evaluate industry efforts to promote online safety for minors, assess their effectiveness, and provide recommendations for improvement.

Effective Date: The review must be initiated within six months, and a report to Congress is due within three years of enactment.

Back to index


16. Stop Profiling Youth and Kids Act (SPY Kids Act)

This Act prohibits or limits "market or product-focused research" on minors. Key provisions include:

  • Prohibition of Research on Children: Covered platforms are prohibited from conducting market or product-focused research on users or visitors they know are under 13.
  • Limitation on Research on Teens: Covered platforms may conduct market or product-focused research on users or visitors they know are between 13 and 16 only if they obtain verifiable parental consent (as defined in COPPA) beforehand.
  • Reporting Exception: The Act does not limit the processing of personal information used solely for measuring or reporting advertising or content performance, reach, or frequency, including through independent measurement.

Enforcement: Violations are treated as Unfair or Deceptive Acts or Practices (UDAPs) under the Federal Trade Commission Act and enforceable by the FTC and state attorneys general.

Effective Date: 90 days after enactment.

Back to index


17. Safe Messaging for Kids Act of 2025 (SMK Act)

Bars certain ephemeral messaging features for minors and mandates parental controls for direct messaging.

Who It Applies To:

Social media platforms that enable users to create and maintain profiles, connect with other users to form networks or communities, share and consume user-generated content, or engage in persistent private or semi-public interpersonal communication.

Covered users are minors under the age of 17 whom social media platforms have actual knowledge are minors or would know are minors if not for willful disregard.

What It Requires:

  • Prohibition on ephemeral messaging features: Social media platforms may not offer, provide, or enable features that "permanently delete or render inaccessible communications" after a predetermined period, being viewed by the recipient, or exiting the chat interface.
  • Parental controls for direct messaging: Platforms must provide parental direct messaging controls that allow parents to approve or deny requests from unapproved contacts seeking to message minors; view and manage lists of approved contacts; be notified if minors change their age listed on profiles; and disable direct messaging features entirely for minors.
  • Prohibition on direct messaging for younger children: Direct messaging features must be disabled for minors under 13 unless parents provide verifiable consent to enable them.
  • App store warnings: App stores must provide clear warnings to parents when minors attempt to download apps with direct messaging features. Warnings must be tied to parental consent settings provided by the app store.

Encryption Protections: The Act does not require platforms to weaken or impair encryption.

Enforcement: Violations are treated as Unfair or Deceptive Acts or Practices (UDAPs) under the Federal Trade Commission Act and enforceable by the FTC and state attorneys general.

Effective Date: Platforms have one year to comply with parental control requirements for direct messaging and 18 months to comply with app store warning requirements.

Back to index


18. Don't Sell Kids' Data Act of 2025 (H.R. 6292)

The Act prohibits data brokers from collecting, using, or maintaining the personal data of minors and establishes strict requirements for deletion and enforcement.

  • A data broker is "an entity that, for valuable consideration, sells, licenses, rents, trades, transfers, releases, discloses, provides access to, or otherwise makes available to another entity personal data of an individual that the entity did not collect directly from such individual."
  • The Act excludes entities that "act as a service provider," or "provide, maintain, or offer a product or service with respect to which personal data, or access to such data, is not the product or service."

Enforcement: The FTC, state attorneys general, and individual causes of action.

Effective Date: 180 days after enactment.

Back to index


19. Parents Over Platforms Act (H.R. 6333)

This Act mandates age-assurance practices and parental controls, imposes transparency requirements, limits data use, and prohibits personalized advertising to minors.

  • Age-Assurance Requirements:
    • Application distribution providers must ask account holders to declare their age when creating accounts. May use commercially reasonable methods to estimate users' age categories with a reasonable level of certainty. Must provide mechanisms for users to update their age category if incorrect.
    • Developers must use commercially reasonable efforts to determine whether users are adults or minors, must block minors from accessing applications or features intended only for adults, and must obtain parental consent before allowing minors to access age-restricted content or applications. Developers are also barred from delivering personalized advertising to minors and must ensure minors cannot engage in activities restricted to adults.
  • Parental Controls: Application distributors must provide parents with tools to block minors from acquiring or using applications intended for adults.
  • Data-Use Restrictions: Distributors and developers must
    • (1) request the minimum amount of information necessary for compliance;
    • (2) avoid sharing age-related data with third parties, except service providers implementing safety measures or as required by law; and
    • (3) prohibit using age signals or data to infer users' dates of birth or for purposes beyond compliance with the Act.
  • Transparency: Developers must report whether their applications provide different experiences for adults and minors.
  • Limitations on Liability:
    • Application distribution providers are not liable for erroneous age signals, technical limitations, or outages preventing compliance and are not required to proactively identify covered applications or verify developers' information.
    • Developers are solely responsible for identifying whether their applications are covered under the Act and are not liable for erroneous age signals provided by distributors if reasonable efforts are made to comply.

Enforcement: Violations are treated as Unfair or Deceptive Acts or Practices (UDAPs) under the Federal Trade Commission Act and enforceable by the FTC and state attorneys general.

Effective Date: 24 months after enactment.

Back to index

Related Articles

DWT logo
©1996-2026 Davis Wright Tremaine LLP. ALL RIGHTS RESERVED. Attorney Advertising. Not intended as legal advice. Prior results do not guarantee a similar outcome.
Media Kit Affiliations Legal notices
Privacy policy Employees DWT Collaborate EEO
SUBSCRIBE
©1996-2026 Davis Wright Tremaine LLP. ALL RIGHTS RESERVED. Attorney Advertising. Not intended as legal advice. Prior results do not guarantee a similar outcome.
Close
Close

CAUTION - Before you proceed, please note: By clicking "accept" you agree that our review of the information contained in your e-mail and any attachments will not create an attorney-client relationship, and will not prevent any lawyer in our firm from representing a party in any matter where that information is relevant, even if you submitted the information in good faith to retain us.