Trust Issues: April 2026
In This Issue
- New California Age Assurance Law Requires Age Verification Signals That May Trigger Need to Comply with Other Youth Privacy and Safety Laws
- GSA Extends Comment Period on Requirements for AI Providers
- FBI's Operation Winter Shield Provides Cybersecurity Guidance Based on Recent Investigations
- FCC Bans Foreign-Made Routers
- NY RAISE Act Amendments Signed by Governor
- Enforcement Focus: CalPrivacy Fines PlayOn Sports, Requires Changes to How it Targets Ads to Minors and Provides Opt-Out Mechanisms
- Recent and Upcoming Events and Key Deadlines
New California Age Assurance Law Requires Age Verification Signals That May Trigger Need to Comply with Other Youth Privacy and Safety Laws
In less than nine months, app developers must begin requesting—and operating system providers must begin sending—real-time age verification signals indicating whether an individual is within a designated age range whenever that individual downloads an app onto their mobile device, desktop, or other general-purpose computing device.
The California Digital Age Assurance Act (the Act), which becomes effective on January 1, 2027, is designed to protect minors online by helping app developers fulfill their obligations under federal and state laws that regulate minors' access to online content and features. App developers that receive the age verification signal will be deemed to have "actual knowledge" of consumers' ages and therefore will need to comply with the Children's Online Privacy Protection Act (COPPA) and other state youth privacy and safety laws that restrict the use and disclosure of minors' personal data.
In this way, compliance with the Act—like compliance with other age-assurance laws—will have a cascading effect, triggering the potential application of a host of other online youth privacy and safety laws. Many companies have avoided learning their users' ages (and, therefore, complying with those laws) by not collecting birthdates or other age-related information; for them, these obligations will be new.
The Act Regulates App Developers and Operating System Providers
The Act imposes obligations on both operating system providers (Providers), defined as people or entities that develop, license, or control the operating system software on a computer, mobile device, or "any other general purpose computing device," and app developers (Developers), defined as people who own, maintain, or control an application, whether run on a computer, mobile device, or other general purpose computing device.
As these definitions make clear, the Act is broad—it applies not just to mobile apps, but to all software applications, including those on desktop computers and any other computing device that can access a covered application store or can download an application. Moreover, it applies not just to people and entities who develop operating systems or own applications, but to those who "license" and "control" operating system software or "maintain" or "control" applications. The Act does not define "device," so it could cover smart TVs, fitness trackers, gaming consoles, and other internet of things devices. Nor does the Act define "application," meaning that common functional apps, such as calendars, to-do-lists, and weather apps could be included, as could apps downloaded with updates.
Here's how the Act works:
- Providers must implement an application programming interface through which they can send to Developers upon request a digital signal indicating the user's age range as either (1) under 13 years old, (2) at least 13 and under 16 years old, (3) at least 16 and under 18 years old, or (4) at least 18 years old (Age Bracket Data).
- Providers must require account holders to provide the device end user's birth date or age range (or both) when they set up accounts. While the Act requires account holders to be 18 years of age (or the parent or guardian of a device user who is under 18 years of age), it does not address whether or how a Provider should verify the account holder's—as opposed to the device user's—age.
- Developers must request a digital signal from a Provider whenever a user downloads and launches an app. (The Act also allows a Developer to request a digital signal from a "covered application store," but does not require covered application stores to send such signals. "Covered application store" means a publicly available website, app, online service, or platform that distributes and facilitates the download of apps from third-party developers to users of computers, mobile devices, or any other general-purpose computing device.)
- After receiving a Developer's request for a signal, Providers must send only the minimum information necessary to comply with the Act and cannot share the Age Bracket Data with a third party for purposes unrelated to the Act.
- Developers must treat Age Bracket Data as a primary indicator of the device user's age range unless the Developer has internal "clear and convincing information" that a device user's age is different than the range indicated by the Age Bracket Data provided. Developers may not request more information from Providers or covered application stores than is necessary to comply with the Act and may not share the Age Bracket Data with third parties for purposes other than complying with the Act.
- If an account was set up before January 1, 2027, a developer must provide an accessible interface for the age bracket signal before July 1, 2027.
- If an application was updated after January 1, 2026, and downloaded to a device before January 1, 2027, the developer must request a signal with respect to that user before July 1, 2027.
The Act takes a different—and more privacy protective—approach to age verification than the App Store Accountability Acts (ASAAs) that were enacted in 2025 in Utah, Texas, and Louisiana. Unlike the Act, which requires account holders to self-report the age range of device end users, the ASAAs require companies to use "commercially reasonable" methods to verify a user's age, including by collecting sensitive personal data, such as government IDs or biometric information.
Developers' "Actual Knowledge" That Device Users Are Minors Will Trigger Other Youth Privacy and Safety Laws
As noted above, a Developer's "actual knowledge" or "willful disregard" of a device user's status as a minor triggers application of COPPA and state privacy and safety laws governing minors' personal data. For many years, Developers have been able to avoid these obligations by not collecting birthdates or other age-related information from consumers, but age verification requirements are making this increasingly infeasible. Indeed, the Act expressly states that Developers who receive a signal containing Age Bracket Data will be deemed to have "actual knowledge" of the age range of the device user to whom the signal pertains "across all platforms of the application and points of access of the application, even if the [D]eveloper willfully disregards the signal." (Emphasis added).
Developers therefore must not only take steps to comply with the Act but also ensure that they implement policies and procedures to comply with COPPA and youth privacy and safety laws that will apply to them after they begin receiving Age Bracket Data.
Developers should consider whether the following types of youth privacy and safety laws could apply to them:
- COPPA: COPPA applies to operators of websites and online services that have actual knowledge they collect personal information from children under 13 years of age. Developers subject to COPPA will need to adopt and implement policies and procedures to comply with COPPA and the COPPA Rules, including by providing a direct notice to, and obtaining verifiable consent from, parents of device users known to be under 13 years of age. Compliance with COPPA will satisfy most state consumer privacy law requirements regarding the handling of personal information collected from users under 13 years of age.
- California Consumer Privacy Act (CCPA) requirements related to teens: The CCPA prohibits a business from selling or sharing the personal information of consumers whom the business knows or willfully disregards are under 16 years of age without either obtaining such consumers' consent (if the consumers are at least 13 and under 16 years of age) or parental consent (if the consumers are under 13 years of age). If a developer is a "business" under the CCPA, it will need to ensure it does not "sell" or "share" the personal information of a device user who is under 16 without obtaining the required consent.
- Other state consumer privacy laws that impose heightened protections on personal data collected from teens: Developers will need to consider whether Age Bracket Data relating to California users puts them on notice that minors in other states are using their app, such that they could be "willfully disregarding" information about those users' ages. If so, they will need to comply with applicable state consumer privacy laws that protect the personal information of a broader range of minors. Like California, Connecticut raised the age of protected minors to under 16 (e.g., Connecticut), while Maryland raised the age to 18. These state consumer privacy laws require controllers to obtain opt-in consent from such minors before "selling" or processing their personal data for targeted advertising. Developers will either need to take a one-size-fits-all approach and refrain from sales and targeted advertising without consent for all users under 18 years of age or take measures (e.g., tag and segment personal data based on state of residence) to ensure compliance on a case-by-case basis.
- Age Appropriate Design Code Acts: Several states have enacted Age Appropriate Design Code Acts (AADC Acts). These laws differ with respect to what triggers their application. For instance, the Maryland AADC will apply to certain companies doing business in Maryland that have evidence regarding audience composition indicating that their online service is routinely accessed by a significant number of minors; the Vermont AADC will apply when certain companies doing business in Vermont know or should have known that their audiences are composed of at least 2% minors; and the Nebraska AADC will apply to certain companies doing business in Nebraska that have actual knowledge that at least 2% of their audience are minors. These laws differ, so Developers will need to determine whether they meet the various jurisdictional thresholds and, if so, how best to establish a compliance program that covers all of them. For instance, the Vermont AADC Act requires covered businesses to exercise a duty of care regarding minors' personal data, while the Nebraska AADC Act focuses on providing minors and parents with tools to control how minors' personal data is used.
- Laws governing access to social media and addictive feeds: Many states—including Nebraska, Arkansas, Virginia, and Louisiana, among others—impose restrictions on minors' access to and use of social media platforms. For instance, the Virginia Consumer Data Protection Act was amended last year to require operators of social media platforms to limit a minors' use of such platforms to one hour per day, although the daily time limit can be increased or decreased with verified parental consent. (The Virginial law defines minors as users under 16 years of age.) And New York's Stop Addictive Feeds Exploitation (SAFE) Kids Act requires online platforms to obtain verifiable parental consent before providing "addictive feeds" to users under 18 years of age. (The SAFE Act defines an "addictive feed" as a website, app, or online service "in which multiple pieces of media generated or shared by users" are "recommended, selected, or prioritized for display to a user based, in whole or in part, on information associated with the user or the user's device" unless certain exceptions apply.)
- Other state youth safety laws: Numerous states have enacted youth safety laws designed to protect minors from harms associated with certain types of digital services. For instance, the Texas Securing Children Online Through Parental Empowerment (SCOPE) Act restricts digital service providers (owners or operators of websites, applications, programs, or software with internet connectivity that collect personal information) that provide certain social features to minors from certain types of data processing without parental consent. Unlike the AADC Acts, the SCOPE Act does not have a jurisdictional trigger, although it exempts certain entities, including small businesses, as defined by the Small Business Administration. The SCOPE Act requires, among other things, digital service providers to disclose how they use algorithms to provide content to minors, limit the collection of personal information from minors, implement a strategy to prevent minors' exposure to harmful content, and create and give parents access to tools that allow them to supervise minors' use of the service. Similarly, the Connecticut act concerning online privacy, data and safety protections (SB 3) requires controllers to use "reasonable care" to avoid any heightened risk of harm to minors caused by an online service, product or feature. Colorado and Montana have enacted similar requirements. And California law requires an operator of a companion chatbot that knows a user is a minor to disclose to the user that the user is interacting with artificial intelligence, provide by default a clear and conspicuous notification at least every three hours reminding the user to take a break and that the chatbot is artificially generated and not human, and to institute reasonable measures to prevent the chatbot from producing sexually explicit visual material or suggesting that the minor engage in sexually explicit conduct. Finally, Washington Governor Bob Ferguson recently signed HB 2225, which similarly will require operators of certain chatbots that know a user is under 18 years of age to provide clear and conspicuous notice that the chatbot is artificially generated and not human and implement reasonable measures to both prevent the chatbot from generating or producing sexually explicit content or suggestive dialogue with minors and prohibit the use of manipulative engagement techniques that engage or prolong an emotional relationship with the user.
Contact: Nancy Libin
Back to top
GSA Extends Comment Period on Requirements for AI Providers
The General Services Administration (GSA) released a draft contract clause—GSAR 552.239-7001, "Basic Safeguarding of Artificial Intelligence Systems"—that would impose wide-ranging obligations on contractors providing AI solutions to the federal government. The proposed clause would require GSA contractors to grant agencies an irrevocable license to use AI systems for "any lawful Government purpose," to only use "American AI systems," (i.e., AI systems developed and produced in the United States), establish neutrality requirements for AI-generated outputs that ban "ideological dogmas such as Diversity, Equity, Inclusion," and give the government extensive audit rights to assess "bias, truthfulness, safety, and unsolicited ideological content." An AI system also would be prohibited from refusing to produce any data outputs or conduct analyses based on the contractor's "discretionary policies." These and other obligations in the proposed clause stretch well beyond terms that AI providers typically use for private-sector clients. With respect to data security, providers would be required to implement "reasonable technical, administrative, physical, and organizational safeguards" to secure government data and "eyes off" procedures to restrict human review of government data, and must report security incidents within 72 hours.
Following industry pushback, GSA extended the public comment deadline to April 3 and indicated the clause would be pushed to a future round of contract modifications.
Contact: Michael T. Borgia, Alix Town, Andrew Lewis
Back to top
FBI's Operation Winter Shield Provides Cybersecurity Guidance Based on Recent Investigations
The Federal Bureau of Investigation (FBI) is continuing its Operation Winter Shield to provide cybersecurity guidance based on real-world FBI investigations. Upon the program's launch late January, the FBI released a set of 10 core actions for organizations to improve their resilience against cyber intrusions. These recommendations draw on recent investigations and reflect observed adversary behavior. The 10 actions are:
- Adopt phish-resistant authentication
- Implement a risk-based vulnerability management program
- Track and retire end-of-life technology on a defined schedule
- Manage third-party risk
- Protect security logs and preserve them for an appropriate period of time
- Maintain offline, immutable backups and test restoration processes
- Identify, inventory, and protect internet-facing systems and services
- Strengthen email authentication and malicious content protections
- Reduce administrator privileges
- Exercise incident response plans with all stakeholders
Following the initial launch, the FBI has continued to provide guidance and examples from recent matters through multiple speeches, interviews, field office events, and media appearances. Operational Winter Shield is expected to continue through April.
While the FBI's guidance is nonbinding, federal and state regulators may reference the FBI's 10 core steps during investigations and enforcement actions on cybersecurity matters.
Contact: Michael T. Borgia, Ryan Burns
Back to top
FCC Bans Foreign-Made Routers
The Federal Communications Commission (FCC) recently took action to ban new consumer-grade routers produced in any foreign country from being sold in the United States. The FCC's action came after a "national security determination" by a White House-convened interagency body that "routers produced in a foreign country, regardless of the nationality of the producer, pose an unacceptable risk to the national security of the United States and to the safety and security of U.S. persons." The national security determination cites Salt Typhoon and several other attacks linked to China that targeted routers and other U.S. communications infrastructure as evidence of the cybersecurity threats posed by foreign-produced routers.
On March 23, 2026, the FCC added all "consumer-grade routers produced in a foreign country" to its Covered List, thereby prohibiting such routers from receiving an FCC equipment authorization. Routers without an equipment authorization cannot be imported, marketed, or sold in the United States. Consumers may continue to use routers they previously purchased or acquired, and retailers may continue to import, market, and sell routers already authorized by the FCC.
Manufacturers may seek an exception in the form of a Conditional Approval from the Department of War or the Department of Homeland Security. Guidance on the Conditional Approval process indicates that manufacturers will be required to submit a "U.S. Manufacturing and Onshoring Plan" as part of their application.
Contact: Michael T. Borgia, Kasey McGee
Back to top
NY RAISE Act Amendments Signed by Governor
New York Governor Kathy Hochul signed a bill on March 27, 2026, repealing and replacing the Responsible AI Safety and Education Act (the RAISE Act), aligning the law more closely with California's AI transparency framework. Effective January 1, 2027, the negotiated changes to the RAISE Act impose new transparency, reporting, and governance obligations on "frontier developers" and "large frontier developers" that develop or deploy "frontier models" (as those terms are defined in the RAISE Act), including requirements to publish a frontier AI framework (i.e., documented technical and organizational protocols to manage, assess, and mitigate catastrophic risks); disclose information about model capabilities and uses; and report critical safety incidents to a new oversight office within the New York Department of Financial Services. The law reflects a broader trend of states continuing to legislate and promulgate AI regulations alongside federal efforts to limit or preempt state laws and regulations, creating potential regulatory tension. Organizations developing advanced AI systems should begin evaluating whether they fall within the RAISE Act's scope and consider steps to prepare for new compliance, disclosure, and incident-reporting obligations.
Contact: Michael T. Borgia, Apurva Dharia, Andrew Lewis
Back to top
Enforcement Focus: CalPrivacy Fines PlayOn Sports and Requires Changes to How it Targets Ads to Minors and Provides Opt-Out Mechanisms
CalPrivacy's Stipulated Final Order (Final Order) with PlayOn Sports (PlayOn), announced March 3, 2026, provides expanded guidance on selling personal information of minors, warns again on deficient opt-out preference signal and opt-out mechanism practices, sets boundaries on notice-only tracker/cookie banners, and provides insights on how CalPrivacy views remedial measures taken before a business is aware they are under investigation.
Major themes include:
- Practices involving minors and vulnerable populations continue to be high priority.
- CalPrivacy will closely evaluate opt-out and "notice only" cookie banners involving platforms that primarily serve minors and other vulnerable populations.
- Remedial measures taken before a target knows they are under investigation help but will not insulate the target from enforcement or substantial fines and compliance obligations.
- CalPrivacy fined PlayOn Sports $1,100,000 despite PlayOn's substantial changes to its privacy practices even before they knew they were under investigation. Without those changes, the fine would likely have been much higher.
- Enhanced compliance obligations are probably worse than the fine.
- The compliance obligations and regulatory intrusions ordered by CalPrivacy—including the requirement that the PlayOn Board of Directors review and approve annual risk assessments through 2030—impose a greater burden on PlayOn than the fine.
- This demonstrates once again that businesses must look beyond the headline-grabbing penalty amounts to understand the full regulatory impact of these investigations.
Background
PlayOn provides an all-in-one service for digital ticketing, streaming, fundraising, concessions, merchandise sales, and website management to high schools in California as well as all other U.S. states. PlayOn claims to have sold 30 million tickets to high school events nationwide. Platform users, who are almost all students seeking tickets to school-sponsored events like football games, purchase tickets through PlayOn and then use their mobile phone as their ticket into the event. Sometimes, the PlayOn platform is the only way tickets are sold for an event, which means that high school students have to agree to PlayOn's privacy practices in order to attend.
Investigation
CalPrivacy's Enforcement Division opened an investigation into PlayOn's privacy practices sometime in 2024. In December 2024, before hearing from the Enforcement Division, PlayOn significantly changed its website, privacy policy, and notice banners. For example, PlayOn updated its website to recognize and process a consumer's opt-out preference signal, which was not the case previously. PlayOn also revised its notice banner to allow consumers to "accept" or "reject" tracking technologies. In addition, PlayOn substantially revised its privacy policies to align with the CCPA.
The Enforcement Division revealed the investigation to PlayOn sometime after that. The Final Order noted that "[d]uring the course of the investigation, PlayOn cooperated with the Enforcement Division and produced documents, answered questions, and engaged with the Enforcement Division in candid discussions about PlayOn's privacy practices." Based on these changes, CalPrivacy focused on the period from January 1, 2023, through December 31, 2024. The Enforcement Division noted that PlayOn engaged in only one targeted advertising campaign on its ticketing platform, which it viewed as sufficient evidence of sale and sharing of students personal information.
PlayOn Failed to Offer an Effective Method for Submitting a Request to Opt Out of Sale/Sharing and to Recognize Opt-Out Preference Signals
Before PlayOn updated its practices at the end of 2024, PlayOn's website had a notice-only cookie banner that blocked users from accessing event tickets unless they agreed to PlayOn's third-party trackers and privacy policy. When using PlayOn's ticketing platform on a phone or mobile device (necessary to access electronic tickets), the notice-only banner covered the part of the screen that allowed the consumer to use the event ticket. Thus, high school students attempting to access tickets to school events had to select "agree" regardless of how they accessed the platform in order to obtain and use their ticket.
At that time, PlayOn's only first-party methods for users to opt out of sale or sharing involving third- party trackers were through an email address and toll-free phone number. PlayOn's privacy policy directed users to opt out of sale and sharing involving third-party trackers using third-party tools provided by the Network Advertising Initiative (NAI) and the Digital Advertising Alliance (DAA). The Enforcement Division concluded that, in doing so, PlayOn "violated the company's responsibility to provide its own method for consumers to opt-out."
PlayOn's website also did not honor opt-out preference signals from user browsers. CalPrivacy has consistently flagged this failure in its enforcement actions as violating CCPA regulations.
PlayOn's Privacy Notice Was Deficient
PlayOn's privacy notice said that it did not sell consumers' personal information and did not explain how it would process opt-out preference signals. PlayOn's "Your Privacy Choices" link failed to include information required in a notice to allow consumers to opt out of sale and sharing and directed users to the opt-out methods above that CalPrivacy found to be deficient. Some of these problems may have resulted from the fact that PlayOn's privacy policy was last updated in July 2022, and its practices likely changed over time—a reminder that businesses need to update their privacy notices and practices as their data processing and applicable laws change.
Fines and Compliance Obligations
The Enforcement Divisions favorably viewed PlayOn's cooperation during the investigation, noting that "[t]he Agency recognizes and credits PlayOn's remediation efforts. PlayOn has substantially revised its practices and has committed substantial financial and other resources to remediating the shortcomings identified in this Stipulated Final Order, even before PlayOn learned about the Agency's investigation."
Nevertheless, CalPrivacy fined PlayOn $1,100,000. CalPrivacy also mandated additional compliance obligations, in addition to generally requiring CCPA compliance such as third-party contracting terms and OOPS recognition. Of these, the most challenging is CalPrivacy's enhanced risk assessment obligations. CalPrivacy required PlayOn to conduct a risk assessment by January 1, 2027, and for three years after that "before any material change in the processing of users' Personal Information," and include review of the risk assessment by the PlayOn Board of Directors, whose names would be recorded in the assessment.
Takeaways
CalPrivacy staff have indicated publicly that they view enforcement actions as a way to guide compliance by developing a body of precedent with a factual context. Consistent with that approach, the PlayOn Stipulated Final Order contains compliance lessons for businesses.
- Procedure matters: Opt-Out Preference Signal and other opt-out process failures are easy for regulators to spot and must be high priority for regulated businesses.
- Consumer context matters: A notice only banner that might work for some sites will likely not be acceptable for sites that serve minors and other vulnerable populations identified in the Final Order, like those whose information is sold by data brokers.
- It isn't just the fine that is the problem: The multiyear close regulatory scrutiny and Board of Directors review could concern some businesses more than the fine.
Contact: David Rice
Back to top
Recent and Upcoming Events and Key Deadlines
April 9: Adam Greene and Diane Butler will present "Navigating Health Care Privacy Laws When Responding to Immigration Enforcement" at the 2026 HIPAA COW Spring Virtual Conference.
April 15: Entities subject to the New York Department of Financial Services's (NYDFS) cybersecurity regulations must file their annual attestations of compliance with NYDFS by April 15.
April 22: Michael T. Borgia will join a panel discussion titled "The Use of AI in Cyberattacks and Cyber Defense" at the Incident Response Forum D.C. 2026. If you are interested in attending this event virtually, please contact Alexandra Petot.