This year, Data Privacy Day, January 28, fell on a Sunday, allowing us extra time to contemplate – between NFL championship games – the top ten things to keep your eyes on in US privacy in 2024. The coming year will continue to feature a dizzying pace of developments, including federal and state legislation, regulation, litigation and platform rules. Among the numerous developments, the spotlight will be shining brightly on perceived privacy harms associated with AI, protecting the privacy of health data, including, in particularly information related to reproductive information, kids’ and teens’ information, and the practices of data brokers and ad tech vendors. Here’s what we see coming down the pike:
1. AI governance is all the rage. Ever since ChatGPT launched to enormous fanfare in November 2022, businesses and policymakers have been enmeshed in nonstop discussions about the technological paradigm shift to AI, which some claim is as profound as the emergence of the Internet. Anyone dealing with the federal government needs to adhere to the President’s Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence (the “EO”). Published last October, the EO tasks government agencies with developing rules around safety, security, privacy, competition and the mitigation of bias and discrimination. The FTC has also already been active in the AI space, hosting a Tech Summit on AI last week and including language on AI fairness, including mitigation of mistakes and bias and enhancement of transparency, in recent enforcement actions such as the Rite Aid case (see, in particular, Commissioner Bedoya’s statement). Importantly, as more comprehensive consumer privacy laws come into force at the state level, provisions on automated decision making proliferate and will impact how AI may be used. Most immediately, businesses are watching the California Privacy Protection Agency’s draft regulations on automated decision-making technology (ADMT). Further on the horizon, especially in an election year, Senate majority leader Chuck Schumer continues to support federal AI legislation, which has broad bi-partisan appeal and would include a significant privacy component.
2. Sensitive data: health. In the past, privacy protection for health information largely meant compliance with HIPAA, with its narrow scope governing covered entities and their business associates. No more. Over the past couple of years, primarily since the Supreme Court Dobbs decision, federal and state policymakers have been laser focused on protecting “consumer health data” across a wide swath of industries and technologies far exceeding the limited reach of HIPAA. In federal enforcement by the FTC and a new class of health specific privacy laws, most notably Washington State’s My Health My Data Act (MHMDA), which comes into force on March 31st, consumer health data is defined broadly. The term includes not only patient medical information but also extends to information about consumers’ fitness, wellness, diet, gender affirming care, pregnancy status, as well as precise geolocation of individuals visiting stores or locations providing these products or services, and even biometric data. Unlike any privacy law except Illinois’ notorious biometric privacy act (BIPA), the MHMDA features a broad and robust private right of action, which promises to unleash a wave of privacy litigation. The FTC, meanwhile, has been bringing case after case after case after case of enforcement against websites and apps processing consumer health information, including by reinvigorating its long forgotten Health Breach Notification Rule.
3. Sensitive data: kids. Not only consumer health data but also other sensitive categories of data, most notably kids’ and teens’ information, are drawing intense regulatory focus. Over the winter holidays, the FTC announced long awaited revisions to its COPPA rule. More than four years in the making and integrating more than 175,000 comments from industry and civil society players, the new rules would require separate opt-in consent for targeted advertising; limit COPPA’s “support for the internal operations” exception; and codify existing restrictions on education technology (ed tech) providers. Already in 2023, the FTC has vigorously enforced against alleged COPPA violations, including in cases against Amazon’s Alexa, Fortnight developer Epic Games and ed-tech provider Edmodo. With policymakers increasingly weighing in on the content available online to kids and teens, particularly on social media, age verification requirements are becoming commonplace. Alas, verifying age and identity often entails collecting biometric information, in tension with state privacy and biometric protection laws, placing companies between a rock and a hard place. Importantly, in this context, all eyes are on the Ninth Circuit Court of Appeals, as it hears the appeal by California Attorney General Rob Bonta to overturn the preliminary injunction issued by the U.S. District Court for the Northern District of California in September 2023 enjoining the implementation of the California Age Appropriate Design Code Act (“AADCA”). If it survives judicial scrutiny, the AADCA would significantly expand the scope of COPPA protections, which apply to sites or apps that have “actual knowledge” about collecting information from kids under 13, applying to any site or app “likely to be accessed by children” defined as minors under 18.
4. Sensitive data: location. Less than a month into 2024, the FTC has already settled multiple cases involving vendors’ collection and sharing of consumers’ precise geolocation data. In X-Mode Social, the FTC enforced against a location analytics data broker and SDK provider, leveraging its unfairness authority to assert that disclosing consumers’ sensitive location information constitutes a tangible privacy harm. Specifically, the FTC alleged that the company sold “audience segments” tied to users’ MAIDs, which were derived from location data and given names revealing of sensitive locations. Similarly, in the InMarket Media case, the FTC alleged a location data aggregator and SDK developer did not verify that first parties appropriately collected precise geolocation, including by obtaining users’ affirmative express consent. The FTC viewed any unique persistent identifiers, such as a MAID or IDFA, that were combined location data as location data; and it accepted the CCPA’s threshold for precise geolocation as a radius of 1,850 feet. These cases, in addition to earlier FTC cases such as InMobi and Kochava, demonstrate the agency’s theory of privacy harm, which is anchored in consumers’ sensitive data and its potential to expose consumers to bias, discrimination, stigma or harassment.
5. Data brokers in scope. For more than a decade, data brokers have garnered more than their share of regulatory attention, including a landmark policy report from the FTC and data broker laws from California to Vermont. Now more than ever, data brokers face regulatory headwinds, with new laws requiring registration in Texas and Oregon and California’s DELETE Act promising to be a game changer. Under the DELETE Act, the California Privacy Protection Agency (“CPPA”) has been charged with creating an accessible deletion mechanism by January 1, 2026, enabling California consumers to initiate a single verifiable request through which they can instruct every registered data broker that possesses their personal information to delete their data. If that’s not enough, the the Consumer Financial Protection Bureau (“CFPB”) has launched a rulemaking effort under the Fair Credit Reporting Act, which seeks to expand the regulation of data brokers by subjecting more companies to stringent Fair Credit Reporting Act (“FCRA”) obligations.
6. Biometrics: introducing annihilative liability. Exactly a year ago, in a startling decision, the Illinois Supreme Court ruled in White Castle that a separate claim accrues under BIPA each time a company scans or transmits an individual’s biometric identifier or information without prior informed consent. The upshot was stark: the regional hamburger chain became subject to damages in an amount of up to $17 billion, which the court itself dubbed “annihilative liability”. Last May, the FTC issued a Policy Statement on Biometric Information, warning that “false or unsubstantiated claims relating to the validity, reliability, performance, fairness, or efficacy of technologies using biometric information” may violate the FTC Act. And in fact, the FTC’s recent enforcement action against Rite Aid prohibited the pharmacy chain from using facial recognition technology in stores for five years, claiming that it failed to implement reasonable procedures to prevent harm to consumers based in inaccuracies, lack of transparency, and bias and discrimination. By including biometric data in the definition of consumer health data, MHMDA, with its broad private right of action, will further turn up the heat on vendors and users of biometric technologies, even as regulators increasingly require websites to verify their users’ age.
7. Ad-tech crackdown. Once seen as online controls with little popular use or appeal, universal opt out mechanisms (UOOMs), also known as opt out preference signals, are gaining traction based on regulatory initiatives in states including California and Colorado. Both the California and Colorado Attorneys General endorsed the global privacy control (GPC) as a valid implementation of UOOMs, meaning that users will increasingly be able to opt out from publishers and vendors sharing their data through one flip of a switch. Indeed, in its latest round of draft regulations, the CPPA suggested consumers would have the right to opt out of even first party behavioral targeting. Coming on top of Apple’s App Tracking Transparency framework and Google’s deprecation of third-party cookies and roll out of its Privacy Sandbox, enhanced opt out rates would weigh heavily on the profitability of ad tech vendors. Ad tech companies face increasing pressure to innovate on privacy and implement data governance control, or else.
8. Privacy litigation gets hot and hotter. 2023 was a landmark year for privacy litigation. In addition to enormous BIPA claims, class action plaintiffs pursued novel legal theories under state wiretapping laws and federal privacy legislation. Plaintiffs sued publishers and vendors based on their deploying chatbots, session replay scripts, pixels and SDKs, with some cases making it past motion to dismiss. We anticipate continued activity in 2024 as more plaintiffs’ counsel pursue crafty new litigation claims.
9. New Fintech Privacy Requirements? We also expect to see new privacy related requirements in the fintech space. In October 2023, the CFPB announced its proposal on “Personal Financial Data Rights,” which would provide consumers the right to access and transfer their financial information between banks and other financial entities. As discussed above, the CFPB also published a proposed outline that would expand interpretation of the FCRA to include data aggregators when they engage in “assembly or evaluation” of consumer information. These proposals, if adopted, will bring new compliance challenges to fintech business.
10. State Enforcement. 2024 will bring additional enforcement activity not just at the federal level and with class action plaintiffs but also by state attorneys general and the CPPA. As of March 29, 2024, the CPPA can begin enforcement of the CPRA Regulations. Where will they begin? Perhaps recent sweeps provide some insight. Late last week California AG Rob Bonta to announced an investigative sweep targeting businesses with popular streaming apps and devices alleging that they failed to comply with CCPA’s opt-out requirements for selling or sharing consumer personal information. California has conducted previous investigatory sweeps and last year, just days after the state’s new law came into force, Colorado’s Attorney General sent inquiries to entities relating to their compliance. The stated priorities include complying with sensitive data processing requirements and targeted advertising and profiling opt-out requests. Perhaps even more notably, in December 2023, the Federal Communications Commission (“FCC’) and the attorneys general of Connecticut, Illinois, New York, and Pennsylvania entered into Memoranda of Understanding to strengthen and formalize cooperation on privacy, data protection, and cybersecurity investigations and enforcement. We expect to see additional enforcement sweeps and inquiries continuing into 2024 and also anticipate that these actions will lead to more enforcement activity at the state level, even as some states complain about lack of resources to pursue enforcement actions.
The post Data Privacy Day 2024: What’s Hot in Privacy? Everything appeared first on Data, Privacy & Cybersecurity Insights.