Aristotle, the Greek philosopher and polymath of the fourth century BC is known to have coined the phrase horror vacui – nature abhors a vacuum, meaning that in nature, a vacuum or a void isn’t a steady state. Fast forward 2,400 years, and the same holds true for tech regulation. If Washington policymakers take their foot off the gas, don’t expect privacy and AI laws to slow down. Enter Sacramento, Austin and Albany, Tallahassee, Trenton and Olympia. This land is your land, and it is rife with tech policy initiatives. No federal privacy law? Introducing 20 states – and counting. No federal AI regulation? Prepare for an avalanche of state action.
As we enter the early days of 2025, it is an ideal moment to consider what the new year may bring to privacy regulation. We are in a period of continued evolution, marked by the emergence of new technologies, most notably artificial intelligence (AI), growing awareness of personal data rights, and rising public demand for stronger protections. This year, however, the shift is not only technological but also political, with Republicans consolidating power in Washington, holding control over the executive, legislative, and, for the most part, judicial branches.
While some observers suggested this may signal a deregulatory stance to privacy and AI, in the U.S., the absence of federal regulation may very well open the floodgates of state laws. Blue states in particular are expected to double down on protecting consumer health information and imposing anti-bias and discrimination measures on developers and deployers of AI.
With a diverse and complex regulatory landscape at both the federal and state levels, businesses and individuals alike are navigating the implications of this shift. At the state level, privacy legislation continues to gain momentum, with states like California, Texas, and Colorado leading the charge, setting precedents with groundbreaking laws, while other states move toward similar measures.
Here are the key privacy and AI issues that will make 2025:
Federal Regulation –
Consider this curveball: with Republicans in control of both chambers of Congress as well as the White House, federal privacy legislation could actually get done. To be sure, historically Republican lawmakers were more circumspect about privacy regulation. At the same time, industry has been clamoring for a federal law that would preempt the growing patchwork of state privacy laws. While we wouldn’t hold our breath waiting for a federal privacy law, such an initiative could gain momentum in Congress. If it happens, with Sen. Ted Cruz (R-TX) at the helm of Senate Commerce and Rep. Brett Guthrie (R-KY) chairing House Energy and Commerce, we predict a proposal for a new comprehensive federal law that looks more like the Texas Data Privacy And Security Act than the last iteration of a federal privacy bill, the American Privacy Rights Act (APRA). That is, a federal law with strong preemption, no private right of action, less emphasis on civil rights and data minimization, and a stronger focus on security. Short of a federal privacy law, we may well see kids’ privacy legislation pass in this Congress. Recent efforts, such as the Kids Online Safety and Privacy Act (“KOSPA”) have strong bi-cameral and bi-partisan support. If lawmakers want to show they can get something done, this would be a good place to start.
The new Federal Trade Commission (FTC), led by Trump’s choice for chair, sitting Commissioner Andrew Ferguson, and with a Republican majority with new Commissioner Mark Meador, will look starkly different from outgoing Chair Lina Khan’s agency. For starters, the agency is all but certain to scrap its “Commercial Surveillance and Data Security Rulemaking”. In fact, we’re unlikely to even hear the term “commercial surveillance” under this leadership. Moreover, nominated Chair Ferguson has already stated that he intends to roll back the agency’s recent efforts to become a de facto AI enforcement agency, focusing instead on its traditional role as a competition and consumer protection regulator. This intent can be witnessed in his Rytr dissent, where Feguson stated the agency “certainly should not chill innovation by threatening to hold AI companies liable for whatever illegal use some clever fraudster might find for their technology”. On privacy, we expect the FTC to shift from an activist enforcement stance, which often relies on the unfairness prong of its section 5 authority, to a more conservative approach based on anti deception.
State Law Frenzy –
You can surely expect 2025 to mark a pivotal year in state law regulation of privacy as more states implement comprehensive privacy laws that offer robust consumer rights. Many of these laws reflect a trend toward more consumer control over personal data, corporate transparency, and greater accountability for organizations that process sensitive information. In 2024, the number of comprehensive state privacy laws grew from five to eight, with the laws in Texas, Oregon and Montana coming into force. This year, the tally will balloon to 16, as privacy laws go into effect in Tennessee, Delaware, Iowa, New Jersey, New Hampshire, Nebraska, Maryland and Minnesota. Some of these laws, including in Delaware, Iowa, Nebraska and New Hampshire, already entered into force on January 1, and New Jersey’s is imminent on January 15. Further, in certain states with such laws in effect, new requirements are now implemented. For example, in Texas and Connecticut, businesses must now respect Universal Opt-Out Mechanisms (UOOMs). All in all, 19 states (plus Florida, whose law’s scope is somewhat more limited) have passed a comprehensive privacy law, with many others in the pipeline.
In addition to having a direct effect for rights of consumers and obligations of business in the particular state at play, these laws may come to have a broader impact by influencing discussions around federal privacy legislation. Importantly, the new Maryland law adopted the data minimization paradigm from one of the stalled federal bills, marking a shift in US state law from a permissive, largely notice and opt out approach, to a stricter GDPR-like regime limiting the collection of data to only what’s “necessary”.
While for several years, businesses addressed the fracturing US privacy landscape as “California plus”, grounding their compliance efforts in CCPA while keeping an eye on state by state developments, this approach will now change to a more integrative strategy. California is no longer the strictest regime, though it continues to be an outlier in its application to employees’ data, and new frameworks, such as Maryland’s data minimization principles or Washington’s sector specific My Health My Data Act, will need to be accounted for. State laws also diverge in application thresholds – for example, Texas law introduced a small business exemption based not on revenue thresholds but on being defined as a small business by the United States Small Business Administration. And while some states provide entity level exemptions for businesses covered by federal laws such as HIPAA, GLBA or FERPA, others allow a much more limited data-based exemption. Other differences manifest in state law treatment of specific categories of data, such as kids’ information or biometrics, or in the scope of consumer privacy rights.
Health Privacy –
Don’t be surprised if the red wave in federal elections will lead to a blue counterwave in state efforts to protect consumer health information, particularly – though not only – in the context of reproductive rights and gender affirming care. Already, back in 2023, Washington state enacted the My Health My Data Act, which has been emulated by legislation in Nevada. Over the past couple of months, we’ve seen a reproductive data privacy geofencing bill in New York (A 5517); a bill specifically scoped to reproductive data privacy in Michigan (SB 1082); a bill on AI and mental health data in Texas (HB 1265); multiple bills on location health data in California (AB 45 and AB 67); and a bill on privacy in the context of reproductive rights and gender-affirming care in Virginia (SB 754).
At the federal level, we will see significant changes to Health Insurance Portability and Accountability Act of 1996 (HIPAA) regulations in the near future. Recently, the U.S. Department of Health and Human Services (HHS) proposed updates to the HIPAA Security Rule to strengthen the security of electronic protected health information (ePHI). The proposed rule outlines several major revisions to the Security Rule, including the removal of the distinction between “required” and “addressable” implementation specifications, the introduction of compliance deadlines and documentation requirements, and more detailed risk analysis guidelines. Additionally, the rule proposes new notice requirements for changes in workforce access termination status, updates to contingencies and security incident response procedures and the introduction of new security controls. It further mandates the encryption of ePHI both at rest and in transit, along with the required use of multi-factor authentication.
Kids privacy –
Few areas are as contentious – but also draw bi-partisan convergence – as kids’ privacy. Policymakers, educators and parents alike assert that something must be done about kids’ privacy, but what? Everyone is anxious about kids’ use of devices, apps and social media sites that generate and consume a gushing stream of data, but regulatory protective measures present formidable risks to free speech and online anonymity. As discussed above, this may be one area where Congress steps up and delivers federal legislation. The latest bi-partisan efforts include KOSPA, sponsored by Sen. Marsha Blackburn (R‑TN) and Richard Blumenthal (D‑CT) and COPPA 2.0, sponsored by Sen. Ed Markey (D-MA) and Bill Cassidy (R-LA).
In this area too, states haven’t been sitting on their hands. Just this past year, New York passed two important kids’ privacy laws, the Child Data Protection Act, which focuses on kid’s data protection, and the Stop Addictive Feeds Exploitation (SAFE) For Kids Act, which focuses on preventing addictive apps and technologies. Meanwhile, Maryland passed an Age Appropriate Design Code Act, modeled after the California law that, for now, is held up in constitutional litigation.
Regulators too have weighed into this space. While the FTC’s commercial surveillance rulemaking is kaput, its COPPA Rule refresh is very much viable. Furthermore, enforcement agencies from the FTC to the California and Texas AGs have demonstrated vigor in enforcing against violations of kids’ privacy rights. We expect state-level interest in and enforcement of children’s privacy rights to grow in 2025 and beyond.
Sensitive data, ad tech and data brokers
In 2024, we witnessed the regulation of data brokers for the first time veer off from a privacy track to one anchored in national security. As tensions with China heightened over national security and trade policy, policymakers recognized concerns around foreign governments’ access to US persons’ sensitive personal data. Consequently, the Biden administration passed legislation and an Executive Order intended to protect sensitive personal information of U.S. persons from being accessed by China or entities under its control. While the “Tik Tok law” – Protecting Americans from Foreign Adversary Controlled Applications Act of 2024 – drew a lot of media attention, the administration also passed the Protecting Americans’ Data from Foreign Adversaries Act of 2024 (PADFA) as an add-on to a supplemental appropriations bill to support Israel and Ukraine. In addition, the President’s Executive Order (EO) 14117 on Preventing Access to Americans’ Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern passed and was accompanied by extensive rulemaking by the DOJ, published just days before the end of 2024. While PADFA focuses on preventing data brokers from selling personally identifiable sensitive data of U.S. individuals to a foreign adversary country or an entity controlled by a foreign adversary country, the EO has much broader remit. The EO applies not only to data brokers, but rather to any entity that makes Americans’ bulk sensitive personal data and certain U.S. Government-related data accessible in countries of concern, notably China (including Hong Kong), and extends to not only data selling but also investment, employment or vendor agreements involving such entities and access to data.
Data brokers attracted the scrutiny of other regulators too. Last month, the CFPB issued a proposed rule for Protecting Americans from Harmful Data Broker Practices. The rule would apply the protections under the Fair Credit Reporting Act to a category of data brokers that have traditionally not been viewed as consumer reporting agencies, if such businesses sold information about a consumer’s income or financial tier, credit history, credit score, or debt payments, regardless of how the information is used. The rule also asserts that advertising and marketing are not “permissible purposes” for which consumer reporting agencies may furnish consumer reports. However, given the November election results, the fate of the rule – and indeed of the CFPB itself – has become unclear.
The FTC too focused on enforcement actions against data brokers. In a series of enforcement actions, starting with InMarket and X-Mode at the beginning of 2024 and culminating with Mobilewalla and Gravy Analytics at the end of the year, the agency cracked down on data brokers selling location data, particularly in sensitive contexts. Importantly for this industry, the FTC stated held that brokers cannot simply rely on contractual language with data providers to verify consumer consent; rejected data brokers’ creation of sensitive location segments, such as ones tied to healthcare clinics, places of worship, LGBTQ gatherings, or political rallies; and cracked down on brokers’ practice of monetizing data inappropriately collected and retained from real time bidding (RTB) exchanges. The FTC’s decisions bear implications not only for data brokers, but also for companies up and down the data supply chain, requiring businesses to ensure consumers provide verifiable consent, honor opt outs, block sensitive locations, disclose retention practices and verify data hygiene and accountability. Industry groups too tightened their best practices to prohibit the use, sale, and transfer of precise location information related to “sensitive points of interest”.
The FTC criticized the ad tech industry beyond just data brokers. In September, the agency released a lengthy report titled, “A Look Behind the Screens: Examining the Data Practices of Social Media and Video Streaming Services,” with recommendations addressing data minimization, restrictions on data sharing, protection of kids’ and teens’ personal information, and automated decision making. In a November blog post, the FTC criticized Data Clean Rooms, stating they “are not rooms, do not clean data, and have complicated implications for user privacy, despite their squeaky-clean name.” The FTC added: “Companies shouldn’t view Data Clean Rooms as a way to get around their obligations under the law or the promises they have made to consumers.”
States too kept their eyes on the data broker industry. With Oregon’s data broker law going into effect and the Texas Attorney General announcing an enforcement sweep against unregistered data brokers, registration requirements now figure in four states, along with California and Vermont. In recent rulemaking under California’s DELETE Act, the CPPA narrowed the “direct relationship” exception in the statutory definition of “data broker” to situations where “a consumer intentionally interacts with a business for the purpose of obtaining information about, accessing, purchasing, using, or requesting the business’s products or services within the preceding three years.” That means that the CPPA would consider a business to be a data broker even if it had a direct relationship with consumers as long as it sold personal information about such consumers that the business did not collect directly from them. While still a year out, the DELETE Act requires the CPPA to establish a “Deletion Request and Opt-Out Platform” by January 1, 2026. By August 2026, data brokers will have to respond to single-click deletion requests and review the deletion system every 45 days. As a result, opt out rates are expected to skyrocket.
And if the intentions of policymakers weren’t clear enough, the powerful attorneys general of Texas and California announced broad enforcement sweeps against data brokers, culminating in a set of fines and penalties.
Artificial Intelligence
Anyone say AI? No technological or business development has captured the public’s imagination over the past few years like the AI revolution. Whether it’s generative AI that puts personal data and intellectual property at risk or artificial general intelligence (AGI) that threatens human existence, everyone is searching for AI governance frameworks. One of President-Elect Trump’s campaign promises was to repeal President Biden’s Executive Order on AI, and more broadly, AI policy, on day one in office. The Trump Administration will address AI policy primarily through the lens of (a) competition with China on innovation; (b) national security; and (c) energy policy. While Trump’s 2019 Executive Order on AI paid tribute to civil rights, that issue was nowhere near its prominence as in Biden’s 2023 Executive Order or his 2022 “Blueprint for an AI Bill of Rights.”
But here too, the states may have the final say. By stepping back from AI governance and ethics, the Administration will effectively cede territory to the states. Already, Colorado is implementing its Artificial Intelligence Act, focused on AI rendering consequential decisions, and California’s AB 2013 requires generative AI providers to ensure training data transparency. California will likely re-litigate SB 1047, the bill requiring developers of “frontier models” to implement prescriptive safety measures. In Texas, Rep. Giovanni Capriglione (R), who authored the state’s privacy law, recently submitted his Responsible AI Governance Act. Indeed, dozens or even hundreds of AI laws are making their way through the legislative pipeline in state capitals.
Litigation –
Historically, privacy wasn’t a field of vibrant litigation. Enforcement was largely the remit of regulatory agencies, such as the FTC and HHS OCR. No more. The past couple of years have seen a surge in privacy litigation, with the plaintiffs’ bar weaving privacy causes of action out of federal and state laws ranging from the VPPA to CIPA, BIPA and even trap and trace legislation and Daniel’s Law. These cases target business practices ranging from integrating cookies, pixels, session replay or SDKs, to using third parties to offer customer service chatbots or track email open rates. Despite some recent cases the industry views as helpful, such as the Supreme Judicial Court of Massachusetts decision in Vita v. New England Baptist Hospital, rejecting application of wiretapping laws to the use of pixel technology on websites, and Ninth Circuit Court of Appeals decision in Zellmer vs. Meta, holding that under BIPA, biometric identifiers must be capable of identifying a person, plaintiffs are likely to continue to pepper businesses with demand letters and claims threatening mass arbitration or class action litigation.
***
Privacy regulation and enforcement activity were neither quiet nor predictable last year. Rather, the landscape of privacy and AI law in 2024 continued to evolve at a breakneck speed, with both federal and state governments taking a more active role in defining the regulatory framework. Key developments include a growing focus on protecting consumer health data, enhancing children’s privacy, and addressing the practices of data brokers. Litigation around privacy violations has surged, highlighting the urgency of robust protections and measures for risk mitigation. As we look ahead to 2025 and beyond, we can expect continued efforts to standardize privacy regulations, with increased emphasis on consumer rights, transparency, and the accountability of companies handling sensitive data. And wherever federal policymakers and regulators take a step back, expect state lawmakers and attorneys general to press forward. There’s never a vacuum in technology regulation, an issue that draws intense media focus and broad political appeal. The intersection of technology and regulation will remain dynamic, requiring both innovation and adaptation to ensure individuals’ privacy and security in an interconnected world. As new challenges arise, the push for stronger enforcement, clearer guidelines for businesses, and more comprehensive protections for vulnerable groups like children will be central to the ongoing dialogue. The coming years will shape the future of privacy and AI law, balancing innovation with the need to protect personal freedoms and individual protections in the coming age.
Reprinted with permission from IAPP
The post US Privacy and AI Outlook for 2025: Less Feds, More States appeared first on Data, Privacy & Cybersecurity Insights.