O’Melveny Worldwide

Data Security and Privacy Predictions for 2024

February 22, 2024

US companies collecting consumer or employee data already face a complex web of overlapping obligations from federal and state laws and regulators, and we anticipate that the web, at least domestically, will spread wider in 2024 across five crucial domains: (1) enforcement of new federal disclosure obligations, (2) new comprehensive state data privacy laws in more states, (3) new state, federal, and international laws addressing AI and data privacy, (4) more litigation concerning cookies, and (5) more enforcement actions and litigation and proposed legislation concerning children’s privacy. Internationally, though, companies seeking to transfer data between Europe and the US should enjoy greater certainty and consistency as the US and its allies across the Atlantic have reached new agreements that standardize expectations.  

1. Federal regulators will increase scrutiny of investment adviser cybersecurity practices, while Congress will attempt to establish national privacy standards.  

In 2023, the federal government continued to try to keep pace with increasing cybersecurity threats with new regulations from the SEC and FTC, although efforts in Congress to pass concrete statutory changes stalled. There were signs of legislative action early in the year with the introduction of the Data Privacy Act (“DPA”) in February, but the bill is no longer progressing.

New cybersecurity reporting requirements adopted by the SEC in July 2023 are now in effect as of December 18, 2023 for reporting cybersecurity incidents to the SEC in an 8-K and for the fiscal year ending after December 15, 2023 for reporting in a 10-K annual report. The rules are meant to enhance and standardize disclosures by public companies regarding cybersecurity risk management, strategy, governance, and incidents.1 Registrants experiencing a material cybersecurity incident are now required to file Form 8-K, item 1.05, within four business days and describe the material aspects of the incident’s nature, scope, timing, and impact. They will also need to file an updated Form 10-K and describe the internal processes in place for assessing, identifying, and managing material risks from cybersecurity threats as well as the role the board of directors and management play in supervising these processes. Foreign Private Issuers experiencing an incident must file similar reports on Forms 6-K and 20-F. 

In October 2023, the FTC adopted an amendment to its Safeguards Rule,2 which requires financial institutions to notify the FTC in the event of an unauthorized acquisition of unencrypted customer information involving 500 or more customers. The notice, which will be effective in May 2024, must include a general description of the event, the type of information involved, and the number of consumers affected. 

The first quarter of 2024 is likely to see a continued attempt by the federal government to protect data privacy and enhance cybersecurity on several fronts. In November, the Senate introduced the Digital Accountability and Transparency to Advance Privacy Act (“DATA Privacy Act”) to establish national data privacy standards.3 The SEC has also signaled its intentions for the new year, stating, in its 2024 Examination Priorities, that its Division of Examinations plans to focus on broker-dealers and financial advisors. Specifically, the SEC intends to scrutinize how broker-dealers and advisors protect investor information and records, and how firms promote cyber resiliency, prevent account intrusions, and safeguard personally identifiable information.4 The SEC will also look at registrants’ policies and governance procedures, their responses to cyber-related incidents, including ransomware attacks, and review their staff training for preventing identity theft and protecting customer information. Moving forward in 2024, it will be important for organizations to review their cybersecurity and data management policies to ensure that they are taking appropriate steps to protect consumer data. Board members, executives, and management should also examine their own protocols to avoid liability under the new requirements.

2. States will continue to lead the way on US privacy laws. 

If recent trends continue, more states will pass comprehensive data privacy laws in 2024. Before 2023, only five states – California, Colorado, Connecticut, Utah, and Virginia – had passed comprehensive consumer data privacy legislation. Eight more states joined the group last year: Delaware, Florida, Indiana, Iowa, Montana, Oregon, Tennessee, and Texas. Legislatures in some 40 states have now introduced or are now considering some form of consumer privacy bills,5 so it is likely that more regulation is on the way, and soon.

While obligations under some state statutes overlap, complying with the regulations in one state does not guarantee that a company is complying in another. For example, data privacy laws in Utah6 and Iowa7 do not provide consumers with the right to correct errors in their personal data, while laws in California8 and Colorado9 do afford such rights. Companies should consider auditing their data collection and storage practices to ensure that they are complying in all the states where they do business, have an active presence, or from which they collect or maintain resident data. Note that not all the new laws are effective yet: laws in Texas, Oregon, and Montana become effective later in 2024, and Indiana’s becomes effective in 2026. 

California, a pioneer in data privacy laws and the state with arguably the broadest comprehensive state privacy law, added to its regulatory regime with the California Privacy Rights Act (“CPRA”), which became fully effective at the start of 2023. Employee data are no longer excluded from protection under the California Consumer Privacy Act. Employees now have the right to know how personal information is collected, to delete certain personal information, to correct inaccuracies, and to opt out or limit the use or sharing of certain data. Employers are now required to provide employees with comprehensive data privacy notices—similar to those given to consumers—describing what kind of personal data is collected and how it is used. Companies that have not already taken steps to comply with CPRA should consider mapping the employee data that they maintain, developing and distributing written policies and procedures for employee data, and establishing a process for responding to employee requests. 

3. State, federal, and international governments will attempt to pass regulations concerning artificial intelligence.  

Given how fast artificial intelligence and other automated technologies have developed, we expect to see continued focus on regulating these technologies on both the state and federal levels in 2024. Companies should continue to pay careful attention to the types of information they feed to artificial intelligence tools and evaluate obligations to conduct risk assessments based on the use of automated decision-making technologies. 

The second half of 2023 saw a surge in regulatory efforts. President Biden’s Executive Order on the Safe, Secure and Trustworthy Development and Use of Artificial Intelligence in October required heads of various US agencies to set guidelines regarding the development and use of artificial intelligence.10 Over the next few months, the US will likely see changes at the federal level influenced by this executive order’s requirements. 

In other artificial intelligence developments at the federal level, the Department of Health and Human Services (“HHS”) finalized the Health Data, Technology, and Interoperability (“HTI-1”) rule.11 HTI-1 establishes cutting-edge transparency requirements for artificial intelligence and other predictive algorithms that are part of certified health information technology.12 The purpose is to promote responsible artificial intelligence and make it possible for clinical users to access reliable information about the algorithms they use to make decisions.13  

The California Privacy Protection Agency (“CPPA”), established under the CPRA in 2023, likewise issued regulations addressing the use of artificial intelligence with respect to consumers’ personal information. At its December 8, 2023 meeting, the CPPA board discussed revising risk assessment regulations, which included, among other proposals, adding detail on the training of artificial intelligence and disclosures surrounding the use of information.14 The draft for automated decision-making technology (“ADT”) regulations requires businesses to provide consumers with an opt-out notice before using ADT, expanding the scope of the consumer’s right to access information by including a right to access information about the business’s use of ADT, and adhering to special provisions for processing children’s information. As we move into 2024 and the CPPA board continues discussing these draft regulations, businesses should monitor the evolution of these regulations; they may indicate areas of enforcement in 2024.

At the state and local level, artificial intelligence regulation has focused on and likely will continue to address the use of artificial intelligence for significant life decisions and opportunities, including employment, credit and housing. Effective July 5, 2023, a New York City law that amended the City’s administrative code made it unlawful for employers to use automated employment decision tools (“AEDT”) for employment decisions, unless the tool has been subject to a bias audit within the last year.15 Washington, DC introduced the “Stop Discrimination by Algorithms Act of 2023” to prohibit users of algorithmic decision making from utilizing eligibility determinations in a discriminatory matter.16 However, the bill has stalled since February of 2023.17 It is likely that other large municipalities will pass similar regulations.

Efforts to regulate AI are gaining momentum in the EU, too. On December 8, 2023, EU countries and European Parliament members reached a provisional deal on rules for using artificial intelligence.18 Following a risk-based approach, the rules require high-risk artificial intelligence systems to comply with strict requirements and call for consumers to receive notice if they are interacting with artificial intelligence entities rather than human beings. Fines for violating the rules range from the greater of €15 million or 3% of global annual turnover to €7.5 million or 1.5% of global annual turnover for supplying incorrect information. Over the course of 2024, it is likely that the EU AI Act and reaction to it—especially toward compliance obligations and enforcement risk—will, much like the GDPR, influence the regulatory approach in the US.

4. Cookie litigation decisions will continue to create uncertainty for website providers. 

Litigation claims arising from companies inaccurately describing their cookie-collection practices have become increasingly prevalent in California. Claims have been brought under a variety of laws, including the California Invasion of Privacy Act (“CIPA”)19 California’s Computer Data Access and Fraud Act (“CDAFA”)20 California Consumer Privacy Act (“CCPA”)21 California’s Consumers Legal Remedies Act (“CLRA”)22, California’s Unfair Competition Law (“UCL”)23, and California common law, including intrusion upon seclusion, negligent or fraudulent misrepresentation, and unjust enrichment. While at least some of these laws seem ill-fitted for this type of claim, courts are still sorting through these relatively novel claims. 

For example, the case law relating to wiretap and invasion of privacy claims brought under CIPA has not been fully settled. California courts have gone both ways on the question of whether third-party technology that is embedded on websites constitutes a third-party interception (which requires consent) or a party to the communication (which does not). In re Facebook, Inc. Internet Tracking Litig., 956 F.3d 589, 608 (9th Cir. 2020), although not addressing every element of the claim, the court found (at least on the pleadings) that Facebook was not exempt from liability as a matter of law under CIPA as a party to the communication where plaintiff alleged defendants had caused simultaneous unknown duplication and transmission of plaintiffs’ communications with third party sites. However, the court in In re Google Inc. Cookie Placement Consumer Priv. Litig., 806 F.3d 125, 152 (3d Cir. 2015), held that Google was a party to the electronic transmissions that were the bases of the plaintiffs’ wiretapping claims and that such claims must be dismissed. This year, it will be important for businesses to keep an eye on the development of CIPA case law to determine whether wiretapping claims may be brought against website providers for their use of cookies.  

As for CDAFA claims, in Brown v. Google LLC, 525 F. Supp. 3d 1049, 1075 (N.D. Cal. 2021), the plaintiff alleged that Google had stated that browsing history, cookies, and site data would not be saved during private browsing sessions, but users’ browsers were still transmitting data to Google’s server. The court held that, under those allegations, the plaintiffs stated a claim under CDAFA by alleging that Google’s hidden code transmitted data without notice while the users were in private browsing mode. Similarly, In re Nickelodeon Consumer Privacy, 827 F.3d 262 (3d Cir. 2019), held that Nickelodeon may have committed the common law tort of intrusion upon seclusion by tracking consumers through a third party’s use of cookies because their website stated that they do not collect personal information about kids. Companies will need to remain vigilant about the language they use to describe their privacy policies and cookie banners and how they collect personal information.

The use of cookie banners on websites has become a recommended practice to comply with the notice at collection requirement under the CCPA and to avoid wiretap and invasion-of-privacy claims. These banners should permit users to manage their cookie settings by choosing which cookies they want to remain active. Over the course of 2024, it is likely that we will see an uptick in the number of claims regarding the use of cookies brought by individuals and as class action lawsuits.  

5. Children’s privacy and interactions with online platforms will continue to be a battleground for enforcement actions and litigation. 

Children’s privacy has become a central focus of legislative and regulatory efforts at both the state and federal level.

Over the course of 2023, we saw an increase in scrutiny by administrative agencies with respect to the use of children’s data, especially by the FTC. In May 2023, the FTC and DOJ filed a complaint against Amazon relating to Amazon’s maintenance of children’s voice recordings by Alexa-enabled devices in violation of the COPPA Rule.24 As a result, Amazon was required to pay the FTC $25 million as well as change its deletion practices, including deleting inactive child accounts. Less than one month later, Microsoft agreed to pay the FTC $20 million to settle charges that it violated COPPA by collecting personal information from children without notifying their parents or obtaining their parents’ consent.25 

The FTC’s agenda for the next few months reflects this increased focus on children’s privacy issues. The FTC is currently seeking comment (with the comment period closing March 11, 2024) on proposed changes to the COPPA Rule, which has not been changed since 2013.26 Proposed changes to the COPPA Rule would include, among other things, (i) separate verifiable consent from parents to disclose children’s information to third party advertisers; (ii) more robust data security requirements, including the implementation of a written children’s personal information security program; and (iii) a public, written data retention policy for children’s personal information. The revised Rule would also have a significant impact on EdTech platforms, limiting the use of students’ personal information for school-authorized educational purposes and not commercial purposes. 

At the state level, California has taken the lead by enacting the California Age Appropriate Design Code Act (“CAADCA”), which was initially set to take effect on July 1, 2024. The CAADCA dramatically expands the protections of children’s privacy by (a) extending privacy protections to children under 18, which is five years older than the age covered by COPPA and (b) requiring businesses providing online services likely to be accessed by children under 18 to, among other things, conduct data impact protection assessments, estimate the age of users, provide default privacy settings that offer a higher level of protection for children, use language appropriate for children in privacy notices, provided transparency on track, and not use children’s personal information for certain activities deemed harmful, such as selling personal information or tracking a child’s precise geolocation.  

When and whether the CAADCA will take effect is still uncertain. The US District Court for the Northern District of California granted a preliminary injunction blocking the law from going into effect in September 2023, which is currently on appeal.27

This increasing regulatory focus on children’s privacy is likely to be matched by separate litigation surrounding collection, use and sale of children’s data and the impact of social media and other online platforms on children. Recently Meta became the target of children’s related litigation, with a coalition of state attorneys general alleging that the company (i) knowingly designed and deployed harmful features on its platforms that purposefully addict children and teens and (ii) collected and maintained the personal information of children under 13 without their parents’ consent.28 As we move into 2024, companies offering online platforms that may be accessible by children should continue to monitor the data collected from children and ensure that appropriate consents are obtained from parents to avoid the potential risk of an enforcement action or litigation.  

6. Companies operating internationally will enjoy some consistency—for now—in transferring data between Europe and the US

New agreements in 2023 between the US and the European Union and the US and the United Kingdom provided long awaited clarity for businesses seeking to transfer out of Europe and into the US The EU-US Data Privacy Framework (“DPF”) and the UK-US Data Bridge both took effect last year, creating consistent standards for companies seeking to transfer data across the Atlantic. Companies under the jurisdiction of the Federal Trade Commission (“FTC”) or the Department of Transportation (“DOT”) can now file a single registration to begin transferring data out of the EU or the UK and into the United States without using Standard Contractual Clauses or International Data Transfer Agreements. 

Companies seeking to take advantage of this new registration system self-certify with the Department of Commerce that they will follow DPF requirements, which include providing transparent notice about data processing to data subjects, allowing data subjects to opt out of disclosure to third parties for purposes other than what the data were initially collected for, and ensuring data security and appropriate handling of subject data by the collecting company and any third parties that receive the data. Companies must also create mechanisms to investigate complaints regarding non-compliance with the DPF and agree to be subject to investigation and enforcement by the FTC and DOT.  

Once a company registers and complies with the DPF, it can use the same system to opt in to the UK-US Data Bridge. Certain UK data, such as genetic data and biometric data, must be labeled as “sensitive” before transfer to the US and the transferring party must take extra protections to ensure that the data are protected. Also, some data, such as “journalistic data” gathered for publication, broadcast, or public communication, cannot be transferred via the bridge.  

The certainty and ease of this agreed system is still subject to future change, however. The EU and US have reached agreements in the past concerning data transfers which were struck down by the European Court of Justice over concerns that EU data subjects were not adequately protected against US government surveillance.29 Activists who previously brought suits in the European Court of Justice have signaled an intent to do so again.30 The new UK-US Data Bridge also faces scrutiny as the UK Information Commission, an independent agency of the UK Parliament, published an opinion challenging the sufficiency of the protections offered to UK citizens under the agreement. 

Companies seeking to take advantage of the current structures should ensure that they comply with the DPF and register with the Department of Commerce. Depending on the type of data transferred, companies may opt to continue using alternative methods such as Standard Contractual Clauses or International Data Transfer Agreements.


1 The new and amended SEC rules include: 17 C.F.R. § 229.10-1305 (2023); 17 C.F.R. § 229.106 (2023); 17 C.F.R. § 229.601 (2023); 17 C.F.R. § 232.10-903 (2023); 17 C.F.R. § 232.405 (2023); 17 C.F.R. § 239.13 (2023); 17 C.F.R. § 240.13a-11 (2023); 17 C.F.R. § 240.15d-11 (2023); 17 C.F.R. § 249.220f (2023); 17 C.F.R. § 249.306 (2023); 17 C.F.R. § 249.308 (2023); 17 C.F.R. § 249.10 (2023).
2 16 C.F.R. §314.2 (2023); 16 C.F.R. §314.4 (2023).
3 Digital Accountability and Transparency to Advance Privacy Act, S. 3337, 118th Cong. § 1 (2023).
4 Division of Examinations Release, Securities and Exchange Commission, 2024 Examination Priorities, (2023), https://www.sec.gov/files/2024-exam-priorities.pdf.  
5 Heather Morton, 2023 Consumer Data Privacy Legislation, National Conference of State Legislatures (Sept. 28, 2023) https://www.ncsl.org/technology-and-communication/2023-consumer-data-privacy-legislation.
6 Utah Code § 13-61-103
7 Iowa Code Ann. § 715D.1 (West)
8 Cal. Civ. Code § 1789.82 et seq.
9 Colo. Rev. Stat. § 6-1-1301
10 Exec. Order No. 14110, 88 FR 75191 (2023).
11 See 45 CFR 170 (2024); 45 CFR 171 (2024); see also, Press Release, Department of Health and Human Services, HHS Finalizes Rule to Advance Health IT Interoperability and Algorithm Transparency (Dec. 13, 2023), https://www.healthit.gov/topic/laws-regulation-and-policy/health-data-technology-and-interoperability-certification-program
12 See id
13 See id.
14 The revised regulations can be found here: https://cppa.ca.gov/meetings/materials/20231208.html
15 N.Y.C. Local Law No. 144, § 20-871
16 Washington, D.C. Council Bill 250114
17 Stop Discrimination by Algorithms Act of 2023, Washington, D.C. Council Bill 250114, B25-0114 - Stop Discrimination by Algorithms Act of 2023 (dccouncil.gov).
18 Press Release, Council of the EU, Artificial intelligence act: Council and Parliament strike a deal on the first rules for AI in the world (Dec. 9, 2023), https://www.consilium.europa.eu/en/press/press-releases/2023/12/09/artificial-intelligence-act-council-and-parliament-strike-a-deal-on-the-first-worldwide-rules-for-ai/.  
19 Cal. Penal Code §630 et seq.
20 Cal. Penal Code §502
21 Cal. Civ. Code §1798.192
22 Cal. Civ. Code §1750-1784
23 Cal. Bus. and Prof. Code §17200
24 Press Release, FTC, FTC and DOJ Charge Amazon with Violating children’s Privacy Law by Keeping Kids’ Alexa Voice Recordings Forever and Undermining Parents’ Deletion Requests, https://www.ftc.gov/news-events/news/press-releases/2023/05/ftc-doj-charge-amazon-violating-childrens-privacy-law-keeping-kids-alexa-voice-recordings-forever
25 Press Release, FTC, FTC Will Require Microsoft to Pay $20 million over Charges it Illegally Collected Personal Information from Children without Their Parents’ Consent, https://www.ftc.gov/news-events/news/press-releases/2023/06/ftc-will-require-microsoft-pay-20-million-over-charges-it-illegally-collected-personal-information.  
26 Press Release, FTC, FTC Proposes Strengthening Children’s Privacy Rule to Further Limit Companies’ Ability to Monetize Children’s Data, https://www.ftc.gov/news-events/news/press-releases/2023/12/ftc-proposes-strengthening-childrens-privacy-rule-further-limit-companies-ability-monetize-childrens.
27 Press Release, California Office of Attorney General, Attorney General Bonta Continues Defense of California’s Age Appropriate Design Code Act, https://oag.ca.gov/news/press-releases/attorney-general-bonta-continues-defense-california%E2%80%99s-age-appropriate-design
28 Press Release, New York State Attorney General, Attorney General James and Multistate Coalition Sue Meta for Harming Youth, https://ag.ny.gov/press-release/2023/attorney-general-james-and-multistate-coalition-sue-meta-harming-youth
29 Schrems v. Data Protection Commissioner, C-362/14, Judgment, ¶ 87 (Oct. 6, 2015); Data Protection Commissioner v. Facebook Ireland Limited, C-311/18, Judgment, ¶ 197 (July 16, 2020).
30 Joseph Duball, DPC 2022: EU-US Data Privacy Framework on track, Schrems challenge to come, IAPP (Nov. 17, 2022) https://iapp.org/news/a/at-dpc-2022-eu-us-data-privacy-framework-on-track-schrems-challenge-to-come/.


This memorandum is a summary for general information and discussion only and may be considered an advertisement for certain purposes. It is not a full analysis of the matters presented, may not be relied upon as legal advice, and does not purport to represent the views of our clients or the Firm. Randall W. Edwards, an O’Melveny partner licensed to practice law in California, Sid Mody, an O’Melveny partner licensed to practice law in Texas, Scott W. Pink, an O’Melveny special counsel licensed to practice law in California, Joshua Goode, an O’Melveny associate licensed to practice law in the District of Columbia, Emily Losi, an O’Melveny associate licensed to practice law in New York, Sean Milde, an O’Melveny associate licensed to practice law in New York, and Kayla Tanaka, an O’Melveny associate licensed to practice law in California, contributed to the content of this newsletter. The views expressed in this newsletter are the views of the authors except as otherwise noted.

© 2024 O’Melveny & Myers LLP. All Rights Reserved. Portions of this communication may contain attorney advertising. Prior results do not guarantee a similar outcome. Please direct all inquiries regarding New York’s Rules of Professional Conduct to O’Melveny & Myers LLP, Times Square Tower, 7 Times Square, New York, NY, 10036, T: +1 212 326 2000.