2026 Data Security and Privacy Compliance Checklist: Key US State Law Updates, AI Rules, COPPA Changes, and Global Data Protection Risks
April 13, 2026
If your organization handles consumer, employee, or government data, 2026 is shaping up to be a year that demands closer attention to privacy and security compliance. The biggest pressure points come from expanding state privacy laws, expanding AI-related obligations, updated children’s privacy rules, and evolving international frameworks. This update highlights the most important legal and regulatory changes businesses should be tracking now. It also offers a practical checklist to help teams tighten privacy hygiene before enforcement risk grows.
Key Takeaways:
- As of March 2026, 20 US states now have comprehensive privacy laws, with Indiana, Kentucky, and Rhode Island taking effect in 2026 and adding new assessment, notice, and transparency obligations.
- California, Connecticut, Colorado, Maryland, and Minnesota are raising the bar on risk assessments, profiling, biometric data, opt-out tools, and privacy notice accuracy.
- AI regulation is expanding fast, with California, Colorado, and the EU increasing transparency, disclosure, and governance expectations for automated decision-making tools.
- Children’s privacy remains a major enforcement focus, and the updated COPPA rule brings an April 22, 2026 compliance deadline for new consent, retention, and disclosure requirements.
- The practical 2026 checklist centers on updating privacy notices, maintaining data and AI inventories, testing opt-out tools, strengthening vendor oversight, and formalizing risk assessment procedures.
1. US States Continue to Create an Evolving Compliance Landscape
State law is still the primary source of privacy obligations and requirements in the US. We recommend companies annually update their privacy notices to reflect new state coverage, 2026 rule changes, and heightened expectations around transparency, opt-out signals, sensitive data, and automated decision-making. In 2025, eight comprehensive state laws became effective: Delaware, Iowa, Maryland, Minnesota, Nebraska, New Hampshire, New Jersey, and Tennessee, followed by three more on January 1, 2026: Indiana, Kentucky, and Rhode Island. This brings the total number of states with comprehensive privacy laws to over 20. Outside of general disclosure obligations, new state laws and updates to existing state laws create new compliance obligations companies should proactively address.
A. Newly Enacted State Laws for 2026
Kentucky & Indiana: Each effective as of January 1, 2026, the Kentucky Consumer Data Protection Act (KYCDPA) and the Indiana Consumer Data Protection Act (INCDPA) apply to entities that conduct business in Kentucky or Indiana or otherwise produce products or services that are targeted to residents of the respective state; subject entities also must “control or process” the personal data of at least 100,000 Kentucky or Indiana residents or 25,000 Kentucky or Indiana residents and derive over 50% of gross revenue from the sale of personal data.1
The KYCDPA and the INCDPA require subject entities to conduct Data Protection Impact Assessments related to certain processing activities (e.g., targeted advertising, sensitive data, etc.). The KYCDPA Data Protection Impact Assessment obligations apply to processing activities that occur on or after June 1, 2026. The INCDPA Data Protection Impact Assessment does not carry with it a similar grace period; such assessments are required for processing activities that occurred on or after December 31, 2025.
Rhode Island: Effective as of January 1, 2026, the Rhode Island Data Transparency and Privacy Protection Act (RHDPA), applies to entities that conduct business in Rhode Island or otherwise produce products or services that are targeted to the residents of Rhode Island; subject entities also must “control or process” the personal data of at least 35,000 Rhode Island residents or 10,000 Rhode Island residents and derive over 20% of gross revenue from the sale of personal data.2
The RHDPA requires subject entities to conduct Data Protection Assessments for certain processing activities deemed to represent a “heightened risk of harm” to consumers. There is no grace period for such assessments—they are required for such processing activities that occur on or after January 1, 2026.
Notably, the RHDPA includes a broadly applicable privacy notice requirement which applies to all commercial websites and internet service providers that conduct business in Rhode Island or that have customers in Rhode Island and “collect, store, or sell customers’ personally identifiable information,” even those who are not otherwise subject to the RHDPA’s other provisions.3
B. Certain Other State Obligations: Updates, Amendments, and, 2025 Laws
California: On January 1, 2026, California launched the Delete Request and Opt-out Platform (DROP), implementing California’s Delete Act by allowing residents to submit a single deletion request to all registered data brokers at once. Beginning August 1, 2026, registered data brokers must process deletion requests within 45 days after receiving any request made pursuant to DROP.4 Organizations that collect, aggregate, or sell consumer data should evaluate whether registration is required and ensure they have the technical capacity to integrate and comply with DROP by the August 1, 2026 processing deadline.
The California Privacy Protection Agency (“CalPrivacy”) has also recently begun rulemaking activity focused on (1) streamlining consumers’ ability to exercise privacy rights and (2) opt-out preference signals—a preliminary public written comment period is underway and comments are to be accepted through early April.5 This indicates that CalPrivacy will continued to focus on simplifying and making it easier for consumers to exercise their privacy rights.
Connecticut: Effective July 1, 2026, Connecticut’s latest amendments to the Connecticut Data Privacy Act (CTDPA) fine‑tune data minimization and overhaul profiling and transparency duties. Controllers must now limit collection of personal information to what is “reasonably necessary and proportionate” to disclosed purposes, and processing of sensitive data must be “reasonably necessary in relation to the purposes” for which it is processed, with separate consent required to sell sensitive data. The amendments expand profiling protections by extending opt‑out rights for consumers to “any automated decision” producing legal or similarly significant effects, adding a new right to contest such profiling decisions and requiring dedicated profiling “impact assessments” for covered uses beginning August 1, 2026 (discussed further below). The amendments also expand privacy notice obligations by requiring subject entities to disclose whether personal data is collected, used, or sold for training large language models.6
Maryland: Effective October 1, 2025, the Maryland Online Data Privacy Act (MODPA), set a new standard of personal data restrictions by requiring subject entities to limit the collection and processing of personal information to what is “reasonably necessary and proportionate” to provide or maintain a specific product or service so requested by the consumer, moving beyond a purely notice and consent based framework. In particular, controllers may not collect or process sensitive data except where the collection or processing is strictly necessary to provide or maintain a consumer-requested product or service and the subject entity obtains the consumer’s consent.7
Minnesota: Effective July 31, 2025, the Minnesota Consumer Data Privacy Act (MCDPA) contains expanded transparency and profiling-related rights for consumers. Consumers can request subject entities to provide a list of specific third parties to whom a subject entity has actually disclosed the consumer’s personal data. On profiling, Minnesota now provides a distinct right—and a corresponding obligation on the controller to comply—to question results of profiling used for decisions with legal or similarly significant effects, to receive the rationale behind each decision, and—critically—to be informed of actions the consumer could have taken, and can take in the future, to secure a different decision alongside the right to review and correct data with re‑evaluation based on corrected information.8
We expect intensified state enforcement around the clarity and accuracy of privacy notices and the real-world functionality of linked rights tools—especially opt-outs, cookie banners, and automated signals-building on visible 2025 actions and pending 2026 matters.9 Businesses should ensure that public-facing disclosures match actual processing and that consumer privacy controls operate as described across the various channels in which the business operates (e.g. website, mobile applications, etc.); they should also be tested to ensure they handle the technical and network loads generated by the number of requests.
2. Specific Risk Assessment and Audit Requirements
Organizations should expect more explicit, formalized obligations to conduct privacy and security risk reviews before launching higher‑risk data uses, with several regimes now mandating documented assessments. In practice, this means aligning executive attestations and submissions to jurisdiction‑specific timelines and building and maintaining programs that scope and evaluate activities such as targeted advertising, sensitive data handling, automated decision‑making, and biometric processing.
California: In 2025, California finalized and adopted CCPA regulations which became effective January 1, 2026, requiring comprehensive privacy risk assessments before initiating processing that presents a “significant risk” to consumer privacy, including selling/sharing personal information, processing sensitive personal information, using ADMT for significant decisions, certain profiling in HR/education contexts, and training ADMT/biometric technologies. Covered activities that occurred before 2026 require such an assessment by December 31, 2027, with a senior-executive attestation and summary due April 1, 2028, and annual submissions thereafter.
The new rules also establish mandatory, independent cybersecurity audits for businesses meeting specified revenue/data-volume thresholds, with phased first certifications due April 1 in 2028-2030 based on size. California further expanded opt-out preference signal obligations, including visible confirmation that requests (e.g., GPC) were processed, and clearer symmetry/consent rules for cookie banners, requiring status indication on websites beginning this year.10
Colorado: Recent expansions to the Colorado Privacy Act (CPA) create biometric-specific obligations, effective as of July 1, 2025, including broader notice and consent requirements, a mandate to adopt a written biometric policy with retention and prescribed deletion timelines, and prohibitions on selling biometric identifiers without consent. Applicability is broadened to entities processing any amount of biometric identifiers or data collected from Colorado residents, even if they do not otherwise meet CPA thresholds, and controllers must provide clear notice at or before collection specifying purposes, retention periods, and processor disclosures. Given that biometric data remains sensitive data under the CPA, these changes increase when data protection assessments are required, particularly where biometrics are used for identification or in high-risk contexts.
Connecticut: Beginning August 1, 2026, amendments to the Connecticut Data Privacy Act (CTDPA) require any controller that engages in profiling for the purposes of making a decision that produces any legal or similarly significant effect concerning a consumer must conduct a dedicated “impact assessment” for such profiling—a requirement that is separate from the CTDPA’s existing data protection assessment obligations for certain high-risk processing. These new impact assessment requirements only apply to such processing activities created or generated on or after August 1, 2026 (i.e., not a retroactive obligation).
3. Artificial Intelligence: Regulations are Expanding
Across jurisdictions, 2026 brings a convergence of privacy and AI governance requirements that make early planning and system inventories essential. Notably, California’s updated CCPA framework (including privacy risk assessments and audits) addresses obligations around automated decision‑making, profiling, and AI training, while the EU AI Act’s Article 50 transparency obligations take full effect on August 2, 2026.
California & Colorado: California’s recent updates to its CCPA program bring automated decision‑making technology (ADMT) into companies’ compliance considerations, requiring privacy risk assessments for high‑risk processing and related disclosures in notices. Organizations should expect assessments to cover uses such as profiling, decisions with legal or similarly significant effects, use of personal information to train ADMT, and certain AI‑related inferences.
Companies should align privacy notices and user interfaces with cross‑state ADMT expectations, including clear explanations of automated processing and opt‑out mechanics where applicable. For example, Colorado mandates detailed explanations of automated decisions, foreseeable risks of algorithmic discrimination, and the nature, source, and extent of information collected and used by the ADMT.11
European Union: The EU AI Act continues its phased implementation. Rules governing general-purpose AI models became applicable in August 2025, and transparency obligations under Article 50—including requirements that humans be informed when interacting with AI systems and that AI-generated content and deep fakes be clearly labeled—remain on track to take effect on August 2, 2026.12 However, under the EU’s Digital Omnibus simplification package, the enforcement timeline for high-risk AI systems may be substantially delayed. The Council of the EU recently published an update noting that application dates of December 2, 2027 for stand-alone high-risk AI systems and August 2, 2028 for high-risk AI systems embedded in certain products may be included in the package.13
Organizations should maintain an up‑to‑date inventory of AI and automated decision-making tools, classify risk, and map data uses in anticipation of new assessment, disclosure, and audit duties taking effect in the US and EU. Regulators are emphasizing transparency, human oversight, and verifiable operational controls, not just policies—meaning programs should operationalize inventory, testing, and monitoring now. Both the California and the EU regimes expect that organizations have a current register of AI systems, documented risk classification, and clear disclosures. Building these foundations in early 2026 reduces remediation costs and supports consistent notices, opt‑out handling, and cross‑regional governance.
4. Child Privacy Laws
2025 brought renewed federal focus to children’s privacy—with significant changes to the Children’s Online Privacy Protection Act (COPPA) rule and public statements from FTC officials that enforcement of COPPA is a priority. COPPA applies to websites and online services that are aimed at children or know they have collected information from children under 13.
In 2025, the Federal Trade Commission finalized changes to COPPA setting new requirements around the collection, use, and disclosure of children’s personal information. The updated COPPA Rule took effect on June 23, 2025, and carries a compliance deadline of April 22, 2026. Relevant changes include (1) requiring opt-in parental consent for targeted advertising and other disclosures of children’s personal information to third parties, (2) limiting data retention, (3) increasing Safe Harbor14 programs’ transparency by requiring the Safe Harbor programs to publicly disclose their membership lists and report additional information, and (4) adding options for verifiable parental consent when children’s personal information is not disclosed, such as a “text plus” option.15
With children’s privacy being an active enforcement priority and the COPPA rule tightened, companies operating child‑directed services or handling data of users under 13 should promptly verify consent flows, narrow data retention, and refresh disclosures and Safe Harbor governance to match the new requirements. Prioritizing opt‑in parental consent for targeted advertising and third‑party sharing, implementing minimized retention, and adopting approved consent methods will reduce enforcement risk and better align operations with the updated rule set.
5. GDPR – Pending Developments
In late 2025, the European Commission introduced the Digital Omnibus Package, which proposes targeted amendments to the GDPR to simplify compliance and provide clearer legal bases for modern data uses—especially around AI (discussed above), research, and incident reporting. The package would create a relative, entity-specific approach to the definition of personal data—clarifying that data is not covered personal data for an entity that lacks means “reasonably likely to be used” to make such data capable of identifying the underlying individual.16 It would recognize AI development and operation as a potential legitimate interest (subject to a balancing test, safeguards, and an unconditional right to object) and introduce a narrowly framed allowance to handle residual special‑category data in AI datasets under strict minimization and protective measures. The package would also clarify that scientific research may constitute a legitimate interest compatible with further processing.
On breach notifications and related governance, the proposal would shift the law to a more explicitly risk‑based and harmonized regime. For example, controllers would notify authorities and individuals only where a breach is likely to pose a high risk, the deadline for notification would extend from 72 to 96 hours, and the European Data Protection Board (EDPB) would propose a common template and risk indicators. Breach reporting obligations would move to a single EU entry point designed to streamline potentially overlapping regimes.
The package would also centralize Data Privacy Impact Assessment (DPIA) practices by tasking the EDPB with producing EU‑wide lists of the type of processing requiring/not requiring a DPIA and providing a standard methodology and template for executing DPIAs.17
While the Digital Omnibus Package signals potential clarity for subject entities through its simplification, we expect there to be ongoing negotiation between Member States and regulators—subject entities should continue to track the developments and potential adoption of the Digital Omnibus Package and to maintain compliance with the existing laws until any amendments are finalized and in effect.
6. International – Certain Laws to Consider
While the US and EU remain at the forefront of data privacy regulations, other countries continue to enter the regulatory space. Globally operating entities should be careful to consider their global compliance obligations. We highlight two relevant regimes here:
India: India’s Digital Personal Data Protection Act (DPDP) utilizes a phased roll-out of compliance obligations, with a key 2026 milestone—registration and oversight of “consent managers” beginning in November 2026. The consent manager under the DPDP operates as a fiduciary of a data controller, giving individuals a platform to give, manage, review, and withdraw consent. Registration of such consent managers is limited to India‑incorporated entities meeting eligibility criteria such as a minimum net worth and those that possess sufficient technical, operational, and financial capacities. Full substantive compliance for data fiduciaries follows in May 2027 (e.g., notices, consent, breach reporting, security safeguards, and rights enablement).18
With full operation of the DPDP coming ever closer, 2026 should be used to finalize consent architecture, select/register consent managers, and prepare compliance integrations and governance.
Australia: For entities subject to Australia’s data protection regimes, starting December 10, 2026, privacy policies must provide specific transparency around automated decision‑making (ADM), including disclosure of the personal information used in ADM, decisions made solely by automated means, and decisions where automation is substantially related for decisions that meaningfully affect individuals’ rights or interests. Organizations should review and update disclosures to explain ADM use cases and impacts in plain language ahead of the effective date.19
2026 Data Security & Privacy Compliance Check-In: Key Action Items for Organizations
Access a PDF version of the checklist here.
As data security and privacy obligations continue to expand—driven by newly effective state laws, heightened risk assessment, and audit requirements, AI-specific regulations, updates to children’s privacy rules, and evolving international data protection regimes—organizations should take proactive steps to evaluate and strengthen their compliance programs. The following checklist highlights priority action items drawn from significant US and international developments taking effect in or around 2026.
- Update privacy notices on an annual cadence and upon material change.
- Ensure coverage of newly effective state laws, enhanced transparency on sensitive data, opt-out signals, and automated decision-making; align public-facing disclosures with actual processing to avoid enforcement risk.
- Maintain a current enterprise data and AI/ADMT inventory.
- Map personal data flows, identify AI/ADMT use cases, classify risk levels, and document data uses to support disclosures, assessments, and audits under US state laws and the EU AI Act timelines.
- Implement and regularly test privacy preference selection opt-outs (including universal signals).
- Validate end-to-end functionality of GPC/opt-out preference signals with visible confirmation, ensure cookie banner symmetry/consent, and test performance across channels and load.
- Prepare for independent cybersecurity audits where applicable.
- Track thresholds and phase-ins, with first certifications due April 1 in 2028–2030 depending on size.
- Strengthen third-party/vendor management.
- Update DPAs and diligence to reflect state-specific transparency and profiling/ADMT obligations (e.g., Minnesota’s right to a list of specific third parties actually receiving data; Colorado processor disclosures for biometrics); verify vendor-operated rights mechanisms function as described.
- Create entity-specific procedures for applicable risk assessments.
- Cover high-risk processing (selling/sharing PI, sensitive PI, significant-impact ADMT/profiling, training ADMT/biometrics) and track jurisdictional deadlines.
1 Kentucky Revised Statutes §§ 367.3611 et seq., https://apps.legislature.ky.gov/law/Statutes/chapter.aspx?id=39092; Indiana Code §§ 24-15 et seq., https://iga.in.gov/laws/ic/downloads.
2 Rhode Island General Laws §§ 6-48.1 et seq., https://webserver.rilegislature.gov/Statutes/TITLE6/6-48.1/INDEX.htm.
3 Rhode Island General Laws § 6-48.1-3.
4 https://cppa.ca.gov/regulations/pdf/data_broker_reg_delete_act_statute_eff_20260101.pdf
5 Track developments at: https://cppa.ca.gov/regulations/#:~:text=The%0AAgency%20is%20gathering%20information%20and%20seeking%20input%20from%20stakeholders%20about%20these%20topics.
6 Connecticut SB 1295, https://www.cga.ct.gov/2025/ACT/PA/PDF/2025PA-00113-R00SB-01295-PA.PDF.
7 Maryland Commercial Law Code § 14-4701 et seq., https://mgaleg.maryland.gov/mgawebsite/Laws/StatuteText?article=gcl§ion=14-4701&enactments=false.
8 Minnesota Statutes §§ 325M.10 et seq., https://www.revisor.mn.gov/statutes/cite/325M.
9 For instance, in February 2026 the California Attorney General reached a settlement with the Walt Disney Company, requiring payment of $2.75 million in civil penalties related to Disney’s failure to provide CCPA compliant opt-out rights to consumers, See California Won’t Let it Go: Attorney General Bonta Announces $2.75 Million Settlement with Disney, Largest CCPA Settlement in California History, https://oag.ca.gov/news/press-releases/california-wont-let-it-go-attorney-general-bonta-announces-275-million; and Connecticut’s Attorney General reached a settlement with TicketNetwork, Inc., requiring TicketNetwork, Inc. to pay $85,000 in penalties and agree to comply with the CTDPA after failing to provide a compliant privacy notice under the act, See Attorney General Tong Announces $85,000 Settlement with TicketNetwork for Violations of the Connecticut Data Privacy Act, https://portal.ct.gov/ag/press-releases/2025-press-releases/attorney-general-tong-announces-settlement-with-ticketnetwork.
10 https://cppa.ca.gov/regulations/pdf/ccpa_updates_cyber_risk_admt_appr_text.pdf.
11 https://cppa.ca.gov/regulations/pdf/ccpa_updates_cyber_risk_admt_appr_text.pdf; Colorado SB24-205, https://leg.colorado.gov/bills/sb24-205.
12 https://artificialintelligenceact.eu/article/50/.
14 COPPA Safe Harbor programs are FTC-approved industry self-regulatory programs whose members are deemed COPPA-compliant when they follow the programs’ guidelines (e.g., Children’s Advertising Review Unit or kidSAFE).
15 16 CFR § 312, https://www.federalregister.gov/documents/2025/04/22/2025-05904/childrens-online-privacy-protection-rule.
16 Note, the Digital Omnibus Package remains subject to ongoing drafting and negotiation. We flag that the provisions concerning personal data are subject to significant opposition and may likely be dropped from any finalized package. https://www.edpb.europa.eu/news/news/2026/digital-omnibus-edpb-and-edps-support-simplification-and-competitiveness-while_en.
17 https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52025PC0837; See also https://digital-strategy.ec.europa.eu/en/faqs/digital-package.
This memorandum is a summary for general information and discussion only and may be considered an advertisement for certain purposes. It is not a full analysis of the matters presented, may not be relied upon as legal advice, and does not purport to represent the views of our clients or the Firm. Randall W. Edwards, an O’Melveny partner licensed to practice law in California; Sid Mody, an O’Melveny partner licensed to practice law in Texas; Scott W. Pink, an O’Melveny special counsel licensed to practice law in California; Reema Shah, an O’Melveny partner licensed to practice law in New York; and Andrew Kus, an O’Melveny associate licensed to practice law in California, contributed to the content of this newsletter. The views expressed in this newsletter are the views of the authors except as otherwise noted.
© 2026 O’Melveny & Myers LLP. All Rights Reserved. Portions of this communication may contain attorney advertising. Prior results do not guarantee a similar outcome. Please direct all inquiries regarding New York’s Rules of Professional Conduct to O’Melveny & Myers LLP, 1301 Avenue of the Americas, Suite 1700, New York, NY, 10019, T: +1 212 326 2000.