The European Court of Justice Tightens the Requirements for Credit Scoring under the GDPR

by Katja Langenbucher

Photo of Professor Katja Langenbucher

Professor Katja Langenbucher (photo courtesy of author)

The quality of a credit scoring model depends on the data it has access to. Yesterday, the European Court of Justice (ECJ) decided its first landmark case on data protection in a credit-scoring situation. The court issued a preliminary ruling involving a consumer’s request to disclose credit-score related data against a German company (“Schufa”). The practice of credit reporting and credit scoring varies enormously across Europe. Somewhat similar to the US, the UK knows separate credit reporting and scoring agencies. In France, the central bank manages a centralized database that is accessible to credit institutions, which establish their own proprietary scoring models. In Germany, a private company (the “Schufa”) has a de facto monopoly, holding data on 68 million German citizens and establishing the enormously wide-spread “Schufa”-score. Banks look to that score when extending credit, as do landlords, mobile phone companies, utility suppliers, and, sometimes, potential employers. This every-day use stands in stark contrast with a lack of transparency as to which data Schufa collects and how it models the score.

Continue reading

Privacy Experts Share Tips for Managing an Effective Privacy Program from PCCE’s Fall Security, Privacy, and Consumer Protection Conference

Photo of Event Speakers

Left to Right: James Haldin, Judy Titera, Melissa Harrup, Nicole Friedlander, and Avi Gesser (©Hollenshead: Courtesy of NYU Photo Bureau)

On November 17, 2023, the NYU Law Program on Corporate Compliance and Enforcement (PCCE) hosted a standing-room-only full-day conference on Security, Privacy, and Consumer Protection. The conference addressed issues such as managing effective cybersecurity and privacy compliance programs, the use of “dark patterns” to manipulate consumer choices, whether privacy regulation and enforcement actions actually prompt firms to update their privacy policies, and the new amendments to the New York Department of Financial Services cybersecurity rules. A full agenda of the conference, along with speaker bios, is available here. In this post, several participants from the panel on Managing an Effective Privacy Program in a Time of Increasing Regulatory and Legal Risk share further thoughts on the issue.

Continue reading

The EU AI Act – Navigating the EU’s Legislative Labyrinth

by Avi GesserMatt KellyMartha HirstSamuel J. AllamanMelissa Muse, and Samuel Thomson

From left to right: Avi Gesser, Matt Kelly, Martha Hirst, Samuel J. Allaman, and Melissa Muse. Not pictured: Samuel Thomson. (Photos courtesy of Debevoise & Plimpton LLP).

As legislators and regulators around the world are trying to determine how to approach the novel risks and opportunities that AI technologies present, the draft European Union Artificial Intelligence Act (the “EU AI Act” or the “Act”) is a highly anticipated step towards the future of AI regulation. Despite recent challenges in the EU “trilogue negotiations”, proponents still hope to reach a compromise on the key terms by 6th December, with a view to passing the Act in 2024 and most of the provisions becoming effective sometime in 2026.

As one of the few well-progressed AI-specific laws currently in existence, the EU AI Act has generated substantial global attention. Analogous to the influential role played by the EU’s GDPR in shaping the contours of global data privacy laws, the EU AI Act similarly has the potential to influence the worldwide evolution of AI regulation.

This blog post summarizes the complexities of the EU legislative process to explain the current status of, and next steps for, the draft EU AI Act. It also includes steps which businesses may want to start taking now in preparation of incoming AI regulation.

Continue reading

$10 Million Penalty Against D.E. Shaw a Major Step in SEC’s Enforcement of Rule 21F-17(a)

by Benjamin Calitri

Photo courtesy of author

Benjamin Calitri (Photo courtesy of Kohn, Kohn & Colapinto LLP

The SEC recently charged an investment advisor, D. E. Shaw, with Rule 21F-17(a) violations for including clauses in their employment and severance agreements that prohibited whistleblowing. For these violations, D.E. Shaw was fined $10 million. This is a significant development for enforcement of Rule 21F-17(a) as it is over twenty times larger than the previous highest penalty for a Rule 21F-17(a) violation.

It remains to be seen whether sanctions of this size are the new normal for Rule 21F-17(a) actions, but the D.E. Shaw case is undoubtedly a major development. The action dramatically changes the cost-benefit analysis for companies seeking to use contracts to silence whistleblowers and sends a clear message that the SEC is taking violations of Rule 21F-17(a) seriously.

Continue reading

An Ounce of Prevention is Worth a Pound of Cure . . . or an Imposed Compliance Monitorship: A Fresh Look at the DOJ’s Corporate Enforcement Toolkit Applied to Sanctions and Export Controls Enforcement

by Brent Carlson and Michael Huneke

Photos of the authors

From left to right: Brent Carlson and Michael Huneke (Photos courtesy of authors)

In our last article, we discussed the evolution of export controls penalties.[1] Beyond monetary penalties, the U.S. Department of Justice (“DOJ”) has additional items in its corporate enforcement toolkit that dramatically increase the cost of non-compliance. These include the DOJ’s new policies requiring companies to claw back or withhold executive compensation, requiring CEOs and chief compliance officers to make pre-release compliance certifications, and expanding the grounds for appointing independent compliance monitors.

Such corporate enforcement trends significantly increase the value of making front-end investments to avoid the “pound of cure.” In this post, we take a “fresh look” at these trends with an eye towards sanctions and export controls enforcement and offer practical guidance for dealing with them. Continue reading

Former Prosecutors and Crypto Experts Comment on the Binance/Changpeng Zhao Enforcement Actions

The NYU Program on Corporate Compliance and Enforcement (PCCE) is following the recent federal enforcement actions against Binance, the world’s largest cryptocurrency exchange, and its founder Changpeng Zhao. In this post, crypto experts, former prosecutors, and the former Superintendent of the New York Department of Financial Services offer their expert insights on these developments.

Photos of the authors

Left to right: Maria Vullo, Eugene Ingoglia, Daniel Payne, Ijeoma Okoli, and Paul Krieger (Photos courtesy of authors)

Continue reading

California Privacy Protection Agency Publishes Draft Regulations on Automated Decisionmaking Technology

by Hunton Andrews Kurth LLP

photo of the author

On November 27, 2023, the California Privacy Protection Agency (“CPPA”) published its draft regulations on automated decisionmaking technology (“ADMT”). The regulations propose a broad definition for ADMT that includes “any system, software, or process—including one derived from machine-learning, statistics, or other data-processing or artificial intelligence—that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decisionmaking.” ADMT also would include profiling, which would mean the “automated processing of personal information to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.”

Continue reading

From Washington to Brussels: A Comparative Look at the Biden Administration’s Executive Order and the EU’s AI Act

by

Photos of the authors.

Top left to right: Marianna Drake, Marty Hansen, and Lisa Peets. Bottom left to right: Will Capstick, Jayne Ponder, and Yaron Dori. (Photos courtesy of Covington & Burling LLP)

On October 30, 2023, days ahead of government leaders convening in the UK for an international AI Safety Summit, the White House issued an Executive Order (“EO”) outlining an expansive strategy to support the development and deployment of safe and secure AI technologies (for further details on the EO, see our blog here). As readers will be aware, the European Commission released its proposed Regulation Laying Down Harmonized Rules on Artificial Intelligence (the EU “AI Act”) in 2021 (see our blog here). EU lawmakers are currently negotiating changes to the Commission text, with hopes of finalizing the text by the end of this year, although many of its obligations would only begin to apply to regulated entities in 2026 or later.

The EO and the AI Act stand as two important developments shaping the future of global AI governance and regulation. This blog post discusses key similarities and differences between the two.

Continue reading

Hackers Turned Whistleblowers: SEC Cybersecurity Rules Weaponized Over Ransom Threat

by Andrew J. Ceresney, Charu A. Chandrasekhar, Luke Dembosky, Avi Gesser, Matthew E. Kaplan, Erez Liebermann, Benjamin R. Pedersen, Steven J. Slutzky, Jonathan R. Tuttle, Matt Kelly, and Kelly Donoghue

Top left to right: Andrew J. Ceresney, Charu A. Chandrasekhar, Luke Dembosky, Avi Gesser, Matthew E. Kaplan, and Erez Liebermann
Bottom left to right: Benjamin R. Pedersen, Steven J. Slutzky, Jonathan R. Tuttle, Matt Kelly, and Kelly Donoghue (Photos courtesy of Debevoise & Plimpton LLP)

On November 7, 2023, the profilic ransomware group AlphV (a/k/a “BlackCat”) reportedly breached software company MeridianLink’s information systems, exfiltrated data and demanded payment in exchange for not publicly releasing the stolen data. While this type of cybersecurity incident has become increasingly common, the threat actor’s next move was less predictable. AlphV filed a whistleblower tip with the U.S. Securities and Exchange Commission (the “SEC”) against its victim for failing to publicly disclose the cybersecurity incident. AlphV wrote in its complaint[1]:

We want to bring to your attention a concerning issue regarding MeridianLink’s compliance with the recently adopted cybersecurity incident disclosure rules. It has come to our attention that MeridianLink, in light of a significant breach compromising customer data and operational information, has failed to file the requisite disclosure under Item 1.05 of Form 8-K within the stipulated four business days, as mandated by the new SEC rules.

As we have previously reported, the SEC adopted final rules mandating disclosure of cybersecurity risk, strategy and governance, as well as material cybersecurity incidents. This includes new Item 1.05 of Form 8-K, which, beginning December 18,­ will require registrants to disclose certain information about a material cybersecurity incident within four business days of determining that a cybersecurity incident it has experienced is material. Though AlphV jumped the gun on the applicability of new Item 1.05, its familiarity with, and exploitation of their target’s public disclosure obligations is a further escalation in a steadily increasing trend of pressure tactics by leading ransom groups.

Continue reading

EU Advocate General Defines “Identity Theft” and Reaffirms GDPR Compensation Threshold

by Kristof Van Quathem and Aleksander Aleksiev 

Photos of the authors

Left to right: Kristof Van Quathem and Aleksander Aleksiev (Photos courtesy of Covington & Burling LLP)

EU advocate general Collins has reiterated that individuals’ right to claim compensation for harm caused by GDPR breaches requires proof of “actual damage suffered” as a result of the breach, and “clear and precise evidence” of such damage – mere hypothetical harms or discomfort are insufficient. The advocate general also found that unauthorised access to data does not amount to “identity theft” as that term is used in the GDPR.

The right for individuals to claim compensation for data breaches has long been a controversial and uncertain aspect of the GDPR – see our previous blogs here, herehere, and here for example.

Continue reading