Tag Archives: Katja Langenbucher

AI Judgment Rule(s)

by Katja Langenbucher

Photo of Professor Katja Langenbucher

Photo courtesy of author

In an upcoming paper, I explore whether the use of AI to enhance decision-making brings about radical change for legal doctrine or, by contrast, is just another new tool. The essay submits that we must rethink the law’s implicit assumption that (and how) humans make the decisions that corporate law regulates. If there is movement in implicit assumptions about how people make decisions, legal rules need review.

Decision-making is the cornerstone of corporate life and of keen interest to a variety of scholarly disciplines. They range from rational-actor theories over behavioral approaches to neuro-economics and psychology. The law has its own theories on decision-making. Many are normative and specify decision procedures and outcomes. In addition, the law rests on implicit theories of decision-making: A legal rule will look different if, for instance, it assumes either that decision-making follows optimal choice patterns or that heuristics and biases guide human decisions.

Continue reading

The European Court of Justice Tightens the Requirements for Credit Scoring under the GDPR

by Katja Langenbucher

Photo of Professor Katja Langenbucher

Professor Katja Langenbucher (photo courtesy of author)

The quality of a credit scoring model depends on the data it has access to. Yesterday, the European Court of Justice (ECJ) decided its first landmark case on data protection in a credit-scoring situation. The court issued a preliminary ruling involving a consumer’s request to disclose credit-score related data against a German company (“Schufa”). The practice of credit reporting and credit scoring varies enormously across Europe. Somewhat similar to the US, the UK knows separate credit reporting and scoring agencies. In France, the central bank manages a centralized database that is accessible to credit institutions, which establish their own proprietary scoring models. In Germany, a private company (the “Schufa”) has a de facto monopoly, holding data on 68 million German citizens and establishing the enormously wide-spread “Schufa”-score. Banks look to that score when extending credit, as do landlords, mobile phone companies, utility suppliers, and, sometimes, potential employers. This every-day use stands in stark contrast with a lack of transparency as to which data Schufa collects and how it models the score.

Continue reading

Privacy Experts React to Meta’s 1.2 Billion Euro Data Transfer Fine

Editor’s Note: NYU Law’s Program on Corporate Compliance and Enforcement (PCCE) is following the developments from the recently-announced and record-breaking fine against Meta Platforms, Inc. for alleged violations of Europe’s General Data Protection Regulation (GDPR) over transfers of personal data from the EU to the U.S. The relevant decisions of the Irish Data Protection Commission and the European Data Protection Board are available here and here. The question of compliance with rules for cross-Atlantic data transfers is subject to significant legal uncertainty and political disputes between the relevant jurisdictions. In this post, privacy experts offer insights into the decision.

Photos of the authors

From left to right: Joe Jones, Katja Langenbucher, Thomas Streinz, and Trisha Sircar. (Photos courtesy of the authors)

Continue reading

Explaining MiCA: Part of the EU’s Approach to Crypto and Digital Asset Regulation

by Katja Langenbucher

Photo of Professor Katja Langenbucher

Professor Katja Langenbucher

FTX, Kraken, TerraLuna, and similar cases have recently prompted the SEC to move ahead with a long list of enforcement actions. While some applaud the securities regulator‘s push ahead, others criticize its lack of explicit rule-making. Yet some would prefer a banking regulator to step in and authorize a national trust bank charter for issuers of stablecoins. Against this background, the upcoming EU Markets in Crypto Assets Regulation (MiCA) provides an illustration of a tailor-made regime combining elements of securities and banking regulation.

MiCA is part of the larger EU digital finance package which includes rules on operational resilience (DORA), a DLT pilot regime for security tokens, and amendments to several financial services Directives. Arguably, the “libra/diem-scare” to monetary autonomy was a main driver pushing the EU Commission to consider new legislation. Additionally, the differing speed of legislators across EU Member States brought about the risk of unhelpful regulatory competition, suggesting a level playing field strategy instead.

Continue reading

Regulating AI – The Next “Brussels Effect”?

by Katja Langenbucher

Photo of Professor Katja Langenbucher

How to deal with the challenges of Artificial Intelligence has been at the forefront of lawmakers’ and regulators’ initiatives around the world. The FTC has in August announced that it is exploring rules to tackle commercial surveillance, the SEC’s Chair Gary Gensler voiced concerns over AI in the Fintech space, and the CFTC has issued its own “primer” on AI. The Council of the European Union has this morning adopted its common position on a new regulation, the “Artificial Intelligence Act”.  Most other EU institutions have already issued their comments, the EU Parliament is scheduled to pass a final vote on the report in the first quarter of 2023.

Continue reading

European Parliament Adopts New Whistleblower Directive

by Dr. Katja Langenbucher

The history of whistleblower protection under European Law is short. Ten European countries have provided effective protection for whistleblowers in their national laws. For the rest, protection remained fragmented and uneven across policy areas. Only since 2014 have EU institutions been obliged to introduce internal rules protecting whistleblowers who are officials of the EU institutions. By the end of 2015, the EU Parliament adopted similar rules. The EU Commission expressed general support of whistleblower protection in 2016, then being concerned with tax evasion, and in 2017 started a public consultation on the topic. The EP followed up with a resolution and an own-initiative report by its Committee on legal affairs, leading to the “Proposal for a directive of the European Parliament and of the Council on the protection of persons reporting on breaches of Union Law (COM(2018)0218).“  On 16 April 2019, this proposal was adopted by the European Parliament. It now needs to be approved by the Commission for Member States to have two years to transpose its rules.

Whistleblower protection has to strike a delicate balance. For the whistleblower to come forward, we need to provide him with safe reporting channels and efficient protection against negative consequences. Corporations have a legitimate interest to avoid reputational damage if the disclosed accusations are false. The Directive aims at reconciling both concerns by (1) defining the areas of the law eligible for whistleblowing, (2) framing a profile of the whistleblower qualifying under the new rules, (3) setting out the type of information, (4) the reporting channels and (5) the protection offered. Continue reading