The European Court of Justice Tightens the Requirements for Credit Scoring under the GDPR

by Katja Langenbucher

Photo of Professor Katja Langenbucher

Professor Katja Langenbucher (photo courtesy of author)

The quality of a credit scoring model depends on the data it has access to. Yesterday, the European Court of Justice (ECJ) decided its first landmark case on data protection in a credit-scoring situation. The court issued a preliminary ruling involving a consumer’s request to disclose credit-score related data against a German company (“Schufa”). The practice of credit reporting and credit scoring varies enormously across Europe. Somewhat similar to the US, the UK knows separate credit reporting and scoring agencies. In France, the central bank manages a centralized database that is accessible to credit institutions, which establish their own proprietary scoring models. In Germany, a private company (the “Schufa”) has a de facto monopoly, holding data on 68 million German citizens and establishing the enormously wide-spread “Schufa”-score. Banks look to that score when extending credit, as do landlords, mobile phone companies, utility suppliers, and, sometimes, potential employers. This every-day use stands in stark contrast with a lack of transparency as to which data Schufa collects and how it models the score.

By way of comparison, in the US, consumer reporting agencies such as Equifax, Experian or TransUnion collect relevant data. They establish consumer reports and furnish them to credit scoring agencies such as FICO. Credit scoring agencies, in turn, build scores to help lenders assess a potential borrower’s creditworthiness. The Fair Credit Reporting Act of 1970 (FCRA), an early piece of US data protection law, provides privacy rights in consumer reports. The Dodd-Frank Act amended two provisions of the FCRA and granted rule-making authority to the Consumer Financial Protection Bureau. Consumer rights include, for instance, one free consumer report a year, a right to dispute the completeness or accuracy of information in the consumer’s file and a credit freeze relating to identity theft (Solove & Schwartz, Privacy Law fundamentals, 2019, pp. 111-116). Title 15, United States Code, § 1681g requires that consumer reporting agencies, upon request of the consumer, disclose “all information in the consumer’s file”. However, this right does not cover “any information concerning credit scores or any other risk scores or predictors relating to the consumer”. Instead, § 1002.9(a)(2), (b)(2) of the Equal Credit Opportunity Act (ECOA) requires creditors to provide what is called the “ECOA notice”. If a creditor takes adverse action, a notification to the applicant is required. The notification must include a statement of specific reasons – the key factors for denying credit. The underlying policy goal is the ECOA’s prohibition for creditors to discriminate based on certain protected characteristics. The ECOA notice provides support to a consumer in litigation with a lender for discriminatory underwriting practices.

The question submitted by a German court for a preliminary ruling concerned Art. 22 para. 1 of the GDPR. This rule grants the consumer “the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her”. Para. 2 of that rule lists exceptions. There is no such right if the relevant decision is, lit.a, “necessary for entering into or performance of a contract”, if, lit.b, it “is authorized by Union or Member State law (…) which also lays down suitable measures to safeguard that data subject’s rights and freedoms and legitimate interests” or if, lit. c, “it is based on the data subject’s explicit consent”.

The decision has two parts. First, the ECJ answers the submitting court’s question whether Art. 22 para. 1 GDPR covers Schufa’s credit reporting and scoring activities. This involves an interpretation of the terms (i) “decision”, (ii) “based solely on automated processing”, and (iii) “legal or similar effects which significantly affect” the consumer. Second, the ECJ provides the submitting court with a more general assessment of scoring in the context of automated decision-making and hints for the next steps this court must take.

The ECJ answers the first question in the affirmative, rejecting a core argument of Schufa. The agency had submitted that Art. 22 does not cover credit reporting and scoring for the simple reason that it is the lender, not the scoring agency, that takes a “decision”. Note that, in the US, Fintech data aggregators have raised a similar argument. To escape requirements under the FCRA, some claim to be mere data conduits, rather than credit reporting agencies (Jackson/Tahyar, Fintech Law: The Case Studies, https://projects.iq.harvard.edu/files/fintechlaw/files/fintech_law_the_case_studies_ebook.pdf, pp. 238-241). The ECJ disagrees, opting for a broad reading of the term “decision”. It found support for this reading in recital (71) which explains that a “decision (…) may include a measure, evaluating personal aspects”. Additionally, this recital lists “automatic refusal of an online credit application” as an example of a similarly significant effect on the consumer. Following the General Advocate’s opinion, the ECJ looks to the entire process of data collection, scoring, profiling, and credit underwriting. It understands Schufa as contributing via its scoring “decision”, at least if the score is of “paramount importance”. Any other outcome, so the ECJ held, would risk a lacuna whenever several parties are engaged: The scoring agency would not take the relevant “decision”. The lender could not furnish the disclosure the consumer is entitled to receive under the GDPR, because only the scoring agency has access to this information.

It was uncontested among the parties that Schufa’s scoring procedure used “solely automated processing”. As to “legal or similar effects which significantly affect the consumer”, the ECJ references evidence the submitted court had collected. An unsatisfactory credit score, so this court found, will in nearly all cases lead to a denial of credit. Under these circumstances, the credit score is an element of “paramount importance” in the decision-making procedure which leads to a denial of credit. It follows from there, that, in this case, Schufa’s credit scoring activity qualifies as a “decision” entailing “legal or similar effects” under Art. 22 para. 1 GDPR.

This brings the ECJ to the second part of its decision. The GDPR speaks of a “right” not to be subject to a decision based on automated processing. Still, the ECJ interprets the rule as a prohibition that automatically kicks in without requiring the consumer to start legal proceedings. Under this reading, Art. 22 para. 2 GDPR contains the three exceptions listed above. The ECJ mentions only para. 2 lit. b, the exception the submitting court had cited. The rule allows Member States to grant an authorization of automated decision-making. German Federal law on data protection provides a rule on the “use of a probability value regarding specific future behavior of a natural person”.

The ECJ encourages the submitting court to decide if this rule is a suitable exception under Art. 22 para. 2 lit. b GDPR. However, it makes clear that the GDPR’s requirements are tough to meet. The Member State authorization must under para. 2 lit. b lay “down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests”.  Under Art. 22 para. 4, scoring may use protected characteristics, but only in narrow cicumstances. General requirements under Art. 5 and 6, and the more focused rules in Art. 13 – 15 GDPR further highlight the GDPR’s skepticism towards automated decision-making. To explain the reason for this cautionary approach, the ECJ, once again, refers to recital (71). Automation increases the risk of discrimination on the basis of protected characteristics unless it is fair and transparent, takes the specific context of data processing into account, uses appropriate statistical models, corrects wrong data entries, and minimizes errors.

One important detail is hidden in a brief remark by the court. Art. 15 para. 1 lit. h GDPR grants the consumer not only the right to obtain information on the existence of automated decision-making, but also “meaningful information about the logic involved”. For credit-scoring agencies such as Schufa, this entails the requirement to explain the “logic involved” to consumers. It remains to be seen whether courts will take a more holistic approach, looking for something akin to the US ECOA notice or, instead, require details of the scoring methodology. For scoring agencies that use AI-generated models, especially of the black-box variant, information of that type will rarely be possible to deliver.

The practical impact of the court’s decision turns on a credit score being of “paramount importance” in an underwriting process. The submitting court’s evidence suggested that this was the case. Schufa, by contrast, has encouraged its clients to make sure that they considered enough supplementary factors to rule out paramount importance of the score in an underwriting decision. Still, for companies engaged in credit scoring as well as for lenders, there is more new EU law to look out for. Under the upcoming EU Artificial Intelligence Act, algorithmic credit scoring and creditworthiness evaluation are high risk use cases which trigger enhanced compliance requirements. Similar to the GDPR’s recital (71), the AI Act’s recital (37) highlights the risks of algorithmic discrimination due to, for instance, historic bias. The new Consumer Credit Directive of October 18, 2023 mentions discriminatory credit underwriting in recitals (29) and (31). Cancer survivors get special protection against discrimination, recital (48), and the use of alternative data for credit underwriting purposes will be limited (55). Art. 6 of that Directive includes a prohibition against discriminatory underwriting similar to the US ECOA.  

Katja Langenbucher is a law professor at Goethe-University’s House of Finance in Frankfurt, an affiliated professor at SciencesPo, Paris, and a long-term guest professor at Fordham Law School, NYC. She is also a SAFE Fellow with the Leibniz Institute for Financial Research SAFE.

The views, opinions and positions expressed within all posts are those of the author(s) alone and do not represent those of the Program on Corporate Compliance and Enforcement (PCCE) or of the New York University School of Law. PCCE makes no representations as to the accuracy, completeness and validity or any statements made on this site and will not be liable any errors, omissions or representations. The copyright of this content belongs to the author(s) and any liability with regards to infringement of intellectual property rights remains with the author(s).