Category Archives: Consumer Financial Protection Bureau (CFPB)

Consumer Financial Protection Bureau Stands Up to Protect Whistleblowers from Overly Broad NDAs

by Benjamin Calitri

Benjamin Calitri

Photo courtesy of author

Protections for whistleblowers from overly expansive non-disclosure agreements (NDAs) aimed at preventing whistleblowers from providing information to law enforcement and regulators have been expanding exponentially in the past year. The Securities and Exchange Commission’s (SEC) enforcement of Rule 21F-17(a) has gained teeth by increasing the monetary sanctions for enforcement. The Commodity Futures Trading Commission (CFTC) took its first enforcement of Regulation 165.19(b) against Trafigura for the use of NDAs meant to silence whistleblowers. The latest agency to take action against overly expansive NDAs is the Consumer Financial Protection Bureau (CFPB), which has announced that their employee protection regulation applies to NDAs that seek to silence whistleblowers.

Continue reading

CFPB “Firing On All Cylinders” After Surviving Constitutional Challenge To Funding Structure

by Nowell D. Bamberger, Elsbeth Bennett, and Andrew Khanarian

photos of the authors

From left to right: Nowell D. Bamberger, Elsbeth Bennett and Andrew Khanarian. (Photos courtesy of Cleary Gottlieb Steen & Hamilton LLP)

The Supreme Court recently upheld the Consumer Financial Protection Bureau’s funding structure in a 7–2 decision that will likely pave the way for renewed regulatory activity by the agency in the near future. 

Enacted as part of the Dodd-Frank Act, the CFPB’s unique funding structure permits the agency to annually request an unspecified portion of funds from the Federal Reserve System, subject to an inflation-adjusted cap. In rejecting a constitutional challenge to this funding structure by several trade associations, the Supreme Court held in Consumer Financial Protection Bureau v. Community Financial Services Association of America that the Appropriations Clause merely requires Congress to identify the source and purpose of federal funds, and that Congress’s one-time appropriation for the CFPB in the Dodd-Frank Act meets that minimal constitutional standard. The seven-member majority largely aligned in their reasoning that the Constitution’s text and history, as well as early congressional practice, endorsed funding mechanisms such as this one, and thus provided broad legal support for the fiscal independence of agencies that are delegated substantial powers. As a practical matter, this decision will likely jumpstart long-delayed regulatory and enforcement work at the CFPB, including the vacated payday lending rules that were the subject of this litigation.

Continue reading

Mitigating AI Risks for Customer Service Chatbots

by Avi Gesser, Jim PastoreMatt KellyGabriel KohanMelissa Muse and Joshua A. Goland  

photos of authors

Top left to right: Avi Gesser, Jim Pastore, and Matt Kelly. Bottom left to right: Gabriel Kohan, Melissa Muse and Joshua A. Goland (photos courtesy of Debevoise & Plimpton LLP)

Online customer service chatbots have been around for years, allowing companies to triage customer queries with pre-programmed responses that addressed customers’ most common questions. Now, Generative AI (“GenAI”) chatbots have the potential to change the customer service landscape by answering a wider variety of questions, on a broader range of topics, and in a more nuanced and lifelike manner. Proponents of this technology argue companies can achieve better customer satisfaction while reducing costs of human-supported customer service. But the risks of irresponsible adoption of GenAI customer service chatbots, including increased litigation and reputational risk, could eclipse their promise.

We have previously discussed risks associated with adopting GenAI tools, as well as measures companies can implement to mitigate those risks. In this Debevoise Data Blog post, we focus on customer service chatbots and provide some practices that can help companies avoid legal and reputational risk when adopting such tools.

Continue reading

With The Fintech Sector’s Return to Explosive Growth, Here Are Top U.S. Legal Issues to Watch

by Jamillia Ferris, Vinita Kailasanath, Christine Lyon, Jan Rybnicek, and David Sewell

Left to right: Jamillia Ferris, Vinita Kailasanath, Christine Lyon, Jan Rybnicek, and David Sewell (photos courtesy of Freshfields Bruckhaus Deringer LLP)

Freshfields recently hosted a U.S. Fintech Hot Topics Webinar to highlight on-the-ground insights from our Antitrust and Competition, Data Privacy and Security, Financial Services Regulatory, and Transactional teams. The fintech sector has recently seen a return to explosive growth and is expected to continue growing rapidly notwithstanding regulatory and economic headwinds. Our top takeaways from the panel discussion are below, and the full recording is available here.

Continue reading

Executive Order Prohibits Transfer of Sensitive Personal Data to “Countries of Concern”

by Patrick J. Austin and John Pilch

Photos of authors

From the left to right: Patrick J. Austin and John Pilch

On February 28, 2024, U.S. President Joe Biden issued Executive Order on Preventing Access to Americans’ Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern (EO), which authorizes the U.S. Attorney General to restrict large-scale transfers of personal data to “countries of concern.” The “countries of concern” identified in the EO include China (along with Hong Kong and Macau), Russia, Iran, North Korea, Cuba and Venezuela, according to a summary issued by the White House.

Continue reading

President Biden Issues Executive Order Granting Authorities to Regulate the Transfer of Sensitive U.S. Data to Countries of National Security Concern

by Eric J. Kadel Jr., Sharon Cohen Levin, Nicole Friedlander, Anthony J. Lewis, Andrew J. DeFilippis, Joshua Spiegel, and George L. McMillan

photos of authors

Top left to right: Eric J. Kadel Jr., Sharon Cohen Levin, Nicole Friedlander, Anthony J. Lewis.
Bottom left to right: Andrew J. DeFilippis, Joshua Spiegel and George L. McMillan. (Photos courtesy of Sullivan & Cromwell LLP).

SUMMARY

On February 28, 2024, President Biden issued Executive Order 14117, “Preventing Access to Americans’ Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern” (the “Executive Order”), delegating new authorities to the U.S. Department of Justice (“DOJ”) and other agencies to regulate the transfer of sensitive U.S. data to countries of national security concern. The Executive Order focuses primarily on personal and other sensitive information, such as U.S. persons’ financial information, biometric data, personal health data, geolocation data, and information relating to government personnel and facilities.[1]

Continue reading

CFPB Report Highlights Role of Big Tech Firms in Mobile Payments

by the Consumer Financial Protection Bureau

CFPB Logo

Apple and Google set regulations on “tap-to-pay” which can impact innovation and competition

The Consumer Financial Protection Bureau (CFPB) published a new issue spotlight highlighting the impacts of Big Tech companies’ policies and practices that govern tap-to-pay on mobile devices like smartphones and watches. Apple currently forbids banks and payment apps from accessing the tap-to-pay functionality on Apple iOS devices and imposes fees through Apple Pay. Google’s Android operating system does not currently have such a policy. The issue spotlight explains how regulations imposed by mobile operating systems can have a significant impact on innovation, consumer choice, and the growth of open and decentralized banking and payments in the U.S.

“Regulations imposed by Big Tech firms have a big impact on whether consumers and businesses can make payments using third-party apps,” said CFPB Director Rohit Chopra. “We are carefully evaluating Big Tech’s role in our banking and payments system.”

Continue reading

Consumer Advisory: Your Money Is at Greater Risk When You Hold it in a Payment App, Instead of Moving it to an Account with Deposit Insurance

Editor’s Note: the NYU Law Program on Corporate Compliance and Enforcement is following the recent banking failures and policy developments arising from the crisis. In this post, the Consumer Financial Protection Bureau (CFPB) highlights the risk of holding money in uninsured accounts at payment apps.

by the Consumer Financial Protection Bureau

CFPB Logo

More than three quarters of adults in the United States have used a payment app, sometimes called a P2P (peer-to-peer or person-to-person) app. Widely used nonbank payment apps include PayPal, Venmo, and Cash App. The apps can be used on a computer or mobile device to send money to someone else without writing a check or handing over cash.

Young adults use payment apps even more frequently. According to a March 2022 survey by Consumer Reports, 85 percent of consumers aged 18 to 29 have used one of these apps.

Continue reading

National Association of Attorneys General’s 2023 Consumer Protection Spring Conference

by Courtney M. Dankworth, Avi Gesser, Paul D. Rubin, Jehan A. Patterson, Sam Allaman, and Melissa Muse

Photos of the authors

From top left to right: Courtney M. Dankworth, Avi Gesser, and Paul D. Rubin.
From bottom left to right: Jehan A. Patterson, Sam Allaman, and Melissa Muse.
(Photos courtesy of Debevoise & Plimpton)

On May 10−12, 2023, the National Association of Attorneys General (the “NAAG”) held its Spring 2023 Consumer Protection Conference to discuss the intersection of consumer protection issues and technology. During the portion of the conference that was open to the public, panels featuring federal and state regulators, private legal practitioners, and industry experts discussed potential legal liabilities and consumer risks related to artificial intelligence (“AI”), online lending, and targeted advertising.

In this Debevoise Update, we recap some of the panels and remarks, which emphasized regulators’ increased scrutiny of the intersection of consumer protection and emerging technologies, focusing on the leading themes from the conference: transparency, fairness, and privacy.

Continue reading

Federal Agencies Will Jointly Look for Bias and Discrimination in AI

by Bradford Hardin, K.C. Halm, Aisha Smith, and Matt Jedreski

Photos of the authors

From left to right: Bradford Hardin, K.C. Halm, Aisha Smith, and Matt Jedreski (photos courtesy of David Wright Tremaine LLP)

DOJ, FTC, CFPB, and EEOC Announce Joint Commitment to Use Existing Consumer Protection and Employment Authority to Oversee Use of Artificial Intelligence

On April 25, 2023, the Federal Trade Commission (FTC), the Civil Rights Division of the U.S. Department of Justice (DOJ), the Consumer Financial Protection Bureau (CFPB), and the U.S. Equal Employment Opportunity Commission (EEOC) released a joint statement highlighting their commitment to “vigorously use [their] collective authorities to protect individuals” with respect to artificial intelligence and automated systems (AI), which have the potential to negatively impact civil rights, fair competition, consumer protection, and equal opportunity. These regulators intend to use their existing authority to enforce consumer protection and employment laws, which apply regardless of the technology used for making decisions or delivering products and services. The joint statement outlines several key areas of focus for the agencies – ensuring that AI does not result in discriminatory outcomes, protecting consumers from unfair, deceptive, or abusive acts or practices (UDAAP), preventing anticompetitive practices that may be facilitated or exacerbated by AI, and promoting responsible and transparent development of AI systems. Rather than operate as if AI is unregulated, businesses should ensure their use of AI complies with existing laws and regulations.

Continue reading