Category Archives: Data Privacy

Policing Your Own Jardin – France Signals Eagerness to Take Control of Its White Collar Enforcement

by Antoine F. Kirry, Alexandre Bisch, Frederick T. Davis, Robin Lööf, Line Chataud, Ariane Fleuriot, Fanny Gauthier, and Alice Stosskopf

In light of well-publicized U.S. enforcement actions against French companies (Alstom, Total, Technip, Alcatel, BNP), the French government asked MP Raphaël Gauvain to consider measures to protect French companies faced with foreign extraterritorial judicial and administrative procedures. His long-awaited report was published on June 26, 2019. Entitled “Restoring French and European Sovereignty and protecting our companies from extraterritorial laws and measures,” this 100-page document points out the lack of effective legal tools available to French companies faced with extraterritorial proceedings. Drawing on this, the report makes several recommendations. Continue reading

SCOTUS Expands Scope of FOIA Trade Secrets and Commercial Information Exemption

by Michael S. Flynn, Randall D. Guynn, Michael Kaplan, Neil H. MacBride, Paul J. Nathanson, Annette L. Nazareth, Margaret E. Tahyar, and Eric B. Lewin

The Supreme Court has updated an important Freedom of Information Act (“FOIA”) exemption for the digital age.  In Food Marketing Institute v. Argus Leader Media (PDF: 125 KB), the Supreme Court last week significantly expanded the scope of FOIA Exemption 4.  FOIA Exemption 4 is the exemption most commonly claimed by private-sector entities when seeking to protect competitively sensitive information that must be disclosed to a federal agency.  It shields from disclosure “trade secrets and commercial or financial information obtained from a person and privileged or confidential.”[1]  Beginning with a D.C. Circuit decision in 1974, National Parks & Conservation Ass’n v. Morton, 498 F.2d 765 (D.C. Cir. 1974), courts have interpreted FOIA Exemption 4 narrowly.  For commercial or financial information to be “confidential,” a number of federal courts of appeals have required a showing of “substantial competitive harm” from disclosure.  Proving “substantial competitive harm” has proven difficult in practice, and, in this digital age, there is an increasing awareness that information and data are valuable.  The majority opinion in Food Marketing, written by Justice Gorsuch, squarely repudiated the “substantial competitive harm” test in favor of a less difficult standard, thereby broadening Exemption 4.

It is significant that the justices were unanimous in rejecting the “substantial competitive harm” test.  They disagreed about whether harm has any role to play in Exemption 4.  In an opinion concurring in part and dissenting in part, Justice Breyer explained that he “would clarify that a private harm need not be ‘substantial’ so long as it is genuine.”[2]  In contrast, the majority wouldn’t apply a harm test at all, arguing that such a test is not supported by the statute.  Instead, the majority explained its test as follows: Continue reading

District Court Finds Allegations That Data Breach Exposed Publicly Available and Non-Sensitive Personal Information Sufficient for Article III Standing

Potentially signaling an expansion of the scope of constitutional standing in data breach cases, a district court in the Northern District of California recently held that the exposure of users’ non-sensitive, publicly available personal information may be sufficient to establish an injury-in-fact.[1]

Background: The decision was issued in a class action lawsuit brought against Facebook, alleging breach of contract, negligence, and violation of the California Unfair Competition Law, among other state law claims, based on a 2018 data breach.   The breach resulted from a coding vulnerability that allowed hackers to steal information from 15 million users.  Though the stolen information included usernames and basic contact information (i.e., phone numbers and email addresses), and in some cases also included users’ birthdates, hometowns, workplaces, education information, religious views, and prior activities on Facebook, the plaintiffs did not allege the breach of information traditionally considered sensitive, such as social security numbers or credit card information. In its motion to dismiss the complaint, Facebook argued that the named plaintiffs had not established Article III standing because they had not alleged any particularized injury: the stolen information was publicly-available, and the only potential injury was the minimal time spent deleting phishing emails.  The court rejected Facebook’s argument, holding that one plaintiff had adequately alleged two injuries: (i) the substantial risk of future identity theft and (ii) the lost time responding to the data breach.

Regarding the risk of identity theft, the court rejected Facebook’s argument that the plaintiff had not suffered an injury-in-fact because the breach involved no sensitive information.  Despite recognizing that all of the information was otherwise publicly available, the court nonetheless determined that information “need not be sensitive to weaponize hackers in their quest to commit further fraud or identify theft.”  In the court’s view, an “‘increased risk of identity theft’” can occur even when the stolen information is not traditionally sensitive personal information, because the proper inquiry is not “the minutia” of what information had been taken, but whether the data “gave hackers the means to commit fraud or identify theft.”

Here, the court viewed the stolen information as equivalent to sensitive information because it was “immutable,” personally identifying, and of a nature and amount to “provide further ammo . . . to g[i]ve hackers the means to commit fraud or identity theft.”  The public availability of the information was “irrelevant,” because “constructing this information from random sources bit by bit” would be difficult for hackers.  The court also inferred that the goal of the breach was to facilitate fraud and identity theft, emphasizing the plaintiff’s receipt of phishing emails and text messages after the breach, and the hackers’ use of searches to cull information from millions of users.

The court also held that the lost time spent responding to the data breach could constitute an economic injury.  Under the court’s reasoning, even de minimis time spent sorting through phishing emails could be sufficient based on an expectation that “[m]ore phishing e‑mails will pile up” over time.

Takeaway: Courts remain split over the threshold for alleging standing in data breach cases.  Although the Second, Fourth, and Eighth Circuits have determined that allegations based on the risk of future harm are insufficient, the D.C., Third, Sixth, Seventh, Ninth, and Eleventh Circuits have held that alleging a substantial risk of future harm is sufficient to satisfy the Article III injury requirements.  But the question remains—when are allegations of future harm too “speculative” to constitute an injury?  This month’s decision in Schmidt v. Facebook answered that the exposure of a sufficient amount of public, non-sensitive information may create a future risk of harm that is as substantial and imminent as the exposure of social security numbers because it makes social engineering attacks easier.  Should this decision gain traction among other courts, it would ease plaintiffs’ burden to establish standing in a broad array of data breach lawsuits.

Footnotes

[1] Schmidt v. Facebook, Inc., No. C 18-05982 WHA (JSC), 2019 WL 2568799 (N.D. Cal. June 21, 2019).

Continue reading

Regulating the Use of Data in the United Kingdom’s Financial Sector

by Alun Milford

It is just over a year since the European Union’s General Data Protection Regulation came into force. It strengthened Europe’s already highly evolved legal framework for the protection of personal data and provided for much heavier penalties for breaches of those protections than had hitherto been available. For example, under the old law the maximum penalty the United Kingdom’s regulator could impose for a data protection breach was £500,000 whereas under the new law the maximum penalty throughout Europe is the higher of 20,000,000 euros or 4% of the firm’s annual worldwide turnover in the preceding financial year. The prospect of penalties on this scale has concentrated the minds of businesses with European operations, whether headquartered there or not. 

For firms in the United Kingdom’s regulated financial sector a particular concern was the prospect of having to comply with two distinct regulatory frameworks – one for the conduct of business and the other for the protection of personal data – policed by two distinct regulators – the Financial Conduct Authority and the Information Commissioner’s Office – where both regulators now had the power to impose very significant sanctions for the same conduct. In this blog I consider the functions of the respective regulators, the areas of overlap or common interest in their work and the way in which the regulators have indicated they will approach those areas of common interest. Continue reading

The Biggest Risk with CCPA May Be Cybersecurity, Not Privacy: 10 Things Companies Are Doing Now to Prepare

by Avi Gesser, Matthew Kelly, Will Schildknecht, and Clara Y. Kim

By now, most major U.S. companies are generally aware of the new privacy requirements (PDF:187 KB) that will be imposed by the California Consumer Privacy Act (“CCPA”) when it goes into effect on January 1, 2020, including data access and deletion rights for consumers as well as restrictions on selling personal information.  But, at least in the short term, it is likely that the CCPA’s cybersecurity requirements will have the most significant impact on companies.

Unfortunately, the CCPA does not spell out its cybersecurity requirements explicitly.  Rather, it creates a private right of action for California consumers against companies that have experienced a cyber breach if their personal information has been taken by an unauthorized person.  A successful action requires that the exfiltration or disclosure be of unencrypted personal data and result from the company’s violation of its duty to implement and maintain reasonable security procedures and practices. § 1798.150(a)(1). Continue reading

Preparing for the California Consumer Privacy Act in an Evolving Privacy Landscape

by David A. Katz, Marshall L. Miller, and Zachary M. David

Just a month after the European Union’s General Data Protection Regulation (GDPR) (PDF: 146 KB) took effect, California enacted the most expansive data privacy law in the United States to date.  The California Consumer Privacy Act (CCPA), which is scheduled to go into effect on January 1, 2020, will impose unprecedented data obligations on companies doing business in California, requiring increased data use transparency and the observance of novel consumer data rights.  Notwithstanding any GDPR compliance fatigue, companies need to take steps to prepare for compliance with the CCPA. 

The CCPA was a hastily crafted legislative package passed to preempt a statewide ballot initiative set to qualify for California’s November 2018 ballot.  The initiative—which promised to be even more far-reaching—was withdrawn by its ballot sponsors ­in exchange for passage of the CCPA.  The statute remains a work in progress, with numerous legislative amendments currently under consideration and implementing regulations from the California Attorney General expected this fall. Continue reading

Part III: Our Last Look at the CCPA’s Definition of “Personal Information”

by Craig A. Newman and Jonathan (Yoni) Schenker

In our third and final installment on the California Consumer Privacy Act’s (CCPA) expansive definition of “personal information,” we look at other sections of the CCPA that either limit the applicability of the law’s “personal information” definition or exclude information from coverage under the law.

The CCPA excludes information that otherwise meets the definition of “personal information” if the information is already governed under specified federal or state statutes or regulations. Cal Civ. Code §§ 1798.145(c-f)[1]. The CCPA also adopts a narrower definition of “personal information” when conferring a private right of action in the context of a data breach. Id. § 1798.150; see id. § 1798.81.5(d)(1)(A). As we will discuss in a later post, when a private litigant files a data breach lawsuit, the CCPA’s definition of “personal information” isn’t in play but the narrower definition from the state’s existing data breach statute is used.

Our three-part series is designed to help businesses identify whether they hold information covered under the law, while also highlighting the potential pitfalls in the definition as we await interpretative regulations from the California Attorney General and potential amendments from the state’s legislature. In Part I[2], we explored the breadth of the definition, which is unprecedented in the United States. In Part II[3], we explored the law’s two explicit exclusions from the “personal information” definition for “publicly available” and “deidentified or aggregate consumer information,” noting the lack of clarity in the language of the law. Finally, we conclude our series with a look at the rest of the statute for exclusions from, and limitations to, the information covered under the CCPA. Continue reading

Ephemeral Messaging for Businesses: Balancing the Risks of Keeping and Deleting Data by Default

by Avi Gesser, Daniel F. Forester, and Mengyi Xu

One way for companies to decrease their cybersecurity risks, as well as their risks from new privacy regulations, is through data minimization—significantly reducing the amount of their data.  By deleting old data and collecting less new data, companies will have less sensitive information to protect and process in accordance with their regulatory obligations.  But getting rid of old data isn’t easy, in part because of the legal limitations on what can be deleted.  We have previously written about these challenges, as well as the benefits of data minimization, which include reducing:

  • the growth of a company’s data over time, and the associated storage costs;
  • lost productivity associated with searching large volumes of irrelevant data;
  • the cybersecurity and privacy risks of having large volumes of unneeded data, especially considering CCPA and GDPR-type rights of access and erasure;
  • internal audit and compliance risks;
  • contractual risks (e.g., obligations to clients and customers to delete data once it is no longer needed); and
  • the volume of documents that may be unhelpful to the company in potential, but not yet reasonably anticipated, litigation or regulatory inquiries.

Continue reading

Part II: A Closer Look at the CCPA’s Definition of “Personal Information”

by Craig A. Newman and Jonathan (Yoni) Schenker

 Our three-part series on the California Consumer Privacy Act’s (CCPA) expansive definition of “personal information” is designed to help businesses identify whether they hold information covered under the law, while also highlighting the potential pitfalls in the definition as we await interpretative regulations from the California Attorney General and potential amendments from the state’s legislature. In Part [1], we explored the breadth of the definition. We now turn to the law’s two explicit exclusions from the definition of “personal information.” 

The CCPA excludes two categories of information from its definition of “personal information”: “publicly available information” and “consumer information that is deidentified or aggregate consumer information.” Cal Civ. Code § 1798.140(o)(2) [2]. As we discuss below, the statute’s definitions of both terms are far from clear, and as with other aspects of the CCPA, interpretative regulations will be useful in assisting businesses as they work their way through both exceptions. Continue reading

Part I: A Closer Look at California’s New Privacy Regime:The Definition of “Personal Information”

 by Craig A. Newman and Jonathan (Yoni) Schenker

The California Consumer Privacy Act (CCPA) is set to become “operative” on January 1, 2020.  As we have written[1] in earlier[2] blog[3] posts[4], the CCPA is the most sweeping consumer privacy law in the country.

And the CCPA isn’t set in stone. The California Attorney General’s office recently concluded a public comment period as it prepares to draft interpretative regulations mandated by the CCPA. Not surprisingly, industry lobbyists are out in full force advocating for the legislature to amend the law. Yet with January 1st approaching, businesses potentially affected by the CCPA must start preparing for the law’s implementation.

In an effort to assist organizations in complying with the CCPA’s requirements – and all its moving pieces – we are taking a closer look over the next few months at key aspects of the law. In the event of changes to the CCPA, we will also highlight those on this blog. Continue reading