In the last few years, we have seen a dramatic increase in the purchase and sale of alternative data—a shorthand for big data sets, such as satellite images of parking lots, drug approvals, credit card purchases, cellphone data on retail foot traffic, and construction permits. According to alternativedata.org, the alternative data industry is projected to be worth $350 million in 2020. The recent announcement by Bloomberg LP that it is offering a product that will give clients access to large volumes of alternative data shows the widespread use of this information in making investment decisions, which is causing hedge fund managers and institutional investors to seek even more untapped alpha-generating data sets. Not surprisingly, all this activity is attracting increased regulatory scrutiny.
One law that has been around for a while that applies to alternative data is the Gramm-Leach-Bliley Act (“GLBA”), which requires financial institutions to provide privacy disclosures when they share nonpublic personal information with nonaffiliated third parties. Accordingly, GLBA places some limits on financial institutions’ use of alternative data—such as what banks can do with their vast amounts of valuable personal loan or credit information. To comply with GLBA, banks generally anonymize or de-identify the information they have before selling it.
A new piece of legislation that will take these kind of obligations beyond financial institutions is the California Consumer Privacy Act (“CCPA”), which goes into effect in 2020. CCPA requires companies to obtain consent from customers before selling their personal data to third parties, but it expressly does not apply to consumer information that is de-identified. Neither GLBA nor CCPA provide any guidance as to what level of anonymization is sufficient. The concern comes from the increasing ability to use sophisticated algorithms that employ publicly available big data sets to re-identify certain kinds of anonymized data. We anticipate that the issue of how much de-identification is enough will be a subject of future regulatory guidance and litigation.
Another new regulatory development is Vermont’s data broker registry, which covers credit reporting agencies, but not retailers or hotels that sell customer data. The law has three main parts: (1) a registration requirement for data brokers, (2) a minimum data security standard for data brokers, and (3) a prohibition on fraudulent acquisition of certain types of data or use of such data to commit bad acts.
Proposed federal privacy laws, such as the CONSENT Act (PDF: 41.2 KB), may also limit the use of alternative data by requiring companies to obtain consent from customers before collecting and selling their personal data. This would be particularly important for smartphone applications that collect and sell information about consumers without disclosure or with disclosures that are buried in lengthy privacy policies that no one reads.
The most significant new regulatory action of alternative data has come from the New York State Department of Financial Services (“NYDFS”), which has been a leading regulator in cybersecurity (PDF: 97.5 KB). Earlier this year, the NYDFS issued “Insurance Circular Letter No. 1,” which gives specific guidance on the proper use of alternative data in the life insurance industry, in response to the prevalent use of alternative data in underwriting practices. Examples of data being used in assessing life insurance coverage include review of photos of individuals on social media, retail purchase history, and geographic location tracking. The Circular imposes two obligations on life insurers regulated by the NYDFS. First, insurers using alternative data must independently confirm that any given source of data “does not use and is not based in any way on race, color, creed, national origin, status as a victim of domestic violence, past lawful travel, or sexual orientation in any manner, or any other protected class.” Second, insurers using alternative data for the purposes of any adverse underwriting decision for any particular consumer must disclose to the consumer the details about all information on which the insurer relied to make the decision, including the specific sources of the alternative data.
As more businesses use more kinds of alternative data to assist in business decisions that affect consumers, an increase in regulation is inevitable. We expect that these new regulatory initiatives are just the first of many, and we will be discussing any important developments in this area at the Davis Polk Cyber Blog.
The views, opinions and positions expressed within all posts are those of the author alone and do not represent those of the Program on Corporate Compliance and Enforcement (PCCE) or of New York University School of Law. PCCE makes no representations as to the accuracy, completeness and validity of any statements made on this site and will not be liable for any errors, omissions or representations. The copyright of this content belongs to the author and any liability with regards to infringement of intellectual property rights remains with the author.