What Goes on in the Shadows: FTC Action Against Data Broker Sheds Light on Unfair and Deceptive Sale of Consumer Location Data

by Lesley Fair

Photo of the author

Lesley Fair (photo courtesy of the author)

SCENE:   A crowded city or a dark street. (Cue ominous music)

ACTION:  The camera focuses on the protagonist as they make their way to a location – maybe it’s a place of worship, a doctor’s office, or a reproductive health clinic. They think they’re alone, but what they don’t know is that they’re being tailed. What’s more, highly personal information regarding their whereabouts will be shared with third parties, all without their knowledge or consent.

A private eye novel? A detective thriller on a streaming service? No. According to a proposed settlement announced by the FTC, it was all in a day’s work for data broker X-Mode Social, which has billed itself as the “2nd largest US location data company.” The FTC says X-Mode and its corporate successor Outlogic, LLC sold consumers’ raw location data without their informed consent and without placing effective limits on how X-Mode’s customers used the sensitive information it bought from X-Mode.

Data broker X-Mode collected precise consumer geolocation data from a wide variety of sources: third-party apps with X-Mode’s software development kit (SDK) built in, its own mobile apps, and information it bought from other data brokers and aggregators. The company then compiled consumers’ location data and sold it to hundreds of clients for advertising, brand analytics, and other marketing purposes. According to the complaint, the company also sold it to private government contractors. The FTC says this was done without consumers’ informed consent and without disclosing all of the purposes for which consumers’ data would be used. And it’s the reality of the data broker industry that, in many cases, consumers had no idea who X-Mode is or that they were X-Mode’s “product.”  

How personal was the data X-Mode compiled and sold? It wasn’t an anonymous aggregation of zeroes and ones. According to the FTC, the data X-Mode sold was capable of matching an individual consumer’s mobile device with the exact locations they visited. (In fact, some companies offer services to match that data to individual consumers.) How targeted was the information? X-Mode advertised that its location data “is 70% accurate within 20 meters or less.” And how vast was the data set X-Mode collected? According to the complaint, “Through its own apps, partner apps, and other data brokers, X-Mode daily has ingested over 10 billion location data points from all over the world.”

The FTC says that until May 2023, the company didn’t have policies in place to remove sensitive locations from the raw location data it sold. In addition, the company didn’t implement appropriate safeguards for how its clients used that data, putting consumers’ sensitive personal information at risk. The complaint also alleges the company failed to employ necessary technical safeguards and oversight to ensure that it honored requests by some Android users to opt out of tracking and personalized ads.

The complaint describes the kinds of threats to consumers’ privacy posed by X-Mode’s conduct. For example, “[T]he location data could be used to track consumers who have visited women’s reproductive health clinics and as a result, may have had or contemplated sensitive medical procedures such as an abortion or in vitro fertilization. Using the data X-Mode has made available, it is possible for third parties to target consumers visiting such healthcare facilities and trace that mobile device to a single-family residence.”

Furthermore, X-Mode has used consumers’ geolocation data to create catalogs of people with shared characteristics and has even created custom lists for clients. For example, X-Mode had a contract with a private clinical research company to provide information about consumers who had visited certain medical offices in the Columbus, Ohio area – data the company wanted for marketing purposes.

The seven-count complaint charges X-Mode/Outlogic with multiple instances of unfair or deceptive conduct, in violation of the FTC Act. To settle the case, the company has agreed to make major changes to how it does business going forward. Among other things, the proposed order puts substantial limits on sharing certain sensitive location data and requires the company to develop a comprehensive sensitive location data program to prevent the use and sale of consumers’ sensitive location data. X-Mode/Outlogic also must take steps to prevent clients from associating consumers with locations that provide services to LGBTQ+ individuals or with locations of public gatherings like marches or protests. In addition, the company must take effective steps to see to it that clients don’t use their location data to determine the identity or location of a specific individual’s home. And even for location data that may not reveal visits to sensitive locations, X-Mode/Outlogic must ensure consumers provide informed consent before it uses that data. Finally, X-Mode/Outlogic must delete or render non-sensitive the historical data it collected from its own apps or SDK and must tell its customers about the FTC’s requirement that such data should be deleted or rendered non-sensitive.

Another noteworthy aspect of the proposed settlement: X-Mode/Outlogic must give consumers an easy way to withdraw their consent for the collection and use of their location data, to require the deletion of any location data that was previously collected, and to request the identity of any individual and business to whom their personal data has been sold or shared. Once the proposed settlement is published in the Federal Register, the FTC will accept public comments for 30 days.

The FTC’s action is this case suggests three fundamental principles about the privacy of consumers’ location data.

The status quo is a “no go” when it comes to the collection and sale of  consumer location information. Many companies have made a practice of suctioning up every available piece of location data and compiling mountains of highly sensitive information without consumers’ consent. What some businesses haven’t grasped – including some data brokers who operate behind the scenes and in the shadows – is that consumers’ personal information isn’t just another “raw material” for companies to exploit commercially. That’s especially true for location data. Just because your business has access to location information doesn’t mean you’re free to use it any way you choose.

Contract clauses about data use are a start, but they’re not enough. According to the complaint, in some instances, X-Mode’s contracts with clients included clauses that at least facially appeared to put limits on third parties’ use of data – but words on a piece of paper aren’t enough. When the stakes are so high, privacy-conscious companies can’t just talk the talk. They need to take steps to ensure compliance.

Because the unauthorized and illegal trafficking in location data is a key concern for consumers and the FTC, it should matter to your company.  Whose responsibility is it to ensure that consumers have consented to the collection and sharing of their location information? Savvy businesses assume it’s theirs. That legal obligation is a two-way street. Information about particularly sensitive locations, like where people worship or seek medical help, shouldn’t be used at all. And don’t sell other location data – and don’t buy it – without proof of informed consumer consent. Although every participant in the location data marketplace is responsible for complying with the law, the FTC’s action in this case sends a specific message to location data brokers to re-evaluate their practices for legal compliance.

Lesley Fair is a Senior Attorney at the Federal Trade Commission and a Lecturer at the law schools of George Washington University and Catholic University of America. This post first appeared on the Federal Trade Commission’s Business Blog.  

The views, opinions and positions expressed within all posts are those of the author(s) alone and do not represent those of the Program on Corporate Compliance and Enforcement (PCCE) or of the New York University School of Law. PCCE makes no representations as to the accuracy, completeness and validity or any statements made on this site and will not be liable any errors, omissions or representations. The copyright of this content belongs to the author(s) and any liability with regards to infringement of intellectual property rights remains with the author(s).