[ad_1]
On August 29, the Federal Trade Commission announced it had filed a landmark lawsuit against data broker Kochava for “selling geolocation data from hundreds of millions of mobile devices” that can be used to trace individuals’ time-stamped movements to sensitive locations. These include reproductive health clinics, places of worship, addiction recovery centers, and shelters for the unhoused and survivors of domestic violence. Kochava, the FTC alleges, “is enabling others to identify individuals and exposing them to threats of stigma, stalking, discrimination, job loss, and even physical violence.”
In response, the Idaho-based company claims that it “operates consistently and proactively in compliance with all rules and laws, including those specific to privacy.” In other words, Kochova relied on a core defense in the data broker playbook: Well, it’s legal.
But this is like saying you’ve read every book on the subject when all that’s been written is a waiting-room brochure. In a colossal failure of US policymaking—and, in many cases, a product of deliberate attempts to undermine or neglect marginalized people’s privacy—the US has weak privacy laws in general. Very few laws in the US even relate to data brokers, let alone constrain their actions. Nonetheless, the fact that Kochava is not breaking the law doesn’t make its behavior harmless—and it doesn’t make the company immune from lawsuits, either. The FTC’s case could establish that this kind of data surveillance, monetization, and exploitation is an unfair or deceptive business practice, exposing brokers to punishment. And it has the argument to get there.
Despite the lack of privacy laws, the FTC can still bring companies to court for engaging in “unfair or deceptive acts or practices.” FTC lawsuits against data brokers are not unprecedented, but they have typically focused on behavior such as facilitating criminal scams. By suing Kochava for brokering individuals’ geolocation data without their knowledge, and exposing them to risk, the FTC is effectively pushing for a greater foundation to act against data-brokerage harms.
While data brokers and other tech companies (from Experian to financial data broker Yodlee) have absurdly claimed that their data is “anonymized,” Kochova’s billions of strands of data are anything but. The company provided mobile advertising IDs—which let marketers track the person behind a device—paired with people’s location information, making it possible for a buyer to “identify the mobile device’s user or owner,” as the lawsuit claims. Kochava also exposed individuals to risk in a simpler fashion: If you have someone’s entire location history, you can easily uncover their identity. Phones lying on a nightstand from 10 pm to 6 am, for example, might indicate a home address, just as phones in the same office building or retail store from 9 to 5 could signal a place of employment. The FTC says Kochava knew this and even tried to profit off it—suggesting “Household Mapping” as a potential data use case on the Amazon Web Services Marketplace, where a buyer could “group devices by dwelling time and frequency at shared locations to map individual devices to households.” Selling this information blatantly puts many people at risk, especially the already marginalized and vulnerable.
Data brokers’ entire business model is premised on secretly gathering, analyzing, and selling or otherwise monetizing people’s information. Just within the location data realm, companies have been caught advertising Americans’ real-time GPS locations, quietly surveilling Black Lives Matter protesters to identify individuals’ characteristics, and providing location data to law enforcement agencies like the FBI and Immigration and Customs Enforcement (ICE), without needing a warrant. Even after the Supreme Court overturned Roe v. Wade, numerous data brokers continued selling location data related to abortion clinic visits, some of which only agreed to stop when called out in the press and by members of Congress. Earlier this month, NextMark CEO Joe Pych told Politico, in a supposed defense of his own firm’s behavior, that “as far as I know, there’s no law today that prohibits prenatal mailing lists.” Whether these practices further domestic and intimate partner violence, enable warrantless surveillance of overpoliced communities, or put women and LBGTQ+ people at risk of stalking and physical violence, many data brokers continue selling location information anyway.
If data brokers look at surveillance of vulnerable communities and claim to not understand the harm of collecting and selling this kind of data, they are either outright lying or simply do not care. If they are secretly collecting individuals’ locations, linking them to people and selling them online—facilitating the tracking of people going to churches and mosques, hospitals and health clinics, queer nightclubs and anti-policing rallies—and cook up an “It’s not not legal” defense—they are pushing bad-faith arguments. In a state of constant surveillance and an absence of privacy regulation, legality is not the determinant of harm.
Critically, the agency alleges that Kochava violated the “unfair or deceptive acts or practices” clause of the FTC Act, because it unfairly sells highly sensitive location information that poses a risk of “substantial injury” to consumers. Individuals, tracked without fully knowing and understanding the surveillance, cannot reasonably avoid these harms on their own. So, for all that Kochava claims the FTC is perpetuating “misinformation surrounding data privacy,” this case may further solidify the fact that brokering people’s highly sensitive information is grounds for legal action.
As state legislatures remain slow to enact more privacy laws, and congressional initiatives on the issue stagnate, with some members even refusing to touch data brokerage harms, the FTC’s case may be the country’s best shot. The agency should press hard on its case and take every measure to link the sale of location data to outcomes like stalking, discrimination, and other kinds of data exploitation; otherwise, this highly sensitive information will remain on the open market and continue to harm millions of Americans.
[ad_2]
Source link