FTC chief pledges tight rein on data
Edith Ramirez, the Commissioner of the Federal Trade Commission laid out the case for strong consumer protections regulating the private industry’s use of big data, as the agency asks Congress for the power to level civil fines against businesses for weak consumer data security.
Speaking at the Aspen Forum, Ramirez offered “A view from the lifeguard’s chair,” as her keynote was titled, alluding to her roots in coastal southern California.
“The already intricate data-collection ecosystem is becoming even more complex,” said Ramirez, whose term as commissioner ends in 2015. Ramirez pointed to the “Internet of Things” as a growing technology that will test the bounds of the law.
In addition to online and cell phone data, “households with smart appliances such as refrigerators, televisions, thermostats... will soon be widespread,” Ramirez said. “These devices will be connected to the Internet, collecting information that will end up in the hands of manufacturers, service providers and others. What are the privacy and security implications? These are questions we are thinking about at the FTC,” she said.
“The fact that big data may be transformative does not mean that the challenges it poses are, as some claim, novel or beyond the ability of our legal institutions to respond.”
The FTC, an independent federal agency that turns 100 years old next year, believes it has “an obligation” to protect consumer privacy, said Ramirez. Congress directed the FTC to prevent unfair commercial practices — “conduct that substantially harms consumers, or threatens to substantially harm consumers, which consumers cannot reasonably avoid, and where the harm outweighs the benefits,” said Ramirez, who prior to joining the FTC was a partner in the Los Angeles office of law firm Quinn Emanuel Urquhart & Sullivan, al law firm specializing in intellectual property litigation with clients including Google, Shell Oil, Motorola, Samsung and Sony.
Many companies are using data technology “in ways that implicate individual privacy,” Ramirez said. “The FTC’s role isn’t to stand in the way of innovation; it is to ensure that these advances are accompanied by sufficiently rigorous privacy safeguards.”
Ramirez argued for a consumer data framework that limits the collection of consumer data, as opposed to “after-the-fact restriction” limiting certain uses of data that’s collected, as many tech firms would prefer. “The indiscriminate collection of data violates the First Commandment of data hygiene: Thou shall not collect and hold onto personal information unnecessary to an identified purpose. Keeping data on the off chance that it might prove useful is not consistent with privacy best practices,” Ramirez argued.
There’s also the risk of what Ramirez called “data determinism" taking hold in institutional practices. “Individuals may be judged not because of what they’ve done, or what they will do in the future, but because inferences or correlations drawn by algorithms suggest they may behave in ways that make them poor credit or insurance risks, unsuitable candidates for employment or admission to schools or other institutions, or unlikely to carry out certain functions.”
Ramirez noted that she is a big fan of big data. “The fact that decision-by-algorithm may be less than perfect is not to condemn the enterprise. Far from it. Using data-driven analytics to improve decision-making may be an important step forward. After all, human decision-making is not error-free. People often make imperfect decisions for a variety of reasons, including incomplete information, poor decisional tools, or irrational bias. But the built-in imperfections in the decision-by-algorithm process demand transparency, meaningful oversight and procedures to remediate decisions that adversely affect individuals who have been wrongly categorized by correlation. At the very least, companies must ensure that by using big data algorithms they are not accidently classifying people based on categories that society has decided — by law or ethics — not to use, such as race, ethnic background, gender, and sexual orientation.”
Last year the FTC called on data brokers — a relatively new occupation servicing mostly corporate clients with consumer data for business intelligence or targeted advertising — to give consumers access to their information through an easy-to-find, easy-to-use common portal, and the agency argued for legislation giving consumers the ability to access, dispute or suppress data held by brokers.
Further establishing its role as cop on the consumer data beat, the FTC has issued subpoenas to nine data brokers, investigating “the nature and sources of the consumer information the data brokers collect; how they use, maintain, and disseminate the information; and the extent to which they allow consumers to access and correct their information or opt out of having their personal information sold,” Ramirez said.
The FTC has used its “unfairness authority” against companies for failing to provide reasonable data security — suing the Wyndham hotel chain last year for data security practices that led to three data breaches.The FTC has brought over 40 data security cases under our unfairness and deception authority for failing to provide reasonable security safeguards.
Along with the landmark Fair Credit Reporting Act of 1970, the FTC enforces the Children’s Online Privacy Protection Act requiring companies to obtain parental consent before collecting personal information from kids under 13, and recently updated rules to include social media and mobile applications.
The FTC is pushing Congress for the power to secure civil penalties against businesses that "fail to maintain reasonable security," Ramirez said. The agency also is urging Congress to pass "baseline privacy legislation" that would increase transparency about companies' collection of user information, among other goals.