In a case that could have far-reaching implications for how companies are held liable for data security lapses, the FTC issued an order and opinion unanimously overturning its Chief Administrative Law Judge’s (ALJ) November 2015 dismissal of charges that LabMD’s allegedly lax data security measures were unfair practices under Section 5 of the FTC Act (see our coverage of […]
Thank you to everyone who participated in last week’s webinar “Privacy Shield: What You Need to Know,” in which we explored how companies demonstrate compliance with the Privacy Shield principles, what it takes to move from Safe Harbor to Privacy Shield, and more. A copy of the slide deck and recorded webinar are now available on our blog.
On Monday, May 16, 2016, the Supreme Court of the United States issued its highly anticipated opinion in Spokeo, Inc. v. Robins, a case that examined the question of whether a plaintiff who sued for a technical violation of the Fair Credit Reporting Act could maintain Article III standing for a class action without claiming any real-world injury. The case before the Court involved a putative class action brought against petitioner Spokeo, Inc., a company that generates profiles about people based on information obtained though computerized searches. Respondent Thomas Robins was one of the people with a profile on Spokeo’s website. According to Robins, the information on that profile was inaccurate. Robins filed a class-action complaint against Spokeo in federal court, alleging violations of the FCRA, which requires consumer reporting agencies to “follow reasonable procedures to assure maximum possible accuracy of” consumer reports. The Ninth Circuit held that by alleging the violation of a statutory right Robins had satisfied the injury-in-fact requirement of Article III standing.
The FTC released this week a web-based tool to assist mobile app developers in determining which federal privacy laws apply to their mobile health applications. The tool asks developers a series of ten targeted questions that help a user determine whether HIPAA, FTC, and/or FDA rules and regulations might apply.
On March 15, 2016, the Federal Trade Commission reached an agreement with Lord & Taylor to settle charges that the luxury department store brand engaged in allegedly deceptive native advertising practices by failing to disclose and accurately represent its relationship to online magazines and fashion “influencers” who promoted the brand. This latest enforcement action follows the FTC’s release of a policy statement on native advertising practices and a companion set of guidelines for businesses. The action provides a cautionary tale with practical lessons about the importance of transparency in marketing strategies that mimic the look and feel of surrounding content.
FTC Commissioner Julie Brill will join Hogan Lovells US LLP as a partner and co-leader of the Privacy and Cybersecurity Practice on 1 April. Commissioner Brill was appointed by President Obama to the FTC in 2010 and will complete her service on 31 March.
The February 29, 2016 announcement of the new EU-U.S. data transfer framework—the Privacy Shield—was accompanied by over 130 pages of documentation and significantly more operational details than its predecessor, Safe Harbor. We have reviewed the Privacy Shield materials and published a comprehensive breakdown of the changes from Safe Harbor to Privacy Shield and the practical impact on business: Inside the New and Improved EU-U.S. Data Transfer Framework.
On February 29, 2016 and after more than two years of negotiations with the U.S. Department of Commerce, the European Commission released its draft Decision on the adequacy of the new EU–U.S. Privacy Shield program, accompanied by new information on how the Program will work. The Privacy Shield documentation is significantly more detailed than that associated with its predecessor, the EU-U.S. Safe Harbor, as it describes more specifically the measures that organizations wishing to use the Privacy Shield must implement. Importantly, the Privacy Shield provides for additional transparency and processes associated with U.S. government access to the personal data of EU individuals.
The FTC wants companies to listen. More precisely, the FTC wants companies to pay attention to and promptly to respond to reports of security vulnerabilities. That’s a key takeaway from the Commission’s recent settlement with ASUSTek. In its complaint against the Taiwanese router manufacturer, the FTC alleged that ASUS misrepresented its security practices and failed to reasonably secure its router software, citing the company’s alleged failure to address vulnerability reports as one of the Commission’s primary concerns. The settlement reiterates the warnings contained in the FTC’s recent Start with Security Guide and prior settlements with HTC America and Fandango: the FTC expects companies to implement adequate processes for receiving and addressing security vulnerability reports within a reasonable time.
On January 31, 2016, the Silicon Flatirons Center for Law, Technology, and Entrepreneurship at the University of Colorado hosted its annual Digital Broadband Migration Symposium. The theme of this year’s conference was “The Evolving Industry Structure of the Digital Broadband Landscape.” The two-day conference brought together an array of leaders from government, academia, and industry to examine the role of regulatory oversight, antitrust law, and intellectual property policy in regulating industry structure and to discuss what policy reforms may be appropriate for the constantly changing digital broadband environment. As outlined below, a recurring topic throughout this year’s conference was the relationship between privacy, security, and the evolving digital landscape.
If you’ve ever opened your washing machine to find white socks turned a pale shade of pink, you can relate to the sentiment of Buzzfeed UK’s piece “14 Laundry Fails We’ve All Experienced.” Humorous and empathetic, the piece mimicked Buzzfeed’s editorial tone and style, but also subtly promoted the message of a commercial advertiser—in this case, Dylon, a color dye manufacturer. And in what may be a sign of things to come in the US, the piece drew the attention of the U.K.’s advertising regulator, the Advertising Standards Authority, which cited Buzzfeed for failing to make the piece “obviously identifiable” as commercial content, a violation of the U.K.’s Committee on Advertising Practices Code.
On Wednesday, January 5, the FTC released a report titled “Big Data: A Tool for Inclusion or Exclusion?” The Report addresses the effects of the growing use of big data analytics on low-income and underserved populations, and the FTC’s role in monitoring and regulating the impacts of this commercial use of big data. There are two high-level takeaways from the Report: First, big data is a powerful tool that can be used to include or to exclude. Used responsibly, it can be a key to unlocking opportunities for underprivileged and underserved classes; but, when used with disregard for its effects, big data can serve to shut the underprivileged and underserved out of those same opportunities. Second, the FTC will be the cop on the beat. The Report’s emphasis on the tools at the FTC’s disposal for regulating the use of big data analytics, signals that the FTC intends to make use of its enforcement powers where it can.
On November 13, 2015, the Federal Trade Commission’s Chief Administrative Law Judge dismissed an FTC administrative complaint based on LabMD’s alleged failure to provide “reasonable and appropriate” security for personal information maintained on its computers. The ALJ concluded that the complaint counsel failed to prove that LabMD’s alleged practices constituted an unfair trade practice. Specifically, according to the ALJ’s initial decision, complaint counsel failed to prove by a preponderance of the evidence the first prong of the three-part unfairness test – that the alleged unreasonable conduct caused or is likely to cause substantial injury to consumers as required by Section 5(n) of the FTC Act. The case is notable for being the first data security case tried before an ALJ and only one of two instances where a company has fought the FTC’s decision to move forward with an enforcement action based on allegations that a company has engaged in unfair practices because of inadequate data security practices. Companies have otherwise voluntarily entered into consent decrees without admitting liability. In the other instance where a company did not capitulate to an FTC enforcement action, Wyndham moved to dismiss the FTC’s lawsuit against it in federal district court based on lack of jurisdiction. Wyndham lost in the district court and on an interlocutory appeal the federal court of appeals upheld that ruling, but remanded the case to district court for a trial on the merits which will assess whether Wyndham’s alleged unreasonable data security practices meet the unfairness factors in section 5(n) of the FTC Act. Accordingly, as the ALJ did here, the court in Wyndham will consider whether the practices and the data breaches there caused or were likely to cause substantial consumer injury under the first prong of an unfairness inquiry
Data privacy and security regulators don’t always agree. Take a look at the Federal Trade Commission for example. In recent years, FTC commissioners have disagreed about the role that cost-benefit analyses should play and the types of consumer harms that should be considered in the FTC’s data privacy and security enforcement actions. For organizations that rely on the collection and use of consumer information, understanding the different viewpoints at the FTC and how those viewpoints may influence future enforcement is vital to evaluating risk. On Thursday, November 5, 2015, the Future of Privacy Forum will look at those issues as it celebrates its new home and its new partnership with Washington & Lee University School Law by hosting a panel discussion addressing the Future of Section 5 of the FTC Act. Panelists David Vladeck (former FTC Consumer Bureau Director David Vladeck) and James Cooper (former Acting Director of the Office of Policy Planning) will look at key Section 5 issues.
The status of consumer data security law in the United States is at a crossroads. Last week, the White House released a discussion draft of its Consumer Privacy Bill of Rights Act of 2015, which would require businesses collecting personal information to maintain safeguards reasonably designed to ensure the security of that information. And yesterday, the Third Circuit held oral argument in FTC v. Wyndham Worldwide Corp., in which the district court last April denied Wyndham’s challenge to the Federal Trade Commission’s data security enforcement efforts.
On May 27, the Federal Trade Commission issued a report on the data broker industry that found data brokers operate with a ”fundamental lack of transparency.” The commission unanimously recommended that Congress consider enacting legislation to make data broker practices more visible to consumers and to give consumers greater control over the immense amounts of personal information about them that are collected and shared by data brokers. Not well-recognized at the time were a number of concerns, mini-dissents if you will, expressed by Federal Trade Commissioner Josh Wright. I recently asked Commissioner Wright some questions about his “dissent by footnotes.”
Less than two months after the European Commission issued a report urging the Federal Trade Commission to step up enforcement of the EU-U.S. Safe Harbor framework, the FTC announced a settlement with twelve companies — including an Internet service provider, makers of consumer goods, three National Football League teams, and a developer of mobile applications — over allegations that they deceptively claimed to be certified under Safe Harbor. According to the FTC, each of these companies represented that they maintained a active Safe Harbor certification with the U.S. Department of Commerce when in fact they did not.
The Federal Trade Commission (“FTC”) recently issued a revised guidance (“Guide”) on the Red Flags Rule (“Rule”) (see “Fighting Identity Theft with the Red Flags Rule: A How-To Guide for Business”). The Red Flags Rule requires certain businesses to develop, implement and administer an identity theft protection program. The purpose of this Guide is to […]
On October 11, 2012, the U.S. Government Accountability Office (GAO) issued a report titled “Mobile Device Location Data: Additional Federal Actions Could Help Protect Consumer Privacy.” Requested by Sen. Al Franken (D-MN), the Report recognizes the efforts of Federal agencies to protect consumer privacy when using mobile devices but calls for additional action
Following up on a public workshop held earlier this year, today the Federal Trade Commission (FTC) issued a set of truth-in-advertising and privacy guidelines for mobile device application (app) developers. Titled “Marketing Your Mobile App: Get it Right From the Start,” the guidelines provide an overview of key issues for all app developers to consider.
The Federal Trade Commission yesterday announced settlements with two companies over security breaches caused by peer-to-peer (P2P) file sharing software. The settlements require the companies to establish and maintain comprehensive information security programs and to undergo data security audits by independent auditors every other year for 20 years.
Today the Federal Trade Commission (FTC) issued its long-awaited privacy report, “Protecting Consumer Privacy in an Era of Rapid Change: Recommendations for Businesses and Policymakers,” which is intended to articulate “best practices” for companies that collect and use consumer data, and to assist Congress as it considers new privacy legislation.
A draft bill circulating on the Hill would impose new regulations on companies involved in the mobile “app” ecosystem, including wireless service providers, equipment manufacturers, device retailers, operating system providers, website operators, and other online service providers.