On October 13, the Federal Trade Commission held a workshop on drone privacy and cybersecurity as part of its Fall Technology Series. Close watchers of the drone privacy debate would recognize the arguments presented at the FTC workshop as reminiscent of the comprehensive and productive debate over drone privacy played out before the National Telecommunications and Information Administration earlier this year. The NTIA process concluded with the release of Best Practices for drone privacy supported by a diverse group of industry members and civil society representatives. Although the FTC’s workshop was in many ways a reprise of the NTIA multi-stakeholder debate, the workshop was notable insofar as the public gained new insights into FTC staff views on drone privacy and cybersecurity.
The Federal Trade Commission recently presented an analysis of how its approach to data security over the past two decades compares with the Framework for Improving Critical Infrastructure Cybersecurity issued in 2014 by the National Institute of Standards and Technology and strongly endorsed by the White House. The FTC first explains how this question has a faulty premise, as the Framework is not designed to be a compliance checklist. Instead, in this new blog post, the FTC outlines how the FTC’s enforcement actions comport with the Framework’s five Core functions—Identify, Protect, Detect, Respond, and Recover—and emphasizes how both the Framework and the FTC’s approach highlight risk assessment and management, along with implementation of reasonable security measures, as the touchstones of any data security compliance program.
The FTC today announced a request for public comment on the Standards for Safeguarding Consumer Information Rule. The FTC promulgated the Safeguards Rule in 2002, implementing Title V of the Gramm-Leach-Bliley Act , which required federal agencies to establish standards for the administrative, technical, and physical safeguards employed by financial institutions for certain information. In addition to general requests for comment, the FTC requested that five specific issues be addressed, which we have outlined below. Comments are due by November 7, 2016.
On July 25, 2016, Hogan Lovells hosted a Silicon Valley dinner as part of its 2025 dinner series. The theme of the dinner was “I’m from Mars, You’re from Venus: The Tech Community and its Future Relationship with Government”. The discussion, moderated by Deirdre Mulligan of UC, Berkeley, focused on the tech community’s view of regulatory, law enforcement and national security issues, here in the U.S., as well as in Europe; and how the tech industry will be impacted by the upcoming U.S. elections as well as Brexit.
A new report from the Department of Health and Human Services Office of the National Coordinator for Health Information Technology highlights data protection gaps in the U.S. for health data from wearable devices, social media, and emerging technologies. The report, “Examining Oversight of the Privacy & Security of Health Data Collected by Entities Not Regulated by HIPAA,” identifies several areas in which privacy and security protections for health data have lagged behind technological developments that are expanding the collection of health data outside the traditional venues for health care.
In a case that could have far-reaching implications for how companies are held liable for data security lapses, the FTC issued an order and opinion unanimously overturning its Chief Administrative Law Judge’s (ALJ) November 2015 dismissal of charges that LabMD’s allegedly lax data security measures were unfair practices under Section 5 of the FTC Act (see our coverage of […]
Thank you to everyone who participated in last week’s webinar “Privacy Shield: What You Need to Know,” in which we explored how companies demonstrate compliance with the Privacy Shield principles, what it takes to move from Safe Harbor to Privacy Shield, and more. A copy of the slide deck and recorded webinar are now available on our blog.
On Monday, May 16, 2016, the Supreme Court of the United States issued its highly anticipated opinion in Spokeo, Inc. v. Robins, a case that examined the question of whether a plaintiff who sued for a technical violation of the Fair Credit Reporting Act could maintain Article III standing for a class action without claiming any real-world injury. The case before the Court involved a putative class action brought against petitioner Spokeo, Inc., a company that generates profiles about people based on information obtained though computerized searches. Respondent Thomas Robins was one of the people with a profile on Spokeo’s website. According to Robins, the information on that profile was inaccurate. Robins filed a class-action complaint against Spokeo in federal court, alleging violations of the FCRA, which requires consumer reporting agencies to “follow reasonable procedures to assure maximum possible accuracy of” consumer reports. The Ninth Circuit held that by alleging the violation of a statutory right Robins had satisfied the injury-in-fact requirement of Article III standing.
The FTC released this week a web-based tool to assist mobile app developers in determining which federal privacy laws apply to their mobile health applications. The tool asks developers a series of ten targeted questions that help a user determine whether HIPAA, FTC, and/or FDA rules and regulations might apply.
On March 15, 2016, the Federal Trade Commission reached an agreement with Lord & Taylor to settle charges that the luxury department store brand engaged in allegedly deceptive native advertising practices by failing to disclose and accurately represent its relationship to online magazines and fashion “influencers” who promoted the brand. This latest enforcement action follows the FTC’s release of a policy statement on native advertising practices and a companion set of guidelines for businesses. The action provides a cautionary tale with practical lessons about the importance of transparency in marketing strategies that mimic the look and feel of surrounding content.
FTC Commissioner Julie Brill will join Hogan Lovells US LLP as a partner and co-leader of the Privacy and Cybersecurity Practice on 1 April. Commissioner Brill was appointed by President Obama to the FTC in 2010 and will complete her service on 31 March.
The February 29, 2016 announcement of the new EU-U.S. data transfer framework—the Privacy Shield—was accompanied by over 130 pages of documentation and significantly more operational details than its predecessor, Safe Harbor. We have reviewed the Privacy Shield materials and published a comprehensive breakdown of the changes from Safe Harbor to Privacy Shield and the practical impact on business: Inside the New and Improved EU-U.S. Data Transfer Framework.
On February 29, 2016 and after more than two years of negotiations with the U.S. Department of Commerce, the European Commission released its draft Decision on the adequacy of the new EU–U.S. Privacy Shield program, accompanied by new information on how the Program will work. The Privacy Shield documentation is significantly more detailed than that associated with its predecessor, the EU-U.S. Safe Harbor, as it describes more specifically the measures that organizations wishing to use the Privacy Shield must implement. Importantly, the Privacy Shield provides for additional transparency and processes associated with U.S. government access to the personal data of EU individuals.
The FTC wants companies to listen. More precisely, the FTC wants companies to pay attention to and promptly to respond to reports of security vulnerabilities. That’s a key takeaway from the Commission’s recent settlement with ASUSTek. In its complaint against the Taiwanese router manufacturer, the FTC alleged that ASUS misrepresented its security practices and failed to reasonably secure its router software, citing the company’s alleged failure to address vulnerability reports as one of the Commission’s primary concerns. The settlement reiterates the warnings contained in the FTC’s recent Start with Security Guide and prior settlements with HTC America and Fandango: the FTC expects companies to implement adequate processes for receiving and addressing security vulnerability reports within a reasonable time.
On January 31, 2016, the Silicon Flatirons Center for Law, Technology, and Entrepreneurship at the University of Colorado hosted its annual Digital Broadband Migration Symposium. The theme of this year’s conference was “The Evolving Industry Structure of the Digital Broadband Landscape.” The two-day conference brought together an array of leaders from government, academia, and industry to examine the role of regulatory oversight, antitrust law, and intellectual property policy in regulating industry structure and to discuss what policy reforms may be appropriate for the constantly changing digital broadband environment. As outlined below, a recurring topic throughout this year’s conference was the relationship between privacy, security, and the evolving digital landscape.
If you’ve ever opened your washing machine to find white socks turned a pale shade of pink, you can relate to the sentiment of Buzzfeed UK’s piece “14 Laundry Fails We’ve All Experienced.” Humorous and empathetic, the piece mimicked Buzzfeed’s editorial tone and style, but also subtly promoted the message of a commercial advertiser—in this case, Dylon, a color dye manufacturer. And in what may be a sign of things to come in the US, the piece drew the attention of the U.K.’s advertising regulator, the Advertising Standards Authority, which cited Buzzfeed for failing to make the piece “obviously identifiable” as commercial content, a violation of the U.K.’s Committee on Advertising Practices Code.
On Wednesday, January 5, the FTC released a report titled “Big Data: A Tool for Inclusion or Exclusion?” The Report addresses the effects of the growing use of big data analytics on low-income and underserved populations, and the FTC’s role in monitoring and regulating the impacts of this commercial use of big data. There are two high-level takeaways from the Report: First, big data is a powerful tool that can be used to include or to exclude. Used responsibly, it can be a key to unlocking opportunities for underprivileged and underserved classes; but, when used with disregard for its effects, big data can serve to shut the underprivileged and underserved out of those same opportunities. Second, the FTC will be the cop on the beat. The Report’s emphasis on the tools at the FTC’s disposal for regulating the use of big data analytics, signals that the FTC intends to make use of its enforcement powers where it can.
On November 13, 2015, the Federal Trade Commission’s Chief Administrative Law Judge dismissed an FTC administrative complaint based on LabMD’s alleged failure to provide “reasonable and appropriate” security for personal information maintained on its computers. The ALJ concluded that the complaint counsel failed to prove that LabMD’s alleged practices constituted an unfair trade practice. Specifically, according to the ALJ’s initial decision, complaint counsel failed to prove by a preponderance of the evidence the first prong of the three-part unfairness test – that the alleged unreasonable conduct caused or is likely to cause substantial injury to consumers as required by Section 5(n) of the FTC Act. The case is notable for being the first data security case tried before an ALJ and only one of two instances where a company has fought the FTC’s decision to move forward with an enforcement action based on allegations that a company has engaged in unfair practices because of inadequate data security practices. Companies have otherwise voluntarily entered into consent decrees without admitting liability. In the other instance where a company did not capitulate to an FTC enforcement action, Wyndham moved to dismiss the FTC’s lawsuit against it in federal district court based on lack of jurisdiction. Wyndham lost in the district court and on an interlocutory appeal the federal court of appeals upheld that ruling, but remanded the case to district court for a trial on the merits which will assess whether Wyndham’s alleged unreasonable data security practices meet the unfairness factors in section 5(n) of the FTC Act. Accordingly, as the ALJ did here, the court in Wyndham will consider whether the practices and the data breaches there caused or were likely to cause substantial consumer injury under the first prong of an unfairness inquiry
Data privacy and security regulators don’t always agree. Take a look at the Federal Trade Commission for example. In recent years, FTC commissioners have disagreed about the role that cost-benefit analyses should play and the types of consumer harms that should be considered in the FTC’s data privacy and security enforcement actions. For organizations that rely on the collection and use of consumer information, understanding the different viewpoints at the FTC and how those viewpoints may influence future enforcement is vital to evaluating risk. On Thursday, November 5, 2015, the Future of Privacy Forum will look at those issues as it celebrates its new home and its new partnership with Washington & Lee University School Law by hosting a panel discussion addressing the Future of Section 5 of the FTC Act. Panelists David Vladeck (former FTC Consumer Bureau Director David Vladeck) and James Cooper (former Acting Director of the Office of Policy Planning) will look at key Section 5 issues.
The status of consumer data security law in the United States is at a crossroads. Last week, the White House released a discussion draft of its Consumer Privacy Bill of Rights Act of 2015, which would require businesses collecting personal information to maintain safeguards reasonably designed to ensure the security of that information. And yesterday, the Third Circuit held oral argument in FTC v. Wyndham Worldwide Corp., in which the district court last April denied Wyndham’s challenge to the Federal Trade Commission’s data security enforcement efforts.
On May 27, the Federal Trade Commission issued a report on the data broker industry that found data brokers operate with a ”fundamental lack of transparency.” The commission unanimously recommended that Congress consider enacting legislation to make data broker practices more visible to consumers and to give consumers greater control over the immense amounts of personal information about them that are collected and shared by data brokers. Not well-recognized at the time were a number of concerns, mini-dissents if you will, expressed by Federal Trade Commissioner Josh Wright. I recently asked Commissioner Wright some questions about his “dissent by footnotes.”
Less than two months after the European Commission issued a report urging the Federal Trade Commission to step up enforcement of the EU-U.S. Safe Harbor framework, the FTC announced a settlement with twelve companies — including an Internet service provider, makers of consumer goods, three National Football League teams, and a developer of mobile applications — over allegations that they deceptively claimed to be certified under Safe Harbor. According to the FTC, each of these companies represented that they maintained a active Safe Harbor certification with the U.S. Department of Commerce when in fact they did not.
The Federal Trade Commission (“FTC”) recently issued a revised guidance (“Guide”) on the Red Flags Rule (“Rule”) (see “Fighting Identity Theft with the Red Flags Rule: A How-To Guide for Business”). The Red Flags Rule requires certain businesses to develop, implement and administer an identity theft protection program. The purpose of this Guide is to […]
On October 11, 2012, the U.S. Government Accountability Office (GAO) issued a report titled “Mobile Device Location Data: Additional Federal Actions Could Help Protect Consumer Privacy.” Requested by Sen. Al Franken (D-MN), the Report recognizes the efforts of Federal agencies to protect consumer privacy when using mobile devices but calls for additional action