The Federal Trade Commission and National Highway Traffic Safety Administration are co-hosting a workshop on June 28, 2017, to explore the privacy and security issues raised by automated and connected vehicle technologies. The agencies are looking to explore the types of data such technologies collect, store, transmit, and share; the potential benefits and challenges posed by the technologies; the privacy and security practices of vehicle manufacturers; the roles that federal agencies should play in regulating privacy and security issues; and how self-regulatory standards apply to connected vehicle privacy and security issues. In advance of the workshop, the FTC and NHTSA are seeking public comment on privacy and security issues. Comments may be submitted through April 20, 2017.
As previously reported, on Thursday, March 9th, the Federal Trade Commission (FTC) hosted a forum on the consumer implications of recent developments in artificial intelligence (AI) and blockchain technologies. This is the second of two entries on the March 9th FinTech Forum and focuses on the discussions surrounding blockchain technologies, in which panelists reflected on the nascent stage of the technology, industry representatives expressed confusion over the applicability of current regulation, and regulators expressed a lack of clarity over jurisdictional questions.
On Thursday, March 9th, the Federal Trade Commission (FTC) hosted a forum on the consumer implications of recent developments in artificial intelligence (AI) and blockchain technologies. The FTC acknowledged the benefits of technological developments in AI and blockchain technologies, but stressed that advancements in these technologies must be coupled with an awareness of and active engagement in identifying and minimizing associated risks. This blog post focuses on the AI discussion, which addressed how the values of privacy, autonomy, and fairness are affected by the advent of AI systems as well as how to ensure safety and security in the development and deployment of individual and connected AI systems.
On January 23, 2017, fourteen months after hosting a workshop to review the multi-device, multi-platform digital landscape, the FTC issued a staff report on cross-device tracking summarizing the FTC’s 2015 workshop and providing a set of related recommendations. In this post, we look at the FTC’s previous advice on cross-device tracking, key takeaways from the FTC report, and how the guidance aligns with the Digital Advertising Alliance’s (DAA) self-regulatory principles for cross-device tracking, which become enforceable on February 1, 2017.
In June 2015, the Federal Trade Commission held a workshop on The “Sharing” Economy: Issues Facing Platforms, Participants, and Regulators. The Commission also solicited public comments on the topic, receiving more than 2,000 comments in response. On 17 November, the Commission issued a report summarizing the issues explored in the workshop and the public comments. The report emphasized that the workshop (and its ensuing summary) was not intended “as a precursor to law enforcement” but “an opportunity to learn more” about this rapidly evolving business model and to aid “the Commission, as well as regulators, consumer groups, platforms, participants using the platforms, incumbent firms, and others” to address the unique issues raised by sharing economy platforms.
The Federal Communication Commission’s long-awaited – and much debated – privacy rules for Internet Service Providers have now been adopted. The agency approved the rules by a 3-2 vote along political party lines last Thursday. Several of the FCC requirements are particularly notable for being more restrictive than the Federal Trade Commission’s standards for consumer online privacy. In this post we provide an overview of some of the new FCC rules and highlight key areas where the FCC’s requirements diverge from the FTC’s framework.
Please join us for our November 2016 Privacy and Cybersecurity Events.
Close followers of the cases FTC v. Wyndham Worldwide Corp. and In the Matter of LabMD know that the litigation has prompted increased Congressional oversight of the Federal Trade Commission’s data security enforcement practices. Prior to Wyndham and LabMD, Congressional debates on the FTC’s data security practices centered on whether the Commission should have additional tools to address these issues, including traditional rulemaking authority to create new data security rules, civil penalty authority to fine violators, or authority over the activities of non-profit entities. To the extent Congress questioned the FTC’s enforcement decisions in this pre- Wyndham and LabMD era, those inquires typically focused on the length of time of FTC settlement agreements, while relatively little attention was paid to how the Commission provided notice of its data security standards or how the Commission chose its enforcement targets. Wyndham and LabMD fundamentally shifted this debate.
On October 13, the Federal Trade Commission held a workshop on drone privacy and cybersecurity as part of its Fall Technology Series. Close watchers of the drone privacy debate would recognize the arguments presented at the FTC workshop as reminiscent of the comprehensive and productive debate over drone privacy played out before the National Telecommunications and Information Administration earlier this year. The NTIA process concluded with the release of Best Practices for drone privacy supported by a diverse group of industry members and civil society representatives. Although the FTC’s workshop was in many ways a reprise of the NTIA multi-stakeholder debate, the workshop was notable insofar as the public gained new insights into FTC staff views on drone privacy and cybersecurity.
The Federal Trade Commission recently presented an analysis of how its approach to data security over the past two decades compares with the Framework for Improving Critical Infrastructure Cybersecurity issued in 2014 by the National Institute of Standards and Technology and strongly endorsed by the White House. The FTC first explains how this question has a faulty premise, as the Framework is not designed to be a compliance checklist. Instead, in this new blog post, the FTC outlines how the FTC’s enforcement actions comport with the Framework’s five Core functions—Identify, Protect, Detect, Respond, and Recover—and emphasizes how both the Framework and the FTC’s approach highlight risk assessment and management, along with implementation of reasonable security measures, as the touchstones of any data security compliance program.
The FTC today announced a request for public comment on the Standards for Safeguarding Consumer Information Rule. The FTC promulgated the Safeguards Rule in 2002, implementing Title V of the Gramm-Leach-Bliley Act , which required federal agencies to establish standards for the administrative, technical, and physical safeguards employed by financial institutions for certain information. In addition to general requests for comment, the FTC requested that five specific issues be addressed, which we have outlined below. Comments are due by November 7, 2016.
On July 25, 2016, Hogan Lovells hosted a Silicon Valley dinner as part of its 2025 dinner series. The theme of the dinner was “I’m from Mars, You’re from Venus: The Tech Community and its Future Relationship with Government”. The discussion, moderated by Deirdre Mulligan of UC, Berkeley, focused on the tech community’s view of regulatory, law enforcement and national security issues, here in the U.S., as well as in Europe; and how the tech industry will be impacted by the upcoming U.S. elections as well as Brexit.
A new report from the Department of Health and Human Services Office of the National Coordinator for Health Information Technology highlights data protection gaps in the U.S. for health data from wearable devices, social media, and emerging technologies. The report, “Examining Oversight of the Privacy & Security of Health Data Collected by Entities Not Regulated by HIPAA,” identifies several areas in which privacy and security protections for health data have lagged behind technological developments that are expanding the collection of health data outside the traditional venues for health care.
In a case that could have far-reaching implications for how companies are held liable for data security lapses, the FTC issued an order and opinion unanimously overturning its Chief Administrative Law Judge’s (ALJ) November 2015 dismissal of charges that LabMD’s allegedly lax data security measures were unfair practices under Section 5 of the FTC Act (see our coverage of […]
Thank you to everyone who participated in last week’s webinar “Privacy Shield: What You Need to Know,” in which we explored how companies demonstrate compliance with the Privacy Shield principles, what it takes to move from Safe Harbor to Privacy Shield, and more. A copy of the slide deck and recorded webinar are now available on our blog.
On Monday, May 16, 2016, the Supreme Court of the United States issued its highly anticipated opinion in Spokeo, Inc. v. Robins, a case that examined the question of whether a plaintiff who sued for a technical violation of the Fair Credit Reporting Act could maintain Article III standing for a class action without claiming any real-world injury. The case before the Court involved a putative class action brought against petitioner Spokeo, Inc., a company that generates profiles about people based on information obtained though computerized searches. Respondent Thomas Robins was one of the people with a profile on Spokeo’s website. According to Robins, the information on that profile was inaccurate. Robins filed a class-action complaint against Spokeo in federal court, alleging violations of the FCRA, which requires consumer reporting agencies to “follow reasonable procedures to assure maximum possible accuracy of” consumer reports. The Ninth Circuit held that by alleging the violation of a statutory right Robins had satisfied the injury-in-fact requirement of Article III standing.
The FTC released this week a web-based tool to assist mobile app developers in determining which federal privacy laws apply to their mobile health applications. The tool asks developers a series of ten targeted questions that help a user determine whether HIPAA, FTC, and/or FDA rules and regulations might apply.
On March 15, 2016, the Federal Trade Commission reached an agreement with Lord & Taylor to settle charges that the luxury department store brand engaged in allegedly deceptive native advertising practices by failing to disclose and accurately represent its relationship to online magazines and fashion “influencers” who promoted the brand. This latest enforcement action follows the FTC’s release of a policy statement on native advertising practices and a companion set of guidelines for businesses. The action provides a cautionary tale with practical lessons about the importance of transparency in marketing strategies that mimic the look and feel of surrounding content.
FTC Commissioner Julie Brill will join Hogan Lovells US LLP as a partner and co-leader of the Privacy and Cybersecurity Practice on 1 April. Commissioner Brill was appointed by President Obama to the FTC in 2010 and will complete her service on 31 March.
The February 29, 2016 announcement of the new EU-U.S. data transfer framework—the Privacy Shield—was accompanied by over 130 pages of documentation and significantly more operational details than its predecessor, Safe Harbor. We have reviewed the Privacy Shield materials and published a comprehensive breakdown of the changes from Safe Harbor to Privacy Shield and the practical impact on business: Inside the New and Improved EU-U.S. Data Transfer Framework.
On February 29, 2016 and after more than two years of negotiations with the U.S. Department of Commerce, the European Commission released its draft Decision on the adequacy of the new EU–U.S. Privacy Shield program, accompanied by new information on how the Program will work. The Privacy Shield documentation is significantly more detailed than that associated with its predecessor, the EU-U.S. Safe Harbor, as it describes more specifically the measures that organizations wishing to use the Privacy Shield must implement. Importantly, the Privacy Shield provides for additional transparency and processes associated with U.S. government access to the personal data of EU individuals.
The FTC wants companies to listen. More precisely, the FTC wants companies to pay attention to and promptly to respond to reports of security vulnerabilities. That’s a key takeaway from the Commission’s recent settlement with ASUSTek. In its complaint against the Taiwanese router manufacturer, the FTC alleged that ASUS misrepresented its security practices and failed to reasonably secure its router software, citing the company’s alleged failure to address vulnerability reports as one of the Commission’s primary concerns. The settlement reiterates the warnings contained in the FTC’s recent Start with Security Guide and prior settlements with HTC America and Fandango: the FTC expects companies to implement adequate processes for receiving and addressing security vulnerability reports within a reasonable time.
On January 31, 2016, the Silicon Flatirons Center for Law, Technology, and Entrepreneurship at the University of Colorado hosted its annual Digital Broadband Migration Symposium. The theme of this year’s conference was “The Evolving Industry Structure of the Digital Broadband Landscape.” The two-day conference brought together an array of leaders from government, academia, and industry to examine the role of regulatory oversight, antitrust law, and intellectual property policy in regulating industry structure and to discuss what policy reforms may be appropriate for the constantly changing digital broadband environment. As outlined below, a recurring topic throughout this year’s conference was the relationship between privacy, security, and the evolving digital landscape.
If you’ve ever opened your washing machine to find white socks turned a pale shade of pink, you can relate to the sentiment of Buzzfeed UK’s piece “14 Laundry Fails We’ve All Experienced.” Humorous and empathetic, the piece mimicked Buzzfeed’s editorial tone and style, but also subtly promoted the message of a commercial advertiser—in this case, Dylon, a color dye manufacturer. And in what may be a sign of things to come in the US, the piece drew the attention of the U.K.’s advertising regulator, the Advertising Standards Authority, which cited Buzzfeed for failing to make the piece “obviously identifiable” as commercial content, a violation of the U.K.’s Committee on Advertising Practices Code.