In June 2015, the Federal Trade Commission held a workshop on The “Sharing” Economy: Issues Facing Platforms, Participants, and Regulators. The Commission also solicited public comments on the topic, receiving more than 2,000 comments in response. On 17 November, the Commission issued a report summarizing the issues explored in the workshop and the public comments. The report emphasized that the workshop (and its ensuing summary) was not intended “as a precursor to law enforcement” but “an opportunity to learn more” about this rapidly evolving business model and to aid “the Commission, as well as regulators, consumer groups, platforms, participants using the platforms, incumbent firms, and others” to address the unique issues raised by sharing economy platforms.
Ever since the first draft of the EU-US Privacy Shield framework was published in early 2016, groups opposed to the idea have indicated their intent to challenge the legality of the framework under EU law. Recently, the privacy advocacy group Digital Rights Ireland made good on that promise. Following the filing of a formal complaint on 15 September asking for an annulment of the framework by the Court of Justice of the European Union, DRI has now made public the details of its complaint.
The Federal Communication Commission’s long-awaited – and much debated – privacy rules for Internet Service Providers have now been adopted. The agency approved the rules by a 3-2 vote along political party lines last Thursday. Several of the FCC requirements are particularly notable for being more restrictive than the Federal Trade Commission’s standards for consumer online privacy. In this post we provide an overview of some of the new FCC rules and highlight key areas where the FCC’s requirements diverge from the FTC’s framework.
Close followers of the cases FTC v. Wyndham Worldwide Corp. and In the Matter of LabMD know that the litigation has prompted increased Congressional oversight of the Federal Trade Commission’s data security enforcement practices. Prior to Wyndham and LabMD, Congressional debates on the FTC’s data security practices centered on whether the Commission should have additional tools to address these issues, including traditional rulemaking authority to create new data security rules, civil penalty authority to fine violators, or authority over the activities of non-profit entities. To the extent Congress questioned the FTC’s enforcement decisions in this pre- Wyndham and LabMD era, those inquires typically focused on the length of time of FTC settlement agreements, while relatively little attention was paid to how the Commission provided notice of its data security standards or how the Commission chose its enforcement targets. Wyndham and LabMD fundamentally shifted this debate.
On October 13, the Federal Trade Commission held a workshop on drone privacy and cybersecurity as part of its Fall Technology Series. Close watchers of the drone privacy debate would recognize the arguments presented at the FTC workshop as reminiscent of the comprehensive and productive debate over drone privacy played out before the National Telecommunications and Information Administration earlier this year. The NTIA process concluded with the release of Best Practices for drone privacy supported by a diverse group of industry members and civil society representatives. Although the FTC’s workshop was in many ways a reprise of the NTIA multi-stakeholder debate, the workshop was notable insofar as the public gained new insights into FTC staff views on drone privacy and cybersecurity.
Some of the largest cyber attacks in recent memory have employed an army of connected home devices to achieve their goals. This co-opting of connected home devices owned by consumers around the world occurs without those consumers’ knowledge or consent. For example, in mid-September, several thousand devices—home routers, Internet-connected video cameras, and digital video recorders—were used to create a “botnet” that collectively pounded the security researcher Brian Krebs’ website with 620 gigabits of data per second. At the time, the attack was thought to be the largest in history. An even larger army was assembled a few days later for an attack on the French hosting provider OVH that peaked at over one terabit of traffic per second. These distributed denial-of-service attacks were successful because they exploited basic security vulnerabilities in connected home devices, such as default passwords used to access administrator settings.
This week, the Online Trust Alliance turned its attention from manufacturers to consumers by releasing a checklist of basic steps that consumers can take to improve the privacy and security “hygiene” of their connected home and wearable devices. Just as smoke detectors require periodic battery changes, the OTA warns that IoT devices also benefit from regular checkups.
On August 29, 2016, the Federal Aviation Administration’s long-awaited small unmanned aircraft systems rule went into effect, for the first time broadly authorizing commercial drone operations. This is a positive step, as drones have great safety and efficiency benefits for the public. Nevertheless, the American public remains concerned about drone privacy issues.
The FTC today announced a request for public comment on the Standards for Safeguarding Consumer Information Rule. The FTC promulgated the Safeguards Rule in 2002, implementing Title V of the Gramm-Leach-Bliley Act , which required federal agencies to establish standards for the administrative, technical, and physical safeguards employed by financial institutions for certain information. In addition to general requests for comment, the FTC requested that five specific issues be addressed, which we have outlined below. Comments are due by November 7, 2016.
On July 25, 2016, Hogan Lovells hosted a Silicon Valley dinner as part of its 2025 dinner series. The theme of the dinner was “I’m from Mars, You’re from Venus: The Tech Community and its Future Relationship with Government”. The discussion, moderated by Deirdre Mulligan of UC, Berkeley, focused on the tech community’s view of regulatory, law enforcement and national security issues, here in the U.S., as well as in Europe; and how the tech industry will be impacted by the upcoming U.S. elections as well as Brexit.
In a case that could have far-reaching implications for how companies are held liable for data security lapses, the FTC issued an order and opinion unanimously overturning its Chief Administrative Law Judge’s (ALJ) November 2015 dismissal of charges that LabMD’s allegedly lax data security measures were unfair practices under Section 5 of the FTC Act (see our coverage of […]
The U.S. Department of Education and Department of Justice recently weighed in on the obligations of school districts, colleges, and universities to provide civil rights protections for transgender students. On May 13, 2016, the Departments issued a Dear Colleague Letter that summarizes the responsibilities of school districts, colleges, and universities that receive federal financial assistance under the Departments’ interpretation of federal law, including Title IX of the Education Amendments of 1972 and the Family Education Rights and Privacy Act. Here, we focus on the DCL’s guidance pertinent to compliance with FERPA.
A three-judge panel of the U.S. Court of Appeals for the Second Circuit today unanimously reversed a lower court’s denial of Microsoft’s motion to quash a warrant seeking the content of emails for a customer of its Outlook.com email service. The decision is surprising in that that U.S. courts, including the Second Circuit, have traditionally enforced government process seeking documents or data stored abroad from entities that have control over the information under the test of “control, not location.” This case could have a significant impact on cloud providers’ decisions to store information abroad. It also serves, in the midst of debates about the newly enacted Privacy Shield and the recent challenge to Standard Contractual Clauses now before the Court of Justice of the European Union, as a counterbalance to arguments that some make about the U.S. legal system not respecting personal privacy.
On Monday, May 16, 2016, the Supreme Court of the United States issued its highly anticipated opinion in Spokeo, Inc. v. Robins, a case that examined the question of whether a plaintiff who sued for a technical violation of the Fair Credit Reporting Act could maintain Article III standing for a class action without claiming any real-world injury. The case before the Court involved a putative class action brought against petitioner Spokeo, Inc., a company that generates profiles about people based on information obtained though computerized searches. Respondent Thomas Robins was one of the people with a profile on Spokeo’s website. According to Robins, the information on that profile was inaccurate. Robins filed a class-action complaint against Spokeo in federal court, alleging violations of the FCRA, which requires consumer reporting agencies to “follow reasonable procedures to assure maximum possible accuracy of” consumer reports. The Ninth Circuit held that by alleging the violation of a statutory right Robins had satisfied the injury-in-fact requirement of Article III standing.
On April 5, 2016, the National Telecommunications and Information Administration initiated an inquiry to review the potential benefits and challenges presented by the Internet of Things. In its Notice and request for public comment (RFC), NTIA is seeking input on the current IoT technological and policy landscape with a goal of developing recommendations—in the form of a Green Paper—as to whether and how the federal government should play a role in fostering the advancement of IoT technologies.
On March 15, 2016, the Federal Trade Commission reached an agreement with Lord & Taylor to settle charges that the luxury department store brand engaged in allegedly deceptive native advertising practices by failing to disclose and accurately represent its relationship to online magazines and fashion “influencers” who promoted the brand. This latest enforcement action follows the FTC’s release of a policy statement on native advertising practices and a companion set of guidelines for businesses. The action provides a cautionary tale with practical lessons about the importance of transparency in marketing strategies that mimic the look and feel of surrounding content.
Fifteen months after forming an Internet of Things working group, on March 2, 2016, the Online Trust Alliance released a final version of its IoT Framework along with a companion Resource Guide that provides explanations and additional resources. The voluntary Framework sets forth thirty suggested guidelines that provide criteria for designing privacy, security, and sustainability into connected devices. The creation of the OTA IoT principles represents a potential starting point for achieving privacy- and security-protective innovation for IoT devices.
On Thursday, Federal Communications Commission Chairman Tom Wheeler circulated a highly anticipated broadband data privacy and security Notice of Proposed Rulemaking to the other Commissioners, slating the proposals for a full Commission vote at the agency’s March 31 Open Meeting. The rules would apply to internet service providers, but organizations throughout the online data ecosystem will want to pay close attention to this rulemaking and be prepared to comment on the FCC’s proposals.
On March 2, 2016, the Consumer Financial Protection Bureau announced its first data security enforcement action in the form of a Consent Order with online payment platform Dwolla, Inc. The 5 year Consent Order is based on CFPB allegations that Dwolla engaged in deceptive acts and practices by misrepresenting to consumers that it had “reasonable and appropriate data security practices.” Dwolla neither admitted nor denied that it engaged in data security misrepresentations. The CFPB fined Dwolla $100,000, enjoined it from making further misrepresentations, and is requiring that it develop a written, comprehensive data security program, designate a person responsible for the program, provide employee training, conduct risk assessments, and undergo independent third party audits annually, among other things. The CFPB also places primary responsibility for compliance with the Consent Order on Dwolla’s board of directors.
The FTC wants companies to listen. More precisely, the FTC wants companies to pay attention to and promptly to respond to reports of security vulnerabilities. That’s a key takeaway from the Commission’s recent settlement with ASUSTek. In its complaint against the Taiwanese router manufacturer, the FTC alleged that ASUS misrepresented its security practices and failed to reasonably secure its router software, citing the company’s alleged failure to address vulnerability reports as one of the Commission’s primary concerns. The settlement reiterates the warnings contained in the FTC’s recent Start with Security Guide and prior settlements with HTC America and Fandango: the FTC expects companies to implement adequate processes for receiving and addressing security vulnerability reports within a reasonable time.
If you’ve ever opened your washing machine to find white socks turned a pale shade of pink, you can relate to the sentiment of Buzzfeed UK’s piece “14 Laundry Fails We’ve All Experienced.” Humorous and empathetic, the piece mimicked Buzzfeed’s editorial tone and style, but also subtly promoted the message of a commercial advertiser—in this case, Dylon, a color dye manufacturer. And in what may be a sign of things to come in the US, the piece drew the attention of the U.K.’s advertising regulator, the Advertising Standards Authority, which cited Buzzfeed for failing to make the piece “obviously identifiable” as commercial content, a violation of the U.K.’s Committee on Advertising Practices Code.
On Wednesday, January 5, the FTC released a report titled “Big Data: A Tool for Inclusion or Exclusion?” The Report addresses the effects of the growing use of big data analytics on low-income and underserved populations, and the FTC’s role in monitoring and regulating the impacts of this commercial use of big data. There are two high-level takeaways from the Report: First, big data is a powerful tool that can be used to include or to exclude. Used responsibly, it can be a key to unlocking opportunities for underprivileged and underserved classes; but, when used with disregard for its effects, big data can serve to shut the underprivileged and underserved out of those same opportunities. Second, the FTC will be the cop on the beat. The Report’s emphasis on the tools at the FTC’s disposal for regulating the use of big data analytics, signals that the FTC intends to make use of its enforcement powers where it can.
One of the most common devices in the emerging Internet of Things (IoT) was reportedly discovered to have a bug. According to the research firm Fortinet, a popular fitness tracker was vulnerable to wireless attacks through its unsecured Bluetooth port. A savvy attacker could install malware wirelessly within ten seconds—simply by coming within a few feet of the tracker. When the device’s owner returned home to sync daily activity with a computer, the malware could, in principle, infect the computer as well.
The Right to be Forgotten Law imposes an obligation on search engines that disseminate adverts targeted at consumers located in Russia to remove search results listing information on individuals where such information is unlawfully disseminated, untrustworthy, outdated, or irrelevant (i.e. the information is no longer substantially relevant to the individual in question due to subsequent events or the actions of individuals). The Law includes exemptions where a search engine does not have to comply – (i) information on events reporting a crime where the limitation period for criminal liability has not expired; as well as (ii) crimes committed by an individual where their conviction record has not been erased.
On November 13, 2015, the Federal Trade Commission’s Chief Administrative Law Judge dismissed an FTC administrative complaint based on LabMD’s alleged failure to provide “reasonable and appropriate” security for personal information maintained on its computers. The ALJ concluded that the complaint counsel failed to prove that LabMD’s alleged practices constituted an unfair trade practice. Specifically, according to the ALJ’s initial decision, complaint counsel failed to prove by a preponderance of the evidence the first prong of the three-part unfairness test – that the alleged unreasonable conduct caused or is likely to cause substantial injury to consumers as required by Section 5(n) of the FTC Act. The case is notable for being the first data security case tried before an ALJ and only one of two instances where a company has fought the FTC’s decision to move forward with an enforcement action based on allegations that a company has engaged in unfair practices because of inadequate data security practices. Companies have otherwise voluntarily entered into consent decrees without admitting liability. In the other instance where a company did not capitulate to an FTC enforcement action, Wyndham moved to dismiss the FTC’s lawsuit against it in federal district court based on lack of jurisdiction. Wyndham lost in the district court and on an interlocutory appeal the federal court of appeals upheld that ruling, but remanded the case to district court for a trial on the merits which will assess whether Wyndham’s alleged unreasonable data security practices meet the unfairness factors in section 5(n) of the FTC Act. Accordingly, as the ALJ did here, the court in Wyndham will consider whether the practices and the data breaches there caused or were likely to cause substantial consumer injury under the first prong of an unfairness inquiry