On July 25, 2016, Hogan Lovells hosted a Silicon Valley dinner as part of its 2025 dinner series. The theme of the dinner was “I’m from Mars, You’re from Venus: The Tech Community and its Future Relationship with Government”. The discussion, moderated by Deirdre Mulligan of UC, Berkeley, focused on the tech community’s view of regulatory, law enforcement and national security issues, here in the U.S., as well as in Europe; and how the tech industry will be impacted by the upcoming U.S. elections as well as Brexit.
In a case that could have far-reaching implications for how companies are held liable for data security lapses, the FTC issued an order and opinion unanimously overturning its Chief Administrative Law Judge’s (ALJ) November 2015 dismissal of charges that LabMD’s allegedly lax data security measures were unfair practices under Section 5 of the FTC Act (see our coverage of […]
The U.S. Department of Education and Department of Justice recently weighed in on the obligations of school districts, colleges, and universities to provide civil rights protections for transgender students. On May 13, 2016, the Departments issued a Dear Colleague Letter that summarizes the responsibilities of school districts, colleges, and universities that receive federal financial assistance under the Departments’ interpretation of federal law, including Title IX of the Education Amendments of 1972 and the Family Education Rights and Privacy Act. Here, we focus on the DCL’s guidance pertinent to compliance with FERPA.
A three-judge panel of the U.S. Court of Appeals for the Second Circuit today unanimously reversed a lower court’s denial of Microsoft’s motion to quash a warrant seeking the content of emails for a customer of its Outlook.com email service. The decision is surprising in that that U.S. courts, including the Second Circuit, have traditionally enforced government process seeking documents or data stored abroad from entities that have control over the information under the test of “control, not location.” This case could have a significant impact on cloud providers’ decisions to store information abroad. It also serves, in the midst of debates about the newly enacted Privacy Shield and the recent challenge to Standard Contractual Clauses now before the Court of Justice of the European Union, as a counterbalance to arguments that some make about the U.S. legal system not respecting personal privacy.
On Monday, May 16, 2016, the Supreme Court of the United States issued its highly anticipated opinion in Spokeo, Inc. v. Robins, a case that examined the question of whether a plaintiff who sued for a technical violation of the Fair Credit Reporting Act could maintain Article III standing for a class action without claiming any real-world injury. The case before the Court involved a putative class action brought against petitioner Spokeo, Inc., a company that generates profiles about people based on information obtained though computerized searches. Respondent Thomas Robins was one of the people with a profile on Spokeo’s website. According to Robins, the information on that profile was inaccurate. Robins filed a class-action complaint against Spokeo in federal court, alleging violations of the FCRA, which requires consumer reporting agencies to “follow reasonable procedures to assure maximum possible accuracy of” consumer reports. The Ninth Circuit held that by alleging the violation of a statutory right Robins had satisfied the injury-in-fact requirement of Article III standing.
On April 5, 2016, the National Telecommunications and Information Administration initiated an inquiry to review the potential benefits and challenges presented by the Internet of Things. In its Notice and request for public comment (RFC), NTIA is seeking input on the current IoT technological and policy landscape with a goal of developing recommendations—in the form of a Green Paper—as to whether and how the federal government should play a role in fostering the advancement of IoT technologies.
On March 15, 2016, the Federal Trade Commission reached an agreement with Lord & Taylor to settle charges that the luxury department store brand engaged in allegedly deceptive native advertising practices by failing to disclose and accurately represent its relationship to online magazines and fashion “influencers” who promoted the brand. This latest enforcement action follows the FTC’s release of a policy statement on native advertising practices and a companion set of guidelines for businesses. The action provides a cautionary tale with practical lessons about the importance of transparency in marketing strategies that mimic the look and feel of surrounding content.
Fifteen months after forming an Internet of Things working group, on March 2, 2016, the Online Trust Alliance released a final version of its IoT Framework along with a companion Resource Guide that provides explanations and additional resources. The voluntary Framework sets forth thirty suggested guidelines that provide criteria for designing privacy, security, and sustainability into connected devices. The creation of the OTA IoT principles represents a potential starting point for achieving privacy- and security-protective innovation for IoT devices.
On Thursday, Federal Communications Commission Chairman Tom Wheeler circulated a highly anticipated broadband data privacy and security Notice of Proposed Rulemaking to the other Commissioners, slating the proposals for a full Commission vote at the agency’s March 31 Open Meeting. The rules would apply to internet service providers, but organizations throughout the online data ecosystem will want to pay close attention to this rulemaking and be prepared to comment on the FCC’s proposals.
On March 2, 2016, the Consumer Financial Protection Bureau announced its first data security enforcement action in the form of a Consent Order with online payment platform Dwolla, Inc. The 5 year Consent Order is based on CFPB allegations that Dwolla engaged in deceptive acts and practices by misrepresenting to consumers that it had “reasonable and appropriate data security practices.” Dwolla neither admitted nor denied that it engaged in data security misrepresentations. The CFPB fined Dwolla $100,000, enjoined it from making further misrepresentations, and is requiring that it develop a written, comprehensive data security program, designate a person responsible for the program, provide employee training, conduct risk assessments, and undergo independent third party audits annually, among other things. The CFPB also places primary responsibility for compliance with the Consent Order on Dwolla’s board of directors.
The FTC wants companies to listen. More precisely, the FTC wants companies to pay attention to and promptly to respond to reports of security vulnerabilities. That’s a key takeaway from the Commission’s recent settlement with ASUSTek. In its complaint against the Taiwanese router manufacturer, the FTC alleged that ASUS misrepresented its security practices and failed to reasonably secure its router software, citing the company’s alleged failure to address vulnerability reports as one of the Commission’s primary concerns. The settlement reiterates the warnings contained in the FTC’s recent Start with Security Guide and prior settlements with HTC America and Fandango: the FTC expects companies to implement adequate processes for receiving and addressing security vulnerability reports within a reasonable time.
If you’ve ever opened your washing machine to find white socks turned a pale shade of pink, you can relate to the sentiment of Buzzfeed UK’s piece “14 Laundry Fails We’ve All Experienced.” Humorous and empathetic, the piece mimicked Buzzfeed’s editorial tone and style, but also subtly promoted the message of a commercial advertiser—in this case, Dylon, a color dye manufacturer. And in what may be a sign of things to come in the US, the piece drew the attention of the U.K.’s advertising regulator, the Advertising Standards Authority, which cited Buzzfeed for failing to make the piece “obviously identifiable” as commercial content, a violation of the U.K.’s Committee on Advertising Practices Code.
On Wednesday, January 5, the FTC released a report titled “Big Data: A Tool for Inclusion or Exclusion?” The Report addresses the effects of the growing use of big data analytics on low-income and underserved populations, and the FTC’s role in monitoring and regulating the impacts of this commercial use of big data. There are two high-level takeaways from the Report: First, big data is a powerful tool that can be used to include or to exclude. Used responsibly, it can be a key to unlocking opportunities for underprivileged and underserved classes; but, when used with disregard for its effects, big data can serve to shut the underprivileged and underserved out of those same opportunities. Second, the FTC will be the cop on the beat. The Report’s emphasis on the tools at the FTC’s disposal for regulating the use of big data analytics, signals that the FTC intends to make use of its enforcement powers where it can.
One of the most common devices in the emerging Internet of Things (IoT) was reportedly discovered to have a bug. According to the research firm Fortinet, a popular fitness tracker was vulnerable to wireless attacks through its unsecured Bluetooth port. A savvy attacker could install malware wirelessly within ten seconds—simply by coming within a few feet of the tracker. When the device’s owner returned home to sync daily activity with a computer, the malware could, in principle, infect the computer as well.
The Right to be Forgotten Law imposes an obligation on search engines that disseminate adverts targeted at consumers located in Russia to remove search results listing information on individuals where such information is unlawfully disseminated, untrustworthy, outdated, or irrelevant (i.e. the information is no longer substantially relevant to the individual in question due to subsequent events or the actions of individuals). The Law includes exemptions where a search engine does not have to comply – (i) information on events reporting a crime where the limitation period for criminal liability has not expired; as well as (ii) crimes committed by an individual where their conviction record has not been erased.
On November 13, 2015, the Federal Trade Commission’s Chief Administrative Law Judge dismissed an FTC administrative complaint based on LabMD’s alleged failure to provide “reasonable and appropriate” security for personal information maintained on its computers. The ALJ concluded that the complaint counsel failed to prove that LabMD’s alleged practices constituted an unfair trade practice. Specifically, according to the ALJ’s initial decision, complaint counsel failed to prove by a preponderance of the evidence the first prong of the three-part unfairness test – that the alleged unreasonable conduct caused or is likely to cause substantial injury to consumers as required by Section 5(n) of the FTC Act. The case is notable for being the first data security case tried before an ALJ and only one of two instances where a company has fought the FTC’s decision to move forward with an enforcement action based on allegations that a company has engaged in unfair practices because of inadequate data security practices. Companies have otherwise voluntarily entered into consent decrees without admitting liability. In the other instance where a company did not capitulate to an FTC enforcement action, Wyndham moved to dismiss the FTC’s lawsuit against it in federal district court based on lack of jurisdiction. Wyndham lost in the district court and on an interlocutory appeal the federal court of appeals upheld that ruling, but remanded the case to district court for a trial on the merits which will assess whether Wyndham’s alleged unreasonable data security practices meet the unfairness factors in section 5(n) of the FTC Act. Accordingly, as the ALJ did here, the court in Wyndham will consider whether the practices and the data breaches there caused or were likely to cause substantial consumer injury under the first prong of an unfairness inquiry
On November 5, 2015, the Federal Communications Commission Enforcement Bureau announced a $595,000 settlement agreement with Cox Communications, Inc. to resolve an investigation into whether the company failed to properly protect its customers’ personal information when electronic data systems were breached in August 2014. According to the FCC, Cox exposed the personal information of numerous customers and failed to report the breaches through the Commission’s established breach-reporting portal.
Data privacy and security regulators don’t always agree. Take a look at the Federal Trade Commission for example. In recent years, FTC commissioners have disagreed about the role that cost-benefit analyses should play and the types of consumer harms that should be considered in the FTC’s data privacy and security enforcement actions. For organizations that rely on the collection and use of consumer information, understanding the different viewpoints at the FTC and how those viewpoints may influence future enforcement is vital to evaluating risk. On Thursday, November 5, 2015, the Future of Privacy Forum will look at those issues as it celebrates its new home and its new partnership with Washington & Lee University School Law by hosting a panel discussion addressing the Future of Section 5 of the FTC Act. Panelists David Vladeck (former FTC Consumer Bureau Director David Vladeck) and James Cooper (former Acting Director of the Office of Policy Planning) will look at key Section 5 issues.
On September 11, 2015, the Federal Communications Commission Enforcement Bureau issued citations to F.N.B. Corporation and Lyft, Inc., a ride-sharing service, for Telephone Consumer Protect Act violations pertaining to the marketing rules.
The National Institute of Standards and Technology released the draft Framework for Cyber-Physical Systems on September 18. The Framework is intended to serve as a common blueprint for the development of safe, secure, and interoperable systems as varied as smart energy grids, wearable devices, and connected cars. The NIST Cyber-Physical Systems Public Working Group developed the draft document over the past year with input from several hundred experts from industry, academia, and government. NIST will be accepting public comment on the draft for the next 45 days.
For the past several years, California’s Legislature has actively sought to regulate unmanned aerial systems, including, but not only, through privacy-related legislation.. In the 2014 session, one bill passed and was signed by Governor Brown. It bans the use of UAS to capture images or record voices of people without their permission, and is widely regarded as an anti-paparazzi law, aimed at protecting the many celebrities – and their children – in California’s entertainment industry. However, the wording of the bill more broadly protects individuals’ privacy from visual or audio recording in a manner that is “offensive to a reasonable person … under circumstances in which the [person] had a reasonable expectation of privacy” if the recording could not have been made without either trespassing or using special equipment. The bill is codified at California Civil Code section 1708.8. In the 2015 session, the California Legislature introduced five more bills, covering a range of issues.
Last month, bankrupt company RadioShack settled with a coalition of seventeen attorneys general to destroy most of the company’s customer data in its files. The agreement was part of a Bankruptcy Court-approved $26.2 million sale of RadioShack’s assets.