On 1 October 2019, the Court of Justice of the European Union handed down a crucial decision impacting the way that consent is obtained on the internet. The judgment relates to Case C-673/17. In the Planet49 case, the German Federal Court referred a number of questions to the CJEU regarding the validity of consent to cookies placed by a website operating an online lottery.
On 19 July the French Data Protection Authority published new guidelines on cookies and trackers. These replace the existing Recommendation No. 2013-378 of 5 December 2013, are intended to be in line with relevant GDPR provisions and have been produced in anticipation of the future ePrivacy Regulation. The guidelines will be supplemented, at a later stage, with sectoral recommendations setting out practical methods for obtaining consent. These sectoral recommendations will be included in a final version of the guidelines on cookies and trackers open for public consultation, which will then be subject to final adoption by the CNIL (expected early 2020).
It’s no secret that a hot topic, perhaps the hot topic, in the European data protection world at present is the interplay between the GDPR and the e-Privacy Directive, in particular how it affects online advertising involving cookies. The European Data Protection Board recently released an opinion on this topic, and on 21 March the Court of Justice of the European Union released Advocate-General Szpunar’s opinion in the case of Planet49, which discusses the requirements for valid consent, in the context of both cookies under the e-Privacy Directive and more general data processing under the GDPR.
On February 27, 2019, the Federal Trade Commission (“FTC”) announced that it settled with the operators of a video social networking app for a record civil penalty of $5.7 million under the Children’s Online Privacy Protection Act (“COPPA”). This FTC COPPA action was notable not just for the size of the penalty, but also because of the joint statement by the two Democratic Commissioners, Rebecca Slaughter and Rohit Chopra, that future FTC enforcement should seek to hold corporate officers and directors accountable for violations of consumer protection law.
On December 4, 2018, the New York Attorney General (NYAG) announced that Oath Inc., which was known until June 2017 as AOL Inc. (AOL), has agreed to pay a $4.95 million civil penalty to settle allegations that AOL’s ad exchange practices violated the Children’s Online Privacy Protection Act (COPPA). The $4.95 million penalty is the largest ever assessed by any regulator in a COPPA enforcement matter.
In the digital age, data is everything. “Big Data” feeds countless business processes and offerings. Businesses rely on data to enhance revenue and drive efficiency, whether by better understanding the needs of existing customers, reaching new ones in previously unimagined ways, or obtaining valuable insights to guide a wide array of decisions. Data also drives developments in artificial intelligence, automation, and the Internet of Things. Come 2020, the California Consumer Privacy Act (“CCPA”) may significantly impact businesses’ data practices, with new and burdensome compliance obligations such as “sale” opt-out requirements and, in certain circumstances, restrictions on tiered pricing and service levels. This entry in Hogan Lovells’ ongoing series on the CCPA will focus on implications for data-driven businesses–the rapidly increasing number of businesses that rely heavily on consumer data, whether for marketing, gaining marketplace insights, internal research, or use as a core commodity.
A U.S. court has recently ruled that an EU citizen’s privacy rights and the GDPR do not trump a U.S. litigant’s right to obtain discovery, including video-taped depositions. In d’Amico Dry d.a.c. v. Nikka Finance, Inc., CA 18-0284-KD-MU, Dkt. No. 140 (Adm. S.D. Ala. Oct. 19, 2018), a federal magistrate denied an EU citizen’s motion […]
Words matter. Nowhere is this truer than in legislation, where word choices—often the product of long debate and imperfect compromise—determine the scope and impact of a law. Legislative history can speak volumes about those word choices, and the unique legislative history of the California Consumer Privacy Act of 2018 (CCPA) only highlights the importance of understanding the terms used in the act. We thus focus here on discussing some of the CCPA’s key definitional terms.
Judging by the number of calls and the intensity of the discussions about how to comply with the cookie consent requirement in a post-GDPR world, this issue has become a top worry for organisations and data protection officers. Partly due to the visibility of the mechanisms used to collect this consent, and partly due to the potential implications of operating a website without cookies, the dilemma around what solution to deploy has become a serious business decision. Different business stakeholders are often at odds with each other and matters are getting escalated to decision makers who had never been involved in the technically complex and largely misunderstood world of cookies. The tension is rising and yet, no approach has emerged as the preferred one among all involved. So everyone is getting anxious to find a way to do what they have always done and comply with the law. Is this panic justified?
The General Data Protection Regulation entered into force on 25 May 2018. In light of the urgency to adapt Law no. 78-17 dated 6 January 1978 to the new European Union law, the French Government has initiated an accelerated procedure. This procedure led to the adoption in final reading by the French National Assembly of the bill on personal data protection on 14 May 2018. However, some French Senators lodged a constitutional complaint against the said law on 16 May 2018.
Nothing challenges the effectiveness of data protection law like technological innovation. You think you have cracked a technology neutral framework and then along comes the next evolutionary step in the chain to rock the boat. It happened with the cloud. It happened with social media, with mobile, with online behavioural targeting and with the Internet of Things. And from the combination of all of that, artificial intelligence is emerging as the new testing ground. 21st century artificial intelligence relies on machine learning, and machine learning relies on…? You guessed it: Data. Artificial intelligence is essentially about problem solving and for that we need data, as much data as possible. Against this background, data privacy and cybersecurity legal frameworks around the world are attempting to shape the use of that data in a way that achieves the best of all worlds: progress and protection for individuals. Is that realistically achievable?
Recently, the Russian Data Privacy Authority, Roskomnadzor, organized an Open Doors Day in honor of the International Data Privacy Day. During the occasion, Roskomnadzor officers presented on the authority’s 2017 enforcement activities. They followed this presentation with an open question and answer period, during which they responded to numerous questions raised by attendees. This post summarizes the key takeaways.
It is finally here. This is the year of the GDPR. A journey that started with an ambitious policy paper about modernising data protection almost a decade ago – a decade! – is about to reach flying altitude. No more ‘in May next year this, in May next year that’. Our time has come. Given the amount of attention that the GDPR has received in recent times, data protection professionals are in high demand but we are ready. We knew this was coming and we have had years to prepare. However, even the most seasoned practitioners are at risk of being engulfed by the frantic fire-fighting mood out there. The hamster wheel of GDPR compliance is spinning faster and faster, but it is precisely now when we must look up, see the bigger picture and focus on getting the important things right.
Following the European Commission and European Parliament’s proposed versions of the EU Regulation on Privacy and Electronic Communications, we are now waiting for the Council of the European Union to agree their position before discussions between the three bodies can begin. A discussion paper from the Bulgarian Presidency of the Council dated 11 January 2018 shows that the Council is still considering multiple options in relation to several critical issues.
You may not have noticed it, but despite all of the distractions caused by Brexit and the General Data Protection Regulation (Regulation (EU) 2016/679), the UK Information Commissioner’s Office has been extremely active on the enforcement front in recent times. One of the features of this activity has been the variety of infringements targeted and, in particular, the focus on e-mail marketing. More specifically, the ICO has taken enforcement action by way of monetary penalties against well-known consumer brands such as Flybe, Honda, Morrisons and Moneysupermarket, for practices that might not have been seen as so out of order in the past. However, given the current tough stance taken by the ICO in connection with direct marketing practices, it would not be surprising to see future enforcement actions in this area.
The Information Commissioner’s Office has issued a £70,000 fine against Flybe and a £13,000 fine against Honda Motor Europe Ltd for breaching Regulation 22 of the Privacy and Electronic Communications Regulations by sending emails requesting individuals to update their marketing preferences.
Data brokers are organisations that obtain data from a variety of sources and then sell or license it to third parties. Many trade in personal data, which is purchased by their customers for several purposes, most commonly to support marketing campaigns. The UK data protection regulator has for some time been actively enforcing against organisations who buy individuals’ personal data for direct marketing purposes without first conducting appropriate due diligence to ensure that those individuals have adequately consented to receiving marketing communications. However, in a recently issued monetary penalty notice, the ICO indicated that it may be shifting its enforcement strategy. This post discusses the latest developments.
Part 6 of Future-Proofing Privacy: Profiling Restrictions versus Big Data. Profiling and big data analytics are set to play a pivotal role in the growth of the digital economy. From cookie-based tracking to people’s interaction through social media, the size and the degree of granularity of our digital footprints have created unprecedented opportunities for business development and service delivery. The scale of data collection, data sharing and data analysis has not gone unnoticed to public policy makers and this has led to the inclusion of special rules addressing profiling in the Regulation. In fact, from the point of view of those businesses seeking to benefit from data analytics, the provisions dealing with profiling are likely to become the most crucial aspect of the entire Regulation.
Part 4 of Future-Proofing Privacy: Justifying Data Uses – From Consent to Legitimate Interests. Currently, under the Data Protection Directive, each instance of data processing requires a legal justification – a “ground for processing”. This fundamental feature of EU data protection law will remain unchanged under the Regulation. However, the bar for showing the existence of certain grounds for processing will be set higher. This is especially true with regards to consent.
On September 11, 2015, the Federal Communications Commission Enforcement Bureau issued citations to F.N.B. Corporation and Lyft, Inc., a ride-sharing service, for Telephone Consumer Protect Act violations pertaining to the marketing rules.
Spain is well known for having one of the most restrictive data protection regimes in the European Union. It also counts with some of the highest penalties (fines are up to € 600,000 per infringement), and a data protection authority – the Spanish Data Protection Agency – with a reputation for being one of the fiercest of the EU. Moreover, the penalties envisaged are not only on paper; they are applied on a regular basis by the AEPD. For instance, in the past few years, it has imposed fines of € 450,000, € 900,000 and € 1,400,000.
Profiling and Big Data analytics are set to play a pivotal role in the growth of the digital economy. From cookie-based tracking to people’s interaction through social media, the size and the degree of granularity of our digital footprints have created unprecedented opportunities for business development and service delivery. The scale of data collection, data sharing and data analysis has not gone unnoticed to public policy makers and this has led to the inclusion of special rules addressing profiling in the Regulation. In fact, from the point of view of those businesses seeking to benefit from data analytics, the provisions dealing with profiling are likely to become the most crucial aspect of the entire Regulation. This entry is an excerpt from Hogan Lovells’ “Future-proofing privacy: A guide to preparing for the EU Data Protection Regulation.”
Under the Data Protection Directive, each instance of data processing requires a legal justification – a “ground for processing”. This fundamental feature of EU data protection law remains unchanged under the draft Regulation. However, the bar for showing the existence of certain grounds for processing will be set higher, particularly in relation to consent. This entry is an excerpt from Hogan Lovells’ “Future-proofing privacy: A guide to preparing for the EU Data Protection Regulation.”