Words matter. Nowhere is this truer than in legislation, where word choices—often the product of long debate and imperfect compromise—determine the scope and impact of a law. Legislative history can speak volumes about those word choices, and the unique legislative history of the California Consumer Privacy Act of 2018 (CCPA) only highlights the importance of understanding the terms used in the act. We thus focus here on discussing some of the CCPA’s key definitional terms.
Judging by the number of calls and the intensity of the discussions about how to comply with the cookie consent requirement in a post-GDPR world, this issue has become a top worry for organisations and data protection officers. Partly due to the visibility of the mechanisms used to collect this consent, and partly due to the potential implications of operating a website without cookies, the dilemma around what solution to deploy has become a serious business decision. Different business stakeholders are often at odds with each other and matters are getting escalated to decision makers who had never been involved in the technically complex and largely misunderstood world of cookies. The tension is rising and yet, no approach has emerged as the preferred one among all involved. So everyone is getting anxious to find a way to do what they have always done and comply with the law. Is this panic justified?
The General Data Protection Regulation entered into force on 25 May 2018. In light of the urgency to adapt Law no. 78-17 dated 6 January 1978 to the new European Union law, the French Government has initiated an accelerated procedure. This procedure led to the adoption in final reading by the French National Assembly of the bill on personal data protection on 14 May 2018. However, some French Senators lodged a constitutional complaint against the said law on 16 May 2018.
Nothing challenges the effectiveness of data protection law like technological innovation. You think you have cracked a technology neutral framework and then along comes the next evolutionary step in the chain to rock the boat. It happened with the cloud. It happened with social media, with mobile, with online behavioural targeting and with the Internet of Things. And from the combination of all of that, artificial intelligence is emerging as the new testing ground. 21st century artificial intelligence relies on machine learning, and machine learning relies on…? You guessed it: Data. Artificial intelligence is essentially about problem solving and for that we need data, as much data as possible. Against this background, data privacy and cybersecurity legal frameworks around the world are attempting to shape the use of that data in a way that achieves the best of all worlds: progress and protection for individuals. Is that realistically achievable?
Recently, the Russian Data Privacy Authority, Roskomnadzor, organized an Open Doors Day in honor of the International Data Privacy Day. During the occasion, Roskomnadzor officers presented on the authority’s 2017 enforcement activities. They followed this presentation with an open question and answer period, during which they responded to numerous questions raised by attendees. This post summarizes the key takeaways.
It is finally here. This is the year of the GDPR. A journey that started with an ambitious policy paper about modernising data protection almost a decade ago – a decade! – is about to reach flying altitude. No more ‘in May next year this, in May next year that’. Our time has come. Given the amount of attention that the GDPR has received in recent times, data protection professionals are in high demand but we are ready. We knew this was coming and we have had years to prepare. However, even the most seasoned practitioners are at risk of being engulfed by the frantic fire-fighting mood out there. The hamster wheel of GDPR compliance is spinning faster and faster, but it is precisely now when we must look up, see the bigger picture and focus on getting the important things right.
Following the European Commission and European Parliament’s proposed versions of the EU Regulation on Privacy and Electronic Communications, we are now waiting for the Council of the European Union to agree their position before discussions between the three bodies can begin. A discussion paper from the Bulgarian Presidency of the Council dated 11 January 2018 shows that the Council is still considering multiple options in relation to several critical issues.
You may not have noticed it, but despite all of the distractions caused by Brexit and the General Data Protection Regulation (Regulation (EU) 2016/679), the UK Information Commissioner’s Office has been extremely active on the enforcement front in recent times. One of the features of this activity has been the variety of infringements targeted and, in particular, the focus on e-mail marketing. More specifically, the ICO has taken enforcement action by way of monetary penalties against well-known consumer brands such as Flybe, Honda, Morrisons and Moneysupermarket, for practices that might not have been seen as so out of order in the past. However, given the current tough stance taken by the ICO in connection with direct marketing practices, it would not be surprising to see future enforcement actions in this area.
The Information Commissioner’s Office has issued a £70,000 fine against Flybe and a £13,000 fine against Honda Motor Europe Ltd for breaching Regulation 22 of the Privacy and Electronic Communications Regulations by sending emails requesting individuals to update their marketing preferences.
Data brokers are organisations that obtain data from a variety of sources and then sell or license it to third parties. Many trade in personal data, which is purchased by their customers for several purposes, most commonly to support marketing campaigns. The UK data protection regulator has for some time been actively enforcing against organisations who buy individuals’ personal data for direct marketing purposes without first conducting appropriate due diligence to ensure that those individuals have adequately consented to receiving marketing communications. However, in a recently issued monetary penalty notice, the ICO indicated that it may be shifting its enforcement strategy. This post discusses the latest developments.
Part 6 of Future-Proofing Privacy: Profiling Restrictions versus Big Data. Profiling and big data analytics are set to play a pivotal role in the growth of the digital economy. From cookie-based tracking to people’s interaction through social media, the size and the degree of granularity of our digital footprints have created unprecedented opportunities for business development and service delivery. The scale of data collection, data sharing and data analysis has not gone unnoticed to public policy makers and this has led to the inclusion of special rules addressing profiling in the Regulation. In fact, from the point of view of those businesses seeking to benefit from data analytics, the provisions dealing with profiling are likely to become the most crucial aspect of the entire Regulation.
Part 4 of Future-Proofing Privacy: Justifying Data Uses – From Consent to Legitimate Interests. Currently, under the Data Protection Directive, each instance of data processing requires a legal justification – a “ground for processing”. This fundamental feature of EU data protection law will remain unchanged under the Regulation. However, the bar for showing the existence of certain grounds for processing will be set higher. This is especially true with regards to consent.
On September 11, 2015, the Federal Communications Commission Enforcement Bureau issued citations to F.N.B. Corporation and Lyft, Inc., a ride-sharing service, for Telephone Consumer Protect Act violations pertaining to the marketing rules.
Spain is well known for having one of the most restrictive data protection regimes in the European Union. It also counts with some of the highest penalties (fines are up to € 600,000 per infringement), and a data protection authority – the Spanish Data Protection Agency – with a reputation for being one of the fiercest of the EU. Moreover, the penalties envisaged are not only on paper; they are applied on a regular basis by the AEPD. For instance, in the past few years, it has imposed fines of € 450,000, € 900,000 and € 1,400,000.
Profiling and Big Data analytics are set to play a pivotal role in the growth of the digital economy. From cookie-based tracking to people’s interaction through social media, the size and the degree of granularity of our digital footprints have created unprecedented opportunities for business development and service delivery. The scale of data collection, data sharing and data analysis has not gone unnoticed to public policy makers and this has led to the inclusion of special rules addressing profiling in the Regulation. In fact, from the point of view of those businesses seeking to benefit from data analytics, the provisions dealing with profiling are likely to become the most crucial aspect of the entire Regulation. This entry is an excerpt from Hogan Lovells’ “Future-proofing privacy: A guide to preparing for the EU Data Protection Regulation.”
Under the Data Protection Directive, each instance of data processing requires a legal justification – a “ground for processing”. This fundamental feature of EU data protection law remains unchanged under the draft Regulation. However, the bar for showing the existence of certain grounds for processing will be set higher, particularly in relation to consent. This entry is an excerpt from Hogan Lovells’ “Future-proofing privacy: A guide to preparing for the EU Data Protection Regulation.”
The CNIL, France’s data protection authority, published on 25 February 2014 a new recommendation relating to the collection of credit card information, replacing an older 2003 recommendation. The new recommendation, which represents a de facto standard for online merchants and payment services providers who collect data from French consumers, is more prescriptive than the old, particularly regarding how online merchants should seek consent for the retention of credit card information.
The continued uncertainty around the draft EU Data Protection Regulation presents something of a challenge for data controllers. It’s clear that it could require them to make significant changes to how they handle individuals’ data, but the ongoing fundamental political disagreements make it difficult to predict which changes will make it into the final form of the legislation. So it is interesting to see the recommendations on the UK ICO’s blog on where to start in preparing for reforms, highlighting three areas: consent, breach notification, and privacy by design.
The EU Parliament’s Committee on Civil Liberties, Justice and Home Affairs (“LIBE”) voted on Monday to adopt its report on the draft General Data Protection Regulation and the separate Directive for the law enforcement sector. This vote sets out the Parliament’s position for its negotiations with the Council and Commission (known as the “trialogue” stage). The Committee aims to have a plenary Parliamentary vote in March before the Parliamentary elections.
On October 17, Jan Albrecht, rapporteur to the EU Parliament’s Committee on Civil Liberties, Justice and Home Affairs (“LIBE”), issued a release in which he claims that “Edward Snowden and the PRISM scandal laid the ground” for including a prohibition against telecommunications and Internet companies transferring data to other countries’ governmental authorities unless otherwise permitted by EU law. Albrecht’s release offers 10 points to describe the draft Regulation that LIBE is scheduled to vote upon on October 21. If LIBE adopts the draft, the Parliament, Council, and Commission will begin work on negotiating the final legislation, which parliamentarians hope will be adopted before elections in May 2014.
A recent federal court opinion raises concerns that privacy cases alleging violations of a standard user license agreement may be susceptible to class certification. Last week, the U.S. District Court for the Northern District of Illinois certified a class in a consumer privacy lawsuit against comScore, Inc. Plaintiffs allege that comScore exceeded the scope of the […]
The European Union’s Article 29 Data Protection Working Party (“WP29“), which consists of the 27 data protection authorities of the European Union Member States, has published its “Opinion on Apps in Smart Devices“, adopted on 27 February 2013 (the “Opinion“). Applicability of EU laws According to WP29, the 1995 Data Protection Directive applies to all […]