Territoriality will continue to be one of the most vexing problems for data regulation in 2018. One aspect of this debate relates to whether a U.S. judge can compel the disclosure of personal data located in Europe without using international treaty mechanisms. This issue is currently being considered by the United States Supreme Court in the case United States v. Microsoft. The case involves the question of whether a U.S. statute relating to search warrants can be interpreted as extending to a search for data located outside the United States; in this case, the data is located in Ireland. The U.S. Court of Appeals found that, in the absence of express wording in the statute relating to extraterritorial application, the statute should be interpreted as being limited to searches conducted within the territory of the United States. The Supreme Court is currently reviewing the case. In December, 2017, the European Commission filed an amicus brief urging the Supreme Court to give due consideration to the principles of international comity and territoriality when interpreting the U.S. statute.
Hot on the heels of the European Commission’s official review of the functioning of the EU-U.S. Privacy Shield framework, the Article 29 Working Party of EU data protection regulators has issued its own report on the matter. The summary of findings by the Working Party, which draws from both written submissions and oral contributions, begins by commending U.S. authorities for their efforts in establishing a procedural framework to support the operation of Privacy Shield but quickly shifts to the Working Party’s concerns. Should the concerns not be addressed by the time of the second joint review, the Working Party notes that its members will “take appropriate action,” including bringing a Privacy Shield adequacy decision to national courts for reference to the Court of Justice of the European Union for a preliminary ruling.
The complexity of the EU General Data Protection Regulation is often alleviated by the guidance of regulatory authorities who contribute their practical interpretation of the black letter of the law and provide welcome certainty. However, the latest draft guidelines issued by the Article 29 Working Party on automated decision-making has thrown up a particular curve ball which bears further investigation. It relates to whether Article 22(1) of the GDPR should be read as a right available to data subjects or as a straightforward prohibition for controllers.
The steady trickle of GDPR guidance from the Article 29 Working Party continues. Fresh from finalising its guidance on data portability, lead supervisory authorities and data protection officers, the Working Party has published draft guidance on data protection impact assessments, the full text of which is available on the Working Party website. Comments can be submitted to the Working Party by 23 May 2017, after which the guidance will be finalised.
The Article 29 Working Party held its April plenary meeting last week, where it continued its work preparing for the GDPR, adopted an opinion on the draft e-Privacy Regulation, and discussed the annual review of Privacy Shield.
The UK Information Commissioner’s Office has just published draft guidance on consent under GDPR. This is an interesting move given that the Article 29 Working Party has promised guidance on the same topic later this year, but reading the guidance makes it clear why the ICO decided to prioritise it: many of the practices which it identifies as unacceptable are fairly common in the UK, meaning many companies are going to have to re-think their approach to legitimising their data processing.
No one could accuse the EU Article 29 Working Party of not delivering as promised. Following its recently held December plenary meeting, the WP29 has released three separate guidelines with their interpretation of some key aspects of the General Data Protection Regulation, namely: data portability, data protection officers, and lead supervisory authorities. At the same time, the WP29 has confirmed its role as the “EU centralised body” for handling individual complaints under the Privacy Shield and the re-establishment of its enforcement subgroup in charge of coordinating cross-border enforcement actions. We explore the three guidelines in this post.
The European Commission has actively promoted the importance of mHealth following their 2014 consultation. One of the initiatives to emerge from the Commission has been the Privacy Code of Conduct for mHealth apps. The Code was drafted by a working group set up in January this year and the final draft was published on 7th June and submitted to the Article 29 Working Party for their consideration and approval. If and when it receives the Working Party’s approval it could then be relied upon by app developers wishing to demonstrate a good standard of data protection compliance. The Code is an example of the type of initiative that is increasingly likely to develop under the forthcoming EU General Data Protection Regulation.
The February 29, 2016 announcement of the new EU-U.S. data transfer framework—the Privacy Shield—was accompanied by over 130 pages of documentation and significantly more operational details than its predecessor, Safe Harbor. We have reviewed the Privacy Shield materials and published a comprehensive breakdown of the changes from Safe Harbor to Privacy Shield and the practical impact on business: Inside the New and Improved EU-U.S. Data Transfer Framework.
Connected cars can generate large volumes of data, including data on engine performance, location, and driver behaviour. The European Commission has convened multi-stakeholder groups to figure out how to organize access to that data in a safe, competitively neutral, and privacy-friendly way. Two recent reports shed light on the principles under consideration for data sharing infrastructures in the EU. And legislative and regulatory developments in the EU will likely have a substantial impact on connected car deployments.
Following the announcement by the European Commission of the newly agreed EU-US Privacy Shield, the missing piece of the jigsaw was the Article 29 Working Party’s stance on the adequacy of the existing mechanisms in place—in particular, standard contractual clauses and binding corporate rules. So after two days of intense discussions, the Working Party has issued a statement with its latest position, which is the follow up to their original reaction to the invalidation of Safe Harbor last October. The bottom line: the Working Party still does not view US government surveillance laws as sufficiently protective of privacy—a position which calls all transfers of personal data to the US in question, regardless of the methods used to legitimise the transfer—but they will reconsider this position in light of the Privacy Shield in the coming months.
It’s close to 7pm on a Friday evening and my team are trying their best to manage our clients’ stress and frantic desperation. Jokes about how much they love Max Schrems are shared by email. In the meantime, we are diligently working our way through endless charts of dataflows and attempting to cover every single […]
The EU General Data Protection Regulation has been called the most lobbied piece of legislation in the history of the EU. Before Christmas last year, what is likely to be the final text of the GDPR emerged from the EU trilogue negotiations. Victoria Hordern, Senior Associate at Hogan Lovells, explores what the new GDPR will mean for those collecting and handling health data, and examines a number of the provisions and themes that impact the use of health data.
The EU’s Article 29 Working Party issued a statement today on the recent Schrems decision invalidating the adequacy of the EU-U.S. Safe Harbor framework, emphasizing that affected businesses should start to put in place legal and technical solutions in a timely manner to meet EU data protection standards. The statement gave a January 2016 deadline for companies to come into compliance with the ruling, at which point EU data protection authorities would be “committed to take all necessary and appropriate actions, which may include coordinated enforcement actions.” In response, we publish here a high-level analysis of the possible options available for companies—including the EU Standard Contractual Clauses, Intra-Group Agreements and other ad-hoc contracts, Binding Corporate Rules, Safe Harbor 2.0, and consent—and the pros and cons of choosing each one.
In our previous post we outlined the key issues regarding mHealth devices and services from a privacy law perspective. Now, we go further into the details and discuss the scope of the personal data involved, especially relating to sensitive health data. We introduce the relevant statutory requirements in the EU and the legal opinions of the Article 29 Working Party and the European Data Protection Supervisor as well as having a look at the upcoming European General Data Protection Regulation. Against this legal background, one core question we will examine is whether information collected and processed by lifestyle apps and devices must be classified as health data and fall under the strict requirements of European data protection laws.
Thank you to everyone who participated in today’s webinar “Safe Harbor Invalidated – What Next?”, in which we analyzed the implications of yesterday’s decision by the Court of Justice of the European Union invalidating the EU-U.S. Safe Harbor Framework. A copy of the slide deck and a link to a recording of the webinar are attached to this post.
Following on from the Article 29 Working Party’s Opinion in June, the European Data Protection Supervisor has now published his own recommendations for the proposed General Data Protection Regulation. Unsurprisingly, given that the EDPS is a member of the Working Party, the views expressed are in line with that Opinion. At this point you may be tempted to stop reading, but wait, there is more. In addition to expressing his vision of the GDPR (more on which below) and producing his own recommendations for every single article of the GDPR, the EDPS has demonstrated his commitment to practicality by making this all available as a mobile app. The app allows you to select which of the drafts you wish to see side by side, scroll rapidly through the contents to select a particular article, or search on the whole text so you can see at a glance what each version says, for example, about pseudonymisation or profiling. Whilst the app may have limited appeal, and is unlikely to keep small children entertained on long car journeys, it will be a thing of joy for its target audience.
The mobile Health sector is rapidly developing and revolutionising the healthcare market. More and more consumers share information such as medical and physiological conditions, lifestyles, daily activity and geolocation via all kinds of health-related mobile applications and devices. The growing success of mHealth, however, inevitably casts a spotlight on compliance with privacy protection laws. Data protection agencies and supervisory bodies in the EU recently raised concerns about the collection, processing and use of customers’ data by mHealth apps and mobile devices. This blog introduces the key hot spots involving mHealth and data protection laws, before we dig deeper on other issues in a series of consecutive posts on this blog in the upcoming weeks.
Accountability has been described by the Article 29 Working Party as a way of “showing how responsibility is exercised and making this verifiable”. Accountability is far from being a new concept. It was introduced back in 1980 in the OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data. This entry is an excerpt from Hogan Lovells’ “Future-proofing privacy: A guide to preparing for the EU Data Protection Regulation.”
On 9 March, the Council of the EU issued a partial general approach on a key chapter of the EU Data Protection Regulation which has implications for the regulation of health data. The Council’s stance has been welcomed by a number of healthcare commentators as it promotes a more flexible approach to the use of health data and accords with the tenor of the revised version of the draft Regulation that emerged from the Council in December last year.
The Court of Justice of the European Union has today published its decision in the case of Ryneš and has found that domestic CCTV which films a public area cannot be exempt from the obligations contained in the EU Data Protection Directive by virtue of the “household exemption”.
Addressing the French Parliamentary Commission on Digital Rights, CNIL and Article 29 Working Party Chair Isabelle Falque-Pierrotin commented on the current state of negotiations of the proposed European General Data Protection Regulation, warning that excessive reliance on a risk-based approach could undermine fundamental rights. A risk analysis is useful as a guide to allocate resources, but should not affect the underlying rights of the data subject, she said. To illustrate her point, Falque-Pierrotin used the analogy of a home owner who lives in a part of the city where burglaries are frequent. The risk-based approach means that the home owner will buy more locks for doors, and that police authorities may devote more resources to patrolling. It does not mean, however, that home owners have different rights depending on where they live. Falque-Pierrotin is concerned that the current negotiations on the risk-based approach may confuse these two concepts, leading to a situation where individuals’ rights are reduced or ignored for low-risk processing.
The “one-stop-shop” EU data protection regulator was originally presented as one of the fundamental pillars of the future Data Protection Regulation, but now hangs in the balance of the EU legislative process. This post provides the latest on the status of one-stop-shop in the Council of the EU, where it currently is being debated.
The Article 29 Working Party’s new opinion on anonymization techniques provides a useful primer on randomization and generalization (i.e., data aggregation) techniques used to anonymize data sets. The opinion analyzes each technique based on three ways that data can be re-identified: the ability to single out individuals after the anonymization technique has been applied; the linkability of the anonymized data sets to other data sets; and finally the ability of the data sets to resist inference attacks after application of the anonymization technique. Organizations depending on anonymization for compliance with the Data Protection Directive would be well advised to review their anonymization processes to determine if they comport with the standards set out in the opinion.