A close observer of the GDPR will have noticed that, in several places, individual EU Member States can implement derogations from the GDPR requirements. Of course, as a regulation under EU law there is less scope for local flexibility under the GDPR than under the current EU Data Protection Directive 95/46. Yet the GDPR does, in a number of key areas, allow an EU Member State to set down local laws that could allow a more locally relevant flavour to a particular aspect of compliance.
On 11 April 2017 the Cyberspace Administration of China published a circular calling for comments on its draft Security Assessment for Personal Information and Important Data Transmitted Outside of the People’s Republic of China Measures (the Draft Export Review Measures). Public comments are open through 11 May 2017.
The main legislative purpose of the Draft Export Review Measures is to clarify the process and requirements relating to the data localisation provisions in the Cyber Security Law, one of the most controversial aspects of the law. While the Draft Export Review Measures do add a significant level of implementing detail as to the practicalities of compliance, we expect that for many multinational corporations with operations in, or doing business with, China, the nature of the clarifications do not go in the direction that they would have wanted. In particular, the Draft Export Review Measures include a significant expansion of the scope of the localization measure, potentially applying to all businesses collecting data in China.
Hogan Lovells has released a guide highlighting the key provisions in the Draft Export Review Measures, including an overview of the significant points for commentary. The full guide is available here. Please refer to the contacts at the end of the guide for related inquiries.
The Article 29 Working Party held its April plenary meeting last week, where it continued its work preparing for the GDPR, adopted an opinion on the draft e-Privacy Regulation, and discussed the annual review of Privacy Shield. Highlights of the meeting included:
The UK ICO has published what it describes as a feedback request on profiling and automated decision-making, with the intention that responses will “help inform the UK’s contribution to the WP29 guidelines due to be published later this year.”
Given the growing importance of profiling to most businesses, companies should consider whether they wish to contribute their views, particularly on areas where they consider more guidance is needed on what GDPR’s requirements mean in practical terms. For example, the GDPR focuses on profiling that has a “legal” or “significant” effect, and the ICO discussion paper contains its “initial thoughts” on what might constitute significant effects, which includes “causing individuals to change their behaviour in a significant way.” As the ICO acknowledges, what amounts to a “legal” or “significant” effect can be somewhat subjective, and so this is an opportunity for businesses that engage in profiling to put forward their opinions and influence future guidance.
The deadline for responses is 28 April.
If you care enough about privacy issues to be a regular reader of this blog, you probably know that one of the Big Changes under GDPR will be the introduction of “accountability” as a legal obligation, i.e. it will now be a requirement that a data controller is able to demonstrate its compliance with the principles relating to processing of personal data set out in Article 5 of the GDPR. You may even have started thinking about what this means for your organisation: how are you going to get your development teams to adopt privacy by design and default? What are you doing about data minimisation? Do you apply appropriate levels of encryption to your personal data? In our ever-more digitally driven world, it’s easy to get caught up in the sophisticated stuff, but a recent UK ICO decision reminds us that accountability is about the simple stuff as well. Which brings us to filing cabinets.
The Information Commissioner’s Office (ICO) has issued a £70,000 fine against Flybe and a £13,000 fine against Honda Motor Europe Ltd for breaching Regulation 22 of the Privacy and Electronic Communications Regulations (PECR) by sending emails requesting individuals to update their marketing preferences. The two cases confirm that: Continue Reading
The Federal Trade Commission (FTC) and National Highway Traffic Safety Administration (NHTSA) are co-hosting a workshop on June 28, 2017, to explore the privacy and security issues raised by automated and connected vehicle technologies. The agencies are looking to explore the types of data such technologies collect, store, transmit, and share; the potential benefits and challenges posed by the technologies; the privacy and security practices of vehicle manufacturers; the roles that federal agencies should play in regulating privacy and security issues; and how self-regulatory standards apply to connected vehicle privacy and security issues.
In advance of the workshop, the FTC and NHTSA are seeking public comment on privacy and security issues. Comments may be submitted through April 20, 2017, and the agencies have noted the following topics of interest: Continue Reading
As previously reported, on Thursday, March 9th, the Federal Trade Commission (FTC) hosted a forum on the consumer implications of recent developments in artificial intelligence (AI) and blockchain technologies. This is the second of two entries on the March 9th FinTech Forum. Today’s post focuses blockchain technologies. Coverage of the opening remarks and the AI discussion may be found here.
On Thursday, March 9th, the Federal Trade Commission (FTC) hosted a forum on the consumer implications of recent developments in artificial intelligence (AI) and blockchain technologies. This was the FTC’s third forum on issues in FinTech. Previous FinTech Forums covered marketplace lending and crowdfunding and peer-to-peer payments.
In opening remarks, the FTC acknowledged the benefits of technological developments in AI and blockchain technologies: AI promises better decision-making and personalized consumer technologies, while blockchain technologies would increase the efficiency of financial transactions and eliminate the need for the middleman, among other benefits. But, the FTC stressed that advancements in these technologies must be coupled with an awareness of and active engagement in identifying and minimizing associated risks. For AI, this means countering biased or incomplete results, improving the transparency of decision-making, and addressing general lack of consumer awareness and understanding. For blockchain, it means strengthening data security, increasing oversight, and preventing abuse of the technology. The need to carefully consider the challenges raised by technological advancements was echoed by panelists throughout the forum, suggesting that the FTC will likely expect companies in these industries to have assessed and taken steps to mitigate the novel risks they face as they continue to innovate and break new ground in these spaces.
This is the first of two entries on the March 9th FinTech Forum. Today’s post focuses on Artificial Intelligence, with coverage of blockchain technologies to follow.
The UK Information Commissioner’s Office has just published draft guidance on consent under GDPR. This is an interesting move given that the Article 29 Working Party has promised guidance on the same topic later this year, but reading the guidance makes it clear why the ICO decided to prioritise it: many of the practices which it identifies as unacceptable are fairly common in the UK, meaning many companies are going to have to re-think their approach to legitimising their data processing.