The European Data Protection Board (EDPB) has recently published its Opinion on the (United Kingdom) Information Commissioner’s list of processing activities which would require a Data Protection Impact Assessment under the GDPR. In its Opinion, the EDPB appears to be moving away from the idea that processing of genetic or location data, on its own, might be enough to trigger the mandatory DPIA requirements of the GDPR. This news will perhaps come as a relief to organisations currently struggling to come to grips with the “new” DPIA process and the resources and time that it demands. But, should we be surprised by the EDPB’s Opinion and will it have a significant impact in practice on the way organisations consider and conduct DPIAs?
The draft text of the EU-UK withdrawal agreement was published by the UK Government and the European Union yesterday, providing some of the first concrete indicators of the possible direction of travel in the area of data protection. In this post, we discuss ten initial conclusions from the draft text.
Unless there is a political earthquake (some would say a miracle) Brexit will happen on 29 March 2019. Upon Brexit the UK will cease to be an EU Member State and become a so-called ‘third country’. As a result, UK-based organisations, which in the context of transfers of personal data to countries outside the EU have always been exporters, will become importers of data originating from the EU. This is a serious concern because transfers of personal data from the EU to third countries are severely restricted. So a key UK Government objective from day one has been to ensure that the UK is regarded as an adequate jurisdiction, which would allow unconstrained transfers of personal data from the EU. But will it be?
The UK Government has announced a new three-tier charging structure for data controllers to ensure the continued funding of the Information Commissioner’s Office to come into effect on 25 May 2018 to coincide with the GDPR coming into force.
To date, the main legacy of the Brexit referendum of 2016 appears to be a country split in half: some badly wish the UK would continue to be a member of the EU and some are equally keen on making a move. Yet, there seems to be at least one thing on which Remainers and Leavers will agree: nobody knows exactly what is going to happen. The same is true of the effect of Brexit on UK data protection. However, as Brexit day approaches, it is becoming imperative for those with responsibility for data protection compliance to make some crucial strategic decisions. To help with that process, here are some pointers about what we know and what we don’t know.
On September 13, the U.K. government introduced in Parliament the Data Protection Bill. The main aim of the bill is to implement the General Data Protection Regulation (EU) 2016/679 into U.K. domestic law. However, as perhaps reflected in the length and complexity of the bill, it is also intended to do several other things. This post outlines key observations on the structure and content of the bill.
The Information Commissioner’s Officer ruled, on 3 July 2017, that the Royal Free NHS Foundation Trust had failed to comply with the Data Protection Act 1998 when it provided 1.6 million patient details to Google DeepMind as part of a trial diagnosis and detection system for acute kidney injury, and required the Trust to sign an undertaking. The investigation brings together some of the most potent and controversial issues in data privacy today; sensitive health information and its use by the public sector to develop solutions combined with innovative technology driven by a sophisticated global digital company. This analysis provides insight on the investigation into Google DeepMind with focus on how the General Data Protection Regulation may impact the use of patient data going forward.
You may not have noticed it, but despite all of the distractions caused by Brexit and the General Data Protection Regulation (Regulation (EU) 2016/679), the UK Information Commissioner’s Office has been extremely active on the enforcement front in recent times. One of the features of this activity has been the variety of infringements targeted and, in particular, the focus on e-mail marketing. More specifically, the ICO has taken enforcement action by way of monetary penalties against well-known consumer brands such as Flybe, Honda, Morrisons and Moneysupermarket, for practices that might not have been seen as so out of order in the past. However, given the current tough stance taken by the ICO in connection with direct marketing practices, it would not be surprising to see future enforcement actions in this area.
The Digital Economy Bill passed into UK law last Thursday 27 April 2017 amidst the flurry of activity known as the ‘wash up’ period before the dissolution of Parliament and ahead of the early general election in the UK to be held on 8 June. The Digital Economy Act introduces measures to “modernise the UK for enterprise,” and includes plans for public sector data sharing, direct marketing and age verification for online pornography, amongst other measures. An overview of these measures is set forth in this post.
The UK ICO has published what it describes as a feedback request on profiling and automated decision-making, with the intention that responses will “help inform the UK’s contribution to the WP29 guidelines due to be published later this year.” The deadline for responses is 28 April.
If you care enough about privacy issues to be a regular reader of this blog, you probably know that one of the Big Changes under GDPR will be the introduction of “accountability” as a legal obligation, i.e. it will now be a requirement that a data controller is able to demonstrate its compliance with the principles relating to processing of personal data set out in Article 5 of the GDPR. You may even have started thinking about what this means for your organisation: how are you going to get your development teams to adopt privacy by design and default? What are you doing about data minimisation? Do you apply appropriate levels of encryption to your personal data? In our ever-more digitally driven world, it’s easy to get caught up in the sophisticated stuff, but a recent UK ICO decision reminds us that accountability is about the simple stuff as well. Which brings us to filing cabinets.
The Information Commissioner’s Office has issued a £70,000 fine against Flybe and a £13,000 fine against Honda Motor Europe Ltd for breaching Regulation 22 of the Privacy and Electronic Communications Regulations by sending emails requesting individuals to update their marketing preferences.
The UK Information Commissioner’s Office has just published draft guidance on consent under GDPR. This is an interesting move given that the Article 29 Working Party has promised guidance on the same topic later this year, but reading the guidance makes it clear why the ICO decided to prioritise it: many of the practices which it identifies as unacceptable are fairly common in the UK, meaning many companies are going to have to re-think their approach to legitimising their data processing.
Last week, the UK’s Information Commissioner’s Office published a monetary penalty notice, which fined a private healthcare company, HCA International, £200,000 for its failure to keep sensitive data secure.
Data brokers are organisations that obtain data from a variety of sources and then sell or license it to third parties. Many trade in personal data, which is purchased by their customers for several purposes, most commonly to support marketing campaigns. The UK data protection regulator has for some time been actively enforcing against organisations who buy individuals’ personal data for direct marketing purposes without first conducting appropriate due diligence to ensure that those individuals have adequately consented to receiving marketing communications. However, in a recently issued monetary penalty notice, the ICO indicated that it may be shifting its enforcement strategy. This post discusses the latest developments.
The UK Information Commissioner’s Office (the “ICO”) recently published further guidance on encryption on its blog. The ICO has taken the position for some time that if a business holds sensitive personal information on portable or mobile devices, it should protect that information using appropriate encryption software. If that does not occur and such information is compromised, the ICO has stated that it may pursue regulatory action. The guidance does not modify the ICO’s position on encryption, but it does explain in layman’s terms what the ICO means by encryption and the different types of encryption that are available, so non-technical data protection officers may find it a helpful introduction to this topic.