With the deadline for a no-deal Brexit looming—the UK’s exit date from the European Union is now slated for April 12—companies certified to the EU-U.S. Privacy Shield should update their Privacy Shield privacy policies if they have not done so already to ensure that they are able to lawfully receive personal data from the UK post-Brexit.
Right now, the whole of the U.K. appears to be on the same spot looking over a precipice. However, this is not the moment to be blind. As politicians struggle to find a magic formula for a prosperous Brexit, businesses are stepping up their efforts to mitigate the damage of a possible “no-deal Brexit.” The data protection community is no different. The proposed withdrawal agreement would have preserved the status quo in data protection terms, at least until the end of the transition period in December 2020. However, if the U.K. leaves the EU without a deal, the implications for international data flows and privacy compliance generally will be severe. Therefore, British pragmatism demands an urgent and thorough approach to preparing for the eventuality of a no-deal Brexit.
The European Data Protection Board (EDPB) has recently published its Opinion on the (United Kingdom) Information Commissioner’s list of processing activities which would require a Data Protection Impact Assessment under the GDPR. In its Opinion, the EDPB appears to be moving away from the idea that processing of genetic or location data, on its own, might be enough to trigger the mandatory DPIA requirements of the GDPR. This news will perhaps come as a relief to organisations currently struggling to come to grips with the “new” DPIA process and the resources and time that it demands. But, should we be surprised by the EDPB’s Opinion and will it have a significant impact in practice on the way organisations consider and conduct DPIAs?
The draft text of the EU-UK withdrawal agreement was published by the UK Government and the European Union yesterday, providing some of the first concrete indicators of the possible direction of travel in the area of data protection. In this post, we discuss ten initial conclusions from the draft text.
Unless there is a political earthquake (some would say a miracle) Brexit will happen on 29 March 2019. Upon Brexit the UK will cease to be an EU Member State and become a so-called ‘third country’. As a result, UK-based organisations, which in the context of transfers of personal data to countries outside the EU have always been exporters, will become importers of data originating from the EU. This is a serious concern because transfers of personal data from the EU to third countries are severely restricted. So a key UK Government objective from day one has been to ensure that the UK is regarded as an adequate jurisdiction, which would allow unconstrained transfers of personal data from the EU. But will it be?
The UK Government has announced a new three-tier charging structure for data controllers to ensure the continued funding of the Information Commissioner’s Office to come into effect on 25 May 2018 to coincide with the GDPR coming into force.
To date, the main legacy of the Brexit referendum of 2016 appears to be a country split in half: some badly wish the UK would continue to be a member of the EU and some are equally keen on making a move. Yet, there seems to be at least one thing on which Remainers and Leavers will agree: nobody knows exactly what is going to happen. The same is true of the effect of Brexit on UK data protection. However, as Brexit day approaches, it is becoming imperative for those with responsibility for data protection compliance to make some crucial strategic decisions. To help with that process, here are some pointers about what we know and what we don’t know.
On September 13, the U.K. government introduced in Parliament the Data Protection Bill. The main aim of the bill is to implement the General Data Protection Regulation (EU) 2016/679 into U.K. domestic law. However, as perhaps reflected in the length and complexity of the bill, it is also intended to do several other things. This post outlines key observations on the structure and content of the bill.
The Information Commissioner’s Officer ruled, on 3 July 2017, that the Royal Free NHS Foundation Trust had failed to comply with the Data Protection Act 1998 when it provided 1.6 million patient details to Google DeepMind as part of a trial diagnosis and detection system for acute kidney injury, and required the Trust to sign an undertaking. The investigation brings together some of the most potent and controversial issues in data privacy today; sensitive health information and its use by the public sector to develop solutions combined with innovative technology driven by a sophisticated global digital company. This analysis provides insight on the investigation into Google DeepMind with focus on how the General Data Protection Regulation may impact the use of patient data going forward.
You may not have noticed it, but despite all of the distractions caused by Brexit and the General Data Protection Regulation (Regulation (EU) 2016/679), the UK Information Commissioner’s Office has been extremely active on the enforcement front in recent times. One of the features of this activity has been the variety of infringements targeted and, in particular, the focus on e-mail marketing. More specifically, the ICO has taken enforcement action by way of monetary penalties against well-known consumer brands such as Flybe, Honda, Morrisons and Moneysupermarket, for practices that might not have been seen as so out of order in the past. However, given the current tough stance taken by the ICO in connection with direct marketing practices, it would not be surprising to see future enforcement actions in this area.
The Digital Economy Bill passed into UK law last Thursday 27 April 2017 amidst the flurry of activity known as the ‘wash up’ period before the dissolution of Parliament and ahead of the early general election in the UK to be held on 8 June. The Digital Economy Act introduces measures to “modernise the UK for enterprise,” and includes plans for public sector data sharing, direct marketing and age verification for online pornography, amongst other measures. An overview of these measures is set forth in this post.
The UK ICO has published what it describes as a feedback request on profiling and automated decision-making, with the intention that responses will “help inform the UK’s contribution to the WP29 guidelines due to be published later this year.” The deadline for responses is 28 April.
If you care enough about privacy issues to be a regular reader of this blog, you probably know that one of the Big Changes under GDPR will be the introduction of “accountability” as a legal obligation, i.e. it will now be a requirement that a data controller is able to demonstrate its compliance with the principles relating to processing of personal data set out in Article 5 of the GDPR. You may even have started thinking about what this means for your organisation: how are you going to get your development teams to adopt privacy by design and default? What are you doing about data minimisation? Do you apply appropriate levels of encryption to your personal data? In our ever-more digitally driven world, it’s easy to get caught up in the sophisticated stuff, but a recent UK ICO decision reminds us that accountability is about the simple stuff as well. Which brings us to filing cabinets.
The Information Commissioner’s Office has issued a £70,000 fine against Flybe and a £13,000 fine against Honda Motor Europe Ltd for breaching Regulation 22 of the Privacy and Electronic Communications Regulations by sending emails requesting individuals to update their marketing preferences.
The UK Information Commissioner’s Office has just published draft guidance on consent under GDPR. This is an interesting move given that the Article 29 Working Party has promised guidance on the same topic later this year, but reading the guidance makes it clear why the ICO decided to prioritise it: many of the practices which it identifies as unacceptable are fairly common in the UK, meaning many companies are going to have to re-think their approach to legitimising their data processing.
Last week, the UK’s Information Commissioner’s Office published a monetary penalty notice, which fined a private healthcare company, HCA International, £200,000 for its failure to keep sensitive data secure.
Data brokers are organisations that obtain data from a variety of sources and then sell or license it to third parties. Many trade in personal data, which is purchased by their customers for several purposes, most commonly to support marketing campaigns. The UK data protection regulator has for some time been actively enforcing against organisations who buy individuals’ personal data for direct marketing purposes without first conducting appropriate due diligence to ensure that those individuals have adequately consented to receiving marketing communications. However, in a recently issued monetary penalty notice, the ICO indicated that it may be shifting its enforcement strategy. This post discusses the latest developments.
The UK’s Information Commissioner’s Office is known to prefer an “engaging” rather than an enforcement approach with organisations. However, when looking at the “action we’ve taken” page on the ICO website the ICO’s enforcement activity seems to be increasing by the day. While the ICO has stated that it wants to focus its enforcement efforts going forward on unsolicited marketing, such as nuisance messages and calls, breaches of security requirements have to date attracted the majority of the ICO’s enforcement attention. Therefore, organisations operating in the UK would be well-served to focus on understanding and adhering to the ICO’s expectations for data security compliance.
The UK Information Commissioner and the Secretary of State for Justice have entered into Memoranda of Understanding on the handling of information requests in relation to national security cases under the UK’s Data Protection Act, Freedom of Information Act and Environmental Information Regulations. The new Memoranda set out guidelines as to how the Information Commissioner’s office and government departments will cooperate with one another in cases where a government department refrains from disclosing information to an individual, or the ICO, on the basis of national security.
The continued uncertainty around the draft EU Data Protection Regulation presents something of a challenge for data controllers. It’s clear that it could require them to make significant changes to how they handle individuals’ data, but the ongoing fundamental political disagreements make it difficult to predict which changes will make it into the final form of the legislation. So it is interesting to see the recommendations on the UK ICO’s blog on where to start in preparing for reforms, highlighting three areas: consent, breach notification, and privacy by design.
On 14 October, the Article 29 Working Party of EU data protection commissioners published a Working Document providing guidance on obtaining consent for cookies, some eighteen months after the effective date of the so-called “cookie consent law” which required EU websites to obtain consent from Internet users before before placing cookies on their devices. The document analyses, to some extent, the practices more commonly used by website operators to obtain the required consent, and attempts to answer the question as to what measures would “be legally compliant for a website operating across all EU Member States.”
The UK Information Commissioner’s Office (the “ICO”) recently published further guidance on encryption on its blog. The ICO has taken the position for some time that if a business holds sensitive personal information on portable or mobile devices, it should protect that information using appropriate encryption software. If that does not occur and such information is compromised, the ICO has stated that it may pursue regulatory action. The guidance does not modify the ICO’s position on encryption, but it does explain in layman’s terms what the ICO means by encryption and the different types of encryption that are available, so non-technical data protection officers may find it a helpful introduction to this topic.
The UK First Tier Tribunal issued a decision on August 21 finding that the Information Commissioner’s Office (ICO) was wrong to impose a £250,000 fine on Scottish Borders Council in relation to an incident where pension records of former Council employees were discovered overflowing from recycling bins outside a local supermarket. The Tribunal held that the contravention, while serious, was not of a kind likely to cause substantial damage or substantial distress, which is a requirement for imposing such a penalty. The decision may have implications for the ICO’s approach to imposing monetary penalties in the future.