03 Mar 2023

Data Protection update – February 2023

Linkedin

Welcome to the Stephenson Harwood Data Protection bulletin, covering the key developments in data protection law from February 2023.

While February may be the shortest month of the year, it was far from quiet on the data protection front as regulators across the UK, US and EU stepped up enforcement efforts.

In the UK, the Information Commissioner's Office experienced both wins and setbacks. It helpfully clarified its approach to regulating the use of Facial Recognition Technology, setting out a number of recommendations after a local council was found using the technology to manage "cashless catering" in school canteens. But the regulator also suffered a setback after a tribunal upheld an appeal from Experian over the credit reference agency’s handling of personal data for direct marketing purposes. The First-Tier Tribunal rejected the ICO's decision that Experian's privacy notice was not transparent or that using credit reference data for direct marketing purposes was inherently unfair. 

Elsewhere in the US, the Federal Trade Commission announced its first enforcement action under its Health Breach Notification Rule, which requires vendors of personal health records to notify consumers following a data breach revealing unsecured information. It fined digital healthcare platform GoodRx Holdings $1.5million after it was found to have shared individuals' health data with Facebook and Google and others.

In the EU, the European Commission helpfully set out a plan to streamline cooperation efforts between data protection authorities of member states. The European Commission has yet to publish details on how the initiative will work in practice, but as it seeks to aid harmony in cross-border cases it could mean big changes to the 'one-stop-shop' regime in this area – so watch this space.

In this month's issue:

Data protection

Cyber security

Enforcement

Civil litigation

Data protection

European Commission to introduce bi-monthly progress checks on large-scale GDPR cases

Following a complaint by the Irish Council for Civil Liberties ("ICCL") and subsequent suggestions by the European Ombudsman, the European Commission has confirmed that it will require all EEA national supervisory data protection authorities to share details of large-scale cross-border investigations under the EU General Data Protection Regulation ("EU GDPR") with it on a bi-monthly basis. All information shared will be done so on a confidential basis, giving the European Commission greater oversight over the enforcement of the EU GDPR and the protection of data subjects' rights throughout the EEA.

The European Commission's enhanced supervision over the actions of data supervisory authorities looks set to accelerate investigations and enforcement action and could ensure a more harmonised approach to enforcement action under the EU GDPR. ICCL senior fellow, Dr Johnny Ryan, has commented that the increased supervision by the European Commission signals the "the beginning of true enforcement of the GDPR, and of serious European enforcement against Big Tech".

European Commission planning to streamline cooperation between national data protection authorities

The European Commission has also proposed a plan to create a legislative initiative to streamline cooperation between the data protection authorities of member states. The European Commission has stated that this initiative will allow increased harmony in the actions taken by data protection authorities when considering cross-border cases. This could be considered to be an effective overhaul of the 'one-stop shop' regime currently in practice, implementing a new regime for consistency across member states. The European Commission has yet to publish details on how this initiative will work in practice but it has opened a 'call for evidence' to receive feedback to "develop and fine-tune" the initiative between 24 February 2023 and 24 March 2023.

Feedback can be submitted to the European Commission here.

People's Republic of China introduces new SCCs for data transfers from China

On 24 February 2023, the Cyberspace Administration of China officially published the Measures on the Standard Contract for Export of Personal Information ("Measures on Standard Contract"). A standard contract for the export of personal data from the People's Republic of China ("PRC") to other countries ("PRC Standard Contract"), similar to the Standard Contractual Clauses ("SCCs") used under the EU GDPR, is attached to the Measures on Standard Contract, which will be effective from 1 June 2023. 

Unlike the SCCs, the PRC Standard Contract only has one universal template regardless of the roles of the data importer and data exporter as controller or processor. The PRC Standard Contract cannot be adopted where the data exporter is a critical information infrastructure operator (such as a telecoms or public utility provider), is processing the personal data of more than 1 million data subjects, has exported the personal data of more than 100,000 individuals (in aggregate) since 1 January of the preceding year, or has exported the sensitive personal data of more than 10,000 individuals (in aggregate) since 1 January of the preceding year. In these instances the data exporter will be instead required to carry out a mandatory data export security assessment.

For further detail on the Measures on Standard Contract and the PRC Standard Contract, an alert from our associate firm in the PRC (Wei Tu) is available on our data protection hub.

MEPs and EDPB express their views on the draft EU-US Data Privacy Framework

In December last year, the European Commission published a draft adequacy decision ("Draft Decision") endorsing the proposed EU-US Data Privacy Framework ("Framework"). A short article on the Draft Decision can be found here. That action came after US president Joe Biden signed Executive Order 14086 on Enhancing Safeguards for U.S. Signals Intelligence Activities ("EO").

In the last month, there have been two developments in relation to the Draft Decision and the Framework. The first came in the form of a draft motion of the Members of the European Parliament ("MEPs") making up the Committee on Civil Liberties, Justice and Home Affairs ("LIBE Committee") on 14 February 2023 (the "Motion"), which states that the Framework "fails to create actual equivalence in the level of protection" and urges the European Commission "not to adopt the adequacy finding".

The Motion takes the view that the Draft Decision does not offer EU businesses legal certainty, with the LIBE Committee remaining concerned about the Framework becoming subject to future legal challenges. Some of the potential areas for future challenge are identified as follows:

  • EU and US laws have differing concepts of necessity and proportionality;
  • The redress process under the Framework does not provide sufficient transparency and impartiality to data subjects;
  • The EO can be amended at any time by the US president and, unlike all other third countries beneficiaries of adequacy decisions, the US does not have a federal data protection law;
  • The EO does not address two key concerns raised by the CJEU in Schrems II. Namely, it neither prohibits the bulk collection of data by public authorities nor does it apply to data accessed by public authorities via other means, such as under the US Cloud Act or the US Patriot Act, by commercial data purchases, or by voluntary data sharing agreements.

The Motion has since been followed by a non-binding opinion of the European Data Protection Board's ("EDPB") on 28 February 2023 ("EDPB Opinion") which highlights the "substantial improvement" in the Framework when compared to the old Privacy Shield regime. Unlike the Motion, the EDPB Opinion is overall fairly positive about the Framework and resulting Draft Decision, acknowledging that the test of essential equivalence under the EU GDPR does not require data protection safeguards in the US to be identical to those in the EU. In this regard, the EDPB makes some suggested improvements in relation to how the Framework interacts with US laws, including in relation to data subject rights requests, onward transfers and profiling or automated decision making. Critically, the EDPB recognises that these suggestions are not the issues that were criticised by the CJEU in Schrems II, but instead speak to pre-existing concerns over the overall equivalence of US law to the EU GDPR.

The EDPB Opinion also sets out some areas in which the Framework falls short, as far as access to personal data by US national security authorities is concerned. The EDPB refers to its European Essential Guarantees for surveillance measures from 2021 ("Guarantees") and proposes that the European Commission adopt the Draft Decision on the condition that US intelligence agencies implement updated policies and procedures that meet the requirements of the Guarantees.

So, what is next?  The European Commission needs the Draft Decision to be approved by a committee composed of representatives of the EU member states. In addition, the European Parliament has a right of scrutiny over adequacy decisions. We now await a final vote by the European Parliament on the Draft Decision (which is expected to take place during the Spring). If approved, the Framework could be operational as soon as July.

Irish DPC takes the EDPB to court to have direction annulled

The European Commission's new approach of enhanced supervision (as reported on above) follows recent controversy regarding the Irish Data Protection Commission's ("DPC") handling of complaints against Big Tech companies. The DPC responded to recent criticisms of its own decision-making processes by criticising the EDPB's oversight of the DPC's enforcement action against Meta Platforms Ireland Limited ("Meta"), as reported in our last bulletin. Subsequently, the DPC has now issued multiple claims against the EDPB in the Court of Justice of the European Union ("CJEU"). Although the details of these claims have not yet been published, they are likely made pursuant to Article 263 of the Treaty on the Functioning of the European Union, which allows the CJEU to examine the legality of legal acts of bodies, offices or agencies of the Union that produce legal effects in a process similar to that of judicial review in the United Kingdom.

As reported in our December 2021/January 2022 bulletin, Meta attempted a similar course of action in relation to the EDPB's decision on WhatsApp which led the DPC to issue a €225 million fine against Meta. However, in December 2022 the CJEU ruled the claim brought by Meta to be "inadmissible".

It remains to be seen what the DPC's precise arguments will be and whether or not these arguments will be enough to convince the CJEU that the EDPB "overreached" its authority by requiring the DPC in its decision to initiate a fresh investigation into Meta's platforms' processing of special category personal data.

European telecom giants to form ad tech joint venture

A group of four of Europe's largest telecoms providers have announced their plans to form a joint venture company aimed at providing a privacy-by-design digital marketing technology platform ("Platform"). Each of the telecoms providers – Deutsche Telekom AG, Orange SA, Telefónica SA and Vodafone Group Plc – will hold a 25% stake in the joint venture.

In a statement Vodafone noted that the Platform has been designed with a focus on compliance with the EU GDPR and European Directive 2002/58/EC ("ePrivacy Directive"), as well as increased transparency for consumers using the Platform. Consumers will only receive communications from brands where they have given affirmative opt-in consent and will also be able to revoke this consent with a single click through the Platform. The Platform is already undergoing a trial in Germany to evaluate the Platform's transparency and the control it gives to consumers.

In the UK, digital marketing is governed by the UK GDPR, the Data Protection Act 2018 ("DPA 2018") and the Privacy and Electronic Communications (EC Directive) Regulations 2003 ("PECR"), which implements the ePrivacy Directive into UK law. As with their European equivalents, the UK GDPR, DPA 2018 and PECR require organisations carrying out digital marketing to obtain opt-in consent to receiving marketing communications, which most companies do by asking individual users to tick a box if they wish to receive such communications. Similarly, under Regulation 6 PECR, when using cookies or other tracking technologies (to enable targeted or behavioural digital marketing), organisations must tell people that the cookies or tracking technologies are there, explain what they do and why, and get the individual's consent to store the cookie on their device.

The Platform appears to provide consumers with greater control over their consents by allowing users to manage their consents for multiple brands in one place. Moreover, the Platform is designed so that consumers' consents are anonymised when data is shared with brands in order to protect the consumers' personal data.

Whilst the Platform has not yet been adopted for commercial use, it appears to address a wider emphasis across the EEA and UK on protecting consumers' data protection rights in relation to digital marketing; including with the publication of updated guidance from the Information Commissioner's Office ("ICO") on e-mail direct marketing (as reported on in our October bulletin).

Cyber Security

10 million JD Sports customers' personal data at risk following cyber-attack

Following a cyber-attack on JD Sports Fashion plc ("JD Sports"), the personal data of ten million customers who made online orders between November 2018 and October 2020 may be at risk. JD Sports made clear that while full payment card details should not be at risk, as it did not hold full card details for customers (only the last 4 digits of their card), personal data such as names, addresses, and phone numbers could be accessed by cyber-attackers.

Neil Greenhalgh, the Chief Operating Officer of JD Sports, apologised to customers affected and advised them to be "vigilant about potential scam emails, calls and texts and providing details on how to report these". 

The cyber-attacks have been reported to the ICO and JD Sports has stated that it is reviewing its cybersecurity with external specialists. This cyber-attack, along with the recent cyber-attack on Royal Mail (as discussed in our December 2022/January 2023 bulletin), is a stark warning to companies holding large amounts of customer personal data. Companies should ensure that they are appropriately safeguarding their systems and have a clear policy in place for preventing and managing cyber-attacks. While limiting card information held (often by using third party payment processors) helps to reduce the potential financial impact on data subjects, the ancillary information held by consumer facing companies is generally more than enough to impart obligations to notify both regulators and data subjects, which invariably leads to consequential reputational damage. Notably, several claimant law firms have posted data breach claim websites looking to bring claims on behalf of impacted data subjects, a common feature of large scale publicly reported breaches.

New security-related obligations for Internet of Things products in the UK

On 6 December 2022, the Product Security and Telecommunications Infrastructure Act 2022 (the "Act") received Royal Assent and came into force. As discussed in our December 2021/January 2022 bulletin, the Act's purpose is to create a new regulatory regime for the security of consumer-connectable products and for the provision of electronic communications infrastructure.

The Act includes obligations and legal duties for manufacturers, authorised representatives, importers and distributors (the "Relevant Persons").

These duties for the Relevant Persons include duties to:

  • comply with certain security requirements;
  • make statements of compliance and retain a copy of the statement of compliance or a summary, which the Relevant Persons can be required to make available;
  • when relevant, take action if a manufacturer fails to comply, investigate potential failure to comply by a manufacturer, not supply products to a manufacturer that fails to comply;
  • maintain records/maintain a record of investigations of non-compliance; and
  • for importers and distributors alone, not supply products that are believed to be non-compliant and prevent such products from being made available to customers in the UK.

The Minister for Digital, Culture, Media and Sport stated that the Act and its supporting legislation "will also strengthen cyber protection to make sure the UK has the strongest security regime for smart tech in the world". The Act will ensure has a practical impact by creating a range of penalties for non-compliance. Potential penalties include a fine of the greater of £10 million or 4% of qualifying worldwide revenue. This may also be followed by fines of £20,000 per day for those who fail to remediate and comply with the Act once notified. The authority enforcing the Act will also be able to issue stop, recall or compliance notices and failing to adhere to these shall be a criminal offence.

While the Act is now in force, many of the effective obligations will be brought into force through regulations in the future. These subsequent regulations are set to introduce obligations for manufacturers, importers and distributors amongst others. The Government have stated that Relevant Persons shall have at least 12 months to comply with the Act and any subsequent regulations, but it is advisable for organisations likely to be subject to the Act and these legislations to start building the necessary organisational infrastructure to ensure compliance. Organisations should account for the costs associated with the new obligations under the Act and ensure that records of compliance are maintained, as well as ensuring that remediation plans are feasible.

The Act can be read in full here.

Seven of Russian cybercrime gang sanctioned by HM Treasury

On 9 February 2023, the HM Treasury Office of Financial Sanctions Implementation ("OFSI") released a notice (the "Notice") freezing the assets of seven Russian individuals who are alleged members of Russian cybercrime gang Trickbot, and banning these individuals from travelling to the UK. These sanctions have been implemented pursuant to the Cyber (Sanctions) (EU Exit) Regulations 2020 (S.I. 2020/597) which allow for an individual's funds and economic resources to be frozen if they are "conducting or directing cyber activity that undermines, or is intended to undermine, the integrity, prosperity or security of the United Kingdom or a country other than the United Kingdom; international organisations; and non-governmental organisations whose purposes relate to the governance of international sport or the Internet".

The sanctioned individuals are said to have been involved in the deployment of ransomware in both the UK and US. Consequently, they have also been sanctioned in the US by the US Department of the Treasury's Office of Foreign Assets Control, in accordance with Executive Order 13694, as amended by Executive Order 13757. The estimated value of the funds extricated using the ransomware is said to be over £27 million, with entities such as hospitals, local authorities, and schools targeted in attacks. The sanctions have been announced in the midst of a large-scale, ongoing investigation by the National Crime Agency ("NCA").

The Notice also obliges any person or entity potentially connected financially or economically to the sanctioned individuals to check if they are holding any accounts, funds or economic resources for these individuals and, if relevant. freezing such accounts.

The OFSI has simultaneously released updated guidance relating to ransomware and sanctions (the "Ransomware Guidance"). The Ransomware Guidance seeks to outline mitigating steps which can be taken to allow the OFSI and NCA to resolve breach cases involving ransomware payment through alternative means to monetary penalties and criminal investigations.

The key takeaways from the Ransomware Guidance are:

  • Ransomware Payments: The OFSI repeats the general advice that paying a ransom does not prevent attackers from accessing networks and does not guarantee a victim regaining access to their data or computer. The OFSI also states that there have been cases in which paying victims have been subject to further attacks utilising the same (non-remediated) vulnerability. Although in the UK (and many other jurisdictions) there is no universal ban on payment of ransoms, in cases where the threat group is subject to sanctions or suspected of being involved in terrorism, payment may be illegal and the OFSI sets out the considerations it will take in deciding whether to take enforcement action where payment is made to a designated group.

    This advice follows on from an open letter sent by the ICO and National Cyber Security Centre ("NCSC") to the Law Society in July 2022, in which the ICO and NCSC stated that payment of a ransom would not be taken into account in mitigation by the ICO when considering whether to take enforcement action. Both can be seen as indicative of Government's approach, which to date has been to firmly discourage ransom payments, on strong public policy and practical grounds, without taking the step of making payments outright illegal.

    We would note that attribution of an attack is often difficult in cyber incidents, especially where a group is sanctioned, as sanctioned groups will usually not advertise their involvement, making determining whether an attacker is sanctioned difficult.
  • Due Diligence: Organisations should implement effective due diligence measures to manage risks of financial sanctions breaches. Specific measures are not specified by the OFSI and the onus is on organisations to ensure that they have sufficient measures in place to not commit a breach of financial sanctions.
  • Reporting: Alongside any regulatory reports required, ransomware attacks can be made to the Where to Report a Cyber Incident portal. It guides users through a series of questions and directs users to Action Fraud for a ransomware incident, or potentially the NCSC which forms part of GCHQ.
  • Payments to designated persons: Where a payment has been made to a designated person in breach of sanctions regulations, the OFSI encourages voluntary reporting as a complete voluntary disclosure of a breach of sanctions will generally be a mitigating factor in the OFSI's assessment of the case and any enforcement action. Aggravating factors will also be considered, such as a lack of compliance with regulatory standards by regulated individuals.

Outside of the above, the Ransomware Guidance also offers practical steps to consider if there is a ransomware attack, tools and online platforms/services provided by the NCSC for the purpose of cyber resilience and mitigation and key contacts for reporting ransomware attacks and obtaining further information.

The Notice can be read here, the US Department of the Treasury's press release can be read here and the Ransomware Guidance can be read here. The joint letter from the NCSC and ICO to the Law Society can be read here.

Negotiation history between Royal Mail and LockBit released following ransomware attack

Following the ransomware attack on Royal Mail, the negotiation history between Royal Mail International and ransomware gang, LockBit, has been released. On 14 February 2023, LockBit leaked the full transcript of the live chat between Royal Mail and itself.

The transcript highlighted some of the common negotiation strategies either side employs, notably in response to LockBit's ransom demand for $80 million made in February 2023. For example, Royal Mail's negotiator is seen requesting sample decryption evidence by providing files to be decrypted. This is common to obtain 'proof of life' that the decryption works. The LockBit threat actor is seen rejecting some on the basis that it believes they could be used to restore functionality without the decryptor. 

On the other hand, the LockBit threat actor focused on the alleged annual revenue of Royal Mail, citing news sources as evidence that the Royal Mail could pay (which the Royal Mail negotiator disputed). The threat actor also referred to the possible fines that regulators could impose, stating that payment would save Royal Mail from a much larger fine. This is a common tactic from threat actors, but as noted elsewhere in this update, regulators have publicly stated they will not take into account payment of ransoms when considering when to issue a fine, or the size of any fine to be issued. Further, payment of a ransom does not obviate the need to report a breach to regulators, as the breach occurred when the threat actor accessed and exfiltrated data, regardless of whether it later publishes it. 

The negotiation appeared to end without achieving a resolution, and on 23 February various outlets reported that some 44GB of Royal Mail data had been leaked onto the dark web, with a further ransom demand issued in respect of the remaining unpublished data in the threat actor's hands.

Enforcement

Federal Trade Commission announce first enforcement action under Health Breach Notification Rule

The US Federal Trade Commission ("FTC") has announced the first enforcement action under its Health Breach Notification Rule, which requires vendors of personal health records and related entities to notify consumers following a data breach revealing unsecured information. 

GoodRx Holdings Inc. ("GoodRx") was found by the FTC to have shared individuals' health data with Facebook and Google amongst others and agreed a US$1.5 million civil penalty with the FTC for the breach.

The FTC's proposed order for the breaches (which must be approved by a US federal court to be effective) suggests that, in the US, opt-in consent is required to use any health data for advertising and that health data should not be shared without the relevant individual's knowledge. This echoes the decision of the ICO to fine Easylife Limited in the UK for using inferred health data to target digital marketing to individuals and the fines issued by a number of European data protection supervisory authorities to Clearview AI Inc. in relation to its collection of biometric data without individuals' knowledge for similar purposes.

The FTC's action against GoodRx highlights the importance of ensuring any health or other special category personal data is held and used in accordance with applicable data protection laws. Any special category personal data collected should only be shared with great caution and only when full transparency can be given to the individual whose data is being shared.

Former RAC employee fined over road traffic victims' data theft

A former RAC employee has been found to have stolen the personal data of at least twenty-one individuals relating to their involvement in road traffic accidents following an ICO investigation. The former employee, who pleaded guilty to stealing data in breach of the DPA 2018, was fined £5,000 and ordered to pay a victim surcharge and court costs.

In the ICO's statement on the action, the ICO's Director of Investigations, Stephen Eckersley, noted that: "receiving nuisance calls can be hugely frustrating and people often wonder how these companies got their details in the first place. This case shows one such way that it happens. But also shows that those who do this crime will be caught, will be convicted and justice will be served."

Whilst ICO action against individuals is rare, this action illustrates that the ICO will take action against individuals as well as organisations for breaching data protection laws, particularly where these breaches result in distress to data subjects. 

ICO makes recommendations following school group's use of Facial Recognition Technology

The ICO has made a number of recommendations to North Ayrshire Council (the "Council") following the Council's controversial use of Facial Recognition Technology in its schools. The ICO's recommendations centre around three key requirements when using biometric data and children's data.

Having a valid lawful basis for processing special category data

Where biometric data is used to uniquely identify a natural person, the data used will be classified as special category data and require both a lawful basis for processing under Article 6 UK GDPR and a condition for processing special category data under Article 9 UK GDPR to be lawfully processed. In this instance, the Council purported to rely on consent and explicit consent as its lawful bases for processing special category data. Investigating the matter, the ICO found a number of issues with the forms used to collect these consents from data subjects. 

Firstly, the ICO found that the forms did not present the use of Facial Recognition Technology as an option. For a public authority to rely on consent as a lawful basis for processing, it must have given individuals a genuine choice to accept or reject the processing. 

The ICO also found that the forms used a number of technical terms that were unlikely to have been understood by the data subjects. 

The Council's use of the wording "I do wish to grant consent to participate in the use of facial recognition systems within the school" in its forms was deemed by the ICO to be too vague and broad to properly grant explicit consent. Whenever collecting explicit consent, it is critical that any consent collected is given in both an explicit and unambiguous manner. Such explicit consent should be in a clear oral or written statement, must specify the nature of the special category data the individual is consenting to have collected, and must be separate from any other more general consent.

Transparency

The ICO noted in its recommendations that the right for individuals to be informed about how their personal data is collected and used is a key transparency requirement under the UK GDPR. Pursuant to Article 12 and Recital 58 UK GDPR, when using children's data there is a particular need to make efforts to provide this information to the relevant children in a way that they "can easily understand". Particular care should also be taken to ensure that children understand the potential risks with collecting their data and their rights under the UK GDPR and other applicable data protection legislation.

Any communications with data subjects should also not be misleading. The ICO found in this instance that the Council had underplayed the complexity of the Facial Recognition Technology, giving a misleading impression that this technology was more well-tested and commonplace than it is in reality.

Data Protection Impact Assessment

The ICO found the Council had made a number of errors when completing its Data Protection Impact Assessment ("DPIA") in relation to the Facial Recognition Technology. Before processing is commenced, a comprehensive DPIA, complying with the Article 35 UK GDPR requirements, should be completed by the controller. This should be signed and dated by a senior employee or the data protection officer ("DPO"), prior to the processing commencing. In its recommendations to the Council, the ICO reiterated the importance of consulting and involving the DPO throughout the process of completing the DPIA. Any residual risks highlighted in the DPIA must either be mitigated or the ICO must be consulted before processing commences, so as not to impinge on the rights of data subjects.

The ICO's report on the recommendations given to the Council can be read in full here.

Civil Litigation

Experian appeal case leads to reaffirmation of the status of legitimate interests

The First-Tier Tribunal (the "Tribunal") has upheld an appeal from credit reference agency Experian Limited ("Experian") in regard to its processing of data from a number of sources for the purpose of direct marketing. Of particular interest were the Tribunal's findings on legitimate interest as a basis for processing for direct marketing purposes.

The judgment follows enforcement action taken by the ICO against Experian in October 2020 (which we reported on in detail here), following a two-year-long investigation relating to the use of personal data by Experian's direct marketing arm acquired from a variety of public sources (such as the electoral register), internal sources (from its credit reference business) and third party sources, which it used to build profiles for 51 million adults. It sold this profile data on to third parties for direct marketing purposes. The profiles combined name and address information with up to 13 other attributes. From these Experian produced modelled datapoints which reflected 'predictions' about the likelihood of people having certain characteristics.

The ICO's investigation held that Experian was breaching the UK GDPR by processing the personal data of data subjects for direct marketing purposes without consent, and that Experian could not rely on the legitimate interest basis to render its processing lawful. It was not appropriate for Experian to process data for direct marketing purposes on the basis of its legitimate interests when that data was originally obtained on the basis of consent.

While Experian had, in October 2020, introduced a notice in its Customer Information Platform collating the privacy information required that popped up on its consumer information portal, the ICO held that this was not sufficient. As a result, Experian was conducting 'invisible processing', breaching its transparency obligations and not processing data with the consent of data subjects. The ICO's enforcement notice ("Enforcement Notice") required that Experian amend its practices within a period of nine months, including by providing a UK GDPR-compliant privacy notice to all data subjects, subject to a fine of £20 million or 4% of its annual global turnover should no changes be implemented.

The Enforcement Notice was appealed by Experian (the "Appeal"). In the Appeal, Experian asserted that the law had been applied incorrectly and flawed conclusions reached on the facts, and that the requirements of the Enforcement Notice were disproportionate. Experian asserted that the Enforcement Notice was an attempt by the ICO to impose subjective preferences as if they were legal requirements under the GDPR, and that the effect of the Enforcement Notice would be that Experian would be forced to adopt an unworkable consent-based model for offline marketing services, which would effectively shut down its business in this area.

There were five grounds of appeal underpinning this broad argument:

  1. Experian's offline marketing services help to achieve effective and efficient marketing and ensure data subjects receive relevant marketing materials;
  2. Experian uses public data to build statistical models without tracking internet activity or locations, which does not support the ICO's assertion that data subjects would not expect Experian's processing activities and would be distressed by them;
  3. The ICO made incorrect assumptions about Experian's business model and did not apply the UK GDPR in a fair, proportionate and otherwise lawful manner. Contrary to the ICO's conclusions, Experian's business model did not rely on its processing being invisible so as to avoid the requirements of the UK GDPR;
  4. The ICO's approach was disproportionate and out of proportion with parts of the UK GDPR and public policy given the economic harm which would result from the enforcement notice; and
  5. The ICO's conclusions would make its privacy notice less meaningful and lacking effective, user-friendly layering and structuring. It would require Experian to send communications to data subjects that are unnecessary and irritating.

The Tribunal struck out the ICO's Enforcement Notice and confirmed that the legitimate interests ground can be relied upon for direct marketing activities. There are many fact-specific points to the decision, but some of the key areas of note are as follows:

  • Legitimate interests: Legitimate interests can be an appropriate lawful basis for the purpose of direct marketing. The Tribunal noted that this is reflected in recital 49 of UK GDPR which recognises that direct marketing may be a legitimate interest and that this requires balancing the interests of the controller and the individual in the particular circumstances.
  • Transparency: The Tribunal considered the costs of informing data subjects. It would cost a significant amount to provide privacy notices to all UK adults whose data is held but concluded that these are necessary costs. They referred to transparency obligations and their costs as "a business expense which should have been incurred over time as a matter of routine compliance". They found that Experian's Customer Information Platform notice, while it may not have been previously, is currently compliant with the UK GDPR and clear and, while the processing may be surprising to the data subject, the information on the Customer Information Platform was "sufficiently prominently displayed".
  • Two key contraventions: The Tribunal found two principal contraventions of the UK GDPR: (i) in respect of the failure to provide privacy notices to 5.3 million data subjects whose data was obtained from open sources such as the Electoral Register or Companies House; and (ii) in respect of the lack of lawful grounds for processing personal data obtained from third-party suppliers that had been originally obtained on the basis of consent. This reflected the ICO's view that it was not possible to "convert" processing of data obtained via third parties based on consent to processing on the basis of Experian's legitimate interests. However, the Tribunal accepted that Experian no longer carried out this practice of 'switching' legal bases when it obtained personal data from third parties for direct marketing purposes in this way.
  • On the basis of its findings, the Tribunal held that to order notification to all impacted data subjects was disproportionate. The Tribunal found that the ICO needed to consider the positive benefits to consumers of the processing and that it had failed to do so in its enforcement notices. It considered the benefits of using the data for purposes such as ensuring the mailing lists are up-to-date and allowing businesses to reduce duplicate names and similar errors. The Tribunal also considered the minimal negative impact on the individual, the benefits of receiving a marketing leaflet which aligns with their interests, and that based on the facts it was unlikely that a data subject would succeed in a damages claim.
  • The Tribunal did order Experian to provide privacy notices to the 5.3 million data subjects whose data was obtained from open sources, to be provided within 12 months, subject to some exceptions.

The Tribunal's decision can be read here. It is not yet known whether the ICO will appeal the decision.

CJEU rules on DPOs' conflicts of interest

The CJEU has ruled that data protection officers ("DPOs") may hold other roles and carry out other obligations or tasks so long as they do not result in a conflict of interest.

The 9 February 2023 decision from the CJEU follows a request for a preliminary ruling from the Federal Labour Court of Germany in a claim relating to a former DPO. The former DPO of X-Fab Dresden, who was also chair of the company's works council, was dismissed, with his former employer arguing that his two roles posed a risk of a conflict of interests.

The CJEU's decision focused on interpreting Article 38 EU GDPR, which does not allow for the dismissal or penalising of DPOs for performing their tasks. It clarified that the aim of Article 38 EU GDPR is to allow DPOs to be independent from controllers and processors. Article 38 does not prevent member states from placing stricter conditions for the dismissal of a DPO, but these conditions could not be incompatible with the EU GDPR.

In respect of determining the circumstances of a conflict of interest, the CJEU concluded that a DPO "cannot be entrusted with tasks or duties which would result in him or her determining the objectives and methods of processing personal data on the part of the controller or its processor". Data protection objectives and methods should be carried out independently by the DPO.

The decision sheds some light on what external tasks and duties it is appropriate for DPOs in organisations to carry out alongside their roles as DPOs. The focus should always be on ensuring that DPOs are not placed in a position in which they are determining processing or where their role may conflict with ensuring compliance with the EU GDPR.

The decision from the CJEU can be read in full here.

German Data Protection Authorities rule on third-country parent companies and data transfers

The conference of German Data Protection Authorities ("DSK") has published a decision on whether the risk that a parent company based in a third country could instruct an EEA-based subsidiary to grant access to the personal data it holds ("Access Risk") can be considered a data transfer under Article 44 EU GDPR. The DSK's decision, which appears to have been influenced by a judgment of the Oberlandesgericht Karlsruhe (in the proceedings 15 Verg 8/22), was that the Access Risk was not to be considered in isolation as a data transfer. However, the DSK noted that an Access Risk should be taken into account by a controller when assessing the reliability of their processor in accordance with Article 28(1) EU GDPR, as it may indicate that the processor is unable to give sufficient guarantees of the implementation of technical and organisations required to protect the rights of data subjects and meet the requirements of the EU GDPR.

Where an Access Risk exists, determining the reliability of the EEA-based processor with a parent company in a third country requires an assessment of all the relevant criteria, such as the likelihood of requests from the parent company to transfer to third-country or assurances on how conflicts between the laws of the member state and third country are managed. According to the DSK, for a processor with an Access Risk to be used, there must be particularly stringent safeguards in place to ensure that the Access Risk is mitigated. 

The DSK have referred their decision on this topic to the EDPB for further determination. The DSK's decision can be read (in German) here.

Linkedin

KEY CONTACT

Alison Llewellyn

Alison Llewellyn
Managing associate

T:  +44 20 7809 2278 M:  Email Alison | Vcard Office:  London