• Home
  • News
  • Data Protection update - July 2020

04 Aug 2020

Data Protection update - July 2020


Welcome to our data protection bulletin, covering the key developments in data protection law from July 2020.

Data protection

Cyber security

Regulatory enforcement


Other EU countries

Civil litigation

Data protection

European Court struck down EU-US data sharing agreement

On 16 July 2020, the Court of Justice of the European Union ("CJEU") delivered its long-awaited decision in the case of Data Protection Commissioner v Facebook Ireland Ltd, Maximillian Schrems and others (Case C-311/18), also known as the "Schrems II case".

The CJEU held that the Privacy Shield is invalid and cannot be used as a transfer mechanism. It held that the Standard Contractual Clauses ("SCCs") are valid as they contain effective mechanisms to ensure compliance with EU law, given that they allow for transfers of personal data to be suspended or prohibited if the recipient cannot comply with them.

For more information on the decision and the background facts, please read our full update: Privacy Shield invalid: what next for international data transfers?.

As a result of the Schrems II decision, those transferring data under the Privacy Shield will need to use an alternative safeguard as the Privacy Shield has been declared invalid with immediate effect. Further, those transferring data using the SCCs will be required to conduct a review of the adequacy of any third country laws on data safeguards to ensure essential equivalence, as compared with EU standards.

The invalidity of the Privacy Shield due to the lack of safeguards available under US law may cast doubt on whether even the SCCs can properly be used to transfer personal data to the US or other countries subject to similar national laws. There may be particular concern if a US recipient of data, or its service providers, is subject to the US law requirements to share certain data with the government under Section 702 of the Foreign Intelligence Surveillance Act (FISA) and Executive Order 12333 (E.O. 12333). The Irish Data Protection Commissioner will consider this issue in the context of Facebook’s data exports and guidance from the European Data Protection Board (“EDPB”) is also expected.

The EDPB has already published FAQs on the case and the ICO has issued a statement. For more information on this and on your next steps for data transfers in the wake of this case, please see our separate update: EDPB FAQs: what are your next steps for data transfers post-Privacy Shield?

The European Commission publishes advice on how businesses can prepare for data protection changes following the Brexit transition period

With less than half a year to go until the end of the transition period, the European Commission released a communication to businesses and Member States on steps they should consider to prepare for some of the inevitable changes to come. In relation to data protection and data transfers, the European Commission reminds businesses that as of 1 January 2021, transfers of personal data to the United Kingdom can continue but they will have to comply with specific rules and safeguards as set out in the GDPR. Whilst there is a chance the European Union will adopt a unilateral “adequacy” decision for the UK (deeming that the UK offers an adequate level of data protection), this is not guaranteed. As such, the Commission advises businesses and public administrations to take the necessary steps to ensure GDPR compliance with respect to any personal data transfers to the United Kingdom. This can be achieved by having appropriate safeguards in place including binding corporate rules or through specific derogations (bearing in mind, of course, the decision in Schrems II).

EDPB established new register containing decisions of national supervisory authorities across the EU

The rise of EU-wide co-operation in policing data protection compliance is increasingly prevalent. The GDPR places national data protection agencies under a duty to cooperate on cases which have cross-border elements, so as to ensure uniform application of the GDPR; the "One-Stop-Shop" mechanism.

On 25 June 2020, the EDPB published a new register which contains decisions issued by national data protection agencies on matters investigated pursuant to this mechanism. According to the EDPB, this initiative "will be valuable to data protection practitioners who will gain access to information showcasing how [Supervisory Authorities] work together to enforce the GDPR in practice". The register will provide access to the decisions and contain summaries of the decisions in English.

The new register can be accessed here.

ICO and Department of Health and Social Care publish guidance on collecting personal data for Covid-19 track and trace efforts

As more and more businesses re-open and, in turn, face an obligation to collect customer data to help in the fight against coronavirus, both the Department of Health and Social Care (“DHSC”) and the ICO have each published guidance for businesses on how to collect information securely. The DHSC’s guidance provides comprehensive information on the purpose of maintaining records as well as practical guidance on the kind of information to collect and who to collect it from, how to maintain records in compliance with the GDPR, when information should be shared with NHS Test and Trace, and how NHS Test and Trace will take steps to minimise transmission. The ICO has issued a statement explaining how its practical advice to businesses (which we reported on in the June update) complements and supports the DHSC’s guidance and reassures small businesses that it can offer advice and support for coronavirus-related data protection issues.

Government admits that NHS Test and Trace system breaches GDPR

As the UK attempts to return to normality amidst the Covid-19 pandemic, a crucial element to this endeavour is the NHS Test and Trace system. Whilst the roll out of the UK’s contact-tracing app continues to be delayed, the UK government launched the manual Test and Trace system on 28 May 2020. On 20 July 2020, the DHSC admitted that the government had not conducted a data protection impact assessment ("DPIA") before implementing the Test and Trace system. As the system required individuals to share personal information such as their name, date of birth and places they recently visited, the GDPR requires that a DPIA be carried out. It has also been reported that the system has led to various data breaches including email mishaps and unredacted personal information being shared in training materials

A DHSC spokesperson has said that an "overarching DPIA" was "in development". The Test and Trace system is currently facing a potential legal challenge by the Open Rights Group with regard to the UK government’s failure to conduct a DPIA.

The European Commission sets up a voluntary gateway to assist with interoperability of contact tracing apps

On 15 July, the European Commission announced that it had adopted an implementing decision to facilitate a voluntary gateway service to help ensure the interoperability of contact tracing and warning apps across the EU in the fight against Covid-19. The decision sets out rules to be followed under which the European Commission acts as a provider of technical and organisational solutions for the gateway, and processes pseudonymised personal data on behalf of EU governments. This gateway should allow citizens, particularly those travelling in the EU, to only have to install one app rather than downloading a separate app for each country. This decision builds on the interoperability guidelines and technical specifications which have been agreed between Member States and the European Commission (which we reported on in our May and June updates).

The impact of the new Hong Kong Security Law on privacy, cybersecurity and data flows

The National Security Law (the “New Law”) came into effect in Hong Kong on 30 June 2020. The New Law has been introduced by China’s national legislature and intends to prevent, stop and punish acts in Hong Kong that threaten national security, including secessionist and subversive activity as well as foreign interference and terrorism. The New Law presents a major shift away from the “one country, two systems” governing model in Hong Kong which has been in place since 1997 and Hong Kong will now see far greater powers and controls lying with the Chinese government. The decision is likely to have significant implications for technology and service sectors across privacy, cybersecurity and data.

The Personal Data (Privacy) Ordinance 2012 (“PDPO”) is the main data privacy legislation currently in effect in Hong Kong. It is thought that the New Law will enable mainland China to become increasingly involved in enforcing the existing PDPO. In addition, the New Law is likely to give law enforcement in China access to personal data held in Hong Kong. This access will limit Hong Kong’s chance of committing to free data flow arrangements. Last year, Hong Kong signed a bilateral free trade agreement with Australia with an e-commerce chapter supporting the free flow of financial data. There is now some concern as to whether Hong Kong will actually be able to implement the agreement given the new limitations on free data flow. It is also thought that China’s own Cybersecurity Law will slowly be introduced in Hong Kong over the next few years which will allow law enforcement access to data, security and localisation requirements for mainland citizens’ and organisations’ data as well as critical information infrastructure protection measures. 

Large tech companies are already pulling out of Hong Kong with Facebook, Twitter, Microsoft and Google all suspending their cooperation with Hong Kong law-enforcement agencies at the beginning of this month, as they conduct human rights assessment of the New Law. The drafting of the New Law has been described as vague and without clear definition, which raises alarm bells for organisations currently operating in Hong Kong as they try to figure out what it means for data security and transfers. Police have been given extensive powers under the New Law including the right to request that service providers remove content considered to be in breach of the law. Companies also risk being subject to hefty fines or up to six months in prison for failure to comply with a request. Whilst it remains unclear quite how rigorously the New Law will be enforced by China, its broad provisions are certainly sparking fears with tech organisations.

Given the decision in Schrems II, careful thought will have to be given to whether it is appropriate to transfer personal data to Hong Kong from the EU, and, if so, what safeguards should now be put in place by transferors.

The EDPB adopts draft guidelines on PSD2

The PSD2 gives payment service providers the ability to access, process and store personal data necessary for providing their services if the payment services user has granted explicit consent for this. Under the GDPR however, payment service providers are allowed to choose another legal basis for accessing, processing and storing personal data such as it being necessary for performance of a contract, legitimate interest or compliance with legal obligations. Guidance issued by the EDPB suggests payment service providers must comply with both the PSD2 and the GDPR; despite concern in the industry on how to manage the interplay between the provisions of the GDPR and the PSD2 and the conflicting requirements for explicit consent under the PSD2 and other legal bases for processing personal data under the GDPR. The EDPB have now published draft guidelines (the “Guidelines”), the main purpose of which is to clarify the data protection aspects of the PSD2 and the relationship between the relevant provisions on the GDPR and the PSD2. The Guidelines state that the processing of special categories of personal data is generally prohibited, except when explicit consent is given by the data subject, or processing is necessary for reasons of substantial public interest in line with the GDPR. The Guidelines also clarify that, pursuant to the PSD2, a data subject must give their consent for AISPs and PISPs to use, access or store any data for purposes other than performing the service specifically requested by the user or, if no consent has been given, the reason for the processing is laid down in applicable laws. The Guidelines are still in draft form and have been submitted for public consultation.

Heathrow airport is developing a facial recognition check-in

Heathrow airport, as part of the ICO’s Sandbox programme (which was set up for organisations to develop innovative technology in collaboration with the ICO), is designing an automated process for check-in at airports, with the passenger ID process instead conducted by Facial Recognition Technology (“FRT”). The FRT would verify the passengers’ identity from an ID source and create an “on the day image”. A template of the individual’s face would then be created at the airport and matched against another template created from the individual’s passport image. This biometric matching process would occur at subsequent touchpoints throughout the check-in route to ensure it was still the same person without having to provide documentation. The aim of the project is to expedite the passenger’s journey through the airport. The ICO has warned that Heathrow airport will need to develop its biometric data processing function based on the collection of consent and explicit consent which, according to the report, may be the most appropriate legal basis for the processing. During the programme, Heathrow’s proposal of using layered affirmative actions to obtain explicit consent was deemed by the ICO to be non-compliant with data protection laws. The final report issued by the ICO provides interesting insight into some of the key data protection issues faced by Heathrow during the project including the role of controller, the complexity of other legal frameworks that airports need to operate on a global scale and finding a practical and compliant method of gaining consent.

Irish Data Protection Commission faces judicial review challenge for slow process

On 25 May 2018, NOYB filed four complaints against Google, Instagram, WhatsApp and Facebook, with the French, Belgian, German and Austrian data protection authorities respectively, for allegedly forcing users to consent to data processing, which would not constitute valid consent under the GDPR. NOYB is a non-profit organisation based in Austria which advocates for data protection. It was founded by Maximilian Schrems, who is known for his legal challenges against Facebook's transfer of personal data between the EU and USA.

Of the four complaints NOYB filed, only the French data protection authority has concluded its investigation, culminating in the EUR 50 million fine against Google as reported in our June 2020 bulletin. The Austrian, German and Belgium authorities forwarded the complaints to the Irish Data Protection Commissioner (the "Irish DPC"), as the complainants have their EU headquarters in Ireland. Since the forwarding, NOYB claimed that the Irish DPC had not made progress on the complaints against Instagram and WhatsApp. For these two companies, the Irish DPC had only issued "Draft Inquiry Reports", and there are at least five more steps that must be completed before users can receive a final decision.

As two years have passed since the complaints were first raised, NOYB filed an application for judicial review of the Irish DPC's slow progress. NOYB sought a declaration that the Irish DPC had failed to carry out investigations into the complaints within a reasonable period, in effect depriving EU citizens of their GDPR rights. NOYB's application for judicial review was approved by the Irish High Court on 6 July 2020.

NOYB's judicial review challenge may be a sign of more future challenges against the Irish DPC, which is under increasing pressure to deal with data protection complaints made against many large technology companies who have moved their EU headquarters to Ireland.

EDPS warns EU Institutions to carefully monitor their data protection impact assessments (“DPIAs”)

The European Data Protection Supervisor (the “EDPS”) (the data protection authority for EU Institutions) has issued a report showing the results of a survey of 40 EU Institutions on how they carry out DPIAs. DPIAs are required under Regulation (EU2018/17251) in situations where EU Institutions are processing information which presents a high risk to individuals’ rights and freedoms. It is also a requirement under the GDPR for controllers where processing of personal data is considered particularly high risk. DPIAs are ultimately designed to assist with managing and minimising risk. The report suggests DPIAs may be taking a back seat with only four of the EU Institutions surveyed having finalised DPIAs, with the majority saying their DPIAs are still in progress. The report looks at some of the key factors considered by EU Institutions when it comes to deciding whether and how to conduct a DPIA including triggering criteria, what was considered “high risk”, the use of internally developed processes as well as the use of Data Protection Officers in the DPIA process. The report serves as an interesting insight and an important reminder that DPIAs should be seriously considered when processing high risk personal data. The EDPS emphasises that they expect to carry out more regular surveys on the use of DPIAs to monitor and ensure compliance.

Cyber security

Cybersecurity capabilities in the EU come under scrutiny as the European Commission launches a public consultation on the revision of the NIS Directive and a cybersecurity certification scheme

On 25 June 2020, the European Commission adopted an Inception Impact Assessment announcing the revision of the NIS Directive. The NIS Directive aims to improve national level cybersecurity capabilities and increase cyber incident response co-operation among Member States. Member States have implemented the NIS Directive with a variety of different approaches, which has led to significant inconsistencies and fragmentation in the regulatory landscape. The European Commission feels it is now necessary to revise the NIS Directive in line with its key policy objective to ensure “Europe is fit for the digital age”. By the end of 2020, the European Commission aims to assess whether cybersecurity has improved across the EU, identify existing and emerging issues in this space and identify and quantify the regulatory costs and benefits. The public consultation remains open until 13 August 2020 with the aim to complete the review by the end of 2020. It is thought the review may result in the proposal of a new legal act to manage cybersecurity across the EU.

Separately, on 2 July 2020, the European Union Agency for Cybersecurity (“ENISA”) launched a month-long consultation for the first candidate cybersecurity certification scheme which aims to replace the existing schemes operating under the SOG-IS MRA for ICT products, to add new elements and to extend the scope to cover all EU Member States. It will look into the certification of ICT products’ cybersecurity, based on certain criteria and standards. ENISA Executive Director Juhan Lepassaar said “Cybersecurity certification aims to promote trust in ICT products, processes and services while at the same time tackling the fragmentation of the internal market, thus reducing the costs for those operating in the Digital Single Market”.

The European Commission is conscious of the rapid digitalisation of society, which has only been expedited by Covid-19 pandemic. The pandemic itself brings with it new cyber threats and challenges which require adaptive and innovative responses which makes these consultations particularly relevant.

Regulatory enforcement


Update on the ICO’s enforcement action against British Airways and Marriott

In July 2019, the ICO issued notices of intent to fine British Airways £183,390,000 and Marriott International £99,200,396 for alleged breaches of GDPR. In our April 2020 bulletin, we reported that the ICO had delayed the enforcement process against these companies to 18 May 2020 and 1 June 2020 respectively. Given that there has been no further update from the ICO, British Airways or Marriott International, it would appear that the enforcement process has been further delayed.

However, on 31 July 2020, British Airways’ owner, IAG, released its Q2 2020 Financial Results which indicate that it now expects the ICO's fine to amount to EUR 22 million (albeit making clear that this amount is merely the "management's best estimate" and that the ICO has not issued a final penalty notice). If this is correct, the fine would be approximately 10% of the amount initially included in the ICO’s notice of intent.

If so, this is in line with the ICO's recent statement on revisiting fines in light of the Covid-19 pandemic which noted: “before issuing fines we take into account the economic impact and affordability. In current circumstances, this is likely to mean the level of fines reduces.”

Given that Marriott International also operates in a sector, which is facing unprecedented challenges due to the pandemic, it seems equally likely that Marriott's fine would similarly be a fraction of that referred to in the ICO’s notice of intent.

ICO admonishes police practice on mobile phone extraction

On 18 June 2020, the ICO published its report on mobile phone data extraction ("MPE") by the England and Wales police in conducting criminal investigations. The report was prompted by concerns that the police's MPE practices may breach the GDPR requirements for lawful processing of personal data, given that mobile phones are now more likely than before to contain an abundance of personal and intimate information.

The report found that police MPE practices were inconsistent across England and Wales. It found that some police forces collected excessive amounts of personal data which were then stored "without an appropriate basis in existing data protection law". To that end, the ICO has recommended that a statutory code of practice be introduced to provide "greater clarity and foreseeability about when, why and how" the police will carry out MPE. Further, the police should implement measures to ensure that extracted data is retained "no longer than necessary".

Nevertheless, the ICO accepted that MPE regulation "is a complex area" and must be tackled through the lenses of data protection, criminal justice and human rights legislation.

The report can be read here.

UK and Australian data protection authorities jointly investigate AI company

On 9 July 2020, the ICO and Australian Information Commissioner announced a joint investigation into Clearview AI Inc's use of "scraped" data and biometrics of individuals. Clearview AI is headquartered in New York. It provides facial recognition software which is used by private companies and law enforcement agencies.

According to the announcement, Clearview's database reportedly contains more than three billion images that it claimed to have "scraped" from the internet, such as through social media platforms. Users of its facial recognition app can upload a photo of an individual, and the app will match the uploaded photo to one in its database.

Other EU countries


The Conseil d'Etat, France's highest administrative court, annulled parts of the cookie guidelines issued by the CNIL (France's data protection authority). In the cookie guidelines issued by the CNIL on 4 July 2019 (the "CNIL Guidelines"), the CNIL prohibited the use of "cookie walls" (the practice of blocking access to a site for users who do not consent to the use of cookies).

The Counseil d'Etat found that the prohibition of "cookie walls" in the CNIL guidelines was overly general. While it recognised that the GDPR permitted the CNIL to create guidelines, the Counseil d'Etat found that the CNIL exceeded its authority by imposing the "cookie walls" prohibition. Crucially, the Counseil d'Etat did not rule on whether the "cookie walls" prohibition went against what was required by the GDPR.

This decision highlights the potential for diverging national guidelines across the EU, despite the harmonisation on basic data privacy principles under the GDPR. Readers are reminded to consult local guidelines on data protection for the offices they have across the EU.

The annulment decision can be accessed here (in French).


Garante, the Italian data protection authority, issued two significant fines recently. The first was issued on 10 June 2020 against UniCredit bank for violation of the Italian Personal Data Protection Code (Italy’s pre-GDPR data protection legislation). Garante found that personal data of more than 700,000 UniCredit customers was unlawfully accessed by its external commercial partner between April 2016 and July 2017. This was as a result of UniCredit's failure to implement adequate security measures. In imposing the fine of EUR 600,000, credit was given to UniCredit's voluntary self-reporting after it discovered the data breach.

The second substantial fine of EUR 16.7 million was issued on 9 July 2020 against Wind Tre SpA, an Italian telecommunications company. Wind Tre was found to have unlawfully processed customers' personal data in its direct marketing practices. In particular, Wind Tre had conducted marketing exercises without customers' consent. Further, some customers were unable to exercise their right to oppose to the data processing for direct marketing purposes.

Readers are reminded that they must obtain the customer's valid consent before engaging in direct marketing exercises, and customers must be given the option to exercise their right to opt out of data processing for marketing purposes at any time.


The German data protection authority fined AOK Baden Wuerttemberg ("AOK"), the statutory health insurance company of a German state, EUR 1.24 million for a data protection breach. AOK had conducted lotteries on various occasions and collected personal data of participants for the purposes of the lotteries. While AOK had put in place measures to prevent the data collected from unlawful processing, it transpired that 500 participants' data slipped through these safety measures. The German data protection authority concluded that insufficient technical and organisational measures were implemented by AOK to deal with the personal data collected, in breach of Article 32 of the GDPR.

This is a reminder of the importance of strict implementation of internal systems put in place to process personal data collected.


On 6 July 2020, the Dutch data protection authority (the "Dutch DPA") fined the Dutch Credit Registration Bureau ("BKR") EUR 830,000 for GDPR breaches. BKR maintains the Dutch central credit information system. It is a source of information for various companies, financial institutions and municipalities on credit information, insolvency checks, and sanction screening.

The Dutch DPA found that free access to personal data under BKR's system required data subjects to send a written request via post accompanied by a copy of a passport. Such requests could only be made once a year, and took up to 28 days to process. Alternatively, data subjects could opt to pay EUR 4.95 each year for unlimited, instant access to their personal data. The Dutch DPA held that this system violated Article 12 of the GDPR, which requires data subjects to be given right of access to personal data and without charge.

Readers are reminded that their procedures for data subjects to request personal data held on them should not impede their right of access.


Agencia Española de Protección de Datos ("AEPD"), the Spanish data protection authority, issued fines in July 2020 against Xfera Móviles (a Spanish telecom company) and the Spanish unit of Orange (a French telecom operator) for GDPR breaches.

Xfera was fined EUR 70,000 for unlawfully processing a customer's personal data. This is the ninth fine Xfera has received from the AEPD in the past six months.

Separately, the Spanish Unit of Orange was fined EUR 80,000 for breaching Article 6 of the GDPR. The issue related to a complainant whose personal data was used to enter into six telephone service contracts without their consent. When the contracts were defaulted on, Orange included the complainant's personal data on a credit blacklist, without checking that it had the complainant's consent to do so. Orange then carried out debt recovery actions. The AEPD found that Orange had failed to undertake "the minimum diligence required" to verify audio recordings from the contracted phones lines of several people on behalf of the complainant. The AEPD concluded that Orange had failed to prove it had a lawful basis for processing the complainant's data.


On 14 July 2020, the Belgian data protection authority fined Google Belgium EUR 600,000 for failing to remove search results related to an unnamed CEO. This is one of the highest fines issued to-date in Belgium.

The CEO had requested Google to remove 12 Google search results that appeared to suggest ties between the CEO and a specific political party, or which referred to a harassment complaint that was set aside in 2010. Google refused to do so for various reasons, claiming that the pages did not exist, were inaccessible or did not meet Google's criteria for removal.

Google said that it plans to appeal the fine as it did not believe that the case met the criteria for delisting search results. Google's stance is that "it was in the public's interest that this reporting remain searchable".

Civil litigation

Individuals named in controversial "Steele Dossier" successful in data protection claim

The Steele Dossier was produced in 2016 by Orbis, an English company established by two British former public officials, pursuant to instructions from US lawyers. It related to Orbis' findings on intelligence concerning any links that might exist between Russia, Vladimir Putin and Donald Trump. Extracts from the Steele Dossier were subsequently published by Buzzfeed News.

In Aven, Fridman & Khan v Orbis Business Intelligence Ltd [2020] EWHC 1812 (QB), the claimants brought proceedings against Orbis for breaches of the Data Protection Act 1998 in connection with the Steele Dossier. They argued that "Memorandum 112" of the published memoranda contained personal data relating to them, and that the personal data were inaccurate (contrary to the fourth data protection principle) and had been processed by Orbis unfairly or unlawfully (contrary to the first data protection principle).

Warby J found for the claimants. In doing so, he drew out a number of useful points for data protection practitioners including, but not limited to the following:

  • He held as "personal data" the memorandum's information on the claimants giving significant favours to, and receiving them from, Vladimir Putin, as the information was "biographically significant". Further, the memorandum implied that the claimants delivered "illicit cash" to Vladimir Putin through a third party. He found this implication of criminality to amount to "sensitive personal data" under section 2(g) of the Data Protection Act 1998. In making these findings, He emphasised that such analyses demanded that information which might potentially constitute personal data be considered in the context of the whole document in which it was located;
  • With regard to the alleged breaches of the data protection principles, he found that the first data protection principle had not been breached. However, he agreed that the fourth principle was breached with regard to the allegations of illicit payments. On the facts, although Orbis had accurately recorded what it was told by its sources, it had not taken reasonable steps to verify the criminality allegation. Therefore, the allegation was inaccurate. In undertaking this analysis, he drew significantly on principles derived from the law of defamation; and
  • He ordered Orbis to rectify its records to correct the inaccuracy and awarded compensation of £18,000 to the first and second claimants for loss of autonomy, distress and reputational damage, again, drawing on principles derived from the law of defamation.

UK football players suing for unlawful use of personal data

In July 2020, it was announced that Global Sports Data and Technology Group is spearheading a group action against various companies for breach of the GDPR in relation to processing the personal data of many professional football players, dubbed “Project Red Card”. The Group was founded by Russell Slade, former Cardiff City Football Club manager, and Jason Dunlop.

The claim is brought on behalf of at least 400 current and former professional football players across the UK, including players from the Premier League, EFL, National League and Scottish Premiership. They allege that certain gaming, betting and data-processing companies have used their personal statistics without their consent or compensation.

The use of players' statistics and personal data in the gaming industry is prolific; gaming companies obtain consent from the clubs or leagues to access the players' personal data to build avatars of the players for games. However, it is alleged that, in breach of GDPR, the players themselves are not currently asked to consent to the processing of their personal data.

The number of group and class actions relating to data subjects' rights is increasing in popularity and is likely to continue as data subjects become more aware of their rights under the GDPR.

First-tier Tribunal has the responsibility for enforcing any non-compliance with decisions it has rendered rather than the ICO

In Moss v Information Commissioner & Royal Borough of Kingston upon Thames [2020] UKUT 174 (AAC), Jacobs J held that the First-tier Tribunal ("FTT") bears the onus of enforcing non-compliance with its judgments, rather than the Information Commissioner's Office ("ICO").

Mr Moss made a Freedom of Information Act 2000 request to the Royal Borough of Kingston upon Thames ("Kingston") on 16 February 2016. Kingston refused to provide the information requested, and the ICO agreed with Kingston that the cost of compliance would exceed the appropriate limit. This exception is permitted under section 12 of the Freedom of Information Act 2000 ("FOIA 2000"). Mr Moss challenged the ICO's decision. While the FTT agreed with the ICO on the section 12 matter, it permitted Mr Moss' appeal in part because it held that Kingston had breached its duty to advise and assist, as required by section 16 of FOIA 2000. The FTT then issued a substituted decision notice.

Mr Moss alleged that Kingston had failed to comply with the substituted decision notice, and applied to the ICO for enforcement. The ICO took the view that enforcement was not its responsibility, but that of the FTT General Regulatory Chamber. Subsequently, Mr Moss applied to the FTT for an order that Kingston was in contempt of court, and made a parallel application to the FTT to certify to the High Court an offence of contempt of court against Kingston and the ICO. The FTT struck out both applications, causing Mr Moss to appeal to the Upper Tribunal.

The Upper Tribunal accepted the ICO's submission that it did not have the power to enforce a decision notice which has been subject to an appeal. To permit it to have such a power would undermine the legal principle that the executive body cannot override a judicial act (R (Evans) v HM Attorney General [2015] UKSC 21). However, the ICO is the enforcement authority for decision notices which have not been appealed.

As a result, the Upper Tribunal held that the FTT was right to refuse the application to commit Kingston for contempt because it had no such power. However, the FTT was wrong to strike out the application to certify Kingston for contempt because it did have that power. While this case concerned the Data Protection Act 1998, it will likely also apply to the Data Protection Act 2018, save that under the 2018 Act certification of contempt will be to the Upper Tribunal, not the High Court.