Personal Data | Cybersecurity | Data Innovation
Europe
03/14/2023 – European Union Agency for Cybersecurity | Report | Cybersecurity of AI and Standardisation
On 14 March 2023, the European Union Agency for Cybersecurity published a report on Cybersecurity of AI and Standardisation.
The objective of the report is to provide an overview of standards (existing, being drafted, under consideration and planned) related to cybersecurity of artificial intelligence, assess their scope and identify gaps in standardisation.
For further information: ENISA Website
03/14/2023 – European Parliament | Regulation | Data Act
On 14 March 2023, the European Parliament adopted the draft Data Act.
The Data Act aims to boost innovation by removing barriers obstructing access by consumers and businesses to data.
For further information: European Parliament Website
02/28/2023 – European Data Protection Board | Opinion | EU-US Data Privacy Framework
On 28 February 2023, the European Data Protection Board adopted its opinion on the draft adequacy decision regarding the EU-US Data Privacy Framework.
The European Data Protection Board welcomes substantial improvements such as the introduction of requirements embodying the principles of necessity and proportionality for US intelligence gathering of data and the new redress mechanism for EU data subjects. At the same time, it expresses concerns and requests clarifications on several points.
For further information: EDPB Website
02/24/2023 – European Data Protection Board | Guidelines | Transfers, Certification and Dark Patterns
On 24 February 2023, the European Data Protection Board published final version of three guidelines.
Following public consultation, the European Data Protection Board has adopted three sets of guidelines in their final version: the Guidelines on the interplay between the application of Article 3 and the provisions on international transfers as per Chapter V GDPR; the Guidelines on certification as a tool for transfers; and the Guidelines on deceptive design patterns in social media platform interfaces.
For further information: EDPB Website
02/15/2023 – European Commission | Decision | Whistleblowing
On 15 February 2023, the European Commission announced its decision to refer eight Member States to the Court of Justice of the European Union for failing to transpose the Directive (EU) 2019/1937 on the Protection of Persons who Report Breaches of Union Law before 17 December 2021.
The relevant Members States include the Czech Republic, Germany, Estonia, Spain, Italy, Luxembourg, Hungary, and Poland.
For further information: European Commission Website
01/18/2023 – European Data Protection Board | Report | Cookie Banner Taskforce
On 18 January 2023, the European Data Protection Board adopted its final report of the cookie banner task force.
The French Supervisory Authority and its European counterparts adopted the report summarizing the conclusions of the task force in charge of coordinating the answers to the questions on cookie banners raised by the complaints of the None Of Your Business Association. The main points of attention that were discussed concern the modalities of acceptance and refusal to the storage of cookies and the design of banners.
For further information: EDPB Website
01/16/2023 – European Union | Regulation | Digital Operational Resilience Act
The Digital Operational Resilience Act (“DORA”) entered into force on 16 January 2023.
The DORA aims to ensure that financial-sector information and communication technology (“ICT”) systems can withstand security threats and that third-party ICT providers are monitored.
For further information: Official Journal Website
01/12/2023 – Court of Justice of the European Union | Decision | Right of access
On 12 January 2023, the Court of Justice of the European Union ruled that everyone has the right to know to whom their personal data has been disclosed.
The data subject’s right of access to personal data under the GDPR entails, where those data have been or will be disclosed to recipients, an obligation on the part of the controller to provide the data subject with the actual identity of those recipients, unless it is impossible to identify those recipients or the controller demonstrates that the data subject’s requests for access are manifestly unfounded or excessive within the meaning of the GDPR, in which cases the controller may indicate to the data subject only the categories of recipient in question.
For further information: Press Release
Austria
02/01/2023 – Austrian Parliament | National Council | Whistleblowing
On February 1st 2023, the Directive (EU) 2019/1937 on the protection of persons who report breaches of union law (“the Whistleblowing Directive”) was implemented by the Austrian National Council.
For further information: Austrian Parliament Website
Belgium
02/15/2023 – House of Representatives | Legislation | Whistleblowing
On 15 February 2023, the Whistleblowing law for the private sector which partially transposes the Whistleblowing Directive entered into force.
For further information: Whistleblowing Law
Bulgaria
01/27/2023 – Bulgarian National Assembly | Legislation | Whistleblowing
On 27 January 2023, the Bulgarian National Assembly (“CPDP”) adopted the Whistleblower Protection and Public Disclosure Act (“PWIPDA”) transposing the Whistleblowing Directive.
For further information: CPDP Website [BG]
Czech Republic
03/07/2023 – Czech Supervisory Authority | FAQ | Cookies
On 7 March 2023, the Czech Supervisory Authority (“UOOU”) published a FAQ on cookie banners and consent.
For further information: UOOU Website [CZ]
Denmark
02/20/2023 – Danish Supervisory Authority | Decision | Cookie Walls
The Danish Supervisory Authority issued two decisions regarding the use of cookie walls on websites and published general guidelines for the use of such consent solutions.
The Danish Supervisory Authority generally found that a method whereby the website visitor can access the content of a website in exchange for either giving consent to the processing of his personal data or paying an access fee, meets the requirements of the data protection rules for a valid consent.
For further information: Danish DPA Website [DK]
01/20/2023 – Danish Supervisory Authority | Guidelines | Storage and Consent
On 20 January 2023, the Danish Supervisory Authority has prepared guidance dealing with the storage of personal data with the aim of being able to demonstrate compliance with data protection rules on consent.
For further information: Danish DPA Website [DK]
Finland
02/17/2023 – Finnish Supervisory Authority | Sanction | GDPR Violation
On 17 February 2023, the Finnish Supervisory Authority issued an administrative fine of €440,000 against a company for failing to comply with the authority’s order to rectify its practices.
In particular, the authority stated that the company failed to erase inaccurate payment default entries saved into the credit information register due to inadequate practices. The authority stresses that the processing of payment default information has a significant impact on the rights and freedoms of individuals.
For further information: Finnish DPA Website
France
03/28/2023 – French Supervisory Authority | Sanction | Geolocation Data
On 28 March 2023, the French Supervisory Authority (“CNIL”) announced that it imposed a fine of €125,000 on a company of rental scooters because it geolocated its customers almost permanently.
The CNIL noted a failure to comply with several obligations, namely to ensure data minimization, to comply with the obligation to provide a contractual framework for the processing operations carried out by a processor, to inform the user and obtain his or her consent before writing and reading information on his or her personal device.
For further information: CNIL Website
03/15/2023 – French Supervisory Authority | Investigation | Smart Cameras
On 15 March 2023, the French Supervisory Authority (“CNIL”) announced setting “smart” cameras, mobile apps, bank and medical records as priority topics for investigations in 2023.
The CNIL carries out investigations on the basis of complaints received, current events, but also annual priority topics. In 2023, it will focus on the use of “smart” cameras by public actors, the use of the file on personal credit repayment incident, the management of health files and mobile apps.
For further information: CNIL Website
02/09/2023 – French Supervisory Authority | Guidance | Data Governance Act
On 9 February 2023, the French Supervisory Authority (“CNIL”) published a guidance on the economic challenges of implementing the Data Governance Act.
For further information: CNIL Website
01/26/2023 – French Supervisory Authority | Statement | Artificial Intelligence
On 26 January 2023, the French Supervisory Authority (“CNIL”) announced creating an Artificial Intelligence (“AI”) Department and starting to work on learning databases.
The CNIL is creating an AI Department to strengthen its expertise on these systems and its understanding of the risks to privacy while preparing for the implementation of the European regulation on AI. In addition, the CNIL has announced that it will propose initial recommendations on machine learning databases.
For further information: CNIL Website
01/24/2023 – Ministry of Home Affairs | Legislation | Cyberattack Risk Insurance
On 24 January 2023, the French Parliament adopted the LOPMI Act that authorizes the insurability of “cyber-ransoms” paid by victims, subject to the prompt filing of a complaint.
For further information: LOPMI
01/04/2023 – French Supervisory Authority | Sanction | Consent
On 4 January 2023, the French Supervisory Authority (“CNIL”) imposed an administrative €8 million fine on a technology company because it did not collect the consent of French users before depositing and/or writing identifiers used for advertising purposes on their terminals.
The CNIL found that the advertising targeting settings were pre-checked by default. Moreover, the user had to perform a large number of actions in order to deactivate this setting.
The CNIL explained the amount of the fine by the scope of the processing, the number of people concerned in France, the profits the company made from advertising revenues indirectly generated from data collected by these identifiers and the fact that since then, the company has reached compliance.
For further information: CNIL Website
01/17/2023 – French Supervisory Authority | Sanction | Consent
On 17 January 2023, the French Supervisory Authority (“CNIL”) imposed a €3 million fine on a company which publishes video games for smartphones.
The company was using an essentially technical identifier for advertising purposes without the user’s consent.
For further information: CNIL Website
Germany
03/22/2023 – Supervisory Authorities| Opinion | “Pure Subscription Models”
The Conference of the Independent Data Protection Authorities of Germany (DSK) adopted an opinion on so-called “pure subscription models” on websites.
The opinion assesses pure (no-tracking) subscription models and alternative free consent-based tracking models and provides criteria to assess these alternative access instruments on websites.
For further information: DSK Website [DE]
03/15/2023 – Supervisory Authorities| BfDI | Activity Report
The Federal Commissioner for Data Protection and Freedom of Information (BfDI), Ulrich Kelber, has presented the BfDI’s Activity Report for 2022.
For further information: BfDI [DE]
03/15/2023 – Supervisory Authorities| Activity Reports
The Commissioners for Data Protection and Freedom of Information of Baden-Württemberg, Hamburg and Schleswig Holstein have presented their activity reports on the year 2022.
The activity reports cover various data protection and information freedom topics. For example in Schleswig-Holstein data breaches remained frequent while the number of complaints dropped, with video surveillance being the main cause of complaints. The reports emphasize the need to proactively address risks such as artificial intelligence and data sharing.
For further information: ULD Website [DE] and LfDI-BW Website [DE] and HmbBfDI Website [DE]
03/01/2023 – Supervisory Authorities| Opinion | EU-US Privacy Framework
The Hamburg Supervisory Authority (on 1 March 2023) and the German Supervisory Authority (on 28 February 2023) both issued an opinion on the draft adequacy decision on the EU-US Data Privacy.
For further information: Bundestag Website [DE] and BfDI [DE]
02/13/2023 – German Competition Authority | Decision | US Data Transfers
On 13 February 2023 the German Competition Authority (“BKartA”) issued a ruling on data transfers under the GDPR.
In particular, the authority ruled that a company relying on a German subsidiary of a US parent company as a data processor cannot be excluded from a contract bid due to possible violations of the GDPR.
For further information: BKartA Website [DE]
02/09/2023 – ArbG Oldenburg | Decision | Claim for Damages
On 9 February 2023, the Oldenburg Labor Court has ordered a company to pay a former employee damages in the amount of 10,000 euros under Article 82 of the GDPR for failing to comply with an information request under Article 15 (1) of the GDPR without establishing any additional (immaterial) harm.
In the opinion of the court the violation of the GDPR itself already resulted in immaterial harm to be compensated; according to the court, no additional proof of harm was required.
Italy
03/30/2023 – Italian Supervisory Authority | Temporary limitation | AI Chatbot
The Italian Supervisory Authority (“Garante”) imposed an immediate temporary limitation on the processing of Italian users’ data by an US-based company developing and managing an AI chatbot.
The Garante opened a probe over a suspected breach of GDPR. The authority alleged “the absence of any legal basis that justifies the massive collection and storage of personal data in order to ‘train’ the algorithms underlying the operation of the platform”. The authority also accused the company of failing to check the age of its users.
For further information: Garante Website [IT]
03/09/2023 – Council of Ministers | Legislation | Whistleblowing
On 9 March 2023, the Italian Council of Ministers approved the whistleblowing legislative decree.
The Council of Ministers announced, on 9 March 2023, the approval, after final review, of the legislative decree to transpose into Italian law the Whistleblowing Directive.
For further information: Governo Italiano Website [IT]
02/21/2023 – Italian Supervisory Authority | Sanction | Marketing Practices
The Italian Supervisory Authority (“Garante”) announced, on 21 February 2023, that it issued, on 15 December 2022, a €4.9 million fine against an energy company for various non-compliances with the GDPR, including unlawful marketing practices.
For further information: Garante Website [IT]
02/03/2023 – Italian Supervisory Authority | Temporary limitation | AI Chatbot
The Italian Supervisory Authority (“Garante”) issued an order on an AI chatbot noting that tests performed identified risks for minors and vulnerable individuals.
The US-based developer was ordered to terminate processing of data relating to Italian users and to inform the Garante within 20 days on any measures taken to implement its orders.
For further information: Garante Website
Ireland
02/27/2023 – Irish Supervisory Authority | Sanction | Security
On 27 February 2023, the Irish Supervisory Authority (“DPC”) imposed a fine of €750,000 on a banking company for inadequate data security measures.
The inquiry was initiated after the notification to the DPC of a series of 10 data breaches. In this context, the DPC found that the technical and organizational measures in place at the time were not sufficient to ensure the security of the personal data processed.
For further information: #DPC Website
02/23/2023 – Irish Supervisory Authority | Sanction | Security
On 23 February 2023, the Irish Supervisory Authority (“DPC”) imposed a €460,000 fine against a health care provider.
The DPC initiated an enquiry after receiving a personal data breach notification related to a ransomware attack affecting patient data (70,000 people). The DPC considered that the health care provider failed to ensure that the personal data were processed in a manner that ensured appropriate security.
For further information: DPC Website
01/16/2023 – Irish Supervisory Authority | Sanction | CCTV
On 16 January 2023, the Irish Supervisory Authority (“DPC”) imposed a €50,000 fine and a temporary ban on the processing of personal data with CCTV cameras on a company for violations of the GDPR.
For further information: DPC Website
Netherlands
02/22/2023 – Dutch Supervisory Authority | Statement | Camera Settings
The Dutch Supervisory Authority (“AP”) published a statement on changes made by a car manufacturer in the settings of the built-in security cameras of its cars, following an investigation of these cameras by the AP.
For instance, the car may still take camera images, but only when the user activates that function.
For further information: AP Website [NL]
02/18/2023 – House for Whistleblowers | Legislation | Whistleblowing
On 18 February 2023, the House for Whistleblowers announced the entry into force of the Whistleblower Protection Act.
For further information: AP Website [NL]
Norway
03/01/2023 – Norwegian Supervisory Authority | Preliminary conclusion | Analytics Tool
On 1st March 2023, the Norwegian Supervisory Authority (“Datatilsynet”) published its preliminary conclusion on a case related to the use of the analytics tool of a US-based company considering that the use of this tool is not in line with the GDPR.
For further information: Datatilsynet Website [NO]
02/06/2023 – Norwegian Supervisory Authority | Sanction | GDPR Violation
On 6 February 2023, the Norwegian Supervisory Authority (“Datatilsynet”) fined a company operating fitness centers NOK 10 million (approximately €912,940) for various GDPR violations (e.g., lawfulness of processing, transparency and data subjects rights).
For further information: Datatilsynet Website [NO]
Portugal
01/27/2023 – Portuguese Supervisory Authority | Guidelines | Security Measures
The Portuguese Supervisory Authority (“CNPD”) published guidelines on security measures in order to minimize consequences in case of attacks on information systems.
These guidelines aim to inform controllers and processors about their legal obligations, with the increase of cyberattacks on information systems, listing organizational and technical measures that must be considered by organizations.
For further information: Press release [PT]
Romania
03/28/2023 – President of Romania | Legislation | Whistleblowing
The Law No. 67/2023 which amends article 6 (2) of the Law no. 361/2022 on the protection of whistleblowers in the public interest, was published in the Official Gazette on 28 March 2023 and entered into force on 31 March 2023.
For further information: CDEP Website [RO]
Spain
03/16/2023 – Spanish Supervisory Authority | Sanction | Data Minimization
The Spanish Supervisory Authority (“AEPD”) published, on 16 March 2023, its decision in which it imposed a fine of €100,000 on a telecommunications company for violation of the data minimization principle.
For further information: AEPD Website [ES]
03/15/2023 – Spanish Supervisory Authority | Sanction | GDPR Violation
The Spanish Supervisory Authority (“AEPD”) fined a bank €100,000 for violation of the GDPR.
In particular, the bank used the information provided by the claimant and her child to open several accounts in the name of the child without consent and while it was not necessary for the services requested.
For further information: AEPD Website [ES]
03/15/2023 – Spanish Supervisory Authority | Sanction | Data Portability
The Spanish Supervisory Authority (“AEPD”) published, on 15 March 2023, a decision in which it imposed a fine of €136,000 on a telecommunications company for completing a data portability request without ensuring the security of the personal data of the client.
For further information: AEPD Website [ES]
03/13/2023 – Spanish Senate | Legislation | Whistleblowing
The Spanish Law 2/2023 implementing the EU Whistleblower Directive was published in the Official Gazette on 20 February 2023 and entered into force on 13 March 2023.
For further information: BOE Website [ES]
United Kingdom
03/28/2023 – UK Supervisory Authority | Guidance | Direct Marketing
On 28 March 2023, the UK Supervisory Authority (“ICO”) issued guidance to businesses operating in regulated private sectors (e.g., finance, communications or utilities) on direct marketing and regulatory communications.
The guidance aims to help businesses identify when a regulatory communication message might count as direct marketing. If the message is direct marketing, it also covers what businesses need to do to comply with data protection and ePrivacy law.
For further information: ICO Website
03/16/2023 – UK Supervisory Authority | Sanction | GDPR Violations
The UK Supervisory Authority (“ICO”) reached an agreement with a retailer to reduce the monetary penalty notice issued for breaching the GDPR from £1,350,000 to £250,000.
The ICO found that the company was making assumptions about customers’ medical conditions, based on their purchase history, to sell them further health related products. The processing involved special category data and the ICO concluded that the processing had been conducted without a lawful basis. The retailer appealed the decision which led to an agreement to reduce the monetary penalty notice, taking into account that the retailer has stopped the unlawful processing.
For further information: ICO Website
03/15/2023 – UK Supervisory Authority | Guidelines | AI and Data Protection
The UK Supervisory Authority (“ICO”) announced on 15 March 2023 that it had updated its guidance on artificial intelligence (“AI”) and data protection.
The ICO indicates that the changes respond to requests from UK industry to clarify requirements for fairness in AI.
For further information: ICO Website
03/13/2023 – UK Supervisory Authority | Guidance | Data Protection by Default
The UK Supervisory Authority (“ICO”) has produced new guidance to help user experience designers, product managers and software engineers embed data protection into their products and services by default.
The guidance looks at key privacy considerations for each stage of product design, from kick-off to post-launch. It includes both examples of good practice and practical steps that organisations can take to comply with data protection law when designing websites, apps or other technology products and services.
For further information: ICO Website
03/08/2023 – UK Government | Legislation | Cookies
The government re-introduced new laws on 8 March 2023 aiming to cut down paperwork for businesses and reduce unnecessary cookie pops-up.
The Data Protection and Digital Information Bill was first introduced last summer and paused in September 2022 so ministers could engage in a co-design process with business leaders and data experts. According to the government, this was to ensure that the new regime built on the UK’s high standards for data protection and privacy, and seeks to ensure data adequacy while moving away from the “one-size-fits-all” approach of the European Union’s GDPR.
For further information: UK Government Website
02/16/2023 – UK Supervisory Authority | Guidance | Protection of Children
The UK Supervisory Authority (“ICO”) issued a series of recommendations to game developers to ensure the protection of children and compliance with data protection laws.
For further information: ICO Website
This newsletter has been prepared by the EU Privacy team of Gibson Dunn. For further information, you may contact us by email:
- Ahmed Baladi – Partner, Partner, Co-Chair, PCCP Practice, Paris (abaladi@gibsondunn.com)
- Vera Lukic – Partner, Paris (vlukic@gibsondunn.com)
- Kai Gesing – Partner, Munich (kgesing@gibsondunn.com)
- Joel Harrison – Partner, London (jharrison@gibsondunn.com)
- Alison Beal – Partner, London (abeal@gibsondunn.com)
- Thomas Baculard – Associate, Paris (tbaculard@gibsondunn.com)
- Roxane Chrétien – Associate, Paris (rchretie@gibsondunn.com)
- Christoph Jacob – Associate, Munich (cjacob@gibsondunn.com)
- Yannick Oberacker – Associate, Munich (yoberacker@gibsondunn.com)
- Clémence Pugnet – Associate, Paris (cpugnet@gibsondunn.com)
© 2023 Gibson, Dunn & Crutcher LLP
Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice. Please note, prior results do not guarantee a similar outcome.
The Council of Europe Has Adopted the First International Treaty on Artificial Intelligence.
- Executive Summary
On May 17, 2024, the Council of Europe adopted the first ever international legally binding treaty on artificial intelligence, human rights, democracy, and the rule of law (Convention)[1]. In contrast to the forthcoming EU AI Act[2], which will apply only in EU member states, the Convention is an international, potentially global treaty with contributions from various stakeholders, including the US. The ultimate goal of the Convention is to establish a global minimum standard for protecting human rights from risks posed by artificial intelligence (AI). The underlying core principles and key obligations are very similar to the EU AI Act, including a risk-based approach and obligations considering the entire life cycle of an AI system. However, while the EU AI Act encompasses comprehensive regulations on the development, deployment, and use of AI systems within the EU internal market, the AI Convention primarily focuses on the protection of universal human rights of people affected by AI systems. It is important to note that the Convention, as an international treaty, does not impose immediate compliance requirements; instead, it serves as a policy framework that signals the direction of future regulations and aims to align procedures at an international level.
- Background and Core Principles
The Convention was drawn up by the Committee on Artificial Intelligence (CAI), an intergovernmental body bringing together the 46 member states of the Council of Europe, the European Union, and 11 non-member states (namely Argentina, Australia, Canada, Costa Rica, the Holy See, Israel, Japan, Mexico, Peru, the United States of America, and Uruguay) as well as representatives of the private sector, civil society, and academia. Such multi-stakeholder participation has been shown to promote acceptance of similar regulatory efforts. The main focus lies on the protection of human rights, democracy, and the rule of law, the core guiding principles of the Council of Europe, by establishing common minimum standards for AI systems at the global level.
- Scope of Application
The Convention is in line with the updated OECD definition of AI, which provides for a broad definition of an “artificial intelligence system” as “a machine-based system that for explicit or implicit objectives infers from the input it receives how to generate outputs such as predictions, content, recommendations, or decisions that may influence physical or virtual environments.” The EU AI Act, OECD updated definition, and US Executive Order (US EO) 14110 definitions of AI systems are generally aligned as they all emphasize machine-based systems capable of making predictions, recommendations, or decisions that impact physical or virtual environments, with varying levels of autonomy and adaptiveness. However, the EU and OECD definitions highlight post-deployment adaptiveness, while the US EO focuses more on the process of perceiving environments, abstracting perceptions into models, and using inference for decision-making.
Noteworthy is the emphasis on the entire life cycle of AI systems (similar to the EU AI Act). The Convention is primarily intended to regulate the activities of public authorities – including companies acting on their behalf. However, parties to the Convention must also address risks arising from the use of AI systems by private companies, either by applying the same principles or through “other appropriate measures,” which are not specified. The Convention also contains exceptions, similar to those laid down by the EU AI Act. Its scope excludes:
- activities within the lifecycle of AI systems relating to the protection of national security interests, regardless of the type of entities carrying out the corresponding activities;
- all research and development activities regarding AI systems not yet made available for use; and
- matters relating to national defense.
- Obligations and Principles
The Convention is principles-based and therefore by its nature formulated in high level commitments and open-ended terms. It contains several principles and obligations on the parties to take measures to ensure the protection of human rights, the integrity of democratic processes, and respect for the rule of law. These core obligations are familiar as they also form the basis of the EU AI Act. The core obligations include:
- measures to protect the individual’s ability to freely form opinions;
- measures ensuring adequate transparency and oversight requirements, in particular regarding the identification of content generated by AI systems;
- measures ensuring accountability and responsibility for adverse impacts;
- measures to foster equality and non-discrimination in the use of AI systems, including gender equality;
- the protection of privacy rights of individuals and their personal data;
- to foster innovation, the parties are also obliged to enable the establishment of controlled environments for the development and testing of AI systems.
Two other key elements of the Convention are that each party must have the ability to prohibit certain AI systems that are incompatible with the Convention’s core principles and to provide accessible and effective remedies for human rights violations. The examples given in the Convention underline that current issues have been included, e.g., election interference seems to be one of the risks discussed.
- Criticism and Reactions
The Convention has been criticized by civil society organizations[3] and the European Data Protection Supervisor[4]. The main points of criticism include:
- Broad Exceptions: The Convention includes exceptions for national security, research and development, and national defense. Critics argue that these loopholes could undermine essential safeguards and lead to unchecked AI experimentation and use in military applications without oversight. Similar criticism has been levelled at the EU AI Act.
- Vague Provisions and Private Sector Regulation: The Convention’s principles and obligations are seen as too general, lacking specific criteria for enforcement. Critics highlight the absence of explicit bans on high-risk AI applications, such as autonomous weapons and mass surveillance. Additionally, the Convention requires addressing risks from private companies but does not specify the measures, leading to concerns about inconsistent regulation.
- Enforcement and Accountability: The Convention mandates compliance reporting but lacks a robust enforcement mechanism. Critics argue that without stringent enforcement and accountability, the Convention’s impact will be limited. There are also concerns about the adequacy of remedies for human rights violations by AI systems, due to vague implementation guidelines.
- Implementation and Entry into Force
The parties to the Convention need to take measures for sufficient implementation. In order to take account of different legal systems, each party may opt to be directly bound by the relevant Convention provision or take measures to comply with the Convention’s provisions. Overall, the Convention provides only for a common minimum standard of protection; parties are free to adopt more extensive regulations. To ensure compliance with the Convention, each party must report to the Conference of the Parties within two years of becoming a party and periodically thereafter on the activities it has undertaken.
- Next Steps and Takeaways
The next step is for States to sign the declaration of accession. The Convention will be opened for signature on September 5, 2024. It is expected, although not certain, that the CoE Member States and the other 11 States (including the US) that contributed to the draft convention will become parties.
In the EU, the Convention will complement the EU AI Act sharing the risk based approach and similar core principles. Given the very general wording of the Convention’s provisions and the broad exceptions to its scope, it seems that the EU AI Act, adopted on May 21, remains the most comprehensive and prescriptive set of standards in the field of AI at least in the EU. However, as the Convention will form the bedrock of AI regulation in the Council of Europe, it is to be expected that the European Court of Human Rights (ECtHR) will in the future draw inspiration from the Convention when interpreting the European Convention on Human Rights (ECHR).This may have significant cross-fertilisation effects for EU fundamental rights law, including in the implementation of the EU AI Act, as the ECHR forms the minimum standard of protection under Article 52(3) of the Charter of Fundamental Rights of the European Union (Charter). Both States and private companies will therefore have to be cognisant of the potential overlapping effects of the Convention and the EU AI Act.
__________
[1] See Press release here. See the full text of the Convention here.
[2] On May 21, 2024, the Council of the European Union finally adopted the AI Regulation (AI Act). For details on the EU AI Act, please also see: https://www.gibsondunn.com/artificial-intelligence-review-and-outlook-2024/.
[3] See https://ecnl.org/sites/default/files/2024-03/CSOs_CoE_Calls_2501.docx.pdf.
[4] See https://www.edps.europa.eu/press-publications/press-news/press-releases/2024/edps-statement-view-10th-and-last-plenary-meeting-committee-artificial-intelligence-cai-council-europe-drafting-framework-convention-artificial_en#_ftnref2.
The following Gibson Dunn lawyers assisted in preparing this update: Robert Spano, Joel Harrison, Christoph Jacob, and Yannick Oberacker.
Gibson Dunn lawyers are available to assist in addressing any questions you may have regarding these issues. Please contact the Gibson Dunn lawyer with whom you usually work, the authors, or any leader or member of the firm’s Artificial Intelligence, Privacy, Cybersecurity & Data Innovation or Environmental, Social and Governance (ESG) practice groups:
Artificial Intelligence:
Cassandra L. Gaedt-Sheckter – Palo Alto (+1 650.849.5203, cgaedt-sheckter@gibsondunn.com)
Vivek Mohan – Palo Alto (+1 650.849.5345, vmohan@gibsondunn.com)
Robert Spano – London/Paris (+44 20 7071 4902, rspano@gibsondunn.com)
Eric D. Vandevelde – Los Angeles (+1 213.229.7186, evandevelde@gibsondunn.com)
Privacy, Cybersecurity and Data Innovation:
Ahmed Baladi – Paris (+33 (0) 1 56 43 13 00, abaladi@gibsondunn.com)
S. Ashlie Beringer – Palo Alto (+1 650.849.5327, aberinger@gibsondunn.com)
Kai Gesing – Munich (+49 89 189 33 180, kgesing@gibsondunn.com)
Joel Harrison – London (+44 20 7071 4289, jharrison@gibsondunn.com)
Jane C. Horvath – Washington, D.C. (+1 202.955.8505, jhorvath@gibsondunn.com)
Rosemarie T. Ring – San Francisco (+1 415.393.8247, rring@gibsondunn.com)
Environmental, Social and Governance (ESG):
Susy Bullock – London (+44 20 7071 4283, sbullock@gibsondunn.com)
Elizabeth Ising – Washington, D.C. (+1 202.955.8287, eising@gibsondunn.com)
Perlette M. Jura – Los Angeles (+1 213.229.7121, pjura@gibsondunn.com)
Ronald Kirk – Dallas (+1 214.698.3295, rkirk@gibsondunn.com)
Michael K. Murphy – Washington, D.C. (+1 202.955.8238, mmurphy@gibsondunn.com)
Selina S. Sagayam – London (+44 20 7071 4263, ssagayam@gibsondunn.com)
© 2024 Gibson, Dunn & Crutcher LLP. All rights reserved. For contact and other information, please visit us at www.gibsondunn.com.
Attorney Advertising: These materials were prepared for general informational purposes only based on information available at the time of publication and are not intended as, do not constitute, and should not be relied upon as, legal advice or a legal opinion on any specific facts or circumstances. Gibson Dunn (and its affiliates, attorneys, and employees) shall not have any liability in connection with any use of these materials. The sharing of these materials does not establish an attorney-client relationship with the recipient and should not be relied upon as an alternative for advice from qualified counsel. Please note that facts and circumstances may vary, and prior results do not guarantee a similar outcome.
The obligations apply with respect to a company’s own operations and those of its subsidiaries — but also to those carried out by a company’s “business partners” in the company’s “chain of activities”.
On 24 April 2024, the Corporate Sustainability Due Diligence Directive[1] (“CSDDD” or “Directive”) was finally passed by the European Parliament (“Parliament”), marking the end of the key stages of the legislative process, after four years. The CSDDD establishes far-reaching mandatory human rights and environmental obligations on both European Union (“EU”) and non-EU companies meeting certain turnover thresholds, starting from 2027. Those obligations apply with respect to a company’s own operations and those of its subsidiaries—but also to those carried out by a company’s “business partners” in the company’s “chain of activities”.[2] Generally, the CSDDD, one of the most debated pieces of European legislation of recent times, establishes an obligation on in-scope companies to:
- identify and assess (due diligence) adverse human rights and environmental impacts;
- prevent, mitigate and bring to an end / minimise such adverse impacts; and
- adopt and put into effect a transition plan for climate change mitigation which aims to ensure—through best efforts—compatibility of the company’s business model and strategy with limiting global warming to 1.5 °C in line with the Paris Agreement.
The CSDDD also sets out minimum requirements (including the ability for claims to be made by trade unions or civil society organisations) of a liability regime to be implemented by EU Member States for violation of the obligation to prevent, mitigate and bring to an end / minimise adverse impacts.
Key Takeaways
|
1. Legislative History
As reported in our earlier article,[3] in April 2020, the European Commission (“Commission”) proposed the adoption of a directive requiring companies to undertake mandatory human rights and environmental due diligence across their value chains, and a proposal followed in February 2022.[4] At that time, some Member States had already adopted national due diligence laws,[5] and the Commission considered it important to ensure a level playing field for companies operating within the internal market. The Directive was further intended to contribute to the EU’s transition towards a sustainable economy and sustainable development through the prevention and mitigation of adverse human rights and environmental impacts in companies’ supply chains.
After multiple rounds of negotiations and material amendments submitted by all EU institutions, as well as extensive negotiations between Member States, the Permanent Representative Committee of the Council of the European Union (“Council”) endorsed the draft Directive on 15 March 2024, with the Parliament voting in favour on 24 April 2024.[6]
Notably, the CSDDD crystallises into hard law at the EU level certain voluntary international standards on responsible business conduct, such as the UN Guiding Principles on Business and Human Rights (“UNGPs”), the OECD Guidelines for Multinational Enterprises, the OECD Guidance on Responsible Business Conduct, and sectoral direction. Prior to the CSDDD coming into force, these voluntary instruments will continue to offer valuable “best practice” guidance to in-scope companies.
2. Scope of Application and Timing
The Directive will apply to EU companies (i.e., companies formed in accordance with the legislation of a Member State) where a company meets the following thresholds (in each instance measured in the last financial year for which annual financial statements have been or should have been adopted):
- has more than 1,000 employees on average (including in certain circumstances, temporary agency workers) and a net worldwide turnover of more than EUR 450 million;[7] or
- is the ultimate parent company of a group that collectively reaches the thresholds in (a); or
- has entered into or is the ultimate parent company of a group that entered into franchising or licensing agreements in the EU in return for royalties where these royalties amount to more than EUR 22.5 million and provided that the company had or is the ultimate parent company of a group that had a net worldwide turnover of more than EUR 80 million.
The Directive has extra-territorial effect since it also applies to non-EU companies (i.e., companies formed in accordance with the legislation of a non-EU country), if that company:
- has generated a net turnover in the EU of more than EUR 450 million; or
- is the ultimate parent company of a group that collectively reaches the thresholds under (a); or
- has entered into or is the ultimate parent company of a group that entered into franchising or licensing agreements in the EU in return for royalties where these royalties amount to more than EUR 22.5 million in the EU and provided that the company had or is the ultimate parent company of a group that had a net turnover of more than EUR 80 million in the EU.
For the Directive to apply, for both EU and non-EU companies, the threshold conditions must have been satisfied for at least two consecutive financial years. Smaller companies operating in the “chain of activities” of in-scope companies will also be indirectly affected because of contractual requirements imposed on them by companies within the scope of the Directive (discussed further below).
It is notable that the scope of application of the CSDDD is more limited than that of the Corporate Sustainability Reporting Directive (“CSRD”),[8] which (save with respect to franchisors or licensors) applies both lower employee and turnover thresholds. Whilst the CSDDD is expected to apply to around 5,500 companies, the CSRD covers approximately 50,000 companies.
3. Obligations on In-scope Companies
(a) Adopt Human Rights and Environmental Due Diligence
The Directive introduces so-called human rights and environmental “due diligence obligations”. These apply to a company’s own operations, those of its subsidiaries, and those of its direct and indirect business partners throughout their “chain of activities”. The Directive defines “chain of activities” as activities of a company’s:
- upstream business partners,[9] relating to the production of goods or the provision of services by the company, including the design, extraction, sourcing, manufacture, transport, storage and supply of raw materials, products or parts of the products and development of the product or the service; and
- downstream business partners, relating to the distribution, transport and storage of the product, where the business partners carry out those activities for the company or on behalf of the company.[10]
Companies will be required to:
- develop a due diligence policy[11] that ensures risk-based due diligence, and integrate due diligence into their relevant policies and risk management systems;
- identify and assess actual or potential adverse human rights and environmental impacts (which are defined by reference to obligations or rights enshrined in international instruments),[12] including mapping operations to identify general areas where adverse impacts are most likely to occur and to be most severe; and
- prevent and mitigate potential adverse impacts and bring to an end / minimise the extent of actual adverse impacts. Where it is not feasible to prevent, mitigate, bring to an end or minimise all identified adverse impacts at the same time to their full extent, companies must prioritise the steps they take based on the severity and likelihood of the adverse impacts.
In each instance, companies will be required to take “appropriate measures”; that is, measures that “effectively addres[s] adverse impacts in a manner commensurate to the degree of severity and the likelihood of the adverse impact”.[13] Such measures must take into account the circumstances of the specific case, including the nature and extent of the adverse impact and relevant risk factors.
With regards to the prevention of potential adverse impacts, companies are required (amongst other obligations) to:
- develop and implement a prevention action plan, with reasonable and clearly defined timelines for the implementation of appropriate measures and qualitative and quantitative indicators for measuring improvement;
- seek contractual assurances from a direct business partner that it will ensure compliance with the company’s code of conduct / prevention action plan, including by establishing corresponding contractual assurances from its partners if their activities are part of the company’s chain of activities;
- make necessary financial or non-financial investments, adjustments or upgrades, such as into facilities, production or other operational processes and infrastructures; and
- provide targeted and proportionate support for an SME[14] which is a business partner of the company.
Similar obligations are imposed in the context of bringing actual adverse impacts to an end.
Notably, regarding (b), companies must verify compliance. To do so, the CSDDD states that companies “may refer to” independent third-party verification, including through industry or multi-stakeholder initiatives.[15]
The financial sector has more limited obligations. “Regulated financial undertakings” are only subject to due diligence obligations for their own operations, those of their subsidiaries and the upstream part of their chain of activities. Such undertakings are expected to consider adverse impacts and use their “leverage” to influence companies, including through the exercise of shareholders’ rights.
(b) Adopt / Put into Effect a Climate Transition Plan
Companies will also be required to adopt and put into effect a climate change mitigation transition plan (“CTP”), to be updated annually, which aims to ensure that a company’s business model and strategy are compatible with limiting global warming to 1.5°C in line with the Paris Agreement and the objective of achieving climate neutrality, including intermediate and 2050 climate neutrality targets. The CTP should also address, where relevant, the exposure of the company to coal-, oil- and gas-related activities.
The CTP must contain: (a) time-bound targets in five-year steps from 2030 to 2050 including, where appropriate, absolute greenhouse gas emission reduction targets for scope 1, 2 and 3 emissions; (b) description of decarbonisation levers and key actions planned to reach the targets identified in (a); (c) details of the investments and funding supporting the implementation of the CTP; and (d) a description of the role of the administrative, management and supervisory bodies with regard to the CTP.[16]
Companies which report a CTP in accordance with the CSRD or are included in the CTP of their parent undertaking are deemed to have complied with the CSDDD’s CTP obligation. Regulated financial undertakings will also have to adopt a CTP ensuring their business model complies with the Paris Agreement.
(c) Provide Remediation
Consistent with the right to a remedy under the UNGPs, Member States must ensure that where a company has caused or jointly caused an actual adverse impact, it will provide “remediation”.[17] This is defined in the Directive as “restoration of the affected person or persons, communities or environment to a situation equivalent or as close as possible to the situation they would be in had an actual adverse impact not occurred”.[18] Such remediation should be proportionate to the company’s implication in the adverse impact, including financial or non-financial compensation to those affected and, where applicable, reimbursement of any costs incurred by public authorities for necessary remedial measures.
(d) Meaningfully[19] engage with Stakeholders
Companies are required to effectively engage with stakeholders. This includes carrying out consultations at various stages of the due diligence process, during which companies must provide comprehensive information.
(e) Establish a Notification Mechanism and Complaints Procedure
Member States must ensure that companies provide the possibility for persons or organisations with legitimate concerns regarding any adverse impacts to submit complaints.[20] There should then be a fair, publicly available, accessible, predictable and transparent procedure for dealing with complaints, of which relevant workers, trade unions and other workers’ representatives should be informed. Companies should take reasonably available measures to avoid any retaliation.
Notification mechanisms must also be established through which persons and organisations can submit information about adverse impacts.
Companies will be allowed to fulfil these obligations through collaborative complaints procedures and notification mechanisms, including those established jointly by companies, through industry associations, multi-stakeholder initiatives or global framework agreements.
(f) Monitor and Assess Effectiveness
Member States shall ensure that companies carry out periodic assessments of their own operations and measures, those of their subsidiaries and, where related to the chain of activities of the company, those of their business partners. These will assess implementation and monitor the adequacy and effectiveness of the identification, prevention, mitigation, bringing to an end and minimisation of the extent of adverse impacts.
Where appropriate, assessments are to be based on qualitative and quantitative indicators and carried out without undue delay after a significant change occurs, but at least every 12 months and whenever there are reasonable grounds to believe that new risks of the occurrence of those adverse impacts may arise.[21]
(g) Communicate Compliance
Companies will be required to report on CSDDD-matters by publishing an annual statement on their website within 12 months of the end of their financial year, unless they are subject to sustainability reporting obligations under the CSRD. The CSDDD does not introduce any new reporting obligations in addition to those under the CSRD.[22]
The contents of the annual statement will be defined by the Commission through a subsequent implementing act.
4. Enforcement and Sanctions
The Directive requires Member States to designate independent “supervisory authorities” to supervise compliance (“Supervisory Authority”).[23] A Supervisory Authority must have adequate powers and resources, including the power to require companies to provide information and carry out investigations. Investigations may be initiated by the Supervisory Authorities’ own motion or as a result of substantiated concerns raised by third parties.
Supervisory Authorities are to be empowered to “at least”: (a) order the cessation of infringements, the abstention from any repetition of the relevant conduct and the taking of remedial measures; (b) impose penalties; and (c) adopt interim measures in case of imminent risk of severe and irreparable harm.
Sanctions regimes adopted by Member States must be effective, proportionate and dissuasive. This includes pecuniary penalties with a maximum limit of not less than 5% of the in-scope company’s worldwide net turnover.[24] Additionally, the Directive stipulates that any decision of a Supervisory Authority containing penalties is: (a) published, (b) publicly available for at least five years; and (c) sent to the “European Network of Supervisory Authorities” (“naming and shaming”).[25]
Besides these sanctions, compliance with the CSDDD’s obligations can be used as part of the award criteria for public and concession contracts.
5. Civil Liability of Companies
Member States must establish a civil liability regime for companies which intentionally or negligently fail to comply with the CSDDD’s obligations and where damage has been caused to a person’s legal interest (as protected under national law) as a result of that failure.[26] However, a company cannot be held liable if the damage was caused only by its business partners in its chain of activities.
Member States must provide for “reasonable conditions” under which any alleged injured party may authorize a trade union, non-governmental human rights or environmental organization or other NGO or national human rights institution, to bring actions to enforce the rights of the alleged injured party.[27]
The Directive requires a limitation period for bringing actions for damages of at least five years and, in any case, not shorter than the limitation period laid down under general civil liability regimes of Member States.
Regarding compensation, Member States are required to lay down rules that fully compensate victims for the damage they have suffered as a direct result of the company’s failure to comply with the Directive. However, the Directive states that deterrence through damages (i.e., punitive damages) or any other form of overcompensation should be prohibited.
6. Next Steps / Implementation
The Directive must now be formally adopted by the Council and will subsequently come into force on the 20th day following that of its publication in the Official Journal of the EU, which is expected to occur in the first half of 2024. Once the Directive enters into force, Member States will need to transpose it into national law within two years, i.e., by mid-2026.
Depending on their size, companies will have between three to five years from the Directive entering into force to implement its requirements (i.e., likely until between 2027 and 2029):
- three years (i.e., likely in 2027) for (a) EU companies with more than 5,000 employees and EUR 1,500 million net worldwide turnover, and (b) non-EU companies with more than EUR 1,500 million net turnover generated in the EU.
- four years (i.e., likely in 2028) for: (a) companies with more than 3,000 employees and EUR 900 million net worldwide turnover and (b) non-EU companies with more than EUR 900 million net turnover generated in the EU; and
- five years (i.e., likely in 2029) for companies with more than 1,000 employees and EUR 450 million turnover.
7. Relationship between the CSDDD and other EU Laws Protecting Human Rights and the Environment
The Directive is part of a series of EU regulations which aim to protect human rights and the environment through both reporting and due diligence obligations. Such regulations include the CSRD and the Sustainable Finance Disclosure Regulation, which impose mandatory reporting obligations, as well as the Regulation on Deforestation-free Products, the Conflicts Minerals Regulation, the Batteries Regulation and the Forced Labour Ban Regulation (which, coincidentally, was also approved by the European Parliament on 24 April 2024),[28] which impose due diligence requirements on companies in certain sectors / circumstances.
In this context, the CSDDD will become the “default” EU due diligence regime. The Directive expressly provides that its obligations are without prejudice to other, more specific EU regimes, meaning that if a provision of the CSDDD conflicts with another EU regime providing for more extensive or specific obligations, then the latter will prevail.
8. Practical Considerations for In-Scope Companies
Given the significance of expectations and liabilities in the CSDDD, in-scope companies would be well advised to commence preparation now, notwithstanding the implementation timeframe. Indeed, the types of measures that the CSDDD requires to be implemented will take time to operationalise. Functions and entities across multinationals will need to be engaged in that implementation, and it is prudent to involve key internal stakeholders (including legal and compliance functions) in that process from the outset.
The types of next steps in-scope companies should be considering now include:
First, mapping current and potentially future upstream and downstream business relationships to understand where any human rights and environmental risks exist. Any gaps or concerns should be addressed. Additionally, effective systems should be implemented to continually monitor risks within the chain of activities.
Second, putting in place a risk-based due diligence policy containing a description of the company’s approach, as well as supplier codes of conduct, which describe the rules and principles to be followed throughout the company and its subsidiaries. Codes of conduct should apply to all relevant corporate functions and operations, including procurement, employment and purchasing decisions.
Third, considering whether it is appropriate to involve lawyers in the development of internal due diligence systems in order to seek to apply privilege to relevant communications and documentation. This is particularly important given the: (a) matrix of legal regulation which applies in this space; and (b) envisaged regulatory and civil liability regimes.
Fourth, inserting appropriate contractual language into business partner contracts. The CSDDD requires the Commission, in consultation with Member States and stakeholders, to adopt guidance in this regard. However, the Commission has 30 months from the entry into force of the CSDDD to adopt such guidance.
Fifth, training employees—and being cognisant that training should not be limited just to those persons directly involved with sustainability compliance and reporting. Employees should understand how to spot adverse human rights and environmental impacts and understand the actions to be taken when they do.
Sixth, establishing operational level grievance mechanisms for rights holders, their representatives and civil society organisations. Such mechanisms act not only as a tool to remedy and redress but can be harnessed preventively as an early warning system for the identification and analysis of adverse impacts.
Seventh, meaningfully engaging with stakeholders will require identification of who relevant stakeholders are and require companies to design effective engagement processes.
Last, given the overlapping nature of some of the EU directives and regulations in this space (as well as laws at the Member State level), mapping all relevant obligations to ensure consistent compliance and drive efficiencies where practicable. It is notable that the Directive explicitly states that it does not prevent Member States from imposing further, more stringent obligations on companies—so companies will want to keep this under review.
__________
[1] European Parliament legislative resolution of 24 April 2024 on the proposal for a directive of the European Parliament and of the Council on Corporate Sustainability Due Diligence and amending Directive (EU) 2019/1937.
[2] Art. 1(a) of the Directive.
[3] See our previous client alert addressing Mandatory Corporate Human Rights Due Diligence.
[4] See our previous client alert addressing the European Commission’s draft directive on “Corporate Sustainability Due Diligence”.
[5] See for example, France’s “Loi de Vigilance” enacted in 2017, which inserted provisions into the French Commercial Code imposing substantive requirements on companies in relation to human rights and environmental due diligence. Specifically, companies with more than 5,000 employees in France (or 10,000 employees in France or abroad) are required to establish, implement and publish a “vigilance plan” to address risks within their supply chains or which arise from the activities of direct or indirect subsidiaries or subcontractors. Such plans should also include action plans to mitigate those risks and prevent damage, as well as a monitoring system to ensure that the plan is effectively implemented. (See our previous client alert addressing global legislative developments and proposals in the bourgeoning field of mandatory corporate human rights due diligence). Meanwhile in Germany, the Supply Chain Due Diligence Act 2023 (the “SCCDA”) was enacted, imposing due diligence obligations on companies with a statutory seat in Germany and more than 1,000 employees, regardless of revenue. In many instances, the CSDDD and the SCDDA obligations overlap, although there are some differences. For example, whilst the CSDDD extends obligations to the company’s “chain of activities”, the SCDDA focuses primarily on direct suppliers. An in-scope company may also be required to conduct due diligence on its indirect suppliers if the company has substantiated knowledge of grievances or violations of the law. The German legislator is expected to align the obligations under the CSDDD and the SCDDA, as it did in relation to CSRD.
[6] Press Release of the European Parliament, 24 April 2024, “Due diligence: MEPs adopt rules for firms on human rights and environment”.
[7] Turnover of branches of the relevant entity are also to be taken into account when calculating whether a threshold has been reached.
[8] See our previous client alert addressing the CSRD.
[9] See Art. 3(1)(f) of the Directive, which defines “business partner” as “an entity (i) with which the company has a commercial agreement related to the operations, products or services of the company or to which the company provides services pursuant to point (g) (‘direct business partner’), or (ii) which is not a direct business partner but which performs business operations related to the operations, products or services of the company (‘indirect business partner’)”.
[10] See Art. 3(1)(g) of the Directive.
[11] See Art. 5 of the Directive. The company’s risk-based due diligence policy should be developed in consultation with its employees and their representatives and be updated after a significant change or at least every 24 months (Art. 7(3) of the Directive). It shall contain all of the following: (a) a description of the company’s approach, including in the long term, to due diligence; (b) a code of conduct describing rules and principles to be followed throughout the company and its subsidiaries, and the company’s direct or indirect business partners; and (c) a description of the processes put in place to integrate due diligence into the relevant policies and to implement due diligence, including the measures taken to verify compliance with the code of conduct and to extend its application to business partners.
[12] See Art. 3(1)(b) and (c). Adverse environmental impacts are defined as an adverse impact on the environment resulting from the breach of the prohibitions and obligations listed in Part I, Section 1, points 15 and 16 (the prohibition of causing any measurable environmental degradation and the right of individuals, groupings and communities to lands and resources and the right not to be deprived of means of subsistence), and Part II of the Annex to the Directive, which includes, for example, the obligation to avoid or minimise adverse impacts on biological diversity, interpreted in line with the 1992 Convention on Biological Diversity and applicable law in the relevant jurisdiction. Adverse human rights impacts are defined as an adverse impact on one of the human rights listed in Part I, Section 1, of the Annex to the Directive, as those human rights are enshrined in the international instruments listed in Part I, Section 2, of the Annex to the Directive, for example, The Convention on the Rights of the Child and The International Covenant on Civil and Political Rights.
[13] See Art. 3(1)(o) of the Directive.
[14] This is defined in Art. 3(1)(i) of the Directive as “a micro, small or a medium-sized undertaking, irrespective of its legal form, that is not part of a large group…”.
[15] Art. 10(5) of the Directive.
[16] Art. 22 of the Directive.
[17] Art. 12 of the Directive.
[18] Art. 3(1)(t) of the Directive.
[19] Whilst the text of Art. 13(1) of the Directive refers to “effective” engagement with stakeholders, the title of provision refers to “meaningful” engagement, which is also found in the Recitals.
[20] Art. 14 of the Directive.
[21] Ar. 15 of the Directive.
[22] Art. 16 of the Directive.
[23] Art. 24(1) of the Directive. For France and Germany, we expect the “Supervisory Authority” to be the same authority as is currently overseeing compliance with their analogous due diligence regimes.
[24] Art. 27(4) of the Directive.
[25] Art. 27(5) of the Directive.
[26] Art. 29 of the Directive.
[27] Art. 29(3)(d) of the Directive.
[28] See Press Release of the European Parliament on 23 April 2024, “Products made with forced labour to be banned from EU single market”.
Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding these developments. If you wish to discuss any of the matters set out above, please contact the Gibson Dunn lawyer with whom you usually work, any member of Gibson Dunn’s Environmental, Social and Governance (ESG) practice group, or the following authors in London, Paris and Munich:
London:
Selina S. Sagayam – London (+44 20 7071 4263, ssagayam@gibsondunn.com)
Susy Bullock – London (+44 20 7071 4283, sbullock@gibsondunn.com)
Stephanie Collins – London (+44 20 7071 4216, scollins@gibsondunn.com)
Alexa Romanelli – London (+44 20 7071 4269, aromanelli@gibsondunn.com)
Harriet Codd (+44 20 7071 4057, hcodd@gibsondunn.com)
Paris:
Robert Spano – Paris/London (+33 1 56 43 14 07, rspano@gibsondunn.com)
Munich:
Ferdinand Fromholzer (+49 89 189 33-270, ffromholzer@gibsondunn.com)
Markus Rieder (+49 89 189 33-260, mrieder@gibsondunn.com)
Katharina Humphrey (+49 89 189 33-217, khumphrey@gibsondunn.com)
Julian von Imhoff (+49 89 189 33-264, jvonimhoff@gibsondunn.com)
Carla Baum (+49 89 189 33-263, cbaum@gibsondunn.com)
Melina Kronester (+49 89 189 33-225, mkronester@gibsondunn.com)
Julian Reichert (+49 89 189 33-229, jreichert@gibsondunn.com)
Marc Kanzler (+49 89 189 33-269, mkanzler@gibsondunn.com)
© 2024 Gibson, Dunn & Crutcher LLP. All rights reserved. For contact and other information, please visit us at www.gibsondunn.com.
Attorney Advertising: These materials were prepared for general informational purposes only based on information available at the time of publication and are not intended as, do not constitute, and should not be relied upon as, legal advice or a legal opinion on any specific facts or circumstances. Gibson Dunn (and its affiliates, attorneys, and employees) shall not have any liability in connection with any use of these materials. The sharing of these materials does not establish an attorney-client relationship with the recipient and should not be relied upon as an alternative for advice from qualified counsel. Please note that facts and circumstances may vary, and prior results do not guarantee a similar outcome.
In a judgment dated 14 October 2021 related to the so-called “Case of the Century”, the Paris Administrative Court (the Court) ordered the State to make good the consequences of its failure to reduce greenhouse gas (GHG) emissions. In this respect, the Court ordered that the excess of the GHG emissions cap set by the first carbon budget be offset by 31 December 2022 at the latest. The French Government remains free to choose the appropriate measures to achieve this result.
I. Background to the Judgment
In March 2019, four non-profit organizations had filed petitions before the Court to have the French State’s failure to combat climate change recognized, to obtain its condemnation to compensate not only their moral prejudice but also the ecological prejudice and to put an end to the State’s failures to meet its obligations.
In a judgment dated February 3, 2021, the Court ruled that the State should compensate for the ecological damage caused by the failure to comply with the objectives set by France in terms of reducing GHG emissions and, more specifically, the objectives contained in the carbon budget for the period 2015-2019. As a reminder, France has defined a National Low-Carbon Strategy, which describes both a trajectory for reducing GHG emissions until 2050 and short- and medium-term objectives. These latter, called carbon budgets, are emission ceilings expressed as an annual average per five-year period, that must not be exceeded. The Court also ordered a further investigation before ruling on the evaluation and concrete methods of compensation for this damage (please see Gibson Dunn’s previous client alert).
In the separate Grande Synthe case, the Council of State – France’s highest administrative court – on 1 July 2021 enjoined the Prime Minister to take all appropriate measures to curb the curve of GHG emissions produced on national territory to ensure its compatibility with the 2030 GHG emission reduction targets set out in Article L. 100-4 of the Energy Code and Annex I of Regulation (EU) 2018/842 of 30 May 2018 before 31 March 2022.
II. The steps in the reasoning followed by the Tribunal
First, the Court considers that it is dealing solely with a dispute seeking compensation for the environmental damage caused by the exceeding of the first carbon budget and the prevention or cessation of the damage found and that it is for the Court to ascertain, at the date of its judgment, whether that damage is still continuing and whether it has already been the subject of remedial measures.
On the other hand, the Court considers that it is not for it to rule on the sufficiency of the measures likely to make it possible to achieve the objective of reducing GHGs by 40% by 2030 compared to their 1990 level, which is a matter for the litigation brought before the Council of State.
Second, the Court considers that it can take into account, as compensation for damage and prevention of its aggravation, the very significant reduction in GHG emissions linked to the Covid 19 crisis and not to the action of the State.
However, the Court finds that the data relating to the reduction of GHG emissions for the first quarter of 2021 do not make it possible to consider as certain, in the state of the investigation, that this reduction would make it possible to repair the damage and prevent it from worsening. It concludes that the injury continues to be 15 Mt CO2eq.
Third, the Court considers that it can apply articles 1246, 1249 and 1252 of the Civil Code, which give it the power to order an injunction in order to put an end to an ongoing injury and prevent its aggravation.
The State argued in its defense that the injunction issued by the Council of State in its decision of 1 July, 2021 already made it possible to repair the ecological damage observed. The Court nevertheless considers that the injunction issued by the Conseil d’Etat aims to ensure compliance with the overall objective of a 40% reduction in GHG emissions in 2030 compared to their 1990 level and that it does not specifically address the compensation of the quantum of the damage associated with exceeding the first carbon budget. Since the injunction sought from the Court is specifically intended to put an end to the damage and prevent it from worsening, the Court considers that it is still useful and that the non-profit organizations are entitled to request that it be granted.
Fourth, the Court indicated that “the ecological damage arising from a surplus of GHG emissions is continuous and cumulative in nature since the failure to comply with the first carbon budget has resulted in additional GHG emissions, which will be added to the previous ones and will produce effects throughout the lifetime of these gases in the atmosphere, i.e. approximately 100 years. Consequently, the measures ordered by the judge in the context of his powers of injunction must be taken within a sufficiently short period of time to allow, where possible, the damage to be made good and to prevent or put an end to the damage observed.
As the State failed to demonstrate that the measures to be taken pursuant to the Climate Act of 20 August 2021 will fully compensate for the damage observed, the Court then ordered “the Prime Minister and the competent ministers to take all appropriate sectoral measures to compensate for the damage up to the amount of the uncompensated share of GHG emissions under the first carbon budget, i.e. 15 Mt CO2eq, and subject to an adjustment in the light of the estimated data of the [Technical Reference Centre for Atmospheric Pollution and Climate Change] known as of 31 January 2022, which make it possible to ensure a mechanism for monitoring GHG emissions“.
In view of (i) the cumulative effect of the harm linked to the persistence of GHGs in the atmosphere and the damage likely to result therefrom, and (ii) the absence of information making it possible to quantify such harm, the Court orders that the abovementioned measures be adopted within a period sufficiently short to prevent their aggravation.
Finally, he adds that:
(i) “the concrete measures to make reparation for the injury may take various forms and therefore express choices which are within the free discretion of the Government“;
(ii) repair must be effective by 31 December 2022, which means that measures must be taken quickly to achieve this objective;
(iii) that no penalty be imposed in addition to the injunction.
III. The aftermath of the Judgment
The Government has two months in which to appeal against the Judgment. If the Judgment is appealed, the application for enforcement will have to be submitted to the Administrative Court of Appeal in Paris.
If the Government decides not to contest the Judgment, it will have to take the necessary measures for each of the sectors identified in the SNBC (transport, agriculture, construction, industry, energy, waste), which will probably mean imposing new standards on economic actors and individuals.
If, on December 31, 2022, the non-profit organizations consider that the Judgment has not been properly executed, i.e., if the measures taken by the Government have not made it possible to repair the damage up to the amount of 15 Mt CO2eq, they will be able to refer the matter to the Tribunal so that it may order, after investigation, a measure to execute the Judgment, which will most likely be a penalty payment.
As a reminder, in a decision of August 4, 2021, the Council of State condemned the State to pay the sum of 10 million euros to various organizations involved in the fight against air pollution for not having fully implemented its previous decisions regarding its failure to improve air quality in several areas in France.
The following Gibson Dunn attorneys assisted in preparing this client update: Nicolas Autet and Grégory Marson.
Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding these developments. Please contact the Gibson Dunn lawyer with whom you usually work, the authors, or any of the following lawyers in Paris by phone (+33 1 56 43 13 00) or by email:
Nicolas Autet (nautet@gibsondunn.com)
Grégory Marson (gmarson@gibsondunn.com)
Nicolas Baverez (nbaverez@gibsondunn.com)
Maïwenn Béas (mbeas@gibsondunn.com)
© 2021 Gibson, Dunn & Crutcher LLP
Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.
Overshadowed in the media by the historic judgment of 3 February 2021 by the Administrative Court of Paris in the “Affaire du siècle” (the Case of the century), a ruling by the Versailles Administrative Court of Appeal (the Court) on 29 January 2021 could also result in a historic ruling by the Court of Justice of the European Union (the CJEU). Indeed, upon referral by the Court, the CJEU will be called upon to rule on the existence of a right to breathe clean air and on the liability incurred by the Member States of the European Union in the case of disregard of their obligations in terms of air quality (Case C-61/21).
I. Context of the ruling rendered by the Court
Under Directive 2008/50/EC of 21 May 2008 on “ambient air quality and cleaner air for Europe” (the Directive), Member States must establish zones and agglomerations throughout their territory in which air quality is assessed (Article 4).
Article 13-1 of the Directive requires Member States to ensure that levels of fine particulate matter (PM10), carbon monoxide or nitrogen dioxide (NO2) do not exceed limit values set out in an annex.
Article 23-1 of the Directive provides that where these limit values are exceeded by levels of pollutants in ambient air, Member States must, in the given zone or agglomeration, adopt “air quality plans”. If the limit values are exceeded after the deadline for their application, the air quality plans provide for appropriate measures to ensure that the period of exceedance is as short as possible.
At the end of 2019, following an action for failure to fulfil obligations brought by the European Commission, the Court of Justice of the European Union ruled that France had failed to fulfil its obligations under Articles 13(1) and 23(1) of the Directive with regards to NO2 for several French regions, including the Paris region (CJEU, 24 October 2019, case C-636/18). On 30 October 2020, the European Commission announced that it would bring a new action against France before the CJEU for failure to fulfil obligations , it being specified that the failures this time deal with the excessive level of PM10 in the air.
For its part, the Conseil d’Etat (Council of State, France), the highest administrative court in France, had already ruled in 2017 that, given the persistence of observed exceedance of PM10 and NO2 concentrations in the air, the air quality plans for certain areas, including the Paris region, had to be considered insufficient with respect to the obligations and thresholds set by the Directive. The Conseil d’Etat had then enjoined the State to take the necessary measures to bring PM10 and NO2 concentrations below the limit values (CE, 12 July 2017, No. 394254). In a decision dated 10 July 2020, the Conseil d’Etat considered that the French State had not complied with the injunctions requested in the decision of 12 July 2017, and imposed a €10 million penalty on them if they did not justify having taken the required measures within six months of the decision (CE, ass., 10 July 2020, No. 428409). In light of the publicly available information, the Conseil d’Etat should soon rule on whether the French State has finally fulfilled its obligations.
It is in this context that the Court, sitting in plenary session, was called upon to rule on the action for damages brought by an applicant, resident of the Paris region, who attributed his various allergies to air pollution. The applicant considered that the deterioration of the air quality resulted in particular from the disregard by the French authorities of the obligations set by Articles 13(1) and 23(1) of the Directive.
II. Reasoning steps followed by the Court
It has been consistently held that “the principle of State liability for loss and damage caused to individuals as a result of breaches of [Community] law for which it can be held responsible is inherent in the system of the [Treaty on the Functioning of the European Union]” (CJEU, 5 March 1996, cases C-46/93 and C-48/93).
The CJEU also recalls that a right to reparation is recognized by European law if the following three conditions are met:
- the rule of law infringed must be intended to confer rights on individuals;
- the breach must be sufficiently serious, it being specified that this is the case if the breach has persisted despite a judgment by the CJEU finding the infringement in question to be established;
- there must be a direct causal link between the breach of the obligation resting on the State and the damage sustained by the injured parties.
In the present case, since it was seized of a claim for damages based on the breach of the Directive, i.e. of a norm of European law, the Court had to verify whether the three conditions mentioned above were met.
In order to determine whether the first condition had been met, the Court had first to decide whether Articles 13(1) and 23(1) of the Directive, which the applicant claimed had been disregarded, gave him a “right”. In other words, the Court had to determine whether these Articles conferred a “right to breathe clean air” eligible of giving rise to a compensation claim.
As early as 2014, the CJEU had indicated that Articles 13(1) and 23(1) allowed “persons directly concerned by the limit value being exceeded” to obtain, before the national authorities and courts, the establishment of an air quality plan in accordance with the requirements of Article 23 (CJEU, 19 November 2014, case C-404/13). It is, moreover, this right that was implemented by the Conseil d’Etat in the 2017 and 2020 decisions outlined above.
The Court probably considered that the right thus available to individuals to compel Member States to implement the obligations laid down by the Directive did not necessarily imply the recognition for their benefit of a “right to breathe clean air”, the disregard of which is likely to give rise to an action for damages.
Since the answer was uncertain and the issue was related to the scope of a European norm, the Court chose to refer two questions to the CJEU for a preliminary ruling on Articles 13(1) and 23(1) of the Directive in order to obtain the appropriate interpretation of these Articles.
The first question is relative to whether Articles 13(1) and 23(1) of the Directive give individuals, in the event of a sufficiently serious breach by a Member State of the European Union of the obligations arising therefrom, a right to obtain from the Member State in question, compensation for damage to their health which has a direct and certain causal link with the deterioration of air quality.
If the answer to the first question is affirmative, the Court then asked the CJEU to specify the conditions for the opening of this right, in particular with regards to the date on which the existence of the breach attributable to the Member State in question must be assessed.
III. Possible consequences of the Court’s ruling
If the CJEU were to answer the first of the questions asked by the Court in the affirmative, it would then be for the Court to determine whether the other two conditions for the French State’s liability to be characterized are met.
Insofar as France has already been subject of a breach judgment for failure to comply with its obligations with respect to NO2 (CJEU, 24 October 2019, cited above), the condition relating to the sufficiently serious breach of a right conferred on individuals does not seem to pose any particular difficulty.
It will then be up to the Court to assess whether there is a direct causal link between the violation and the damage claimed by the applicant, it being specified that this demonstration will depend on the answer given by the CJEU to the second question, namely from what date the existence of the violation attributable to the Member State in question must be assessed, and will probably require recourse to a medical expert opinion.
The recognition of a right to breathe clean air likely to be subject of an action for compensation would very probably constitute a strong constraint weighing on the Member States of the European Union. In this respect, it should be emphasized that France is far from being the only country in the European Union to have been condemned for failure to comply with the obligations set out in Articles 13(1) and 23 of the Directive: Italy has been condemned for systematic and persistent exceeding of the PM 10 limit values (CJEU, 10 November 2020, case C-644/18), the United Kingdom and Germany have been condemned in the same way, but for N02 (CJEU, 4 March 2021, case C-664/18 and CJEU, 3 June 2021, case C-635/18). The question of a possible compensation claim based on the disregard of the right to breathe clean air could thus have a repercussion in all of the European Union States.
The following Gibson Dunn attorneys assisted in preparing this client update: Nicolas Autet and Grégory Marson.
Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding these developments. Please contact the Gibson Dunn lawyer with whom you usually work, the authors, or any of the following lawyers in Paris by phone (+33 1 56 43 13 00) or by email:
Nicolas Autet (nautet@gibsondunn.com)
Grégory Marson (gmarson@gibsondunn.com)
Nicolas Baverez (nbaverez@gibsondunn.com)
Maïwenn Béas (mbeas@gibsondunn.com)
© 2021 Gibson, Dunn & Crutcher LLP
Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.
In response to a claim brought by several environmental advocacy groups (the Associations), which sought to obtain the recognition of the French State’s failure to act in response to climate change, the Administrative Court of Paris (the Court) ruled, for the first time in French law, in a judgment of February 3, 2021, that such a liability action against the State was admissible, that the ecological damage alleged by the Associations was established and that the French State was partially responsible for it. The Court ordered a further investigation in order to determine the measures that it could enjoin the French State to adopt to repair the highlighted damage and prevent its aggravation.
I. Context of the ruling rendered by the Court
The Court’s ruling comes in the wake of several rulings by the Conseil d’Etat, the French highest Administrative Court, which reveal an intensification of control and compliance with the State’s obligations in environmental matters in general, and in connection with climate change in particular.
In a ruling of July 10, 2020, the Conseil d’Etat found that the Government had not taken the measures requested to reduce air pollution in 8 areas in France, as the judge had ordered in a decision of July 12, 2017. To compel it to do so, the Conseil d’Etat imposed a penalty payment of 10 million euros for each semester of delay, the highest amount ever imposed to force the State to enforce a judgement taken by the Administrative judge (CE, Ass., 10 July 2020, Les Amis de la Terre, no. 428409).
In a Grande Synthe ruling of November 19, 2020, the Conseil d’Etat ruled for the first time on a case concerning compliance with commitments to reduce greenhouse gas emissions. Indeed, the city of Grande-Synthe referred the matter to the Conseil d’Etat after the refusal of the Government to comply with its request for additional measures to be taken to meet the goals resulting from the Paris Agreement. The Conseil d’Etat first ruled that the request of the city, a coastal city particularly exposed to the effects of climate change, was admissible. On the merits, the Conseil d’Etat noted, firstly, that although France has committed to reducing its emissions by 40% by 2030, in recent years it has regularly exceeded the emission ceilings it had set itself and, secondly, that the decree of April 21, 2020 postponed most of the reduction efforts beyond 2020. According to the High Administrative Court, it is not necessary to wait until the 2030 deadline to exercise control over the State’s actions since the control of the trajectory that the State has set itself is relevant in ecological matters. Before ruling definitively on the request, the Conseil d’Etat asked the Government to justify, within three months, that its refusal to take additional measures is compatible with compliance with the reduction trajectory chosen to achieve the objectives set for 2030. If the justifications provided by the Government are not sufficient, the Conseil d’Etat may then grant the municipality’s request and cancel the refusal to take additional measures to comply with the planned trajectory to achieve the -40% target by 2030 (EC, November 19, 2020, Commune de Grande-Synthe et al., no. 427301), or even impose obligations on the French State. According to the information provided by representatives of the Conseil d’Etat, the decision could be taken before Summer 2021.
Moreover, in a ruling of January 29, 2021, the Versailles Administrative Court of Appeal referred a question to the Court of Justice of the European Union to determine whether the rules of the European Union law should be interpreted as opening up to individuals, in the event of a sufficiently serious breach by a European Union Member State of the obligations resulting therefrom, a right to obtain from the Member State in question compensation for damage affecting their health that has a direct and certain causal link with the deterioration of air quality (CAA Versailles, January 29, 2021, no. 18VE01431).
II. Reasoning steps followed by the Court
First, the Court ruled on the admissibility of the action for compensation for ecological damage brought by the Associations against the French State. In order to recognize the Associations’ status as victims, the Court had to acknowledge the existence of a fault, damage and a causal link between the fault and the damage.
First of all, it recalled that in application of article 1246 of the French Civil Code “Any person responsible for ecological damage is required to repair it”. Implicitly, the Court considered that this provision is applicable to the State. Article 1248 of the French Civil Code provides that “The action for compensation for ecological damage is open to any person having the capacity and interest to act, [such as] associations approved or created for at least five years at the date of the institution of proceedings which have as their purpose the protection of nature and the defense of the environment”. After having examined the purpose in the Associations’ by-laws, which mention the environment protection and sometimes explicitly the fight against climate change, the Court considered that their liability action was admissible.
Second, the Court had to rule on the existence of ecological damage, bearing in mind that such damage consists of “a non-negligible damage to the elements or functions of ecosystems or to the collective benefits derived by mankind from the environment” (Article 1247 of the French Civil Code). In this respect, it should be emphasized that the Conseil Constitutionnel considered that the legislature could validly exclude from the set-up compensation mechanism, the compensation for negligible damage to the elements, functions and collective benefits derived by mankind from the environment (Decision no. 2020-881 QPC of February 5, 2021). Consequently, it is up to the courts to determine, on a case-by-case basis, according to the facts of the case, what the notion of “non-negligible damage” covers.
In order to characterize the existence of non-negligible damage, the Court first relied on the work of the Intergovernmental Panel on Climate Change (IPCC), from which it concluded “that the constant increase in the average global temperature of the Earth, which has now reached 1°C compared to the pre-industrial era, is due mainly to greenhouse gas emissions [resulting from human activity]. This increase, responsible for a modification of the atmosphere and its ecological functions, has already caused, among other things, the accelerated melting of continental ice and permafrost and the warming of the oceans, resulting in an accelerating rise in sea level”.
It also drew on the work of the National Observatory on the Effects of Global Warming, a body attached to the Ministry of Ecological Transition and responsible in particular for describing, through a certain number of indicators, the state of the climate and its impacts on the entire national territory. The Court found that “in France, the increase in average temperature, which for the 2000-2009 decade amounts to 1.14°C compared to the 1960-1990 period, is causing an acceleration in the loss of glacier mass, particularly since 2003, the aggravation of coastal erosion, which affects a quarter of French coasts, and the risk of submersion, which poses serious threats to the biodiversity of glaciers and the coastline, is leading to an increase in extreme climatic phenomena, such as heat waves, droughts, forest fires, extreme rainfalls, floods and hurricanes, which are risks to which 62% of the French population is highly exposed, and is contributing to the increase in ozone pollution and the spread of insects that are vectors of infectious agents such as dengue fever or chikungunya”.
In light of all these elements, the Court considered that the ecological damage claimed by the Associations had to be considered as established.
Third, the Court had to identify the obligations of the States in responding to climate change in order to, in a second stage, rule on possible breaches in relation to these obligations.
The Court considered that it arose in particular from the provisions of the Paris Agreement of December 12, 2015, as well as from European and national standards relating to the reduction of greenhouse gas emissions, that the French State had committed to take effective action against climate change in order to limit its causes and mitigate its harmful consequences. From this perspective, the Court recalled that the French State had chosen to exercise “its regulatory power, in particular by conducting a public policy to reduce greenhouse gas emissions emitted from the national territory, by which it undertook to achieve, at specific and successive deadlines, a certain number of objectives in this area”.
The Court then examined compliance with the greenhouse gas emission reduction trajectories that the State had set itself in order to determine whether it had failed to meet its obligations. To do so, it relied in particular on the annual reports published in June 2019 and July 2020 by the High Council for the Climate, an independent body whose mission is to issue opinions and recommendations on the implementation of public policies and measures to reduce greenhouse gas emissions of France. In its two reports, the High Council for the Climate noted that “the actions of France are not yet commensurate with the challenges and objectives it has set itself” and noted the lack of substantial reduction in all the economic sectors concerned, particularly in transportation, agriculture, construction and industry sectors.
The Court concluded that the French State should be regarded as having failed to carry out the actions that it had itself recognized as likely to reduce greenhouse gas emissions. The guilty failure to meet its commitments was thus characterized, as was the causal link between that failure and the ecological damage previously identified. The Court therefore considered that part of that damage was attributable to the failure of the French State to act.
Fourth, the Court had to rule on the modalities of reparation of the ecological damage. Under the terms of the law, this was to be carried out primarily in kind. It is only in the event of impossibility or inadequacy of the reparation measures that the judge sentences the liable person to pay damages to the plaintiff, such damage being allocated to the reparation of the environment.
The Court considered that in the state of the investigation of the case, it was not in a position to determine the measures “that must be ordered to the State” to repair the observed damage or to prevent its future aggravation. He therefore prescribed a further two-month investigation in order to identify the measures in question.
Fifth, it sentenced the State to pay each of the Associations a symbolic sum of one euro as compensation for the moral prejudice it had caused them by not respecting the goals of reducing greenhouse gas emissions.
III. Follow-up to the Court’s ruling
The Court’s ruling, which sentences the State for not having implemented the necessary measures to achieve the greenhouse gas emission reduction targets, is a landmark decision in French law.
The second ruling that will be rendered following the two-month additional investigation ordered by the Court could constitute another historic decision if the Court were to enjoin the State – as the terms of the Ruling seem to imply – to implement a number of specific measures aimed at achieving the expected reduction targets, if necessary within a set timeframe. When this judgment comes into effect, possibly before the 2021 Summer, it will then be necessary to examine the impact of the measures that would thus be ordered on the economic sectors and companies likely to be affected.
At this stage of the proceedings, it is not possible to determine whether or not the French State will decide to appeal the ruling rendered by the Court to the Administrative Court of Appeal of Paris. If the latter were to uphold the ruling, the French State could then appeal to the Conseil d’Etat. A final decision on the issue at stake in this case could thus only be made in several years’ time.
The Court’s ruling could also have the immediate effect of modifying the provisions of the “Bill to combat climate change and strengthen resilience to its effects” which will be debated in the French Parliament from the end of March 2021. During the discussion, parliamentarians in favor of strengthening the provisions of this law could rely on the Court’s ruling to motivate and justify their position.
The following Gibson Dunn attorneys assisted in preparing this client update: Nicolas Autet and Gregory Marson.
Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding these developments. Please contact the Gibson Dunn lawyer with whom you usually work, the authors, or any of the following lawyers in Paris by phone (+33 1 56 43 13 00) or by email:
Nicolas Autet (nautet@gibsondunn.com)
Gregory Marson (gmarson@gibsondunn.com)
Nicolas Baverez (nbaverez@gibsondunn.com)
Maïwenn Béas (mbeas@gibsondunn.com)
© 2021 Gibson, Dunn & Crutcher LLP
Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.
For the third consecutive year, following the publication of Gibson Dunn’s ninth annual U.S. Cybersecurity and Data Privacy Outlook and Review on Data Privacy Day, we offer this separate International Outlook and Review.
Like many recent years, 2020 saw significant developments in the evolution of the data protection and cybersecurity landscape in the European Union (“EU”):
- On 16 July 2020, the Court of Justice of the EU (“CJEU” or “Court”) struck down as legally invalid the EU-U.S. Privacy Shield, on which some companies relied to transfer personal data from the EU to the U.S. While companies are turning to other frameworks to transfer personal data, such as Standard Contract Clauses (“SCCs”) and Binding Corporate Rules (“BCRs”), EU law also compels these companies to ensure that personal data will be safeguarded.
- As a consequence of the COVID-19 pandemic, a number of public, corporate and workplace practices have emerged to limit the spread of the virus, all which have privacy implications. To respond to this, many EU Member States have issued rules and guidelines with respect to the processing of personal data in the context of the pandemic.
- Negotiations among EU Member States have been ongoing regarding the adoption of a new e-Privacy Regulation, due to replace the soon 20-year-old e-Privacy Directive. Meanwhile, EU supervisory authorities have continued to publish guidance on cookie practices and other e-privacy matters, as well as to impose heavy fines on companies in breach of cookies-related requirements.
- Before Brexit was completed on 31 December 2020, the EU and the UK adopted the Trade and Cooperation Agreement, which includes an overall six-month “bridging mechanism” to cover transfers of personal data into the UK. The European Commission and the UK are in negotiations to adopt an adequacy decision that can enable the free flow of personal data beyond this six-month period, as in the pre-Brexit scenario.
In addition to the EU, different legal developments occurred in other jurisdictions around the globe, including in other European jurisdictions, the Asia-Pacific region, the Middle East, Africa and Latin America.
We cover these topics and many more in this year’s International Cybersecurity and Data Privacy Outlook and Review.
__________________________________________
Table of Contents
A. International Data Transfers
1. The Schrems II Ruling
2. Guidance Adopted by the EDPB and Member State Authorities
3. Conclusions on Data Transfers
1. Guidance Adopted by Supervisory Authorities
2. Guidance at EU Member State Level
3. Next Challenges for the Fight against the COVID-19 Pandemic
1. Guidance Adopted by the EDPB and Member State Authorities
2. Reform of the e-Privacy Directive
3. Enforcement in Relation to Cookies
D. Cybersecurity and Data Breaches
1. Guidance and Initiatives Adopted by ENISA
2. Enforcement in Relation to Cybersecurity
1. Transfers from and into the EU/EEA and the UK
2. Transfers from and into the UK and other Jurisdictions
F. Other Significant Developments in the EU
II. Developments in Other European Jurisdictions: Switzerland, Turkey and Russia
1. Access Restriction Trend in Privacy Laws Enforcement
2. The Russian Data Protection Authority Has Continued to Target Large, Multinational Digital Companies
3. Legislative Updates
1. The Revised FADP
2. The Swiss-U.S. Privacy Shield
1. Turkish Data Protection Authority and Board Issues a Number of Regulations, Decisions and Guidance Documents
2. Turkish Data Protection Act Continues to be Enforced
III. Developments in Asia-Pacific, Middle East and Africa
1. New Developments in Chinese Legislation
2. Enforcement of Chinese Data Protection and Cybersecurity Legislation
1. Legislative initiatives
2. Regulatory opinions and guidance
3. Enforcement of data protection laws
M. Other Developments in Africa
N. Other Developments in the Middle East
O. Other Developments in Southeast Asia
IV. Developments in Latin America and in the Caribbean Area
B. Other Developments in South America
1. Argentina
2. Chile
3. Colombia
4. Mexico
5. Uruguay
__________________________________________
I. European Union
A. International Data Transfers
1. The Schrems II Ruling
On 16 July 2020, the CJEU struck down as legally invalid the EU-U.S. Privacy Shield, which some companies had relied upon to transfer personal data from the EU to the U.S. The Court also ruled that the Standard Contractual Clauses (“SCCs”) approved by the European Commission, another mechanism used by many companies to transfer personal data outside of the EU, remained valid with some caveats. The Court’s landmark decision has forced companies on both sides of the Atlantic to reassess their data transfer mechanisms, as well as the locations where they store and process personal data.[1]
2. Guidance Adopted by the EDPB and Member State Authorities
Following the Schrems II ruling, several supervisory authorities shared their views and opinions on its interpretation.[2] On its side, the UK Information Commissioner’s Office (“ICO”) invited companies to continue transferring data on the basis of the invalidated Privacy Shield and, on the contrary, several German Authorities have advised against it.
These initial reactions were overcome by the Frequently Asked Questions (“FAQ”) report issued by the European Data Protection Board (“EDPB”) on 23 July 2020. In its FAQs on Schrems II, the EDPB stated, in particular, the following:
i. |
No “grace” period is granted for entities that relied on the EU-U.S. Privacy Shield. Entities relying on the now invalidated Privacy Shield should immediately put in place other data transfer mechanisms or frameworks. | ||
ii. |
Data controllers relying on SCCs and BCRs to transfer data should contact their processors to ensure that the level of protection required by EU law is respected in the third country concerned. If personal data is not adequately protected in the importing Member State, the controller or the processor responsible should determine what supplementary measures would ensure an equivalent level of protection. | ||
iii. |
If data transferred cannot be afforded a level of protection essentially equivalent to that guaranteed by EU law, data transfers should be immediately suspended. Companies willing to continue transferring data under these circumstances should notify the competent supervisory authority(ies).[3] |
In October 2020, the U.S. Department of Commerce and the European Commission announced that they had initiated discussions to evaluate the potential for a new version of the Privacy Shield that would be compliant with the requirements of the Schrems II ruling.[4]
Pending the discussions between the EU and the U.S. on a new data transfer framework, on 10 November 2020, the EDPB issued important new guidance on transferring personal data out of the EEA, namely:
i. |
Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data,[5] which aim to provide a methodology for data exporters to determine whether and which additional measures would need to be put in place for their transfers; and | ||
ii. |
Recommendations 02/2020 on the European Essential Guarantees (“EEG”) for surveillance measures,[6] which aim to update the EEG, in order to provide elements to examine whether surveillance measures allowing access to personal data by public authorities in a receiving country, whether national security agencies or law enforcement authorities, can be regarded as a justifiable interference. |
The EDPB’s guidance lessened some of the uncertainty caused by the Schrems II ruling. However, since this guidance was issued in the form of a public consultation closing on 21 December 2020, it may be subject to further changes or amendments.
In the Recommendations on supplementary transfer tools, the EDPB recommends that data exporters: (i) map all transfers of personal data to third countries and verify that the data transferred is adequate, relevant and limited to what is necessary; (ii) verify the transfer tool on which the transfers are based; (iii) assess whether there is anything in the law or practice of the third country that may impinge on the effectiveness of the appropriate safeguards, and document this assessment; (iv) identify and adopt additional measures (examples are provided in Annex 2 of the Recommendations); (v) take any formal procedural steps that the adoption of the supplementary measure may require; and (vi) re-evaluate at appropriate intervals the level of protection afforded to the data transferred. Although the guidance takes the form of non-binding recommendations, companies that transfer personal data outside of the EEA would be well served to review their approach to such transfers in light of the EDPB guidance.
On 12 November 2020, the European Commission published a draft implementing decision on SCCs for the transfer of personal data to third countries along with a draft set of new SCCs. The new SCCs include several modules to be used by companies, depending on the transfer scenario and designation of the parties under the GDPR, namely: (i) controller-to-controller transfers; (ii) controller-to-processor transfers; (iii) processor-to-processor transfers; and (iv) processor-to-controller transfers.
These new SCCs also incorporate some of the contractual supplementary measures recommended by the EDPB, as described above. They have been opened for public consultation that closed on 10 December 2020 and the final new set of SCCs is expected to be adopted in early 2021. At this stage, the draft provides for a grace period of one year during which it will be possible to continue to use the old SCCs for the execution of contracts concluded before the entry into force of the new SCCs.[7]
Besides, the European Commission also published on 12 November 2020 draft of SCCs for contracts between controllers and processors. These SCCs are intended to be optional (the parties may choose to continue using their own data processing agreements) and have also been opened for public consultation that closed on 10 December 2020. The final draft of SCCs are also expected to be adopted in early 2021.[8]
On 15 January 2021, the EDPB and European Data Protection Supervisor adopted joint opinions on both sets of SCCs (one opinion on the SCCs for contracts between controllers and processors, and another one on SCCs for the transfer of personal data to third countries).[9]
3. Conclusions on Data Transfers
As explained above, 2020 was a year of changes when it comes to data transfer mechanisms.
The EU-U.S. Privacy Shield, once believed to have put an end to the issues raised by the EU-U.S. Safe Harbour, has again been deemed to be insufficient to safeguard the data protection rights of individuals in the EU. It is expected that, with a change in the U.S. federal administration, and the need for authorities to give legal certainty and facilitate cross-border commercial activity in the current economic context, the EU and the U.S. will work swiftly towards a mechanism that can resolve transatlantic transfers once and for all.
The adoption of new SCCs, expected to occur in 2021, will also bring more certainty to companies that relied on this framework to transfer personal data. The new sets of SCCs will cover wider scenarios than those under the current framework, reducing implementation costs and limiting uncertainty. However, given the limited grace period expected to apply to pre-GDPR SCCs, and the introduction of changes to the new SCCs, companies should take the opportunity to review the new contractual framework and adapt it to their data transfer needs.
B. COVID-19 Pandemic
The COVID-19 pandemic and the ensuing health crisis has led to the emergence of new practices to limit the spread of the virus, such as the issuance of tracing apps and the implementation of temperature checks at public administration buildings or at the workplace. These practices involve the processing of various health data, and may therefore have privacy implications. On the other hand, remote working has increased the exposure of companies and their employees to cybersecurity risks, such as the use of private (unprotected and non-certified) assets to review, print or process company information.[10]
1. Guidance Adopted by Supervisory Authorities
On 19 March 2020, the EDPB adopted a statement on the processing of personal data in the context of COVID-19. In the statement, the EDPB emphasised that while data protection rules should not hinder the fight against the virus, data controllers and processors must ensure the protection of personal data even in these exceptional times.[11]
Further, on 17 April 2020, the European Commission set out the criteria and requirements that applications supporting the fight against COVID-19 must meet in order to ensure compliance with data protection regulations.[12] Building on this guidance, the EDPB adopted Guidelines on geolocation and other tracing tools in the context of the COVID-19 outbreak as well as Guidelines on the processing of health data for research purposes in the context of the COVID-19 outbreak.[13]
Since the beginning of the pandemic, European authorities have also focused on pooling resources at the EU level. The European Commission and the EDPB published materials relating to the interoperability between the Members States’ contact tracing applications, in order for users to be able to rely on a single app wherever they are located in the EU.[14]
The EDPS also issued a Preliminary Opinion on the European Health Data Space, which aims to promote better exchange and access to different types of health data within the EU.[15]
2. Guidance at EU Member State Level
Member State supervisory authorities have also issued their own guidance with respect to the processing of personal data in the context of the COVID-19 pandemic. Although authorities have emphasised the general principles set forth under the GDPR, they have failed to adopt a unified approach.
As regards national tracing applications, the UK ICO issued a notice on the joint initiative by two tech companies to enable the use of Bluetooth technology in contact research applications,[16] as well as on the development of contact tracing applications in accordance with the principles of privacy by design and privacy by default.[17] In France, the French supervisory authority (the “CNIL”) opened and closed a formal enquiry into the national tracing app sponsored and developed by the French government,[18] after requesting the Ministry of Solidarity and Health to remedy certain breaches identified in the app.[19] In Germany, as in France, the authority emphasised that the use of the national COVID-19 app should be voluntary.[20]
On a different note, supervisory authorities have also intervened in different degrees in the testing and tracing efforts of public authorities. In the UK, for example, the ICO issued a notice on the recording and retention of personal data in support of the test and trace scheme, where it advised in particular to only collect data requested by the government, not to reuse the data for other purposes, and to delete the data as soon as it is no longer necessary.[21] In Germany, a regional supervisory authority even issued warnings for excessive health requests.[22]
Supervisory authorities have also issued substantial guidance in respect of measures to fight the COVID-19 pandemic in an employment context, for example, in the UK,[23] France,[24] Italy,[25] Belgium[26] and the Netherlands.[27] The topics covered by supervisory authorities include the implementation of tests and the monitoring of employees, the reporting of sensitive information to the employer, and in turn the communication of such information to the health authorities, as well as remote work.
The use of smart and thermal cameras has also been strictly regulated both in France and in Germany.[28]
3. Next Challenges for the Fight against the COVID-19 Pandemic
While data protection laws were not meant to hinder the deployment of necessary measures to trace and contain the evolution of the virus, EU supervisory authorities have been adamant that this should not come at a cost in terms of privacy.
Privacy standards are likely to remain high as Member States commence their vaccination plans and prepare for the post-COVID-19 economic recovery. For example, in the Member States the monitoring of doses and medical supervision of patients are generally conducted by qualified medical staff, and health and pharmaceutical institutions. However, there is still some debate whether private and public institutions can issue or request vaccination “passports” or certificates to facilitate the safe movement of people.[29] With regard to tracing and detection data, public administrations and companies have to assess the proper retention periods that apply to the storage and archive of such information.
C. E-Privacy and Cookies
Against the backdrop of the ongoing EU discussions on the future e-Privacy Regulation, guidance has been released by Member State supervisory authorities. Meanwhile, significant fines continue to be imposed on companies that do not comply with applicable e-privacy rules.
1. Guidance Adopted by the EDPB and Member State Authorities
On 5 April 2020, the EDPB updated its Guidelines (05/2020) on consent, which now specifically address the practice of so-called “cookie walls” (a practice which consists in making access to online services and functionalities conditional on the consent of a user to cookies). Among others, in these Guidelines the EDPB explicitly states that continuing browsing on a website does not meet the requirements of valid consent.[30]
As a result of the additional clarifications provided by the EDPB, the Spanish supervisory authority (“AEPD”) updated its guidance on the use of cookies, denying the validity of consent obtained through cookie walls or continued browsing.[31]
In France, the CNIL adopted a different approach set by the French Administrative Court, which in a 2020 ruling invalidated the general and absolute ban on cookie walls. Consequently, the CNIL adopted amending guidelines and a recommendation on the use of cookies and other tracing devices, offering practical examples of the collection of user’s consent.[32]
2. Reform of the e-Privacy Directive
The e-Privacy Regulation was proposed by the European Commission in 2017 in order to update the legislative rules applicable to digital and online data processing and to align e-privacy laws to the GDPR. Ambitious and promising at first, eight presidencies of the Council of the EU have been unable to push the project over the finish line.
In January 2021, the Portuguese Presidency of the Council of the EU (January to June 2021) proposed a new version (the 14th) of the e-Privacy Regulation, with the aim to simplify the text and further align it with the GDPR.[33]
While the new Regulation is not expected to be applicable before 2022, its adoption process should be closely monitored in order to anticipate compliance efforts that will be required, in particular in view of the shorter transition period (from 24 to 12 months) set out in the proposal of the Portuguese Presidency.
3. Enforcement in Relation to Cookies
In parallel, Member State supervisory authorities continued to enforce their national e-privacy legislation transposing the e-Privacy Directive.
In Spain, a social network service was fined €30,000 for breaching the rules relating to cookies, specifically because its cookie banner did not enable users to reject the use of trackers or to issue consent per type of cookie.[34] Similarly, the AEPD imposed a fine of the same amount to an airline for implementing a “cookie wall” on its website.[35]
In France, hefty fines have been imposed for violations of the legal provisions on cookies. First, two companies of a food and goods retail distribution group were fined €2,250,000 and €800,000 euros for various violations, including the automatic setting of cookies on users’ terminals.[36] More recently, two U.S. tech companies have been imposed fines of €100 million and €35 million, respectively, due to violation of the legal framework applicable to cookies. In particular, the CNIL observed that these companies placed advertising cookies on user’s computers without obtaining prior consent and without providing adequate information.[37]
D. Cybersecurity and Data Breaches
As in previous years, EU and Member State supervisory authorities and cybersecurity agencies have continued to be active in the adoption of measures and decisions that enhance and enforce cybersecurity standards.
1. Guidance and Initiatives Adopted by ENISA
The EU Agency for Cybersecurity (“ENISA”) has the mandate of increasing the protection of public and private networks and information systems, to develop and improve cyber resilience and response capacities, and to develop skills and competencies in the field of cybersecurity, including management of personal data.
In 2020, ENISA continued to issue guidelines and to spearhead initiatives to achieve these objectives:
- On 27 January 2020, ENISA released an online platform to assist companies in the security of personal data processing. Among others, the platform focuses on the analysis of technical solutions for the implementation of the GDPR, including the principle of privacy by design. The platform may assist data controllers and processors in the determination of their approach when developing personal data protection policies.[38]
- On 4 February 2020, ENISA published a report outlining frameworks, schemes and standards of possible future EU cybersecurity certification schemes. The report focuses in particular on the current standards applied to fields such as the Internet of Things, cloud infrastructure and services, the financial sector and electronic health records. The Report also addresses gaps in the current cybersecurity certification schemes, paving the way for the adoption of future EU cybersecurity certification schemes.[39]
- On 19 March 2020, ENISA issued a report on security requirements for digital service providers and operators of essential services, based on Directive (EU) 2016/1148 of 6 July 2016 Concerning Measures for a High Common Level of Security of Network and Information Systems Across the Union (“NISD”) and the GDPR. Among other things, the report proposes and sets the outline for a risk-based approach to security. It identifies the guidelines relevant to NISD and GDPR security measures, recommends the establishment of certification mechanisms, and sets the need for competent EU bodies and research bodies to continue providing specialised guidance on state-of-the-art data protection and security techniques.[40]
- On 9 June 2020, ENISA made available a visual tool to ensure transparency with regard to cybersecurity incidents. The tool provides information on eight years of telecommunications security incidents, as well as four years of trust services incident reports. In total, the tool provides information on a total of 1,100 cybersecurity incidents notified as mandated by EU legislation for over nine years. In its release, ENISA noted that, over the last four years, system failure was the most common cause behind both telecom security incidents and trust services incidents.[41]
Finally, it is worth noting the Strategy for a Trusted and Cyber Secure Europe released by ENISA on 17 July 2020. The Strategy aims to achieve a high common level of cybersecurity across the EU, containing ENISA’s strategic objectives to boost cybersecurity, preparedness, and trust across the EU. The Strategy sets out a list of seven objectives that it aims to reach, including the effective cooperation amongst operational actors within the EU in case of massive cyber incidents, the creation of a high level of trust in secure digital solutions, and efficient and effective cybersecurity information and knowledge management for Europe.[42]
2. Enforcement in Relation to Cybersecurity
Member State supervisory authorities have been particularly active in sanctioning data breaches and the lack of appropriate security measures, with significant monetary penalties.
For example, in the UK, three sanctions have been especially significant. First,an airline company was fined £20 million following a cyberattack in 2018, compromising the personal and financial data of more than 400,000 of its customers for over two months.[43] ICO investigators found that the airline company should have identified weaknesses in its security and resolved them with security measures that were available at the time, which would have prevented the cyber-attack.
Second, a hotel chain was fined £18.4 million after an estimated 339 million guest records worldwide were affected following a cyberattack that occurred in 2014, but remained undetected until September 2018.[44] According to the ICO, the investigation revealed failures on the side of the hotel chain to put appropriate technical or organisational measures in place to protect the personal data being processed on its systems, as required by the GDPR. In those two cases, the ICO significantly reduced the amount of the fine originally considered in its notice of intention to fine the companies, taking into account the company’s representations and the economic impact of the COVID-19 pandemic in setting the final amount of the fine.
Third, a ticket sales and distribution company was imposed a £1.25 million fine for failing to comply with its security obligations, in the context of a cyberattack on a chatbot installed on its online payment page, potentially affecting the data of 9.4 million people.[45] The ICO concluded that the company failed to assess the risks of using a chat-bot on its payment page, identify and implement appropriate security measures to negate the risks, and identify the source of suggested fraudulent activity in a timely manner.
In Germany, a German telecommunications service provider was fined by the German Federal Data Protection Authority for insufficient data security procedures established in a call centre that lead to an inappropriate disclosure of a cell phone number of an individual who then complained to a data protection authority. While the fine initially amounted to €9.5 million, it was challenged by the telecommunications service provider and later reduced by the competent district court in Bonn to €900,000.
More recently, in Ireland, a social network service was fined €450,000 concerning its 2019 data breach. This decision bears great importance, as it represented the outcome of the first application of the GDPR dispute resolution mechanism, where the Irish Data Protection Commission adopted a decision further to the adoption of a prior decision by the EDPB.[46]
On 30 July 2020, the Council of the EU imposed its first ever sanctions on cyberattacks. In particular, the Council adopted restrictive measures against six individuals and three entities responsible for or involved in various cyberattacks, including a travel ban and an asset freeze. In addition, EU individuals and entities are forbidden from making funds available to these individuals and entities.[47]
E. The UK and Brexit
The UK regained full autonomy over its data protection rules at the end of the Brexit transition period, on 31 December 2020. However, before Brexit was concluded, the EU and the UK entered into the EU-UK Trade and Cooperation Agreement on 30 December 2020.[48] This Agreement regulates data flows from the EU/EEA to the UK under a so-called “bridging mechanism”, and sets a timeline for the adoption of an EU-UK adequacy decision thereafter.
The Trade and Cooperation Agreement includes mechanisms to enable the UK to make changes to its data protection regime or exercise international transfer powers, subject to mutual agreement, without affecting the bridging mechanism. The EU does not have the power to block changes to the UK’s framework or use of its powers. However, if the EU objects to changes considered by the UK, and the UK implements them despite these objections, the EU/EEA-UK bridge will be terminated.
1. Transfers from and into the EU/EEA and the UK
As indicated above, the bridging mechanism contained in the EU-UK Trade and Cooperation Agreement covers personal data transfers from the EU/EEA to the UK. According to the provisions in the Agreement, it will apply for up to a maximum period of six months, unless an adequacy decision comes into effect earlier. The adoption of an EU adequacy decision for the UK, which is expected to be adopted in 2021, would enable the ongoing free flow of personal data from the EEA to the UK thereafter, without needing to implement additional safeguards.
Notwithstanding the stability offered by the Trade and Cooperation Agreement, the UK Government has advised companies to put in place alternative transfer mechanisms that may safeguard personal data received from the EEA against any interruption to the free flow of personal data.[49] SCCs have been identified as the most relevant mechanism that organisations may resort to in order to safeguard such transfers.
On the other side, regarding personal data transfers from the UK to the EU/EEA and Gibraltar, the conditions under which such transfers may be made will remain unchanged and unrestricted, according to the UK Government.[50]
2. Transfers from and into the UK and other Jurisdictions
The transfer of personal data from third countries and territories to the UK generally raises questions of legal compliance in the exporting jurisdiction. The impact of Brexit has been particularly significant regarding the regulation of data transfers into the UK from jurisdictions that were already covered by an adequacy decision of the European Commission.
Pre-Brexit, the European Commission had made findings of adequacy of personal data transfers to a number of jurisdictions.[51] These adequacy decisions generally address the inbound transfer of personal data from these jurisdictions into the EU/EEA. However, in order to obtain and maintain these adequacy decisions, these jurisdictions put in place legal restrictions on (onward) transfers of personal data to countries outside the EEA, which now include the UK.
To resolve potential issues on transfers of personal data from these jurisdictions to the UK, the governments of most of these jurisdictions have issued statements, resolutions and even modified their legal regimes in order to permit the continued transfer of personal data into the UK. The UK ICO has indicated that it is continuing to work with these jurisdictions in order to make specific arrangements for transfers of personal data to the UK.[52]
On the UK side, the 2019 Brexit regulations applicable to data protection matters recognised the European Commission’s adequacy decisions, and rendered permissible cross-border transfers of personal data to these jurisdictions.[53] The Government and the ICO are working on the adoption of new UK adequacy regulations, to confirm that particular countries, territories or international organisations ensure an adequate level of protection, so as to allow transfers of personal data from the UK to these jurisdictions, without the need for adoption of additional safeguards. SCCs and other mechanisms for lawful international data transfers may be put in place to cover transfers of personal data from the UK to jurisdictions not covered by adequacy decisions.
F. Other Significant Developments in the EU
More generally, this year has been marked by the adoption of important EDPB Guidelines. In addition to those mentioned above, the EDPB released new Guidelines on the concepts of controller and processor, on the targeting of social media users, and on data protection by design and by default.[54]
Furthermore, hefty fines were imposed as mentioned in Sections I.A to D above, in particular in France with the €100 million fine imposed on a tech company which is the highest penalty ever imposed by a supervisory authority as of end of December 2020.
Fines were also imposed on topics other than those addressed above. In particular, in Germany, the Hamburg supervisory authority fined a retail company €35.3 million for illegally collecting and storing sensitive personal data from employees, such as information about health condition, religious beliefs and family matters. According to the authority’s investigation, data about the personal life of the company’s employees had been collected comprehensively and extensively by supervisors since at least 2014, and stored on the company’s network drive. This information was accessible to up to 50 managers of the company and was used, among other things, to create profiles of individual employees in order to evaluate their work performance and to adopt employment decisions. In sum, the practice of the company amounted to a number of data protection violations, including a lack of legal basis for the data processing, illegal processing of the data, and the absence of controls to limit storage and access to the data.[55]
Significant monetary penalties have also been imposed due to the lack of valid consent under the GDPR:
- In Italy, two telecommunications operators were fined approximately €17 and €12 million for processing hundreds of unsolicited marketing communications without having obtained users’ prior consent, without having offered to users their right to object to the processing, and for aggressive telemarketing practices, respectively.[56]
- In Spain, the AEPD fined a bank €5 million for violations of the right to information and for lack of valid consent. In particular, the bank used imprecise terminology to define the privacy policy, and provided insufficient information about the category of personal data processed, especially in relation to customer data obtained through financial products, services, and channels. Moreover, the bank failed to obtain consent before issuing promotional SMS messages, and did not have in place a specific mechanism for consent to be obtained by customers and account managers.[57]
As regards the requirements for valid consent under the GDPR, the CJEU, in its ruling on Orange România SA v Autoritatea Naţională de Supraveghere a Prelucrării Datelor cu Caracter Personal, decided that valid consent cannot be inferred from a preselected box in a contract for the provision of telecommunications services, whereby the customer allegedly consents to the collection and storage of his/her identity document. The Court specified that this is also the case where the customer is misled as to the possibility of concluding the contract if he/she refuses to consent to the processing of his/her data, or where the freedom to choose to object to that collection and storage is affected by the requirement to complete an additional form setting out that refusal.[58]
In addition to increased scrutiny by data protection authorities, there is also a slightly increasing trend in private enforcements actions from consumers and (former) employees. These actions primarily relate to both the enforcement of transparency and access rights to personal data as well as claims for compensation for alleged GDPR violations.
II. Developments in Other European Jurisdictions: Switzerland, Turkey and Russia
As explained in the 2020 International Outlook and Review, the increasing impact of digital services in Europe and the overhaul brought about by the GDPR in the EU have continued to influence the regulatory and enforcement actions of jurisdictions in the vicinity of the EU.
A. Russia
1. Access Restriction Trend in Privacy Laws Enforcement
Russian local data privacy laws have continued to be heavily enforced by the Russian Federal Service for the Supervision of Communications, Information Technology and Mass Communications (“Roskomnadzor”). This activity reflects the growing priority and concern that personal data protection represents for the Russian population. According to Roskomnadzor’s statistics, in the previous year the number of complaints concerning personal data protection had increased to 50,300. The largest number of complaints related to the actions of the owners of internet sites, including social networks, credit institutions, housing and communal services organisations, and collection agencies.[59]
The most notable activity of Roskomnadzor in 2020 was its use of its regulatory powers to manage activities of numerous Internet-based services. Below we describe three noteworthy cases where the access to Internet resource was restricted by Roskomnadzor until the respective company satisfied certain expectations and /or requests of the regulator.
On 29 January 2020, Roskomnadzor announced that it would restrict access to the mail service of a tech company. In deciding so, Roskomnadzor noted that the company was used by cybercriminals to send false messages under the guise of reliable information, and that it had categorically refused Roskomnadzor’s repeated requests for information to be included in the register of information dissemination organisers on the Internet.[60] However, the company has taken actions to address the situation, and currently it is accessible for the Russian users.
On 20 February 2020, Roskomnadzor took a similar measure and temporarily restricted access to another email service provider.[61] The authority stated that, in 2019 and in February 2020, the email service had been used by cyber-attackers to send false messages under the guise of reliable information about the massive mining of social transport infrastructure and ships in the Russian Federation.
On 18 June 2020, Roskomnadzor also announced that it had removed the requirements to restrict access to the messaging application of a tech company.[62] This decision was paired with Roskomnadzor’s declaration of its readiness to cooperate with internet companies operating in Russia to quickly suppress the spread of terrorist and extremist information, child pornography, and the promotion of suicide and drugs. In addition, Roskomnadzor noted that, through joint efforts with leading Russian and foreign companies, it had removed, on average and weekly, 2,500 materials relating to suicidal behaviours, 1,300 materials of an extremist and terrorist nature, 800 materials propagandising drug use, and 300 materials containing pornographic images of minors.
2. The Russian Data Protection Authority Has Continued to Target Large, Multinational Digital Companies
In 2020, Roskomnadzor followed its set trend in targeting large, multinational digital companies. On 31 January 2020 the authority announced that it had initiated administrative proceedings against two social network services.[63] In particular, Roskomnadzor stated that these companies did not meet the requirements for data localisation of Russian users on servers located in the Russian Federation.
Following the authority’s proceedings, on 13 February 2020, the Tagansky District Court of Moscow fined both social network services RUB 4 million (approx. €45,000) for these violations.[64] The Court affirmed the authority’s finding that one of the companies had violated Russia’s legal requirement to record, organise and store the personal data of Russian citizens in databases located in the Russian Federation.[65]
3. Legislative Updates
Several notable laws have been adopted at the end of 2020.
New amendments to the Code of Administrative Offenses of the Russian Federation entail considerable fines for failure to delete prohibited information upon the request of Roskomnadzor.[66] The fines can be imposed on hosting providers or any person enabling other persons to publish information on the Internet for failure to restrict access to prohibited information and owners of the websites or Internet resources for non-deletion of prohibited information may be up to RUB 4,000,000 (approx. €45,000) for the first offence and up to 10% of the company’s annual turnover from the preceding calendar year (but not less than RUB 4,000,000) for the subsequent offence. If prohibited information contains propaganda of extremism, child pornography, or drugs, liability is increased for up to RUB 8,000,000 (approx. €90,000) for the first offence or up to 20% of the company’s annual revenue from the preceding calendar year (but not less than RUB 8,000,000) for the subsequent offence. This law is aimed at establishing liability for hosting providers, owners of websites and information resources who fail to restrict access to or delete the information, dissemination of which is prohibited in Russia, and has come into force on 10 January 2021.
Another amendment to Russian law[67] increases significantly the risks of blocking of internet resources in Russia. The law introduces the status of the owner of an Internet resource involved in violations of the fundamental human rights of Russian citizens. The Prosecutor General, in consultation with the Russian Foreign Ministry, may assign this status to the owner of an Internet resource that discriminates against materials from the Russian media. Such a decision can be made if the internet resource limits access to socially important information based on the nationality, language, or in connection with the imposition of sanctions against Russia or its citizens. If the owner of the internet resource censors or anyhow restricts the access to accounts of Russian media, Roskomnadzor is entitled to restrict access to such internet resource, fully or partially. This law has come into force on 10 January 2021.
The law amending the Personal Data Law significantly changes the legal landscape with regard to the processing of publicly available personal data.[68] As per the new law, data controllers making personal data publicly available for further processing by third parties must obtain individuals’ explicit consents, which shall not be bundled to any other consents and data subjects have a wide range of rights in this regard.
Third parties who intend processing publicly available personal data have three options: (i) to rely on the consent obtained by the controller when making the data publicly available, subject to compliance with the rules of data processing; (ii) to rely on the consent provided by an individual to Roskomnadzor via a dedicated web-based platform to be set up under the law, but also subject to compliance with the rules of data processing; or (iii) to ensure on their own that they have appropriate legal grounds as per the general requirements of Russian Personal Data Law. The above rules will enter into force as of 1 March 2021.
In addition, the new law introduces the data controller’s obligation to publish information on the processing terms and existing prohibitions and conditions for processing of personal data, permitted by a data subject for dissemination, by an unlimited number of persons. These new requirements will come into force as of 1 July 2021. According to the amendments to the Law on Information, Information Technologies, and Information Protection, if a resource is considered a social network, it will be included in the register maintained by the Roskomnadzor.[69] These amendments impose moderation obligations on social networks regarding the content published by users, and require them to make available certain information on their websites.
In practice, social networks will now be required to identify and restrict access to illegal content.[70] Furthermore, the following information must be posted on the social network by its owner: (i) name, email address and an electronic form for sending requests about the illegal content; (ii) annual reports on the results of the consideration of requests and monitoring activities; (iii) terms of use of the social network. This amendment will enter into force on 1 February 2021.
The recently adopted laws evidence the trend of the increased regulation of IT-industry activities in Russia. With these new regulations, the Russian authorities increase the regulatory mechanisms that may affect the activities of websites, news media, social media, social networks and video hosting services in Russia.
B. Switzerland
1. The Revised FADP
On 25 September 2020, the Swiss Parliament adopted the revised version of the Federal Act on Data Protection 1992 (“Revised FADP”).[71] The Revised FADP is not in force yet, as it was subject to approval by referendum until 14 January 2021 (which was not held). The Federal Council will decide on entry into force which is expected during 2021 or at the beginning of 2022. The specific date is particularly important because the Revised FADP does not provide for any transitional periods.
One of the main reasons behind the adoption of the Revised FADP was to ensure that the EU recognises Switzerland as providing an adequate level of protection to personal data according to GDPR standards.
The most significant differences between the Revised FADP and the previous version, are the following:
- The Revised FADP now codifies expressly the international principle of the effects doctrine, subject to the principles governing civil and criminal enforcement that remain in place.[72] Hence, the Revised FADP will also apply on persons that are domiciled outside of Switzerland if they process personal data and this data processing has an effect in Switzerland.
- Personal data pertaining to legal entities is no longer covered by the Revised FADP, which in line with the GDPR, and most foreign data protection laws.[73]
- The Revised FADP will extend the term of sensitive data by adding two new categories: (i) genetic data; and (ii) biometric data that uniquely identifies an individual.[74]
- The Revised FADP now contains a legal definition of profiling that corresponds to the definition in the GDPR.[75]
- The Revised FADP distinguishes controllers and processors.[76]
- Like the GDPR, the Revised FADP contains provisions concerning data protection by design and by default.[77]
- The Revised FADP provides that a processor can hire a sub-processor only with the prior consent of the controller.[78]
- Under the Revised FADP and subject to specific exemptions, controllers and processors must maintain records of data processing activities under their respective responsibility. The former duty to notify data files to and register with the Federal Data Protection and Information Commissioner (“FDPIC”) has been abolished.[79]
- Under the Revised FADP and under specific conditions, controllers that are domiciled or resident abroad and process personal data of Swiss individuals must designate a representative in Switzerland.[80]
- The Revised FADP provides that individuals must (at the time of collection) be informed about certain minimum information[81] and have a new right to intervene in case of automated decision-making.[82]
- Under the Revised FADP, the FDPIC will have the power to issue binding decisions. However, it will not have the unilateral power to impose fines, unlike most data protection authorities in Europe – resort to Swiss courts will be required.
- Controllers are required to conduct a Data Protection Impact Assessment (“DPIA”) where there is a high risk for the privacy and the fundamental rights of data subjects.[83]
- Controllers will have a data breach notification obligation to the FDPIC where an incident results in high risk for data subjects.[84]
- The Revised FADP introduces the right to data portability, which was not covered by the previous data protection law.[85]
- The maximum amount of sanctions for individuals will be CHF 250,000 (approx. €232,000),[86] and the Revised FADP also extends criminal liability to the violation of additional data protection obligations.
As can be seen, there are significant similarities between the Revised FADP and the GDPR. The entry into force of the Revised FADP is therefore expected to lead to continuity in the cross-border data transfers between the EU and Switzerland.
2. The Swiss-U.S. Privacy Shield
On 8 September 2020, the FDPIC published an assessment on the Swiss-U.S. Privacy Shield where it found that the cross-border transfer mechanism did not guarantee an adequate level of protection regarding data transfers from Switzerland to the U.S.[87] Prior to FDPIC’s assessment, the CJEU had delivered its judgment in Schrems II,[88] in July 2020, which rendered the European Commission’s decision on the EU-U.S. Privacy Shield invalid.
The FDPIC identified two key problems concerning the Swiss-U.S. Privacy Shield, namely: (i) the lack of an enforceable legal remedy for persons concerned in Switzerland in particular due to the inability to assess the effectiveness of the Ombudsman mechanism because of a lack of transparency; and (ii) the inability to assess the decision-making abilities of the Ombudsman and its independence with respect to U.S. intelligence services. Since FDPIC’s assessment is a soft-law instrument without legally binding nature, the Swiss-U.S. Privacy Shield will remain valid and binding for the companies registered unless and until it is repealed or annulled on a case-by-case basis by the competent Swiss courts or in its entirety by the U.S.
C. Turkey
1. Turkish Data Protection Authority and Board Issues a Number of Regulations, Decisions and Guidance Documents
In 2020, the Turkish Data Protection Authority (“KVKK”) and the Turkish Data Protection Board (the “Board”) continued to issue a number of statements, decisions and guidance documents regarding the application and enforcement of Turkish data protection provisions. We outline and briefly explain below the most relevant ones:
- On 16 December 2020, the KVKK issued a statement on the data protection rules related to publicly available personal data. In the statement, the KVKK acknowledged that the Law on Protection of Personal Data No. 6698 (“Turkish Data Protection Act”) allows personal data to be processed where the data concerned is made available to the public by the data subject themselves.[89] However, the KVKK clarified that the concept of “making data public” has a narrow meaning under the Turkish Data Protection Act, and only covers scenarios where the data subjects wish the data to be public for data processing – the mere act of making personal data available to the public is not sufficient.
- On 26 October 2020, the KVKK issued a statement on cross-border data transfers outside of Turkey.[90] The statement noted that the Turkish Data Protection Act allowed a grace period for compliance with relevant data transfer provisions, and that several deadlines had been extended due to the COVID-19 pandemic. The KVKK also committed to eliminate and correct any misunderstandings arising from the interpretation and implementation of the Act, which had led to criticism from practitioners and scholars. As a start, the KVKK clarified that the Board will carry out assessments on the adequacy of foreign jurisdictions for data transfers based on a number of factors, including the reciprocity concerning data transfers between the importing country and Turkey. The KVKK also indicated that “Binding Corporate Rules” (“BCRs”) may be applicable and used in data transfers between multinational group companies. Indeed, on 10 April 2020, the KVKK introduced BCRs to the Turkish data protection law, to be used in cross-border personal data transfers of multinational group companies.[91] In its announcement, the KVKK described the undertaking letter procedure for data transfers outside of Turkey, and states that although the undertaking letters make bilateral data transfers easier; they may be inadequate in terms of data transfers between multinational group companies. Therefore, the KVKK determined BCRs as another mean that could be used in international data transfers between group companies.
- On 17 July 2020, the KVKK issued a statement on de-indexing of personal data from search engine results[92] based on the Board’s decision with number 2020/481.[93] The KVKK stated in its announcement that, they have evaluated the applications submitted before the KVKK with regards to the requests as to de-indexing web search results and within the scope of “right to be forgotten”, the Board decided that search engines should be considered as “data controllers” under the Turkish Data Protection Act, that individuals may primarily convey their de-indexing requests to the search engines and file complaints before the KVKK and search engines should make a balance test between fundamental right and freedoms and public interest. Additionally, KVKK also published a criteria document[94] by indicating that de-indexing requests should be considered per the issues indicated therein, which is mainly based on Article 29 Working Party’s Opinion on the Guidelines on the Implementation of the Court of Justice of the European Union Judgment on Costeja Case.
- On 26 June 2020, the KVKK issued a statement on obligation to inform data subjects.[95] The statement concerns the general rules that are already regulated under the Turkish Data Protection Act and secondary legislation concerning the obligation to inform set forth for the data controllers. KVKK indicated in its announcement that privacy policies or data processing policies should not be used to fulfill the obligation to inform and thus, privacy notices should be separated from these texts. Following that, the KVKK listed several examples with regards to the deficiencies and illegalities as to obligation to inform.
- In the context of the COVID-19 pandemic, on 9 April 2020, the KVKK issued a statement on the processing of location data in light of the COVID-19 pandemic.[96] The statement highlights that many other countries have used and allowed the use of personal data, such as the health, location and contact information of individuals, to identify those who carry or are at risk of carrying this disease. The KVKK reminds that the processing of this data needs to be carried out within the framework of the basic principles enshrined in the Turkish Data Protection Act.
2. Turkish Data Protection Act Continues to be Enforced
2020 was also a year in which the KVKK enforced the Turkish Data Protection Act in a number of data protection proceedings.
On 6 February 2020, the KVKK fined an undisclosed bank TRY 210,000 (approx. €27,800) for illegally processing personal data to gain potential customers.[97] The case concerned the creation of bank accounts without the knowledge or consent of individuals, using information gained by the bank via a third party. The KVKK found that the bank had acted in breach of its security obligations to prevent unlawful processing of personal data.
On 22 July 2020, the KVKK fined an automotive company TRY 900,000 (approx. €101,840) for violations related to the transfer of personal data based on the Convention for the Protection of Individuals with Regard to Automatic Processing of Personal Data (“Convention 108”).[98] The software provider sought to rely on the fact that the receiving country was party to Convention 108 and, therefore, offered sufficient protection to personal data imported from Turkey. However, the KVKK outlined that the fact that a receiving country is a party to Convention 108 is in itself an insufficient measure in determining adequate protection of data. The data transfer had thus been carried out in breach of the Turkish Data Protection Act, without data subjects’ consent and not benefitting from any of the exceptions set out in the Turkish Data Protection Act. It is worth noting, in this regard, that the KVKK is yet to publish the list of countries deemed to provide sufficient protection under Turkish law. Finally, the decision notes that the data controller failed to comply with its data security obligations, as it had failed to prevent the unlawful processing and transfer of personal data. The KVKK ordered the data controller to delete/destroy the personal data unlawfully transferred outside of Turkey.
On 16 April 2020, the KVKK fined a gaming company TRY 1,100,000 (approx. €120,000) for failing to notify the KVKK of data breach within seventy-two (72) hours after becoming aware of the relevant data breach and to take required data security measures.[99]
On 27 February 2020, the KVKK fined an e-commerce company TRY 1,200,000 (approx. €120,000) mainly, TRY 1,100,000 for failing to fulfil the obligations relating to data security and TRY 100, 000 for failing to comply with the obligation to inform data subjects.[100] Besides, the Board also ordered the data controller to revise the data processing processes and privacy policy, Conditions of Sale and Use and Cookie Notice in accordance with the determined irregularities and in line with the Turkish Data Protection Act. The Board stated in its decision that (i) the privacy policy contains lots of information and general information about personal data processing and this does not mean that the data subjects are duly informed; (ii) although the data processing activities start with the cookies as soon as a user enters the website, information obligation is not complied with at any stages such as cookies or member login to the website; (iii) explicit consent is not obtained for commercial electronic communications and cross-border transfer of personal data; and (iv) considering that the undertaking letters submitted for cross-border transfer of personal data are not approved and the safe countries have not been announced, data controller may only transfer personal data abroad based on data subjects’ explicit consent.
III. Developments in Asia-Pacific, Middle East and Africa
A. Australia
The Australian government released the Terms of Reference and Issues Paper for the review of the Privacy Act 1988, and solicited public submissions by 29 November 2020. This wholesale review may update main provisions of the Privacy Act 1988, such as increasing maximum civil penalties, creating a binding privacy code for social media platforms, strengthening notification and consent requirements, modifying international data transfers, and expanding the definition of personal information. The government plans to issue a discussion paper seeking specific feedback on preliminary outcomes and possible areas of reform in early 2021.
B. China
1. New Developments in Chinese Legislation
The most significant legislative framework in China for the protection of personal data is the Cybersecurity Law (“Cybersecurity Law”) which came into effect on 1 June 2017. Two additional laws were introduced into the pipeline in 2020: the Draft Personal Information Protection Law[101] (“Draft PIPL”); and the Draft Data Security Law (“Draft DSL”). Once adopted, the combination of these three legal instruments (the Cybersecurity Law, the Draft Data Security Law and the Draft PIPL) are expected to become the fundamental laws in the field of cybersecurity and data protection in China.
The Draft PIPL is intended to be a general data protection law, which could harmonise the current fragmented legislative framework. However, even after the adoption of the Draft PIPL, personal information protection in China would remain sector based.
The Draft PIPL was partially inspired by the GDPR, but it has important differences that prevent a common cross-border approach (e.g., regarding the legal grounds for data processing, there is no legal basis of legitimate interest of the controller). Using a single privacy framework for EU and Chinese companies would consequently not result in adequate compliance.
The Draft PIPL introduces substantial new fines. For example, data processors are subject to fines of RMB 50 million (approx. €8 million, or $7.4 million), or 5% of the company’s revenue from the previous year.[102] In addition, the Cyberspace Administration of China would also have the competence to blacklist organisations and individuals for misusing data subjects’ data.[103]
On 18 November 2020, the Centre for Information Policy Leadership (“CIPL”) submitted recommendations on possible modifications of the Draft PIPL in order to ensure the protection of China’s citizens, businesses and government data,[104] including the following:
- The Draft PIPL includes definitions for sensitive personal information,[105] including biometric, financial, ethnic and religious information. The CIPL suggested a risk-based approach to assess personal data processing, rather than providing categories of predefined “sensitive information”.
- According to the CIPL, exemptions should be provided to the general requirement to appoint data protection officers and representatives, in line with other foreign privacy laws like the GDPR.
- The Draft PIPL should explain further what conditions or factors are required to satisfy the Cyberspace Administration’s security assessment for cross-border transfers of personal data.
- The Draft PIPL should clarify what constitutes a “serious” unlawful act.
- Finally, the CIPL recommended that organisations be afforded a two-year grace period from the date that the Draft PIPL is passed, to be fully compliant.
The other major legislative proposal, the Draft DSL, is intended to provide the fundamental rules of data security for both personal and non-personal data. The intended scope of application of the Draft DSL is broad, applying to “activities” (actions including collection, storage, processing, use, supply, trade and publishing) regarding “data” (any record of information in electronic or non-electronic form).
Finally, on 1 January 2021 the Civil Code of the People’s Republic of China entered into force, adopted by the third session of the 13th NPC. The Civil Code applies to all businesses in general (without distinguishing among controllers and processors), and introduces rules for the protection of personal information, including its collection, use, disclosure, and processing.
2. Enforcement of Chinese Data Protection and Cybersecurity Legislation
In August 2020, the China Banking and Insurance Regulatory Commission (“CBIRC”) issued two separate fines of RMB 1 million ($150,000) on two banks.[106] In both cases the banks were fined for failures to provide protection to personal data of credit card customers.
C. Hong Kong SAR
On June 30, 2020, the Law of the People’s Republic of China on Safeguarding National Security in the Hong Kong Special Administrative Region (the “NSL”) passed by the Standing Committee of the National People’s Congress of the People’s Republic of China (the “PRC”) became effective in Hong Kong. The NSL empowers law enforcement authorities to search electronic devices and premises that may contain evidence of related offenses and carry out covert surveillance upon approval of the Chief Executive; criminalizes acts of terrorism, subversion, secession, or collusion with foreign or external forces to endanger national security; and holds incorporated or unincorporated entities accountable for violations of the NSL.
Furthermore, the Committee for Safeguarding National Security (the “Committee”), which consists of specified Hong Kong officials and an advisor appointed by the Central People’s Government of the PRC (the “CPR”), is established pursuant to the NSL and assumes various duties including formulating work plans and policies, advancing the enforcement mechanisms and coordinating significant operations for safeguarding national security in Hong Kong. Decisions made by the Committee are not subject to judicial review.
The Office for Safeguarding National Security of the CPG (the “Office”) may in specified circumstances assume jurisdiction over serious or complex cases which would be difficult or ineffective for Hong Kong to handle in light of, for example, involvement of a foreign country or external elements. Such cases shall be investigated by the Office and, upon prosecution by a body designated by the Supreme People’s Procuratorate, adjudicated by a court designated by the Supreme People’s Court of the PRC.
The NSL applies not only to offenses committed or having consequences in Hong Kong by any person or entity, but also offenses committed from outside Hong Kong against Hong Kong by any person or entity.
D. India
1. Legislative initiatives
As indicated in the 2020 International Outlook and Review, the Personal Data Protection Bill 2019 (“PDP Bill”) was introduced in Parliament on 11 December 2019 adapted from the draft data protection legislation presented to the Ministry of Electronics and Information Technology on 27 July 2018[107], by the committee of experts led by Justice Srikrishna. Thereafter the PDP Bill was referred to a Joint Parliamentary Committee for its review. As of January 2021, the PDP Bill is in its final stages of deliberation and is expected to be promulgated soon. Several industry bodies and stakeholders were asked to depose before the Joint Parliamentary Committee for their views on the amendments made in the PDP Bill and the desired requisites of a national data protection law. Until the PDP Bill is enacted, the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules 2011, continue to govern data protection in India.
In September 2019, the Ministry of Electronics and Information Technology constituted a committee of experts (“Committee”) to devise a framework for the regulation of non-personal data. Ultimately, on 12 July 2020, the Committee released a Report on Non-Personal Data Governance Framework (“NPD Framework”)[108], where it emphasised that the regulation of non-personal data is necessary to incentivise innovation, create value from data sharing, address privacy concerns, and prevent harm. The NPD Framework was met with criticism for the imposition of compulsory data sharing obligations and onerous compliance requirements on entities collecting and managing non-personal data. After reviewing feedback from public and stakeholders, the Committee released a revised version of the NPD Framework on 1 January 2021, wherein the Committee provided several clarifications to the earlier draft and streamlined the jurisdictions of the PDP Bill and the NPD Framework. The NPD Framework is still under public consultation and is yet to be presented before the Parliament as a bill for the promulgation of a single national-level regulation to establish rights over non-personal data collected and created in India.
In August 2020, the Government of India also proposed a data-sharing framework in the fintech sector. The National Institution for Transforming India (“NITI Aayog”) released a draft framework on the Data Empowerment and Protection Architecture[109] which will be implemented by the four government regulators: the Reserve Bank of India, the Securities and Exchange Board of India, the Insurance Regulatory and Development Authority, and the Pension Fund Regulatory and Development Authority, and the Ministry of Finance. The draft aims to institute a mechanism for secure consent-based data sharing in the fintech sector, which may be an important step towards empowering individuals in relation to their personal data. The draft aims to enable individuals to share their financial data across banks, insurers, lenders, mutual fund houses, investors, tax collectors, and pension funds in a secure manner.
In August 2020, the Government of India also launched the National Digital Health Mission (“NDHM”), a visionary project which intends to digitise the entire health care ecosystem of India. The National Health Data Management Policy, 2020[110] came into force on 15 December, 2020, and is the first step in realising the NDHM’s guiding principle of “security and privacy by design” for the protection of data principals’ personal digital health data privacy. It is intended to be a guidance document across the National Digital Health Ecosystem and sets out the minimum standard for data privacy protection for data relating to the physiological and psychological health of individuals in India.
2. Regulatory opinions and guidance
Indian institutions have also adopted certain measures in response to the challenges resulting from the COVID-19 pandemic. For instance, the Data Security Council of India (“DSCI”) issued the best practices on working from home in light of COVID-19[111] on 18 March, 2020. The guidance notes, among other things, that virtual private networks should only be used on company-owned devices, employees should access company data and applications through a browser-based webpage or virtual desktop, and a risk assessment should be conducted when selecting a remote access method. In addition, the guidance outlines a basic mandate for organisations and employees, which includes taking care of the confidentiality of valuable transactions and sensitive financial documents when working from home.
In a similar vein, the DSCI published, on 24 April 2020, its guidelines on data privacy during the COVID-19 pandemic, which highlights the privacy implications of COVID-19 for different sets of stakeholders and provides privacy and data protection practices.[112] The guidelines address healthcare privacy considerations and note the importance of notifying patients of all information that is collected, having specific protocols in place to ensure that consent is obtained, having internal and external audit mechanisms to assess privacy measures, and using health data solely for the specific purposes of their collection. Finally, the guidelines provide working from home considerations both for employers and employees, noting the importance of revisiting data protection strategies, data management practices, remaining compliant with regulatory obligations, conducting Data Protection Impact Assessments to ascertain privacy risks, and spreading privacy awareness and training across organisations.[113]
The DSCI also published its Report for Enabling Accountable Data Transfers from India to the United States Under India’s Proposed Personal Data Protection Bill on 8 September 2020[114] (“Report on Data Transfers”). The purpose of the Report on Data Transfers is to make additional recommendations to the existing draft of the PDP Bill to enable free flow of data between countries, especially with the U.S. owing to the value it adds to India’s digital economy, and to provide solutions for facilitating India-US data transfers. The Report on Data Transfers also suggests, among other things, that the PDP Bill’s provision on the creation of codes of practice should include certification requirements in order to increase interoperability between different privacy regimes as well as facilitate cross-border transfer mechanisms.
On 2 September 2020, the Artificial Intelligence Standardisation Committee for the Department of Telecommunication released its Indian AI Stack discussion paper.[115] The Discussion Paper notes that the AI Stack will, among other things, secure storage environments that simplify archiving and extraction from data based on the data classification, ensure the protection of data through data federation, data minimisation, an open algorithm framework, defined data structures, interfaces and protocols, and monitoring, auditing, and logging, as well as ensuring the legitimacy of backend services.
3. Enforcement of data protection laws
In 2020, the Government of India adopted three decisions to block applications following information that they were engaging in activities which were prejudicial to the integrity and the national security of India.[116]
In particular, the Government had received complaints regarding the misuse of mobile application data, stealing and secretly transmitting users’ data in an unauthorised manner to servers located outside of India. As a result, on 29 June 2020, the Government decided to disallow the use of 59 applications to safeguard the interests of Indian mobile and internet users.[117] Similarly, on 2 September 2020[118], and 29 November, 2020,[119] the Indian Government decided to further block 118 and 43 mobile applications respectively for misusing users’ data and engaging in activities which are prejudicial to the sovereignty, integrity and defence of India, as well as the security of the state and public order. According to the Government, the applications’ practices raised concerns relating to the fact that they were collecting and sharing data in a manner which compromised the personal data of users, posing a severe threat to the security of the State.
On 23 November 2020, the Orissa High Court delivered an important judgment emphasising the need to recognise the right to be forgotten, noting the presence of objectionable images and videos of rape victims on social media platforms.[120] The court emphasised that the principle of purpose limitation is already embodied in law by virtue of the precedent of the Supreme Court’s judgment in K.S. Puttaswamy v. Union of India, and that capturing images and videos with the consent of the victim cannot justify the subsequent misuse of such content. The court referred to existing case law and the PDP Bill, which provide for the right to be forgotten. Accordingly, the court recognised the right to be forgotten as a right in rem and stressed that, in the absence of legislation, victims may nevertheless seek appropriate orders to have offensive posts erased from public platforms to ensure protection their right to privacy.
E. Indonesia
On 24 January 2020, a draft of the Personal Data Protection Act (“PDP Bill”) was submitted to the Indonesian House of Representatives.[121] The PDP Bill consolidates the rules related to personal data protection in Indonesia, and is anticipated to establish data sovereignty and security as the keystone of Indonesia’s data protection regime.[122]
On 1 September 2020, the Ministry of Communication and Information Technology of Indonesia (“Kominfo”) issued a statement claiming that the PDP Bill would be completed by mid-November 2020.[123] However, it appears that the COVID-19 pandemic has led to delays in the adoption of the Bill.
Finally, on 10 March 2020, Kominfo submitted a new draft regulation on the Management of Privately Managed Electronic System Organiser (“Draft Regulation”) for approval. The Draft Regulation is intended to serve as an implementing regulation of Government Regulation No. 71 of 2019 on the Implementation of Electronic Systems and Transactions, which, as noted in the 2020 International Outlook and Review, became effective in October 2019.
F. Israel
On 29 November 2020, the Israeli Ministry of Justice (“MoJ”) launched a public consultation on the introduction of amendments to the Protection of Privacy Law 5741-1981.[124] The MoJ also launched, on 23 July 2020, a public consultation on proposed amendments to privacy law database registration requirements which would reduce the scope of the obligation to register a database and amend certain definitions contained in the law.[125]
Moreover, the Privacy Protection Authority (“PPA”) published a number of reports and recommendations on a series of topics, including:
- privacy protection in the context of epidemiological investigations,
- security recommendations following security incidents,
- the protection of privacy in the context of money transfers and app payments,
- data processing and storage service providers,
- smart transportation services,
- digital monitoring tools for COVID-19 contact tracing,
- GSS assistance in contact tracing,
- recommendations in the context of the COVID-19 pandemic (e.g., remote learning, privacy for individuals entering workplaces, medical institutions privacy compliance).
Following the CJEU’s decision to annul the EU-U.S. Privacy Shield in Schrems II, the PPA issued, on 29 September 2020, a statement regarding transfers of personal information from Israel to the U.S. In this statement, the PPA indicated that data transfers from Israel to the U.S. could no longer rely on the EU-U.S. Privacy Shield or the Transfer of Information Regulations, and that alternative exceptions provided for in Section 2 of the Regulations could only be used where applicable. The PPA had nonetheless clarified that personal data could be transferred from Israel to EU Member States, as well as to countries which will cease to be EU Member States but will continue to apply and enforce the provisions of EU Law on the protection of personal data.[126]
On the enforcement side, in 2020 the PPA identified and investigated a number of violations, including the leak of personal data of 6.5 million Israeli voters.[127] The PPA also offered security recommendations following the security incident at an insurance company.
G. Japan
On 5 June 2020, the Parliament of Japan adopted a bill to amend the currently applicable general data protection law, the Act on the Protection of Personal Information (“APPI”).[128]
Under the bill, the rights of the data subjects have been expanded. For example, if the proposed amendments to the APPI are introduced, data subjects will be entitled to request an organisation to delete their personal information, but only if certain requirements are met. Consequently, the scope has remained narrower than the right to erasure and the right to object under the GDPR.
Regarding data retention periods, the currently applicable law provides that any data which was to be erased after six months is not considered as “retained personal data”, and therefore is not not subject to data subject requests. The Amendments will abolish this six-month rule, and data subjects will be able to exercise their data-related rights regardless of the retention period.
Under the current applicable law, organisations should “duly make an effort” to report data breaches to the Personal Information Commission (“PIC”). In contrast, the bill will introduce a mandatory obligation to notify data breaches, obliging organisations to report data breaches to the PIC and to notify the affected data subjects if their rights and interests are infringed. Although this requirement is similar to the corresponding provisions in the GDPR, the latter sets a strict deadline of 72 hours for notification, while the bill requires “prompt” reporting.
The amended APPI will include the concept of “pseudonymously processed information”, which similarly to the GDPR will mean personal information that cannot be used to identify an individual unless combined with other information. Pseudonymously processed information will not be subject to some requirements, such as requests for disclosure, utilisation, or correction. In the event of a data breach concerning pseudonymously processed information, reporting to the PIC will not be mandatory.
One of the main goals of the bill is to address the increasing risks associated with cross-border data transfers. Under the new provisions, data subjects should be informed about the details of any data transfer to a third party located in a foreign country. The bill has also increased the criminal penalties, such as the penalty for violating an order of the PIC (100 million yen; approx. €800,000). However, administrative fines will not be introduced.
The bill is expected to enter into force no later than June 2022. The new rules will bring the APPI into closer alignment with the EU’s data protection standards and strengthen Japan’s data protection regime.
H. Malaysia
On the legislative side, on 14 February 2020, a public consultation paper was released proposing amendments to the Malaysian Personal Data Protection Act 2010, which currently regulates data protection in Malaysia.[129] If adopted, the amendments would introduce significant changes to Malaysia’s data protection regime, including: the obligatory appointment of a data protection officer, mandatory breach reporting, the introduction of civil litigation against data users, the implementation of technical and organisational measures such as data portability and privacy by design, and the broadening of the Malaysian Personal Data Protection Act’s scope to data processors. Many of the proposed amendments have been inspired by the GDPR and aim to bring the Malaysian regime closer to EU data protection standards.
On 29 May 2020, the Department of Personal Data Protection (“PDP”) released advisory guidelines on the handling of personal data by businesses under the Conditional Movement Control Order.[130] The advisory guidelines highlight that only names, contact numbers, and the dates and times of attendance can be collected from customers, and requires a clearly visible notice detailing the purpose of collection. The PDP also advises that personal data should only be collected for informational purposes and must be permanently deleted six months after the Control Order is terminated.
I. Singapore
As explained in the 2020 International Outlook and Review, Data protection in Singapore is currently governed by the Personal Data Protection Act 2012 (“Singapore PDPA”).
The Personal Data Protection Commission (“PDPC”) conducted a review of the Singapore PDPA and, on 14 May 2020, the PDPC released a joint statement with the Ministry of Communications and Information announcing the launch of an online public consultation on a bill to amend the Singapore PDPA and the Spam Control Act 2007 (“SCA”).[131]
On the basis of this, the proposed amendments to the Singapore PDPA to address Singapore’s evolving digital economy needs, and related amendments to the SCA, were passed in Parliament on 2 November 2020.[132] The bill introduced several notable amendments, including mandatory data breach notification requirements, enabling meaningful consent where necessary and providing consumers with greater autonomy over their personal data through the incorporation of a data portability obligation.[133] Moreover, the bill strengthened the enforcement powers of the PDPC.[134]
Subsequently, on 20 November 2020, the PDPC issued the draft Advisory Guidelines on Key Provisions of the Personal Data Protection (Amendment) Bill (“Draft Advisory Guidelines”).[135] The Draft Advisory Guidelines provide clarifications on key provisions in the bill, covering, inter alia, the framework for the collection, use, and disclosure of personal data, mandatory breach notification requirements, financial penalties, and offences for mishandling personal data. The Draft Advisory Guidelines will be finalised and published when the amendments to the Singapore PDPA come into effect, i.e., upon their signing and publication in the Gazette, which is expected in early 2021.
J. South Korea
In January 2020, the National Assembly of the Republic of Korea adopted amendments (“Data 3 Act”) to the Personal Information Protection Act 2011 (“PIPA”)[136] and to other main data protection laws. The adoption of the Data 3 Act meant the implementation of a more streamlined approach to personal data protection in South Korea. In addition, it is expected that these legislative changes will facilitate the adequacy assessment under the GDPR and the adoption of an adequacy decision from the European Commission.
The Data 3 Act aims to extend the powers of the Personal Information Protection Commission (“PIPC”), which will be the supervisory authority for any data breaches. Data protection issues are currently handled by several different agencies, but with the entry into force of the reforms these will now be handled exclusively by the PIPC. In addition, the PIPC will have the competence to impose fines similar to those provided under the GDPR.
The Data 3 Act introduced to the PIPA the concept of “pseudonymised information” (i.e., personal information processed in a manner that cannot be used to identify an individual unless combined with other information). Pseudonymised information may be processed without the consent of the data subject for purposes of statistical compilation, scientific research, and record preservation for the public interest.
Finally, it should be noted that the cross-border transfer of the personal data of Korean data subjects has remained restricted as their consent is required prior to transferring their personal data abroad.
K. Thailand
As noted in the 2020 International Outlook and Review, the Personal Data Protection Act 2019 (“Thailand PDPA”), which is the first consolidated data protection law in Thailand, was originally expected to come into full effect on 27 May 2020. However, in May 2020, the government of Thailand approved a Royal Decree to postpone the application of the Thailand PDPA until 31 May 2021, citing the negative effects of the COVID-19 pandemic as one of the main reasons for doing so.[137]
Subsequently, on 8 June 2020, the Ministry of Digital Economy and Society (“MDES”) issued a statement on the Thailand PDPA’s postponement, noting that government agencies, and private and public institutions, were not ready for the enforcement of the legislation.[138] This was followed by a notice published by the MDES on 17 July 2020 for data controller requirements and security measures to be implemented during the postponement period of the Thailand PDPA.[139]
Reference must be made to the fact that the Thailand PDPA is largely modelled upon the GDPR, containing many similar provisions, although they differ in areas such as anonymisation. Moreover, the Thailand PDPA provides for the creation of the Personal Data Protection Committee (“PDPC”), which is yet to be fully established. As such, the MDES is currently acting as the supervisory authority for any data protection–related issues within Thailand. Once created, the PDPC is expected to adopt notices and regulations to clarify and guide data controllers and other stakeholders on how to prepare for and remain compliant with the requirements under the Thailand PDPA by 27 May 2021.
L. United Arab Emirates
On 19 November 2020, the Abu Dhabi Global Market (“ADGM”)[140] announced the issuance of a public consultation on proposed new Data Protection Regulations 2020 amending the existing Data Protection Regulations 2015.[141] The proposed draft aims at aligning the ADGM with certain international standards, especially the GDPR,[142] and introduces, amongst other things, the following elements: definitions, the principles of accountability and transparency, the processing of special categories of data, individual rights, security obligations, and the notification of data breaches. The proposed data protection framework is aimed to have a broad scope of application, including the processing of personal data in the context of the activities of an establishment in ADGM, regardless of whether the processing takes place in ADGM. In a similar vein, it will apply to natural persons, whatever their nationality or place of residence, excluding cases where a data controller is only connected to ADGM because it uses a data processor located inside the ADGM. In the latter case, the Proposed Data Protection Framework would not apply to the data controller.[143]
On 1 July 2020, the Dubai International Financial Centre (the “DIFC”) published the Data Protection Regulations, which entered into effect on the same date with the Data Protection Law No. 5 of 2020.[144] In particular, the Regulations comprise provisions regarding, in particular, the content and format to be followed by personal data processing records, activities requiring data processing notifications to the Data Protection Commissioner, conditions to transfer data outside of the DIFC, and fines. Moreover, in September 2020, the DIFC became a fully accredited member of the Global Privacy Assembly (“GPA”).[145]
M. Other Developments in Africa
Data protection authorities in Africa have generally been monitoring compliance with data protection requirements, especially in the context of the COVID-19 pandemic. Moreover, Nigeria and other African nations have developed a framework that aims to harmonise laws on data protection and the digital economy.[146]
Egypt: On 17 July 2020, Resolution No. 151 of 2020 (“Egypt Data Protection Law”) was approved and published in the official gazette, and within three months it came into force.[147] The Egypt Data Protection Law governs the processing of personal data carried out electronically, in part or in full, and gives to data subjects’ rights in relation to the processing of personal data. The key elements that the law provides for are the following:
- consent is the main legal basis for the processing of personal data;
- conditions and principles for data processing must be respected;
- the Centre for the Protection of Personal Data is the regulatory body aiming to maintain compliance with the Egypt Data Protection Law; and
- activities covered include the processing of sensitive personal data, cross-border transfers, electronic direct marketing practices, monetary penalties and criminal sanctions for violations of the Egypt Data Protection Law itself.
Kenya:[148] The Information Technology Industry Council (“ITI”) announced, on 28 April 2020, that it had submitted comments to the Office of the U.S. Trade Representative on the U.S. and Republic of Kenya Trade Agreement negotiations. These comments include measures that should ensure protection of personal data by taking into account best international practices for privacy and interoperability, strengthen regulatory practices in emerging technologies such as artificial intelligence and machine learning, and promote risk-based cybersecurity and vulnerability disclosure in alignment with international standards.[149] The formal negotiations were launched in July 2020.[150]
Namibia: Namibia has not yet enacted a comprehensive data protection legislation. On 24 February 2020, the Council of Europe organised, in coordination with Namibia’s Ministry of Information and Communication Technology, a two-day stakeholders’ consultation workshop on a draft data protection bill for Namibia.[151] A draft of the bill is expected to be published in 2021.
Nigeria: In Nigeria, data privacy is currently protected by a comprehensive data protection regime comprising a variety of laws, regulations, and guidelines. As underlined in a statement, issued on 27 January 2020 by the National Information Technology Development Agency (“NITDA”), the Nigeria Data Protection Regulation concerns the use, collection, storage or transfer of personal data and intends to provide a clear framework for data protection in Nigeria. However, pursuant to the Nigerian Communications Commission, appropriate legal instruments must be put in place in order in order to strengthen cybersecurity.[152]
The NITDA issued, on 17 May 2020, its Guidelines for Management of Personal Data by Public Institutions in Nigeria.[153] On 20 August 2020, the NITDA had published the Draft Data Protection Bill 2020 for public comments. The Draft Bill aims primarily to promote a code of practice that ensures the protection of personal data and its lawful, fair and transparent process in accordance with the principles set out in the Draft Bill while taking into account the legitimate interests of commercial organisations as well as government security agencies. In addition, the Draft Bill provides for a Data Protection Commissioner, an impartial, independent and effective regulatory authority.
South Africa:[154] In 2013, the Protection of Personal Information Act (“POPIA”) was signed into law by the President of South Africa and the Information Regulator was established as the supervisory authority. In June 2020, the President announced that certain essential remaining sections of POPIA would commence to apply on 1 July 2020 and that, following a 12-month transition period, public and private bodies would need to comply from 30 June 2021.
In addition, on 3 April 2020, the South African Regulator published a guidance note on processing personal information during the Coronavirus pandemic encouraging proactive compliance by responsible parties when processing personal information belonging to COVID-19 cases and their contacts.[155]
Togo: On 9 December 2020, the National Assembly announced that it had adopted a draft decree on the organisation and functioning of the body for the protection of personal data, the IPDCP, which will have a power of investigation and enforcement in order to support the government’s policy on personal data protection.[156]
Rwanda: A final draft of the data protection bill was approved and published on 27 October 2020 by the Office of the Prime Minister of the Republic of Rwanda.[157] The Bill includes provisions on data subject rights, general rules for data collection and processing, and procedures for data activities, such as transfers, sharing and retention.[158] Moreover, the Ministry of ICT and Innovation (MINICT) published, on 5 May 2020, COVID-19 guidelines addressing cybersecurity measures.[159]
N. Other Developments in the Middle East
Whereas data protection was mainly provided for in sectoral regulations, privacy laws are progressively emerging across the region.
Oman: On 12 July 2020, the State Council of the Sultanate of Oman announced that it had held discussions on the draft law on the protection of personal data, which comprises in particular provisions regarding the role of the Ministry of Technology and Communications, the responsibility to protect the rights of personal data owners, and the obligations of controllers and processors, as well as the applicable sanctions.[160] The State Council also announced on 10 September 2020 that it had discussed a draft law of a new legislation dealing with cybersecurity. The Technology and Innovation Committee of the State Council had approved in part the content of the draft law.
Pakistan: Data protection is still governed through sectoral legislation. However, the Ministry of Information Technology and Telecommunication (“MOITT”) finalised the draft Personal Data Protection Bill 2020 which was presented to the Cabinet of Pakistan for approval.[161] The bill, which was introduced in April 2020, provides for the general requirements for personal data collection and processing and contains several similar provisions to those found within GDPR, but is silent regarding the right to data portability and does not require data controllers to notify data subjects of data breaches. In addition, the MOITT adopted, on 18 November 2020, social media rules setting measures and obligations applicable to social media and internet providers in order to prevent unlawful online content and to protect national security.[162]
O. Other Developments in Southeast Asia
Throughout 2020, developments related to the data protection and cybersecurity landscape occurred in certain other jurisdictions in the south-eastern subregion of Asia, including the following:
Cambodia: While the country does not have a general personal data protection law or a data protection authority, there have been recent legislative developments addressing relevant areas. In particular, a draft cybercrime law is currently being prepared that would regulate Cambodia’s cyberspace and security, aiming to prevent and combat cyber-related crimes.
Philippines: On 9 March 2020, the APEC Cross-Border Privacy Rules (“CBPR”) system Joint Oversight Panel approved the Philippines’ application to join the APEC CBPR system. As such, the Philippines becomes the ninth APEC economy to join the CBPR system.
The institutions in the Philippines have been particularly active in formulating data protection measures and statements to address issues relating to the collection and processing of data in the wake of the COVID-19 pandemic. On 1 June 2020, the Philippines created a task force in order to drive practical responses to privacy issues emerging from the pandemic.
Vietnam: The data protection framework in Vietnam was fragmented, and relevant provisions can be found in numerous laws. In 2020, the government of Vietnam issued Decree No. 15/2020/ND-CP, providing for regulations on penalties for administrative offences in the sectors of post, telecommunication, radio frequency, information technology, and electronic transactions, which is in effect as of 15 April 2020. In February 2020, however, a draft personal data protection decree was released, which has already undergone public consultation. The draft decree sets out principles of data protection, including purpose limitation, data security, data subject rights, and the regulation of cross-border data transfers. Moreover, the draft decree contains provisions on obtaining consent of data subjects, the technical measures needed to protect personal data, and the creation of a data protection authority.
IV. Developments in Latin America and in the Caribbean Area
A. Brazil
The biggest data protection development in Brazil in 2020 was the entry into force of Law No. 13.709 of 14 August 2018, the General Personal Data Protection Law[163] (as amended by Law No. 13.853[164] of 8 July 2019) (“LGPD”) on 18 September 2020. The specific enforcement provisions of the LGPD are expected to enter into force on 1 August 2021, further to an additional law passed in June 2020.
Compared to the EU’s GDPR, the LGPD shows both differences and similarities. The definitions of “personal data” are very similar in both instruments, both having the goal of assuring a high level of protection for any “information related to an identified or identifiable natural person”. Thus, anonymised data falls expressly out of scope in the two jurisdictions, with a caveat on the Brazilian side existing in the sense that if anonymised data is used to create or enhance the behavioural profiling of a natural person, it may also be deemed as personal data, provided that the impacted person can be identified in the process.
Both legislations apply to the processing of personal data carried out by both public and private entities, online and offline. As for the territorial scope, the rules apply to organisations that are physically present in the EU and Brazil as well as to organisations that, although not located in those states/regions, may offer goods or services there. When it comes to the handling of sensitive data, the LGPD sets forth a narrower list of legal grounds that can be elected to legitimise the processing of such data, such as the necessity to comply with a legal obligation, to protect the life and physical safety of the subject or a third party, for the exercise of rights in contractual or judicial proceedings and for the prevention of fraud.
The LGPD offers ten legal grounds for processing of personal data, which are comparable to the ones provided in the GDPR. In addition, the LGPD offers four additional grounds that may authorise the processing of personal data, namely for the conduction of studies of research bodies, for the exercise of rights in judicial, administrative, and arbitral proceedings, for the protection of health in procedures conducted by health professionals and health entities, and for the protection of credit.
Both the LGPD and the GDPR expressly provide for a set of rights granted to data subjects with respect to their personal data. Both norms recognise individuals’ right of access to their personal data, right to be informed of processing activities based on their personal data, and rights of rectification and erasure. Although the rights prescribed in both pieces of legislation are fairly similar, it could be argued that the major element that sets both norms apart are the timeframes for responding to data subject requests. While on the European side organisations must generally respond to requests within one month of the receipt of a request, the LGPD is limited to a 15-day period for complying with access requests, while requests for the exercise of other rights should be responded to immediately.
The role of data protection officers (“DPOs”) is fairly similar under both legislations. DPOs are legally tasked with acting as a point of contact between the organisation they represent, the supervisory authorities, and data subjects, as well as advising and orienting the organisation they represent with regard to its data protection obligations. There are, however, two major differences between the Brazilian and the EU rules concerning the position of DPOs. The first one is that the GDPR expressly specifies instances where an organisation is required to appoint a DPO, while the LGPD makes no such limitation, thus obliging virtually every organisation subject to its scope to appoint one. The second difference is that, while the GDPR establishes the need for DPOs to be independent within the organisational structure of their organisations and also to be provided with monetary and human resources to fulfil their tasks, the LGPD does not provide such express guidance.
A significant difference between the two instruments is their enforcement. The legal structure of the Brazilian supervisory authority lacks some traits of independence and autonomy when compared to the structure provided for under the GDPR. However, the LGPD has introduced a number of sanctions that can be imposed by the ANPD, such as public disclosure of a violation, erasure of personal data relating to a violation, and even a temporary suspension of data processing activities. The entry into force of the provisions of the LGPD governing administrative sanctions has been deferred to 1 August 2021.
On 23 September 2020, Bill 4695/2020,[165] seeking to protect the personal information of students when using distance learning platforms, was introduced. The bill would require distance learning platforms to follow data processing requirements provided by the LGPD and to, whenever possible, use the technology without collecting and sharing personal and sensitive data, revealing racial origin, religious or political beliefs, or genetics of the users. Furthermore, the bill requires that processing of personal data can only take place when prior and express consent has been obtained.
Finally, on 18 December 2020, the National Telecommunications Agency (“Anatel”) approved the Cybersecurity Regulation[166] applied to the telecommunications sector. The regulation is intended to promote cybersecurity in telecommunications networks and services and support ongoing supervision of the market, infrastructures, and the adoption of proportional corrective measures. Moreover, the regulation imposes an obligation on telecommunication providers to develop, maintain and implement a detailed cybersecurity policy, which must include, inter alia, national and international norms, best practices, risk mapping, incident response time and sharing and sending information to Anatel. The regulation came into force on 4 January 2021.
B. Other Developments in South America
1. Argentina
On 28 January 2020, The Argentinian data protection authority (“AAIP”) issued a resolution[167] against a telecommunication company for violations of Law No. 26.951 (“DNC Law”).[168] In particular, the AAIP issued a fine of ARS 3,000,000 (approx. €45,000) for 248 charges relating to violations of Article 7 of the DNC Law, which provides that those who advertise, offer, sell or give away goods or services by means of telephone communications may not address any individual who is registered in the “Do Not Call” registry.
On 6 June 2020, the AAIP imposed a fine[169] of ARS 280,000 (approx. €3,770) against a tech company for violations of the Personal Data Protection Act No. 25.326 of 2000. In particular, the AAIP found that the company did not allow a user to access their personal data in their email account and related applications after changes to their passwords were made by an un-authorised third party.
2. Chile
On 1 June 2020, the Chilean Transparency Council (“CPLT”) announced that an audit of 12,000 purchase orders made by 86 organisations in the health sector had revealed some disclosures of sensitive personal data of patients without their express consent.[170] Moreover, the CPLT highlighted that in some cases the data had even been made public through online platforms. To remedy that, the CPLT has offered technical support to the Chilean Ministry of Health.[171]
3. Colombia
On 26 November 2020, the Colombian data protection authority (“SIC”) announced that it had issued an order[172] requiring a videoconference service provider (with no physical presence in Colombia) to implement new measures guaranteeing the security of personal data of its users in Colombia. SIC emphasised that the measures should be effective and meet the standards of data security required under the Colombian Data Protection Law, and required the company to provide a certificate issued by an independent data security expert. SIC’s order raise significant jurisdictional question, since the Colombian Data Protection Law does not apply to processing that occurs outside of Colombia (and there was no allegation that any processing in violation of the Law occurred in Colombia).).[172a]
Through 2020, SIC also imposed a number of fines on various companies for non-compliance with data protection rules. Some of the biggest and most notorious fines were imposed on a health company[173] and on financial institutions[174]
4. Mexico
Since the beginning of the COVID-19 pandemic, the Mexican data protection authority, the National Institute of Transparency, Access to Information and Data Protection (“INAI”) began a series of actions to provide information to the general public on how to protect their personal data and the guidelines for data controllers on how to process personal and sensitive personal data.
Among these actions, it became imperative to announce to health-related data controllers, public and private hospitals, to comply with their legal obligations as per the Mexican data protection laws, on how to process personal data of patients diagnosed with COVID-19. This was especially the case because Mexican data protection laws consider health-related data to be sensitive and thus require stronger security measures.
One of the first actions by the Mexican data protection authority was that, on 29 March, 2020, it launched a COVID-19 microsite[175] dedicated specifically to provide useful information and guidelines to protect personal data and provide transparency during the pandemic. This microsite has been a useful tool for both data subjects and data controllers to handle personal data processed as a result of the COVID-19 pandemic.
On 2 April 2020, the INAI released a statement calling for the adoption of extreme precautions with regard to personal data of COVID-19 patients.[176] Medical personnel handling such data must use strict administrative, physical and technical safeguards to avoid any loss, destruction of improper use. The INAI also recommended that only minimum necessary personal data is collected, and only for purposes of preventing and containing the spread of the virus. This communication also speaks of the responsibility that all data processors bear when handling personal data.
As the pandemic grew, on 13 July 2020, the INAI expressed its concerns on the deficiencies of the health sector in the processing of personal data of COVID-19 patients. Francisco Javier Acuña Llamas, the then President Commissioner of INAI, noted that data bases that contain COVID-19 patients must be kept for a specific period of time and not indefinitely. He established that all data transferences of sensitive personal data should be under the specificities of the Mexican data protection laws. He also recognised that the Global Privacy Assembly, to be held in Mexico in 2021, should have at its core a discussion of the impact of the pandemic.[177]
The pandemic brought a series of events that had not been taken into consideration on a regular basis, because of the pandemic many companies allowed their employees to work from home. Because of this development, on 8 April 2020, the INAI issued recommendations for the protection of personal data in a home office environment. These guidelines highlighted the need to implement security measures that included only using computer equipment provided by the employer, not using public connections, using only official communication sites to share information, and using passwords on all equipment used at home for work-related activities.[178]
In Mexico this brought legislative changes to the Federal Labor Law[179] that now establishes how work from home is to be regulated. These modifications to the law establish both the employers and employees’ obligations when working from home. This comes to show how, due to the COVID-19 pandemic, a new normality is underway and will be here to stay.
This pandemic is far from over and it poses a challenge not only to the processing of sensitive personal data, but also to the implementation of health check points in every public space or while working from home. It has changed the way organisations protect their information from any loss or improper access putting cybersecurity at the forefront for any organisation. It has changed the way organisations interact with clients and how products or services are purchased, turning evermore to an online commerce activity. This will bring challenges not only regarding companies’ operations, but also how companies collect and process a data subjects’ information.
5. Uruguay
On 21 February 2020, the Council of Ministers adopted Decree No.64/020 on the Regulation of Articles 37-40 of Law No. 19.670 of 15 October 2018 and Article 12 of Law No. 18.331 of 8 November 2008.[180]
The Decree regulates new personal data protection obligations with major changes, including requiring all database owners and data controllers to report security incidents involving personal data to the Uruguayan data protection authority within a maximum of 72 hours. Reports must contain relevant information relating to the security incident, including the actual or estimated date of the breach, the nature of the personal data affected and possible impacts of the breach.
The Decree establishes the obligation to assess the impact of a breach when data processing involves specially protected data, large volumes of personal data (i.e., data of over 35,000 persons) and international data transfers to countries not offering an adequate level of protection. The Decree obliges public entities, and private entities that focus on the processing of sensitive personal data or of large volumes of data, to appoint a data protection officer.
[1] See http://curia.europa.eu/juris/document/document.jsf;jsessionid=2BDC80771D0FB7EA8B6F60B9A3C4F572?text=&docid=228677&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=20032710.
[2] See, e.g., https://www.cnil.fr/en/invalidation-privacy-shield-cnil-and-its-counterparts-are-currently-analysing-its-consequences (French CNIL); https://www.aepd.es/es/derechos-y-deberes/cumple-tus-deberes/medidas-de-cumplimiento/transferencias-internacionales/comunicado-privacy-shield (Spanish AEPD).
[5] See https://edpb.europa.eu/our-work-tools/public-consultations-art-704/2020/recommendations-012020-measures-supplement-transfer_en.
[6] See https://edpb.europa.eu/sites/edpb/files/files/file1/edpb_recommendations_202002_europeanessentialguaranteessurveillance_en.pdf.
[7] See https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12741-Commission-Implementing-Decision-on-standard-contractual-clauses-for-the-transfer-of-personal-data-to-third-countries.
[8] See https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12740-Data-protection-standard-contractual-clauses-between-controllers-processors-located-in-the-EU-implementing-act-
[9] See https://edpb.europa.eu/news/news/2021/edpb-edps-adopt-joint-opinions-new-sets-sccs_en#:~:text=The%20EDPB%20and%20EDPS%20have,of%20contractual%20clauses%20(SCCs).&text=The%20Controller%2DProcessor%20SCCs%20will,between%20controllers%20and%20their%20processors.
[10] See, e.g., https://www.enisa.europa.eu/news/executive-news/top-tips-for-cybersecurity-when-working-remotely. On 15 March 2020, the Director of the ENISA shared some views on teleworking conditions during COVID-19. The Director recommended that individuals work with a secure Wi-Fi connection and have up-to-date security software, regularly update their anti-virus systems and make periodic backups. Employers should also provide regular feedback to their employees on the procedures to follow in case of problems.
[11] See https://edpb.europa.eu/sites/edpb/files/files/news/edpb_statement_2020_processingpersonaldataandcovid-19_en.pdf.
[13] See https://edpb.europa.eu/sites/edpb/files/files/file1/edpb_guidelines_202003_healthdatascientificresearchcovid19_en.pdf andhttps://edpb.europa.eu/sites/edpb/files/files/file1/edpb_guidelines_20200420_contact_tracing_covid_with_annex_en.pdf.
[14] See https://ec.europa.eu/health/sites/health/files/ehealth/docs/contacttracing_mobileapps_guidelines_en.pdf andhttps://edpb.europa.eu/sites/edpb/files/files/file1/edpb_statementinteroperabilitycontacttracingapps_en.pdf.
[15] See https://edps.europa.eu/sites/edp/files/publication/20-11-17_preliminary_opinion_european_health_data_space_en.pdf.
[17] See https://ico.org.uk/media/for-organisations/documents/2617676/ico-contact-tracing-recommendations.pdf.
[18] See https://www.cnil.fr/fr/application-stopcovid-la-cnil-tire-les-consequences-de-ses-controles andhttps://www.cnil.fr/fr/tousanticovid-la-cnil-revient-sur-levolution-de-lapplication-stopcovid.
[19] See https://www.cnil.fr/sites/default/files/atoms/files/deliberation_du_24_avril_2020_portant_avis_sur_un_projet_dapplication_mobile_stopcovid.pdf.
[21] See https://ico.org.uk/global/data-protection-and-coronavirus-information-hub/contact-tracing-protecting-customer-and-visitor-details/ andhttps://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2020/09/data-protection-guidance-for-collecting-customer-information/.
[22] See https://www.datenschutz.rlp.de/fileadmin/lfdi/Dokumente/Pruefschritte_Datenuebermittlung_in_Drittlaender_nach_Schrems_II.pdf.
[23] See https://ico.org.uk/global/data-protection-and-coronavirus-information-hub/coronavirus-recovery-data-protection-advice-for-organisations/.
[24] See https://www.cnil.fr/fr/coronavirus-covid-19-les-rappels-de-la-cnil-sur-la-collecte-de-donnees-personnelles andhttps://www.cnil.fr/fr/les-questions-reponses-de-la-cnil-sur-le-teletravail.
[26] See https://www.dataprotection.ie/sites/default/files/uploads/2020-07/Data%20Protection%20implications%20of%20the%20Return%20to%20Work%20Safely%20Protocol.pdf.
[27] See https://autoriteitpersoonsgegevens.nl/nl/nieuws/ap-onderzoekt-meten-temperatuur-werknemers-tijdens-corona.
[28] See https://www.cnil.fr/fr/cameras-dites-intelligentes-et-cameras-thermiques-les-points-de-vigilance-de-la-cnil-et-les-regles andhttps://www.datenschutzkonferenz-online.de/media/dskb/20200910_beschluss_waeremebildkameras.pdf.
[29] See https://ec.europa.eu/health/sites/health/files/vaccination/docs/2019-2022_roadmap_en.pdfhttps://ec.europa.eu/health/sites/health/files/vaccination/docs/2019-2022_roadmap_en.pdf.
[32] See https://www.cnil.fr/fr/cookies-et-autres-traceurs-la-cnil-publie-des-lignes-directrices-modificatives-et-sa-recommandation.
[36] See https://www.cnil.fr/fr/sanctions-2250000-euros-et-800000-euros-pour-carrefour-france-carrefour-banque.
[37] See https://www.cnil.fr/en/cookies-financial-penalties-60-million-euros-against-company-google-llc-and-40-million-euros-google-ireland andhttps://www.cnil.fr/en/cookies-financial-penalty-35-million-euros-imposed-company-amazon-europe-core.
[39] See https://www.enisa.europa.eu/publications/recommendations-for-european-standardisation-in-relation-to-csa-ii/.
[40] See https://www.enisa.europa.eu/news/enisa-news/security-requirements-for-operators-of-essential-services-and-digital-service-providers.
[41] See https://www.enisa.europa.eu/news/enisa-news/spotlight-on-incident-reporting-of-telecom-security-and-trust-services.
[42] See https://www.enisa.europa.eu/news/enisa-news/enisa-unveils-its-new-strategy-on-cybersecurity-for-a-trusted-and-cyber-secure-europe.
[43] See https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2020/10/ico-fines-british-airways-20m-for-data-breach-affecting-more-than-400-000-customers/.
[44] See https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2020/10/ico-fines-marriott-international-inc-184million-for-failing-to-keep-customers-personal-data-secure/.
[45] See https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2020/11/ico-fines-ticketmaster-uk-limited-125million-for-failing-to-protect-customers-payment-details/.
[46] See https://edpb.europa.eu/sites/edpb/files/files/file1/edpb_bindingdecision01_2020_en.pdf and https://edpb.europa.eu/sites/edpb/files/decisions/final_decision_-_in-19-1-1_9.12.2020.pdf.
[47] See https://www.consilium.europa.eu/en/press/press-releases/2020/07/30/eu-imposes-the-first-ever-sanctions-against-cyber-attacks/.
[48] See https://ec.europa.eu/info/sites/info/files/draft_eu-uk_trade_and_cooperation_agreement.pdf.
[50] See https://ico.org.uk/for-organisations/dp-at-the-end-of-the-transition-period/data-protection-now-the-transition-period-has-ended/the-gdpr/international-data-transfers/.
[51] The adequacy decisions adopted by the European Commission currently cover Andorra, Argentina, Canada (commercial organisations only), Faroe Islands, Guernsey, Isle of Man, Israel, Japan (private-sector organisations only), Jersey, New Zealand, Switzerland and Uruguay.
[52] Seehttps://ico.org.uk/for-organisations/dp-at-the-end-of-the-transition-period/data-protection-now-the-transition-period-has-ended/the-gdpr/international-data-transfers/.
[53] See Schedule 21 of the Data Protection Act 2018, as enacted by the Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019.
[54] See https://edpb.europa.eu/sites/edpb/files/consultation/edpb_guidelines_202007_controllerprocessor_en.pdf, https://edpb.europa.eu/sites/edpb/files/consultation/edpb_guidelines_202008_onthetargetingofsocialmediausers_en.pdf and https://edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-42019-article-25-data-protection-design-and_en.
[56] See https://www.garanteprivacy.it/home/docweb/-/docweb-display/docweb/9435753 and https://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/9485754.
[59] The Statistics are (in Russian) available athttps://rkn.gov.ru/news/rsoc/news71528.htm.
[60] Press release (in Russian) available athttps://rkn.gov.ru/news/rsoc/news71612.htm. For more information in English seehttps://www.reuters.com/article/us-russia-protonmail-idUSKBN1ZS1K8.
[61] Press release (in Russian) available athttps://rkn.gov.ru/news/rsoc/news72026.htm.
[62] Press release (in Russian) available athttps://rkn.gov.ru/news/rsoc/news73050.htm. For more information (in English) seehttps://www.ft.com/content/b1e76905-29f2-4ac0-99e0-7af07cef280d. For more information see the 2020 Privacy and Cybersecurity International Review and Outlook.
[63] Press release (in Russian) available athttps://rkn.gov.ru/news/rsoc/news71720.htm. More information (in English) available athttps://www.themoscowtimes.com/2020/01/31/russia-threatens-facebook-twitter-with-100k-fines-a69126.
[64] Press release (in Russian) available athttps://mos-gorsud.ru/rs/taganskij/news/mirovoj-sudya-rajona-taganskij-rassmotrel-dela-ob-administrativnyh-pravonarusheniyah-v-otnoshenii-tvitter-ink-i-fejsbuk-ink, and https://mos-gorsud.ru/rs/taganskij/news/mirovoj-sudya-rajona-taganskij-rassmotrel-dela-ob-administrativnyh-pravonarusheniyah-v-otnoshenii-tvitter-ink-i-fejsbuk-ink. More information (in English) available athttps://www.themoscowtimes.com/2020/02/13/russia-fines-twitter-and-facebook-63000-each-over-data-law-a69280.
[65] See https://www.themoscowtimes.com/2020/11/26/facebook-pays-russia-50k-fine-for-not-localizing-user-data-a72152.
[66] The law (in Russian) available athttp://publication.pravo.gov.ru/Document/View/0001202012300050.
[67] The law (in Russian) available athttp://publication.pravo.gov.ru/Document/View/0001202012300002.
[68] The law (in Russian) available athttp://publication.pravo.gov.ru/Document/View/0001202012300044.
[69] The law (in Russian) available athttp://publication.pravo.gov.ru/Document/View/0001202012300062.
[70] The Russian laws define the notion of illegal content broadly. Inter alia, illegal content is materials containing public calls for terrorist activities or publicly justifying terrorism, other extremist materials, as well as materials promoting pornography, the cult of violence and cruelty, and materials containing obscene language.
[71] The text of the Revised FADP (in German) is available athttps://www.parlament.ch/centers/eparl/curia/2017/20170059/Schluzssabstimmungstext%203%20NS%20D.pdf.
[72] See Revised FADP, Article 3.
[73] See Revised FADP, Article 5(a).
[74] See Revised FADP, Article 5(c).
[75] See Revised FADP, Article 5(f).
[76] See Revised FADP, Article 5(j) and (k).
[77] See Revised FADP, Article 7.
[78] See Revised FADP, Article 9(3).
[79] See Revised FADP, Article 12.
[80] See Revised FADP, Article 14.
[81] See Revised FADP, Article 19.
[82] See Revised FADP, Article 21.
[83] See Revised FADP, Article 22.
[84] See Revised FADP, Article 24.
[85] See Revised FADP, Article 28.
[86] See Revised FADP, Articles 60-63.
[87] Press release available at https://www.admin.ch/gov/en/start/documentation/media-releases.msg-id-80318.html.
[88] Judgment of the Court of 16 July 2020 in Case C-311/18 – Data Protection Commissioner v Facebook Ireland Limited and Maximillian Schrems, available athttp://curia.europa.eu/juris/document/document.jsf?text=&docid=228677&pageIndex=0&doclang=en&mode=lst&dir=&occ=rst&part=1&cid=9791227.
[89] Full statement (in Turkish) available at https://www.kvkk.gov.tr/Icerik/6843/-ALENILESTIRME-HAKKINDA-KAMUOYU-DUYURUSU.
[90] Full statement (in Turkish) available at https://www.kvkk.gov.tr/Icerik/6828/YURTDISINA-VERI-AKTARIMI-KAMUOYU-DUYURUSU.
[91] Full statement (in Turkish) available at https://www.kvkk.gov.tr/Icerik/6728/YURT-DISINA-KISISEL-VERI-AKTARIMINDA-BAGLAYICI-SIRKET-KURALLARI-HAKKINDA-DUYURU.
[92] Full statement (in Turkish) available at https://www.kvkk.gov.tr/Icerik/6777/Kisilerin-Ad-ve-Soyadi-ile-Arama-Motorlari-Uzerinden-Yapilan-Aramalarda-Cikan-Sonuclarin-Indeksten-Cikarilmasina-Yonelik-Talepler-Hakkinda-Kamuoyu-Duyurusu.
[93] Full decision (in Turkish) available athttps://kvkk.gov.tr/Icerik/6776/2020-481.
[94] Full criteria (in Turkish available athttps://kvkk.gov.tr/SharedFolderServer/CMSFiles/68f1fb19-5803-4ef8-8696-f938fb49a9d5.pdf.
[95] Full statement (in Turkish) available athttps://www.kvkk.gov.tr/Icerik/6765/AYDINLATMA-YUKUMLULUGUNUN-YERINE-GETIRILMESI-HAKKINDA-KAMUOYU-DUYURUSU.
[96] Full statement (in Turkish) available at https://www.kvkk.gov.tr/Icerik/6726/COVID-19-ILE-MUCADELEDE-KONUM-VERISININ-ISLENMESI-VE-KISILERIN-HAREKETLILIKLERININ-IZLENMESI-HAKKINDA-BILINMESI-GEREKENLER-2-.
[97] Full text of the Decision (in Turkish) available athttps://kvkk.gov.tr/Icerik/6733/2020-103.
[98] Full text of the Decision (in Turkish) available athttps://kvkk.gov.tr/Icerik/6790/2020-559.
[99] Full text of the Decision (in Turkish) available athttps://www.kvkk.gov.tr/Icerik/6763/2020-286.
[100] Full text of the Decision (in Turkish) available athttps://www.kvkk.gov.tr/Icerik/6739/2020-173.
[101] English text available athttps://www.newamerica.org/cybersecurity-initiative/digichina/blog/chinas-draft-personal-information-protection-law-full-translation/.
[102] See Article 62 of the Draft PIPL.
[103] See Article 42 of the Draft PIPL.
[104] Seehttps://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/cipl_comments_on_chinas_draft_personal_information_protection_law__18_november_2020_-_english_.pdf.
[105] See Article 29 of the Draft PIPL.
[106] CBIRC decisions (in Chinese) available athttp://www.cbirc.gov.cn/branch/shanghai/view/pages/common/ItemDetail.html?docId=920602&itemId=1000 and http://www.cbirc.gov.cn/branch/shanghai/view/pages/common/ItemDetail.html?docId=920603&itemId=1000.
[107] For the daft data protection legislation presented to the Ministry of Electronics and Information Technology on 27 July 2018 by the committee of experts led by Justice Srikrishna, seehttps://meity.gov.in/writereaddata/files/Data_Protection_Committee_Report.pdf.
[108] Report on Non-Personal Data Governance Framework available at https://static.mygov.in/rest/s3fs-public/mygov_159453381955063671.pdf
[109] See “Data Empowerment and Protection Architecture: A Secure Consent-Based Data Sharing Framework to Accelerate Financial Inclusion – Draft for Discussion” (August 2020), available athttps://niti.gov.in/sites/default/files/2020-09/DEPA-Book_0.pdf.
[110] See the National Health Data Management Policy, available athttps://ndhm.gov.in/assets/uploads/NDHM%20Health%20Data%20anagement%20Policy.pdf.
[111] See DSCI, “Work from Home – Best Practices” (18 March 2020), available athttps://www.dsci.in/sites/default/files/DSCI-WorkfromHomeAdvisory-1.pdf.
[112] See DSCI, “COVID-19: Data Privacy Outlook” (24 April 2020), available athttps://www.dsci.in/sites/default/files/DSCI_COVID19_Data_Privacy_Outlook.pdf.
[113] See also DSCI, “Business Resiliency and Security During COVID-19” (24 May 2020), available at https://www.dsci.in/sites/default/files/Business-Resiliency-and-Security.pdf.
[114] See DSCI, “Report on Data Transfers” (8 September 2020), available athttps://www.dsci.in/sites/default/files/documents/resource_centre/DSCI-CIPL-Accountable-Data-Transfer-Report.pdf.
[115] The Discussion Paper is available athttps://www.tec.gov.in/pdf/Whatsnew/ARTIFICIAL%20INTELLIGENCE%20-%20INDIAN%20STACK.pdf.
[116] See “India bans 43 more mobile apps as it takes on China” Reuters (25 November 2020), available athttps://uk.reuters.com/article/uk-india-china-apps/india-bans-43-more-mobile-apps-as-it-takes-on-china-idUKKBN2841QI.
[117] The press release and a list of the apps that were blocked are available athttps://pib.gov.in/PressReleasePage.aspx?PRID=1635206#.XvoIE9L3Qpw.whatsapp.
[118] The press release and a list of the apps that were blocked are available athttps://pib.gov.in/PressReleasePage.aspx?PRID=1650669.
[119] The press release and a list of the apps that were blocked are available athttps://www.pib.gov.in/PressReleasePage.aspx?PRID=1675335.
[120] Case BLAPL/4592/2020 Subhranshu Rout @ Gugul v State of Odisha available at https://www.medianama.com/wp-content/uploads/display_pdf.pdf.
[121] Press release (in Indonesian) available athttps://www.kominfo.go.id/content/detail/24039/siaran-pers-no-15hmkominfo012020-tentang-presiden-serahkan-naskah-ruu-pdp-ke-dpr-ri/0/siaran_pers; the PDP Bill (in Indonesian) is available athttps://web.kominfo.go.id/sites/default/files/users/4752/Rancangan%20UU%20PDP%20Final%20%28Setneg%20061219%29.pdf.
[122] Press release (in Indonesian) available athttps://www.kominfo.go.id/content/detail/24041/menkominfo-indonesia-akan-menjadi-negara-ke-5-di-asean-pemilik-uu-pdp/0/berita_satker.
[123] Press release (in Indonesian) available athttps://www.kominfo.go.id/content/detail/29084/siaran-pers-no-104hmkominfo092020-tentang-pemerintah-apresiasi-pandangan-fraksi-terhadap-ruu-pdp/0/siaran_pers.
[124] Press release (in Hebrew) available at https://www.gov.il/he/departments/publications/Call_for_bids/amendment_privacy_protection
[125] Press release (in Hebrew) available at https://www.tazkirim.gov.il/s/law-item/a093Y00001RdRNXQA3/%D7%AA%D7%96%D7%9B%D7%99%D7%A8-%D7%97%D7%95%D7%A7-%D7%94%D7%92%D7%A0%D7%AA-%D7%94%D7%A4%D7%A8%D7%98%D7%99%D7%95%D7%AA-%D7%AA%D7%99%D7%A7%D7%95%D7%9F-%D7%9E%D7%A1-%D7%94%D7%92%D7%93%D7%A8%D7%95%D7%AA-%D7%95%D7%A6%D7%9E%D7%A6%D7%95%D7%9D-%D7%97%D7%95%D7%91%D7%AA-%D7%94%D7%A8%D7%99%D7%A9%D7%95%D7%9D-%D7%94%D7%AA%D7%A9%D7%A3-2020?language=iw
[126] See “Opinion regarding cross-border transfers of personal data, from Israeli based organisations to organisations based in countries complying with the data protection legislation of the EU” (1 July 2020), available athttps://www.gov.il/en/Departments/publications/reports/personaldata_the_european_union.
[127] See “Personal data of all 6.5 million Israeli voters is exposed” (10 February 2020), available athttps://www.nytimes.com/2020/02/10/world/middleeast/israeli-voters-leak.html. Press release, “Data Breach at Shirbit” (1 December 2020), available athttps://www.gov.il/en/departments/news/news_shirbit.
[128] English version of the of APPI available athttps://www.ppc.go.jp/files/pdf/Act_on_the_Protection_of_Personal_Information.pdf.
[129] Department of Personal Data Protection, “Public Consultation Paper No. 10/2020 – Review of Personal Data Protection Act 2010 (Act 709)” (14 February 2020), available athttps://www.pdp.gov.my/jpdpv2/assets/2020/02/Public-Consultation-Paper-on-Review-of-Act-709_V4.pdf. See also a press release of 26 August 2020, where the Malaysian government announces the continued discussions on amending the Personal Data Protection Act 2010 (in Malay), available athttps://www.kkmm.gov.my/awam/berita-terkini/17616-bernama-26-ogos-2020-kerajaan-masih-bincang-keperluan-pinda-akta-perlindungan-data-peribadi.
[130] Advisory guidelines (in Malay) available athttps://www.kkmm.gov.my/images/AdHoc/200529-ADVISORY.pdf.
[131] See “MCI and PDPC launch online public consultation on Personal Data Protection (Amendment) Bill 2020”, Press Release (14 May 2020), available athttps://www.mci.gov.sg/pressroom/news-and-stories/pressroom/2020/5/MCI-and-PDPC-launch-online-public-consultation-on–Personal-Data%20Protection-Amendment-Bill-2020; “Public Consultation on the Draft Personal Data Protection (Amendment) Bill” (28 May 2020), available athttps://www.mci.gov.sg/public-consultations/public-consultation-items/public-consultation-on-the-draft-personal-data-protection-amendment-bill.
[132] See Bill No. 37/2020 Personal Data Protection (Amendment) Bill, available athttps://www.parliament.gov.sg/docs/default-source/default-document-library/personal-data-protection-(amendment)-bill-37-2020.pdf; Ministry of Communications and Information, “Amendments to the Personal Data Protection Act and Spam Control Act Passed”, Press Release (2 November 2020), available athttps://www.mci.gov.sg/pressroom/news-and-stories/pressroom/2020/11/amendments-to-the-personal-data-protection-act-and-spam-control-act-passed.
[133] See “Opening Speech by Mr S Iswaran, Minister for Communications and Information, at the Second Reading of the Personal Data Protection (Amendment) Bill 2020 on 2 November 2020” (2 November 2020), available athttps://www.mci.gov.sg/pressroom/news-and-stories/pressroom/2020/11/opening-speech-by-minister-iswaran-at-the-second-reading-of-pdp-(amendment)-bill-2020.
[134] See “Amendments to the Personal Data Protection Act and Spam Control Act Passed”, Press Release (2 November 2020), available athttps://www.mci.gov.sg/pressroom/news-and-stories/pressroom/2020/11/amendments-to-the-personal-data-protection-act-and-spam-control-act-passed.
[135] See PDPC, “Draft Advisory Guidelines on Key Provisions of the Personal Data Protection (Amendment) Bill” (20 November 2020), available athttps://www.pdpc.gov.sg/-/media/Files/PDPC/PDF-Files/Advisory-Guidelines/Draft-AG-on-Key-Provisions/Draft-Advisory-Guidelines-on-Key-Provisions-of-the-PDP-(Amendment)-Bill-(20-Nov-2020).pdf?la=en.
[136] English version of PIPA available athttps://elaw.klri.re.kr/kor_service/lawView.do?hseq=53044&lang=ENG.
[137] Royal Decree (in Thai) available athttp://www.ratchakitcha.soc.go.th/DATA/PDF/2563/A/037/T_0001.PDF.
[138] MDES statement (in Thai) is available athttps://www.mdes.go.th/news/detail/2760–ดีอีเอส–เตรียมคลอดประกาศมาตรการคุ้มครองความปลอดภัยข้อมูลส่วนบุคคลฯ.
[139] MDES notice (in Thai) available athttp://www.ratchakitcha.soc.go.th/DATA/PDF/2563/E/164/T_0012.PDF.
[141] See “Abu Dhabi Global Market Launches Public Consultation on New Data Protection Regulatory Framework” by Natasha G. Kohne, Jenny Arlington, Sahar Abas & Mazen Baddar, GDPR, International Privacy (7 December 2020), available at https://www.akingump.com/en/experience/practices/cybersecurity-privacy-and-data-protection/ag-data-dive/abu-dhabi-global-market-launches-public-consultation-on-new-data-protection-regulatory-framework.html.
[142] See “ADGM commences Public Consultation on proposed new Data Protection Regulations” (19 November 2020), available athttps://www.adgm.com/media/announcements/adgm-commences-public-consultation-on-proposed-new-data-protection-regulations.
[144] See Data Protection Regulations, available athttps://www.difc.ae/files/9315/9358/7756/Data_Protection_Regulations_2020.pdf and Data Protection Law No. 5 of 2020, available athttps://www.difc.ae/files/6215/9056/5113/Data_Protection_Law_DIFC_Law_No._5_of_2020.pdf.
[145] For the full list of accredited GPA members, see https://globalprivacyassembly.org/participation-in-the-assembly/list-of-accredited-members/.
[146] See “Africa to harmonise laws for data protection, digital economy” by Gloria Nwafor, Guardian (8 October 2020), https://guardian.ng/appointments/africa-to-harmonise-laws-for-data-protection-digital-economy/?_sm_au_=iVV7MH8JqKDPF0RFFcVTvKQkcK8MG.
[147] See “Sisi endorses law on personal data protection”, Egypt Today (18 July 2020), available athttps://www.egypttoday.com/Article/1/89794/Sisi-endorses-law-on-personal-data-protection.
[148] Kenya’s high court ruled that the country’s new digital ID scheme could continue with some conditions and stronger regulations. The court banned the collection of DNA and geolocation data, See “Court orders safeguards for Kenyan digital IDs, bans DNA collecting“, by Humphrey Malalo, Omar Mohammed, (31 January 2020), available athttps://www.reuters.com/article/us-kenya-rights/court-orders-safeguards-for-kenyan-digital-ids-bans-dna-collecting-idUSKBN1ZU23D
[149] See “ITI Comments on the U.S.-Kenya Trade Agreement Negotiation” (27 April 2020), https://www.itic.org/policy/ITIUS-KenyaFTAComments_27APR2020_FINAL.pdf and “ITI: U.S.-Kenya Trade Agreement Can Set New Global Benchmark for Digital Trade” (28 April 2020), available athttps://www.itic.org/news-events/news-releases/iti-u-s-kenya-trade-agreement-can-set-new-global-benchmark-for-digital-trade.
[150] See “Joint Statement Between the United States and Kenya on the Launch of Negotiations Towards a Free Trade Agreement” (7 August 2020), available athttps://ustr.gov/node/10204.
[151] See “Stakeholders’ Consultation Workshop on the Data Protection Bill in Namibia” (24-26 February 2020), available athttps://www.coe.int/en/web/cybercrime/glacyplusactivities/-/asset_publisher/ekq5KxUZwAqU/content/glacy-stakeholders-consultation-workshop-on-the-data-protection-bill-in-namibia?inheritRedirect=false&redirect=https%253A%252F%252Fwww.coe.int%252Fen%252Fweb%252Fcybercrime%252Fglacyplusactivities%253Fp_p_id%253D101_INSTANCE_ekq5KxUZwAqU%2526p_p_lifecycle%253D0%2526p_p_state%253Dnormal%2526p_p_mode%253Dview%2526p_p_col_id%253Dcolumn-4%2526p_p_col_count%253D1.
[152] See “Pantami Reiterates FG’s Commitment to Strengthening Cybersecurity” (14 April 2020), available athttps://www.ncc.gov.ng/media-centre/news-headlines/783-pantami-reiterates-fg-s-commitment-to-strengthening-cybersecurity.
[154] See “Annual Report for the 2019/2020 Financial Year”, available athttps://www.justice.gov.za/inforeg/docs/anr/ANR-2019-2020-InformantionRegulatorSA.pdf and “South Africa must implement privacy laws to protect citizens, says UN expert” (12 March 2020), available athttps://mg.co.za/article/2020-03-12-south-africa-must-implement-privacy-laws-to-protect-citizens-says-un-expert/. Moreover, two significant incidents were reported: Experian South Africa announced a data incident affecting 24 million South Africans and 793,749 businesses, see “Experian South Africa curtails data incident” (19 August 2020), available athttps://www.experian.co.za/content/dam/marketing/emea/soafrica/za/assets/experian-south-africa-statement-19082020.pdf. Nedbank announced a data incident concerning 1.7 million clients, see “Nedbank warns clients of potential impact of data incident at Computer Facilities (Pty) Ltd”, https://www.nedbank.co.za/content/nedbank/desktop/gt/en/info/campaigns/nedbank-warns-clients.html.
[155] See “Guidance Note on the Processing of Personal Information in the Management and Containment of COVID-19 Pandemic in terms of the Protection of Personal Information Act 4 of 2013 (POPIA),” available athttps://www.justice.gov.za/inforeg/docs/InfoRegSA-GuidanceNote-PPI-Covid19-20200403.pdf and Press Release (3 April 2020), available athttps://www.justice.gov.za/inforeg/docs/ms-20200403-GuidanceNote-PPI-Covid19.pdf.
[156] See “Conseil des ministres: un projet de décret sur la protection des données à caractère personnel adopté” (9 December 2020), available athttps://presidence.gouv.tg/2020/12/09/conseil-des-ministres-un-projet-de-decret-sur-la-protection-des-donnees-a-caractere-personnel-adopte/.
[157] See “Statement on cabinet decisions of 27th October 2020”, available athttps://www.primature.gov.rw/index.php?id=131&tx_news_pi1%5Bnews%5D=933&tx_news_pi1%5Bcontroller%5D=News&tx_news_pi1%5Baction%5D=detail&cHash=7a012c144e6b2eb6d384a0bf1f153c26.
[158] See “Rwanda data protection bill approved” (29 October 2020), available athttps://iapp.org/news/a/rwanda-data-protection-bill-approved/#:~:text=30%20and%20included%20provisions%20on,as%20transfers%2C%20sharing%20and%20retention.
[159] See Cybersecurity Regulation n˚ 010/r/cr-csi/rura/020 of 29/05/2020, available athttps://rura.rw/fileadmin/Documents/ICT/Laws/Cybersecurity_Regulation_in_Rwanda.pdf.
[160] See “Oman: Latest developments in data protection and cybersecurity,” Alice Gravenor, PWC-Middle East (19 November 2020), available athttps://www.pwc.com/m1/en/media-centre/articles/oman-latest-developments-data-protection-cybersecurity.html.
[161] See Draft Personal Data Protection Bill (9 April 2020), available athttps://moitt.gov.pk/SiteImage/Misc/files/Personal%20Data%20Protection%20Bill%202020%20Updated(1).pdf.
[162] See social media rules adopted (6 October 2020), available athttps://moitt.gov.pk/SiteImage/Misc/files/Corrected%20Version%20of%20Rules.pdf.
[163] Law (in English) available athttps://www.lgpdbrasil.com.br/wp-content/uploads/2019/06/LGPD-english-version.pdf.
[164] Law (in Portuguese) available athttps://www.in.gov.br/en/web/dou/-/lei-n-14.058-de-17-de-setembro-de-2020-278155040.
[165] Bill (in Portuguese) available at: https://www.camara.leg.br/proposicoesWeb/prop_mostrarintegra;jsessionid=E847373CA5B3CD6F9C0FEF1AA14EED13.proposicoesWebExterno1?codteor=1931814&filename=Tramitacao-PL+4695/2020.
[166] Regulation (in Portuguese) available at https://sei.anatel.gov.br/sei/modulos/pesquisa/md_pesq_documento_consulta_externa.php?eEP-wqk1skrd8hSlk5Z3rN4EVg9uLJqrLYJw_9INcO760LFI_pHFdPDvhssf6GcKAE5_GJovBZUfi7_h9SO4EFu4GZ_rtRSkPAMggKV38swnbODIuh_k2ClcCwWdtg0X.
[167] Decision (in Spanish) available athttps://www.argentina.gob.ar/sites/default/files/rs-2020-33-apn-aaip.pdf.
[168] Law (in Spanish) available athttp://servicios.infoleg.gob.ar/infolegInternet/anexos/230000-234999/233066/norma.htm.
[169] Decision (in Spanish) available athttps://www.argentina.gob.ar/sites/default/files/rs-2020-25457045-apn-aaip_google.pdf.
[170] Press release (in Spanish) available athttps://www.consejotransparencia.cl/fiscalizacion-del-cplt-descubre-vulneracion-de-la-privacidad-de-pacientes-en-compras-de-hospitales-y-servicios-de-salud/.
[171] Letter (in Spanish) available athttps://www.consejotransparencia.cl/wp-content/uploads/2020/06/N%C2%B0000746-Patricio-Ferna%CC%81ndez-Pe%CC%81rez.-Superintendente-de-Salud.pdf.
[172] Order (in Spanish) available athttps://www.sic.gov.co/sites/default/files/files/Normativa/Resoluciones/Res%2074519%20DE%202020%20ZOOM(1).pdf.
[172a] See https://www.sic.gov.co/slider/superindustria-ordena-la-plataforma-zoom-reforzar-medidas-de-seguridad-para-proteger-los-datos-personales-de-los-colombianos
[173] The imposed fine was of COP 894,365,280 (approx. €214,524), after confirming the violation of the personal data of a data subject whose data was being processed by EPS. Full Resolution available at https://www.sic.gov.co/sites/default/files/files/Normativa/Resoluciones/1%20Apelacio%CC%81n%2018-179365%20%20EPS%20SANITAS%20VP%20F%20(1)%20(1).pdf.
[174] For the first bank, the imposed fine was of COP 702,000,000 (approx. €171,400) for including information that was not of a financial or credit nature in the credit history of 288,753 Colombians. Full Resolution available athttps://www.sic.gov.co/sites/default/files/files/Normativa/Resoluciones/SANCIO%CC%81N%20CIFIN.pdf; for the second bank, the imposed fine was of COP 269,046,492 (approx. €60,030) for violating a data subject’s right to deletion. Full Resolution of SIC available athttps://www.sic.gov.co/sites/default/files/files/Normativa/Resoluciones/19-141889%20VP.pdf; for the third bank, the imposed fine was of COP 356,070,000 (approx. €80,910) for violations of Law 1581 of 2012 and Decree 4886 of 2011. Full decision of SIC available athttps://www.sic.gov.co/sites/default/files/files/Noticias/2019/RE10720-2020(1).pdf.
[175] Press release (in Spanish) available athttps://home.inai.org.mx/wp-content/documentos/SalaDePrensa/Comunicados/Comunicado%20INAI-102-20.pdf.
[176] Press release (in Spanish) available athttps://home.inai.org.mx/wp-content/documentos/SalaDePrensa/Comunicados/Comunicado%20INAI-106-20.pdf.
[177] Press release (in Spanish) available athttps://home.inai.org.mx/wp-content/documentos/SalaDePrensa/Comunicados/Comunicado%20INAI-228-20.pdf.
[178] Press release (in Spanish) available athttps://home.inai.org.mx/wp-content/documentos/SalaDePrensa/Comunicados/Comunicado%20INAI-113-20.pdf.
[179] Mexico’s Official Gazzete publication of January 11, 2021 that modifies section XII Bis of the Federal Labor Law available athttp://dof.gob.mx/nota_detalle.php?codigo=5609683&fecha=11/01/2021.
[180] Decree (in Spanish) available athttps://www.impo.com.uy/bases/decretos/64-2020
The following Gibson Dunn lawyers assisted in the preparation of this article: Ahmed Baladi, Alexander Southwell, Alejandro Guerrero, Vera Lukic and Clémence Pugnet.
Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding these developments. Please contact the Gibson Dunn lawyer with whom you usually work, the authors, or any member of the firm’s Privacy, Cybersecurity and Consumer Protection practice group:
Europe
Ahmed Baladi – Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com)
James A. Cox – London (+44 (0) 20 7071 4250, jacox@gibsondunn.com)
Patrick Doris – London (+44 (0) 20 7071 4276, pdoris@gibsondunn.com)
Kai Gesing – Munich (+49 89 189 33-180, kgesing@gibsondunn.com)
Bernard Grinspan – Paris (+33 (0)1 56 43 13 00, bgrinspan@gibsondunn.com)
Penny Madden – London (+44 (0) 20 7071 4226, pmadden@gibsondunn.com)
Michael Walther – Munich (+49 89 189 33-180, mwalther@gibsondunn.com)
Alejandro Guerrero – Brussels (+32 2 554 7218, aguerrero@gibsondunn.com)
Vera Lukic – Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com)
Sarah Wazen – London (+44 (0) 20 7071 4203, swazen@gibsondunn.com)
Asia
Kelly Austin – Hong Kong (+852 2214 3788, kaustin@gibsondunn.com)
Connell O’Neill – Hong Kong (+852 2214 3812, coneill@gibsondunn.com)
Jai S. Pathak – Singapore (+65 6507 3683, jpathak@gibsondunn.com)
United States
Alexander H. Southwell – Co-Chair, PCCP Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com)
Debra Wong Yang – Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com)
Matthew Benjamin – New York (+1 212-351-4079, mbenjamin@gibsondunn.com)
Ryan T. Bergsieker – Denver (+1 303-298-5774, rbergsieker@gibsondunn.com)
Howard S. Hogan – Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com)
Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com)
Kristin A. Linsley – San Francisco (+1 415-393-8395, klinsley@gibsondunn.com)
H. Mark Lyon – Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com)
Karl G. Nelson – Dallas (+1 214-698-3203, knelson@gibsondunn.com)
Ashley Rogers – Dallas (+1 214-698-3316, arogers@gibsondunn.com)
Deborah L. Stein – Los Angeles (+1 213-229-7164, dstein@gibsondunn.com)
Eric D. Vandevelde – Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com)
Benjamin B. Wagner – Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com)
Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com)
Cassandra L. Gaedt-Sheckter – Palo Alto (+1 650-849-5203, cgaedt-sheckter@gibsondunn.com)
© 2021 Gibson, Dunn & Crutcher LLP
Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.
Paris partner Pierre-Emmanuel Fender is the author of “Weathering the Covid-19 Crisis in France,” [PDF] published in the Europe, Middle East and Africa Review 2020 by Global Restructuring Review in December 2020.
On 10 November 2020, the European Data Protection Board (EDPB) issued important new guidance on transferring personal data out of the European Economic Area (EEA). The guidance addresses a key question for many companies: how to transfer personal data out of the EEA to the United States or other countries not recognized by the European Commission as ensuring an adequate level of protection for personal data. The guidance thus begins to lessen some of the uncertainty caused by the Court of Justice of the European Union’s July 2020 ruling in the landmark Schrems II decision.
The EDPB’s guidance have been published for consultation by citizens and stakeholders until 21 December 2020, and may thus be subject to further changes or amendments. Although the guidance take the form of non-binding recommendations, companies that transfer personal data out of the EEA would be well-served to review their approach to such transfers in light of the EDPB guidance.
I. Context
As a reminder, under the EU’s omnibus privacy law, the General Data Protection Regulation (GDPR), a transfer of personal data out of the EEA may take place if the receiving country ensures an adequate level of data protection, as determined by a decision of the European Commission. In the absence of such an adequacy decision, the exporter may proceed to such data transfer only if it has put in place appropriate safeguards.
In the Schrems II ruling in July 2020, the CJEU invalidated the EU-U.S. Privacy Shield, which had been a framework used by companies transferring personal data from the EEA to the U.S. to provide reassurance that the data would be protected after the transfer. The CJEU’s decision allowed the use of the Standard Contractual Clauses, known as the “SCCs,” approved by the European Commission, to continue as another framework or method to cover such transfers. However, the CJEU required companies to verify, prior to any transfer of personal data pursuant to the SCCs, whether data subjects would be granted a level of protection in the receiving country essentially equivalent to that guaranteed within the EU, pursuant to the GDPR and the EU Charter of Fundamental Rights.[i]
The Court specified that the assessment of that level of protection must take into consideration both the contractual arrangements between the data exporter and the recipient and, as regards any access by the public authorities of the receiving country, the relevant aspects of the legal system of that third country.
Due to their contractual nature, SCCs cannot bind the public authorities of third countries, since they are not party to the contract. Consequently, under Schrems II, data exporters may need to supplement the guarantees contained in the SCCs with supplementary measures to ensure compliance with the level of protection required under EU law in a particular third country.
The EDPB issued on 10 November 2020 two sets of recommendations:
- Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data, which are aimed at providing a methodology for data exporters to determine whether and which additional measures would need to be put in place for their transfers; and
- Recommendations 02/2020 on the European Essential Guarantees (EEG) for surveillance measures, which are aimed at updating the EEG[ii], in order to provide elements to examine whether surveillance measures allowing access to personal data by public authorities in a receiving country, whether national security agencies or law enforcement authorities, can be regarded as a justifiable interference.
II. Recommendations on how to identify and adopt supplementary measures
The EDPB describes a roadmap of the steps to adopt in order to determine if a data exporter needs to put in place supplementary measures to be able to legally transfer data outside the EEA.
Step 1 – Know your transfers. The data exporter should map all transfers of personal data to countries outside the EEA (and verify that the data transferred is adequate, relevant and limited to what is necessary in relation to the purposes for which it is transferred to and processed in the third country).
Step 2 – Verify the transfer tool on which the transfer relies. If the European Commission has already declared the country as ensuring an adequate level of protection for personal data, there is no need to take any further steps, other than monitoring that the adequacy decision remains valid.
In the absence of an adequacy decision, the data exporter and the data importer would need to rely on one of the transfer tools listed under Articles 46 of the GDPR (including the SCCs) for transfers that are regular and repetitive. Derogations provided for in Article 49 of the GDPR[iii] may be relied on only in some cases of occasional and non-repetitive transfers.
Step 3 – Assess if there is anything in the law or practice of the third country that may impinge on the effectiveness of the appropriate safeguards of the transfer tools relied on, in the context of the transfer (see section III below).
The recommendations specify that: (i) the data importer should be in a position to provide the relevant sources and information relating to the third country in which it is established and the laws applicable to it; and (ii) the data exporter may also refer to several sources of information (e.g., case law of the CJEU and of the European Court of Human Rights; adequacy decisions in the country of destination if the transfer relies on a different legal basis; national caselaw or decisions taken by independent judicial or administrative authorities competent on data privacy and data protection of third countries).
If the assessment reveals that the receiving country’s legislation impinges on the effectiveness of the transfer tool contained in Article 46 of the GDPR, Step 4 should be implemented[iv].
Step 4 – Identify and adopt supplementary measures to bring the level of protection of the data transferred up to the EU standard of “essential equivalence”.
Supplementary measures may have a contractual[v], technical[vi], or organizational[vii] nature—and combining diverse measures may enhance the level of protection and contribute to reaching EU standards.
The EDPB provides for:
- Various examples of measures that are dependent upon several conditions being met in order to be considered effective (e.g., technical measures such as encryption or pseudonymization; contractual measures such as obligation to use specific technical measures, transparency obligations, obligations to take specific actions, empowering data subjects to exercise their rights; organizational measures such as internal policies for governance of transfers especially with groups of enterprises, transparency and accountability measures, organization methods and data minimization measures, adoption of standards and best practices); and
- A non-exhaustive list of factors to identify which supplementary measures would be most effective: (a) format of the data to be transferred (i.e. in plain text, pseudonymized or encrypted); (b) nature of the data; (c) length and complexity of the data processing workflow, number of actors involved in the processing, and the relationship between them; (d) possibility that the data may be subject to onward transfers, within the same receiving country or even to other third countries.
The EDPB clarifies that certain data transfer scenarios may not lead to the identification of an effective solution to ensure an essentially equivalent level of protection for the data transferred to the third country. Therefore, in these circumstances, supplementary measures may not qualify to lawfully cover data transfers (e.g., where transfer to processors requires access to data in clear text or remote access to data for business purposes).
In addition, the EDPB specifies that contractual and organizational measures alone will generally not overcome access to personal data by public authorities of the third country. Thus, there will be situations where only technical measures might impede or render ineffective such access.
If no supplementary measure can ensure an essentially equivalent level of protection for a specific transfer, in particular if the law of the receiving country prohibits the application of the possible supplementary measures envisaged (e.g., prohibits the use of encryption) or otherwise prevents their effectiveness, the transfer should be avoided, suspended or terminated.
Step 5 – Implement procedural steps if effective supplementary measures have been identified[viii].
For example, this could consist of entering into an amendment to complete the SCCs to provide for the supplementary measures. When the SCCs themselves are modified or where the supplementary measures added “contradict” directly or indirectly the SCCs, the procedural step should consist in requesting the authorization from the competent supervisory authority.
Step 6 – Re-evaluate at appropriate intervals, i.e., monitor developments in the third country that could affect the initial assessment.
III. Recommendations on how to assess the level of protection of a third country (Step 3)
The “Recommendations 02/2020 on the European Essential Guarantees for surveillance measures” specify the four EEGs to be taken into consideration in assessing whether surveillance measures allowing access to personal data by public authorities in a receiving country (whether national security agencies or law enforcement authorities), can be regarded as a justifiable interference. Such EEGs should be seen as the essential guarantees to be found in the receiving country when assessing the interference (rather than a list of elements to demonstrate that the legal regime of a third country as a whole is providing an essentially equivalent level of protection):
- Processing should be based on clear, precise and accessible rules;
- Necessity and proportionality with regard to the legitimate objectives pursued must be demonstrated;
- An independent oversight mechanism should exist; and
- Effective remedies need to be available to the individual.
IV. Consequences
Many companies will likely continue to transfer personal data outside of the EEA on the basis of the transfer tools listed under Articles 46 of the GDPR (including the SCCs and binding corporate rules). Companies, in particular data exporters, must therefore document the efforts implemented in order to ensure that the level of protection required by EU law will be complied within the third countries to which personal data are transferred.
Such efforts should include, first, to assess whether the level of protection required by EU law is respected in the relevant third country and, if this is not the case, to identify and adopt supplementary measures (technical, contractual and/or organizational) to bring the level of protection of the data transferred up to the EU standard of “essential equivalence”. If no supplementary measure can ensure an essentially equivalent level of protection for a specific transfer, the transfer should be avoided, suspended or terminated.
It is difficult to predict how local supervisory authorities will assess compliance efforts or sanction non-compliant transfers. While the EDPB’s recommendations are to be implemented on a case-by-case basis based on the specifics of the concerned transfer, we may not exclude supervisory authorities to assess independently the level of protection of certain receiving countries and identifying relevant supplemental measures.
In addition, these recommendations raise sensitive issues with respect to Brexit, and come at a critical moment in the Brexit negotiations. The U.K. will, in the event of a “No-Deal” Brexit, become a third state from the end of the transition period on 31 December 2020, and there is unlikely to be, at least immediately, an adequacy decision in place in respect of the U.K. One might reasonably expect that, given its membership throughout the currency of the GDPR and the forerunner directive, an adequacy decision in favor of the U.K. would be rapidly forthcoming. While that would be a determination for the European Commission, the EDPB has expressed reservations, making specific reference to the October 2019 agreement between the U.K. and the U.S. on Access to Electronic Data for the Purpose of Countering Serious Crime, which, it says, “will have to be taken into account by the European Commission in its overall assessment of the level of protection of personal data in the UK, in particular as regards the requirement to ensure continuity of protection in case of “onward transfers” from the UK to another third country.” The EDPB has indicated that if the Commission presents an adequacy decision in favor of the U.K., it will express its own view in a separate opinion. Absent an adequacy decision, transfers from the EEA to the U.K. would fall to be treated in the same way as transfers to other third countries, requiring consideration of Articles 46 and 49, SCCs, supplementary measures, etc.
A separate question is how the U.K. will, post-Brexit transition, treat these recommendations from the EDPB, and the question of transfers to third countries generally (and to the U.S. specifically). It cannot be excluded that this may be among the first area in which we begin to see a limited divergence between EU and U.K. data privacy laws.
It is also worth noting that on 12 November 2020, the European Commission published a draft implementing decision on SCCs for the transfer of personal data to third countries along with a draft set of new SCCs[ix]. The new SCCs include several modules to be used by companies, depending on the transfer scenario and designation of the parties under the GDPR, namely: (i) controller-to-controller transfers, (ii) controller-to-processor transfers, (iii) processor-to-processor transfers and (iv) processor-to-controller transfers. These new SCCs also incorporate some of the contractual supplementary measures recommended by the EDPB as described above. They are open for public consultation until 10 December 2020 and the final new set of SCCs are expected to be adopted in early 2021. At this stage, the draft provides for a grace period of one year during which it will be possible to continue to use the old SCCs for the execution of contracts concluded before the entry into force of the new SCCs[x].
In light of the above, we recommend that companies currently relying on SCCs to consult with their data protection officer or counsel to evaluate tailored ways to document and implement the steps to be taken in order to minimize the risks associated with continued data transfers to non-EEA countries — particularly to the U.S.
________________________________
[i] The Charter of Fundamental Rights brings together all the personal, civic, political, economic and social rights enjoyed by people within the EU in a single text.
[ii] The European Essential Guarantees were originally drafted in response to the Schrems I judgment (CJEU judgment of 6 October 2015, Maximillian Schrems v Data Protection Commissioner, Case C‑362/14, EU:C:2015:650).
[iii] Under article 49.2 of the GDPR, a transfer to a third country or an international organization may take place only if the transfer is not repetitive, concerns only a limited number of data subjects, is necessary for the purposes of compelling legitimate interests pursued by the controller which are not overridden by the interests or rights and freedoms of the data subject, and the controller has assessed all the circumstances surrounding the data transfer and has on the basis of that assessment provided suitable safeguards with regard to the protection of personal data.
[iv] The CJEU held, for example, that Section 702 of the U.S. FISA does not respect the minimum safeguards resulting from the principle of proportionality under EU law and cannot be regarded as limited to what is strictly necessary. This means that the level of protection of the programs authorized by 702 FISA is not essentially equivalent to the safeguards required under EU law. As a consequence, if the data importer or any further recipient to which the data importer may disclose the data is subject to 702 FISA, SCCs or other Article 46 of the GDPR transfer tools may only be relied upon for such transfer if additional supplementary technical measures make access to the data transferred impossible or ineffective.
[v] Example of contractual measures: The exporter could add annexes to the contract with information that the importer would provide, based on its best efforts, on the access to data by public authorities, including in the field of intelligence provided the legislation complies with the EDPB European Essential Guarantees, in the destination country. This might help the data exporter to meet its obligation to document its assessment of the level of protection in the third country. Such measure would be effective if (i) the importer is able to provide the exporter with these types of information to the best of its knowledge and after having used its best efforts to obtain it, (ii) this obligation imposed on the importer is a mean to ensure that the exporter becomes and remains aware of the risks attached to the transfer of data to a third country.
[vi] Example of technical measures: A data exporter uses a hosting service provider in a third country to store personal data, e.g., for backup purposes. The EDPB considers that encryption measure provides an effective supplementary measure if (i) the personal data is processed using strong encryption before transmission, (ii) the encryption algorithm and its parameterization (e.g., key length, operating mode, if applicable) conform to the state-of-the-art and can be considered robust against cryptanalysis performed by the public authorities in the recipient country taking into account the resources and technical capabilities (e.g., computing power for brute-force attacks) available to them, (iii) the strength of the encryption takes into account the specific time period during which the confidentiality of the encrypted personal data must be preserved, (iv) the encryption algorithm is flawlessly implemented by properly maintained software the conformity of which to the specification of the algorithm chosen has been verified, e.g., by certification, (v) the keys are reliably managed (generated, administered, stored, if relevant, linked to the identity of an intended recipient, and revoked), and (vi) the keys are retained solely under the control of the data exporter, or other entities entrusted with this task which reside in the EEA or a third country, territory or one or more specified sectors within a third country, or at an international organization for which the Commission has established in accordance with Article 45 of the GDPR that an adequate level of protection is ensured.
[vii] Example of organizational measures: Regular publication of transparency reports or summaries regarding governmental requests for access to data and the kind of reply provided, insofar publication is allowed by local law. The information provided should be relevant, clear and as detailed as possible. National legislation in the third country may prevent disclosure of detailed information. In those cases, the data importer should employ its best efforts to publish statistical information or similar type of aggregated information.
[viii] It is worth noting that the EDPB indicates that it will provide more details “as soon as possible” on the impact of the Schrems II judgement on other transfer tools (in particular binding corporate rules and as hoc contractual clauses).
[ix] This set of new SCCs should be distinguished from the new draft of clauses published by the Commission on the same day which relates to Article 28.3 of the GDPR (also called SCCs by the Commission). This new draft of clauses will only be optional (the parties may choose to continue using their own data processing agreements) and is also subject to public consultation until 10 December 2020.
[x] Provided the contract remains unchanged, with the exception of necessary supplementary measures; on the contrary, in case of relevant changes to the contract or new sub-contracting, the old SCCs must be replaced by the new ones.
The following Gibson Dunn lawyers prepared this client alert: Ahmed Baladi, Ryan T. Bergsieker, Patrick Doris, Kai Gesing, Alejandro Guerrero, Vera Lukic, Adelaide Cassanet, and Clemence Pugnet. Please also feel free to contact the Gibson Dunn lawyer with whom you usually work, the authors, or any member of the Privacy, Cybersecurity and Consumer Protection Group:
Europe
Ahmed Baladi – Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com)
James A. Cox – London (+44 (0)20 7071 4250, jacox@gibsondunn.com)
Patrick Doris – London (+44 (0)20 7071 4276, pdoris@gibsondunn.com)
Penny Madden – London (+44 (0)20 7071 4226, pmadden@gibsondunn.com)
Michael Walther – Munich (+49 89 189 33-180, mwalther@gibsondunn.com)
Kai Gesing – Munich (+49 89 189 33-180, kgesing@gibsondunn.com)
Alejandro Guerrero – Brussels (+32 2 554 7218, aguerrero@gibsondunn.com)
Vera Lukic – Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com)
Sarah Wazen – London (+44 (0)20 7071 4203, swazen@gibsondunn.com)
Asia
Kelly Austin – Hong Kong (+852 2214 3788, kaustin@gibsondunn.com)
Connell O’Neill – Hong Kong (+852 2214 3812, co’neill@gibsondunn.com)
Jai S. Pathak – Singapore (+65 6507 3683, jpathak@gibsondunn.com)
United States
Alexander H. Southwell – Co-Chair, PCCP Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com)
Debra Wong Yang – Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com)
Matthew Benjamin – New York (+1 212-351-4079, mbenjamin@gibsondunn.com)
Ryan T. Bergsieker – Denver (+1 303-298-5774, rbergsieker@gibsondunn.com)
Howard S. Hogan – Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com)
Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com)
Kristin A. Linsley – San Francisco (+1 415-393-8395, klinsley@gibsondunn.com)
H. Mark Lyon – Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com)
Karl G. Nelson – Dallas (+1 214-698-3203, knelson@gibsondunn.com)
Deborah L. Stein – Los Angeles (+1 213-229-7164, dstein@gibsondunn.com)
Eric D. Vandevelde – Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com)
Benjamin B. Wagner – Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com)
Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com)
© 2020 Gibson, Dunn & Crutcher LLP
Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.
On June 19, 2020, the French Administrative Supreme Court (“Conseil d’Etat”) dismissed Google LLC’s appeal against the French Data Protection Authority’s decision of January 21, 2019, imposing a fine of 50 million euros on Google LLC. This fine is, to date, the highest fine issued under the GDPR in the European Union. The decision is now final with no further possibility of appeal before French courts.
This Client Alert lays out the key aspects and implications of the decision.
I. Context of the decision
On January 21, 2019, the French Data Protection Authority (“CNIL”) imposed a fine of 50 million euros on Google LLC for breach of EU transparency and information obligations and lack of valid consent to process data for targeted advertising purposes under the General Data Protection Regulation (EU) 2016/679 (“GDPR”).
Google filed an appeal against the CNIL’s decision with the French Administrative Supreme Court, which issued its final ruling on June 19, 2020. Google raised two requests for preliminary rulings from the Court of Justice of the European Union (one in relation to the jurisdiction of the CNIL and another in relation to the consent mechanism the company had used) but the French Administrative Supreme Court determined that there was no need to refer such questions to the Court of Justice of the European Union.
II. CNIL’s jurisdiction
In its ruling, the French Administrative Supreme Court first confirmed the CNIL’s jurisdiction.
In order to challenge the CNIL’s jurisdiction, Google claimed that its main establishment, as defined under the GDPR, was Google Ireland Limited, which is its head office in Europe, has human and financial resources and assumes the responsibility of “many organizational functions” in Europe.
In doing so, Google tried to demonstrate that the Irish supervisory authority (the DPC) should have had jurisdiction in this matter considering the one-stop-shop mechanism provided by the GDPR under which an organization established in multiple EU Member States shall have, as its sole interlocutor, the supervisory authority of its “main establishment” (also called, the “lead supervisory authority”). Under the GDPR, the “main establishment” should correspond to the place of the central administration in the EU, unless decisions on the purposes and means of data processing are taken in another establishment which has the power to have such decisions implemented, in which case the latter establishment should be considered the main establishment.
The French Administrative Supreme Court found that Google Ireland Limited did not exercise, at the time of the challenged conduct, control over the other European affiliates of the company, so that it could not be regarded as the “central administration,” and that Google LLC was determining alone the purposes and means of the processing. The Court noted that Google Ireland Limited was assigned new responsibilities in relation to data processing in Europe, but highlighted that this new scope of responsibility was in any event effective only after the date the CNIL issued its decision.
The Court also pointed out that while the CNIL cooperated with other supervisory authorities in the EU in relation to its jurisdiction, none of them raised a concern with respect to the CNIL’s exercise of jurisdiction, and the Irish supervisory authority even publicly stated at that time that it was not the lead supervisory authority of Google LLC.
Therefore, the Court rejected Google’s jurisdictional arguments, including the request for preliminary rulings from the Court of Justice of the European Union regarding that issue.
III. GDPR violation
The French Administrative Supreme Court also confirmed the breaches identified by the CNIL regarding Google’s transparency and information obligations, as well as the lack of valid consent to process its users’ personal data for targeted advertising purposes.
First, the Court found that Google’s consumer disclosures were scattered, thus hindering the accessibility and clarity of information for users, while the data processing carried out was particularly intrusive.
Furthermore, with respect to the validity of the consent collected, the Court confirmed that the information Google provided to consumers that was related to targeted advertising was not presented in a sufficiently clear and distinct manner for the user’s consent to be valid. In particular, the consent was collected in a global manner for various purposes and through a pre-ticked box, which do not meet the requirements of the GDPR. In that respect, the French Administrative Supreme Court also determined that there was no need to raise a request for preliminary rulings from the Court of Justice of the European Union.
Finally, the French Administrative Supreme Court stated that the administrative fine of 50 million euros was not disproportionate and confirmed its amount.
IV. Conclusion
This decision is an important reminder that providing clear disclosures and consent mechanisms are key obligations to be complied with when a company’s processing of personal data is subject to the GDPR, as shortcomings in those areas may lead to significant monetary sanctions. For organizations with multiple subsidiaries or affiliates in the EU, this decision also illustrates the importance of clarifying their corporate organization, identifying their main establishment in the EU, and ensuring that this main establishment satisfies the criteria set out in the GDPR in order to benefit from the one-stop-shop mechanism.
The following Gibson Dunn lawyers prepared this client alert: Ahmed Baladi, Vera Lukic, Adelaide Cassanet, Clemence Pugnet, and Ryan T. Bergsieker. Please also feel free to contact the Gibson Dunn lawyer with whom you usually work, the authors, or any member of the Privacy, Cybersecurity and Consumer Protection Group:
Europe
Ahmed Baladi – Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com)
James A. Cox – London (+44 (0)20 7071 4250, jacox@gibsondunn.com)
Patrick Doris – London (+44 (0)20 7071 4276, pdoris@gibsondunn.com)
Penny Madden – London (+44 (0)20 7071 4226, pmadden@gibsondunn.com)
Michael Walther – Munich (+49 89 189 33-180, mwalther@gibsondunn.com)
Kai Gesing – Munich (+49 89 189 33-180, kgesing@gibsondunn.com)
Alejandro Guerrero – Brussels (+32 2 554 7218, aguerrero@gibsondunn.com)
Vera Lukic – Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com)
Sarah Wazen – London (+44 (0)20 7071 4203, swazen@gibsondunn.com)
Asia
Kelly Austin – Hong Kong (+852 2214 3788, kaustin@gibsondunn.com)
Jai S. Pathak – Singapore (+65 6507 3683, jpathak@gibsondunn.com)
United States
Alexander H. Southwell – Co-Chair, PCCP Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com)
Debra Wong Yang – Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com)
Matthew Benjamin – New York (+1 212-351-4079, mbenjamin@gibsondunn.com)
Ryan T. Bergsieker – Denver (+1 303-298-5774, rbergsieker@gibsondunn.com)
Howard S. Hogan – Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com)
Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com)
Kristin A. Linsley – San Francisco (+1 415-393-8395, klinsley@gibsondunn.com)
H. Mark Lyon – Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com)
Karl G. Nelson – Dallas (+1 214-698-3203, knelson@gibsondunn.com)
Deborah L. Stein (+1 213-229-7164, dstein@gibsondunn.com)
Eric D. Vandevelde – Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com)
Benjamin B. Wagner – Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com)
Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com)
© 2020 Gibson, Dunn & Crutcher LLP
Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.