EU Privacy + Security Intensive Jan Dhont Partner, Wilson Sonsini - - PowerPoint PPT Presentation

eu privacy security
SMART_READER_LITE
LIVE PREVIEW

EU Privacy + Security Intensive Jan Dhont Partner, Wilson Sonsini - - PowerPoint PPT Presentation

Oct 14, 2019 EU Privacy + Security Intensive Jan Dhont Partner, Wilson Sonsini Goodrich & Rosati John Bowman Senior Principal, Promontory, an IBM Company Speaker Jan Dhont Partner, Wilson Sonsini Goodrich & Rosati Privacy and


slide-1
SLIDE 1

Oct 14, 2019

EU Privacy + Security Intensive

Jan Dhont Partner, Wilson Sonsini Goodrich & Rosati John Bowman Senior Principal, Promontory, an IBM Company

slide-2
SLIDE 2

Speaker

Jan Dhont

Partner, Wilson Sonsini Goodrich & Rosati

Privacy and Cybersecurity

Counseling on privacy and cybersecurity matters for more than 20 years with substantive experience in working with global and U.S.-based public and private companies in a wide variety of industries. A frequent speaker

  • n topics relating to privacy and is widely known in data privacy and

security circles.

slide-3
SLIDE 3

Speaker

Privacy and data protection consultant since 2014. Previously Head of EU and International Data Protection Policy at UK Ministry of Justice. In that role was Head of Delegation and Lead Negotiator for UK Government on GDPR, Law of Enforcement Directive and Convention 108.

John Bowman

Senior Principal, Promontory, an IBM Company

slide-4
SLIDE 4

EU Privacy + Security Intensive

Session A – 09:00am – 10:15am

  • GDPR: Where are we

now? Session B – 10:45am – 12:00pm

  • GDPR: What is happening

across EU and across Borders? Session C – 1:30pm – 2:45 pm

  • New Issues for 2020 and

Beyond Session D – 3:15pm – 4:30pm

  • Practical exercise
slide-5
SLIDE 5

Session A: GDPR: Where are we now?

1. Implementation in Practice 2. EDPB Guidelines 3. Member State Laws 4. Enforcement

slide-6
SLIDE 6

Session A: GDPR: Where are we now?

  • 1. Implementation

in Practice

slide-7
SLIDE 7

Implementation in Practice

Key Aspects of GDPR implementation

Substantial amount of activity up to GDPR go-live on May 25,

  • 2018. Many organizations transitioned from change program

activity to business as usual by:

  • Establishing privacy office and networks (CPO, ‘privacy

champions’)

  • Appointing Data Protection Officer(s) and other key

personnel

  • Identifying main establishment in the EU
  • Providing mandatory GDPR training for staff
  • Updating policies, processes, guidance and contracts
  • Maintaining record of processing activities
  • Managing marketing lists and deciding whether to reobtain

consent (the ‘GDPR Deluge’)

  • Incorporation of data protection into broader digital

strategies and risk/compliance approaches

GDPR PR Don

  • ne!!
slide-8
SLIDE 8

Implementation in Practice

Key post-GDPR Challenges

  • Retaining budget and interest in data protection from

influencers and decision makers

  • Keeping track of regulatory guidance, court rulings and

enforcement proceedings

  • Responding to data subject rights requests, complaints and

litigation

  • Reconciling the need to provide a high standard of data

protection with Big Data processing, AI, adtech etc.

  • Operating manual systems and workarounds
  • Benchmarking and maturity assessments
  • Over or underreporting breaches
  • Applying GDPR as a baseline global framework and

reconciling it with emerging privacy laws (CCPA, LGPD)

  • Recruiting and retaining staff
  • Planning for the unexpected (e.g. breach incidents)

GDPR PR Don

  • ne!!
slide-9
SLIDE 9

Session A: GDPR: Where are we now?

  • 2. EDPB Guidelines
slide-10
SLIDE 10

EDPB Guidelines - Overview

Selected European Data Protection Board (EDPB) Guidelines

  • Consent
  • Transparency
  • Data Breach Notification
  • Data Protection Officers

Former WP29 guidelines were endorsed by EDPB https://edpb.europa.eu/our-work-tools/general- guidance/gdpr-guidelines-recommendations-best- practices_en Guidelines have also been published on:

  • Right to data portability
  • Codes of conduct and monitoring bodies, certification
  • Identifying a lead supervisory authority
  • Standard application of BCRs
  • Adequacy referential
  • Article 49 derogations
  • Setting administrative fines
  • Territorial scope of GDPR
  • Automated Decisions & Profiling
  • Data Protection Impact Assessments
slide-11
SLIDE 11

EDPB Guidelines – Consent

  • Obtaining consent does not negate or diminish a controller’s obligation to
  • bserve the other GDPR principles, such as

– Fairness – Necessity – Proportionality – Data quality

  • Consent does not legitimize the collection of data that is not necessary in

relation to a specified purpose

General Rules for Valid Consent

slide-12
SLIDE 12

EDPB Guidelines – Consent

Under Art. 4(11), “Consent” means any: – Freely given – Specific – Informed; and – Unambiguous indication of the data subject’s wishes by which he or she signifies agreement to the processing of personal data relating to him or her, by making a statement or a clear affirmative action.

Elements of Consent

slide-13
SLIDE 13

EDPB Guidelines – Consent

  • To obtain valid consent, the data subject must be informed of a specific,

explicit, legitimate purpose for the intended processing activity

  • If the controller later wishes to process the data for a new purpose, the

controller must seek a new consent from the data subject for the new processing purpose

  • A controller that seeks consent for various purposes must provide a

separate opt-in for each purpose

Specific

slide-14
SLIDE 14

EDPB Guidelines – Consent

For consent to be informed, at least the following is required for obtaining valid consent: – Controller’s identity – Purpose of each of the processing operations for which consent is sought – What data or type of data will be collected and used – The existence of the right to withdraw consent – Information about the use of the data for decisions based solely on automated processing or profiling – If the consent relates to transfers, information about the possible risk of data transfers to third countries in the absence of an adequacy decision and appropriate safeguards.

  • Where the consent sought is to be relied upon by multiple, joint controllers, or if the data is

to be transferred to, or processed by, other controllers, these organizations should also be named.

Informed

slide-15
SLIDE 15

EDPB Guidelines – Consent

  • When seeking consent, the controller should

– Use clear and plain language:

  • Intelligible
  • So that data subject can easily identify who the

controller is, and

  • What they are agreeing to
  • Adapted to the targeted audience

How to Provide Information about Consent /1

slide-16
SLIDE 16

EDPB Guidelines – Consent

– Distinguishable from other matters

  • In a separate document
  • Not hidden in general terms and conditions

– In an easily accessible form

  • Consider using a layered way of presenting information

to avoid excessive disturbance of the user experience

  • r product design.

How to Provide Information about Consent /2

slide-17
SLIDE 17

EDPB Guidelines – Consent

  • It must be clear that the data subject has consented to the

particular processing.

  • Use of pre-ticked boxes is invalid
  • Silence or inactivity is not an indication of choice

Unambiguous Indication of Wishes

slide-18
SLIDE 18

EDPB Guidelines – Consent

Explicit consent is required where serious data protection risks emerge:

  • Article 9: Special categories of data

– Note that there are other way to legitimize the collection and processing of special categories of data that do not rely on consent.

  • Article 49:

– Transfer of data to third countries in the absence of adequate protection – Note that there are other way to legitimize the crossborder transfer of personal data that do not rely on consent

  • Article 22:

– Automated decision-making including profiling

Consent v. Explicit Consent /1

slide-19
SLIDE 19

EDPB Guidelines – Consent

  • How to obtain “explicit consent”:

– “Double opt-in”

  • Controller sends an email explaining the proposed use,

and the data subject replies to the email “I agree”.

  • After reply is sent, the data subject receives a

verification code that must be clicked or used.

Consent v. Explicit Consent /2

slide-20
SLIDE 20

EDPB Guidelines – Consent

  • Controller must be able to demonstrate that data subject gave consent.

It must keep a record of consent statements received to show: – How consent was obtained – When consent was obtained – Information provided to the data subject at the time – That the data subject was informed – That the controller’s workflow met all relevant criteria for valid consent

  • As a best practice, consent should be refreshed at appropriate intervals.

Ability to Demonstrate Valid Consent

slide-21
SLIDE 21

EDPB Guidelines – Consent

  • Controller must ensure that the data subject can withdraw consent as easily as he/she gave consent, and

at any given time

  • Data subject must be informed of:

– His/her right to withdraw consent – How to exercise this right

  • Method for withdrawing consent must be “as easy” as giving consent.

– Using the same electronic interface as when giving consent – Data subject must be able to withdraw consent without detriment

  • Free of charge
  • Without lowering the service level
  • Controller must stop the processing action concerned, and delete the data or anonymize it
  • Except if the data is to be used for other purposes, under a different lawful basis

Withdrawal of Consent

slide-22
SLIDE 22

EDPB Guidelines – Transparency

Article 12 requires that communications made to data subjects be:

  • Concise
  • Transparent
  • Intelligible
  • Easily accessible
  • Use clear and plain language

– Especially when dealing with children and people with disabilities

  • In writing or by other means, including electronic means

– May be provided orally if requested by the data subject

  • Provided free of charge

Requirements for Communications

slide-23
SLIDE 23

EDPB Guidelines – Transparency

  • By providing it directly
  • By linking to it
  • By clearly signposting it
  • As an answer to a natural language question
  • In an online layered privacy statement
  • In FAQs
  • By way of contextual pop-ups that activate when a data subject fills in a
  • nline form
  • In an interactive digital context through a chat box interface

Easily Accessible

slide-24
SLIDE 24

EDPB Guidelines – Transparency

  • Avoid complex sentence and language structure
  • Provide concrete and definitive information
  • Do not use abstract or ambivalent terms
  • Do not leave room for different interpretations
  • Provide clear statement of the purposes of, and legal basis for, the processing
  • Avoid language qualifiers such as “may”, “might”, “some”, “often” and

“possible”

Clear and plain language /1

slide-25
SLIDE 25

EDPB Guidelines – Transparency

  • Use bullets and indents
  • Use active instead of the passive form
  • Avoid excess nouns
  • Avoid overly legalistic, technical or specialist language or terminology
  • When targeting data subjects speaking those languages, provide translations,

ensure that those translation are accurate, ensure that phraseology and syntax make sense

Clear and plain language /2

slide-26
SLIDE 26

EDPB Guidelines – Transparency

  • Layered privacy notice

– Make sure that the layers to do not provide conflicting information

  • Privacy dashboard
  • Just-in-time notice
  • Hard copy: paper, leaflets, flowcharts, cartoons
  • Telephone environment: oral explanation; pre-recorded information
  • Screenless smart technology, IoT, Wi-Fi tracking: icons, QR codes, voice alerts
  • CCTV, drone: public signage, visible boards

Formats for Providing Information

slide-27
SLIDE 27

EDPB Guidelines – Transparency

  • As part of its accountability obligations, the data controller must be able to

demonstrate that personal data is processed in a transparent manner

  • Accountability as applied to transparency applies:

– At the point of collection of the data – Throughout the processing life cycle – When changing content, terms of existing privacy statements

  • Timing for notification of changes
  • Making data subjects aware of the risk

Accountability Obligations

slide-28
SLIDE 28

EDPB Guidelines – Transparency

  • Accountability requires the controller or processor to be able to demonstrate

– Rationale for a decision – Justify why the decision was made, and how the potential impact on the data subject was evaluated.

  • Example: when a data controller/processor seeks to rely on an exception, it is

expected to: – Carry-out an analysis of the situation, and how it applies to the particular situation – Assess the information against the impact and effects on the data subject if implementing that solution – Keep a record of the evaluation, analysis and decision

Demonstrating Accountability

slide-29
SLIDE 29

EDPB Guidelines – Data Breaches

  • “Personal Data Breach” is a breach of security leading to the accidental or unlawful

– Destruction – Loss – Alteration – Unauthorized disclosure of – Unauthorized access to personal data transmitted, stored or processed

  • Three types of personal data breaches:

– Confidentiality Breach: disclosure, access (e.g., phishing) – Integrity Breach: alteration (e.g., man-in-the-middle attack) – Availability Breach: loss of access (even if temporary), destruction (e.g. denial of service)

Types of Data Breaches

slide-30
SLIDE 30

EDPB Guidelines – Data Breaches

Notification requirements in the GDPR

RISK

Notification to Supervisory Authority (Art. 33 GDPR) A controller shall notify every data breach. UNLESS the breach is unlikely to result in a risk to the rights and freedoms of natural persons, e.g.:

  • If disclosed data was already publicly available; or
  • If data is unintelligible due to strong encryption.

Deadline: “Without undue delay and, where feasible, no later than 72 hours”.

HIGH RISK

Communication to Data Subjects (Art. 34 GDPR) A controller shall notify a data breach if it is likely to result in a high risk to the rights and freedoms of natural persons. EXCEPT if:

  • The controller adopted protective measures on the

affected data (e.g., encryption); or

  • The controller adopted measures which ensure that

the risk won’t materialize.

Deadline: “Without undue delay”.

slide-31
SLIDE 31

EDPB Guidelines – Data Breaches

  • A controller becomes “aware” when it has a reasonable degree of certainty that a

security incident has occurred and has led to the compromise of personal data.

  • GDPR requires controllers to have in place appropriate technical and
  • rganizational measures to establish immediately whether a breach has taken

place

  • When a controller can be considered to be “aware” of a particular breach will

depend on the circumstances of the specific breach

  • The emphasis should be on prompt action to investigate an incident to determine

whether personal data have indeed breached, and if so to take remedial action and notify if required.

When the Controller Becomes Aware

slide-32
SLIDE 32

EDPB Guidelines – Data Breaches

  • Detection

– After first being informed by a third party, or having detected a security incident, the controller may undertake a short period of investigation in order to establish whether a breach has in fact occurred.

  • Investigation

– During the period of investigation, the controller may not be regarded as being ”aware”, however, it is expected that the initial investigation should begin as soon as possible, and establish with a reasonable degree of certainty whether a breach has taken place; a more detailed investigation can then follow.

  • Awareness

– Once the controller has become aware, with a certain level of certainty, that a breach has occurred, if the conditions for notification in Art. 33(1) have been met, a notifiable breach must be notified without undue delay, and, where feasible not later than 72 hours. – During this period, the controller should assess the likely risk to individuals in order to determine whether the requirements for notification have been triggered, and the actions needed to address the breach

Being “informed” is not being “aware”

slide-33
SLIDE 33

EDPB Guidelines – Data Breaches

  • When assessing a breach, the controller should consider the specific circumstance
  • f the breach, including the severity of the potential impact and the likelihood of
  • ccurrence, taking into account the following criteria:

– Type of breach – Nature, sensitivity, volume of data – Ease of identification of individuals – Severity of the consequences to the individual – Special characteristics of the individuals (children, vulnerable individuals) – Special characteristics of the data controller – Number of affected individuals

  • See also ENISA’s “Recommendation for a Methodology of the Assessment of the

Severity of Personal Data Breaches” https://www.enisa.europa.eu/publications/dbn-severity .

Criteria for assessing a breach

slide-34
SLIDE 34

EDPB Guidelines – Data Breaches

  • The ability to detect, address and report a breach in a timely manner is an essential element of the

requirement that each controller and processor have in place appropriate technical and organizational measures to ensure an appropriate level of security of personal data

  • Controller should have

– Internal processes in place to be able to

  • detect a breach (e.g., data flow and log analyzers)
  • address a breach (e.g., upwards reporting management)

– Incident response plan – Arrangements with any processors that the controller uses

  • Obligation to notify the controller
  • Specific requirements for prompt notification to help the controller meet the 72 hour threshold.

– Process for keeping documentation of the breach as it develops

Preparedness

slide-35
SLIDE 35

EDPB Guidelines – Data Breaches

  • When a breach takes place in the context of cross-border processing:

– The controller should notify the lead supervisory authority. – If the controller is not established in the EU/EEA, the controller’s EU Representative should notify the supervisory authority of the Member State where the EU Representative is located.

  • In case of doubt, (also) notify the local supervisory authority where the breach has taken place.
  • When drafting an Incident Response Plan, identify the applicable lead supervisory authority.

Crossborder Processing – Where to Notify?

slide-36
SLIDE 36

EDPB Guidelines – Data Breaches

  • Data subjects should be contacted directly, unless doing so would involve a disproportionate

effort. – In that case, the communication may be made using public communication

  • Dedicated messages should be used, i.e., not combined with other information
  • Examples of approved transparent communications methods

– Direct messaging (email, SMS, direct message) – Prominent website banners or notification – Postal communications – Prominent advertisement in print media – Use of different languages

  • EDPB encourages

– The use of means that maximize the chances of properly communicating the information, such as using several channels concurrently. – Cooperation with supervisory authorities and law enforcement.

How to Contact Affected Individuals

slide-37
SLIDE 37

EDPB Guidelines – Data Breaches

  • Whether or not notification is required, Controller must keep documentation of all breaches.
  • Companies should establish a register of breach and keep records of their assessment of

identified breach.

  • Supervisory authority can request to see these records

– Breach – Causes of the breach – What took place – Which personal data were affected Records /

  • Effects and consequences of the breach
  • Remedial action taken by the controller
  • Reasoning for decision taken to report / not report the breach
  • Reason for the delay in reporting, if any
  • Any relevant evidence

Record Keeping Obligations

slide-38
SLIDE 38

EDPB Guidelines – Data Breaches

  • Obtaining cooperation of processors/vendors
  • Dependency on (a) vendor(s) / Vendor(s) physically hold(ing) the data
  • Keep control tight (joint and several liability)
  • Upward trend of class-actions by non-for-profits
  • Complex interaction with investigations by the DPAs
  • Courts may grant injunctive relieve
  • Processors need strategy to deal with multiple controllers
  • Data breaches often trigger investigations and fines by Supervisory Authorities

Practical Considerations

slide-39
SLIDE 39

EDPB Guidelines - DPO

GDPR Art. 37(1)

  • Core activity of controller/processor consists of processing

– that requires regular and systematic monitoring of data subjects on a large scale; or – Of special categories of data on a large scale; or – Of personal data relating to criminal convictions and offenses on a large scale; or

  • Processing is carried out by a public authority or body EU Member State

laws may add other grounds

When is a DPO required?

slide-40
SLIDE 40

WHEN IS A DPO REQUIRED? (Art. 37 GDPR)

Source of this slide: DPO Network Europe

slide-41
SLIDE 41

EDPB Guidelines - DPO

Definition

  • The key operations necessary to achieve the objectives of the business; and/or
  • The activities where data processing forms an inextricable part of the activity of

the business.

  • Not the routine activities of a company (e.g. payroll, IT support)

Examples

  • Hospital  processing of health data is a core activity but payroll is not; payroll is

“accessory” to the business.

  • Surveillance  processing of visitors’ biometrics data is core activity
  • Payroll service processor  processing of personal data to perform hospital's

payroll activity is “core activity” for the payroll processor.

Key Terms: Core Activity

slide-42
SLIDE 42

EDPB Guidelines - DPO

Sample criteria – Number of individuals; proportion of the relevant population – Volume of data to be processed – Duration, permanence of the data processing activity – Geographical extent of the processing activity Examples – Patient data in hospital – Travel data using travel cards – Geolocation of customers – Processing of insured data by insurance company

Key Terms: Large Scale

slide-43
SLIDE 43

EDPB Guidelines - DPO

“Concept clearly includes all forms of tracking and profiling on the Internet, including for the purpose of behavioral advertising” Examples – Location tracking – Loyalty programs – Retargeting; behavioral advertising – Monitoring wellness, fitness, health care – CCTV; operating a telecom network; providing telecom services – Connected devices, smart cars, home automation – Profiling and scoring for risk assessment (e.g. credit scoring) – Fraud prevention; detection of money laundering – Establishment of insurance premiums

Key Terms: Regular and Systematic Monitoring

slide-44
SLIDE 44

EDPB Guidelines - DPO

Clearly required?  Go to next slide Not clearly required

  • Conduct an analysis of the relevant factors; analysis is part of the documentation under the

accountability principle

  • If conclusion is “no DPO needed”, keep record of the analysis and of the conclusions
  • Analysis should be updated as business, activities, services, change

Risks

  • WP29 “encourages voluntary efforts” to appoint a DPO
  • Beware, if the of the person responsible for personal data is named “DPO”, the business is

expected to treat the individual as a GDPR DPO. Use a different title to avoid confusion

  • WARNING: DPO once appointed, whether voluntary or mandatory, it is designed for ALL

processing operations conducted by the business.

To DPO or not to DPO?

slide-45
SLIDE 45

EDPB Guidelines - DPO

  • DPO is not responsible in case of non-compliance with the GDPR
  • GDPR Art. 24(1): The business is required

– to ensure compliance – to be able to demonstrate that the processing is performed in accordance with the GDPR

  • DPO

– Is independent – Should be given the possibility to make his/her dissenting opinion clear – Cannot be dismissed for providing his/her advice.

DPO Responsibilities

slide-46
SLIDE 46

EDPB Guidelines - DPO

GDPR Art. 38(3): organization must enable the DPO to act in an independent manner:

  • No instructions by the business regarding the exercise of the DPO’s tasks
  • No dismissal or penalty for the performance of the DPO’s tasks
  • No conflict of interest with possible other tasks and duties. DPO cannot wear

several hats: – CEO; COO, CFO – CMO, HR, IT – Other roles that involve the determination of purposes and means of the processing.

Independence: No Conflicts

slide-47
SLIDE 47

EDPB Guidelines - DPO

GDPR Article 39 (1)(b)

  • Collect information to identify processing activities
  • Analyze and check the compliance of the processing activities
  • Inform, advise and issue recommendations to the controller/processor

GDPR Art. 39(1)(d) & (e)

  • Cooperate with supervisory authority
  • Act as a contact point with the supervisory authority

Tasks: Monitor Compliance

slide-48
SLIDE 48

EDPB Guidelines - DPO

GDPR Art. 35(2), 37(1)(c) – DPIA

  • Controller must seek the advice of the DPO when conducting a DPIA.
  • DPO does not perform the DPIA, but provides advice

– Whether or not to carry out a DPIA – What methodology to follow when carrying out a DPIA – Whether to carry out a DPIA in-house or to outsource it – What technical, organizational measures should be used to mitigate any risks to the rights and interest of the data subjects – Whether the DPIA has been correctly carried out – Whether conclusions to go ahead, or apply safeguards, are inline with GDPR

Data Protection Impact Assessments

slide-49
SLIDE 49

EDPB Guidelines - DPO

  • Key factor: accessibility
  • EDPB

– Recommends that the DPO be located in the EU whether or not the business is established in the EU BUT – States that where a business has no establishment in the EU, a DPO may be able to carry out his/her activities more effectively if located

  • utside the EU

Location of DPO

slide-50
SLIDE 50

EDPB Guidelines - DPO

DPO must be designated on the basis of professional qualities and in particular expert knowledge:

  • Expertise in national and EU data protection laws & practices
  • In-depth understanding of GDPR
  • Understanding of the processing operations to be carried out
  • Understanding of information technology and data security
  • Knowledge of the business sector and other organizations
  • Ability to promote a data protection culture within the organization.

DPO’s Professional Qualities

slide-51
SLIDE 51

Internal DPO External DPO

PROS

  • Deeper understanding of company’s privacy culture.
  • Easy access to processing operations and relevant

stakeholders.

  • May be elected among trusted professionals.
  • May be associated with management more easily

compared to external DPO.

  • Appears as the face of the company towards SA and

individuals, which enhances trust.

  • May have access to sensitive and confidential business

information without fear for external disclosure.

  • Entirely dedicated to privacy.
  • Can leverage experience from other clients.
  • Flexibility re work arrangement and risk allocation due to

contractual relationship.

  • Ability to tailor a solution to company’s temporary or urgent

needs.

  • May complement where internal DPO is in place. Ex:

consultation for specific projects such as inventory preparation.

CONS

  • May be difficult to find due to hyper-specialization of the

role.

  • May implement restrictive policies due to strict

interpretation of GDPR.

  • Has a protected status similar to that of a union

representative.

  • Less familiar with the company’s privacy culture and

expectations.

  • Less direct access to processing operations, management,

and relevant stakeholders in the company.

  • May not find his or her place as team player within the

company.

  • Less involvement in strategy.
  • May value personal interest to keep client over company’s

interest. 51

Should We Appoint an Internal or an External DPO?

.

slide-52
SLIDE 52

Session A: GDPR: Where are we now?

  • 3. Member State

Laws

slide-53
SLIDE 53

Introduction

  • Regulation has direct effect…
  • … but allows member states to specify or derogate
  • n certain points: e.g.,
  • Organization Supervisory Authorities is a national matter
  • Restrictions to data subject rights (Art. 23 GDPR)
  • Employment-related processing, scientific research,

national ID, … (Chapter IX GDPR)

53

slide-54
SLIDE 54

Member State Laws

Belgium: Data Protection Act 2018

slide-55
SLIDE 55

Member State Laws

BE Data Protection Act 2018

  • I. The Act of 3 December 2017 establishing the data protection authority
  • Reshapes the pre-GDPR Belgian regulator officially into the

‘Belgian Data Protection Authority’

  • Grants it new powers in accordance with the GDPR
  • II. The Act of 30 July 2018 on the protection of natural persons with

regard to the processing of personal data

  • Implements / supplements provisions of the GDPR
  • Covers interaction between law enforcement and data

protection law

GDPR

slide-56
SLIDE 56

Member State Laws

Belgium: Data Protection Act 2018 example provisions

  • Limitation of data subject

rights (e.g. the right to be informed about the processing and the right to obtain access to data) in context of law enforcement and national security.

  • The age at which a child can

consent to the processing of their personal data for online purposes is set at 13 years of age. Safeguards for the processing of criminal convictions and offences

  • r related security measures:
  • Limited purposes available (e.g.

necessary for the management

  • f disputes; for scientific,

historical or statistical research

  • r archiving).
  • The controller must implement

the same safeguards as for the processing of health data. Safeguards for the processing of genetic, biometric or health data:

  • Companies must keep a list of

the categories of persons with access to such sensitive data; role description in relation to data.

  • Specific regime for processing
  • f data for scientific purposes.
slide-57
SLIDE 57

Member State Laws

BE Data Protection Authority

  • Operational since April 25, 2019
  • “Dispute resolution chamber” and

dedicated “inspection body”

  • Broad investigation powers (e.g.

unannounced on site inspections,

  • rder to stop processing, penalty

payments, publication of decision

  • n website)
  • https://www.wsgrdataadvisor.com/

2019/04/belgian-dpa/

slide-58
SLIDE 58

Member State Laws

France: Data Protection Act 2018

slide-59
SLIDE 59

Member State Laws

French Data Protection Act 2018

The French Data Protection Act No. 78-17 (“DPA”) was first adopted on 6

January 1978. Its current version is 50 pages long (approx. 40,000 words).

  • The DPA was amended to implement the GDPR via law No.

2018-493 of 20 June 2018. It was subsequently revised via

  • rdinance No. 2018-1125 on 12 December 2018 and

complemented via an implementing decree No. 2019-563 on 29 May 2019.

  • The revised DPA reiterates pre-GDPR data protection principles

and focuses on derogations and other aspects for which the GDPR allows for national deviations.

  • The recent revision of the DPA is also aimed at implementing the

provisions of EU directive 2016/680 (“Police Directive”) under French law.

slide-60
SLIDE 60

Member State Laws

France: Data Protection Act 2018 example provisions

Processing of personal data post- mortem

  • The DPA provides for a French-specific

right which allows individuals to give instructions on how their personal data is stored, deleted, and further disclosed after their death.

  • An individual may be appointed to

carry out these instructions and request their implementation from the data controller.

  • Data subjects have the right to be

informed of their post-mortem rights.

  • Scope. The provisions of the DPA

which deviate from, or complement the GDPR, apply to individuals who reside in France even if the data controller is established outside of France.

  • Limitation of data subject
  • rights. In accordance with Article 23

GDPR, the DPA provides some restrictions on data subjects rights, including limitations on the right to be informed about the processing in the context of law enforcement and national security.

  • Consent from minors. The age at

which a child can consent to the processing of their personal data for

  • nline purposes is set at 15 years of

age.

  • Specific “right to be forgotten”

for minors. Where the data subject

was a minor at the time of data collection, he or she can obtain the deletion of the data “as soon as possible”.

slide-61
SLIDE 61

Member State Laws

United Kingdom: Data Protection Act 2018

slide-62
SLIDE 62

Member State Laws

UK Data Protection Act 2018

The DPA received Royal Assent on 23 May 2018. It is 339 pages long (approx. 136,000 words).

  • The DPA recognises that most processing is subject to

the GDPR

  • Applies a broadly equivalent regime to certain types of

processing not in the GDPR (“applied GDPR”)

  • Implements EU Law Enforcement Directive
  • Implements Convention 108 with regards to processing

by intelligence services

  • Articulates role of the Information Commissioner
  • Sets out the enforcement regime
slide-63
SLIDE 63

Member State Laws

slide-64
SLIDE 64

Member State Laws

UK: Data Protection Act 2018 example provisions

Criminal offences which can be prosecuted by the ICO include:

  • selling or offering to sell

unlawfully obtained data

  • retention of personal data

without the consent of the controller

  • Unauthorised re-identifying of

de-identified information

  • alteration of data to prevent

disclosure following a data subject rights request Courts may impose fines but imprisonment subject to other acts, e.g. Computer Misuse Act

  • Government may specify limits
  • n the fees that a controller

may apply when dealing with subject access requests (e.g. for unfounded or excessive requests)

  • Controller must provide an

appeal process for “qualifying significant decisions” based solely on automated processing

  • Govt. may make further

provisions to safeguard data subject rights in relation to automated decision-making

  • The age at which a child can

consent to the processing of their personal data for online purposes is set at age 13

  • Processing of special

categories/criminal data, ot restrictions of rights where substantial public interest, e.g.

  • Fraud prevention
  • Suspicion of terrorist

financing or money laundering

  • Insurance
  • Political parties
  • Sport (e.g. anti-doping)
slide-65
SLIDE 65

Member State Laws

Ireland: Data Protection Act 2018

slide-66
SLIDE 66

Member State Laws

Ireland: Data Protection Act 2018

The Data Protection Act 2018 was signed into law on 24 May 2018. It is 184 pages long (approx. 68,000 words). Its implementation has the following key effects:

  • It transposes the Law Enforcement Directive.
  • It retains elements of the Data Protection Acts

1988 and 2003 for purposes of national security, defence and international relations of the State.

  • It contains new substantial enforcement powers

for the Irish Data Protection Commission ('DPC’).

slide-67
SLIDE 67

Member State Laws

Ireland: Data Protection Act 2018 example provisions

Criminal offences under the DPA include:

  • Disclosure of personal data

without prior authorization from the data controller

  • Unauthorized disclosure by a

data processor

  • Selling of data following

unauthorized disclosure

  • Obstruction of an authorized
  • fficer under the DPA

Both fines and imprisonment available to courts (potential maximum of. It permits the processing of health data for insurance and pension purposes where it is necessary and proportionate for the purpose of:

  • A policy of insurance or life

assurance

  • A policy of health insurance or

health-related insurance,

  • An occupational pension,

retirement annuity contract or pension arrangement

  • The mortgaging of property
  • The age at which a child can

consent to the processing of their personal data for online purposes is set at 16 years of age.

  • A right to be forgotten for
  • children. Controllers must erase

‘without undue delay’ personal data collected in relation to the

  • ffer of online services to a child

if requested by a data subject.

slide-68
SLIDE 68

Session A: GDPR: Where are we now?

  • 4. Enforcement
slide-69
SLIDE 69

The New Enforcement Paradigm

Extended Powers of Data Protection Authorities (“DPAs”) Investigative Powers

  • Order companies to provide information
  • Auditing
  • Dawn raids (obtain access to premises,

including equipment/means) Corrective Powers

  • Issue warnings/issue reprimands
  • Order to bring processing in compliance (in a specified manner/time period)
  • Order to communicate personal data breaches to individuals
  • Order ban on processing/suspension of data flows
  • Withdraw certifications
  • Impose administrative fines

Specific Remedies Right to submit complaints with DPA (Art. 77 GDPR) Effective remedy against a DPA (Art. 78 GDPR) Effective judicial remedy against a controller or processor (Art. 79 GDPR) Representation of data subjects by non-for-profit bodies (Art. 80 GDPR) Liability Regime Civil Liability. Right to receive compensation from a controller or processor for damage suffered as result of a GDPR infringement (Art. 82 GDPR) Administrative Fines. DPAs must ensure “effective, proportionate and dissuasive” application of administrative fines (Art. 83 GDPR) Criminal Liability. Left to national law. For the most serious violations and often require “intent”

Powers and Remedie ies

slide-70
SLIDE 70

The New Enforcement Paradigm

Th The Numbers

  • Most common triggers for

investigations:

  • 51% complaints by data

subjects

  • 32% data breach notifications
  • 63% of complaints and

notifications until May 2019 have been closed: 37% are still pending

  • Most DPAs have increased their

budget and staff to face increasing workload

slide-71
SLIDE 71

Evolution Cross-Border Cases

Source: European Data Protection Board

slide-72
SLIDE 72

Consumer Rights Activism

slide-73
SLIDE 73

Investigations on the Rise

Dutch DPA investigates the data processing agreements of 30

  • rganizations

CNIL carried out 310 investigations, including in relation to CCTV systems and data subjects rights Greek DPA carried out investigations on 65 controllers Irish DPC initiated investigations on more than 45 companies (e.g., Facebook, Twitter, and Instagram) Swedish DPA investigates Google for collection of location data and web browsing histories

  • Generally not much

visibility

  • DPAs are getting up

to speed

  • Expect the number
  • f investigations to

increase

  • Proactive vs reactive

* May 2019 Figures

slide-74
SLIDE 74

Fines and Litigation on the Rise

slide-75
SLIDE 75

Fines and Litigation on the Rise

Key UK Enforcement Decisions

  • July 2019: Intention to fine Marriott International, Inc. £99

million (approx. $120 million)

  • A cyber incident exposed 339 million records (30

million were EEA residents of which 7 million were in UK). Case was investigated under one-stop shop system

  • Incident related to the vulnerability of Starwood Hotels

systems which was subsequently acquired by Marriott

  • ICO found that insufficient due diligence had been

undertaken when it acquired the Starwood system

  • July 2019: Intention to fine British Airways £183 million

(approx. $224 million)

  • Incident resulted from user traffic to BA website being

diverted to a fraudulent site. About 500,000 records were compromised

  • ICO found that information was compromised by poor

security arrangements, including log in, payment card, travel booking details, names and addresses

GDPR PR Don

  • ne!!

Both cases were investigated under the GDPR one-stop shop system with UK acting as lead authority. They are both awaiting a final decision following representations from the companies and DPAs of affected EU residents.

slide-76
SLIDE 76

Your #1 Priority: Info Sec

50,000 EUR fine by Italian DPA for failure to use strong security for the use of passwords EUR 20,000 fine by DPA of Baden-Wuerttenberg for storage of passwords in clear text without pseudonymization or encryption 400,000 EUR fine by CNIL for lack of authentication measures to allow access to rental documents Up to 900,000 EUR fine by Dutch DPA for lack of multi-factor authentication in context of health-related data processing 61,500 EUR fine by Latvian DPA for failure to implement effective access logs 400,000 EUR fine by Portuguese DPA for absence of regular checks of a hospital’s access logs

slide-77
SLIDE 77

Calculation of Fines

  • Perceived lack
  • f transparency
  • Perceived lack
  • f inconsistency

Default Fine Fine Range Example

  • Cat. 1

100 K EUR 0 – 200 K EUR Insufficient Record Keeping

  • Cat. 2

310 K EUR 120 K – 500 K EUR Insufficient Processing Agreements, independence DPO

  • Cat. 3

525 K EUR 300 K – 750 K EUR Failure to notify breaches, lack

  • f transparency
  • Cat. 4

725 K EUR 450 K – 1 M EUR Unlawful processing

German DSK Fine Methodology

[Group TO/360 Days] X Severity Factors (1-14.4.) X Classification Score (multiplier) X Aggrav./Mitig. Factors

slide-78
SLIDE 78

Triggers For Enforcement Actions

Triggers

Individual initiates court proceedings against company NGO initiates court proceedings against company Individual/co mpetitor file complaint with DPA NGO files complaint with DPA DPA investigates issue on its

  • wn

initiative

  • High profile security

incident

  • Issues in another

country

  • Novel product or

service

  • Privacy scandal
slide-79
SLIDE 79

A Few Possible Proceedings

Civil (interim measures)

First instance Appeals Court Supreme Court

Civil (on the merits)

First instance Appeals Court Supreme Court

Administrative

DPA Administrative Appeals Court Administrative Supreme Court

Criminal

First instance Appeals Court Supreme Court

EDPB Decisions

CJEU

Procedural law is a matter of national law – harmonization in EU is lacking

slide-80
SLIDE 80

How Challenge EDPB and DPA decisions?

80

EDPB decisions can be challenged before CJEU

  • Bring action for annulment of EDPB decision

before CJEU

  • Limited circumstances under which action can be

brought (Art. 263 TFEU)

  • Only DPA to whom EDPB decision is addressed

and those who are (individually and directly) concerned by EDPB decision can bring action

  • Within 2 months following publication of EDPB

decision on EDPB website or of notification

  • Lengthy procedure before CJEU

DPA decisions can be challenged before national courts

  • Challenge DPA decision before national courts of

DPA’s country, under procedural law of that country

  • National courts exercise full jurisdiction and examine

questions of fact and law

  • National courts may always ask CJEU how to interpret

EU law

  • If DPA decision implementing EDPB decision is

challenged, EDPB decision must be forwarded to the national court

  • If national court considers EDPB decision to be

invalid, CJEU must rule on validity (but it cannot be challenged by parties who had the possibility to bring an action for annulment)

slide-81
SLIDE 81

Looking Forward

  • DPAs state they were only “flexing their muscles” in 2018 – period of transition

is over and organizations must move beyond a base level of compliance

  • CNIL and ICO have issued statements that will start to use their full breadth
  • f regulatory powers, including fines
  • Regulators gained additional resources since May 2018
  • Top priorities:
  • Strong accountability programs (including knowledgeable DPO) [ICO]
  • Focus on how companies deal with data subject rights [CNIL]:
  • Focus on controller/processor designations and what this means in terms
  • f direct and contractual responsibility [CNIL]:
  • Focus on direct marketing via its Direct Marketing Code [ICO]
slide-82
SLIDE 82

Enforcement Focus – Primary Work Streams

  • Where it can go wrong
  • Information security and data breach
  • Complaint management
  • Vendor management
  • Interaction with individuals and SAs
  • Data subject rights
  • Processing records
  • Documentation of compliance efforts
  • Core compliance – outward facing compliance
  • Core policies and notices
  • Data transfers
slide-83
SLIDE 83

Session B: GDPR: What is happening across EU and across Borders

1. Cross-border Data Transfers 2. CJEU Rulings 3. Brexit 4. Influence on Global Laws

slide-84
SLIDE 84

Session B: GDPR: What is happening across EU and across Borders

  • 1. Cross-border

Data Transfers

slide-85
SLIDE 85

GDPR Data Transfer: Decision Tree

NO YES YES YES NO

  • 2. Appropriate

Safeguards ? (Art. 46)

  • BCRs
  • EC Standard

Contractual Clauses

  • National Standard

Contractual Clauses

  • Codes of Conduct
  • Certification
  • Ad hoc Contract
  • 3. Derogations? (Art. 49)
  • Consent
  • Performance of

contract

  • Important reasons
  • f public interest
  • Legal claims
  • Vital interests
  • Legitimate interest

+ notification

  • 1. Is country of import

white-listed? (can also be sector or region) (Art.45) * EU-US Privacy Shield would be a “white-listed” regime. * Codification of WP29 Guidance; preference for appropriate safeguards over derogations.

slide-86
SLIDE 86

Data Transfer Mechanisms Compared

Commission Adequacy BCRs Derogations

(e.g., explicit consent, contractual performance)

Privacy Shield Codes of Conduct/Seals Scope Legal certainty Burden

  • Intra-group transfers
  • Flexible
  • Regulator approved

High

(GDPR Art. 47)

Invalidation Risk

Model Contracts

Invalidation Risk Likely High

(GDPR Arts. 40-43)

High Medium

  • High upfront
  • Low ongoing
  • Medium
  • Documentation

required

  • High upfront
  • Low ongoing
  • Low
  • Intensive

maintenance

  • High upfront
  • Low ongoing
  • Sector/company specific
  • New; mechanisms must be

developed

  • Seals valid for up to 3 years;
  • ption to renew
  • Limited to countries

recognized as providing adequate protection

  • Limited in scope
  • Narrowly interpreted
  • Limited to contracting parties
  • Limited to certain

business

  • EU to US transfers
  • Low

(Arts. 44-50)

General Data Protection Regulation (GDPR)

(Arts. 25, 26)

Data Protection Directive

  • Transfers permissible only if third country ensures adequate

level of protection or derogation applies

  • Controllers responsible for compliance
  • Supersedes Directive; effective May 25th 2018
  • Similar but more detailed transfer regime
  • Controllers and processors responsible for compliance
slide-87
SLIDE 87

Update on Data Transfers

  • Less redtape
  • Schrems II – the big unknown
  • Invalidation of SCCs?
  • Invalidation of Privacy Shield?
  • Effect on BCRs?
  • New model clauses in the pipeline (?)
  • Certification and Codes of Conduct taking a slow

start

87

slide-88
SLIDE 88

BCRs under the GDPR – What’s New?

  • Formal recognition by the European Legislator (Art. 47 GDPR)
  • Data transfer permit requirements have been abolished
  • Available to a “group of enterprises engaged in a joint economic activity”
  • Working Party Documents on BCRs have been overhauled:

– WP 256 rev.01 – Content requirements BCR-C – WP 257 rev.01 – Content requirements BCR-P – WP 263 rev.01 – Cooperation Procedure for Approval of BCRs (Procedural Guideline) – WP 264 – Application Form for BCR-C – WP 265 – Application Form for BCR-P

  • Pre-GDPR BCRs must be updated (!)
slide-89
SLIDE 89

The Advantages of Implementing BCRs

A flexible, scalable mechanism for intra-group transfers – Accommodates new data flows, group members and geographies – Provides legal certainty for transfers – Enhances customer confidence A foundation for a coherent global privacy program – Necessitates mapping of existing data flows and controls – Harmonizes approach to privacy within group and raises awareness – Drives future compliance measures – Signals corporate responsibility to DPAs A means to achieve compliance with the GDPR or to benefit from GDPR compliance efforts – SA approval will hinge on compliance with EU rules, including new GDPR requirements Get to know your Lead SA – Companies that implement BCRs are less likely to face the GDPR’s potentially draconian sanctions Foster compliance with other national data protection laws (e.g. CCPA)

slide-90
SLIDE 90

Approval Process from the Regulator’s Perspective

* Concerned SAs are determined based on the countries from where transfers are to take place or, in the case of a BCR-P, all SAs since a processor established in a Member State may provide services to controllers in potentially all Member States.

New!

slide-91
SLIDE 91

Can BCRs be Invalidated?

What can data subjects or SAs do in theory?

  • Individuals can submit complaints to Sas
  • Individuals can sue SAs that do not take action before national courts and obtain injunctive relief (Art. 78 GDPR) – National courts

may obtain ECJ preliminary ruling

  • SAs can also act autonomously and stop data exports in specific cases (Art. 77 GDPR)
  • SAs can, at the same time, trigger the “urgency procedure” (Art. 66 GDPR) and obtain EDPB “binding decision” compelling other

SAs also to stop data exports

  • Consumer organizations can do the same (Art. 80 GDPR)

Some risk to BCRs and adequacy

  • Same U.S. government protections as for SCCs and Privacy Shield
  • BCRs cannot stop surveillance of non-EU National Authorities

On the other hand, significant legal arguments support BCRs

  • GDPR itself mentions and strongly supports BCRs
  • Considered the gold standard for protecting personal data
  • No pending litigation against BCRs: Thus, even if CJEU holds against SCCs and Privacy Shield, that would have no legal holding

concerning adequacy of BCRs.

slide-92
SLIDE 92

Session B: GDPR: What is happening across EU and across Borders

  • 2. CJEU Rulings
slide-93
SLIDE 93

Recent CJEU Judgments

  • Joint controllers
  • Both the community and its

members are joint controllers regarding data collected by members in the context of door- to-door preaching organized by the community.

  • Irrelevant that the community

does not have access to the specific data.

  • Joint controllers
  • A Facebook fan page

administrator is a joint controller of the data collected through cookies on the page.

  • Irrelevant that page

administrators do not have access to cookies, but only to aggregate statistics.

Wirtschaftsakademie Shleswig- Holstein (June 5, 2018) Jehovan todistajat (July 10, 2018)

slide-94
SLIDE 94

Recent CJEU Judgments

Joint controllers: (i) collection and (ii) transmission

Controller for further processing ECJ: By integrating FB’s code into its website, FashionID has allowed the collection and transmission of personal data to FB. FashionID is also considered to obtain economic advantage (optimizing advertisement; visibility on FB platform)

slide-95
SLIDE 95

Recent CJEU Judgments

Fashion ID (July 29, 2019)

  • Joint control. Website operator and plugin providers are joint controllers.
  • Website operator is responsible for notice and consent. Since the data is

collected and transmitted as soon as individuals visit the website. However, ECJ did not exclude possibility to rely on legitimate interest.

  • No decision as to whether the Facebook “like button” involves storing or

access triggering consent under Directive 2002/58.

slide-96
SLIDE 96

Recent CJEU Judgments

Action Points:

  • 1. Identify and assess use of third-party plug-ins in

websites/apps

  • 2. Review notice and consent strategy
  • 3. Review data privacy terms in contracts with plug-

in providers

slide-97
SLIDE 97

Recent CJEU Judgments

I want to receive marketing

I allow analysis and third party advertising cookies

Planet49 GmbH (October 1, 2019) Questions to ECJ: (i) is the practice to drop cookies based on pre-ticked boxes compatible with ePrivacy Directive? ; (ii) what information must be provided to meet obligations under same Directive?

slide-98
SLIDE 98

Recent CJEU Judgments

Planet49 GmbH (October 1, 2019)

  • Active consent. No implicit consent / Consent requires positive action / No

pre-ticked tick boxes.

  • Users must be informed of Cookie duration and third-party access to
  • cookies. Provide sufficient information of cookie functionality.
  • Consent is required for storage of information.
slide-99
SLIDE 99

ICO/CNIL Cookie Guidance

  • ECJ caselaw confirms position of ICO/CNIL
  • Consent for all cookies, except essential cookies
  • No implied consent
  • Consent should include a list of third-party cookies
  • Cookie walls are disfavored (not allowed in France)
slide-100
SLIDE 100

Session B: GDPR: What is happening across EU and across Borders

  • 3. Brexit
slide-101
SLIDE 101

Brexit – Background /1

  • UK joined the European Communities on 1 January

1973

  • UK referendum on membership June 1975: 67%

voted Yes to stay in EC, 33% voted No

  • Key EU milestones during UK membership:
  • First European Parliament elections held (1979).
  • Maastricht Treaty (1992): European Union

formally established.

  • Euro introduced (2002): UK does not

participate.

  • 2004 expansion. Largest single enlargement of

the EU

  • Lisbon Treaty (2007): European Council formed,

Charter of Fundamental Rights has legal effect.

slide-102
SLIDE 102

Brexit – Background /2

  • UK referendum on membership June 2016: 52% voted to leave

EU, 48% voted to remain. PM David Cameron resigns, Theresa May becomes PM

  • UK letter invoking Article 50 (Treaty on the European Union)

exit procedure delivered to EU on 29 March 2017. General election held June 2017.

  • UK’s withdrawal from the EU was set for 29 March 2019.
  • Then delayed to 12 April 2019, then delayed to 31 October

2019

  • UK participates in European Elections May 2019
  • Prime Minister Theresa May resigns, Boris Johnson becomes

PM 24 July 2019

  • European Council meets on 17/18 October 2019
slide-103
SLIDE 103

Brexit – Current Position

On 25 November 2018, European Council approved a number of negotiating documents including:

  • Draft Withdrawal Agreement (585 pages): the legally binding ‘divorce’

agreement covering citizens’ rights, the financial settlement (c. £39 billion GBP) and the transition period up until end of 2020 (extendable for one or two years).

  • Political Declaration on the Future Relationship between the EU and the

UK (26 pages): a non-binding statement on the future economic and security relationship including a shared commitment to maintaining high personal data protection standards.

slide-104
SLIDE 104

Brexit – What Next?

  • The Protocol on Ireland and Northern Ireland (the Backstop) has been main
  • bstacle to UK Parliament agreement of the Withdrawal Agreement
  • CJEU ruled that the UK could unilaterally revoke Article 50 withdrawal

notice but appears unlikely at present

  • Prime Minister states that UK must leave EU on 31 October 2019 “do or

die” but Parliament had legislated to prevent No Deal exit

  • Deal: under the transition period, EU law will continue to apply in the UK

but the UK will not participate in EU institutions and structures

  • No Deal: UK will become a third country vis-à-vis the EU upon exit
slide-105
SLIDE 105

Brexit – Impact on Data Transfers

Deal = data transfers from the EEA to the UK can continue in accordance with current practice until the end of the transition period No Deal = appropriate data transfer instruments, such as standard data protection clauses or binding corporate rules, must be implemented to enable the transfer of personal data from the EEA to the UK. UK based companies with ICO approved BCRs must identify the most appropriate Lead Supervisory Authority in an EU Member State in order for the BCRs to remain valid. Deal or No Deal = UK adequacy negotiations will commence but there may be challenges to that decision being made (e.g. national security processing concerns, omission of GDPR Article 48). Position in UK

  • UK already recognises EEA countries as adequate so UK to EEA transfers are permitted.
  • UK will permit transfers to countries and territories that EU has determined to be adequate (e.g.

Switzerland, Argentina, Israel, the Faroe Islands and British Crown Dependencies and existing SCCs can be used)

  • Under the Privacy Shield, organisations based in the U.S. will need to update their commitments to include

the UK

slide-106
SLIDE 106

Brexit – Regulatory and Policy Impact

Regulatory Affairs Deal: One-stop shop will still apply to data controllers and processors based in the UK during the transition period in terms of cross-border data processing activities that will also apply in the EU. UK may be invited to attend EDPB on a case-by-case basis but will not have a vote No Deal: One-stop shop will not apply. UK-based controllers must identify the most appropriate lead authority in the EU in order to use the one-stop shop. Otherwise, they should appoint an EU based representative Deal or No Deal: UK ICO will seek to maintain relationships with EDPB but will also seek to extend influence in wider world Public Policy

  • ‘UK GDPR’ comes into effect upon exit day – amends application of GDPR in UK and Data Protection Act

2018

  • Upon exit, UK will no longer participate in negotiations in the Council of the EU, attend the European

Council, have MEPs or participate in institutions and agencies of the EU

  • UK may make own rules and adequacy decisions following exit
slide-107
SLIDE 107

Brexit – Regulatory and Policy Impact

  • Check instruments in place to enable

cross-border transfers following exit

  • Check whether a new one-stop shop

lead authority should be identified or an EU-based representative should be appointed

  • Consider whether a new BCR lead

authority needs to be identified

  • Monitor the political situation and check

regulators’ advice (ICO, EDPB and other EU DPAs) on a regular basis

  • Plan for all contingencies
slide-108
SLIDE 108

Session B: GDPR: What is happening across EU and across Borders

  • 4. Influence on

Global Laws

slide-109
SLIDE 109

Influence on Global Laws

California Consumer Privacy Act (CCPA):

  • Similarities with GDPR include obligations on transparency, access, deletion rights,

security obligations, parental or guardian consent, requirements to update privacy policies and financial penalties

  • Differences include limited scope (threshold revenue for business), application to

consumer data only and no specific data protection authority Brazil General Data Protection Law (LGPD)

  • Similarities with GDPR include exterritorial scope where processing involves
  • ffering goods or services to individuals in Brazil, data subject rights and lawful

grounds for processing and privacy by design and by default

  • An independent DPA was originally excluded from the law but an amendment to

the law introduced created the National Data Protection Authority (ANPD)

slide-110
SLIDE 110

Session C: New Issues for 2020 and Beyond

1. ePrivacy Regulation 2. Convention 108+ 3. Digital Ethics 4. AI in the EU

slide-111
SLIDE 111

Session C: New Issues for 2020 and Beyond

  • 1. ePrivacy

Regulation

slide-112
SLIDE 112

ePrivacy Regulation

ePrivacy Regulation (ePR)

CHAPTER I: GENERAL PROVISIONS Art 1: Subject Matter Art 2: Material scope Art 3: Territorial Scope Art 4: Definitions Art 4a: Consent CHAPTER II: PROTECTION OF ELECTRONIC END-USERS AND OF THE INTEGRITY OF THEIR EQUIPMENT Art 5: Confidentiality of ECD Art 6: Permitted processing of ECD Art 7: Storage and erasure of ECD Art 8: Protection of end-users terminal equipment CHAPTER III: END-USERS RIGHTS TO CONTROL ELECTRONIC COMMUNICATIONS Art 10: Information and options for privacy settings to be provided Art 11: Restrictions Art 12: Presentation and restriction of calling and connected line identification Art 13: Exceptions to presentation and restriction of calling ID Art 14: Incoming call blocking Art 15: Publically available directories Art 16: Unsolicited and direct marketing communications CHAPTER IV: INDEPENDENT SUPERVISORY AUTHORITIES AND ENFORCEMENT Art 18: Independent supervisory authorities Art 19: European Data Protection Board Art 20: Cooperation and consistency procedures CHAPTER V: REMEDIES, LIABILITIES AND PENALTIES Art 21: Remedies Art 22: Right to compensation and liability Art 23: General conditions for imposing administrative fines Art 24: Penalties CHAPTER VI: DELEGATED AND IMPLEMENTING ACTS Art 25: Exercise of the delegation Art 26: Committee CHAPTER VII: FINAL PROVISIONS Art 27: Repeal Art 28: Monitoring and evaluation clause Art 29: Entry into force and application Version 0.4, based on FI Pcy text of 18 Sept 2019

slide-113
SLIDE 113

ePrivacy Regulation: Key Features /1

  • The European Commission published a proposal in January 2017 for a regulation to replace the ePrivacy

Directive of 2002 and amendments of 2009. Negotiations in Council of EU (member states) continue.

  • The proposed ePrivacy Regulation (ePR) lays down rules regarding respect for private life and
  • communications. It protects communications of natural and legal persons
  • Material scope (selected), applies to:

– Electronic communications content (ECC) & electronic communications metadata (ECM) - [together = Electronic communications data (ECD)] – Software permitting electronic communications, e.g. browsers, messaging apps and provides rules

  • n cookies

– Information on end-users’ terminal equipment (TE), e.g. PCs, phones, connected devices – Sending of direct marketing communication (DMC)

  • Territorial scope, applies to:

– Electronic Communications Services (ECS) providers processing ECD of end-users in EU – Sending of DMC to end-users in EU

  • Provides that ECD shall be confidential and that monitoring, surveillance etc. shall be prohibited
slide-114
SLIDE 114

ePrivacy Regulation: Key Features /2

  • Processing of ECD by providers of electronic communications networks (ECN) and ECS permitted if

(selected): – End-user has provided consent – Necessary for the performance of a contract (e.g. billing, subscription) – Necessary for security, network management or optimisation – Council amendment (July 2018) proposes that ECM based on consent may be further processed subject to safeguards such as limiting processing to pseudonymised data

  • Without prejudice to permitted processing conditions, ECS providers shall erase or anonymise ECC and

ECM after receipt by intended recipient unless required for billing purposes and challenge

  • Use of processing of end-user TE or collecting from end-user equipment prohibited unless (selected):

– End-user has provided consent – Necessary for providing information society service (ISS) requested – Necessary for transmitting electronic communications over an ECN

  • Use of ECS to send DMC prohibited unless based on end-user consent (including ‘soft opt-in’), and where

right to object is provided

slide-115
SLIDE 115

ePrivacy Regulation: Potential Impacts

Security and confidentiality of communications

  • Potential impact where hardware and software need to be updated to maintain security and confidentiality
  • f communications

Protecting terminal equipment of end-users

  • Technical measures may need to be applied to protect end-user terminal equipment and to conduct DPIAs

related to new processing use-cases Cookie consent, privacy settings and processing notices

  • Action may need to be taken to update cookie consent and provide privacy settings in apps

Processing of location data and other metadata

  • Measures may need to be put in place to protect terminal equipment and to provide effective notice and

consent regarding the use of connected devices Unsolicited and direct marketing communications

  • measures need to be put in place to provide for notice and consent related to marketing, and to provide for

the right to object. Administrative fines

  • ePR provides for GDPR fines, i.e. maximum fine of 4% of global turnover or €20m million whichever is higher.
slide-116
SLIDE 116

Session C: New Issues for 2020 and Beyond

  • 2. Convention 108+
slide-117
SLIDE 117

Convention 108+

  • The Council of Europe comprises 47 member states

including all members of the EU

  • All member states have signed-up to the European

Convention on Human Rights (1950)

  • The European Court of Human Rights oversees

implementation of the Convention in member states

  • Convention for the Protection of Individuals with

regard to the Processing of Personal Data (Convention 108) adopted in 1981

  • Revision (Convention 108+) adopted in 2018
  • Modernisation addressed challenges from new

technologies and to strengthen effective implementation

slide-118
SLIDE 118

Convention 108+

  • The Council of Europe comprises 47 member states

including all members of the EU

  • All member states have signed-up to the European

Convention on Human Rights (1950)

  • The European Court of Human Rights oversees

implementation of the Convention in member states

  • Convention for the Protection of Individuals with

regard to the Processing of Personal Data (Convention 108) adopted in 1981

  • Revision (Convention 108+) adopted in 2018
  • Modernisation addressed challenges from new

technologies and to strengthen effective implementation

slide-119
SLIDE 119

Convention 108+

Key elements of Convention 108+

  • The principles of transparency, proportionality, accountability, data minimisation, privacy by design, etc.

are now acknowledged as key elements of the protection mechanism and have been integrated in the modernised instrument.

  • The concept of ‘file’ is abandoned. ‘Controller of a data file’ is replaced by ‘data controller’, in addition to

which the terms ‘processor’ and ‘recipient’ are used.

  • The scope of application includes both automated and non-automated processing
  • Each Party has to adopt in its domestic law the measures necessary to give effect to the provisions of the

Convention.

  • The catalogue of sensitive data has been extended to include genetic and biometric data, as well as data

processed for the information they reveal relating to trade-union membership or ethnic origin

  • The right not to be subject to a decision which affects the data subject which is based solely on an

automated processing, without the data subject having her/his views taken into consideration is introduced.

  • Transborder flows of data to a recipient that is not subject to the jurisdiction of a Party are subject to an

appropriate level of protection being in place; either by law, or by ad hoc or approved standardised safeguards that are legally binding and enforceable (notably contractual clauses or binding corporate rules)

slide-120
SLIDE 120

Convention 108+

GDPR Recital 105: Consideration of International Agreements for an Adequacy Decision

Apart from the international commitments the third country or international organisation has entered into, the Commission should take account of obligations arising from the third country’s or international organisation’s participation in multilateral or regional systems in particular in relation to the protection of personal data, as well as the implementation of such obligations. In particular, the third country’s accession to the Council of Europe Convention of 28 January 1981 for the Protection of Individuals with regard to the Automatic Processing of Personal Data and its Additional Protocol should be taken into account. The Commission should consult the Board when assessing the level of protection in third countries

  • r international organisations.

The following non-member states have ratified the Convention

  • Argentina
  • Cabo Verde
  • Mauritius
  • Mexico
  • Morocco
  • Senegal
  • Tunisia
  • Uruguay
slide-121
SLIDE 121

Session C: New Issues for 2020 and Beyond

  • 3. Digital Ethics
slide-122
SLIDE 122

The Evolving Digital Landscape

slide-123
SLIDE 123

What is Digital Ethics?

  • It is the evaluation of moral problems related

to data and algorithms

  • It demonstrates a commitment to the greater

good, taking into account a broad set of principles and stakeholders

  • It encourages organisations to embed trust

into the way that they interact with customers, stakeholders and society

slide-124
SLIDE 124

What is Digital Ethics?

Giovanni Buttarelli, European Data Protection Supervisor (2014-2019) (Opening speech, ICDPPC 2018, 24 October 2018)

  • “We are now living through a generational shift in

the respect for privacy. This shift is towards establishing sustainable ethics for a digitised society.

  • Ethics come before, during and after the law. It

informs how laws are drafted, interpreted and

  • revised. It fills the gaps where the law appears to be
  • silent. Ethics is the basis for challenging laws.”
slide-125
SLIDE 125

What is Digital Ethics?

Elizabeth Denham, UK Information Commissioner (Video address at techUK Digital Ethics Summit, 13 December 2018)

  • “Fairness is where we crash against the questions of

should we do this, rather than can we do this. Fairness is a legal principle in the GDPR and we cannot expect it to be silent against ‘should’ type questions.

  • Every time we make a ruling on the fairness

principle, we reduce the real estate available for those who operate in the ethically questionable but legally acceptable realm.”

slide-126
SLIDE 126

Digital Ethics Themes

Trustworthiness, accountability and honesty

  • Being fair and transparent, implementing policies,

demonstrating compliance with regulations, explaining and recording the logic behind decision- making Testing and evaluation

  • Testing of processes in order to assess impacts and
  • utcomes, including addressing any potential

biases Benefits to society

  • Keeping the ‘greater good’ in mind and developing

corresponding processes and policies

  • Using ethics and wider issues of corporate

responsibility to provide a platform for innovation

slide-127
SLIDE 127

Digital Ethics: What not to do

Collect and process information in a way that infringes human rights

  • For example, in ways that individuals or society would find intrusive or
  • ffensive, does not meet their reasonable expectations, or infringes rights

such as liberty, security and privacy Manipulate the social conscience of society, a group of people or the mental state and views of an individual

  • For example, by manipulating thoughts and feelings of individuals,

distributing propaganda, or subverting democratic principles and the rule

  • f law

Participate in processes that put individuals in danger

  • For example, research and testing, deployment of artificial intelligence in

decision-making linked to technology and sharing sensitive information about individuals

slide-128
SLIDE 128

Ethically Sound Decision-Making

Ethical Position

This overlap is the optimal decision making level the

  • rganisation would like to achieve: what is technically

feasible, legal, desired and ethical.

slide-129
SLIDE 129

Session C: New Issues for 2020 and Beyond

  • 4. AI in the EU
slide-130
SLIDE 130

Turing Test

Can a computer “trick” a person into thinking the computer is a person too? Person communicates with another person and a robot via a

  • telegraph. If the person cannot tell who is who more than

50% of the cases, the robot is “intelligent” No robot has so far passed the Turing test, however, some have claimed victory with a 33% success rate

slide-131
SLIDE 131

Artificial Intelligence

  • AI refers to systems that display intelligent behavior by analyzing their

environment and taking actions – with some degree of autonomy – to achieve specific goals

  • AI – based systems can be:

– purely software-based, acting in the virtual world (e.g., voice assistants, image analysis software, search engine, speech and face recognition systems) or – embedded in hardware devices (e.g., advanced robots, autonomous cars, drones or Internet or IoT applications)

  • Data collected:

– structured data: organized according to pre-defined models (e.g., relational database) – unstructured data: do not have a known organization (e.g., image or piece of text)

slide-132
SLIDE 132

Artificial Intelligence

Artificial Intelligence Machine Learning Deep Learning

slide-133
SLIDE 133

Machine Learning

  • Subset of AI
  • Computers study algorithms and “learn”, i.e. they automatically improve

through experience

  • Types of Machine Learning:

– Supervised (majority of cases):

  • An algorithm is learning from the dataset as the student learns from the teacher. We know

the correct answers, the algorithm makes predictions and is corrected by the “teacher”. Learning stops when algorithm achieves an acceptable level of performance

– Unsupervised:

  • There is no teacher and no correct answers. Algorithms are left alone to make guesses and

discover and present datasets

– Reinforcement learning:

  • AI systems are free to take decisions and are rewarded depending on whether the decision

was bad or good (like a dog being rewarded with a treat if he acts well)

slide-134
SLIDE 134

Deep Learning

  • Subset of Machine Learning
  • Artificial neural networks, i.e. algorithms inspired by the human

brain, learn from large amounts of data. Like humans learn from experience, the algorithm performs a task repeatedly and makes minor tweaks to improve outcome

  • Deep Learning process consists of two main phases:

– training: process of labeling large amounts of data and determining their matching characteristics – inferring: the Deep Learning AI makes conclusions and labels new unexposed data by using the previous knowledge

  • Advantages:

– overall approach more accurate and reliable – less need of human guidance

slide-135
SLIDE 135

AI in Healthcare

  • Worldwide spending in AI is estimated to reach $58 billion by 2021
  • More than half of AI spending is coming from healthcare, banking,

and retail

  • Key clinical health AI applications can potentially create $150

billion in annual savings for the US healthcare economy by 2026

slide-136
SLIDE 136

Application Of AI In The Health Sector

Assisted robotic surgery Detection, diagnosis and treatment Medical image analysis and labeling Drug discovery & supply Prediction of disease trends Healthcare Operational Effectiveness (HOE) Virtual nursing assistants

slide-137
SLIDE 137

Application Of AI In The Health Sector (cont.)

Personalized medicine Patient monitoring and care Telehealth/Telemedicine Electronic Health Records Internet of Medical Things Wearables and health monitoring

slide-138
SLIDE 138

Asimov’s Rules Of Robotics

  • 1. A robot may not injure a human being or, through

inaction, allow a human being to come to harm

  • 2. A robot must obey orders given by human beings except

where such orders would conflict with the First Law

  • 3. A robot must protect its own existence as long as such

protection does not conflict with the First or Second Laws

slide-139
SLIDE 139

AI and EU Privacy: The Main Touchpoints

slide-140
SLIDE 140
  • 1. Automated Decision-Making
  • Right not to be subject to decisions based solely on

automated processing, including profiling, which produces legal effects or similarly significantly affects the individual

  • Automated Decision-Making is permitted if:

– Necessary for entering or performance of a contract – Authorized by law – Based on the data subject’s explicit consent

slide-141
SLIDE 141
  • 2. Anonymized or Personal Data?
  • Anonymization, or de-identification, can be a moving target.

Seemingly anonymized datasets can be reversed when the computer gets “too smart”

  • AI systems can be effective in re-identifying an individual by, e.g.

cross-referencing available datasets

  • Certain jurisdictions consider the re-identification of anonymized

datasets to be a criminal offence, even if it is unintentional (e.g. UK Data Protection Act 2018 S.171)

slide-142
SLIDE 142
  • 3. Data Minimization / Purpose Limitation
  • Only “adequate, relevant and limited” personal data

can be processed in relation with the purposes of the processing

  • Wealth of AI capabilities seems in contradiction with

purpose limitation

  • Secondary purposes that were not envisioned in the

initial data collection need to be GDPR compliant (e.g.

  • btain new consent)
slide-143
SLIDE 143
  • 4. Legal Basis for Processing
  • Considering the voluminous amounts of data processed by AI

technologies and their various sources, how can one determine whether the personal data was lawfully collected?

  • If the data subject is not clearly informed, can he or she give

informed and free consent?

  • What is the limit of legitimate interest when we base

processing activities on this ground?

slide-144
SLIDE 144
  • 5. Privacy by Design and Privacy by Default
  • Every company must implement appropriate technical and
  • rganizational measures integrating necessary safeguards

into the processing

  • In practice, AI algorithms will have to be designed so as to

respect these principles. This may limit the amount of data which can be lawfully processed

  • In cases of oceans of data in AI applications it may be

particularly difficult, if not impossible, to fulfil such rights

slide-145
SLIDE 145

CNIL’S Report on AI

  • Principle of Fairness: algorithm should be fair towards its users, citizens, society.

Not at odds with collective interests

  • Principle of continued attention and vigilance: need for an ongoing state of alert

as regards the development and societal impact of AI 145

slide-146
SLIDE 146

CNIL’S Report on AI

slide-147
SLIDE 147
  • The European Commission launched the pilot phase of the ethics guidelines for

trustworthy AI in July 2019. The ethics guidelines will be finalized in early 2020 and will include specific checklist items that companies can use when deciding whether to deploy an AI algorithm.

  • Trustworthy AI principles, in the pilot form, include:
  • Human agency and oversight;
  • Robustness and safety;
  • Privacy and data governance;
  • Transparency;
  • Diversity, non-discrimination and fairness;
  • Societal and environmental well-being;
  • Accountability

Some AI Developments

slide-148
SLIDE 148

Session D: GDPR in Practice

1. Legitimate Interest Test 2. Data Protection Impact Assessment 3. Group Exercise

slide-149
SLIDE 149

Session D: GDPR in Practice

  • 1. Legitimate

Interest Test

slide-150
SLIDE 150

Legitimate Interest Test /1

Legitimate interest is one of six legal grounds for processing personal data but is possibly the most controversial.

  • Art. 6(1)(f): “(Processing shall be lawful only if…) processing is necessary for the

purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.” Point (f) of the first subparagraph shall not apply to processing carried out by public authorities in the performance of their tasks.

slide-151
SLIDE 151

Legitimate Interest Test /2

  • Select which of the six lawful grounds for processing personal data can be best relied upon

for the specified purpose. There is no hierarchy of grounds.

  • Legitimate interests may be a suitable ground for processing where such processing has

benefit for the controller and the data subject concerned (or a third party to whom the data are disclosed)

  • Circumstances where a legitimate interest basis could be used:

– To achieve legitimate and lawful business objectives of the organisation – Where the reasonable expectations of the data subject have been taken into account – Processing for fraud protection – Processing for direct marketing although consent needs to be relied upon for electronic direct marketing under ePrivacy rules – Managing marketing opt-outs and suppression requests – Physical, IT and network security – Carrying out risk assessments and due diligence tests

slide-152
SLIDE 152

Legitimate Interest Test /3

  • Consider the purposes of the processing, the safeguards in place and what the data subject

would reasonably expect in terms of the way that data concerning them is processed

  • If you think there could be a problem with which the data is processed, then there probably

is (‘cool’ v. ‘creepy’)

  • Consider avoiding using a legitimate interest where there is an intention to use the data in a

way which may raise objections from data subjects, where there is a risk of harm to individuals

  • Legitimate interest cannot be used where special categories of data (sensitive data) are

being processed

  • Ensure that the legitimate interest relied upon is documented and set out in the notification
  • f processing sent to data subjects
  • Data subjects have the right to object to processing based on legitimate interest.
  • Where a data subject objects to the processing, the controller shall no longer process the

data unless they can demonstrate compelling legitimate grounds for the processing which

  • verride the interests rights, and freedoms of the data subject
slide-153
SLIDE 153

Legitimate Interest Test /4 UK ICO Legitimate Interest Test

  • Purpose test: are you pursuing a legitimate

interest?

  • Necessity test: is the processing necessary for

that purpose?

  • Balancing test: do the individual’s interests
  • verride the legitimate interest?
slide-154
SLIDE 154

Session D: GDPR in Practice

  • 2. Data Protection

Impact Assessment

slide-155
SLIDE 155

Data Protection Impact Assessment

GDPR: Article 35 Data protection impact Assessment (abridged) 1. Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data. A single assessment may address a set of similar processing operations that present similar high risks. 2. The controller shall seek the advice of the data protection officer, where designated, when carrying out a data protection impact assessment. 3. A data protection impact assessment referred to in paragraph 1 shall in particular be required in the case of: (a) a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person; (b) processing on a large scale of special categories of data referred to in Article 9(1), or of personal data relating to criminal convictions and offences referred to in Article 10; or (c) a systematic monitoring of a publicly accessible area on a large scale.

slide-156
SLIDE 156

GDPR and the Notion of Risk

GDPR: Article 35 Data protection impact Assessment (abridged) 1. Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data. A single assessment may address a set of similar processing operations that present similar high risks. 2. The controller shall seek the advice of the data protection officer, where designated, when carrying out a data protection impact assessment. 3. A data protection impact assessment referred to in paragraph 1 shall in particular be required in the case of: (a) a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person; (b) processing on a large scale of special categories of data referred to in Article 9(1), or of personal data relating to criminal convictions and offences referred to in Article 10; or (c) a systematic monitoring of a publicly accessible area on a large scale.

slide-157
SLIDE 157

Example DPIA Process Flow

slide-158
SLIDE 158

Session D: GDPR in Practice

  • 3. Group Exercise
slide-159
SLIDE 159

Scenario – applying the GDPR in practice

  • 4. we-r-board.com are planning to

launch the eSurfBoard in six weeks

  • 3. Is planning to introduce a new

product - the eSurfBoard - which collects data and connects to the internet

  • 2. Collects, processes and transfers

personal data worldwide including between the EU and U.S.

  • 1. Marketing offices globally including in

6 EU countries. Designated main establishment in the Netherlands

we-r-board.com

the social network for surfers

  • 5. Planning to use a legitimate

interest basis for data processing but data protection risks have been identified.

slide-160
SLIDE 160

eSurfBoard features

eSurfBoard

the connected surfboard!

  • Integrated 5G, wifi and Bluetooth connectivity
  • GPS-enabled location tracking
  • Onboard camera and live transmission of surfing action
  • Built in phone and screen on the board
  • Transmits health and heartrate date via the connected wristband
  • Live dashboard of all user data on we-r-board.com
  • Notifies users of when surfer friends are in the area or on the same beach
  • Takes water samples (temperature, pollution, wave height) and automatically

transmits such data to the social media platform

  • Emergency function to send notice of when a surfer is in trouble – activated

when surfer is separated a certain distance from board

  • Enables discounts at local surf bars and shops based on location tracking
  • Acceptance of targeted advertising on the board’s screen enables discounts
slide-161
SLIDE 161

Group Exercise

  • we-r-board.com has designed a great new product – eSurfBoard – which

has lots of features

  • In order that eSurfBoard can launch in EU markets and be compliant with

the GDPR, the data controller has decided it would be easier and less costly to rely on legitimate interest rather than consent for the processing activities connected to the board.

  • The Data Protection Officer has however determined that a full Data

Protection Impact Assessment should be undertaken due to potential risks 1. Can we-r-board.com rely on legitimate interest for any processing activities? 2. What should be taken into account when undertaking a DPIA of the eSurfBoard and related data processing activities?