Demystifying the Role of AI in Privacy Management Darren Abernethy - - PowerPoint PPT Presentation

demystifying the role of ai in
SMART_READER_LITE
LIVE PREVIEW

Demystifying the Role of AI in Privacy Management Darren Abernethy - - PowerPoint PPT Presentation

October 16, 2019 Demystifying the Role of AI in Privacy Management Darren Abernethy TrustArc Maggie Gloeckle A+E Networks Hilary Lane Ravi Pather CryptoNumerics Introduction Demystifying the Role of AI in Privacy Management Agenda


slide-1
SLIDE 1

October 16, 2019

Demystifying the Role of AI in Privacy Management

Darren Abernethy TrustArc Maggie Gloeckle A+E Networks Hilary Lane Ravi Pather CryptoNumerics

slide-2
SLIDE 2

Introduction

slide-3
SLIDE 3

Demystifying the Role of AI in Privacy Management

Agenda

  • Introductions
  • Privacy and the Business Challenges
  • Introduction to “Automated Intelligence”
  • Challenges with Managing Privacy Compliance
  • “Automated Intelligence” Use Cases
  • Key takeaways
slide-4
SLIDE 4

Privacy Funding

POLL: Privacy Funding

What is the level of privacy funding at your

  • rganization?
  • A. Low - we have significantly underinvested in privacy
  • B. Medium – we have inconsistently invested in privacy
  • C. High – we have consistently invested in privacy
  • D. I don’t even want to think about it
slide-5
SLIDE 5

Privacy Ownership

POLL: Organizational Compliance Ownership Who owns privacy compliance within your organization?

A. Office of the General Counsel or Legal Department

  • B. Compliance
  • C. Information Technology – CISO / CIO
  • D. Suite Executive (s) CFO/ COO /CTO / CEO
  • E. Management by Committee
slide-6
SLIDE 6

Privacy and Business Challenges

slide-7
SLIDE 7

Privacy and the Business Challenges

Privacy as a strategic business imperative

For maximum data value balanced by responsible data stewardship

Privacy

Stakeholder Alignment

0100110101110101100101

slide-8
SLIDE 8

Privacy and the Business Challenges

How are Data and Privacy Compliance Evolving?

2018 2016

Sep 19

201 9 201 7 2020

  • Securing Data
  • Threat Detection

& Loss Prevention

  • Privacy Compliance is

getting more Complex

  • Incentive to use provable

de-identified data grows

  • Failure to comply with

Privacy is damaging

  • Demands Privacy by Design (PbD)
  • Automate Risk Assessment of re-

identification (Demonstrate PIA)

  • Ensure risk of re-identification has

been removed

  • Retain Analytical Value in Data

Data Security GDPR Phase 1 HIPAA GDPR Phase 2 PIPEDA CCPA Privacy Automation

  • Focus on Data

Security

  • Loss Prevention
  • Intrusion Detection
  • Legal meaning
  • Consent Management
  • GRC Risk and Privacy

Tools

  • Re-identification Risk
  • Secondary Use of Data

Demand increasing

  • Demonstrating

Compliance is hard

  • Invest in Privacy
  • Leverage Automated

tools

  • Leverage

Jan 20

  • Consent

Management

  • Right to be

Forgotten

  • Anonymization
  • Re-identification Risks
  • Data Utility Required
  • Privacy Non-Compliance
  • Security tools don’t

work for Privacy

  • Consent Management
  • Right to be Forgotten
  • Anonymization
slide-9
SLIDE 9

Introduction to “Automated Intelligence”

slide-10
SLIDE 10

The Road to “Automated Intelligence”

Data Information Knowledge Insights Wisdom

Route 2 Intelligence

slide-11
SLIDE 11

From Text to Machine

slide-12
SLIDE 12

Speed Meets Nuance

Data is Fast Laws are Complex

Business, economic, healthcare, security, and political leaders and their teams rely on vast data sources and deep analytics to make rapid, critical decisions. Regulatory obligations can attach to data as rapidly as it moves or is used for a new purpose, however, most laws aren’t written to be applied quickly as companies, data, systems, business partners and the activities in which they are involved fall in and out of scope.

slide-13
SLIDE 13

Defining “Intelligence” for Privacy

Term Definition

in·​tel·​li·​gence | \ in- ˈte-lə-jən(t)s the ability to learn or understand or to deal with new or trying situations ar·​ti·​fi·​cial in·​tel·​li·​gence | \ ˌär- tə-ˈfi-shᵊl \ in-ˈte-lə- jən(t)s a branch of computer science dealing with the simulation of intelligent behavior in computers ma·​chine learn·​ing | \ mə-ˈshēn \ ˈlər-niŋ the process by which a computer is able to improve its own performance (as in analyzing image files) by continuously incorporating new data into an existing statistical model

slide-14
SLIDE 14

Defining “Intelligence” for Privacy

Term Definition

al· go· rithm | \ ˈal-gə- ˌri-t͟həm a step-by-step procedure for solving a problem or accomplishing some end in a finite number of steps that frequently involves repetition of an operation au·​to·​mat·​ed pri·​va·​cy in·​tel·​li·​gence | ȯ-tə- ˌmā-təd\ ˈprī-və-sē in- ˈte-lə-jən(t)s algorithmic, data-driven contextual insights about privacy requirements that drive actionable priorities within operational workflows to streamline privacy management decisions and drive alignment across teams and stakeholders

slide-15
SLIDE 15

Automated Decision-Making (ADM)

ADM is the ability to make decisions by technological means. Solely ADM is ADM without any human involvement. ADM can be based on data collected directly, data collected from third party sources, or derived or inferred data. GDPR addresses risks related to Automated “Individual” Decision-Making, i.e., ADM about individuals ADM used for privacy intelligence leverages information about

  • rganization business practices and privacy metadata.

Data integrity, accuracy, and completeness are as critical to privacy intelligence as they are to nuanced legal and regulatory advice and guidance provided by expert advisors.

slide-16
SLIDE 16

Challenges with Managing Privacy Compliance

slide-17
SLIDE 17

Privacy Management Challenges

▪ Emerging Multiple Privacy Regimes and increasing complexity ▪ Privacy Compliance is here and is a legal requirement ▪ Privacy by Design and Privacy by Default is a legal recommendation ▪ Anonymised Data is today foundational to Data & Privacy Compliance ▪ Risk of re-identification is high if data is not properly anonymised ▪ Risk of fines and brand damage for non-compliant Anonymised data ▪ Balancing with Data Utility, critical for business data science and analytics ▪ Consequences for non-compliance ▪ Secondary Use apps such as Data Science, Analytics, Monetization now in scope ▪ In scope overhead will be prohibitive in using such data ▪ Privacy Breach, Brand and reputational damage and question of ethical use of data

slide-18
SLIDE 18

Enterprise Challenge: Fragmentation

Legal

Contracts Identifies Legal requirements

Compliance

Policy Tools, Spreadsheets

Privacy Compliance Access Requests Enforcement

IT Security

Encryption, Hashing, Tokenization

Data Security Protection tools

Business

Business & Customer Insights

Consumers of data

Risk

GRC Risk Tools, Spreadsheets

Manages Risk Defines Policy

Data Science

Python, R, SAS, Tableau

Analytics, AI & ML Data Insights

  • Privacy Stakeholders are fragmented and operate in silos
  • Privacy Compliance is outpacing Organizational capability to respond
  • Risk of Re-identification of data is a Risk and Compliance exposure
  • Data Protection tools breaks Analytical value of data for Data Science and Analytics

Manual & Fragmented Manual & Fragmented Manual & Fragmented Manual & Fragmented

slide-19
SLIDE 19

“Automated Intelligence” Use Cases

slide-20
SLIDE 20

“Automated Intelligence” Use Cases

  • 1. Consumer Privacy Rights / DSRs
  • 2. Incident Response
  • 3. De-Identification and Risk
  • 4. Data Discovery and Risk
slide-21
SLIDE 21

Use Case 1: Consumer Privacy Requests

Intelligence Filter

Requestor Type Required?

1-California Do Not Sell Yes, under CCPA if applicable 2-Texas Access Yes, under HIPAA and TMPA if applicable 3-Nevada Do Not Sell Yes, under Nevada Law if applicable 4-Brazil Correction Yes 5-Singapore Deletion No

slide-22
SLIDE 22

Use Case 2: Incident Response

slide-23
SLIDE 23

Use Case 3: De-Identification & Risk

Why enterprise class privacy automation is now required

▪ Build data protection by design and by default (Privacy by Design)

▪ Build an architectural point of control for policy enforcement ▪ Automated Risk Assessment for re-identification ▪ Generate fully Anonymised datasets with confidence

▪ Reduce risk of non-compliance

▪ Invest in Privacy Automation now as we invested in Data Security 5 years ago ▪ Privacy breach and non compliance is now a corporate liability & exposure ▪ Harmonize Legal, Risk & Compliance, Data Science and Business teams into a single process with Privacy Automation

▪ Data-driven data science demand will grow

▪ Make Privacy an integrated layer of Data Science Architectures

▪ Balance Privacy Compliance with Data with High Analytical value

slide-24
SLIDE 24

Use Case 3: De-Identification & Risk

slide-25
SLIDE 25

Use Case 3: De-Identification & Risk

slide-26
SLIDE 26

Use Case 4: Data Discovery & Risk

slide-27
SLIDE 27

Summary of Automated Intelligence For Privacy Management

  • Data Protection by Default and by Design

– Build a systems based Architectural Point of control for Policy Enforcement – Use emerging and “State-of-the-Art” tools to meet and demonstrate data compliance

  • Fully Anonymize Data and Demonstrate Compliance

– De-Identify ‘direct identifiers’ and apply privacy protection to ‘indirect identifiers’ – Automate Risk Assessment to demonstrate Privacy Compliance – Move to Automated, systems based ‘Risk of re-Identification vs manual ‘two eyes’ approaches

  • Legal Basis for secondary purpose use of customer data

– ‘Legitimate Interest Processing’ (LIP) is more flexible than Consent for Data Science (GDPR) – Identifiable data is in scope (CCPA & PIPEDA) – Organisational & Technical Controls are required to support de-identification of data

slide-28
SLIDE 28

Key Takeaways

slide-29
SLIDE 29

Key Takeaways

  • Data is fast, but laws are increasingly complex
  • Privacy Intelligence = automated Intelligence that:

○ delivers contextual privacy insights ○ that drive actionable priorities ○ within operational workflows ○ to streamline privacy management decisions and ○ align teams and stakeholders

  • Automated Decision-Making (ADM) is the ability to make decisions by technological

means

  • ADM used for privacy intelligence leverages information about organization business

practices and privacy metadata

  • Data integrity, accuracy, and completeness are as critical to development of privacy

intelligence as they are to nuanced legal and regulatory advice and guidance provided by expert advisors

slide-30
SLIDE 30

Resources

  • FPF Release: The Privacy Expert’s Guide to AI And Machine Learning https://fpf.org/2018/10/18/fpf-release-the-

privacy-experts-guide-to-ai-and-machine-learning/

  • Understanding Artificial Intelligence and Machine Learning https://fpf.org/2019/05/20/understanding-artificial-

intelligence-and-machine-learning/

  • Warning Signs: Identifying Privacy and Security Risks to Machine Learning Systems

https://fpf.org/2019/09/20/warning-signs-identifying-privacy-and-security-risks-to-machine-learning-systems/

  • An Introduction to Machine Learning Algorithms https://blogs.oracle.com/datascience/an-introduction-to-

machine-learning-algorithms

  • Competition and Consumer Protection Implications of Algorithms, Artificial Intelligence, and Predictive

Analytics https://www.ftc.gov/public-statements/2018/11/competition-consumer-protection-implications-algorithms-

artificial

slide-31
SLIDE 31

Questions + Contact

Darren Abernethy

Senior Counsel TrustArc 415-766-6451 darren@trustarc.com

Maggie Gloeckle

Privacy & Compliance A+E Networks 212-551-1570 margaret.gloeckle@ aenetworks.com

Hilary Lane

Former Chief Privacy Officer

NBCUniversal 917-224-4402 hilary@hilarylane.com

Ravi Pather

Vice President CryptoNumerics

+447747024321

Ravi@CryptoNumerics.com