Personal Data Management: The Users Perspective Appendix B - - PDF document

personal data management
SMART_READER_LITE
LIVE PREVIEW

Personal Data Management: The Users Perspective Appendix B - - PDF document

Personal Data Management: The Users Perspective Appendix B Detailed Research Survey Results Note: This report is based primarily on research commissioned by the International Institute of Communications from IPSOS UU between March-August


slide-1
SLIDE 1

Personal Data Management:

The User’s Perspective Appendix B Detailed Research Survey Results

Note: This report is based primarily on research commissioned by the International Institute of Communications from IPSOS UU between March-August 2012, funded by the Microsoft Corporation.

slide-2
SLIDE 2

Personal Data Management: The User’s Perspective 47

Personal Data Management

Global Qualitative Study

Research conducted and report prepared by Funded by Microsoft 2012

1

RESEARCH OBJECTIVES

This study is part of research being conducted to identify issues for consideration in developing a global policy framework on personal data management. The key objective of this qualitative study is to gain insights into the mental model

  • f how users think about personal data including:

 Explore users’ boundaries regarding the application and integration of their

personal data, specifically:

  • The types of personal data collected;
  • The means of collecting personal data (i.e., active vs. passive); and,
  • The ways of using personal data.

 Understand the risks/harms users associate with having their personal data

collected and used.

 Identify which entities users trust with regard to the collection, usage, and

protection of their personal data (e.g., service providers, government, intermediaries).

 Understand motivations for user self-management of their online data.  Identify issues for further quantitative study.

2

slide-3
SLIDE 3

Personal Data Management: The User’s Perspective 48

EXECUTIVE SUMMARY

PERSONAL DATA POLICY HYPOTHESIS

The hypothesis for a personal data management policy coming out of this research is:

  • There is a need for a more holistic and nuanced policy framework on personal data

management that:

  • Is based on responsible and trusted data exchanges amongst stakeholders.
  • Respects a set of personal data principles that empower and protect users from

harm.

  • Recognizes that users already take actions to actively manage and control the

context for sharing data.

  • Acknowledges different data attract different levels of sensitivity, and there are a

number of variables that set the context for sharing data.

  • Privacy is one aspect of a more holistic discussion on personal data management.

4

slide-4
SLIDE 4

Personal Data Management: The User’s Perspective 49

RELATIONSHIP WITH TECHNOLOGY

Technology enables users to do more but also can make them feel powerless. Consequently, the perception of control becomes a critical factor and may be affected by data context variables.

  • Positive associations with technology include:
  • Makes things users currently do more efficient, thus making life easier
  • Enables users to do things that were not previously possible or had required more

efforts in the past

  • Enables users to stay connected with others, the environment (e.g., news, weather),

and the world

  • Is fun/entertaining.
  • Negative associations with technology include:
  • Changes quickly, is difficult to keep up with
  • Increases users’ dependence on technology, resulting in a decreasing sense of control
  • Decreases personal privacy

5

USER TYPE CHARACTERIZATION

Four personal attributes uncovered in this research make up the dimensions of user types, including:

Toronto Shanghai Hamburg US

Personal data awareness

Low Low Mid Mid

Trust in government

High Low High Low

Perceptions of own accountability

High Low High High

Desire for control

High High High High

6

slide-5
SLIDE 5

Personal Data Management: The User’s Perspective 50

DATA CONTEXT VARIABLES IMPACTING USER SENSITIVITY

User sensitivities to their data access/use is impacted by 7 key variables. These variables define the data context.

Data context

Type of Data Type of Entity Trust in Service Provider Collection Method Device Context Usage Application Value Exchange

7 Type of Data

What type

  • f data is it

Unless it is considered a vital part of the service offered, banking data is most sensitive for all users followed by government identification, health information and peer contact information.

Type of Entity

Who is accessing it Most users say they do not want the government to access any private data. They are also concerned about unknown vendors accessing their personal data without their permission. Of all service vendors, users consistently have least trust in social network companies and most trust in banks.

Trust in Service Provider

What gives users confidence in service providers Although they report there is no guarantee or they may not have a choice, users say 3 elements impact their trust levels: (1) reputation – brand familiarity, word of mouth recommendations, a personal relationship, company size, (2) location – organizations with a local/national presence are more likely to abide regulations, and, (3) free vs. paid services – trade-off between user risk and free services, understanding that free services generate income by selling user data.

Collection Method

How is the data collected As collection methods (and awareness thereof) move from active to passive, users become more and more wary as they feel a lack of control of their data. Jurisdiction is a factor in acceptance of more passive methods as users do not feel the need to protect their information (e.g., image) when in the public domain (vs. at home).

Device Context

Which device is being used Users differ in their views of which device is safest to access data from - no one device is universally deemed as being “safest”, but the type of device is a relevant factor.

Data Usage

How the data is being used Users do not want their data to be used without their knowledge nor by an unknown vendor. They also have negative reactions to automated uses of data as they feel it lacks flexibility and control for the user.

Value Exchange

What is the user getting out

  • f it

Users are willing to exchange personal data for an immediate personal benefit that reflects the value of their

  • data. The most appealing benefits include discounts, better service/improved product, and convenience/time
  • savings. A benefit to the social good is not as motivating as a personal benefit, but can be a factor.

DATA CONTEXT

DATA CONTEXT VARIABLES IMPACTING USER SENSITIVITY, CONT’D

8

slide-6
SLIDE 6

Personal Data Management: The User’s Perspective 51

DIFFERENT TYPES OF TRUST

In interacting with a service, user-participants differentiate 3 different types of trust:

  • Service provider – how much they trust the entity, and separately, the individuals/employees that

make up the entity, to protect their data.

  • There is no distinction between the service, the application, the service provider

platform, or the device.

  • Other users of the service – how much they trust other users not to misuse their data, e.g.,

“reposting” of their data into different contexts.

  • Contents – how much they trust the integrity of the contents being shared as part of the service

by other users and by the service provider, e.g., photos, tweets, news articles, etc. (primarily an issue in Shanghai).

9

User-participants consider 5 “worst case scenarios” resulting from the misuse of their personal data.

  • They have difficulty differentiating “risks” from “harms” and say that “risks” represent a potential

negative outcome, while “harm” represents that negative outcome becoming a reality.

Data sold to 3rd party /harassment

An annoyance from being targeted/ engaged by an unknown SP that has accessed their personal data.

Identitytheft/ fraud

Concerns about hackers

  • r rogue employees

accessing users’ financial/ID information and stealing their money or identity.

Reputation among peers

Concerns about personal/ private data being shared with their friends and family resulting in public embarrassment or judgment.

Discrimination/ Penalization

Fear that personal data may be mis-interpreted resulting in user discrimination, penalization,or even persecution.

Physical/emotional Harm

Concern that data misuse may result in physical harm (e.g., child predators) or cause mental anguish (e.g., breaking into house).

Major concern Minor concern Not a concern Primary concern

PERCEIVED RISKS

10

slide-7
SLIDE 7

Personal Data Management: The User’s Perspective 52

RISK MITIGATION

Most user-participants say that while concerned about their personal data, they have no means of protecting it. They are resigned to the risks associated with sharing personal data and feel they are currently doing everything they can to protect themselves.

  • They sign in to express personal context and use of multiple identities to manage various

personas.

  • Many use their email address as a login for their important services or high value

transactions (e.g., banking, etc.) to ensure they have clear communications and transparency into their accounts with these services.

  • Some create special identities/pseudonyms for social sites and more specific online

activities (e.g., gaming, forums, etc.) due to the lack of the trust in these service providers and/or the other users accessing the service and their data.

  • Many create new identities or use old email addresses specifically for junk mail.
  • They keep accounts separate from family members to keep their information private.
  • They volunteer as little data as possible to vendors, or input fake information.

11

USER PERCEPTIONS OF ACCOUNTABILITY

User-participants see an accountability role for all entities involved (to varying degrees in each market) with the greatest expectations for service providers and government.

User

Users feel they are accountable for the data choices and decisions they make (e.g., opting into Terms & Conditions agreements, what specific data they supply).

Intermediaries

An intermediary (3rd party acting as a data conduit between users and service providers with government certification) could safeguard and manage users’ data on their behalf.

Government

Government is expected to establish, administer, and enforce (via fines, etc.) personal data protection regulations. Trigger points for motivating users to be accountable for their data include fear of plausible risks. The amount or degree to which they think they can control their data also impacts their sense of accountability. They admit they do not take much responsibility over their data now (i.e., do not read T&Cs even of service providers they do not know) so primarily rely on service provider reputation. Users say they are willing to take additional steps to control their data (e.g., review the data service providers have about them) but question the amount of time and effort they would actually spend on this.

Vendors Employers

Service providers are expected to keep data safe, use data “properly”, be transparent with users, and abide personal data

  • regulations. Users would also like to be given

the right of self-determination.

Service Providers

12

slide-8
SLIDE 8

Personal Data Management: The User’s Perspective 53

EXPECTED DATA PRINCIPLES

In order to mitigate their concerns about personal data collection and usage, users desire and expect 5 key data principles.

1. Control – user-participants desire access to a graduated level of control (e.g., initial opt-in/out decision, ongoing permissions for data access, automation approval, enabling data sharing and requesting that data be deleted when the relationship is over) but some question the amount of time/effort they would actually spend exercising this control. 2. Transparency – user-participants want clear and understandable insights into which entities have their data and how it is being used so that they can administer their control more effectively. However, they are unsure as to how these should be delivered or whether the information would be consumed. 3. Value Exchange – in exchange for their data, user-participants expect to receive a benefit (e.g., financial reward, enhanced service, convenience, accuracy) which they perceive to be of comparable value to their data. 4. Accountability – to varying degrees across markets, user-participants see an important role of accountability for each of the entities involved: government, service providers and users. In Hamburg and Shanghai (and Denver to a lesser extent) there is also a role for an intermediary. 5. Protection/Enforcement/Redress– user-participants want the assurance that any regulations that are placed by the government will be enforced in order to deter the abuse of their personal data by service providers.

13

KEY DIFFERENCES BY MARKET

Denver Toronto Hamburg Shanghai

Personaldata awareness

  • Somewhat aware
  • Not really aware
  • Somewhat aware
  • Not really aware

Most sensitive data types

  • Banking, gov’t ID, and

children

  • Banking, gov’t ID, and

children

  • Banking, health, and

contacts

  • Banking, gov’t ID,

contacts, and own image Greatest perceived risks

  • Identity theft/fraud
  • Identity theft/fraud
  • Third party access to data/

harassment

  • Third party access to data/

harassment Accountability

  • Users should be

accountable for own data

  • Users should be

accountable for own data

  • Users should be

accountable for own data

  • Users are not accountable

for own data (SPs are) Trust in service providers

  • Most trusted: banks
  • Least trusted: Facebook
  • Most trusted: banks
  • Least trusted: Facebook
  • Most trusted: banks and
  • nline shopping vendors
  • Least trusted: Facebook
  • Most trusted: MSN
  • Least trusted: QQ

Employer rights to data

  • Only work-related data
  • To all data in the

workplace

  • Only work-related data
  • To some work-related

data Role for intermediary

  • Some role
  • No role
  • Some role
  • Major role

Level of trust in government

  • Low trust in government

to protect users

  • High trust in government

to protect users

  • High trust in government

to protect users

  • Only men trust

government to protect users Identity management

  • 3-5 IDs, most say device is

primary

  • 3-5 IDs, split between ID

and device as primary

  • 3-5 IDs, most say device is

primary

  • 4-5 IDs, most say device is

primary

slide-9
SLIDE 9

Personal Data Management: The User’s Perspective 54

DETAILED FINDINGS

THE USER PERSPECTIVE OF PERSONAL DATA

16

slide-10
SLIDE 10

Personal Data Management: The User’s Perspective 55

USERS’ RELATIONSHIP WITH TECHNOLOGY

Technology enables users to do more but also makes them feel powerless. Consequently, the perception of control becomes a critical factor and may be affected by data context variables.

POSITIVE ASSOCIATIONS NEGATIVE ASSOCIATIONS

  • Changes quickly, is difficult to keep up

with

  • Reduces face-to-face communications
  • Weakens personal relationships
  • Increases users’ dependence on

technology, resulting in decreasing sense of control

  • Decreases personal privacy
  • Makes things users currently do

more efficient, thus making life easier

  • Enables users to do things that

were not previously possible or had required more effort

  • Enables users to stay connected

with others, the environment (e.g., news, weather), and the world

  • Is fun/entertaining

“It’s super that there’s technology. There are many new things that make things easier in both my professional and private life.” Hamburg “When it works, it works great. But everything is now dependent on technology.” Toronto “On the one hand it is very convenient. But our ability is fading. I cannot write a lot of Chinese characters now because I use the computer too much.” Shanghai “*In 2030+ I’m afraid that *technology+ might impact the family. Maybe the traditional attachment will be reduced.” Shanghai “I kind of feel like we’re children in a candy store. It’s so new and it’s so exciting – we’re so distracted by it. I don’t think we have yet matured. We’re still consuming as much as we can.” Denver

Today . . . In 2030 . . .

  • Most users do not envision the impact of technology in the

future changing from positive to negative or vice-versa.

  • Instead, most say it will evolve so as to even further

enhance its impact today (e.g., even more convenient, even less face-to-face communications, etc.). 17

AWARENESS OF PERSONAL DATA ECOSYSTEM

Within each of the markets, user-participants have mixed levels of awareness of how data is accessed and used today

Zero knowledge of ecosystem Limited knowledge, have seen data reflected in online ads Experience with data misusage (e.g. they sold my data) Fully aware of ecosystem (e.g. my data is exchanged for assets)

Most users in Toronto and Shanghai fall into this area as many see their data reflected by entities other than the service provider they were

  • riginally dealing with.

While some have limited awareness, many users in Hamburg and Denver are aware of and understand the ecosystem and articulate what their personal data is used for. 18

slide-11
SLIDE 11

Personal Data Management: The User’s Perspective 56

DATA CONTEXT VARIABLES IMPACTING USER SENSITIVITY

19

DATA CONTEXT VARIABLES IMPACTING USER SENSITIVITY

Data context

Type of Data Type of Entity Trust in Service Provider Collection Method Device Context Usage Application Value Exchange

User-participants’ sensitivities to their data access/use is impacted by 7 key

  • variables. These variables define the data context.

20

slide-12
SLIDE 12

Personal Data Management: The User’s Perspective 57

What type

  • f data is it

Unless it is considered a vital part of the service offered, banking data is most sensitive for all users followed by government identification, health information and peer contact information. Who is accessing it Most users say they do not want the government to access any private data. They are also concerned about unknown vendors accessing their personal data without their permission. Of all service vendors, users consistently have least trust in social network companies and most trust in banks. What gives users confidence in service providers Although they report there is no guarantee or they may not have a choice, users say 3 elements impact their trust levels: (1) reputation – brand familiarity, word of mouth recommendations, a personal relationship, company size, (2) location – organizations with a local/national presence are more likely to abide regulations, and, (3) free vs. paid services – trade-off between user risk and free services, understanding that free services generate income by selling user data. How is the data collected As collection methods (and awareness thereof) move from active to passive, users become more and more wary as they feel a lack of control of their data. Jurisdiction is a factor in acceptance of more passive methods as users do not feel the need to protect their information (e.g., image) when in the public domain (vs. at home). Which device is being used Users differ in their views of which device is safest to access data from - no one device is universally deemed as being “safest”, but the type of device is a relevant factor. How the data is being used Users do not want their data to be used without their knowledge nor by an unknown vendor. They also have negative reactions to automated uses of data as they feel it lacks flexibility and control for the user. What is the user getting out

  • f it

Users are willing to exchange personal data for an immediate personal benefit that reflects the value of their

  • data. The most appealing benefits include discounts, better service/improved product, and convenience/time
  • savings. A benefit to the social good is not as motivating as a personal benefit, but can be a factor.

DATA CONTEXT

DATA CONTEXT VARIABLES IMPACTING USER SENSITIVITY, CONT’D

Type of Data Type of Entity Trust in Service Provider Collection Method Device Context Data Usage Value Exchange

1 2 3 4 5 6 7

21

There are specific types of data that user-participants are sensitive to sharing unless it is relevant to the service provided and yields a benefit to the user

TYPE OF DATA BEING SHARED/ACCESSED

Most Sensitive Data Types by Market

Banking Gov’t Issued ID Children Name/ Address Health User image Friends/ Family Contacts

Type of Data

1

22

slide-13
SLIDE 13

Personal Data Management: The User’s Perspective 58

User-reported data service providers have access to

Social Media Shopping Gaming Banks Employer University

Name X X X X X User ID X Location/ address X X X X X Email X X X X X X Age/date of birth X X X X Demos/life stage X X Gov’t identification X X X Financial account data X X X X X Relevant user activity X X X X X X Friends/family X Photos/videos X

User-participants say service providers currently have access to both their contributed and observed data, although many cannot immediately recall the amount of data they provided to register for services

DATA SHARED WITH SERVICE PROVIDERS TODAY

Type of Data

1

23

TYPE OF ENTITY ACCESSING THE DATA

In all markets, user-participants say trusted vendors and employers can access/use relevant personal data while the government should not have access to any private data

“I will have a positive attitude because the store provides a discount. It’s a benefit for me. I would fill out this form and will fill in my real information.” Shanghai “The employers shouldn’t know about my lunch specials. It’s private I think.” Hamburg “Jobs are asking for people’s Facebook

  • passwords. That should be illegal. It’s

invasion of privacy right? They can’t use it at the workplace.” Denver “I’m concerned that Revenue Canada will ask eBay for my information in order to collect additional taxes.” Toronto Moderator: Should the government have access to your personal information? “I don’t think that’s a question. They already do. Anything they want. They shouldn’t though; they shouldn’t.” Denver

Vendors

  • They are comfortable giving vendors access to their data if they are

trustworthy and provide some benefit to the user in exchange.

  • “Trust In Service Providers” pages explore the elements driving user

trust in service providers.

Employers

  • They agree employers have a right to employee data but to varying

degrees across all markets:

Government

  • Users in Toronto, Hamburg, and Denver oppose giving the

government access to their private data with the concern that it may be used against them (although some users in Toronto say the government would not likely sell their data to a third party).

The employer should have access to both personal and professional data as the employee should be cognizant of their activities when in the workplace. The employer should

  • nly have access to

work-related data. The employer should

  • nly have access to

administrative data but that some professional data (e.g., client lists) remains the property of individual employees.

Type of Entity

2

24

slide-14
SLIDE 14

Personal Data Management: The User’s Perspective 59

ELEMENTS OF TRUST IN SERVICE PROVIDERS

User-participants identify 3 elements they say drive their trust, and thereby their confidence, in service providers that collect and use their personal data

Reputation Location Free vs. Paid Services

  • While trust is considered an important criteria, participants say that it is not the
  • nly essential aspect of a data exchange relationship; they are not willing to

provide more, or more sensitive information to a service provider simply because they trust it.

  • They say that, in addition to trusting the service provider, they should still feel

that the data shared is relevant to the service and they should receive a benefit in exchange.

  • When trust is lost, users say it can be regained if the service provider is

transparent about the issue, fixes the problem, and compensates the user.

“What would make me 100% comfortable? If the risk is perceived to be low. If my friends are using it. And, if it is delivering value to me.” Toronto Moderator: So if you’re dealing with an entity that has a good reputation, would you be willing to give them more information? “It depends what kind of information they were looking for. Date of birth, I’d give them that.” Moderator: Social Security number? “No. That’s just an invitation to identity theft.” Moderator: What if it’s a company with a good reputation? “That’s why they wouldn’t ask for that in my opinion. Why would they need that? A company with a good reputation that is soliciting me to purchase a magazine subscription is not going to need my Social Security number.” Denver

Trust in Service Provider

3

25

REPUTATION

  • Well-known brand, name I recognize
  • Has its reputation at stake so not likely to do anything malicious
  • 1. Brand familiarity
  • Others are using the service and have not had any issues
  • Particularly important for start-up service providers
  • 2. Word of mouth recommendation
  • Have had a positive and lengthy relationship with service provider
  • Have had an issue with the provider which was resolved

satisfactorily

  • 3. Personal relationship
  • More likely to have the technical capabilities to keep data safe
  • Can afford higher salaries, so can hire the best people
  • 4. Size of company

“I would not sign up if I don’t know who the company is. I’d *look on a search engine+ it.” Toronto “Even if it’s like weird, you’re going to do it because [your friends have] already accepted it. You know that they’ve already done it and they’re okay and they’re alive.” Toronto “For me, it’s sort of word of mouth. Like, if I hear from enough people that an app

  • r some online service is good enough

and worth it, I’ll breeze through the sign up process; I’m going for it.” Denver “If I were a customer for 30 years, then it wouldn’t matter *what information about me they have] because they are my store.” Hamburg “In *online retailer+ I have trust. It’s a big

  • company. A little shop I would be more

suspicious of. I might be wrong if it is reputable.” Hamburg

User-participants say they can trust a service brand with a good reputation which is built on four key pillars

Trust in Service Provider

3

26

slide-15
SLIDE 15

Personal Data Management: The User’s Perspective 60

  • In Toronto, users

say they would be more comfortable knowing that a service provider accessing and using their data was based in Canada as they are more likely to obey local laws.

  • They are particularly

wary of service providers that are based in Asia and some parts of Eastern Europe.

  • Some users in

Shanghai say they would prefer that the service provider be global (vs. China- based) as it is more likely to have a positive reputation and better technical capabilities.

  • However some

want the service provider to have an

  • ffice in Shanghai so

they know who to contact if there is a dispute.

  • In Hamburg, users

say they are not as concerned about where service providers are located as long as they obey German personal data protection laws.

  • Some do feel that

German companies are more likely to respect and be compliant with German laws.

  • While users in

Denver are fairly cognizant that large, global entities may store data anywhere in the world, they feel it would be more secure in the US or Canada.

  • A few add that they

are comfortable with entities located in Europe as they say data protection laws are stronger there.

User-participants in all markets state that they have more trust in entities with local ties or obey local laws

“*Companies outside Canada+ will have different laws governing what they can do with your personal information. Anything that’s outside of Canada, just naturally I’m more sceptical. I don’t even really know why.” Toronto “I think foreign companies are more

  • advanced. You don’t have to worry if it is
  • credible. But I hope this company has

branches in China because you want to deal with Chinese people.” Shanghai “I would prefer that it was local or

  • German. European is also okay. It depends
  • n the idea and who is behind it. .. It’s

important that they fulfil German laws and data protection.” Hamburg “We live in a virtual world, so as long as it’s the company that they say they are and where they are, that’s fine. I would say in any of the non terrorist countries, sure.” Denver “There’s more secure places in the world. This is virtual; that’s kind of different, so not in Mexico. I feel like it had better be in the United States or Canada.” Denver

Trust in Service Provider

3

LOCATION

27

FREE VS. PAID SERVICES

Some identify a trade-off between paying for services and the amount of risk they accept in order to access them

“I used *social networking site+ for free. They don’t charge. If you want the privacy to be completely hidden, it is not

  • possible. If it is free, it means I accept the

risk.” Shanghai “It’s like if something is free, you are the

  • product. Like with [email site], you give

them your information, whether it’s true

  • r not, they are going to be able to

figure it out based on the emails you send and receive.” Denver “*Social networking site+ is only free because of advertising. If I don’t want advertising then I have to pay for it.” Hamburg “If I paid for the service, then I will fine

  • them. I have a bigger chance of winning

*a dispute+. If it is free, then I wouldn’t know what to do.” Shanghai

Some participants in Shanghai and Hamburg say they have more trust in service providers to which they pay a fee.

Free Services

  • Users understand that they must

exchange their data for the ability to access the free service.

  • They acknowledge that vendors will sell

their data to third parties in order to make revenue.

  • There is sentiment that they are put at

greater risk, which they feel they do have a choice to accept or not.

Paid Services

  • If users pay for the software/ service,

they expect their data to be used only by the service provider.

  • Some also expect that they have more

rights if there is a problem or dispute.

  • A few users say in these circumstances

they are more likely to read the service’s terms and conditions so they are fully aware of these rights.

Trust in Service Provider

3

28

slide-16
SLIDE 16

Personal Data Management: The User’s Perspective 61

HOW THE DATA IS BEING COLLECTED

User-participants are largely consistent across markets in their sensitivities towards the different data collection methods explored in the scenarios

Passively Collected Data

  • Users are generally hesitant about the idea of passively collected data with concerns about

lack of user control over when/how it is collected and how it might be used.

  • Many users indicate that acceptance of would increase if they:

− Were familiar with and trusted the service provider collecting the data. − Were able to negotiate the value exchanged for their data. − Were provided clarity about how the data is collected and what it is used for. − Had ability to choose which types of data and data sources can be accessed and forwarded on by the service provider. Autonomously Collected Data Data gathered (a) while user is actively involved in a transaction, but without user awareness (e.g., location data). (b) Automatically without user awareness of a transaction taking place (e.g., CCTV capture).  Mixed reactions Inferred Data Derived from analytics that aggregate data collected with other available data sets.  Some negative reactions as they feel it is too far out of their control.

Actively Collected Data

  • Users are most comfortable with the idea of data that they themselves have inputted or

volunteered as they feel they have more control.

Collection Method

4

29

PERCEPTIONS OF ACTIVELY COLLECTED DATA

“If I put information out there, I would expect others to access it. If it was a disadvantage, I wouldn’t put that information out there.” Hamburg “I would always prefer to register *versus being observed]. I believe that would be the most correct path to use.” Hamburg

User-participants are most comfortable with the idea of data that they themselves have inputted or volunteered as they feel they have more control over this

Generally, users prefer that service providers collect (and use) data as users can:

  • Choose whether or not to register for the service

provider initially.

  • Decide what real data to contribute vs. being

anonymous.

  • Determine which service providers see what data

(e.g. through the usage of several online personas). Collection Method

4

30

slide-17
SLIDE 17

Personal Data Management: The User’s Perspective 62

“If I am recorded in public places, I’m

  • kay with this data. There is no risk. If

there are other people who are hostile, maybe that person has done something.” Shanghai “That’s kind of a legal issue really. If they put something in your personal home without your knowledge. But the police do that all the time. I don’t know. They do it on TV. If it’s not community property and it’s your own personal space I don’t think it’s okay. I think you have a right to your privacy and that’s why we all have our own.” Denver “Knowing that someone knows where you are at all times is scary.” Toronto “I’m just a face in a huge crowd. It’s not like they’ve got specific information about me.” Toronto “What they do with the *camera footage] data, if you look at what the software can do like detecting face. I’m an analyzed algorithm and I have nothing to protect me.” Hamburg

Reactions to passively collected data differ based on where it is collected, who it benefits, and how it is analyzed

Where it is collected

(e.g., jurisdiction)

  • In public: somewhat expected and
  • ut of user’s control; user is less

likely to be individualized.

  • At work: justified (employer has

rights to this data); not likely to be shared outside of the office.

  • At home: too intimate and

personal; reputation and security concerns.

Who it benefits

  • The user: gives them reward or

convenience.

  • Society: data is used to enhance

public security (e.g. locating, catching criminals).

  • Service provider: Users are

monitored and data is used inappropriately or sold to a third party.

How it is analyzed

  • In aggregate with other users: less

risk of users being analyzed and targeted individually; safety in numbers.

  • Individually: concerns that too

many details are provided to the service provider (e.g., facial recognition).

PERCEPTIONS OF PASSIVELY COLLECTED DATA

Collection Method

4

31

“There is access to so much information that they did not have before. It may hurt me.” Toronto “The more you reveal about myself, the more possibilities they have to search for stuff about me. I wouldn’t want them to know everything about me. ” Hamburg “I think there is a privacy line and that is maybe where technology is not good if they get too much information on a person and you don’t even know what they have on you.” Denver “I don’t think that is an issue because that is kind of an anonymous – you are not saying this person has a high body temperature and we need to turn up the air conditioning while she is in the room. The right data anonymously would be fine.” Denver

Some user-participants have negative reactions to the idea of inferred data as they feel as though it is too far out of their control

PERCEPTIONS OF INFERRED DATA

  • For most, the concept of inferred data is not one they are familiar

with so are initially ambiguous in their reactions.

  • Some report that they are uncomfortable that this type of data

can be collected by service providers, especially if collected without their knowledge.

  • Others say they are less concerned that inferred data is being

collected, but worry where the data is going and how it is being used (although most agree that the usage of inferred data to benefit the user is okay).

  • As with other forms of data, users say they are comfortable with

the concept of inferred data as long as it is analyzed and used in aggregate with other users’ data. Collection Method

4

32

slide-18
SLIDE 18

Personal Data Management: The User’s Perspective 63

Some users say both devices are equally safe because:

  • The user may conduct the same activities on both devices
  • Neither device is perfectly safe from hackers

“I don’t have any privacy settings for my microblog currently because I always log in from my phone so I think it is safe.” Shanghai “I feel like if somebody’s going to access them, I’d rather have it on my phone

  • ver my computer because I keep a lot

more personal information on the computer than I do over the phone.” Denver “It can suggest different things depending on what device I’m access the service from.” Toronto “I doesn’t make a difference, I do the same activities on both devices (mobile phone and PC). The only difference are the localization services on the mobile

  • phone. They often localize you and I

don’t have that on the laptop at all.” Hamburg “Anything that I use is private whether it is my phone or my laptop.” Denver

Across each market, participants differ in their views of which of devices are safest to access data from

  • Users vary in their behaviours in sharing data and services on their devices.
  • Many access every consumer service on every device while some do delegate specific devices

for specific services (but this is usually driven by usability/functionality reasons).

  • Employees tend to access both work and personal data/services from work devices, unless this

is strictly prohibited by the employer.

Safe because… Not Safe because...

Used everywhere Easy to lose Has all communication forms SPs can access users’ location via GPS Not shared with other users Not used to access public Wi-Fi Does not have personal files stored on it Has less data overall on it Only used at home Less likely to be lost Not all forms of communications (e.g., SMS) are conducted on it No localization/GPS services Shared with other users Used to access public Wi-Fi Has personal files (e.g., financial, photos) stored on it Has more data that smartphones

SAFEST DEVICES TO ACCESS DATA FROM

Device Context

5

33

HOW DATA ARE BEING USED

User-participants want their data to be used . . .

  • in a way that allows them to negotiate a value, benefit, or reward;
  • to inform service provider of changes that would improve users’

experience; and

  • by a service provider that they feel they can trust.

User-participants do not want their data to be used . . .

  • without their awareness;
  • for the sole benefit of the service provider;
  • in a way other than what has been communicated to or understood by

the user; and

  • by a unknown third party without their permission.

“If it is accompanied with a good intention and keeps the information to itself and it wants to make the customer happy, then it’s okay. But, if it sends the data to someone else, it’s not funny anymore.” Hamburg “It’s good if *device uses my playing habit data] to improve the system and how it works. *The downside is if+ they’re using it without you wanting them to use it.” Toronto “If for environment protection purpose and they need to install a camera in my home, that’s why I am cooperating. But if they research a new product and use my data for business purposes, I’m not willing to let that information be collected.” Shanghai “I guess there is a part of me that also feels like ‘Alright, so if I give you all this information what are you really going to do with it?’ That’s kind of like the part of me that’s a little bit naïve as to what they are really going to do with the information.” Denver

Generally, participants are comfortable sharing data when used the way they expect or they receive benefits that they think are equivalent with the perceived value of their data

See section “Personal Data Risks” to explore risks that users are most concerned about

Data Usage

6

34

slide-19
SLIDE 19

Personal Data Management: The User’s Perspective 64

AUTOMATED USES OF DATA

  • While some users say that automation could be convenient, most would be

concerned that that their data might be used incorrectly, resulting in mistakes and/or embarrassment (e.g. emails going to the wrong contacts).

  • As such, they would want to have final decisions on automated suggestions.
  • It would also be important that the service provider use the data with some

protection measures in place:

  • The service provider should not have access to use specific data content.
  • The service provider should not share personal data with other users.

“Paying bills is fine but making appointments is difficult due to the irregularity of my schedule.” Toronto “If it is so automatic, I have nothing to

  • do. There’s no choice for me to make.”

Shanghai “This makes us totally dependent. In 10 years time you lose your dependency. You don’t know how things work

  • anymore. One day I won’t know how to

do things anymore. I needn’t live anymore.” Hamburg “It is not your business. I am a very private person and it is not your business if I go get my haircut or I am going for a walk today. I believe that there are still

  • ther motives out there. Even though we

are using this for marketing purposes

  • nly I still sometimes think that there is a

hidden agenda or it can be used some

  • ther way than what it was initially

intended for.” Denver

Overall, user-participants have negative responses to automated uses of data as they feel they lack flexibility and do not give users enough control

Hamburg and Shanghai users in particular have strong negative responses to automation as they feel it takes away their independent ability to make decisions. US users have negative reactions towards services acting autonomously as they say implicit actions seems suspicious and/or feel too aggressive (as found in a separate research study). Data Usage

6

35

WHAT THE USER IS GETTING OUT OF IT

  • Participants in all markets are cognizant of a data/value exchange as they

explicitly state they are willing to provide more of their personal information for more benefit received, especially if they receive that benefit immediately.

  • They carefully weigh the exchange trade-off when determining how much

information they are willing to share with an entity.

  • Some say they consider whether or not they require the service in the first

place and/or if they could get this benefit via some other channel (e.g., through an existing relationship or via their own efforts ) without having to share their data.

“Why would I even want their services if I’m not going to get a whole lot out of it?” Denver “I would not give out my weight information normally. But if Wal-Mart gave me a 50% discount, then yes.” Toronto “I would give my household income information if I can get something right away, otherwise, it’s none of their business.” Toronto “The navigation service is free, it provides you a service so I think its okay that they get your personal information.” Shanghai “That’s how they pay for - they can’t do it that way, then they’ve got to charge us for the usage. A lot of network maintenance goes on in back.” Denver “It makes life easier but it wouldn’t be that much effort to do it yourself.” Hamburg

User-participants expect to get some sort of value when service providers access and use their personal data

Hamburg, Shanghai, and Denver users in particular articulate a clear understanding that users are required to share their data in

  • rder to receive free services.

Fair Value Exchange

7

36

slide-20
SLIDE 20

Personal Data Management: The User’s Perspective 65

CONSENT TO TRUST

  • In some cases, they are motivated by particularly appealing benefits (e.g.,

discounts, convenience, connectivity) and so are willing to trust an unknown service provider and take the risk.

A few user-participants feel they must trust a service provider simply because they want to use the service

“I have to accept this passively, because I need it. You go to buy things in a supermarket and you apply for membership cards to get a discount and the supermarket records your consumption.” Shanghai “You need to keep contact with society so you need to shoulder the risk.” Shanghai “I feel like crap every time. I don’t want to do it but I know I have to. You’ll become a social outcast. You can’t not do it or how are you going to live?” Toronto “I think that’s just the part of how it is

  • nowadays. I’m resigned to it maybe.”

Denver

In Shanghai and Toronto in particular, they discuss a lack of choice as they must trust the service provider in order to use a service. A few say that they feel pressured to adopt the service because they would feel like “a social outcast” otherwise. US participants describe a sentiment of feeling “resigned” to consenting to trust an entity. Often, they will trust an unknown service provider and share their information simply because they feel this data is already “out there”.

Fair Value Exchange

7

37

PERSONAL DATA RISKS

38

slide-21
SLIDE 21

Personal Data Management: The User’s Perspective 66 User-participants across all markets have varying levels of concern about the key risks they associate with the misuse of their personal data

PERCEIVED RISKS

Major concern Minor concern Not a concern Primary concern

Data sold to 3rd party /harassment

An annoyance from being targeted/ engaged by an unknown SP that has accessed their personal data.

Identitytheft/ fraud

Concerns about hackers

  • r rogue employees

accessing users’ financial/ID information and stealing their money or identity.

Reputation among peers

Concerns about personal/ private data being shared with their friends and family resulting in public embarrassment or judgment.

Discrimination/ Penalization

Fear that personal data may be mis-interpreted resulting in user discrimination, penalization,or even persecution.

Physical/emotional Harm

Concern that data misuse may result in physical harm (e.g., child predators) or cause mental anguish (e.g., breaking into house).

39

RISK: DATA SOLD TO A THIRD PARTY

  • User-participants in all markets describe this risk as ‘harassment’ as they are

being targeted and engaged by a service provider with which they do not have a connection.

  • Toronto, Hamburg, and Denver users experience this through direct marketing or

advertising (e.g. online or mail).

  • Some Shanghai users also report having experienced more aggressive tactics with

unknown service providers calling them or showing up at their door.

“I think what is uncomfortable is that there are companies out there now that know certain things about you that you didn’t expect. So, it’s a surprise. It kind of seems like manipulation because they know all of these things.” Toronto “After I delivered my baby, there were nanny services calling me. I think my information was leaked by the hospital. That’s negligence.” Shanghai “I imagine that third parties would annoy me with advertising or too many

  • suggestions. There’s no real damage. It’s

harmless, when it comes to my physical health and wellbeing but I simply wouldn’t want to be annoyed with advertising.” Hamburg “If I didn’t fill anything out, but I just showed up and got a discount, I wouldn’t see anything wrong with that.” Denver

As this is something many users experience, it is the most top-of-mind concern as a primary or major annoyance

Some Hamburg and Toronto users feel that it is creepy to have a unknown provider know them somewhat intimately and prefer to be marketed to more anonymously. They also say that they feel as though they are being manipulated to buy products or services that they would not otherwise spend their money on. That said, they do not feel as though they are at risk of any real pain or harm. A few users in Denver say they would not mind their data being sold to a third party if they received a financial benefit (e.g., a discount) as a result. 40

slide-22
SLIDE 22

Personal Data Management: The User’s Perspective 67

  • User-participants across all markets discuss concerns of hackers or rogue

employees accessing their financial information and stealing their money.

  • Some say their credit card information is safer to share than their debit card

information because:

  • Payment providers have measures in place to safeguard their customers from

fraud.

  • Debit cards grant entities direct access to users’ bank accounts.

“My bank account information cannot be known. Criminals have a lot of technology and can guess my password and my financial is at risk.” Shanghai “Bank account information causes a direct loss. Other information does not have as big of an impact on me.” Shanghai “Somebody I know, today, was telling me that they had no mortgage on their property and someone had stolen their identity and mortgaged their home.” Toronto “Identity theft. That’d be the worst. Because it’s really hard to recover.” Denver

Identity theft or fraud is a key concern as it causes potential significant financial harm; most report having has first- or second-hand experience with identity theft

This risk is particularly top of mind for users in Shanghai as there is a lot of first- and second-hand experiences of social media accounts and user identities being stolen and used to con their friends and family out of money. A few users in Shanghai, Denver, and Toronto also the potential for their identity to be stolen (via their government identification) and used for criminal purposes. Identify theft is a major concern for users in Denver as most either have first- or second-hand experience with a credit card or debit card information being stolen.

RISK: IDENTITY THEFT/FRAUD

41

RISK: REPUTATION AMONG PEERS

  • Many say they explicitly control what personal data is shared out to relatives

and peers (especially spouses).

  • For most, there is a potential for public embarrassment or judgment should a

service provider share their private information with friends or family, e.g.:

  • How often they shop, how much they spend.
  • How much money they have.
  • Where they have been/where they go.
  • What their home looks like.
  • Alternatively, users say they do not care how they are perceived by service

providers (i.e. whether or not they have constructed an accurate profile of the user).

“I don’t want people to find out information about me that I don’t want to be known.” Toronto “I don’t want my girlfriend to know that I went shopping there. It’s not dramatic but it is something private.” Hamburg “I don’t want other people seeing the current situation of my home because it is messy. It would affect my image.” Shanghai “They might send something to your school address. It might affect my

  • reputation. Other people will think I’m

weird and people might point at me.” Shanghai “I feel like if it’s your reputation on*social networking site]. People steal passwords and click on these crazy videos. And then they’ve got a girl with no clothes on their

  • page. Everyone’s like, ‘Why are you

posting all this stuff?’ They’re like, ‘Oh my God my password’s been stolen.’ You go to get a job later, and they’re like ‘Oh, a few years ago, you posted a bunch of pictures of all these girls.’” Denver

In Shanghai in particular, participants worry about their reputation among their friends and family being impacted as a result of their data being exposed

The worst case scenario for Shanghai users is a ‘flesh search’ which has the purpose of identifying and exposing individuals to public humiliation.

42

slide-23
SLIDE 23

Personal Data Management: The User’s Perspective 68

RISK: DISCRIMINATION/PENALIZATION/PERSECUTION

  • Users in Hamburg and Denver (and also Toronto to a lesser extent) say they

worry that a service provider might publicly share out their activities or behavioral patterns.

  • They raise concerns about being judged or discriminated against as a result,

such as:

  • Reading a newspaper article might label a user with a particular political affiliation.
  • Having their financial data shared with a service provider might disadvantage a user

with missed opportunities.

  • An employer might track an employee’s movements in the office and reprimand

them for a lack of productivity.

  • A potential employer or financial lender might see pictures of a job candidate

drinking on the weekend resulting in a lost opportunity.

  • Some users look back to history (e.g., WWII) at how people were identified

and persecuted based on demographic data (e.g., race, religion, etc.).

“News articles. It gives a strong insight into my political interests. Something could be detected. In other countries, they could potentially have pressure put

  • n them.” Hamburg

“The employer might use the information to the disadvantage of the employee if she spends too much time in the kitchen or the copier room.” Hamburg? “It also depends on what you believe humanity is capable of. When they rounded up the Japanese in the US, that wasn’t very long ago, and that was done through the census. So that was people having to put out their information. This is in a sense like a census, but not like your race and all that stuff, but it’s like down to your profile. Like, are you homosexual or not? What’s your religious background?” Denver “Another big one *private data+ would be religion. What happens if the Holocaust happens again?” Toronto

A concern of some participants is that their personal data might be misinterpreted and they may be discriminated against or persecuted as a result

43

RISK: PHYSICAL/EMOTIONAL HARM

“I’m concerned about perverts calling my daughter.” Toronto “I wouldn’t want anyone to know when I’m not home. It’s nobody’s business when I’m on holiday. The door will be wide open for burglars.” Hamburg “Some criminal mind could be tracking this woman, knowing when her apartment is vacant in case she has millions of dollars worth of jewelry they can go in and steal. They know where she works. They can cause her all kinds

  • f grief at work. They know her daily

routine which if it is not used in a good manner is purely evil.” Denver “I feel a bit concerned if my address is known by other people. If someone comes to my door, something bad will happen.” Shanghai

In each market, a few participants convey concerns that the use of certain types of personal data may result in physical or emotional harm

  • Parents in Toronto and Denver say they are not willing to share information

about their children with service providers with concerns that predators may gain access (either via hacking or a rogue employee) and target their family.

  • Some Denver and Hamburg users have similar concerns with sharing their

location information that criminals may access so as to determine when they are not at home.

  • A few female users in Shanghai have experienced, or worry, about third

parties (to which their data has been sold) showing up at their home. While they do not specifically articulate concerns about being physically harmed, they say that this situation would ‘terrify’ or ‘frighten’ them.

44

slide-24
SLIDE 24

Personal Data Management: The User’s Perspective 69

REALITY OF RISKS

Toronto participants say they generally feel safe

  • Some are not

concerned about risks as they have not personally been the victim of data abuse

  • r do not feel they

are a target (e.g. they do not have anything to steal or hide).

  • Others say they feel

safe and protected because of where they live, with a few

  • lder users pointing
  • ut that Canada has

privacy commissioner. Shanghai participants tend to worry more about data security

  • Some users in

Shanghai indicate that the subject of personal information risk is not something they consider often, if at all, so do not feel they are personally at risk.

  • A few say that they

change their information (e.g. phone number, address) frequently enough that they have nothing to worry about. Hamburg participants seem more aware of risks

  • Many say they take

their privacy into consideration every time they interact with an online service provider (e.g. carefully considering what data to share).

  • But a few say they

might not think of it unless a personal data issue is brought up in the media.

  • Some mention

current privacy regulations in Germany. Denver participants are generally concerned

  • Many feel there is a

good chance that their financial data will be at risk despite the security measures banks take.

  • While they feel that

physical or emotional harm is less likely, they are cautious.

  • That said, some users

indicate that the research discussion was the first time they gave the subject significant thought. “They would leave me alone. I’m 50 years old and I’ve been good so far.” Toronto “I think that we’re lucky that we live in Canada, so we don’t really know what the effects are of people knowing too much information about us. And, we have a privacy commissioner.” Toronto “I feel okay sharing my information. I’m not concerned because my information is always changing and I haven’t had trouble for years.” Shanghai “As soon as we get a discount, we forget about [our privacy concerns] except if we read about [social networking site] right before that. Then we are alert.” Hamburg “I feel like *identity theft is+ a reality.” Denver, Focus Group “Definitely the worse case scenario *is data] used against me in a court of law. *But+ it is unlikely because I don’t do things that would cause that.” Denver

In considering the risks associated with their personal data, user-participants consider the likelihood of whether or not they would actually experience harm

45

SENSE OF FUTILITY

“Once the information is out there, it’s

  • ut there.” Toronto

“I’m helpless. I’m in their hands and they can simply do what they want.” Hamburg “It’s not like I bought a thing and it was

  • damaged. If someone is harassing me,

the government can’t help me. There’s nowhere to complain. They are too arrogant, they won’t serve me. I don’t think I have a lot of rights.” Shanghai “Life is full of risks. If you get in the car there is a risk. If you walk there is a risk. And this is our life now. This is the risk we take for all the benefits. I’m not a real big user, but I can see huge benefits to stuff.” Denver

In each market, most participants say that while concerned about their personal data, they have no means of protecting it

“All my information is out there anyway.” “If I didn’t want to share my information, I wouldn’t use the service.” “It’s out of my control.” “Corporations are getting more information about people everyday.” “I’m doing everything that I can be doing.” “If something bad happens, there’s nothing I can do about it.” “That’s the way the world is and I just have to live with it.”

46

slide-25
SLIDE 25

Personal Data Management: The User’s Perspective 70

RISK MITIGATION MEASURES

  • Examples of the mostly commonly executed risk mitigation measures include:
  • Use and management of several online personas, with consolidation into a few for

important services and others for more specific purposes (e.g., spam, financial, work, gaming, shopping, friends).

  • Keeping accounts separate from other family members to ensure information is

kept private from others.

  • Volunteering as little data as possible, or entering faked information.
  • Using pseudonyms to maintain anonymity.
  • Following public recommendations to refrain from sharing sensitive data (e.g.,

government id’s, financial info) and setting/re-setting privacy settings on social media sites

  • Researching the reputation of new service providers.

“If you use one pseudonym everywhere, it’s easier to abuse it if someone finds

  • ut the password.” Hamburg

“Hotmail has the most junk. I guess I look at it the most frequently and can’t get rid of everything. My [email account]

  • nly, my husband, my daughter and

that’s it.” Denver “I don’t know my husband’s password because he prioritizes privacy and personal space.” Shanghai “I don’t put a lot of my information out

  • there. I put in fake information.” Toronto

“The news says it’s not safe *to share credit card information+ so I don’t do things that are not safe.” Toronto “Reading something about identity theft; you shouldn’t do this or you should do

  • that. I will go in and check and see if I

am matching what they say you should do.” Denver

Participants report that they do take some measures to mitigate the perceived risks

Shanghai participants discuss additional measures that are more specific to the security of their devices or

  • nline profile, such as installing firewalls or managing

complex passwords.

47

ACCOUNTABILITY OF ENTITIES INVOLVED

48

slide-26
SLIDE 26

Personal Data Management: The User’s Perspective 71 User-participants see an accountability role for all entities involved (to varying degrees in each market) with the greatest expectations for service providers and government.

USER PERCEPTIONS OF ACCOUNTABILITY

Users feel they are accountable for the data choices and decisions they make (e.g., opting into Terms & Conditions agreements, what specific data they supply). An intermediary (3rd party acting as a data conduit between users and service providers with government certification) could safeguard and manage users’ data on their behalf. Government is expected to establish, administer, and enforce (via fines, etc.) personal data protection regulations. Service providers are expected to keep data safe, use data “properly”, be transparent with users, and abide personal data

  • regulations. Users would also like to be given

the right of self-determination.

User

Intermediaries Government

Vendors Employers

Service Providers

49

Users/Self Vendors Employers Intermediary Government

None

Markets vary in perceptions of entity accountability

USER PERCEPTIONS OF ACCOUNTABILITY, CONT’D

High Accountability Low Accountability Some Accountability

50

slide-27
SLIDE 27

Personal Data Management: The User’s Perspective 72

USER ACCOUNTABILITY TODAY

  • Most participants say that they accept a

service provider’s Terms & Conditions agreement without reading it first.

  • Some account for this by saying that the

agreement is often too long and complicated for them to read or understand.

  • They often feel pressured to blindly accept the

agreement if they are adamant about using the service – the alternative is to opt out from using the service.

  • They often know (in the back of their mind)

that by opting into the service, they are making a choice to take the risk of accepting the agreement without reviewing it first.

  • In doing so, they feel that if something does go

wrong, the user would be the only accountable party.

“They all use legal jargon that the public does not understand. It’s a deterrent to reading it.” Toronto “You always have to accept the general

  • terms. I don’t read them all. I trust they

will be alright.” Hamburg “Sometimes its my own fault. I didn’t read the agreement. You make the decision, you have the responsibility.” Shanghai “You’re the reason why it’s not getting managed correctly. You gave them the right to take [the information], and responsibility.” Denver

There is a sentiment that today, users are accountable for their personal data when they accept the conditions (and risk) of the service, even if they acknowledge that they rarely read the Terms & Conditions

A few users in Shanghai and Hamburg feel that this agreement serves the interest of the service provider, and does not address their

  • wn rights to protection. They would

like to have a two-way contract in place that outlines the service provider’s obligations and the user’s rights. 51

EXPECTATIONS OF USERS (SELF)

  • Trigger points motivating these users to be accountable for their data include:
  • Risks that they feel are plausible or that they have first- or second-hand

experience of.

  • Data types and sources that they feel they can control through their

choices (e.g. what data to contribute, what devices to access services

  • n, what connections between service providers they will allow).
  • Many say they could be more diligent with their current mitigation measures.
  • On an aided basis, there is some willingness to do more such as:
  • Becoming educated on how service providers use personal data.
  • Reviewing own data that service providers currently have.
  • However, some question how much effort they would commit to these

activities, especially if they feel the government should be regulating the use

  • f data.

“I’m of full age. It’s my fault. Unlucky you if it’s not clear what you sign.” Hamburg “We should be less carefree with our

  • information. We don’t know who is

looking at it.” Toronto “I would review the information companies have one me initially, and then only again if there is any issue.” Toronto “I don’t feel like investing my time to look at these things. I fully trust that someone has verified the legal character.” Hamburg “They are accountable for keeping it safe, but you are responsible for giving it to them.” Denver “*Do you have responsibility?+ (Laughing) No, I didn’t leak the data.” Shanghai

Participants in all markets (except for Shanghai) feel they are responsible for making sure their data is used appropriately

Beyond their current security measures, users in Shanghai are not motivated to be accountable for the protection of their own personal data.

User Intermediaries Government Vendors Employers Service Providers

52

slide-28
SLIDE 28

Personal Data Management: The User’s Perspective 73

  • Responsibilities of vendors as outlined by users (within the context of having

a lack of trust in the vendor and limited negotiating abilities) include:

 Vendor should be transparent with users.  Vendor should be registered/certified to access/use data.  Vendor should control/regulate employees interacting with user data.  Vendor should take security measures to protect user data from hackers.  Vendor should not share user data without the user’s consent.  Vendor should notify users of data compromises.  Vendor should be monitored or regulated by the government.

Participants say vendors are ultimately responsible for making sure their personal data is used the way users want it to be used and to keep their data secured

“If *the service provider+ wishes me to use the software, then I would expect him to deal with it in a trustful way.” Hamburg “The enterprise should have regulations to control their working staff.” Shanghai “The people who collect the information should be responsible for it not being managed properly.” Toronto “You trust the bank. They had credit card information stolen; it’s a huge entity. You’d think this would never happen to *bank+, but it happened, so you can’t blame them because they went through all their security precautions, and someone just smart enough just kept working at the code and working at the

  • code. It’s just like the flu viruses.” Denver

EXPECTATIONS OF VENDORS

Intermediaries Government Vendors Employers Service Providers User

Some users in Denver feel that service providers should not be held accountable if they are doing everything in their power to keep user data safe and it is still stolen.

53

  • Specific responsibilities of employers as outlined by users include:

 Employer should limit access to data to users’ immediate manager.  Employer should not share user data outside of the company.  Employer should take security measures to protect user data from hackers.  Employer should delete user data (particularly personal data relating to the users themselves) when the relationship is severed.

  • Expectations of employers are more limited as users do not feel as though

they are in a position to negotiate or would leave if trust was absent.

EXPECTATIONS OF EMPLOYERS

Expectations of employers are somewhat different as participants say the relationship between the employer and the user is a special one

“They probably would have the

  • pportunity to sell things – the fact that

she has a list of favorite restaurants. I just don’t think it needs to go beyond – they should not be able to sell your information.” Denver “The employer should have a firewall that simply cannot be overcome. I can imagine a company collecting data and selling it to others who then try to sell me something I don’t want.” Hamburg “If I am still in the company it’s okay but if I quit my job, the company should delete my information.” Shanghai “You have a choice here. Either you take the job or you don’t take the job. If it’s not acceptable, you don’t have to take it.” Toronto “The company collecting the data, if they are doing it legally, should sign a declaration stating the purpose that the user would have to approve.” Hamburg

Given Toronto users ’ attitude that the employer has rights to all data in the workplace, they feel that the user must be cognizant and responsible for data relating to personal activities in the workplace (e.g., using the work PC for personal email). Users in Hamburg add that there should be some transparency for users as employers should clearly communicate how user data is being used (e.g. monitoring productivity, monitoring energy usage, etc.).

Intermediaries Government Vendors Employers Service Providers User

54

slide-29
SLIDE 29

Personal Data Management: The User’s Perspective 74

EXPECTATIONS OF INTERMEDIARY

  • Intermediaries (3rd party to Shanghai participants) are seen as a neutral solution to address

the lack of trust in government or service providers to have access to all personal data.

  • Most want this entity to have the “regulatory” or “enforcement” authority of the

government, as well as the perceived technical capabilities of the service providers.

  • Responsibilities of an intermediary as outlined by users include:

 Intermediary should store all of the user data in a central location.  Intermediary should take security measures to protect user data from hackers.  Intermediary should provide service providers service-relevant data only.  Intermediary should provide user with a means (e.g. a portal and code) by which they can access, view, and edit their personal data.  Intermediary should be monitored or regulated by the government.

  • Very few say they would be willing to pay for an intermediary to manage their data on

their behalf, but instead feel this ought to be subsidized by the service provider and/or the government.

“I want a third party to manage the

  • data. There must be a high requirement

for a safe data center. Your information is saved at the center, not the store. It has a powerful defense against hackers and won’t disclose information to third parties.” Shanghai “Someone in-between wouldn’t be interested in how much time I spend in a specific room.” Hamburg “I don’t think that they’re necessarily going to manage your profiles and stuff like that, but I think that they are going to make you aware of how businesses are using your information.” Denver “I think the *service provider+ should pay for it. I don’t want to pay.” Shanghai “ I would not pay a third party … it doesn’t concern me enough.” Toronto

Many participants in Shanghai, some in Hamburg, and a few in Denver see a role for an intermediary to safeguard and manage users’ data on their behalf

Participants in Toronto do not see the need or purpose for a intermediary to manage their data.

Intermediaries Government Vendors Employers Service Providers User

55

EXPECTATIONS OF GOVERNMENT

  • They say there should be regulations in place to protect personal data – with

these in place, they feel any available services should be safe to use.

  • Responsibilities of the government as outlined include:

 State should establish regulations/laws to protect users and their personal data.  State should license or certify service providers to collect and use personal data.  State should monitor the collection/usage of personal data by service providers.  State should enforce regulations with punishment.

“The government can make sure that my information is not being shared.” Toronto “The country should give some regulations in this field. The government has administrative power. If they make the law, the those people might be criminal.” Shanghai “I would be nice if the government verifies [the usage of my data]. Something like “state examined” so you see that someone’s really looked at it. Like and independent foundation testing consumer products.” Hamburg “I think they have more power or more strength behind or it can impose more stricter punishment versus me.” Denver “I don’t trust them to regulate. I think because they always take it a step

  • further. They may come in and say ‘Well,

we’re going to pass this law because it’ll help something’, but then they’ll realize they can push it further.” Denver

Participants expect government to establish, administer and enforce personal data protection regulations, but not monitor or control personal data

Some participants in Hamburg feel there might also be a role for a not-for-profit consumer protection agency (like TUV) to certify service providers who collect/use consumers’ personal information.

User Intermediaries Government Vendors Employers Service Providers

Participants in Denver are divided in their expectations of the government – some feel it is the only entity powerful enough to protect user data, while others feel it would abuse this position. 56

slide-30
SLIDE 30

Personal Data Management: The User’s Perspective 75

EXPECTED DATA PRINCIPLES

57

Participants discuss 5 key data principle expectations

1

Control

2

Transparency

3

Fair Value Exchange

4

Accountability

5

Protection/Enforcement

EXPECTED DATA PRINCIPLES

58

slide-31
SLIDE 31

Personal Data Management: The User’s Perspective 76

  • 1. Point of entry

Opting in/out of service (choice)

  • 2. Exchange Set–up

Giving service provider permission to access data for a specific purpose Negotiating value in exchange for data

  • 3. Result Actions

Approving automations Giving permission for data sharing with others

CONTROL

  • Across all markets, they want a graduated level of control over how their

personal data is accessed and used throughout their relationship with a service provider.

  • However, some question how much time/effort they would spend exercising

this control once they consider the amount of personal data there would be to manage.

“Allow me to pick and choose what information I want you to have.” Denver “I *should+ have the possibilities of controlling that I don’t have today.” Hamburg “You should have the choice from the get go. Go back and pick what you want them to do with the data.” Toronto “If they ask for my permission and I agree, then the card center can transfer my information to another party. It’s

  • authorization. We need such a process.”

Shanghai “I want the controls to be there but I will not necessarily read everything.” Toronto “But it comes down to time, how much time do I have? Do I really have the time to sit there and go back through all that data and decide – ‘I don’t want them to have it, but that’s okay for these guys to have it.’?” Denver

Participants say control is a key requirement even if they do not take any measure to control their data today, and question the amount of time and efforts required

59

  • Participants across all markets raise the concern that any data remaining after

the relationship has ended would be out of their control and might be sold to a third party.

RIGHT TO DELETE DATA

“The information might get leaked to

  • ther companies. I can’t imagine what

they would do with it. They might call or send you advertisements.” Shanghai “Information that is still important for the company should still be kept but personal information should be deleted.” Hamburg “I think it somehow needs to be deleted and removed from the system. There are certain things from an employer standpoint that you have to keep as far as personnel records but it does not really do them any good to keep what temperature she likes.” Denver “If I leave the company, maybe this information is still there. I will require the company to delete the information. But you don’t know if they will. You don’t know if they have a back-up so I need a third party to keep the information.” Shanghai

As an aspect of control, participants expect that their data be deleted once their relationship with an employer has been severed

In Shanghai, users saw a need for an intermediary to

  • versee and manage this data deletion process on behalf
  • f the user, both for vendor and employer relationships.

Some in Hamburg and Denver specify expectations that personal data relating to an employee should be deleted while data relating to the corporation could be kept.

60

slide-32
SLIDE 32

Personal Data Management: The User’s Perspective 77

TRANSPARENCY

  • Ultimately, they feel as though the only information currently provided (i.e.,

the Terms & Conditions agreement) is too difficult to read and internalize.

  • As such, they would like vendors to provide clearly and succinctly, the

following information:  Who has my data?  How is it being used?  Who is my data being sold to and when?  If there is a dispute, who can I complain to?

“Normally they make everything so legal so when you read it, it’s not straight

  • forward. I think that would solve a major

problem if they can make everything straight forward where it’s like, this is what we’re going to do here.” Denver “It’s important that it’s very open. Make plans transparent. [Show why the service provider] wants the data, what it will do, the purpose of collection, the

  • bjective and the benefit of the person

who is receiving the data.” Hamburg “Tell me if you want to sell *my data+. Then I should get a cut of it.” Toronto “If I tell them about my information, I need to sign a contract with regulations about information leaks.” Shanghai

Greater transparency and insight on how their data is being used is desired, even if users have no clear idea of how such information should be shared or consumed

Many participants in Shanghai and a few in Hamburg, mention that the provision of a two-way contract would give them with greater insight into the service provider’s

  • bligations and user rights.

61

FAIR VALUE EXCHANGE

“I would not give out my weight information normally. But if [retail

  • utlet] gave me a 50% discount, then

yes.” Toronto “Even if they didn’t pay you, if it made it more convenient for you, I would do that.” Denver “Only safety, that’s the only reason for cameras.” Hamburg “It depends on what they want it for. If they’re environmentally trying to take in information about my energy usage, sure they can track me. If it will help in the future. I think the social benefit is for the good of all.” Denver “It makes sense down the line for an ecological and economical point of view.” Hamburg

User-participants are willing to exchange personal data for immediate personal benefit, but few consider social good

  • Personal benefits that are the most appealing include:
  • Discounts.
  • Better service or an improved product.
  • Convenience/time savings.

While participants in Toronto and Denver are willing to exchange their data to benefit the social good (e.g., environmental protection, public safety, etc.) this is motivating for only a few users elsewhere. Only one participant each in Shanghai and Hamburg say they would share their data to benefit the environment. A few in Hamburg say they would be willing to be videotaped in public places in order to aid security measures (which ultimately is a personal benefit).

62

slide-33
SLIDE 33

Personal Data Management: The User’s Perspective 78

ACCURACY

  • All participants feel that an inaccurate benefit delivered (e.g. email

group suggestions, restaurant suggestions) given their personal data accessed would be annoying to the user, especially if it requires their time and effort to remedy.

  • There is more lenience given to free services while users expect

greater precision from services for which they pay a charge.

“They’ve got to be pretty accurate to be

  • worthwhile. Otherwise, what’s the point
  • f getting information? There’s no

benefit to you.” Denver “If they make suggestions, it should be precise or it will be annoying.” Hamburg “Precision is important. If I need a device to do such management for me, it’s because I’m busy. If it’s not precise, it will make me busier.” Shanghai “The more you pay *for the service+, the more accurate the service would have to be.” Toronto “Accuracy is important if the information is being sent out.” Toronto

As a dimension of value exchange they say accuracy is vital, otherwise users receive no benefit

63

PROTECTION/ENFORCEMENT/REDRESS

  • They expect that service providers would have to abide by these regulations

and should be penalized if they do not.

  • Generally, participants have some difficulty articulating what redress against

service providers should be.

  • Potential punishments users consider include:
  • Fines
  • Imprisonment
  • Compensation paid back to the user

“Fines don’t work. They should imprison the CEO.” Toronto “I guess it would be fines. There are always some type of way of punish it whether it is fines or whatever – again that is beyond my expertise. Possibly fines and erasing whatever information that they gather.” Denver “There’s nothing you can do if your privacy is breached. I would want economic compensation. Pay me back exactly what I have lost.” Shanghai “I think obviously financially they should be held accountable. If it is neglect I think there could be some criminal – there probably are not statues enacted now that would hold them criminally liable other than civilly. I think the consumer – us as consumers should hold them accountable.” Denver

User-participants that want the government to protect personal information expect it to establish regulations as well as a clear due process for dispute re solution

Some in Denver say it is individuals, and not the government, who should hold service providers accountable via civil action suits when their personal data is misused.

64