Social Computing MICHAEL BERNSTEIN CS 376 Recall Sociotechnical - - PowerPoint PPT Presentation

social computing
SMART_READER_LITE
LIVE PREVIEW

Social Computing MICHAEL BERNSTEIN CS 376 Recall Sociotechnical - - PowerPoint PPT Presentation

Social Computing MICHAEL BERNSTEIN CS 376 Recall Sociotechnical system Emergent behaviors result from interactions between social relationships and technological interventions. 2 Recall... Facebook usage increases all types


slide-1
SLIDE 1

Social Computing

MICHAEL BERNSTEIN
 CS 376

slide-2
SLIDE 2

Sociotechnical system

2

Emergent behaviors result from interactions 
 between social relationships and technological interventions.

Recall…

slide-3
SLIDE 3

Recall...

ž Facebook usage increases all types of social capital, especially bridging social capital
 
 [Ellison, Steinfeld and Lampe, JCMC ’07]

3

slide-4
SLIDE 4

Recall...

4

ž The Strength of Weak Ties
 [Granovetter ‘73]

slide-5
SLIDE 5

Recall...

5

ž Systems and applications research
 
 FeedMe
 ReMail
 Chat Circles
 Link Different

slide-6
SLIDE 6

Recall…

ž Can we observationally model tie strength? ž Most predictive: ž Days since last communication ž Days since first communication ž Wall words exchanged ž Mean strength of mutual friends

6

slide-7
SLIDE 7

Operationalizing theory

slide-8
SLIDE 8

Presentation of Self in Everyday Life

[Goffman 1959]

ž Established face-to-face interaction between people as an object of study ž Metaphor: life as performance

ž People work to guide the impression that people develop of them ž On-stage: public life ž Off-stage: private life

8

slide-9
SLIDE 9

The Many Faces of Facebook


[Zhao et al., CHI ’13]

ž Facebook appears monolithic, but there are three functional regions ž Semistructured interviews ž Performance region
 (for now) ž Exhibition region
 (for later) ž Personal region
 (for reflection)

9

CS 376 is the best and I am studying hard right now! I got into Stanford! English major, here I come! After a lot of soul-searching, English isn’t for me…

slide-10
SLIDE 10

Estimating audience size

[Bernstein et al., CHI 2013]

Facebook users underestimate audience size by 4x Median reach is 35% per post and 61% per month Many want larger audiences but already have them

10

How might our activities be impacted if we are incorrectly estimating our audience size? Method: compare survey results (“How many people do you think saw your most recent update?”) to log results

slide-11
SLIDE 11

Reasoning about FB’s algorithms

[Eslami et al., CHI 2015]

ž What are peoples’ mental models of social news feed algorithms? ž Result: over half of Facebook users are unaware of the existence of the news feed algorithm

ž “Initial reactions for these previously unaware participants were surprise and anger.” ž “Participants were most upset when close friends and family were not shown in their feeds.”

11

slide-12
SLIDE 12

Motivating participation

slide-13
SLIDE 13

Motivation: why participate?

ž Intrinsic motivators: drawn from my own desires to complete a goal

  • r task

ž Examples: pleasure, hobby, developing a skill, demonstrating a skill

ž Extrinsic motivators: do not derive from my relationship with the goal or task

ž Examples: money, graduation, points, badges

ž Motivation Crowding Theory

ž Applying external motivators to an intrinsically motivated task reduces participation

13

slide-14
SLIDE 14

Contributions via uniqueness

[Beenen et al., CSCW ’04]

ž Social loafing: why should I contribute if many others could as well? ž Hypothesis: calling out the uniqueness of contributions will increase participation ž Method: rating campaign on MovieLens

ž “As someone with fairly unusual tastes, you have been an especially valuable user of MovieLens [...] You have rated movies that few others have rated: [...]”

ž Result: participants in the uniqueness condition rated 18% more movies

14

slide-15
SLIDE 15

Contributions via goal-setting

[Beenen et al., CSCW ’04] ž Specific, high-challenge goals are known to increase performance on tasks ž Hypotheses

ž H1: specific numeric goals will produce more participation than “do your best” goals ž H2: individual goals will produce more participation than group goals

ž Method: rating campaigns on the MovieLens web site ž Results ž H1 confirmed (3 extra ratings) ž H2 disconfirmed (group goals produced more)

15

slide-16
SLIDE 16

Experts and questions

slide-17
SLIDE 17

Answer Garden

[Ackerman and Malone, OIS ’90]

ž An “organizational memory” system: knowing what the company knows ž Main idea: members leave traces for others to solve their questions ž The original Yahoo! Answers, Quora, Aardvark

17

slide-18
SLIDE 18

Expertise recommendation

[McDonald and Ackerman, CSCW ’00]

ž Recommend people, not documents ž Goal: help organizations know who can tackle each problem

18

slide-19
SLIDE 19

Aardvark: social search engine

[Horowitz and Kamvar, WWW ’10]

ž Technical challenge: question routing over IM

ž Use a joint model over topical relevance and social distance

ž Interesting equilibrium: people were more willing to answer questions than ask them!

19

slide-20
SLIDE 20

ž Enable one tutor to help many students learning programming at

  • nce

ž Visualizations help find “stuck” students

20

Codeopticon

[Guo, UIST ’15]

slide-21
SLIDE 21

Leadership and collective action

slide-22
SLIDE 22

What makes a leader?

ž Reader-to-leader framework 
 [Preece and Shneiderman, AIS Trans. HCI ’09]

ž Readers > Contributors > Collaborators > Leaders ž Goal: guide users into each new stage ž See also: Legitimate peripheral participation 
 [Lave and Wenger ’91]

ž Leaders are born, not made
 [Panciera, Halfaker, Terveen, GROUP ’09]

ž Power editors on Wikipedia do more work than others, even from their first day on Wikipedia

22

slide-23
SLIDE 23

One-sided gatekeeping

[Keegan and Gergle, CSCW ’10]

ž How powerful are leaders in open communities like Wikipedia? ž Method

ž Data mine nominations for breaking news articles on the Wikipedia homepage ž Stories were nominated and voted on by elite, middle-class, and newbie editors

ž Result: “one-sided gatekeeping”

ž Elite editors could block nominations, but had no ability to get their nominations approved

23

slide-24
SLIDE 24

No place to participate


[Suh et al., WikiSym ’09]

ž Can fit Wikipedia’s curve to a ecological population model with a fixed resource limitation
 
 
 
 
 
 


24

slide-25
SLIDE 25

More decline 


[Halfaker et al., American Behavioral Scientist ’13] and [Wikimedia]

25

slide-26
SLIDE 26

Combating censorship


[Hiruncharoenvate, Lin and Gilbert, ICWSM ’15]

ž The Chinese government censors sensitive topics on social media ž However, homophones can be difficult for censors to distinguish from intended use

ž (slang ‘censorship’) vs. (river crab)

26

ž This work introduces an algorithm that decomposes words and nondeterministically creates homophones that are likely to create confusion for censors

slide-27
SLIDE 27

Social influences

  • n the

wisdom of crowds

slide-28
SLIDE 28

Unpredictability in an artificial cultural market

[Salganik, Dodds, and Watts, Science ’06] ž Puzzle: it is extremely difficult for experts to predict which songs, movies and books will be hits ž Method: 14,000 participants download free music from an online site

ž Random assignment: no download info, or one of eight worlds that all start with zero downloads

ž Result: huge variance in download counts ž Best songs rarely did poorly, worst songs rarely did well; any other

  • utcome was possible

28

slide-29
SLIDE 29

Reputation systems

[Resnick and Zeckhauser, Adv. Appl. Microeconomics ’02]

ž Reputation is a core signal in
 social systems ž Study of eBay feedback

ž Despite incentives to free ride, over half of eBay transactions leave feedback ž Feedback is almost always positive ž High reputations didn’t lead to higher seller prices ž Evidence of reciprocation and retaliation

29

slide-30
SLIDE 30

Credibility and online rumors

[Mitra and Gilbert, ICWSM 2015]

ž Social media are a space for spreading information, but how much misinformation are they spreading?
 ž CREDBANK: a corpus of 60 million tweets annotated by humans to indicate how credible the event is
 ž 24% of events in the Twitter stream are seen as not reliable…

30

slide-31
SLIDE 31

Exploration and visualization

slide-32
SLIDE 32

Exploring social data

ž Social media data can help us understand the world around us ž For example: dips in tweet volume show when people are attending to Obama in his SOTU address
 [Shamma et al., CSCW Horizons ’10]

32

slide-33
SLIDE 33

Social data exploration

[Heer, Viégas and Wattenberg, CHI ’07]

33

slide-34
SLIDE 34

Skills for social computing research

ž Skills for understanding and designing social computing systems are complementary ž Understanding: computational social science methods and theory

ž Social psychology, sociology, data mining

ž Designing: core challenge is designing for emergent behavior

34

slide-35
SLIDE 35

Discussion rooms

35

Rotation Littlefield 107 Littlefield 103 a 12 34 b 24 13 c 14 23 d 34 12 e 13 24 f 23 14