Lets Read! Junsoo Park Youngbo Shim Sang-Gyun An 20164320 - - PowerPoint PPT Presentation

let s read
SMART_READER_LITE
LIVE PREVIEW

Lets Read! Junsoo Park Youngbo Shim Sang-Gyun An 20164320 - - PowerPoint PPT Presentation

Lets Read! Junsoo Park Youngbo Shim Sang-Gyun An 20164320 20164350 20164352 The trigger point of our project is... 32 reading responses out of 66 paper We are novice researchers Unfamiliar with the discipline


slide-1
SLIDE 1

Let’s Read!

Junsoo Park Youngbo Shim Sang-Gyun An 20164320 20164350 20164352

slide-2
SLIDE 2

The trigger point of our project is...

slide-3
SLIDE 3
slide-4
SLIDE 4

32 reading responses

  • ut of 66 paper
slide-5
SLIDE 5
slide-6
SLIDE 6

We are novice researchers

  • Unfamiliar with the discipline
  • Little background knowledge
slide-7
SLIDE 7

It causes...

1. Poor time-efficiency

slide-8
SLIDE 8

1. Poor time-efficiency 2. Difficulty in having a critical view on a paper

It causes...

slide-9
SLIDE 9

Problem statement

Novice researchers, who are unfamiliar with a discipline and have little background knowledge, commonly face 1) poor time-efficiency and 2) difficulty in having a critical view on a paper.

slide-10
SLIDE 10

Existing solution: a paper with professor’s annotation

  • Helpful visual cues made by an

expert (professor) → At a glance, I can figure out which parts is important → I can see which part he likes or dislikes (critical view)

slide-11
SLIDE 11

Limitation of existing solution

  • Expert is rare and expensive,

and has less motivation.

  • Hard-copy paper lacks scalability.
slide-12
SLIDE 12

Online paper-reading platform where a group of novice researchers read a paper together with helpful visual cues naturally generated by themselves.

Let’s read

slide-13
SLIDE 13

System design

slide-14
SLIDE 14

Interface walk-through video

slide-15
SLIDE 15

Workflow

Make highlights individually Upload paper & create group. (or join existing group) Vote on others’ highlights Aggregate all highlights

slide-16
SLIDE 16

By highlighting

Workflow

Make highlights individually Upload paper & create group. (or join existing group) Vote on others’ highlights Aggregate all highlights

slide-17
SLIDE 17

By highlighting

Workflow

Make highlights individually Upload paper & create group. (or join existing group) Vote on others’ highlights Aggregate all highlights By crowd By computer

slide-18
SLIDE 18

Design consideration

slide-19
SLIDE 19

4-color highlighter

  • Each color corresponds to

○ (normal highlight) ○ :) (like) ○ :( (dislike) ○ ? (I don’t know)

  • Based on

○ Guideline for reading response (critical review) ■ summary/likes/dislikes ○ Interview from pilot study ■ P3: “I need ‘I don’t know’ color”

slide-20
SLIDE 20

Quality control issue

  • P2, P3: “Some highlights seem irrelevant”
slide-21
SLIDE 21

Quality control issue

  • P2, P3: “Some highlights seem irrelevant”

(word) (# of vote)

slide-22
SLIDE 22

Quality control issue

  • P2, P3: “Some highlights seem irrelevant”

(# of vote) Threshold Max # (word)

slide-23
SLIDE 23

Quality control issue

  • Concern about sabotage

→ we require KAIST E-mail account

slide-24
SLIDE 24

Usability issue

  • P2: “Long distance of mouse travel hinder me from changing color”
slide-25
SLIDE 25

Usability issue

  • P2: “Long distance of mouse travel hinder me from changing color”
slide-26
SLIDE 26

Deployment & result

slide-27
SLIDE 27
slide-28
SLIDE 28
slide-29
SLIDE 29

Deployment for Reading Response

  • Target users: CS492 Crowdsourcing Classmates
  • Target task: reading response for “The Future of Crowd Work”

(due Dec12 midnight)

  • Advertised on Piazza on Dec12 early morning.
  • 7 users visited (including 3 of us)
  • Unfortunately...
  • ...None of them were students from our class. (except us)
  • It was difficult to measure the impact of our platform in reading responses.
slide-30
SLIDE 30

How much users got involved?

  • In 3 days of service, a total of 13 human users registered (excluding test &

troll accounts)

  • Total of 127 highlight blocks

○ 77 (!) , 25 :) , 9 :( , 16 (?)

  • However...
  • 99 of highlights contributed by the 3 of us…

  • Avg. 33 highlights for a 12 page paper
  • Remaining 10 users did 28 highlights

○ Many of them just trying to see if highlight works

slide-31
SLIDE 31

Qualitative survey

How much time did you save?

  • Slight increase in paper reading time

○ 103min → 110min

  • Significantly reduced reading response

writing time ○ 42min → 27min How well did you understand the paper?

  • 4.0 / 5 → 4.0 / 5

How helpful was highlighting when reading paper?

  • 3.0 / 5

How helpful was highlighting for writing reading response?

  • 4.6 / 5
slide-32
SLIDE 32

Qualitative survey

Pros

  • Really helpful for writing a reading response
  • I can easily recall the parts which I like or

dislike

  • I became more eager to semantically

differentiate highlighting Cons

  • It might disturb natural reading flow.
  • ambiguous points make hard to decide which

color to use

How was the experience of Multi-Color Highlighting?

slide-33
SLIDE 33

Qualitative survey

Pros

  • trying to understand other's highlighting

helped me thinking about the issue deeply.

  • I tended to follow others' highlight
  • I noticed a good point I would have missed
  • therwise

Cons

  • I tended to follow others' highlight

unconsciously Suggestions

  • Providing a (statistical) reason to trust others'

highlight

How was the experience of seeing other people’s highlights?

slide-34
SLIDE 34

Discussion

slide-35
SLIDE 35
  • User study: why failed?

○ Recruitment ○ Usability issues

Limitations & Implications

slide-36
SLIDE 36

Difficulty of recruitment

  • Need a full step process for evaluation

○ Fully highlighted paper by various users

  • Hard to find motivated readers

○ Limited crowd pool

slide-37
SLIDE 37

Difficulty of recruitment

  • Need a full step process for evaluation

○ Fully highlighted paper by various users

  • Hard to find motivated readers
slide-38
SLIDE 38
  • Loading forever

○ Due to Flask issues

Usability Issues

slide-39
SLIDE 39

Usability Issues

  • Browser compatibility
  • Minor bugs

○ Highlight not working sometimes ○ Couldn’t type numbers(1~5) in ID entry ○ etc.

slide-40
SLIDE 40

Limitations & Implications

  • Quality Control

○ ‘Quality’ varies by individuals

slide-41
SLIDE 41

Limitations & Implications

  • Quality Control

○ ‘Quality’ varies by individuals ○ Highlight density varies ■ Introduction, Abstract > Conclusion > Implementation, Discussion...

slide-42
SLIDE 42

Possible improvements

  • Highlight slider

○ Control the group highlight appearance ○ By adjusting threshold ○ Could serve user preferences ○ Enforce sparse highlighted parts Show All Show None

slide-43
SLIDE 43

Possible improvements

  • Reading progress

○ Show paper reading progress based on final highlighted paragraph ○ Gamification property ○ Light up goal visibility

slide-44
SLIDE 44

Intrinsic problems

  • Early user disadvantage

○ Early users of the service couldn’t benefit much from group highlights. ○ Intrinsic problem of system ○ Motivate them by providing interactive assets Early user view Later user view

slide-45
SLIDE 45

Intrinsic problems

  • Do user really read on the computer screen?

○ P3: Though I normally read articles in printed papers, if all systems are like this, I would try to read in this platform.

slide-46
SLIDE 46

Let’s Read!

slide-47
SLIDE 47

Thank you for listening…

Q&A

slide-48
SLIDE 48

Appendix

slide-49
SLIDE 49

Possible improvements

  • User verification

○ E-mail verification ○ Group leader invitation ○ Could enhance quality control & recruitment

slide-50
SLIDE 50

Quality control issue

Wizard of Oz

slide-51
SLIDE 51

Related work: Amazon kindle

slide-52
SLIDE 52

Analyzing Highlights

Show Hide

slide-53
SLIDE 53

Quality control issue

slide-54
SLIDE 54

Quality control issue

→ Opacity weighted by # of votes

slide-55
SLIDE 55