User-behavior analytics for video streaming QoE assessment Ricky - - PowerPoint PPT Presentation

user behavior analytics for video streaming qoe assessment
SMART_READER_LITE
LIVE PREVIEW

User-behavior analytics for video streaming QoE assessment Ricky - - PowerPoint PPT Presentation

User-behavior analytics for video streaming QoE assessment Ricky K. P. Mok The Hong Kong Polytechnic University AIMS2016 1 Measuring the QoE is hard! AIMS2016 2 A simple QoE model Playback smoothness, picture quality, Quality of


slide-1
SLIDE 1

User-behavior analytics for video streaming QoE assessment

Ricky K. P. Mok The Hong Kong Polytechnic University

1 AIMS2016

slide-2
SLIDE 2

Measuring the QoE is hard!

2 AIMS2016

slide-3
SLIDE 3

A simple QoE model

Network path metrics Application layer metrics Quality of Experience RTT, packet loss rate, throughput … Start-up delay, rebuffering events, quality level switches … Playback smoothness, picture quality, expectation, past experiences, usage habit …

?

3 AIMS2016

slide-4
SLIDE 4

QoE Crowdtesting

  • Lab-based subjective assessment
  • Crowdsourcing-based subjective assessment

Subjects Video streaming server Emulated network environment Experimenter Task Workers Video streaming server Internet Experimenter Workers Network/Appl. performance MOS

4 AIMS2016

slide-5
SLIDE 5

User behavior analytics

  • User behavior reflects cognitive processes
  • Generated by users
  • User-viewing activities
  • Improve QoE inference
  • Worker behavior
  • Detect low-quality workers in QoE crowdtesting

5 AIMS2016

slide-6
SLIDE 6

QoE inference

  • User-viewing activities can be triggered by the

reaction to impairment events.

6 AIMS2016

slide-7
SLIDE 7

QoE inference

  • User-viewing activities
  • Pause/Resume
  • Change of player sizes
  • Reload

7 AIMS2016

slide-8
SLIDE 8

QoE inference

  • Correlate the occurrence of the activities with

perceivable impairments

  • Quantify the activities into metrics
  • Model the QoE using application layer metrics and

the user-viewing activities

8 AIMS2016

slide-9
SLIDE 9

Findings

  • An event can be triggered within few seconds after

some application events.

  • Pause
  • Reduce the screen size
  • Compare two models
  • 1. Start-up delay, Rebuffering frequency, and Rebuffering

duration

  • 2. Model 1 + No. of Pause, and reduce the screen size
  • The explanatory power is significantly increased by

8%.

9 AIMS2016

slide-10
SLIDE 10

Detecting low-quality workers

  • Worker-behaviour on the question page
  • Clicks
  • Time delay between two question
  • Mouse cursor movement
  • Trajectory
  • Speed/Acceleration
  • Low-quality workers behave differently normal

workers

  • A model can be trained to filter the workers who

cheat the system

10 AIMS2016

slide-11
SLIDE 11

An example

11 AIMS2016

slide-12
SLIDE 12

Findings

  • Ten worker behavior metrics are designed
  • 80% of low-quality worker can be detected
  • Comparing to CrowdMOS, our method has
  • Lower false positives and false negatives
  • Our method is independent of the ratings
  • More suitable for measuring the QoE

12 AIMS2016

slide-13
SLIDE 13

A new model

Network path metrics Application layer metrics Quality of Experience User behavior analytics Mouse cursor movement, clicks, Pause/Resume, … Subjective factors

13 AIMS2016

slide-14
SLIDE 14

Challenges

  • How can user behavior be incorporated into

measurement infrastructures?

  • The user behavior can be application-specific and

platform-specific (desktop vs. mobile)

  • Collaboration from either service providers or users is

required to collect the user behavior

  • Privacy issue?

14 AIMS2016

slide-15
SLIDE 15

Thanks

  • neprobe.org

cs.rickymok@connect.polyu.hk

15 AIMS2016