The MOVIS project Performance Monitoring System for Video Streaming - - PDF document

the movis project
SMART_READER_LITE
LIVE PREVIEW

The MOVIS project Performance Monitoring System for Video Streaming - - PDF document

www.nr.no The MOVIS project Performance Monitoring System for Video Streaming Networks Wolfgang Leister April 2006 MOVIS Participating Institutions Research Project supported by Research Council of Norway TV2 Interaktiv, Bergen


slide-1
SLIDE 1

www.nr.no

The MOVIS project

Performance Monitoring System for Video Streaming Networks

Wolfgang Leister

April 2006

www.nr.no

MOVIS – Participating Institutions

Research Project supported by Research Council of Norway ► TV2 Interaktiv, Bergen ► Norsk Regnesentral, Oslo ► IRT, München ► Nextgentel, Sandsli ► Lyse Tele, Stavanger ► Never.no, Oslo ► Secrets to Sports, Asker ► Nettfokus, Lundesæter ► Nimsoft, Oslo

slide-2
SLIDE 2

www.nr.no

The Challenge

► Content Providers distributes content to

▪ End-users as multimedia streams ▪ ISPs for redistribution

► The ISPs stream to end-users from own servers ► In case of degradations of QoS in delivery:

▪ the end-user gets reduced visual quality. ▪ Communicate problem before user reacts. ▪ Help content providers / ISPs to identify problems. ▪ Logs for quality assurance.

slide-3
SLIDE 3

www.nr.no

Streaming Content Distribution

Provider ...

Internet

Encoder Encoder Databases Video File Server Live Distrib. Server Camera Camera AV Matrix

Video Source

...

Provider 1 Default Server Provider 2 Provider 3

ISP Net

Encoders

slide-4
SLIDE 4

www.nr.no

Infrastructure at ISP

Streaming Server Cont Provider Default Server File Server

Internet Content Provider

Router / Switch Router / Switch Router / Switch DSLAM

www.nr.no

End user equipment scenarios

Access point Router WLAN Viewing Characteristics

Digital TV One PC Home Network WLAN

...

slide-5
SLIDE 5

www.nr.no

Considerations for MOVIS

Online measurements of end-user QoS for streamed content

Emphasis on end-to-end performance

Measure / collect perceived and calculated QoS

Report QoS values back to content providers and ISPs

Metrics, protocols, architectural issues

Measurements in all parts of delivery-chain

End user is in mass market

End user interaction

End user equipment configuration beyond control of providers

Internal structures of ISPs must not be revealed

www.nr.no

How to solve problem?

► Research

▪ Metrics for measuring video quality ▪ Subjective / Objective assessment of video quality ▪ Relations between technical and perceived QoS

► Development

▪ Agent-based system to report perceived video quality to ISP and content provider. ▪ Interfacing with systems in use

► Evaluations

▪ Field trial, user studies

slide-6
SLIDE 6

www.nr.no

Test Methods

Original Content Encoder Streaming Server Transmission Network Decoder Processed Content Feature Comparison Impairment Par. Feature Extraction Feature Extraction Original Content Encoder Streaming Server Transmission Network Decoder Processed Content Monitoring System + Model Impairment Par. Original Content Encoder Streaming Server Transmission Network Decoder Processed Content Measurement System Obj.Quality Rating Picture Comparison Test method Feature Extraction Test Method Single Ended Test method www.nr.no

Characteristics of Test Methods

user – Subjective picture – Directly In-service Real-time Continuous Objective – technical Indirectly – signal Out-of-service Deferred time Samples

vs vs vs vs vs

slide-7
SLIDE 7

www.nr.no

QoS measurements / metrics

► Collecting QoS observations

▪ Intrusive (controlled injection of content) ▪ Non-intrusive (passive observation)

► QoS metric

carefully specified quantity related to the performance and reliability of the service that we would like to know the value of.

► Networking QoS metrics ► Perceived QoS

▪ Laboratory ▪ Real-time

www.nr.no

QoS measurements / metrics

► Perceived QoS metrics

▪ Impact of networking characteristics ▪ Impact of codec characteristics

► Assessment

▪ Subjective assessment (end user evaluations) ▪ Objective assessment (technically deducted)

► Methods

▪ DVB: ETR 290, TR 101 290 (Measurement guidelines for DVB systems), TR 101 291 ▪ BT.500 ▪ SAMVIQ – BT.700

slide-8
SLIDE 8

www.nr.no

QoS measurements / metrics

► Networking QoS metrics

▪ Connectivity, one-way delay, two-way delay, throughput, loss, jitter, ... ▪ Initiatives:

  • IETF IPPM (IP Performance metrics)

RFC 2330, RFC 2680, RFC 2681, IPPM, WOAMP, ...

  • ITU-T G.107: Rating of transmission quality e2e.

► Nettfokus – mobile SLM

▪ http://www.knowyourSLA.com/

► NimBUS

▪ http://www.nimsoft.com/

www.nr.no

NIMSOFT

► Architecture built around message bus (NimBUS) ► All types of traffic / networks ► API libraries available ► Agent-based

slide-9
SLIDE 9

www.nr.no

Firewall

Internet enterprise networks

www.knowyoursla.com

Firewall

Master Slave Slave Slave

Nettfokus / MobileSLM

Management solution for SLM www.knowyoursla.com

Designed for multimedia traffic

Dedicated machines inject traffic

Agent-based

www.nr.no

APDEX

► Numerical Measure of end user satisfaction ► Ratings range from unacceptable to excellent ► Used for response times ► Three end-user categories

▪ #Satisfied ▪ #Tolerating ▪ #Frustrated

slide-10
SLIDE 10

www.nr.no

G.107 – E-model - VQM

► Need models for establishing relation between objective

and subjective quality.

► G.107 / E-model is example for audio ► VQM for video

▪ Document: NTIA-report 02-392 – Video Quality Measurement Techniques by Pinson and Wolf. http://www.its.bldrdoc.gov/n3/video/documents.htm

▪ Developed by ITS / NTIA of the U.S. Dept. of Commerce

www.nr.no

G.107 / E-Model

► Model for audio / IP-telephony ► Uses impairment factors ► Related standards:

▪ G.108 ▪ G.113 ▪ G.175 ▪ G.562 ▪ ...

slide-11
SLIDE 11

www.nr.no

VQM – Video Quality Metric

► Objective measurement method ► How and what to measure ► Formulas to calculate VQM value ► Relation VQM ↔ subjective

www.nr.no

Encoder Streaming server Network

The MOVIS-factor M

Original Content Encoded Content Cont Prov. Network ISP Network User Network User Equipment End User

MOVIS Ancilliary Channel Viewing conditions (not part of MOVIS) Streaming Server type, parameter settings, ... Networking parameters, topology, type, ... Type, parameters ... Codec, parameter settings, encoder type, ... MOVIS-factor M=

MO

  • ME(Codec)
  • MN(Codec)
  • MU(Codec)
  • MV(Cont)

+MA(...)

Advantage factor

  • MS(Codec,Cont)
slide-12
SLIDE 12

www.nr.no

SAMVIQ – ITU-R BT.700

Standard for Video Quality Assessment in multimedia

SAMVIQ submitted as draft standard to ITU-R SG6 WP 6Q

Build on experiences from MUSHRA (audio quality mmt)

Uses hidden reference, low anchor, user knows total scale, possibility to change vote, scale from 0 to 100.

High reliability, comparability w/ other test labs, absolute results

Use scenarios: ▪ Measure impact of different bit rates (MOVIS WP2) ▪ Comparison of codecs (MOVIS WP2) ▪ Minimum rate for specified quality (MOVIS WP2) ▪ Impact of network errors (MOVIS WP3)

slide-13
SLIDE 13

www.nr.no

SAMVIQ: GUI used for sessions

www.nr.no

SAMVIQ: Structure of test sessions

Reference ref

  • Algo. 3

D Hidden Reference C

  • Algo. 1

B

  • Algo. 2

A

  • 1. sequence

Reference ref

  • Algo. 2

D

  • Algo. 1

C Hidden Reference B

  • Algo. 3

A

  • 2. sequence

Reference ref

  • Algo. 2

D

  • Algo. 3

C

  • Algo. 1

B Hidden Reference A

  • k. sequence

Example: Algo. 1: WM9, CIF,168kbps

  • Algo. 2: WM9, CIF, 1032kbps
slide-14
SLIDE 14

www.nr.no

Sequences used in assessment

Scenes: Skiing – bright Rugby – less details Rainman – high frequencies Barcelona - colourful

www.nr.no

Results: influence of content on quality

2 0 4 0 6 0 8 0 1 0 0 4 0 0 8 0 0 1 2 0 0 G lo b a l ( M e a n + C I ) S k iin g ( M e a n + C I ) R u g b y ( M e a n + C I ) R a in m a n ( M e a n + C I ) B a r c e lo n a ( M e a n + C I )

Windows Media 9 CIF format all sequences

Total bit rate (kbps)

Excellent Good Fair Poor Bad Mean Score