Caching in HTTP Adaptive Streaming: Friend or Foe? Danny Lee Ali - - PowerPoint PPT Presentation

caching in http adaptive streaming friend or foe
SMART_READER_LITE
LIVE PREVIEW

Caching in HTTP Adaptive Streaming: Friend or Foe? Danny Lee Ali - - PowerPoint PPT Presentation

Caching in HTTP Adaptive Streaming: Friend or Foe? Danny Lee Ali C. Begen Constantine Dovrolis Objectives What happens when a cache lies between a content server and client? What is the simplest scenario that results in bitrate


slide-1
SLIDE 1

Caching in HTTP Adaptive Streaming: Friend or Foe?

Ali C. Begen Danny Lee Constantine Dovrolis

slide-2
SLIDE 2

Objectives

 What happens when a cache lies between a content server and client?  What is the simplest scenario that results in bitrate oscillations?  How can we prevent bitrate oscillations in the presence of caching?

2

slide-3
SLIDE 3

Outline

  • Overview of adaptive streaming over HTTP
  • Oscillations due to interaction between cache

and client

  • A traffic shaping solution
  • Simulation description
  • Experiments and Results
  • Conclusions

3

slide-4
SLIDE 4

Outline

  • Overview of adaptive streaming over HTTP
  • Oscillations due to interaction between cache

and client

  • A traffic shaping solution
  • Simulation description
  • Experiments and Results
  • Conclusions

4

slide-5
SLIDE 5

Adaptive Streaming over HTTP

  • Media is split into

“segments”, encoded in multiple bitrates

  • Clients adaptively

request segments based on their estimate of available bandwidth

5

Media 256 Kbps 2.0 Mbps 1.0 Mbps 2.0 Mbps 256 Kbps Variable

slide-6
SLIDE 6

Outline

  • Overview of adaptive streaming over HTTP
  • Oscillations due to interaction between cache

and client

– How does it cause problems?

  • A traffic shaping solution
  • Simulation description
  • Experiments and Results
  • Conclusions

6

slide-7
SLIDE 7

Caching – A Simple Model

  • Caches may be deployed to reduce upstream

bandwidth usage, and provide better downstream latency and bandwidth to clients

  • Media segments are transferred over HTTP, and may

be cached in the cache server

  • For a cache in an access network, typically CS < CC

7

Origin Server to Cache Bandwidth

CS

Cache Server to Client Bandwidth

CC

Origin Server Cache Server Client

slide-8
SLIDE 8

Erroneous Bandwidth Estimation due to Cache Hit

  • Clients retrieving cached segments will

receive them at CC

– Causes an artificially high bandwidth estimation

  • But faster is better, right?

8

1.6 Mbps 5 Mbps Origin Server Cache Server Client Request 5 Mbps Cache Hit Request Request Cache Miss 1.6 Mbps 1.6 Mbps

slide-9
SLIDE 9

Bitrate Oscillations

  • Problems arise

when a mid- quality bitrate video is cached

  • Typically,

continuous segments are cached

  • Clients switch

between high and low quality video every few seconds – annoying!

9

Origin Server Cache Server Client

Request Segment with 1.5 Mbps bitrate Return Segment at 5.0 Mbps

Cache Hit Increase requested bitrate

Request Segment with 2.0 Mbps bitrate

Cache Miss

Request Segment with 2.0 Mbps bitrate Return Segment at 1.6 Mbps Return Segment at 1.6 Mbps

Decrease requested bitrate

Request Segment with 1.5 Mbps bitrate Return Segment at 5.0 Mbps

Cache Hit

slide-10
SLIDE 10

Outline

  • Overview of adaptive streaming over HTTP
  • Oscillations due to interaction between cache

and client

  • A traffic shaping solution
  • Simulation description
  • Experiments and Results
  • Conclusions

10

slide-11
SLIDE 11

Video Shaping Intelligent Cache (ViSIC)

  • Control the bitrates requested by the client

by shaping the download speed

– Prevent erroneous bandwidth estimates – Smooth fluctuations in available bandwidth

  • Cache Server Implementation

– Independent of client and origin server – Reduce upstream bandwidth usage – Serve cached segments faster than no-cache

11

slide-12
SLIDE 12

The Shaping Algorithm – High Level Description

  • Estimate CS and CC from all traffic passing

through cache server

  • Select a target bitrate we want the client to use

– Detect long duration changes in available bandwidth and allow increase/decrease in target bitrate – Otherwise favor bitrates of cached segments

  • Shape downloads

from the cache

– Use higher rate than target bitrate – But lower than what causes clients to switch rates

12

Cache Server Client

CS, CC

1.5 Mbps plz Ok, download at 1.9 Mbps Score! 1.5 Mbps segments cached

slide-13
SLIDE 13

Outline

  • Overview of adaptive streaming over HTTP
  • Interaction between cache and client
  • A traffic shaping solution
  • Simulation description
  • Setup
  • Client and Standard Cache
  • Experiments and Results
  • Conclusions

13

slide-14
SLIDE 14

Simulation Description – Setup

  • Compare different Cache Server scenarios:

ViSIC, Standard Cache and No-cache

  • We varied CS and CC
  • Representation Bitrates:

256 Kbps, 768 Kbps, 1.5 Mbps, 2.8 Mbps, 4.5 Mbps

Origin Server to Cache Bandwidth

CS

Cache Server to Client Bandwidth

CC

Origin Server Cache Server Client

14

slide-15
SLIDE 15

Simulation Description – Client and Cache

  • Client

– Simulates a typical adaptive streaming player – Adjusts requested bitrates based

  • n average segment throughput

– If video buffer level falls below a low threshold, engage “panic mode”

15

  • Standard Cache

– Simulates a traditional Web cache – Cache hit: Serve files at maximum speed – Cache miss: Serve files at upstream speed

slide-16
SLIDE 16

Outline

  • Overview of adaptive streaming over HTTP
  • Oscillations due to interaction between cache

and client

  • A traffic shaping solution
  • Simulation description
  • Experiments and Results
  • Constant Bandwidth
  • Fluctuating Bandwidth
  • Conclusions

16

slide-17
SLIDE 17

0.0 0.5 1.0 1.5 2.0 2.5 3.0 50 100 150 200 Requested Representation Bitrate (Mbps) Time (seconds) Available Bandwidth No-Cache Standard Cache ViSIC

Constant Bandwidth

17

slide-18
SLIDE 18

Fluctuating Bandwidth - ViSIC

0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 100 200 300 400 500 Requested Representation Bitrate (Mbps) Time (seconds) Available Bandwidth ViSIC

18

BW Spikes w/ cached segments BW Drops w/ cached segments BW Drops w/ uncached segments BW Spikes w/ uncached segments

slide-19
SLIDE 19

Fluctuating Bandwidth – Stability

0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 50 100 150 200 250 Instability Metric Segment Count No-Cache Standard Cache ViSIC

19

slide-20
SLIDE 20

Outline

  • Overview of adaptive streaming over HTTP
  • Oscillations due to interaction between cache

and client

  • A traffic shaping solution
  • Simulation description
  • Experiments and Results
  • Conclusions

20

slide-21
SLIDE 21

Conclusions

  • A cache server in the path of an HTTP adaptive

streaming client can cause problems

– Bitrate oscillations, buffer draining

  • Cause: Cache hits cause erroneous bandwidth

estimations

– Clients overestimate actual path bandwidth – Clients request segments that are unsustainable

  • Traffic shaping at the cache can prevent
  • scillations and buffer drains

– Maintains cache benefits over no-cache

21

slide-22
SLIDE 22

Acknowledgments

Ashok Narayanan

Ashok presented the cache-induced instability problem at the Adaptive Media Transport Workshop, organized by Cisco, in June 2012

22

slide-23
SLIDE 23

In Memory Of

Saamer Akhshabi 1 April 1987 – 6 March 2014

23

slide-24
SLIDE 24

Thanks

24

slide-25
SLIDE 25

Typical Behavior of a Player

  • Estimates available bandwidth using running

average of per-segment TCP throughput measurements

  • Adaptive segment bitrate selection

– Increase if throughput is high (i.e., can support higher bitrate segments) – Decrease if throughput is lower than current bitrate (i.e., transfer is slower than real time)

  • Client buffer levels affect the state

– “Panic mode” to recover from low buffer situation

25

slide-26
SLIDE 26

Selecting the Target Bitrate – More Details (i)

  • By shaping the download speed, we can cause the

client to select specific bitrates – Target Bitrate

  • Two possibilities:

1. Stay at current segment bitrate

  • Cache Hit: Avoid erroneous high-bandwidth estimation
  • Absorb short term CS bandwidth fluctuations
  • Guard against CS bandwidth decreases when serving cached

segments

2. Change to bitrate supported by available path bandwidth

  • Allow client to adapt to long term bandwidth

increases/decreases

26

slide-27
SLIDE 27

Selecting the Target Bitrate – More Details (ii)

Case I: When CS ≤ CC

  • For a cache hit

– If CS has a long term increase, use current path bw – Else shape at current segment bitrate

  • For a cache miss

– If CS has a long term increase or decrease, use current path bw – Else shape at current segment bitrate

Case II: When CS > CC

  • Whether cache hit or miss

– If CC has a long term increase, use current path bw – Else shape at current segment bitrate

27

slide-28
SLIDE 28

The Shaping Algorithm - More Details

  • Make use of higher available bandwidth between

cache server and client

– Serve a segment at a higher speed than the target bitrate

  • Solution:

1. Select the next higher representation bitrate 2. Multiply it by a factor β (we used 0.9)

  • Shape at a higher speed than current bitrate

but lower than the next higher representation

– Make use of CS < CC – Better performance than No-Cache

28

slide-29
SLIDE 29

The Shaping Algorithm - Example

  • Representation Bitrates:

256 Kbps, 768 Kbps, 1.5 Mbps, 2.8 Mbps, 4.5 Mbps

  • CC: 4.0 Mbps
  • Target Bitrate: 1.5 Mbps
  • Shaped Bitrate:

0.9 * 2.8 Mbps = 2.52 Mbps

  • Client will continue to request

1.5 Mbps segments, but receive them at a higher rate!

29

0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 50 100 Avail-BW Shaped Rate Target Bitrate Next Higher BR We want the client to stay at this rate We have this much CC bandwidth Client will switch up if we shape beyond this We shape at this rate

slide-30
SLIDE 30

Simulation Description – Standard Cache

  • Functions as a cut-through cache

– Intercepts HTTP requests to the server – Files not in cache don’t need to be fully downloaded to be served

  • List of files that represents cached segments
  • If a file exists on disk, cache hit: served at CC
  • Cache miss

– Starts a new transfer, serves file one RTT later – Effective bandwidth is CS

30

slide-31
SLIDE 31

Fluctuating Bandwidth – Full Results

0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 100 200 300 400 500 Requested Representation Bitrate (Mbps) Time (seconds) Available Bandwidth No-Cache Standard Cache ViSIC

31

slide-32
SLIDE 32

Fluctuating Bandwidth – Playback Start Time

  • Playback starts

after client buffer is full

32

Scenario Playback Start (sec) ViSIC 11.954 No-cache 14.258 Standard Cache 15.664 Playback Buffer full

slide-33
SLIDE 33

Fluctuating Bandwidth – Buffer Fullness

We aim to keep the buffer full

  • Allows user to seek

in buffered region

  • Minimal quality

disruptions during playback Playback buffer is near full for ViSIC in all scenarios Panic Mode engaged multiple times for no- cache and standard cache

33

Panic Mode engaged