Caching in HTTP Adaptive Streaming: Friend or Foe?
Ali C. Begen Danny Lee Constantine Dovrolis
Caching in HTTP Adaptive Streaming: Friend or Foe? Danny Lee Ali - - PowerPoint PPT Presentation
Caching in HTTP Adaptive Streaming: Friend or Foe? Danny Lee Ali C. Begen Constantine Dovrolis Objectives What happens when a cache lies between a content server and client? What is the simplest scenario that results in bitrate
Ali C. Begen Danny Lee Constantine Dovrolis
What happens when a cache lies between a content server and client? What is the simplest scenario that results in bitrate oscillations? How can we prevent bitrate oscillations in the presence of caching?
2
and client
3
and client
4
“segments”, encoded in multiple bitrates
request segments based on their estimate of available bandwidth
5
Media 256 Kbps 2.0 Mbps 1.0 Mbps 2.0 Mbps 256 Kbps Variable
and client
– How does it cause problems?
6
bandwidth usage, and provide better downstream latency and bandwidth to clients
be cached in the cache server
7
Origin Server to Cache Bandwidth
CS
Cache Server to Client Bandwidth
CC
Origin Server Cache Server Client
receive them at CC
– Causes an artificially high bandwidth estimation
8
1.6 Mbps 5 Mbps Origin Server Cache Server Client Request 5 Mbps Cache Hit Request Request Cache Miss 1.6 Mbps 1.6 Mbps
when a mid- quality bitrate video is cached
continuous segments are cached
between high and low quality video every few seconds – annoying!
9
Origin Server Cache Server Client
Request Segment with 1.5 Mbps bitrate Return Segment at 5.0 Mbps
Cache Hit Increase requested bitrate
Request Segment with 2.0 Mbps bitrate
Cache Miss
Request Segment with 2.0 Mbps bitrate Return Segment at 1.6 Mbps Return Segment at 1.6 Mbps
Decrease requested bitrate
Request Segment with 1.5 Mbps bitrate Return Segment at 5.0 Mbps
Cache Hit
and client
10
by shaping the download speed
– Prevent erroneous bandwidth estimates – Smooth fluctuations in available bandwidth
– Independent of client and origin server – Reduce upstream bandwidth usage – Serve cached segments faster than no-cache
11
through cache server
– Detect long duration changes in available bandwidth and allow increase/decrease in target bitrate – Otherwise favor bitrates of cached segments
from the cache
– Use higher rate than target bitrate – But lower than what causes clients to switch rates
12
Cache Server Client
CS, CC
1.5 Mbps plz Ok, download at 1.9 Mbps Score! 1.5 Mbps segments cached
13
ViSIC, Standard Cache and No-cache
256 Kbps, 768 Kbps, 1.5 Mbps, 2.8 Mbps, 4.5 Mbps
Origin Server to Cache Bandwidth
CS
Cache Server to Client Bandwidth
CC
Origin Server Cache Server Client
14
– Simulates a typical adaptive streaming player – Adjusts requested bitrates based
– If video buffer level falls below a low threshold, engage “panic mode”
15
– Simulates a traditional Web cache – Cache hit: Serve files at maximum speed – Cache miss: Serve files at upstream speed
and client
16
0.0 0.5 1.0 1.5 2.0 2.5 3.0 50 100 150 200 Requested Representation Bitrate (Mbps) Time (seconds) Available Bandwidth No-Cache Standard Cache ViSIC
17
0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 100 200 300 400 500 Requested Representation Bitrate (Mbps) Time (seconds) Available Bandwidth ViSIC
18
BW Spikes w/ cached segments BW Drops w/ cached segments BW Drops w/ uncached segments BW Spikes w/ uncached segments
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 50 100 150 200 250 Instability Metric Segment Count No-Cache Standard Cache ViSIC
19
and client
20
streaming client can cause problems
– Bitrate oscillations, buffer draining
estimations
– Clients overestimate actual path bandwidth – Clients request segments that are unsustainable
– Maintains cache benefits over no-cache
21
Ashok Narayanan
Ashok presented the cache-induced instability problem at the Adaptive Media Transport Workshop, organized by Cisco, in June 2012
22
Saamer Akhshabi 1 April 1987 – 6 March 2014
23
24
average of per-segment TCP throughput measurements
– Increase if throughput is high (i.e., can support higher bitrate segments) – Decrease if throughput is lower than current bitrate (i.e., transfer is slower than real time)
– “Panic mode” to recover from low buffer situation
25
client to select specific bitrates – Target Bitrate
1. Stay at current segment bitrate
segments
2. Change to bitrate supported by available path bandwidth
increases/decreases
26
Case I: When CS ≤ CC
– If CS has a long term increase, use current path bw – Else shape at current segment bitrate
– If CS has a long term increase or decrease, use current path bw – Else shape at current segment bitrate
Case II: When CS > CC
– If CC has a long term increase, use current path bw – Else shape at current segment bitrate
27
cache server and client
– Serve a segment at a higher speed than the target bitrate
1. Select the next higher representation bitrate 2. Multiply it by a factor β (we used 0.9)
but lower than the next higher representation
– Make use of CS < CC – Better performance than No-Cache
28
256 Kbps, 768 Kbps, 1.5 Mbps, 2.8 Mbps, 4.5 Mbps
0.9 * 2.8 Mbps = 2.52 Mbps
1.5 Mbps segments, but receive them at a higher rate!
29
0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 50 100 Avail-BW Shaped Rate Target Bitrate Next Higher BR We want the client to stay at this rate We have this much CC bandwidth Client will switch up if we shape beyond this We shape at this rate
– Intercepts HTTP requests to the server – Files not in cache don’t need to be fully downloaded to be served
– Starts a new transfer, serves file one RTT later – Effective bandwidth is CS
30
0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 100 200 300 400 500 Requested Representation Bitrate (Mbps) Time (seconds) Available Bandwidth No-Cache Standard Cache ViSIC
31
after client buffer is full
32
Scenario Playback Start (sec) ViSIC 11.954 No-cache 14.258 Standard Cache 15.664 Playback Buffer full
We aim to keep the buffer full
in buffered region
disruptions during playback Playback buffer is near full for ViSIC in all scenarios Panic Mode engaged multiple times for no- cache and standard cache
33
Panic Mode engaged