SLIDE 1
Fundamental Limits of Caching Urs Niesen Jointly with Mohammad - - PowerPoint PPT Presentation
Fundamental Limits of Caching Urs Niesen Jointly with Mohammad - - PowerPoint PPT Presentation
Fundamental Limits of Caching Urs Niesen Jointly with Mohammad Maddah-Ali Bell Labs, Alcatel-Lucent Video on Demand Video on demand is getting increasingly popular: Netflix streaming service Amazon Instant Video Hulu Verizon / Comcast on
SLIDE 2
SLIDE 3
Video on Demand
Video on demand is getting increasingly popular: Netflix streaming service Amazon Instant Video Hulu Verizon / Comcast on Demand . . . ⇒ Places significant stress on service provider’s networks
SLIDE 4
Video on Demand
Video on demand is getting increasingly popular: Netflix streaming service Amazon Instant Video Hulu Verizon / Comcast on Demand . . . ⇒ Places significant stress on service provider’s networks ⇒ Caching (prefetching) can be used to mitigate this stress
SLIDE 5
Caching (Prefetching)
20 40 60 80 100 6 12 18 24 time of day normalized demand
SLIDE 6
Caching (Prefetching)
20 40 60 80 100 6 12 18 24 time of day normalized demand
High temporal traffic variability
SLIDE 7
Caching (Prefetching)
20 40 60 80 100 6 12 18 24 time of day normalized demand
High temporal traffic variability Caching can help smooth traffic
SLIDE 8
The Role of Caching
Conventional beliefs about caching:
SLIDE 9
The Role of Caching
Conventional beliefs about caching: Caches useful to deliver content locally
SLIDE 10
The Role of Caching
Conventional beliefs about caching: Caches useful to deliver content locally Local cache size matters
SLIDE 11
The Role of Caching
Conventional beliefs about caching: Caches useful to deliver content locally Local cache size matters Statistically identical users ⇒ identical cache content
SLIDE 12
The Role of Caching
Conventional beliefs about caching: Caches useful to deliver content locally Local cache size matters Statistically identical users ⇒ identical cache content Insights from this work:
SLIDE 13
The Role of Caching
Conventional beliefs about caching: Caches useful to deliver content locally Local cache size matters Statistically identical users ⇒ identical cache content Insights from this work: The main gain in caching is global
SLIDE 14
The Role of Caching
Conventional beliefs about caching: Caches useful to deliver content locally Local cache size matters Statistically identical users ⇒ identical cache content Insights from this work: The main gain in caching is global Global cache size matters
SLIDE 15
The Role of Caching
Conventional beliefs about caching: Caches useful to deliver content locally Local cache size matters Statistically identical users ⇒ identical cache content Insights from this work: The main gain in caching is global Global cache size matters Statistically identical users ⇒ different cache content
SLIDE 16
Problem Setting
K users caches server shared link
SLIDE 17
Problem Setting
K users caches server N files, N ≥ K for simplicity shared link
SLIDE 18
Problem Setting
K users caches size M server N files shared link
SLIDE 19
Problem Setting
K users caches size M server N files shared link Placement: cache arbitrary function of files (linear, nonlinear, . . . )
SLIDE 20
Problem Setting
K users caches size M server N files shared link Delivery:
SLIDE 21
Problem Setting
K users caches size M server N files shared link Delivery: - requests are revealed to server
SLIDE 22
Problem Setting
K users caches size M server N files shared link Delivery: - requests are revealed to server
- server sends arbitrary function of files
SLIDE 23
Problem Setting
K users caches size M server N files shared link Delivery: - requests are revealed to server
- server sends arbitrary function of files
SLIDE 24
Problem Setting
K users caches size M server N files shared link Question: smallest worst-case rate R(M) needed in delivery phase?
SLIDE 25
Conventional Caching Scheme
N files, K users, cache size M
M N
SLIDE 26
Conventional Caching Scheme
N files, K users, cache size M
M N
M N
SLIDE 27
Conventional Caching Scheme
N files, K users, cache size M
M N
M N
SLIDE 28
Conventional Caching Scheme
N files, K users, cache size M
M N
M N
SLIDE 29
Conventional Caching Scheme
N files, K users, cache size M
M N
M N
SLIDE 30
Conventional Caching Scheme
N files, K users, cache size M
M N
M N
Performance of conventional scheme: R(M) = K · (1 − M/N)
SLIDE 31
Conventional Caching Scheme
N files, K users, cache size M
M N
M N
Performance of conventional scheme: R(M) = K · (1 − M/N) Caches provide content locally ⇒ local cache size matters Identical cache content at users
SLIDE 32
Conventional Caching Scheme
N = 2 files, K = 2 users
2 2 M R conventional scheme
SLIDE 33
Conventional Caching Scheme
N = 4 files, K = 4 users
4 4 M R conventional scheme
SLIDE 34
Conventional Caching Scheme
N = 8 files, K = 8 users
8 8 M R conventional scheme
SLIDE 35
Conventional Caching Scheme
N = 16 files, K = 16 users
16 16 M R conventional scheme
SLIDE 36
Conventional Caching Scheme
N = 32 files, K = 32 users
32 32 M R conventional scheme
SLIDE 37
Conventional Caching Scheme
N = 64 files, K = 64 users
64 64 M R conventional scheme
SLIDE 38
Conventional Caching Scheme
N = 128 files, K = 128 users
128 128 M R conventional scheme
SLIDE 39
Conventional Caching Scheme
N = 256 files, K = 256 users
256 256 M R conventional scheme
SLIDE 40
Conventional Caching Scheme
N = 512 files, K = 512 users
512 512 M R conventional scheme
SLIDE 41
Proposed Caching Scheme
N files, K users, cache size M
Design guidelines advocated in this work: The main gain in caching is global Global cache size matters Different cache content at users
SLIDE 42
Proposed Caching Scheme
N files, K users, cache size M
Design guidelines advocated in this work: The main gain in caching is global Global cache size matters Different cache content at users Performance of proposed scheme: R(M) = K · (1 − M/N) · 1 1 + KM/N
SLIDE 43
Proposed Caching Scheme
N = 2 files, K = 2 users
2 2 M R conventional scheme proposed scheme
SLIDE 44
Proposed Caching Scheme
N = 4 files, K = 4 users
4 4 M R conventional scheme proposed scheme
SLIDE 45
Proposed Caching Scheme
N = 8 files, K = 8 users
8 8 M R conventional scheme proposed scheme
SLIDE 46
Proposed Caching Scheme
N = 16 files, K = 16 users
16 16 M R conventional scheme proposed scheme
SLIDE 47
Proposed Caching Scheme
N = 32 files, K = 32 users
32 32 M R conventional scheme proposed scheme
SLIDE 48
Proposed Caching Scheme
N = 64 files, K = 64 users
64 64 M R conventional scheme proposed scheme
SLIDE 49
Proposed Caching Scheme
N = 128 files, K = 128 users
128 128 M R conventional scheme proposed scheme
SLIDE 50
Proposed Caching Scheme
N = 256 files, K = 256 users
256 256 M R conventional scheme proposed scheme
SLIDE 51
Proposed Caching Scheme
N = 512 files, K = 512 users
512 512 M R conventional scheme proposed scheme
SLIDE 52
Recall: Conventional Scheme
N = 2 files, K = 2 users, cache size M = 1
SLIDE 53
Recall: Conventional Scheme
N = 2 files, K = 2 users, cache size M = 1
B A
SLIDE 54
Recall: Conventional Scheme
N = 2 files, K = 2 users, cache size M = 1
B1, B2 A1, A2
SLIDE 55
Recall: Conventional Scheme
N = 2 files, K = 2 users, cache size M = 1
A1, B1 A1, B1 B1, B2 A1, A2
SLIDE 56
Recall: Conventional Scheme
N = 2 files, K = 2 users, cache size M = 1
A1, B1 A1, B1 B1, B2 A1, A2
SLIDE 57
Recall: Conventional Scheme
N = 2 files, K = 2 users, cache size M = 1
A1, B1 A1, B1 B1, B2 A1, A2 A2
SLIDE 58
Recall: Conventional Scheme
N = 2 files, K = 2 users, cache size M = 1
A A1, B1 A A1, B1 B1, B2 A1, A2 A2
SLIDE 59
Recall: Conventional Scheme
N = 2 files, K = 2 users, cache size M = 1
A A1, B1 A A1, B1 B1, B2 A1, A2 A2 ⇒ Identical cache content at users ⇒ Gain from delivering content locally
SLIDE 60
Recall: Conventional Scheme
N = 2 files, K = 2 users, cache size M = 1
A1, B1 A1, B1 B1, B2 A1, A2
SLIDE 61
Recall: Conventional Scheme
N = 2 files, K = 2 users, cache size M = 1
A1, B1 A1, B1 B1, B2 A1, A2 A2, B2
SLIDE 62
Recall: Conventional Scheme
N = 2 files, K = 2 users, cache size M = 1
A A1, B1 B A1, B1 B1, B2 A1, A2 A2, B2
SLIDE 63
Recall: Conventional Scheme
N = 2 files, K = 2 users, cache size M = 1
A A1, B1 B A1, B1 B1, B2 A1, A2 A2, B2 ⇒ Multicast only possible for users with same demand
SLIDE 64
Recall: Conventional Scheme
N = 2 files, K = 2 users, cache size M = 1 1 2 1 2 M R conventional scheme proposed scheme
SLIDE 65
Proposed Scheme
N = 2 files, K = 2 users, cache size M = 1
SLIDE 66
Proposed Scheme
N = 2 files, K = 2 users, cache size M = 1
B A
SLIDE 67
Proposed Scheme
N = 2 files, K = 2 users, cache size M = 1
B1, B2 A1, A2
SLIDE 68
Proposed Scheme
N = 2 files, K = 2 users, cache size M = 1
A1, B1 A2, B2 B1, B2 A1, A2
SLIDE 69
Proposed Scheme
N = 2 files, K = 2 users, cache size M = 1
A1, B1 A2, B2 B1, B2 A1, A2
SLIDE 70
Proposed Scheme
N = 2 files, K = 2 users, cache size M = 1
A1, B1 A2, B2 B1, B2 A1, A2 A2 B1
SLIDE 71
Proposed Scheme
N = 2 files, K = 2 users, cache size M = 1
A1, B1 A2, B2 B1, B2 A1, A2 A2 B1
SLIDE 72
Proposed Scheme
N = 2 files, K = 2 users, cache size M = 1
A1, B1 A2, B2 B1, B2 A1, A2 A2 ⊕ B1
SLIDE 73
Proposed Scheme
N = 2 files, K = 2 users, cache size M = 1
A A1, B1 B A2, B2 B1, B2 A1, A2 A2 ⊕ B1
SLIDE 74
Proposed Scheme
N = 2 files, K = 2 users, cache size M = 1
A A1, B1 B A2, B2 B1, B2 A1, A2 A2 ⊕ B1 ⇒ Different cache content at users ⇒ Multicast to 2 users with different demands
SLIDE 75
Proposed Scheme
N = 2 files, K = 2 users, cache size M = 1
A A1, B1 A A2, B2 B1, B2 A1, A2 A2⊕A1 A A1, B1 B A2, B2 B1, B2 A1, A2 A2⊕B1 B A1, B1 A A2, B2 B1, B2 A1, A2 B2⊕A1 B A1, B1 B A2, B2 B1, B2 A1, A2 B2⊕B1
⇒ Works for all possible user requests ⇒ Simultaneous multicasting gain
SLIDE 76
Proposed Scheme
N = 2 files, K = 2 users, cache size M = 1 1 2 1 2 M R conventional scheme proposed scheme
SLIDE 77
Proposed Scheme
N files, K users, cache size M
Scheme can be generalized to arbitrary:
Number of files N Number of users K Cache size M
SLIDE 78
Proposed Scheme
N files, K users, cache size M
Scheme can be generalized to arbitrary:
Number of files N Number of users K Cache size M
Enables multicast to KM/N + 1 users with different demands
SLIDE 79
Comparison of the Two Schemes
N files, K users, cache size M
Conventional scheme: R(M) = K · (1 − M/N) Proposed scheme: R(M) = K · (1 − M/N) ·
1 1+KM/N
SLIDE 80
Comparison of the Two Schemes
N files, K users, cache size M
Conventional scheme: R(M) = K · (1 − M/N) Proposed scheme: R(M) = K · (1 − M/N) ·
1 1+KM/N
Rate without caching K
SLIDE 81
Comparison of the Two Schemes
N files, K users, cache size M
Conventional scheme: R(M) = K · (1 − M/N) Proposed scheme: R(M) = K · (1 − M/N) ·
1 1+KM/N
Rate without caching K Local caching gain 1 − M/N
Significant when local cache size M is of order N
SLIDE 82
Comparison of the Two Schemes
N files, K users, cache size M
Conventional scheme: R(M) = K · (1 − M/N) Proposed scheme: R(M) = K · (1 − M/N) ·
1 1+KM/N
Rate without caching K Local caching gain 1 − M/N
Significant when local cache size M is of order N
Global caching gain
1 1+KM/N
Significant when global cache size KM is of order N
SLIDE 83
Comparison of the Two Schemes
N files, K users, cache size M
Conventional scheme: R(M) = K · (1 − M/N) Proposed scheme: R(M) = K · (1 − M/N) ·
1 1+KM/N
Rate without caching K Local caching gain 1 − M/N
Significant when local cache size M is of order N
Global caching gain
1 1+KM/N
Significant when global cache size KM is of order N
⇒ Global gain can be Θ(K) smaller than local gain
SLIDE 84
Can We Do Better?
Theorem The proposed scheme is optimal to within a constant factor in rate.
SLIDE 85
Can We Do Better?
Theorem The proposed scheme is optimal to within a constant factor in rate. ⇒ Information-theoretic bound
SLIDE 86
Can We Do Better?
Theorem The proposed scheme is optimal to within a constant factor in rate. ⇒ Information-theoretic bound ⇒ Constant is independent of problem parameters N, K, M
SLIDE 87
Can We Do Better?
Theorem The proposed scheme is optimal to within a constant factor in rate. ⇒ Information-theoretic bound ⇒ Constant is independent of problem parameters N, K, M ⇒ No other significant gain besides local and global
SLIDE 88
Conclusions
A New Approach to Caching
SLIDE 89
Conclusions
A New Approach to Caching
The main gain in caching is global
⇒ Multicast to users with different demands
SLIDE 90
Conclusions
A New Approach to Caching
The main gain in caching is global
⇒ Multicast to users with different demands
Global cache size matters
SLIDE 91
Conclusions
A New Approach to Caching
The main gain in caching is global
⇒ Multicast to users with different demands
Global cache size matters Statistically identical users ⇒ different cache content
SLIDE 92
Conclusions
A New Approach to Caching
The main gain in caching is global
⇒ Multicast to users with different demands
Global cache size matters Statistically identical users ⇒ different cache content Significant improvement over conventional caching schemes
⇒ Reduction in rate up to order of number of users
SLIDE 93