From FIFO to Predictive Cache Replacement Daniel Meint advised by - - PowerPoint PPT Presentation

from fifo to predictive cache replacement
SMART_READER_LITE
LIVE PREVIEW

From FIFO to Predictive Cache Replacement Daniel Meint advised by - - PowerPoint PPT Presentation

Chair of Network Architectures and Services Department of Informatics Technical University of Munich From FIFO to Predictive Cache Replacement Daniel Meint advised by Stefan Liebald Thursday 11 th October, 2018 Chair of Network Architectures


slide-1
SLIDE 1

Chair of Network Architectures and Services Department of Informatics Technical University of Munich

From FIFO to Predictive Cache Replacement

Daniel Meint

advised by Stefan Liebald Thursday 11th October, 2018 Chair of Network Architectures and Services Department of Informatics Technical University of Munich

slide-2
SLIDE 2

Introduction

Structure of this talk Introduction Evaluation of Replacement Strategies Traditional Cache Replacement Arrival-based Replacement Recency-based Replacement Frequency-based Replacement Adaptive Cache Replacement Predictive Cache Replacement for Web-based Multimedia Content Conclusion Bibliography

  • D. Meint — Cache Replacement

2

slide-3
SLIDE 3

Introduction

Caching “We are therefore forced to recognize the possibility of constructing a hierarchy of memories, each of which has greater capacity than the preceding but which is less quickly accessible.” [2] Caches offer high speed, but (relatively) low capacity.

  • Hardware Caches used by the CPU/GPU
  • Software Caches
  • Database systems
  • Operating systems
  • Web browsers
  • Web servers
  • ...
  • D. Meint — Cache Replacement

3

slide-4
SLIDE 4

Introduction

Cache Management Two central heuristics govern the content of any cache: Fetch Policy determines when information is brought into cache.

  • Demand Fetching
  • Anticipatory Fetching or Prefetching

Removal Policy determines what information leaves the cache.

  • widely studied
  • 50+ proposals from simple to complex
  • D. Meint — Cache Replacement

4

slide-5
SLIDE 5

Evaluation of Replacement Strategies

Performance Metrics Definition Concern Hit Ratio (HR)

  • i hi
  • i fi

general; small cache systems Byte Hit Ratio (BHR)

  • i hi·si
  • i fi·si

network bandwidth savings Delay Savings Ratio (DSR)

  • i hi·di
  • i fi·di

experienced retrieval time hi total number of references to object i which were satisfied from the cache fi total number of references to object i si size of object i di mean delay to fetch object i into the cache

  • D. Meint — Cache Replacement

5

slide-6
SLIDE 6

Traditional Cache Replacement

Key-based Strategies Objects have certain characteristics that may influence the replacement decision. arrival When was the object admitted to the cache? recency When was the object requested last? frequency How often has the object been requested? size How much space does the object take? cost How expensive is it to fetch this object? expiration When is this object going to become invalid?

  • D. Meint — Cache Replacement

6

slide-7
SLIDE 7

Arrival-based Replacement

First In, First Out (FIFO) Request sequence: A B A C A D enqueue dequeue

  • D. Meint — Cache Replacement

7

slide-8
SLIDE 8

Arrival-based Replacement

First In, First Out (FIFO) Request sequence: A B A C A D A enqueue dequeue

  • D. Meint — Cache Replacement

7

slide-9
SLIDE 9

Arrival-based Replacement

First In, First Out (FIFO) Request sequence: A B A C A D B A enqueue dequeue

  • D. Meint — Cache Replacement

7

slide-10
SLIDE 10

Arrival-based Replacement

First In, First Out (FIFO) Request sequence: A B A C A D B A enqueue dequeue

  • D. Meint — Cache Replacement

7

slide-11
SLIDE 11

Arrival-based Replacement

First In, First Out (FIFO) Request sequence: A B A C A D C B A enqueue dequeue

  • D. Meint — Cache Replacement

7

slide-12
SLIDE 12

Arrival-based Replacement

First In, First Out (FIFO) Request sequence: A B A C A D C B A enqueue dequeue

  • D. Meint — Cache Replacement

7

slide-13
SLIDE 13

Arrival-based Replacement

First In, First Out (FIFO) Request sequence: A B A C A D D C B enqueue A

  • D. Meint — Cache Replacement

7

slide-14
SLIDE 14

Recency-based Replacement

Least Recently Used (LRU) Evicting the item that was not requested for the longest time. Request sequence: A B A C A D MRU LRU

  • D. Meint — Cache Replacement

8

slide-15
SLIDE 15

Recency-based Replacement

Least Recently Used (LRU) Evicting the item that was not requested for the longest time. Request sequence: A B A C A D A MRU LRU

  • D. Meint — Cache Replacement

8

slide-16
SLIDE 16

Recency-based Replacement

Least Recently Used (LRU) Evicting the item that was not requested for the longest time. Request sequence: A B A C A D B A MRU LRU

  • D. Meint — Cache Replacement

8

slide-17
SLIDE 17

Recency-based Replacement

Least Recently Used (LRU) Evicting the item that was not requested for the longest time. Request sequence: A B A C A D A B MRU LRU

  • D. Meint — Cache Replacement

8

slide-18
SLIDE 18

Recency-based Replacement

Least Recently Used (LRU) Evicting the item that was not requested for the longest time. Request sequence: A B A C A D C A B MRU LRU

  • D. Meint — Cache Replacement

8

slide-19
SLIDE 19

Recency-based Replacement

Least Recently Used (LRU) Evicting the item that was not requested for the longest time. Request sequence: A B A C A D A C B MRU LRU

  • D. Meint — Cache Replacement

8

slide-20
SLIDE 20

Recency-based Replacement

Least Recently Used (LRU) Evicting the item that was not requested for the longest time. Request sequence: A B A C A D D A C MRU B

  • D. Meint — Cache Replacement

8

slide-21
SLIDE 21

Recency-based Replacement

Scan-vulnerability of LRU scan large sequential one-time request stream Request sequence: ... A B A C A D X Y Z A D A C C A ... D A C MRU LRU

  • D. Meint — Cache Replacement

8

slide-22
SLIDE 22

Recency-based Replacement

Scan-vulnerability of LRU scan large sequential one-time request stream Request sequence: ... A B A C A D X Y Z A D A C C A ... Z Y X MRU LRU

  • D. Meint — Cache Replacement

8

slide-23
SLIDE 23

Frequency-based Replacement

Least Frequently Used (LFU) LFU evicts the object with the lowest reference count. Rationale: Exploit long-term popularity patterns, some items are "hotter" than others. Web object popularity

  • distribution follows Zipf’s Law [1, 3].
  • is often highly dynamic [5].

Distinguish Perfect LFU tracks all objects In-Cache LFU tracks cached objects only

  • D. Meint — Cache Replacement

9

slide-24
SLIDE 24

Frequency-based Replacement

In-Cache LFU pollution Request sequence: X X X Y Y Y A B A B A B A B A B A B ... X (3) Y (3)

  • D. Meint — Cache Replacement

10

slide-25
SLIDE 25

Adaptive Cache Replacement

ARC

MRU LRU LRU B1 T1 T2 B2 L1 L2

Figure 1: Simplified ARC directory 1

L1 contains items accessed exactly once (recency component) L2 contains items accessed at least twice (frequency component) T1 and T2 form the actual cache (main cache) B1 and B2 only contain metadata (ghost cache)

1for an in-depth description, refer to [6, 7, 8]

  • D. Meint — Cache Replacement

11

slide-26
SLIDE 26

Adaptive Cache Replacement

ARC Performance ARC is scan-resistant and low-overhead. workload cache size (MB) LRU (HR %) ARC (HR %) P1 16 16.55 28.26 P2 16 18.47 27.38 P3 16 3.57 17.12 P4 16 5.24 11.24 P5 16 6.73 14.27 ConCat 16 14.38 21.67 Merge(P) 128 38.05 39.91 DS1 1024 11.65 22.52 SPC1 4096 9.19 20.00 S1 2048 23.71 33.43 S2 2048 25.91 40.68 S3 2048 25.26 40.44 Merge(S) 4096 27.62 40.44

sources: [7, 8]

  • D. Meint — Cache Replacement

12

slide-27
SLIDE 27

Predictive Cache Replacement for Web-based Multimedia Content

Two separate phases need to be distinguished:

  • 1. Prediction phase (various techniques)
  • 2. Removal phase (P-LFU)

Prediction Replacement Policy

  • D. Meint — Cache Replacement

13

slide-28
SLIDE 28

Predictive Cache Replacement for Web-based Multimedia Content

Prediction Example Prediction Replacement Policy

Figure 2: Optimal fit of the exponential and gaussian distributions to an example cumulative request pattern. source: [5]

  • D. Meint — Cache Replacement

14

slide-29
SLIDE 29

Predictive Cache Replacement for Web-based Multimedia Content

Predictive Least Frequently Used (P-LFU) Prediction P-LFU P-LFU Remove the item with the smallest number of predicted requests.

  • D. Meint — Cache Replacement

15

slide-30
SLIDE 30

Predictive Cache Replacement for Web-based Multimedia Content

On-Demand Fetching vs. Prefetching Upper bound of 50% HR under Demand Fetching model for any Replacement Policy [4, 10]. Prefetching results in

  • higher hit rate
  • higher network load

Effectiveness of Prefetching is measured by Coverage fraction of total misses eliminated by prefetching Accuracy fraction of prefetched items that were eventually requested Synonyms: Anticipatory fetching, Buffering (esp. video content)

  • D. Meint — Cache Replacement

16

slide-31
SLIDE 31

Conclusion

  • Replacement is a central design issue in any caching scheme
  • There exist various proposals beyond what is covered in this talk
  • Strategies can be compared along different metrics including hit-rate, byte-hit-rate, and

delay-savings-ratio

  • LRU is the most widely used policy and performs "good enough" [9] in most scenarios
  • ARC outperforms LRU and keeps overhead low
  • Multimedia caching is of special interest for future research
  • D. Meint — Cache Replacement

17

slide-32
SLIDE 32

Bibliography

[1]

  • A. Balamash and M. Krunz.

An Overview of Web Caching Replacement Algorithms. IEEE Communications Surveys & Tutorials, 6(2), 2004. [2]

  • A. W. Burks, H. H. Goldstine, and J. Von Neumann.

Preliminary Discussion of the Logical Design of an Electronic Computer Instrument. 1946. [3]

  • M. Busari and C. Williamson.

On the Sensitivity of Web Proxy Cache Performance to Workload Characteristics. In INFOCOM 2001. Twentieth Annual Joint Conference of the IEEE Computer and Communications Societies. Proceedings. IEEE, volume 3, pages 1225–1234. IEEE, 2001. [4] P . Cao and S. Irani. Cost-Aware WWW Proxy Caching Algorithms. In Usenix symposium on internet technologies and systems, volume 12, pages 193–206, 1997. [5]

  • J. Famaey, F. Iterbeke, T. Wauters, and F. De Turck.

Towards a Predictive Cache Replacement Strategy for Multimedia Content. Journal of Network and Computer Applications, 36(1):219–227, 2013. [6]

  • N. Megiddo and D. S. Modha.

ARC: A Self-Tuning, Low Overhead Replacement Cache. In FAST, volume 3, pages 115–130, 2003. [7]

  • N. Megiddo and D. S. Modha.

One Up on LRU. login–The Magazine of the USENIX Association, 28:7–11, 2003.

  • D. Meint — Cache Replacement

18

slide-33
SLIDE 33

Bibliography

[8]

  • N. Megiddo and D. S. Modha.

Outperforming LRU with an Adaptive Replacement Cache Algorithm. Computer, 37(4):58–65, 2004. [9]

  • S. Podlipnig and L. Böszörmenyi.

A Survey of Web Cache Replacement Strategies. ACM Computing Surveys (CSUR), 35(4):374–398, 2003. [10]

  • J. Wang.

A Survey of Web Caching Schemes for the Internet. ACM SIGCOMM Computer Communication Review, 29(5):36–46, 1999.

  • D. Meint — Cache Replacement

19