Tutorial 9 : cache memory Why use a cache ? Main memory (VRAM/DRAM) - - PowerPoint PPT Presentation

tutorial 9 cache memory why use a cache
SMART_READER_LITE
LIVE PREVIEW

Tutorial 9 : cache memory Why use a cache ? Main memory (VRAM/DRAM) - - PowerPoint PPT Presentation

Tutorial 9 : cache memory Why use a cache ? Main memory (VRAM/DRAM) is slow ! To deal with this, the -machine speed is reduced to match the memory read and write speed To make the machine faster, one can use a intermediate smaller


slide-1
SLIDE 1

Tutorial 9 : cache memory

slide-2
SLIDE 2

Why use a cache ?

  • Main memory (VRAM/DRAM) is slow !
  • To deal with this, the 𝛾-machine speed is reduced to match the memory

read and write speed

  • To make the machine faster, one can use a intermediate smaller and faster

memory between the processor and the main memory: a cache.

  • The cache associates memory addresses with their values (taken from the

main memory)

slide-3
SLIDE 3

Basic working principle

  • Reading a value from memory in presence of a cache is simple:
  • 1. Check whether the cache memory contains the address
  • 2. If it does, read the associated value from the cache
  • 3. Otherwise, save the value in the cache and return it
  • This usually works because memory accesses are not random. They

follow the subsequent principles:

  • Temporal locality principle
  • Spatial locality principle
slide-4
SLIDE 4

Cache memory variants

  • Totally associative cache
  • Totally associative cache in blocks
  • Direct mapped cache
  • Set associative cache
slide-5
SLIDE 5

Totally associative cache

Pros:

  • Simple

Cons:

  • One comparator and one address per stored word
  • Does not exploit fully the locality principle
  • Need for a replacement policy (can be costly to

implement) For each memory address A, store its corresponding word. Select a location using a replacement policy.

slide-6
SLIDE 6

Associative cache in blocks

Pros:

  • Exploit the locality principle better
  • Better capacity: one comparator for N stored words

Cons:

  • Need for a replacement policy which can be costly

For each address A, it stores the N consecutive words starting with the one stored at A.

1 2 3

slide-7
SLIDE 7

Direct mapped cache (in blocks)

1 2 3 1 2 3 4 5 6 7 Cache addresses

Pros:

  • No need for a replacement policy
  • Only one comparator is needed

Cons:

  • Not possible to store simultaneously

the content of different memory addresses sharing the same cache address Uses a part of the (memory) address as cache address !

slide-8
SLIDE 8

Set associative cache

Compromise between associative cache and direct mapped cache N direct mapped caches (in blocks or not). Selection of cache using a replacement policy. Pros:

  • Can store the content of memory

addresses having the same cache address Cons:

  • Need for a replacement policy but

for large enough cache, random selection yields results almost as good as LRU