Meta-Learning Neural Bloom Filters Jack Rae Sergey Bartunov - - PowerPoint PPT Presentation

meta learning neural bloom filters
SMART_READER_LITE
LIVE PREVIEW

Meta-Learning Neural Bloom Filters Jack Rae Sergey Bartunov - - PowerPoint PPT Presentation

Meta-Learning Neural Bloom Filters Jack Rae Sergey Bartunov Tim Lillicrap Architecture Interested in neural networks with compressive, distributed memories. Problem Trend in the use of neural networks to replace classical data-structures.


slide-1
SLIDE 1

Jack Rae Sergey Bartunov Tim Lillicrap

Meta-Learning Neural Bloom Filters

slide-2
SLIDE 2

Meta-Learning Neural Bloom Filters - Jack Rae, Sergey Bartunov, Tim Lillicrap

Architecture

Interested in neural networks with compressive, distributed memories.

Problem

Trend in the use of neural networks to replace classical data-structures.

slide-3
SLIDE 3

Bloom Filter

slide-4
SLIDE 4

Bloom Filter

The Case for Learned Index Structures

Kraska et al. (2017)

slide-5
SLIDE 5

Meta-Learning Neural Bloom Filters - Jack Rae, Sergey Bartunov, Tim Lillicrap

Case for Meta-Learning

Often data-structures are not created in isolation. E.g. a Bigtable database with 10,000 tablets. Common rowkey schema and query distribution. Meta-learning: slow-learn common distribution, fast-learning of specific set.

One Bloom Filter per tablet Bigtable cluster Tablet

slide-6
SLIDE 6

Meta-Learning Neural Bloom Filters - Jack Rae, Sergey Bartunov, Tim Lillicrap

Neural Bloom Filter

slide-7
SLIDE 7

Meta-Learning Neural Bloom Filters - Jack Rae, Sergey Bartunov, Tim Lillicrap

Database Task

Space reduction over Bloom Filter for storage set of 5,000 strings.

slide-8
SLIDE 8

Meta-Learning Neural Bloom Filters - Jack Rae, Sergey Bartunov, Tim Lillicrap

Speed Benchmark

[1] Query-efficient Bloom Filter Chen et al. (2007) [2] A Case for Learned Index Structures Kraska et al. (2018)

slide-9
SLIDE 9

Meta-Learning Neural Bloom Filters - Jack Rae, Sergey Bartunov, Tim Lillicrap

Talk to me at my poster: #43

More experiments: Comparisons to MemNets, DNCs, and LSTMs. Image tasks with varying structure. Model ablations to different learned algorithms.

(Too small to see so you have to come to my poster for the real deal)