CSE 373: Binary heaps Michael Lee Monday, Feb 5, 2018 1 Course - - PowerPoint PPT Presentation

cse 373 binary heaps
SMART_READER_LITE
LIVE PREVIEW

CSE 373: Binary heaps Michael Lee Monday, Feb 5, 2018 1 Course - - PowerPoint PPT Presentation

CSE 373: Binary heaps Michael Lee Monday, Feb 5, 2018 1 Course overview The course so far... Coming up next: Divide-and-conquer, sorting Graphs Misc topics (P vs NP, more?) 2 Reviewing manipulating arrays and nodes Algorithm


slide-1
SLIDE 1

CSE 373: Binary heaps

Michael Lee Monday, Feb 5, 2018

1

slide-2
SLIDE 2

Course overview

The course so far... ◮ Reviewing manipulating arrays and nodes ◮ Algorithm analysis ◮ Dictionaries (tree-based and hash-based) Coming up next: Divide-and-conquer, sorting Graphs Misc topics (P vs NP, more?)

2

slide-3
SLIDE 3

Course overview

The course so far... ◮ Reviewing manipulating arrays and nodes ◮ Algorithm analysis ◮ Dictionaries (tree-based and hash-based) Coming up next: ◮ Divide-and-conquer, sorting ◮ Graphs ◮ Misc topics (P vs NP, more?)

2

slide-4
SLIDE 4

Timeline

When are we getting project grades/our midterm back? Tuesday or Wednesday

3

slide-5
SLIDE 5

Timeline

When are we getting project grades/our midterm back? Tuesday or Wednesday

3

slide-6
SLIDE 6

Timeline

Do we have something due soon? Project 3 will be released today or tomorrow Due dates:

Part 1 due in two weeks (Fri, Feb 16) Full project due in three weeks (Fri, Feb 23)

Partner selection

Selection form due Fri, Feb 9 You MUST fjnd a new partner... ...unless both partners email me and petition to stay together

4

slide-7
SLIDE 7

Timeline

Do we have something due soon? ◮ Project 3 will be released today or tomorrow Due dates:

Part 1 due in two weeks (Fri, Feb 16) Full project due in three weeks (Fri, Feb 23)

Partner selection

Selection form due Fri, Feb 9 You MUST fjnd a new partner... ...unless both partners email me and petition to stay together

4

slide-8
SLIDE 8

Timeline

Do we have something due soon? ◮ Project 3 will be released today or tomorrow ◮ Due dates:

◮ Part 1 due in two weeks (Fri, Feb 16) ◮ Full project due in three weeks (Fri, Feb 23)

Partner selection

Selection form due Fri, Feb 9 You MUST fjnd a new partner... ...unless both partners email me and petition to stay together

4

slide-9
SLIDE 9

Timeline

Do we have something due soon? ◮ Project 3 will be released today or tomorrow ◮ Due dates:

◮ Part 1 due in two weeks (Fri, Feb 16) ◮ Full project due in three weeks (Fri, Feb 23)

◮ Partner selection

◮ Selection form due Fri, Feb 9 You MUST fjnd a new partner... ...unless both partners email me and petition to stay together

4

slide-10
SLIDE 10

Timeline

Do we have something due soon? ◮ Project 3 will be released today or tomorrow ◮ Due dates:

◮ Part 1 due in two weeks (Fri, Feb 16) ◮ Full project due in three weeks (Fri, Feb 23)

◮ Partner selection

◮ Selection form due Fri, Feb 9 ◮ You MUST fjnd a new partner... ...unless both partners email me and petition to stay together

4

slide-11
SLIDE 11

Timeline

Do we have something due soon? ◮ Project 3 will be released today or tomorrow ◮ Due dates:

◮ Part 1 due in two weeks (Fri, Feb 16) ◮ Full project due in three weeks (Fri, Feb 23)

◮ Partner selection

◮ Selection form due Fri, Feb 9 ◮ You MUST fjnd a new partner... ◮ ...unless both partners email me and petition to stay together

4

slide-12
SLIDE 12

Today

Motivating question:

Suppose we have a collection of “items”. We want to return whatever item has the smallest “priority”.

5

slide-13
SLIDE 13

The Priority Queue ADT

Specifjcally, want to implement the Priority Queue ADT: The Priority Queue ADT A priority queue stores elements according to their “priority”. It supports the following operations: removeMin: return the element with the smallest priority peekMin: fjnd (but do not return) the smallest element insert: add a new element to the priority queue

6

slide-14
SLIDE 14

The Priority Queue ADT

Specifjcally, want to implement the Priority Queue ADT: The Priority Queue ADT A priority queue stores elements according to their “priority”. It supports the following operations: ◮ removeMin: return the element with the smallest priority ◮ peekMin: fjnd (but do not return) the smallest element ◮ insert: add a new element to the priority queue

6

slide-15
SLIDE 15

The Priority Queue ADT

An alternative defjnition: instead of yielding the element with the largest priority, yield the one with the largest priority: The Priority Queue ADT, alternative defjnition A priority queue stores elements according to their “priority”. It supports the following operations: removeMax: return the element with the largest priority peekMax: fjnd (but do not return) the largest element insert: add a new element to the priority queue The way we implement both is almost identical – we just tweak how we compare elements In this class, we will focus on implementing a “min” priority queue

7

slide-16
SLIDE 16

The Priority Queue ADT

An alternative defjnition: instead of yielding the element with the largest priority, yield the one with the largest priority: The Priority Queue ADT, alternative defjnition A priority queue stores elements according to their “priority”. It supports the following operations: ◮ removeMax: return the element with the largest priority ◮ peekMax: fjnd (but do not return) the largest element ◮ insert: add a new element to the priority queue The way we implement both is almost identical – we just tweak how we compare elements In this class, we will focus on implementing a “min” priority queue

7

slide-17
SLIDE 17

Initial implementation ideas

Fill in this table with the worst-case runtimes: Idea removeMin peekMin insert Unsorted array list n n Unsorted linked list n n Sorted array list n Sorted linked list n Binary tree n n log n AVL tree log n log n log n

8

slide-18
SLIDE 18

Initial implementation ideas

Fill in this table with the worst-case runtimes: Idea removeMin peekMin insert Unsorted array list Θ (n) Θ (n) Θ (1) Unsorted linked list Θ (n) Θ (n) Θ (1) Sorted array list n Sorted linked list n Binary tree n n log n AVL tree log n log n log n

8

slide-19
SLIDE 19

Initial implementation ideas

Fill in this table with the worst-case runtimes: Idea removeMin peekMin insert Unsorted array list n n Unsorted linked list n n Sorted array list Θ (1) Θ (1) Θ (n) Sorted linked list Θ (1) Θ (1) Θ (n) Binary tree n n log n AVL tree log n log n log n

8

slide-20
SLIDE 20

Initial implementation ideas

Fill in this table with the worst-case runtimes: Idea removeMin peekMin insert Unsorted array list n n Unsorted linked list n n Sorted array list n Sorted linked list n Binary tree Θ (n) Θ (n) Θ (log(n)) AVL tree log n log n log n

8

slide-21
SLIDE 21

Initial implementation ideas

Fill in this table with the worst-case runtimes: Idea removeMin peekMin insert Unsorted array list n n Unsorted linked list n n Sorted array list n Sorted linked list n Binary tree n n log n AVL tree Θ (log(n)) Θ (log(n)) Θ (log(n))

8

slide-22
SLIDE 22

Initial implementation ideas

We want something optimized both frequent inserts and removes. An AVL tree (or some tree-ish thing) seems good enough... right? Today: learn how to implement a binary heap. peekMin is , and insert and remove are still log n in the worst case. However, insert is in the average case!

9

slide-23
SLIDE 23

Initial implementation ideas

We want something optimized both frequent inserts and removes. An AVL tree (or some tree-ish thing) seems good enough... right? Today: learn how to implement a binary heap. peekMin is O (1), and insert and remove are still O (log(n)) in the worst case. However, insert is O (1) in the average case!

9

slide-24
SLIDE 24

Binary heap invariants

Idea: adapt the tree-based method Insight: in a tree, fjnding the min is expensive! Rather then having it to the left, have it on the top! A BST or AVL tree A binary heap

10

slide-25
SLIDE 25

Binary heap invariants

Idea: adapt the tree-based method Insight: in a tree, fjnding the min is expensive! Rather then having it to the left, have it on the top! A BST or AVL tree A binary heap

10

slide-26
SLIDE 26

Binary heap invariants

Idea: adapt the tree-based method Insight: in a tree, fjnding the min is expensive! Rather then having it to the left, have it on the top! A BST or AVL tree A binary heap

10

slide-27
SLIDE 27

Binary heap invariants

We now need to change our invariants... Binary heap invariants A binary heap has three invariants: ◮ Num children: Every node has at most 2 children Heap: Every node is smaller then its children Structure: Every heap is a “complete” tree – it has no “gaps”

11

slide-28
SLIDE 28

Binary heap invariants

We now need to change our invariants... Binary heap invariants A binary heap has three invariants: ◮ Num children: Every node has at most 2 children ◮ Heap: Every node is smaller then its children Structure: Every heap is a “complete” tree – it has no “gaps”

11

slide-29
SLIDE 29

Binary heap invariants

We now need to change our invariants... Binary heap invariants A binary heap has three invariants: ◮ Num children: Every node has at most 2 children ◮ Heap: Every node is smaller then its children ◮ Structure: Every heap is a “complete” tree – it has no “gaps”

11

slide-30
SLIDE 30

Example of a heap

A broken heap 2 4 5 6 11 7 10 9

12

slide-31
SLIDE 31

Example of a heap

A fjxed heap 2 4 5 6 11 7 7 10 9

13

slide-32
SLIDE 32

The heap invariant

Are these all heaps? 4 2 3 5 7 6 9 10 8 2 3 4 5 4 5 7 8 6 9 4 5 9 8 6 7

14

slide-33
SLIDE 33

Implementing peekMin

How do we implement peekMin? 2 4 5 6 11 7 7 10 9 2 4 5 6 11 7 7 10 9 Easy: just return the root. Runtime: .

15

slide-34
SLIDE 34

Implementing peekMin

How do we implement peekMin? 2 4 5 6 11 7 7 10 9 Easy: just return the root. Runtime: Θ (1).

15

slide-35
SLIDE 35

Implementing removeMin

What about removeMin? Step 1: Just remove it! 2 4 5 6 11 7 7 10 9 4 5 6 11 7 7 10 9 Problem: Structure invariant is broken – the tree has a gap!

16

slide-36
SLIDE 36

Implementing removeMin

What about removeMin? Step 1: Just remove it! 4 5 6 11 7 7 10 9 4 5 6 11 7 7 10 9 Problem: Structure invariant is broken – the tree has a gap!

16

slide-37
SLIDE 37

Implementing removeMin

What about removeMin? Step 1: Just remove it! 4 5 6 11 7 7 10 9 Problem: Structure invariant is broken – the tree has a gap!

16

slide-38
SLIDE 38

Implementing removeMin

How do we fjx the gap? Step 2: Plug the gap by moving the last element to the top! 4 5 6 11 7 7 10 9 11 4 5 6 7 7 10 9 Problem: Heap invariant is broken – 11 is not smaller then 4 or 7!

17

slide-39
SLIDE 39

Implementing removeMin

How do we fjx the gap? Step 2: Plug the gap by moving the last element to the top! 11 4 5 6 7 7 10 9 11 4 5 6 7 7 10 9 Problem: Heap invariant is broken – 11 is not smaller then 4 or 7!

17

slide-40
SLIDE 40

Implementing removeMin

How do we fjx the gap? Step 2: Plug the gap by moving the last element to the top! 11 4 5 6 7 7 10 9 Problem: Heap invariant is broken – 11 is not smaller then 4 or 7!

17

slide-41
SLIDE 41

Implementing removeMin

How do we fjx the heap invariant? Step 3: “percolate down” – keep swapping node with smallest child 11 4 5 6 7 7 10 9 4 5 6 11 7 7 10 9 And we’re done!

18

slide-42
SLIDE 42

Implementing removeMin

How do we fjx the heap invariant? Step 3: “percolate down” – keep swapping node with smallest child 4 11 5 6 7 7 10 9 4 5 6 11 7 7 10 9 And we’re done!

18

slide-43
SLIDE 43

Implementing removeMin

How do we fjx the heap invariant? Step 3: “percolate down” – keep swapping node with smallest child 4 5 11 6 7 7 10 9 4 5 6 11 7 7 10 9 And we’re done!

18

slide-44
SLIDE 44

Implementing removeMin

How do we fjx the heap invariant? Step 3: “percolate down” – keep swapping node with smallest child 4 5 6 11 7 7 10 9 And we’re done!

18

slide-45
SLIDE 45

Practice

Practice: What happens if we call removeMin? After removing min: 5 10 17 20 19 14 16 15 9 11 24 22 13 18 9 10 17 20 19 14 16 15 11 18 24 22 13

19

slide-46
SLIDE 46

Practice

Practice: What happens if we call removeMin? After removing min: 9 10 17 20 19 14 16 15 11 18 24 22 13

19

slide-47
SLIDE 47

Analyzing removeMin

The percolateDown algorithm

percolateDown(node) { while (node.data is bigger then children) { swap data with smaller child } }

The runtime? fjndLastNodeTime removeRootTime numSwaps swapTime This ends up being: n log n ...which is in n .

20

slide-48
SLIDE 48

Analyzing removeMin

The percolateDown algorithm

percolateDown(node) { while (node.data is bigger then children) { swap data with smaller child } }

The runtime? fjndLastNodeTime + removeRootTime + numSwaps × swapTime This ends up being: n log n ...which is in n .

20

slide-49
SLIDE 49

Analyzing removeMin

The percolateDown algorithm

percolateDown(node) { while (node.data is bigger then children) { swap data with smaller child } }

The runtime? fjndLastNodeTime + removeRootTime + numSwaps × swapTime This ends up being: n + 1 + log(n) · 1 ...which is in Θ (n).

20

slide-50
SLIDE 50

Implementing insert

What about insert? Suppose we insert 3 – what happens? Step 1: insert at last available node 2 4 5 6 11 7 7 10 9 2 4 5 6 11 7 3 7 10 9 Problem: heap invariant broken! 7 is not smaller then 3!

21

slide-51
SLIDE 51

Implementing insert

What about insert? Suppose we insert 3 – what happens? Step 1: insert at last available node 2 4 5 6 11 7 3 7 10 9 2 4 5 6 11 7 3 7 10 9 Problem: heap invariant broken! 7 is not smaller then 3!

21

slide-52
SLIDE 52

Implementing insert

What about insert? Suppose we insert 3 – what happens? Step 1: insert at last available node 2 4 5 6 11 7 3 7 10 9 Problem: heap invariant broken! 7 is not smaller then 3!

21

slide-53
SLIDE 53

Implementing insert

How do we fjx the heap invariant? Step 2: “percolate up” – keep swapping node with parent until heap invariant is fjxed 2 4 5 6 11 7 3 7 10 9 2 3 5 6 11 4 7 7 10 9 All ok!

22

slide-54
SLIDE 54

Implementing insert

How do we fjx the heap invariant? Step 2: “percolate up” – keep swapping node with parent until heap invariant is fjxed 2 4 5 6 11 3 7 7 10 9 2 3 5 6 11 4 7 7 10 9 All ok!

22

slide-55
SLIDE 55

Implementing insert

How do we fjx the heap invariant? Step 2: “percolate up” – keep swapping node with parent until heap invariant is fjxed 2 3 5 6 11 4 7 7 10 9 2 3 5 6 11 4 7 7 10 9 All ok!

22

slide-56
SLIDE 56

Implementing insert

How do we fjx the heap invariant? Step 2: “percolate up” – keep swapping node with parent until heap invariant is fjxed 2 3 5 6 11 4 7 7 10 9 All ok!

22

slide-57
SLIDE 57

Practice

Practice: What happens if we insert 3? After inserting 3: 9 10 17 20 19 14 16 15 11 18 24 22 13 3 10 17 20 19 14 16 15 9 18 24 22 11 13

23

slide-58
SLIDE 58

Practice

Practice: What happens if we insert 3? After inserting 3: 3 10 17 20 19 14 16 15 9 18 24 22 11 13

23

slide-59
SLIDE 59

Analyzing insert

The percolateUp algorithm

percolateUp(node) { while (node.data is smaller then parent) { swap data with parent } }

The runtime? fjndLastNodeTime addNodeToLastTime numSwaps swapTime This ends up being: n log n ...which is in n .

24

slide-60
SLIDE 60

Analyzing insert

The percolateUp algorithm

percolateUp(node) { while (node.data is smaller then parent) { swap data with parent } }

The runtime? fjndLastNodeTime+addNodeToLastTime+numSwaps×swapTime This ends up being: n log n ...which is in n .

24

slide-61
SLIDE 61

Analyzing insert

The percolateUp algorithm

percolateUp(node) { while (node.data is smaller then parent) { swap data with parent } }

The runtime? fjndLastNodeTime+addNodeToLastTime+numSwaps×swapTime This ends up being: n + 1 + log(n) · 1 ...which is in Θ (n).

24

slide-62
SLIDE 62

Analyzing removeMin, part 2

Problem: But wait! I promised worst-case Θ (log(n)) insert and average-case Θ (1) insert. This algorithm is Θ (log(n)) in both the worst and average case! Why: Finding and modifying the last node is slow: requires traversal! Can we speed it up?

25

slide-63
SLIDE 63

Analyzing removeMin, part 2

Problem: But wait! I promised worst-case Θ (log(n)) insert and average-case Θ (1) insert. This algorithm is Θ (log(n)) in both the worst and average case! Why: Finding and modifying the last node is slow: requires traversal! Can we speed it up?

25

slide-64
SLIDE 64

Analyzing removeMin, part 2

Remember this slide? Idea removeMax peekMax insert Unsorted array list Θ (n) Θ (n) Θ (1) Unsorted linked list Θ (n) Θ (n) Θ (1) Sorted array list Θ (1) Θ (1) Θ (n) Sorted linked list Θ (1) Θ (1) Θ (n) Binary tree Θ (n) Θ (n) Θ (log(n)) AVL tree Θ (log(n)) Θ (log(n)) Θ (log(n))

26

slide-65
SLIDE 65

Analyzing removeMin, part 2

Observation: ◮ Arrays let us fjnd and append to the end quickly ◮ Trees let us have nice log(n) traversal behavior The trick: Why pick one or the other? Let’s do both!

27

slide-66
SLIDE 66

Analyzing removeMin, part 2

Observation: ◮ Arrays let us fjnd and append to the end quickly ◮ Trees let us have nice log(n) traversal behavior The trick: Why pick one or the other? Let’s do both!

27

slide-67
SLIDE 67

The array-based representation of binary heaps

Take a tree: A B D H I E J K C F L G How do we fjnd parent? parent i i The left child? leftChild i i The right child? leftChild i i And fjll an array in the level-order of the tree:

A B

1

C

2

D

3

E

4

F

5

G

6

H

7

I

8

J

9

K

10

L

11 12 13 14 28

slide-68
SLIDE 68

The array-based representation of binary heaps

Take a tree: A B D H I E J K C F L G How do we fjnd parent? parent i i The left child? leftChild i i The right child? leftChild i i And fjll an array in the level-order of the tree:

A B

1

C

2

D

3

E

4

F

5

G

6

H

7

I

8

J

9

K

10

L

11 12 13 14 28

slide-69
SLIDE 69

The array-based representation of binary heaps

Take a tree: A B D H I E J K C F L G How do we fjnd parent? parent(i) = i − 1 2

  • The left child?

leftChild(i) = 2i + 1 The right child? leftChild(i) = 2i + 2 And fjll an array in the level-order of the tree:

A B

1

C

2

D

3

E

4

F

5

G

6

H

7

I

8

J

9

K

10

L

11 12 13 14 28

slide-70
SLIDE 70

Finding the last node

If our tree is represented using an array, what’s the time needed to fjnd the last node now? : just use this.array[this.size - 1]. ...assuming array has no ’gaps’. (Hey, it looks like the structure invariant was useful after all)

29

slide-71
SLIDE 71

Finding the last node

If our tree is represented using an array, what’s the time needed to fjnd the last node now? Θ (1): just use this.array[this.size - 1]. ...assuming array has no ’gaps’. (Hey, it looks like the structure invariant was useful after all)

29

slide-72
SLIDE 72

Finding the last node

If our tree is represented using an array, what’s the time needed to fjnd the last node now? Θ (1): just use this.array[this.size - 1]. ...assuming array has no ’gaps’. (Hey, it looks like the structure invariant was useful after all)

29

slide-73
SLIDE 73

Re-analyzing insert

How does this change runtime of insert? Runtime of insert: findLastNodeTime addNodeToLastTime numSwaps swapTime ...which is: numSwaps Observation: when percolating, we usually need to percolate up a few times! So, numSwaps in the average case, and numSwaps height log n in the worst case!

30

slide-74
SLIDE 74

Re-analyzing insert

How does this change runtime of insert? Runtime of insert: findLastNodeTime+addNodeToLastTime+numSwaps×swapTime ...which is: 1 + 1 + numSwaps × 1 Observation: when percolating, we usually need to percolate up a few times! So, numSwaps in the average case, and numSwaps height log n in the worst case!

30

slide-75
SLIDE 75

Re-analyzing insert

How does this change runtime of insert? Runtime of insert: findLastNodeTime+addNodeToLastTime+numSwaps×swapTime ...which is: 1 + 1 + numSwaps × 1 Observation: when percolating, we usually need to percolate up a few times! So, numSwaps ≈ 1 in the average case, and numSwaps ≈ height = log(n) in the worst case!

30

slide-76
SLIDE 76

Re-analyzing removeMin

How does this change runtime of removeMin? Runtime of removeMin: fjndLastNodeTime removeRootTime numSwaps swapTime ...which is: numSwaps Observation: unfortunately, in practice, usually must percolate all the way down. So numSwaps height log n on average.

31

slide-77
SLIDE 77

Re-analyzing removeMin

How does this change runtime of removeMin? Runtime of removeMin: fjndLastNodeTime + removeRootTime + numSwaps × swapTime ...which is: 1 + 1 + numSwaps × 1 Observation: unfortunately, in practice, usually must percolate all the way down. So numSwaps height log n on average.

31

slide-78
SLIDE 78

Re-analyzing removeMin

How does this change runtime of removeMin? Runtime of removeMin: fjndLastNodeTime + removeRootTime + numSwaps × swapTime ...which is: 1 + 1 + numSwaps × 1 Observation: unfortunately, in practice, usually must percolate all the way down. So numSwaps ≈ height ≈ log(n) on average.

31