Lecture Data structures (DAT037) Nils Anders Danielsson 2014-11-14 - - PowerPoint PPT Presentation

lecture data structures dat037
SMART_READER_LITE
LIVE PREVIEW

Lecture Data structures (DAT037) Nils Anders Danielsson 2014-11-14 - - PowerPoint PPT Presentation

Lecture Data structures (DAT037) Nils Anders Danielsson 2014-11-14 Today Binary trees. Priority queues. Binary heaps. Leftist heaps. Binary trees Binary trees A left one and a right one. A binary tree is either empty or a


slide-1
SLIDE 1

Lecture Data structures (DAT037)

Nils Anders Danielsson 2014-11-14

slide-2
SLIDE 2

Today

▶ Binary trees. ▶ Priority queues.

▶ Binary heaps. ▶ Leftist heaps.

slide-3
SLIDE 3

Binary trees

slide-4
SLIDE 4

Binary trees

▶ A binary tree is either empty or a node. ▶ A node may contain a value. ▶ A node has two subtrees (possibly empty):

A left one and a right one.

▶ Terminology:

▶ Parent, child, sibling, grandchild etc. ▶ Root, leaf.

slide-5
SLIDE 5

Binary trees

One representation:

data Tree a = Empty | Node (Tree a) a (Tree a)

Example:

tree :: Tree Integer tree = Node (Node Empty 2 Empty) 1 (Node Empty 3 (Node Empty 5 Empty))

slide-6
SLIDE 6

Binary trees

Another representation:

class Tree<A> { class TreeNode { A contents; TreeNode left; // null if left child is missing. TreeNode right; // null if right child is missing. } TreeNode root; // null if tree is empty. }

slide-7
SLIDE 7

Binary trees

Height:

▶ Empty trees have height -1. ▶ Otherwise:

Number of steps from root to deepest leaf.

slide-8
SLIDE 8

Binary trees

Height:

▶ Empty trees have height -1. ▶ Otherwise:

Number of steps from root to deepest leaf.

height :: Tree a -> Integer height Empty = -1 height (Node l _ r) = 1 + max (height l) (height r)

slide-9
SLIDE 9

Binary trees

Height:

▶ Empty trees have height -1. ▶ Otherwise:

Number of steps from root to deepest leaf.

int height(TreeNode n) { if (n == null) { return -1; } else { return 1 + Math.max(height(n.left), height(n.right)); } }

slide-10
SLIDE 10

int s(TreeNode n) { if (n == null) { return 0; } else { return 1 + s(n.left) + s(n.right); } }

What is the result of applying s to the root

  • f the following tree?

.

1

.

2

.

4

.

6

.

7

.

3

.

5

slide-11
SLIDE 11

Priority queues

slide-12
SLIDE 12

Priority queues

Queues where every element has a certain priority. Interface (example):

▶ Constructor for empty queue. ▶ insert: Inserts element. ▶ find-min: Returns minimum element. ▶ delete-min: Deletes minimum element. ▶ decrease-key: Decreases priority. ▶ merge: Merges two queues.

slide-13
SLIDE 13

Priority queues

Some applications:

▶ Scheduling of processes. ▶ Sorting. ▶ Dijkstra’s algorithm (3rd assignment).

2nd assignment: Implement priority queue.

slide-14
SLIDE 14

If you implement the priority queue ADT with lists, what is the worst case time complexity of insert and delete-min?

▶ insert: Θ(1), delete-min: Θ(1). ▶ insert: Θ(1), delete-min: Θ(𝑜). ▶ insert: Θ(𝑜), delete-min: Θ(1). ▶ insert: Θ(𝑜), delete-min: Θ(𝑜).

slide-15
SLIDE 15

20 40 60 80 100 20 40 60 80 100 n log₂ n

slide-16
SLIDE 16

Binary heaps

slide-17
SLIDE 17

Binary heaps

Heap-ordered complete binary trees.

Heap-ordered

Every node is smaller than or equal to its children.

Complete binary tree

As low as possible, every level completely filled except possibly the last one, which is filled from the left. A binary heap of size 𝑜 has height Θ(log 𝑜).

slide-18
SLIDE 18

Identify the binary heaps.

A: . .

1

.

2

.

4

.

5

.

3

.

6

B: . .

1

.

7

.

8

.

9

.

3

.

6

C: . .

1

.

8

.

7

.

9

.

3

D: . .

1

.

7

.

8

.

9

.

9

.

3

.

6

E: .

1

.

7

.

8

.

9

.

1

.

2

F: . .

1

.

2

.

9

.

2

.

2

.

3

.

2

slide-19
SLIDE 19

Implementation of binary heaps

▶ Empty queue: Empty tree. ▶ find-min: Return the root. ▶ insert: Insert at the end. Percolate up. ▶ delete-min: Remove the root.

Move the final element to the top. Percolate down.

▶ decrease-key: Change priority, percolate up

(or down for increase-key). Percolate up/down until the tree is heap-ordered.

slide-20
SLIDE 20

Time complexity

If the nodes can be found quickly:

▶ Empty queue: Θ(1). ▶ find-min: Θ(1). ▶ insert: 𝑃(log 𝑜) (maybe amortised). ▶ delete-min: 𝑃(log 𝑜) (maybe amortised). ▶ decrease-key: 𝑃(log 𝑜).

(Assuming that comparisons take constant time.) Most nodes are located “close” to the leaves. Average time complexity of insertion: 𝑃(1).

slide-21
SLIDE 21

Implementation of binary heaps

One can represent the tree using an array (2nd assignment).

▶ The root at position 1. ▶ The last element at position 𝑜. ▶ The first empty cell at position 𝑜 + 1. ▶ Node 𝑗’s left child: 2𝑗. ▶ Node 𝑗’s right child: 2𝑗 + 1. ▶ Node 𝑗’s parent (𝑗 > 1): ⌊𝑗/2⌋.

slide-22
SLIDE 22

What is the result of applying delete-min to .

1

.

4

.

8

.

9

.

3

.

5

.

6

?

A: 3 4 5 6 8 9 B: 3 5 6 4 8 9 C: 3 4 8 9 5 6 D: 3 4 5 8 9 6

slide-23
SLIDE 23

decrease-key

decrease-key: How can the node be located quickly? Can use extra data structure. Example: Hash table.

slide-24
SLIDE 24

Leftist heaps

slide-25
SLIDE 25

merge

▶ Merging two binary heaps,

implemented using arrays, seems to be inefficient.

▶ With leftist heaps: 𝑃(log 𝑜)

(assuming 𝑃(1) comparisons).

slide-26
SLIDE 26

Leftist heaps

▶ Heap-ordered (pointer-based) binary trees,

with extra invariant (later).

▶ Basic operation: merge. ▶ Easy to implement insert, delete-min

in terms of merge.

slide-27
SLIDE 27

Leftist heaps, first attempt

  • - Invariant for Node x l r:
  • - * x is smaller than or equal to
  • all elements in l and r.

data PriorityQueue a = Empty | Node a (PriorityQueue a) (PriorityQueue a) empty :: PriorityQueue a empty = Empty isEmpty :: PriorityQueue a -> Bool isEmpty Empty = True isEmpty (Node _ _ _) = False

slide-28
SLIDE 28

Leftist heaps, first attempt

insert :: Ord a => a -> PriorityQueue a -> PriorityQueue a insert x t = merge (Node x Empty Empty) t

  • - Precondition: The queue must not be empty.

findMin :: PriorityQueue a -> a findMin Empty = error "findMin: Empty queue." findMin (Node x _ _) = x

  • - Precondition: The queue must not be empty.

deleteMin :: Ord a => PriorityQueue a -> PriorityQueue a deleteMin Empty = error "deleteMin: Empty." deleteMin (Node _ l r) = merge l r

slide-29
SLIDE 29

Leftist heaps, first attempt

merge implemented by going down right spines:

merge :: Ord a => PriorityQueue a -> PriorityQueue a -> PriorityQueue a merge Empty r = r merge l Empty = l merge l@(Node xl ll rl) r@(Node xr lr rr) = if xl <= xr then Node xl ll (merge rl r) else Node xr lr (merge l rr)

slide-30
SLIDE 30

What is the worst case time complexity of merge? Assume that both queues have 𝑜 elements, and that comparisons take constant time.

▶ Θ(1). ▶ Θ(log 𝑜). ▶ Θ(𝑜). ▶ Θ(𝑜 log 𝑜). ▶ Θ(𝑜2). ▶ Θ(𝑜2 log 𝑜).

slide-31
SLIDE 31

Leftist heaps

▶ Trees may be very unbalanced:

no left children, only right children.

▶ This makes merge linear (in the worst case). ▶ Solution: Ensure right spine is short.

slide-32
SLIDE 32

Leftist heaps

Null path length

▶ -1 for empty trees. ▶ Otherwise: Number of steps from root

to closest node with at most one child.

npl :: PriorityQueue a -> Integer npl Empty = -1 npl (Node _ l r) = 1 + min (npl l) (npl r)

slide-33
SLIDE 33

Leftist heaps

Null path length

▶ -1 for empty trees. ▶ Otherwise: Number of steps from root

to closest node with at most one child.

height :: PriorityQueue a -> Integer height Empty = -1 height (Node _ l r) = 1 + max (height l) (height r)

slide-34
SLIDE 34

Leftist heaps

Leftist

For Node x l r, npl l ≥ npl r. This implies:

▶ Number of nodes on right spine: 1 + npl t. ▶ 1 + npl t is 𝑃(log 𝑜), where 𝑜 is the size of t.

Thus the right spine is short.

slide-35
SLIDE 35

Leftist heaps

Leftist heap invariants

  • 1. Heap-ordered.
  • 2. Leftist.
slide-36
SLIDE 36

Identify the leftist heaps.

A: . .

1

.

2

.

4

.

5

.

3

.

6

B: . .

1

.

7

.

9

.

3

.

6

.

6

C: . .

1

.

7

.

9

.

6

.

3

.

6

D: . .

1

.

7

.

9

.

3

.

8

.

8

.

9

E: . .

1

.

7

.

9

.

1

.

2

F: . .

1

.

2

.

3

.

4

slide-37
SLIDE 37

Leftist heaps

▶ The previous implementation of merge

sometimes breaks the leftist invariant.

▶ Simple fix: When necessary,

swap the left and right subtrees.

slide-38
SLIDE 38

Leftist heaps

Old code:

merge :: Ord a => PriorityQueue a -> PriorityQueue a -> PriorityQueue a merge Empty r = r merge l Empty = l merge l@(Node xl ll rl) r@(Node xr lr rr) = if xl <= xr then Node xl ll (merge rl r) else Node xr lr (merge l rr)

slide-39
SLIDE 39

Leftist heaps

New code:

merge :: Ord a => PriorityQueue a -> PriorityQueue a -> PriorityQueue a merge Empty r = r merge l Empty = l merge l@(Node xl ll rl) r@(Node xr lr rr) = if xl <= xr then node xl ll (merge rl r) else node xr lr (merge l rr)

slide-40
SLIDE 40

Leftist heaps

Smart constructor used to enforce leftist invariant:

  • - Precondition for node x l r:
  • - * x is smaller than or equal to
  • all elements in l and r.

node :: a -> PriorityQueue a -> PriorityQueue a -> PriorityQueue a node x l r = if npl l >= npl r then Node x l r else Node x r l

slide-41
SLIDE 41

Leftist heaps

One final tweak:

▶ The recursive calculation of npl is unnecessary. ▶ Fix: Store npl values in nodes.

  • - Invariants: ...

data PriorityQueue a = Empty | Node Integer a (PriorityQueue a) (PriorityQueue a) npl :: PriorityQueue a -> Integer npl Empty = -1 npl (Node n _ _ _) = n

slide-42
SLIDE 42

What is the result of applying deleteMin to . .

1

.

4

.

8

.

9

.

3

.

5

.

6

?

A: . .

3

.

5

.

4

.

8

.

6

.

9

B: . .

3

.

4

.

8

.

6

.

9

.

5

C: . .

3

.

4

.

8

.

5

.

6

.

9

D: . .

3

.

4

.

8

.

9

.

5

.

6

slide-43
SLIDE 43

Time complexities

Binary heap Leftist heap (immutable) find-min 𝑃(1) 𝑃(1) delete-min 𝑃(log 𝑜) 𝑃(log 𝑜) insert 𝑃(1) (average) 𝑃(log 𝑜) decrease-key 𝑃(log 𝑜) 𝑃(𝑜) merge 𝑃(𝑜) 𝑃(log 𝑜) (Assuming that comparisons take constant time.)

slide-44
SLIDE 44

Other priority queue data structures

Comparison on Wikipedia.

slide-45
SLIDE 45

Summary

▶ Binary trees. ▶ Priority queues.

▶ Binary heaps. ▶ Leftist heaps.

Next time:

▶ Hash tables.