Priority Queue / Heap Stores ( key,data ) pairs (like dictionary) - - PowerPoint PPT Presentation

priority queue heap
SMART_READER_LITE
LIVE PREVIEW

Priority Queue / Heap Stores ( key,data ) pairs (like dictionary) - - PowerPoint PPT Presentation

Priority Queue / Heap Stores ( key,data ) pairs (like dictionary) But, different set of operations: Initialize Heap : creates new empty heap Is Empty : returns true if heap is empty Insert ( key,data ): inserts ( key,data


slide-1
SLIDE 1

Algorithm Theory, WS 2012/13 Fabian Kuhn 1

Priority Queue / Heap

  • Stores (key,data) pairs (like dictionary)
  • But, different set of operations:
  • Initialize‐Heap: creates new empty heap
  • Is‐Empty: returns true if heap is empty
  • Insert(key,data): inserts (key,data)‐pair, returns pointer to entry
  • Get‐Min: returns (key,data)‐pair with minimum key
  • Delete‐Min: deletes minimum (key,data)‐pair
  • Decrease‐Key(entry,newkey): decreases key of entry to newkey
  • Merge: merges two heaps into one
slide-2
SLIDE 2

Algorithm Theory, WS 2012/13 Fabian Kuhn 2

Implementation of Dijkstra’s Algorithm

Store nodes in a priority queue, use , as keys:

  • 1. Initialize , 0 and , ∞ for all
  • 2. All nodes are unmarked
  • 3. Get unmarked node which minimizes , :

4. mark node 5. For all , ∈ , , min , , ,

  • 6. Until all nodes are marked
slide-3
SLIDE 3

Algorithm Theory, WS 2012/13 Fabian Kuhn 3

Analysis

Number of priority queue operations for Dijkstra:

  • Initialize‐Heap:
  • Is‐Empty:
  • Insert:
  • Get‐Min:
  • Delete‐Min:
  • Decrease‐Key:
  • Merge:
  • ||

|| || || ||

slide-4
SLIDE 4

Algorithm Theory, WS 2012/13 Fabian Kuhn 4

Priority Queue Implementation

Implementation as min‐heap:  complete binary tree, e.g., stored in an array

  • Initialize‐Heap:
  • Is‐Empty:
  • Insert:
  • Get‐Min:
  • Delete‐Min:
  • Decrease‐Key:
  • Merge (heaps of size and , ):
slide-5
SLIDE 5

Algorithm Theory, WS 2012/13 Fabian Kuhn 5

Better Implementation

  • Can we do better?
  • Cost of Dijkstra with complete binary min‐heap implementation:

log

  • Can be improved if we can make decrease‐key cheaper…
  • Cost of merging two heaps is expensive
  • We will get there in two steps:

Binomial heap  Fibonacci heap

slide-6
SLIDE 6

Algorithm Theory, WS 2012/13 Fabian Kuhn 6

Definition: Binomial Tree

Binomial tree of order 0 :

slide-7
SLIDE 7

Algorithm Theory, WS 2012/13 Fabian Kuhn 7

Binomial Trees

slide-8
SLIDE 8

Algorithm Theory, WS 2012/13 Fabian Kuhn 8

Properties

  • 1. Tree has 2 nodes
  • 2. Height of tree is
  • 3. Root degree of is
  • 4. In , there are exactly
  • nodes at depth
slide-9
SLIDE 9

Algorithm Theory, WS 2012/13 Fabian Kuhn 9

Binomial Coefficients

  • Binomial coefficient:
  • : # of element subsets of a set of size
  • Property:
  • Pascal triangle:
slide-10
SLIDE 10

Algorithm Theory, WS 2012/13 Fabian Kuhn 10

Number of Nodes at Depth in

Claim: In , there are exactly

  • nodes at depth
slide-11
SLIDE 11

Algorithm Theory, WS 2012/13 Fabian Kuhn 11

Binomial Heap

  • Keys are stored in nodes of binomial trees of different order

nodes: there is a binomial tree or order iff bit of base‐2 representation of is 1.

  • Min‐Heap Property:

Key of node keys of all nodes in sub‐tree of

slide-12
SLIDE 12

Algorithm Theory, WS 2012/13 Fabian Kuhn 12

Example

  • 10 keys: 2, 5, 8, 9, 12, 14, 17, 18, 20, 22, 25
  • Binary representation of : 11 1011

 trees , , and present

14 25 5 20 12 22 9 18 2 8 17

slide-13
SLIDE 13

Algorithm Theory, WS 2012/13 Fabian Kuhn 13

Child‐Sibling Representation

Structure of a node:

  • parent

key degree child sibling

slide-14
SLIDE 14

Algorithm Theory, WS 2012/13 Fabian Kuhn 14

Link Operation

  • Unite two binomial trees of the same order to one tree:

⨁ ⇒

  • Time:

 12  18   20  15 22   40   25 30  

slide-15
SLIDE 15

Algorithm Theory, WS 2012/13 Fabian Kuhn 15

Merge Operation

Merging two binomial heaps:

  • For , , … , :

If there are 2 or 3 binomial trees : apply link operation to merge 2 trees into one binomial tree

Time:

slide-16
SLIDE 16

Algorithm Theory, WS 2012/13 Fabian Kuhn 16

Example

14 25 5 20 12 22 9 18 2 8 17 13

slide-17
SLIDE 17

Algorithm Theory, WS 2012/13 Fabian Kuhn 17

Operations

Initialize: create empty list of trees Get minimum of queue: time 1 (if we maintain a pointer) Decrease‐key at node :

  • Set key of node to new key
  • Swap with parent until min‐heap property is restored
  • Time: log

Insert key into queue :

  • 1. Create queue of size 1 containing only
  • 2. Merge and
  • Time for insert: log
slide-18
SLIDE 18

Algorithm Theory, WS 2012/13 Fabian Kuhn 18

Operations

Delete‐Min Operation:

  • 1. Find tree with minimum root
  • 2. Remove from queue  queue ′
  • 3. Children of form new queue ′′
  • 4. Merge queues ′ and ′′
  • Overall time:
slide-19
SLIDE 19

Algorithm Theory, WS 2012/13 Fabian Kuhn 19

Delete‐Min Example

14 25 2 20 12 22 9 18 5 8 17

slide-20
SLIDE 20

Algorithm Theory, WS 2012/13 Fabian Kuhn 20

Complexities Binomial Heap

  • Initialize‐Heap:
  • Is‐Empty:
  • Insert:
  • Get‐Min:
  • Delete‐Min:
  • Decrease‐Key:
  • Merge (heaps of size and , ):
slide-21
SLIDE 21

Algorithm Theory, WS 2012/13 Fabian Kuhn 21

Can We Do Better?

  • Binomial heap:

insert, delete‐min, and decrease‐key cost log

  • One of the operations insert or delete‐min must cost Ωlog :

– Heap‐Sort: Insert elements into heap, then take out the minimum times – (Comparison‐based) sorting costs at least Ω log .

  • But maybe we can improve decrease‐key and one of the other

two operations?

  • Structure of binomial heap is not flexible:

– Simplifies analysis, allows to get strong worst‐case bounds – But, operations almost inherently need at least logarithmic time

slide-22
SLIDE 22

Algorithm Theory, WS 2012/13 Fabian Kuhn 22

Fibonacci Heaps

Lacy‐merge variant of binomial heaps:

  • Do not merge trees as long as possible…

Structure: A Fibonacci heap consists of a collection of trees satisfying the min‐heap property. Variables:

  • . : root of the tree containing the (a) minimum key
  • . : circular, doubly linked, unordered list containing

the roots of all trees

  • . : number of nodes currently in
slide-23
SLIDE 23

Algorithm Theory, WS 2012/13 Fabian Kuhn 23

Trees in Fibonacci Heaps

Structure of a single node :

  • . : points to circular, doubly linked and unordered list of

the children of

  • . , . : pointers to siblings (in doubly linked list)
  • . : will be used later…

Advantages of circular, doubly linked lists:

  • Deleting an element takes constant time
  • Concatenating two lists takes constant time

left parent right key degree child mark

slide-24
SLIDE 24

Algorithm Theory, WS 2012/13 Fabian Kuhn 24

Example

Figure: Cormen et al., Introduction to Algorithms

slide-25
SLIDE 25

Algorithm Theory, WS 2012/13 Fabian Kuhn 25

Simple (Lazy) Operations

Initialize‐Heap :

  • . ≔ . ≔

Merge heaps and ′:

  • concatenate root lists
  • update .

Insert element into :

  • create new one‐node tree containing  H′
  • merge heaps and ′

Get minimum element of :

  • return .
slide-26
SLIDE 26

Algorithm Theory, WS 2012/13 Fabian Kuhn 26

Operation Delete‐Min

Delete the node with minimum key from and return its element: 1. ≔ . ; 2. if . 0 then 3. remove . from . ; 4. add . . (list) to . 5. . ; // Repeatedly merge nodes with equal degree in the root list // until degrees of nodes in the root list are distinct. // Determine the element with minimum key 6. return

slide-27
SLIDE 27

Algorithm Theory, WS 2012/13 Fabian Kuhn 27

Rank and Maximum Degree

Ranks of nodes, trees, heap: Node :

  • : degree of

Tree :

  • : rank (degree) of root node of

Heap :

  • : maximum degree of any node in

Assumption (: number of nodes in ):

– for a known function

slide-28
SLIDE 28

Algorithm Theory, WS 2012/13 Fabian Kuhn 28

Merging Two Trees

Given: Heap‐ordered trees , ′ with

  • Assume: min‐key of min‐key of ′

Operation , :

  • Removes tree ′ from root list

and adds to child list of

  • ≔ 1
  • . ≔
slide-29
SLIDE 29

Algorithm Theory, WS 2012/13 Fabian Kuhn 29

Consolidation of Root List

Array pointing to find roots with the same rank: Consolidate: 1. for ≔ 0 to do ≔ null; 2. while . null do 3. ≔ “delete and return first element of . ” 4. while null do 5. ≔ ; 6. ≔ ; 7. ≔ , 8. ≔ 9. Create new . and .

1 2

  • |.

Time: |.

slide-30
SLIDE 30

Algorithm Theory, WS 2012/13 Fabian Kuhn 30

Consolidate Example

14 25 5 20 12 22 9 18 2 8 17 1 13 15 3 7 19 31 14 25 5 20 12 22 9 18 2 8 17 1 13 15 3 7 19 31

slide-31
SLIDE 31

Algorithm Theory, WS 2012/13 Fabian Kuhn 31

Consolidate Example

14 25 5 20 12 22 9 18 2 8 17 1 13 15 3 7 19 31

  • 14

25 5 20 12 22 9 18 2 8 17 1 13 15 3 7 19 31

slide-32
SLIDE 32

Algorithm Theory, WS 2012/13 Fabian Kuhn 32

Consolidate Example

14 25 5 20 12 22 9 18 2 8 17 1 13 15 3 7 19 31 14 25 5 20 12 22 9 18 2 8 17 1 13 15 3 7 19 31

slide-33
SLIDE 33

Algorithm Theory, WS 2012/13 Fabian Kuhn 33

Consolidate Example

14 25 5 20 12 22 9 18 2 8 17 1 13 15 3 7 19 31 14 25 5 20 12 22 9 18 2 8 17 1 13 15 3 7 19 31

slide-34
SLIDE 34

Algorithm Theory, WS 2012/13 Fabian Kuhn 34

Consolidate Example

14 25 5 20 12 22 9 18 2 8 17 1 13 15 3 7 19 31 14 25 5 20 12 22 9 18 2 8 17 1 13 15 3 7 19 31

slide-35
SLIDE 35

Algorithm Theory, WS 2012/13 Fabian Kuhn 35

Consolidate Example

14 25 5 20 12 22 9 18 2 8 17 1 13 15 3 7 19 31 14 25 5 20 12 22 9 18 2 8 17 1 13 15 3 7 19 31

slide-36
SLIDE 36

Algorithm Theory, WS 2012/13 Fabian Kuhn 36

Operation Decrease‐Key

Decrease‐Key, : (decrease key of node to new value ) 1. if . then return; 2. . ≔ ; update . ; 3. if ∈ . ∨ . . then return 4. repeat 5. ≔ . ; 6. . ; 7. ≔ ; 8. until . ∨ ∈ . ; 9. if ∉ . then . ≔ ;

slide-37
SLIDE 37

Algorithm Theory, WS 2012/13 Fabian Kuhn 37

Operation Cut( )

Operation . :

  • Cuts ’s sub‐tree from its parent and adds to rootlist

1. if ∉ . then 2. // cut the link between and its parent 3. . ≔ . 1; 4. remove from . . (list) 5. . ≔ null; 6. add to .

25 2 8 1 13 15 3 7 19 31

  • 25

2 8 1 13 15 3 7 19 31

slide-38
SLIDE 38

Algorithm Theory, WS 2012/13 Fabian Kuhn 38

Decrease‐Key Example

  • Green nodes are marked

14 25 5 20 12 22 9 18 2 8 17 1 13 15 3 7 19 31

Decrease‐Key,

  • 14

25 5 20 12 22 9 18 2 8 17 1 13 15 3 7 19 31

slide-39
SLIDE 39

Algorithm Theory, WS 2012/13 Fabian Kuhn 39

Fibonacci Heap Marks

History of a node : is being linked to a node . ≔ a child of is cut . ≔ a second child of is cut .

  • Hence, the boolean value . indicates whether node

has lost a child since the last time was made the child of another node.

slide-40
SLIDE 40

Algorithm Theory, WS 2012/13 Fabian Kuhn 40

Cost of Delete‐Min & Decrease‐Key

Delete‐Min:

1. Delete min. root and add . to . time: 1 2. Consolidate . time: length of .

  • Step 2 can potentially be linear in (size of )

Decrease‐Key (at node ):

1. If new key parent key, cut sub‐tree of node time: 1 2. Cascading cuts up the tree as long as nodes are marked time: number of consecutive marked nodes

  • Step 2 can potentially be linear in
  • Exercises: Both operations can take time in the worst case!
slide-41
SLIDE 41

Algorithm Theory, WS 2012/13 Fabian Kuhn 41

Cost of Delete‐Min & Decrease‐Key

  • Cost of delete‐min and decrease‐key can be Θ…

– Seems a large price to pay to get insert and merge in 1 time

  • Maybe, the operations are efficient most of the time?

– It seems to require a lot of operations to get a long rootlist and thus, an expensive consolidate operation – In each decrease‐key operation, at most one node gets marked: We need a lot of decrease‐key operations to get an expensive decrease‐key operation

  • Can we show that the average cost per operation is small?
  • We can  requires amortized analysis
slide-42
SLIDE 42

Algorithm Theory, WS 2012/13 Fabian Kuhn 42

Amortization

  • Consider sequence , , … , of operations

(typically performed on some data structure )

  • : execution time of operation
  • ≔ ⋯ : total execution time
  • The execution time of a single operation might

vary within a large range (e.g., ∈ 1, )

  • The worst case overall execution time might still be small

 average execution time per operation might be small in the worst case, even if single operations can be expensive

slide-43
SLIDE 43

Algorithm Theory, WS 2012/13 Fabian Kuhn 43

Analysis of Algorithms

  • Best case
  • Worst case
  • Average case
  • Amortized worst case

What it the average cost of an operation in a worst case sequence of operations?

slide-44
SLIDE 44

Algorithm Theory, WS 2012/13 Fabian Kuhn 44

Example: Binary Counter

Incrementing a binary counter: determine the bit flip cost:

Operation Counter Value Cost 00000 1 00001 1 2 00010 2 3 00011 1 4 00100 3 5 00101 1 6 00110 2 7 00111 1 8 01000 4 9 01001 1 10 01010 2 11 01011 1 12 01100 3 13 01101 1

slide-45
SLIDE 45

Algorithm Theory, WS 2012/13 Fabian Kuhn 45

Accounting Method

Observation:

  • Each increment flips exactly one 0 into a 1

001001111 ⟹ 001000000 Idea:

  • Have a bank account (with initial amount 0)
  • Paying to the bank account costs
  • Take “money” from account to pay for expensive operations

Applied to binary counter:

  • Flip from 0 to 1: pay 1 to bank account (cost: 2)
  • Flip from 1 to 0: take 1 from bank account (cost: 0)
  • Amount on bank account = number of ones

 We always have enough “money” to pay!

slide-46
SLIDE 46

Algorithm Theory, WS 2012/13 Fabian Kuhn 46

Accounting Method

Op. Counter Cost To Bank From Bank Net Cost Credit 0 0 0 0 0 1 0 0 0 0 1 1 2 0 0 0 1 0 2 3 0 0 0 1 1 1 4 0 0 1 0 0 3 5 0 0 1 0 1 1 6 0 0 1 1 0 2 7 0 0 1 1 1 1 8 0 1 0 0 0 4 9 0 1 0 0 1 1 10 0 1 0 1 0 2

slide-47
SLIDE 47

Algorithm Theory, WS 2012/13 Fabian Kuhn 47

Potential Function Method

  • Most generic and elegant way to do amortized analysis!

– But, also more abstract than the others…

  • State of data structure / system: ∈ (state space)

Potential function : →

  • Operation :

– : actual cost of operation – : state after execution of operation (: initial state) – ≔ Φ: potential after exec. of operation – : amortized cost of operation :

slide-48
SLIDE 48

Algorithm Theory, WS 2012/13 Fabian Kuhn 48

Potential Function Method

Operation : actual cost: amortized cost: Φ Φ Overall cost: ≔

  • Φ Φ
slide-49
SLIDE 49

Algorithm Theory, WS 2012/13 Fabian Kuhn 49

Binary Counter: Potential Method

  • Potential function:

:

  • Clearly, Φ 0 and Φ 0 for all 0
  • Actual cost :
  • 1 flip from 0 to 1
  • 1 flips from 1 to 0
  • Potential difference: Φ Φ 1 1 2
  • Amortized cost: Φ Φ 2
slide-50
SLIDE 50

Algorithm Theory, WS 2012/13 Fabian Kuhn 50

Back to Fibonacci Heaps

  • Worst‐case cost of a single delete‐min or decrease‐key
  • peration is Ω
  • Can we prove a small worst‐case amortized cost for

delete‐min and decrease‐key operations? Remark:

  • Data structure that allows operations , … ,
  • We say that operation has amortized cost if for every

execution the total time is ⋅

  • ,

where is the number of operations of type

slide-51
SLIDE 51

Algorithm Theory, WS 2012/13 Fabian Kuhn 51

Amortized Cost of Fibonacci Heaps

  • Initialize‐heap, is‐empty, get‐min, insert, and merge

have worst‐case cost

  • Delete‐min has amortized cost
  • Decrease‐key has amortized cost
  • Starting with an empty heap, any sequence of operations

with at most delete‐min operations has total cost (time) .

  • Cost for Dijkstra: || || log ||