Encoding/Decoding, Lower Bounds Russell Impagliazzo and Miles Jones - - PowerPoint PPT Presentation
Encoding/Decoding, Lower Bounds Russell Impagliazzo and Miles Jones - - PowerPoint PPT Presentation
Encoding/Decoding, Lower Bounds Russell Impagliazzo and Miles Jones Thanks to Janine Tiefenbruck http://cseweb.ucsd.edu/classes/sp16/cse21-bd/ May 9, 2016 Encoding: Fixed Density Strings Idea: use marker bit to indicate when to interpret output
Encoding: Fixed Density Strings
Idea: use marker bit to indicate when to interpret output as a position.
- Fix window size.
- If there is a 1 in the current "window" in the string,
record a 1 to interpret next bits as position, then record its position and move the window over.
- Otherwise, record a 0 and move the window over.
Example n=12, k=3, window size n/k = 4. How do we encode s = 011000000010 ? Output:
Encoding: Fixed Density Strings
Idea: use marker bit to indicate when to interpret output as a position.
- Fix window size.
- If there is a 1 in the current "window" in the string,
record a 1 to interpret next bits as position, then record its position and move the window over.
- Otherwise, record a 0 and move the window over.
Example n=12, k=3, window size n/k = 4. How do we encode s = 011000000010 ? Output: 101
Interpret next bits as position of 1; this position is 01
Encoding: Fixed Density Strings
Idea: use marker bit to indicate when to interpret output as a position.
- Fix window size.
- If there is a 1 in the current "window" in the string,
record a 1 to interpret next bits as position, then record its position and move the window over.
- Otherwise, record a 0 and move the window over.
Example n=12, k=3, window size n/k = 4. How do we encode s = 011000000010 ? Output: 101
Encoding: Fixed Density Strings
Idea: use marker bit to indicate when to interpret output as a position.
- Fix window size.
- If there is a 1 in the current "window" in the string,
record a 1 to interpret next bits as position, then record its position and move the window over.
- Otherwise, record a 0 and move the window over.
Example n=12, k=3, window size n/k = 4. How do we encode s = 011000000010 ? Output: 101100
Interpret next bits as position of 1; this position is 00
Encoding: Fixed Density Strings
Idea: use marker bit to indicate when to interpret output as a position.
- Fix window size.
- If there is a 1 in the current "window" in the string,
record a 1 to interpret next bits as position, then record its position and move the window over.
- Otherwise, record a 0 and move the window over.
Example n=12, k=3, window size n/k = 4. How do we encode s = 011000000010 ? Output: 101100
Encoding: Fixed Density Strings
Idea: use marker bit to indicate when to interpret output as a position.
- Fix window size.
- If there is a 1 in the current "window" in the string,
record a 1 to interpret next bits as position, then record its position and move the window over.
- Otherwise, record a 0 and move the window over.
Example n=12, k=3, window size n/k = 4. How do we encode s = 011000000010 ? Output: 1011000
No 1s in this window.
Encoding: Fixed Density Strings
Idea: use marker bit to indicate when to interpret output as a position.
- Fix window size.
- If there is a 1 in the current "window" in the string,
record a 1 to interpret next bits as position, then record its position and move the window over.
- Otherwise, record a 0 and move the window over.
Example n=12, k=3, window size n/k = 4. How do we encode s = 011000000010 ? Output: 1011000
Encoding: Fixed Density Strings
Idea: use marker bit to indicate when to interpret output as a position.
- Fix window size.
- If there is a 1 in the current "window" in the string,
record a 1 to interpret next bits as position, then record its position and move the window over.
- Otherwise, record a 0 and move the window over.
Example n=12, k=3, window size n/k = 4. How do we encode s = 011000000010 ? Output: 1011000111
Interpret next bits as position of 1; this position is 11
Encoding: Fixed Density Strings
Idea: use marker bit to indicate when to interpret output as a position.
- Fix window size.
- If there is a 1 in the current "window" in the string,
record a 1 to interpret next bits as position, then record its position and move the window over.
- Otherwise, record a 0 and move the window over.
Example n=12, k=3, window size n/k = 4. How do we encode s = 011000000010 ? Output: 1011000111 Now we can stop recording, since we have seen all three ones.
Encoding: Fixed Density Strings
procedure WindowEncode (input: b1b2…bn, with exactly k ones and n-k zeros)
- 1. w := floor (n/k)
- 2. count := 0
- 3. location
:= 1
- 4. While count
< k: 5. If there is a 1 in the window starting at current location 6. Output 1 as a marker, then output position
- f first 1 in window.
7. Increment count. 8. Update location to immediately after first 1 in this window. 9. Else 10. Output 0. 11. Update location to next index after current window.
Uniquely decodable?
Decoding: Fixed Density Strings
procedure WindowDecode (input: x1x2…xm, target is exactly k ones and n-k zeros)
- 1. w := floor ( n/k )
- 2. b := floor ( log2(w))
- 3. s := empty string
- 4. i := 0
- 5. While i < m
6. If xi = 0 7. s += 0…0 (w times) 8. i += 1 9. Else 10. p := decimal value of the bits xi+1…xi+b 11. s += 0…0 (p times) 12. s += 1 13. i := i+b+1
- 14. If length(s)
< n 15. s += 0…0 ( n-length(s) times )
- 16. Output
s.
Encoding/Decoding: Fixed Density Strings
Correctness? E(s) = result of encoding string s of length n with k 1s, using WindowEncode. D(t) = result of decoding string t to create a string of length n with k 1s, using WindowDecode. Well-defined functions? Inverses? Goal: For each s, D(E(s)) = s. Strong Induction!
Encoding/Decoding: Fixed Density Strings
Output size? Assume n/k is a power of two. Consider s a binary string of length n with k 1s. How many bits is E(s)? A. n-1 B. log2(n/k) C. Depends on where 1s are located in s
Encoding/Decoding: Fixed Density Strings
Output size? Assume n/k is a power of two. Consider s a binary string of length n with k 1s. For which strings is E(s) shortest? A. More 1s toward the beginning. B. More 1s toward the end. C. 1s spread evenly throughout.
Encoding/Decoding: Fixed Density Strings
Output size? Assume n/k is a power of two. Consider s a binary string of length n with k 1s. Best case : 1s toward the beginning of the string. E(s) has
- One bit for each 1 in s to indicate that next bits denote positions in window.
- log2(n/k) bits for each 1 in s to specify position of that 1 in a window.
- k such 1s.
- No bits representing 0s because all 0s occur in windows with 1s or after the
last 1. Total |E(s)| = k log2(n/k) + k
Encoding/Decoding: Fixed Density Strings
Output size? Assume n/k is a power of two. Consider s a binary string of length n with k 1s. Worst case : 1s toward the end of the string. E(s) has
- Some bits representing 0s since there are no 1s in first several windows.
- One bit for each 1 in s to indicate that next bits denote positions in window.
- log2(n/k) bits for each 1 in s to specify position of that 1 in a window.
- k such 1s.
What's an upper bound on the number of these bits? A. n
- D. 1
B. n-k
- E. None of the above.
C. k
Encoding/Decoding: Fixed Density Strings
Output size? Assume n/k is a power of two. Consider s a binary string of length n with k 1s. Worst case : 1s toward the end of the string. E(s) has
- At most k bits representing 0s since there are no 1s in first several windows.
- One bit for each 1 in s to indicate that next bits denote positions in window.
- log2(n/k) bits for each 1 in s to specify position of that 1 in a window.
- k such ones.
Total |E(s)| <= k log2(n/k) + 2k
Encoding/Decoding: Fixed Density Strings
Output size? Assume n/k is a power of two. Consider s a binary string of length n with k 1s. k log2(n/k) + k <= | E(s) | <= k log2(n/k) + 2k
Encoding/Decoding: Fixed Density Strings
Output size? Assume n/k is a power of two. Consider s a binary string of length n with k 1s. Given | E(s) | <= k log2(n/k) + 2k, we need at most k log2(n/k) + 2k bits to represent all length n binary strings with k 1s. Hence, there are at most 2… many such strings.
Encoding/Decoding: Fixed Density Strings
Output size? Assume n/k is a power of two. Consider s a binary string of length n with k 1s. Given | E(s) | <= k log2(n/k) + 2k, we need at most k log2(n/k) + 2k bits to represent all length n binary strings with k 1s. Hence, there are at most 2… many such strings.
Encoding/Decoding: Fixed Density Strings
Output size? Assume n/k is a power of two. Consider s a binary string of length n with k 1s. Given | E(s) | <= k log2(n/k) + 2k, we need at most k log2(n/k) + 2k bits to represent all length n binary strings with k 1s. Hence, there are at most 2… many such strings. C(n,k) = # Length n binary strings with k 1s <= (4n/k)k
Bounds for Binomial Coefficients
Using windowEncode(): Lower bound? Idea: find a way to count a subset of the fixed density binary strings. Some fixed density binary strings have one 1 in each of k chunks of size n/k.
….
How many such strings are there? A. nn
- B. k!
- C. (n/k)k
- D. C(n,k)k
- E. None of the above.
Bounds for Binomial Coefficients
Using windowEncode(): Using evenly spread strings: Counting helps us analyze our compression algorithm. Compression algorithms help us count.
Theoretically Optimal Encoding
A theoretically optimal encoding for length n binary strings with k 1s would use the ceiling of bits. How?
- List all length n binary strings with k 1s in some order.
- To encode: Store the position of a string in the list, rather than the string itself.
- To decode: Given a position in list, need to determine string in that position.
Theoretically Optimal Encoding
A theoretically optimal encoding for length n binary strings with k 1s would use the ceiling of bits. How?
- List all length n binary strings with k 1s in some order.
- To encode: Store the position of a string in the list, rather than the string itself.
- To decode: Given a position in list, need to determine string in that position.
Use lexicographic (dictionary) ordering …
Lex Order
String a comes before string b if the first time they differ, a is smaller. I.e. a1a2…an <lex b1b2…bn means there exists j such that ai=bi for all i<j AND aj<bj Which of these comes last in lex order? A. 1001 C. 1101
- E. 0000
B. 0011 D. 1010
Lex Order
E.g. Length n=5 binary strings with k=3 ones, listed in lex order:
Original string, s Encoded string (i.e. position in this list) 00111 0 = 0000 01011 1 = 0001 01101 2 = 0010 01110 3 = 0011 10011 4 = 0100 10101 5 = 0101 10110 6 = 0110 11001 7 = 0111 11010 8 = 1000 11100 9 = 1001
Lex Order: Algorithm?
Need two algorithms, given specific n and k: s à E(s,n,k) and p à D(p,n,k) Idea: Use recursion (reduce & conquer).
Lex Order: Algorithm?
For E(s,n,k): 0…. 0…. … 1….. 1…. 1….
- Any string that starts with 0 must have position before
- Any string that starts with 1 must have position at or after
Length n-1 binary strings with k 1s Length n-1 binary strings with k-1 1s
Lex Order: Algorithm?
procedure lexEncode (b1b2…bn, n, k)
- 1. If n = 1,
2. return 0.
- 3. If s1 = 0,
4. return lexEncode (b2…bn, n-1, k)
- 5. Else
6. return C(n-1,k) + lexEncode(b2…bn, n-1, k-1) For E(s,n,k):
- Any string that starts with 0 must have position before
- Any string that starts with 1 must have position at or after
Lex Order: Algorithm?
procedure lexDecode (p, n, k)
- 1. If n = k,
2. return 1111..1 //length n string of all 1s
- 3. If p < C(n-1,k),
4. return "0" + lexDecode(p, n-1, k)
- 5. Else
6. return "1" + lexDecode(p-C(n-1,k), n-1, k-1) For D(s,n,k):
- Any position before
must correspond to string that starts with 0.
- Any position at or after
must correspond to string that starts with 1.
Victory!
Using lexEncode, lexDecode, we can represent any fixed density length n binary string with k 1s as a number in the range 0 through C(n,k)-1. So, it takes log2( C(n,k) ) bits to store fixed-density binary strings using lex order. Theoretical lower bound: log2( C(n,k) ). Same! So this encoding algorithm is optimal.
Another application of counting … lower bounds
Sorting algorithm: performance was measured in terms of number of comparisons between list elements What's the fastest possible worst case for any sorting algorithm?
Another application of counting … lower bounds
Sorting algorithm: performance was measured in terms of number of comparisons between list elements What's the fastest possible worst case for any sorting algorithm? Tree diagram represents possible comparisons we might have to do, based on relative sizes of elements.
Another application of counting … lower bounds
Tree diagram represents possible comparisons we might have to do, based on relative sizes of elements.
Rosen p. 761 a, b, c distinct integers
Another application of counting … lower bounds
Sorting algorithm: performance was measured in terms of number of comparisons between list elements What's the fastest possible worst case for any sorting algorithm? Maximum number of comparisons for algorithm is height of its tree diagram.
Another application of counting … lower bounds
How many leaves will there be in a decision tree that sorts n elements? A. 2n D. C(n,2) B. log n
- E. None of the above.
C. n!
Another application of counting … lower bounds
Sorting algorithm: performance was measured in terms of number of comparisons between list elements What's the fastest possible worst case for any sorting algorithm? Maximum number of comparisons for algorithm is height of its tree diagram. For any algorithm, what would be smallest possible height? What do we know about the tree? * Internal nodes correspond to comparisons. * Leaves correspond to possible input arrangements.
Another application of counting … lower bounds
Sorting algorithm: performance was measured in terms of number of comparisons between list elements What's the fastest possible worst case for any sorting algorithm? Maximum number of comparisons for algorithm is height of its tree diagram. For any algorithm, what would be smallest possible height? What do we know about the tree? * Internal nodes correspond to comparisons. Depends on algorithm. * Leaves correspond to possible input arrangements. n!
Another application of counting … lower bounds
How does height relate to number of leaves? Theorem: There are at most 2h leaves in a binary tree with height h. Which of the following is true? A. It's possible to have a binary tree with height 3 and 1 leaf. B. It's possible to have a binary tree with height 1 and 3 leaves. C. Every binary tree with height 3 has 1 leaf. D. Every binary tree with height 1 has 3 leaves. E. None of the above.
Another application of counting … lower bounds
How does height relate to number of leaves? Theorem: There are at most 2h leaves in a binary tree with height h. Proof: By induction on the height h >= 0. Base case WTS that there are at most 20 leaves in a binary tree with height h=0. What trees have height 0?
Another application of counting … lower bounds
How does height relate to number of leaves? Theorem: There are at most 2h leaves in a binary tree with height h. Proof: By induction on the height h >= 0. Base case WTS that there are at most 20 leaves in a binary tree with height h=0. If a binary tree has height 0, its only node is the root. In this case the root is also a (and the only) leaf node. So, the number of leaves is 1 = 20 in the only possible tree with h=0. J
Another application of counting … lower bounds
How does height relate to number of leaves? Theorem: There are at most 2h leaves in a binary tree with height h. Proof: By induction on the height h >= 0. Induction Step Let h be some integer >=0 and assume (as the IH) that There are at most 2h leaves in a binary tree with height h. WTS that there are at most 2h+1 leaves in a binary tree with height h+1.
Another application of counting … lower bounds
Induction Step Let h be some integer >=0 and assume (as the IH) that There are at most 2h leaves in a binary tree with height h. WTS that there are at most 2h+1 leaves in a binary tree with height h+1. Consider a tree U with height h+1. How can we relate it to trees of height h so that we can use IH?
Another application of counting … lower bounds
Induction Step Let h be some integer >=0 and assume (as the IH) that There are at most 2h leaves in a binary tree with height h. WTS that there are at most 2h+1 leaves in a binary tree with height h+1. Consider a tree U with height h+1. How can we relate it to trees of height h so that we can use IH? Remove all the leaves of U. This gives a new tree, T, of height h. By the IH the tree T has at most 2h leaves. To get from T to U, we need to add back the leaves of U. How many are there?
Another application of counting … lower bounds
Induction Step Let h be some integer >=0 and assume (as the IH) that There are at most 2h leaves in a binary tree with height h. WTS that there are at most 2h+1 leaves in a binary tree with height h+1. Consider a tree U with height h+1. How can we relate it to trees of height h so that we can use IH? Remove all the leaves of U. This gives a new tree, T, of height h. By the IH the tree T has at most 2h leaves. To get from T to U, we need to add back the leaves of U. How many are there? For each leaf of T, there are at most 2 leaves in U.
Another application of counting … lower bounds
Induction Step Let h be some integer >=0 and assume (as the IH) that There are at most 2h leaves in a binary tree with height h. WTS that there are at most 2h+1 leaves in a binary tree with height h+1. Consider a tree U with height h+1. How can we relate it to trees of height h so that we can use IH? Remove all the leaves of U. This gives a new tree, T, of height h. By the IH the tree T has at most 2h leaves. To get from T to U, we need to add back the leaves of U. How many are there? For each leaf of T, there are at most 2 leaves in U. # leaves in U <= 2(# leaves in T) <= 2(2h) = 2h+1 J
Another application of counting … lower bounds
What's the fastest possible worst case for any sorting algorithm? Maximum number of comparisons for algorithm is height of its tree diagram. For any algorithm, what would be smallest possible height? What do we know about the tree? * Internal nodes correspond to comparisons. Depends on algorithm. * Leaves correspond to possible input arrangements. n! Each tree diagram must have at least n! leaves, so its height must be at least log2(n!).
Another application of counting … lower bounds
What's the fastest possible worst case for any sorting algorithm? Maximum number of comparisons for algorithm is height of its tree diagram. For any algorithm, what would be smallest possible height? What do we know about the tree? * Internal nodes correspond to comparisons. Depends on algorithm. * Leaves correspond to possible input arrangements. n! Each tree diagram must have at least n! leaves, so its height must be at least log2(n!). i.e. fastest possible worst case performance of sorting is log2(n!)
Another application of counting … lower bounds
What's the fastest possible worst case for any sorting algorithm? log2(n!) How big is that? Lemma: For n>1, Proof:
Another application of counting … lower bounds
What's the fastest possible worst case for any sorting algorithm? log2(n!) How big is that? Lemma: for n>1, Theorem: log2(n!) is in Proof: For n>1, taking logarithm of both sides in lemma gives i.e.
Another application of counting … lower bounds
What's the fastest possible worst case for any sorting algorithm? log2(n!) How big is that? Lemma: for n>1, Theorem: log2(n!) is in Therefore, the best sorting algorithms will need comparisons in the worst case. i.e. it's impossible to have a comparison-based algorithm that does better than Merge Sort (in the worst case).
Another application of counting … lower bounds
Sorting algorithm: performance was measured in terms of number of comparisons between list elements What's the fastest possible worst case for any sorting algorithm?
Another application of counting … lower bounds
Sorting algorithm: performance was measured in terms of number of comparisons between list elements What's the fastest possible worst case for any sorting algorithm? Tree diagram represents possible comparisons we might have to do, based on relative sizes of elements.
Another application of counting … lower bounds
Tree diagram represents possible comparisons we might have to do, based on relative sizes of elements.
Rosen p. 761 a, b, c distinct integers
Another application of counting … lower bounds
Sorting algorithm: performance was measured in terms of number of comparisons between list elements What's the fastest possible worst case for any sorting algorithm? Maximum number of comparisons for algorithm is height of its tree diagram.
Another application of counting … lower bounds
How many leaves will there be in a decision tree that sorts n elements? A. 2n D. C(n,2) B. log n
- E. None of the above.
C. n!
Another application of counting … lower bounds
Sorting algorithm: performance was measured in terms of number of comparisons between list elements What's the fastest possible worst case for any sorting algorithm? Maximum number of comparisons for algorithm is height of its tree diagram. For any algorithm, what would be smallest possible height? What do we know about the tree? * Internal nodes correspond to comparisons. * Leaves correspond to possible input arrangements.
Another application of counting … lower bounds
Sorting algorithm: performance was measured in terms of number of comparisons between list elements What's the fastest possible worst case for any sorting algorithm? Maximum number of comparisons for algorithm is height of its tree diagram. For any algorithm, what would be smallest possible height? What do we know about the tree? * Internal nodes correspond to comparisons. Depends on algorithm. * Leaves correspond to possible input arrangements. n!
Another application of counting … lower bounds
How does height relate to number of leaves? Theorem: There are at most 2h leaves in a binary tree with height h. Which of the following is true? A. It's possible to have a binary tree with height 3 and 1 leaf. B. It's possible to have a binary tree with height 1 and 3 leaves. C. Every binary tree with height 3 has 1 leaf. D. Every binary tree with height 1 has 3 leaves. E. None of the above.
Another application of counting … lower bounds
How does height relate to number of leaves? Theorem: There are at most 2h leaves in a binary tree with height h. Proof: By induction on the height h >= 0. Base case WTS that there are at most 20 leaves in a binary tree with height h=0. What trees have height 0?
Another application of counting … lower bounds
How does height relate to number of leaves? Theorem: There are at most 2h leaves in a binary tree with height h. Proof: By induction on the height h >= 0. Base case WTS that there are at most 20 leaves in a binary tree with height h=0. If a binary tree has height 0, its only node is the root. In this case the root is also a (and the only) leaf node. So, the number of leaves is 1 = 20 in the only possible tree with h=0. J
Another application of counting … lower bounds
How does height relate to number of leaves? Theorem: There are at most 2h leaves in a binary tree with height h. Proof: By induction on the height h >= 0. Induction Step Let h be some integer >=0 and assume (as the IH) that There are at most 2h leaves in a binary tree with height h. WTS that there are at most 2h+1 leaves in a binary tree with height h+1.
Another application of counting … lower bounds
Induction Step Let h be some integer >=0 and assume (as the IH) that There are at most 2h leaves in a binary tree with height h. WTS that there are at most 2h+1 leaves in a binary tree with height h+1. Consider a tree U with height h+1. How can we relate it to trees of height h so that we can use IH?
Another application of counting … lower bounds
Induction Step Let h be some integer >=0 and assume (as the IH) that There are at most 2h leaves in a binary tree with height h. WTS that there are at most 2h+1 leaves in a binary tree with height h+1. Consider a tree U with height h+1. How can we relate it to trees of height h so that we can use IH? Remove all the leaves of U. This gives a new tree, T, of height h. By the IH the tree T has at most 2h leaves. To get from T to U, we need to add back the leaves of U. How many are there?
Another application of counting … lower bounds
Induction Step Let h be some integer >=0 and assume (as the IH) that There are at most 2h leaves in a binary tree with height h. WTS that there are at most 2h+1 leaves in a binary tree with height h+1. Consider a tree U with height h+1. How can we relate it to trees of height h so that we can use IH? Remove all the leaves of U. This gives a new tree, T, of height h. By the IH the tree T has at most 2h leaves. To get from T to U, we need to add back the leaves of U. How many are there? For each leaf of T, there are at most 2 leaves in U.
Another application of counting … lower bounds
Induction Step Let h be some integer >=0 and assume (as the IH) that There are at most 2h leaves in a binary tree with height h. WTS that there are at most 2h+1 leaves in a binary tree with height h+1. Consider a tree U with height h+1. How can we relate it to trees of height h so that we can use IH? Remove all the leaves of U. This gives a new tree, T, of height h. By the IH the tree T has at most 2h leaves. To get from T to U, we need to add back the leaves of U. How many are there? For each leaf of T, there are at most 2 leaves in U. # leaves in U <= 2(# leaves in T) <= 2(2h) = 2h+1 J
Another application of counting … lower bounds
What's the fastest possible worst case for any sorting algorithm? Maximum number of comparisons for algorithm is height of its tree diagram. For any algorithm, what would be smallest possible height? What do we know about the tree? * Internal nodes correspond to comparisons. Depends on algorithm. * Leaves correspond to possible input arrangements. n! Each tree diagram must have at least n! leaves, so its height must be at least log2(n!).
Another application of counting … lower bounds
What's the fastest possible worst case for any sorting algorithm? Maximum number of comparisons for algorithm is height of its tree diagram. For any algorithm, what would be smallest possible height? What do we know about the tree? * Internal nodes correspond to comparisons. Depends on algorithm. * Leaves correspond to possible input arrangements. n! Each tree diagram must have at least n! leaves, so its height must be at least log2(n!). i.e. fastest possible worst case performance of sorting is log2(n!)
Another application of counting … lower bounds
What's the fastest possible worst case for any sorting algorithm? log2(n!) How big is that? Lemma: For n>1, Proof:
Another application of counting … lower bounds
What's the fastest possible worst case for any sorting algorithm? log2(n!) How big is that? Lemma: for n>1, Theorem: log2(n!) is in Proof: For n>1, taking logarithm of both sides in lemma gives i.e.
Another application of counting … lower bounds
What's the fastest possible worst case for any sorting algorithm? log2(n!) How big is that? Lemma: for n>1, Theorem: log2(n!) is in Therefore, the best sorting algorithms will need comparisons in the worst case. i.e. it's impossible to have a comparison-based algorithm that does better than Merge Sort (in the worst case).
Representing undirected graphs
Strategy:
- 1. Count the number of simple undirected graphs.
- 2. Compute lower bound on the number of bits required to represent these graphs.
- 3. Devise algorithm to represent graphs using this number of bits.
What's true about simple undirected graphs? A. Self-loops are allowed. B. Parallel edges are allowed. C. There must be at least one vertex. D. There must be at least one edge. E. None of the above.
Rosen p. 641-644
Representing undirected graphs: Counting
In a simple undirected graph on n (labeled) vertices, how many edges are possible? A. n2 B. n(n-1) C. C(n,2) D. 2C(n,2) E. None of the above.
** Recall notation: C(n,k) = **
Representing undirected graphs: Counting
In a simple undirected graph on n (labeled) vertices, how many edges are possible? A. n2 B. n(n-1) C. C(n,2) Possibly one edge for each set of two distinct vertices. D. 2C(n,2) E. None of the above.
** Recall notation: C(n,k) = **
Representing undirected graphs: Counting
How many different simple undirected graphs on n (labeled) vertices are there? A. n2 B. n(n-1) C. C(n,2) D. 2C(n,2) E. None of the above.
Representing undirected graphs: Lower bound
How many different simple undirected graphs on n (labeled) vertices are there? A. n2 B. n(n-1) C. C(n,2) D. 2C(n,2) For each possible edge, decide if in graph or not. E. None of the above. Conclude: minimum number of bits to represent simple undirected graphs with n vertices is log2(2C(n,2)) = C(n,2) = n(n-1)/2
Representing undirected graphs: Algorithm
Goal: represent a simple undirected graph with n vertices using n(n-1)/2 bits. Idea: store adjacency matrix, but since
- diagonal entries all zero
no self loops
- matrix is symmetric
undirected graph
- nly store the entries above the diagonal.
How many entries of the adjacency matrix are above the diagonal? A. n2
- D. 2n
B. n(n-1)
- E. None of the above.
C. C(n,2)
Representing undirected graphs: Algorithm
Goal: represent a simple undirected graph with n vertices using n(n-1)/2 bits Idea: store adjacency matrix, but since
- diagonal entries all zero
no self loops
- matrix is symmetric
undirected graph
- nly store the entries above the diagonal.
Can be stored as 0111101100 which uses C(5,2) = 10 bits.
Representing undirected graphs: Algorithm
Decoding: ? What simple undirected graph is encoded by the binary string 011010 110101 111111 000000 110101 110010 ? A. B.
- C. Either one of the above.
- D. Neither one of the above.
Representing directed graphs: Counting
In a simple directed graph on n (labeled) vertices, how many edges are possible? A. n2 B. n(n-1) C. C(n,2) D. 2C(n,2) E. None of the above.
Simple graph: no self loops, no parallel edges.
Representing directed graphs: Counting
In a simple directed graph on n (labeled) vertices, how many edges are possible? A. n2 B. n(n-1) Choose starting vertex, choose ending vertex. C. C(n,2) D. 2C(n,2) E. None of the above.
Simple graph: no self loops, no parallel edges.
Representing directed graphs: Counting
How many different simple directed graphs on n (labeled) vertices are there? A. n2 B. n(n-1) C. C(n,2) D. 2C(n,2) E. None of the above.
Representing directed graphs: Counting
Another way of counting that there are 2n(n-1) simple directed graphs with n vertices: Represent a graph by For each of the C(n,2) pairs of distinct vertices {v,w}, specify whether there is * no edge between them * an edge from v to w but no edge from w to v * an edge from w to v but no edge from v to w * edges both from v to w and from v to w.
Representing directed graphs: Counting
Another way of counting that there are 2n(n-1) directed graphs with n vertices: Represent a graph by For each of the C(n,2) pairs of distinct vertices {v,w}, C(n,2) pairs specify whether there is * no edge between them each has 4 options * an edge from v to w but no edge from w to v * an edge from w to v but no edge from v to w * edges both from v to w and from v to w. Product rule! (4)(4)….(4) = 4C(n,2)=4(n(n-1)/2)=2n(n-1)
Representing directed graphs: Lower bound
Conclude: minimum number of bits to represent simple directed graphs with n vertices is log2(2n(n-1)) = n(n-1)
Representing directed graphs: Algorithm
Encoding: For each of the n vertices, indicate which of the other vertices it has an edge to. How would you encode this graph using bits (0s and 1s)?
- A. 123232443
- B. 0110 0000 0101 0010
- C. 110 000 011 001
- D. None of the above.
Representing directed graphs: Algorithm
Decoding: Given a string of 0s and 1s of length n(n-1),
- Define vertex set { 1, …, n }.
- First n-1 bits indicate edges from vertex 1 to other vertices.
- Next n-1 bits indicate edges from vertex 2 to other vertices.
- etc.