on the capacity of information networks
play

On the Capacity of Information Networks January 28, 2005 April - PowerPoint PPT Presentation

On the Capacity of Information Networks January 28, 2005 April Rasala Lehman Joint work with Nicholas Harvey, Robert Kleinberg and Eric Lehman MIT 1 There is as yet no unified theory of network information flow. But there can be no doubt


  1. On the Capacity of Information Networks January 28, 2005 April Rasala Lehman Joint work with Nicholas Harvey, Robert Kleinberg and Eric Lehman MIT 1

  2. “There is as yet no unified theory of network information flow. But there can be no doubt that a complete theory of communication networks would have wide implications for the theory of communication and computation.” - Cover & Thomas, Elements of Information Theory . 2

  3. History of Network Coding • Breakthrough [Ahlswede et al. ’00]. ◦ Existence of multicast solution depends on min-cut con- dition. • Algebraic framework [Koetter & M´ edard ’03]. ◦ Led to a randomized, distributed, fault-tolerant algorithm for multicast [Ho et al. ’03]. • Deterministic algorithms for multicast [Jaggi et al. ’03, Harvey et al. ’05]. 3

  4. The Network Coding Problem Source Source Given: a b • Directed acyclic graph G . c • Integral capacity c ( u, v ) for each edge ( u, v ). d • k -commodities: e ◦ Set of source nodes. f Sink Sink ◦ Set of sink nodes. 4

  5. The Idea of Network Coding • There is one message for each Source Source commodity. has bit x has bit y ◦ Every source knows the a b y x message. c ◦ Every sink wants the mes- y x x ⊕ y sage. ◦ A message is a single sym- d bol from an alphabet Σ. e f • Each edge of capacity c can Sink Sink transmit c symbols from Σ. wants wants • Question: Does there exist y x a solution? 5

  6. This Talk: from Existence to Optimization • Consider size of alphabet Σ. ◦ Model of network coding that works for multicast doesn’t work well in general. ◦ Need a notion of “rate”. • What is the maximum achievable communication rate in a network? ◦ Explore bounds based on cut conditions. ◦ Develop entropy inequalities based on graph structure. • What is the maximum rate in an undirected network? 6

  7. Alphabet Size 7

  8. Who Cares About Alphabet Size? • Small alphabet means simple, efficiently-computable edge functions. • Large alphabet implies large latency. • Need Ω(log | Σ | ) bits of memory at each node to compute edge functions (naively). • An upper bound on | Σ | would imply that the network coding problem is decidable. 8

  9. Our Results - The Bad News • Sometimes an enormous alphabet is required! ◦ An n -node network may require an alphabet of size: | Σ | = 2 e Ω( n 1 / 3 ) ◦ Solution may exist but be hopelessly unwieldy! • Nonmonotonicity: ◦ Instance solvable with 4-symbol alphabet, but not with 1000-symbol alphabet! ◦ Can’t fix a single large alphabet size, e.g. 2 64 . 9

  10. Building Block: Network I k has messages M 1 , ..., M k , P 1 , ..., P k capacity 2k-2 capacity k-1 capacity 2 wants wants wants wants all M ‘s wants all P ‘s all M ‘s & P ‘s all M ‘s all M ‘s - M i + P j - P j + M i Lemma 1 Solvable iff | Σ | = q k . 10

  11. Doubly-Exponential Lower Bound • Network I k has O ( k 2 ) nodes and requires | Σ | to be a perfect k -th power. • Let J n consist of disjoint networks I 2 I 3 I 5 I 7 I 11 . . . I p where p is largest prime less than n 1 / 3 . ⇒ J n has O ( n ) nodes and there is a solution if and only if: Ce Ω( n 1 / 3 ) | Σ | = C 2 · 3 · 5 · 7 · 11 · · · p = 2 e Ω( n 1 / 3 ) ≥ 11

  12. Our Results - The Good News If each edge can send one additional bit, then the minimum alphabet size is O (1). • Our bad example is an artifact of using the network at 100.0% capacity. • Are we wasting our time with this model? • Tweak the model? ◦ Messages are drawn from an alphabet Γ. ◦ Each edge transmits one symbol from larger alphabet Σ. ◦ Rate = log | Γ | log | Σ | . 12

  13. What is the Maximum Achievable Rate? 13

  14. What is the Maximum Achievable Rate? • Open problem except for multicast where max rate = min- cut between the source and any sink. • Is there a cut-based upper bound on rate for the general problem? • Do information theoretic tools give a better upper bound? 14

  15. Sparsity • Sparsity of a cut A ⊆ E is: capacity of edges in cut A # commodities with no remaining source-sink path • Sparsity of a graph is minimum sparsity over all cuts. • There exist directed graphs in which the maximum rate > sparsity. Sparsity = 1/2 Rate = 1 15

  16. Meagerness • A set of commodities P is separated by a cut if there is no remaining path from a source of any commodity in P to a sink of any commodity in P . • The meagerness of a graph is the minimum over all sets of commodities P and cuts that separate P of capacity of edges in cut | P | • The maximum rate ≤ meagerness in directed graphs. Meagerness = 1 Rate = 1 16

  17. Sometimes Max Rate < Meagerness The meagerness is 1. This flow solution has rate 2 / 3. Best possible? 17

  18. Sometimes Max Rate < Meagerness • The meagerness is 1. This flow solution has rate 2 / 3. Best possible? 18

  19. Sometimes Max Rate < Meagerness 2 2 3 3 Γ = {0,1} 2 Γ = {0,1} 2 Σ = {0,1} 3 Σ = {0,1} 3 1 1 3 3 2 2 3 3 • The meagerness is 1. • This flow solution has rate 2 / 3. Best possible? 19

  20. Better Bounds Through Entropy • Obtain strictly better bounds on rate through entropy argu- ments. ◦ Show max rate 2 / 3 for previous example. ◦ Implies meagerness is a loose upper bound on rate. • Entropy of a random variable X is the information in X mea- sured in bits. ◦ The entropy of X is denoted H ( X ). ◦ The entropy of X and Y together is H ( X, Y ). 20

  21. Entropy View of Network Coding • Suppose messages are selected S a S b independently and uniformly S c from Γ. • As a result, the symbol trans- G F mitted on each edge is a R.V. • Structure of graph and prop- erties of entropy imply con- straints that a network code T b T c T a must satisfy. 21

  22. Entropy and Network Coding • Properties of entropy: ◦ Nonnegative: H ( U ) ≥ 0. ◦ Nondecreasing: H ( U, x ) ≥ H ( U ). ◦ Submodular: H ( U ) + H ( V ) ≥ H ( U ∪ V ) + H ( U ∩ V ). • Constraints on a network coding solution: ◦ Uniformity of sources: H ( S A ) = log | Γ | . ◦ Independence of sources: H ( S A , S B ) = H ( S A ) + H ( S B ). ◦ sources = sinks: H ( S A , U ) = H ( T A , U ) for all U . ◦ Edge capacity: H ( e ) ≤ log | Σ | . 22

  23. One More Condition: Downstreamness U is downstream of V if all paths from a source to an edge in U S a S b intersect V . S c If U is downstream of V , H ( V ) = H ( U, V ). F G Ex 1: T b is downstream of { S a , F } . H ( S a , F ) = H ( S a , T b , F ). T b T c T a Example 2: T a is downstream of { S b , G } . H ( S b , G ) = H ( T a , S b , G ). Example 3: T c is downstream of { F, G } . H ( F, G ) = H ( T c , F, G ). 23

  24. One More Condition: Downstreamness U is downstream of V if all paths from a source to an edge in U S a S b intersects V . S c If U is downstream of V , H ( V ) = H ( U, V ). F G Ex 1: T b is downstream of { S a , F } . H ( S a , F ) = H ( S a , T b , F ). T b T c T a Ex 2: T a is downstream of { S b , G } . H ( S b , G ) = H ( T a , S b , G ). Example 3: T c is downstream of { F, G } . H ( F, G ) = H ( T c , F, G ). 24

  25. One More Condition: Downstreamness U is downstream of V if all paths from a source to an edge in U S a S b intersects V . S c If U is downstream of V , H ( V ) = H ( U, V ). F G Ex 1: T b is downstream of { S a , F } . H ( S a , F ) = H ( S a , T b , F ). T b T c T a Ex 2: T a is downstream of { S b , G } . H ( S b , G ) = H ( T a , S b , G ). Ex 3: T c is downstream of { F, G } . H ( F, G ) = H ( T c , F, G ). 25

  26. Proof: Max Rate = 2/3 26

  27. + + = + = + H(S a , F) H(S b , G) H(S a , T b , F) H(T a , S b , G) 27

  28. + + = 28

  29. + + = sources = sinks 29

  30. + + = > + submodularity 30

  31. + + = > + downstreamness 31

  32. + + = > + sources = sinks 32

  33. + + = > + = 5 log | Γ | 33

  34. Max Rate = 2/3 > 5 log | Γ | + 34

  35. Max Rate = 2/3 > 5 log | Γ | + + 35

  36. Max Rate = 2/3 > 5 log | Γ | + + + 36

  37. Max Rate = 2/3 log | Γ | log | Γ | > 5 log | Γ | + + + 37

  38. Max Rate = 2/3 > 3 log | Γ | + 38

  39. Max Rate = 2/3 2 log | Σ | > 3 log | Γ | > + 39

  40. What is the Maximum Rate? • Simple cut-based characterizations of max rate unsatisfac- tory. ◦ Sparsity is wrong for directed graphs. ◦ Meagerness is a loose upper bound. • Do the entropy conditions give a tight upper bound on rate? ◦ Unknown in general. ◦ Many inequalities and many ways to combine; get giant LP. 40

  41. Further Results: Coding in Undirected Graphs • How do we even model this? ◦ Rule out cyclic dependencies between edge functions. ◦ Edge capacity bounds information flow in two directions. • Entropy conditions carry over, e.g. downstreamness. • Sparsity is a loose upper bound on rate. Conjecture: In an undirected graph, the maximum multicom- modity flow = the maximum network coding rate. • We prove for an infinite class of “interesting” graphs. 41

  42. Okamura-Seymour Example s(a) t(c) • 4 commodities. • Each edge has capacity 1. • Sparsity 1. s(d) t(d) s(b) t(a) • Maximum multicommodity flow 3 / 4. • Maximum rate with network coding is also 3 / 4! t(b) s(c) 42

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend