Network (Coding) Security: Known knowns, Unknown knowns, and - - PowerPoint PPT Presentation

network coding security
SMART_READER_LITE
LIVE PREVIEW

Network (Coding) Security: Known knowns, Unknown knowns, and - - PowerPoint PPT Presentation

Network (Coding) Security: Known knowns, Unknown knowns, and Unknowns Sidharth Jaggi, The Chinese University of Hong Kong Known knowns: Background What is security? The quality or state of being secure : as a) : freedom from danger : safety b)


slide-1
SLIDE 1

Sidharth Jaggi, The Chinese University of Hong Kong

Network (Coding) Security:

Known knowns, Unknown knowns, and Unknowns

slide-2
SLIDE 2

Known knowns: Background

slide-3
SLIDE 3

What is security?

  • Merriam-Webster

The quality or state of being secure: as a) : freedom from danger : safety b) : freedom from fear or anxiety c) : freedom from the prospect of being laid off

slide-4
SLIDE 4

Background – Communication Scenario

Alice Bob Min-cut: 𝐷 Bad guy Calvin “Control” a subset

  • f links/nodes

Source Encoder Decoder Network containing adversaries Message Decoded message R packets Received Collection

  • f packets

Eavesdropped packets Jammed packets

Mostly “large” alphabets (packets) Mostly acyclic networks Info/coding thry Crypto

slide-5
SLIDE 5

Background – Communication Scenario

Source Encoder Decoder Network containing adversaries Message Decoded message R packets Received Collection

  • f packets

Eavesdropped packets Jammed packets

  • Secrecy
  • Robustness to erasures/

errors.

  • More later…
slide-6
SLIDE 6

Secrecy

Alice Bob C=1, Zr=1, but secrecy rate 1 possible! Zr eavesdropped links

  • Cai-Yeung: Secrecy rate C-Zr achievable (intuition – “network wiretap channel”)
  • Feldman et al: small field-sizes, random codes, efficient
  • Silva et al: “Universal” codes (rank-metric/subspace codes)
  • Multiple other works… Rouayheb et al, Bhattad et al, Ngai et al, …
  • Cui et al – LP formulation that’s never worse than C-Zr
  • General problem still open
  • Node eavesdropper problem even harder
slide-7
SLIDE 7

Erasures

Zw erased links

  • Kötter-Médard: Rate C-Zw possible. Optimal.
  • Ho et al: expected throughput for random erasures, efficient random distributed codes
  • Dana et al: Even correlated random erasures (interference) rate computable, efficiently attainable
  • Silva et al: Rank-metric codes for worst-case erasures
  • Node-erasures: Capacity based on node-cut attainable
slide-8
SLIDE 8

“Random” Error-correction

packet errors/ i.i.d. symbol errors

  • Song et al/Borade et al: Symbol errors: Separation between link-by-link error-correction/network coding
  • Silva et al: Rate C-Zw efficiently attainable end-to-end with random packet errors (rank-metric codes)
slide-9
SLIDE 9

Error-detection

Zw noisy links

  • Omniscient Calvin: Rate R < C-Zw possible with error-detection. Optimal.
  • Ho et al: Any rate, at least one-path Calvin does not control (see/jam), can detect errors. Optimal.
slide-10
SLIDE 10

Adversarial errors

Zr jammed links

  • Cai-Yeung: Rate C-2Zw possible. Optimal. Network Singleton bound/Network GV codes
  • Jaggi et al/Kötter-Kschischang/Kötter-Kschischang-Silva: Efficient codes achieving C-2Zw
  • Jaggi et al: If Calvin not omniscient, C-Zw possible in some scenarios (more on this later)
  • Node adversary problem much harder (more on this later).
slide-11
SLIDE 11
  • Cryptography (computational assumptions)
  • List-decoding
  • Rateless codes
  • ...

Addenda …

slide-12
SLIDE 12

Unknown knowns part I:

Reliable and Secure Communication over Adversarial Multipath Networks

Qiaosheng Zhang Eric Mayank Bakshi Sidharth Jaggi Swanand Kadhe Alex Sprintson

12

Codes, Algorithms, Networks: Design & Optimization in Information Theory

slide-13
SLIDE 13

frequency amplitude C different frequencies

Motivating Example 1

Z

R C C Alice Bob Z

13

slide-14
SLIDE 14

frequency amplitude C different frequencies

Motivating Example 2

Eavesdrops on all the frequencies R

Reed-Solomon codes

C C C/2

Singleton bound

14

slide-15
SLIDE 15

frequency amplitude C different frequencies

Motivating Example 3

R

???

C C C/2

15

slide-16
SLIDE 16

Alternate Motivation

  • C computers
  • Administrator: wants to store a file.
  • How? By distributing it across C computers.

…… 1 2 3 C

slide-17
SLIDE 17
  • Administrator: wants to store a file.
  • But hacker has read/write privileges on some servers…

Alternate Motivation

slide-18
SLIDE 18
  • Goals:

(1) The hacker cannot corrupt the file

  • ----- reliability

(2) The hacker cannot decipher the contents.

  • ----- secrecy

Alternate Motivation

slide-19
SLIDE 19

Optimal rate Regime C – ZRW – ZWO Weak adversary regime C – 2*ZRW – ZWO Strong adversary regime

V2 V1 V4 ZRO ZWO ZRW O

  • Weak adversary regime:

Tetrahedron OV1V2V3

  • Strong adversary regime:

Tetrahedron V1V2V3V4

V3

Basic model

[1] Zhang et al, ITW 2015, Talking secretly, reliably and efficiently: A “complete” characterization

ZRW+ZRO+ZWO ≤ C 2*ZRW+ZRO+ZWO = C

19

slide-20
SLIDE 20

Basic model

  • Non-causal condition (Model 0)

One-shot transmission

x2 x3 x1 y2 x3 x1

[1] Zhang et al, ITW 2015, Talking secretly, reliably and efficiently: A “complete” characterization

20

slide-21
SLIDE 21

Causality/feedback

  • Effect of causality ? (Model 1)
  • Cannot see the future
  • Stuck to fixed channels

x23 x33 x13 x22 x32 x12 x21 x31 x11 y21 x31 x11 y22 x32 x12 y23 x33 x13

x2 x3 x1

21

slide-22
SLIDE 22

Causality/feedback

  • Effect of passive feedback ? (Model 2)

x23 x33 x13 x22 x32 x12 x21 x31 x11 y21 x31 x11 y22 x32 x12 y23 x33 x13

y21 x31 x11 y22 x32 x12

22

slide-23
SLIDE 23

Problem Statement

m

message

k

random key Enc (m,k) Encoder X Y Dec (Y) Decoder

  • Multi-round transmission without feedback (Model 1)
  • System diagram:

y13 y12 y11 y23 y22 y21 y33 y32 y31 y43 y42 y41 y53 y52 y51 x13 x12 x11 x23 x22 x21 x33 x32 x31 x43 x42 x41 x53 x52 x51

23

slide-24
SLIDE 24

Problem Statement

m

message

k

random key Enc (m,k)

X(j) Y(j)

Dec (Y)

  • Multi-round transmission with passive feedback (Model

2)

  • System diagram: j-th round, j = 1, 2, …

Y(1) Y(2) … Y(j-1) Received codeword

24

Can also provide secrecy (if desired) at lower rates

???

slide-25
SLIDE 25

Jamming models

  • Additive Jamming:
  • Overwrite Jamming:

(Wireless network) (Wired network / Storage system)

xi

yi ei xi ei yi = ei

25

slide-26
SLIDE 26

26

Results: A “Complete” Characterization

slide-27
SLIDE 27

Overview of main results (additive)

One-shot transmission (Model 0) Multi-round transmission Without feedback (Model 1) Multi-round transmission with Passive feedback (Model 2) V2 V1 V3 ZWO ZRW O V4 V2 V1 V3 ZWO ZRW O V5 V4 ZRO V2 V1 ZWO ZRW O V5 V4 ZRO ZRO

27

slide-28
SLIDE 28

Overview of main results (overwrite)

One-shot transmission (Model 0) Multi-round transmission Without feedback (Model 1) Multi-round transmission with Passive feedback (Model 2) V2 V1 V3 ZWO ZRW O V4 V2 V1 V3 ZWO ZRW O V5 V4 ZRO V2 V1 ZWO ZRW O V5 V4 ZRO ZRO V6 V6

28

slide-29
SLIDE 29

Multi-round transmission without feedback (additive)

  • Key idea for achievability:
  • Self-hashing
  • Pairwise-hashing [Jag06]

29

Xi Hi Ki Yi Hi

Ki

ZWO

slide-30
SLIDE 30

Multi-round transmission without feedback (additive)

  • Key idea for achievability:
  • Self-hashing
  • Pairwise-hashing [Jag06]

X1 X2 X3 K11 K21 K31 K12 K13 K22 K23 K32 K33 H21 H22 H23 H31 H32 H33 H11 H12 H13

Payload Random keys Pairwise hashes

30

slide-31
SLIDE 31

Pairwise-hashing

  • What’s the hash function?

N symbols over Fq Xi = “Linearized polynomial” (p field characteristic)

slide-32
SLIDE 32
  • Case 1:

X1 X2 X3

K11 K21 K31 K12 K13 K22 K23 K32 K33 H11 H12 H13 H21 H22 H23 H31 H32 H33

Link 1 and Link 2 are pairwise-consistent

Key idea for achievability: Pairwise-hashing [Jag06]

32

slide-33
SLIDE 33

X1 X2 X3

K11 K21 K31 K12 K13 K22 K2C K32 K33 H11 H12 H13 H21 H22 H23 H31 H32 H33 H’32

Key idea for achievability: Pairwise-hashing [Jag06]

  • Case 2:

ZRW Y2 Link 2 and Link 3 are not pairwise-consistent

33

slide-34
SLIDE 34

Pairwise-hashing Analysis

  • Receiver Bob:
  • Construct a graph with C vertices.
  • Connect two vertices if consistent.
  • Find the largest clique (count node-degree).

2 7 3 6 4 1 5 8 9

slide-35
SLIDE 35

Main Results

  • Eg: Additive Jamming:
  • Calvin’s clique:
  • zrw + zro
  • Encoder’s clique:
  • C – zrw – zwo

zrw: zro: zwo: “Untouched”:

slide-36
SLIDE 36
  • Decoder:
  • Check pairwise-consistency:
  • Errors are detectable if
  • C–zrw– zwo>zrw+ zro
  • R = C–zrw– zwo= C – ZW

Key idea for achievability: Pairwise-hashing [Jag06]

Yes No Yes ZRW ZRO No ZWO G

36

2 7 3 6 4 1 5 8 9

slide-37
SLIDE 37

Converse: “Stochastic” symmetrization

X1 X2 X3 X4 X5 X1 X2’ X3’ X4’ X5’

m m’

  • “Stochastic” Singleton-type bound

X2’ X3’ X4 X5 X2 X3 X4’ X5’

37

Eg: Overwrite, C=5, Zro=1, Zwo=2, Zrw=0, C≤zro+2(zwo+zrw)R≤C-2(zwo+zrw)=1 X1 X1

slide-38
SLIDE 38

Stochastic Singleton bound

  • Calvin observes (first) Zro links
  • Picks (consistent) X’(m,r)~Pr(X(m,r)|xro)
  • (Not necessarily uniform)
  • Picks (uniformly) one of two subsets to be zwo
  • Transmits symbols from X’(m,r) on Zwo
  • TPT: Bob confused between two alternatives

38

X2’ X3’ X4 X5 X1 X1 X2’ X3’ X4’ X5’ X1 X2 X3 X4 X5

slide-39
SLIDE 39
  • TPT: Bob confused between two alternatives

39

Stochastic Singleton bound

Rate too highSufficiently large uncertainty in message Sufficiently large uncertainty in messageCalvin’s fake message different from true message (Fano’s inequality) Bayes’ theoremBoth messages equally likely given Y observed by Bob

slide-40
SLIDE 40

Multi-round transmission with passive feedback

  • Two-phase code (work for )
  • Phase 1: Erasure code (handle ZW erasures)
  • Phase 2:
  • Uncorrupted links: random keys and hashes
  • Corrupted links: random vectors

x1

(1)

x2

(1)

x3

(1)

y1

(1)

y2

(1)

y3

(1) K1 K2 K3

H1 H2 H3

K1 K2 K3

H1 H2 H3

40

slide-41
SLIDE 41

Multi-round transmission with passive feedback

  • Weak adversary regime:
  • Two-phase code
  • Strong adversary regime:
  • Converse: Symmetrization argument

V2 V1 V3 ZWO ZRW O V4 ZRO

41

slide-42
SLIDE 42

44

Summary of Results

Eg:

slide-43
SLIDE 43

Addenda

  • Reliability and Secrecy
  • Message rate decreases by ZR
  • Computationally Efficient

– Encoding and decoding:

  • Unequal link capacity networks

– Waterfilling

45

  • Information-theoretically optimal
slide-44
SLIDE 44

46

slide-45
SLIDE 45

Unknown knowns part II: II: End-to to-End Error-Correcting Codes

  • n Networks with

Worst-Case Symbol Errors

47

Qiwen Wang Sidharth Jaggi

slide-46
SLIDE 46

Noiseless Noisy Packet error Symbol error Ran Arb Ran Arb Throughput [ACLY00] [YC06], [YYZ08] [SYC06] This work

  • Comp. efficient

[LYC03], [KM03] [SKK10] [SKK08], [SK09] Distributed [HKMKE03]

Networks with Noise

48

1111010…… 0001111…… 0101010…… 1011000…… 0001011…… 1101010……

S t

slide-47
SLIDE 47

Noiseless Noisy Packet error Symbol error Ran Arb Ran Arb Throughput [ACLY00] [YC06], [YYZ08] [SYC06] This work

  • Comp. efficient

[LYC03], [KM03] [SKK10] [SKK08], [SK09] Distributed [HKMKE03]

1111010…… 0001111…… 0101010…… 1011000…… 0111001…… 0010010……

Networks with Noise

49

S t

[YC06] R. W. Yeung, and N. Cai. Network error correction, part I: basic concepts and upper bounds. Communications in Information and Systems, 6(1): 19–36, 2006. [YYZ08] S. Yang, R. W. Yeung, and Z. Zhang. Weight properties of network codes. European Transactions on Telecommunications, 19(4), 371-383, 2008. [SKK08] D. Silva, F. R. Kschischang, and R. Kötter. A rank-metric approach to error control in random network coding. IEEE Transactions on Information Theory, 54(9):3951–3967, 2008. [SK09] D. Silva and F. R. Kschischang. On metrics for error correction in network coding. IEEE Transactions on Information Theory, 55(12):5479–5490, 2009. [SKK10] D. Silva, F. R. Kschischang, and R. Kötter. Communication over finite-field matrix channels. IEEE Transactions on Information Theory, 56(3), 1296-1305, 2010.

slide-48
SLIDE 48

Noiseless Noisy Packet error Symbol error Ran Arb Ran Arb Throughput [ACLY00] [YC06], [YYZ08] [B02], [SYC06] This work

  • Comp. efficient

[LYC03], [KM03] [SKK10] [SKK08], [SK09] Distributed [HKMKE03]

1111010…… 0001111…… 0101010…… 1011000…… 0111001…… 0010010……

Networks with Noise

50

S t

[B02] S. P. Borade, Network information flow: Limits and achievability. In Proc. of IEEE International Symposium on Information Theory, Lausanne, Switzerland, June 2002. [SYC06] L. Song, R. W. Yeung, and N. Cai. A separation theorem for single-source network coding. IEEE Transactions on Information Theory, 52(5):1861–1871, 2006.

p1 p2

Enc Enc Dec Dec

slide-49
SLIDE 49

1111010…… 0001111…… 0101010…… 1011000…… 0001011…… 1101010……

Worst-case Noise:Example

51

S t

mincut = C

N bits on each link

slide-50
SLIDE 50

1111010…… 1110111…… 0101010…… 1011000…… 0001000…… 1001010……

Worst-case Noise: Example

52

S t

mincut = C

1101010…… 0001011…… 1011000……

N bits on each link

0101010…… 0001111…… 1111010……

… …

Out of 2CN bits in the network p·2CN bits are flipped

1001010…… 0001000…… 1011000…… 0101010…… 1110111…… 1111010……

… …

What is the rate region and achievable schemes for this noisy network (normalized by CN)

Num of links: E = 2C

slide-51
SLIDE 51

Revisit: Point-to-Point Communication

53

n

X

n

Y

pn bit-flips

slide-52
SLIDE 52

Revisit: Point-to-Point Communication

54

n

X

n

Y

pn bit-flips

  • Comp. eff. achievable

Achievable

?

Non-achievable

slide-53
SLIDE 53

Benchmark 1

55

1111010…… 1110111…… 0101010…… 1011000…… 0001000…… 1001010……

S t

2pC·N 2pC·N 2pC·N 2pC·N 2pC·N 2pC·N

1 (4 ) R H Cp  

Enc Dec Enc Dec Enc Dec Enc Enc Enc Dec Dec Dec

Link-by-link error-correcting codes (Gilbert-Varshamov construction)

slide-54
SLIDE 54

Benchmark 2

56

1111010…… 1110111…… 0101010…… 1011000…… 0001000…… 1001010……

S t

(2pC/k)·N (2pC/k)·N (2pC/k)·N (2pC/k)·N (2pC/k)·N (2pC/k)·N

Enc Dec Enc Dec Enc Dec Enc Enc Enc Dec Dec Dec

4 1

link

Cp R H k        

slide-55
SLIDE 55

Benchmark 2

57

1111010…… 1110111…… 0101010…… 1011000…… 0001000…… 1001010……

S t

(2pC/k)·N (2pC/k)·N (2pC/k)·N (2pC/k)·N (2pC/k)·N (2pC/k)·N

At most k links corrupted,

Enc Dec

4 1

link

Cp R H k        

2 2 4 1 1 1

link

k k Cp R R H C C k                               

slide-56
SLIDE 56

1111010…… 1110111…… 0101010…… 1011000…… 0001000…… 1001010……

Worst-case Noise: Example

58

S t

mincut = C

1101010…… 0001011…… 1011000……

N bits on each link

0101010…… 0001111…… 1111010……

… …

Out of 2CN bits in the network p·2CN bits are flipped

1001010…… 0001000…… 1011000…… 0101010…… 1110111…… 1111010……

… …

What is the rate region and achievable schemes for this noisy network (normalized by CN)

Num of links: E = 2C 2m

F

slide-57
SLIDE 57

Main Results

59

  • For all ,

Hamming

  • For all ,
  • If ,

Plotkin

  • For all ,

Elias-Bassalygo

1 ( ) E R H p C  

2 C Em

p 

(1 )

C C E E

p  

R 

2 2

1 E R p CE C    (1 )

C C E E

p  

2 2

(1 )

C C Em Em

p  

1 1 4 1 ( ) 2 p E R H C    

  • Coherent GV-type codes achieve rates at least
  • Non-coherent GV-type codes achieve rates at least

Gilbert-Varshamov

  • Concatenated network codes achieve rates at least

Zyablov

1 (2 ) E H p C  1 (2 ) E H p C 

1 (2 )

1

2 max 1 ( (1 ))

E r H p C

C E

p r H r

  

        

Achievable schemes: Converses:

  • Coherent: the internal coding coefficients are known in advance
  • Non-coherent: the internal coding coefficients are unknown in advance

( )

2O n

(1) O

n

slide-58
SLIDE 58

Main Results

60

100, 50 E C   2 m 

slide-59
SLIDE 59

Main Results

61

100, 50 E C   3 m 

slide-60
SLIDE 60

Main Results

62

9, 2 E C   2 m 

slide-61
SLIDE 61

Model

63

α β 2m

F

… …

C

……

N bits N bits C

……

N bits N bits C

slide-62
SLIDE 62

Model

64

α β 2m

F

… …

n = N/m

……

2m

F

……

C C n = N/m

2m

F

C

[KM03] R. Kötter and M. Médard. An algebraic approach to network coding. IEEE/ACM Transactions on Networking, 2003. [HKMKM03] T. Ho, R. Köetter, M. Médard, D. R. Karger, and M. Effros. The benefits of coding over routing in a randomized setting. In Proc. of IEEE International Symposium on Information Theory, Yokohama, Japan, June 2003.

slide-63
SLIDE 63

Model

65

… …

·

n n C C C C

slide-64
SLIDE 64

n symbols

  • ver

Finite field to binary field

66

s1 s2 …… sn

2m

F

b11b12…b1m b21b22…b2m …… bn1bn2…bnm

transmit mn bits

b11 b12

. . .

b1m b21 b22

. . .

b2m …… bn1 bn2

. . .

bnm

binary matrix One Packet:

2m

F

2

F

S t

1 2 3 4

X1 =

4

F

[JEHM04] S. Jaggi, M. Effros, T. Ho, and M. Médard. On linear network coding. In Proceedings of 42nd Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, 2004.

Example:

2 1 3 1 1 1 1

m n 

2

F

slide-65
SLIDE 65

Finite field to binary field

67

2m

F

2

F

1,3

1 f 

2,4

1 f  X = 2 1 3 2 2 T = 1 1 Y = TX = 1 1 1 1 1 1 1 1 1 1

S t

1 2 3 4

Example:

2 1 3 2 2 2 1 3 2 2 1 1 = 1 1 1 1 1 1

4

F

2

F

4

F

4

F ·

2

F

2

F = · Y = TX =

[JEHM04] S. Jaggi, M. Effros, T. Ho, and M. Médard. On linear network coding. In Proceedings of 42nd Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, 2004.

slide-66
SLIDE 66

Finite field to binary field

68

2m

F

2

F

, t x

2m

F

m m

T 

x

,

m m

T 

x

·

2

F

2

F

tx

2m

F

= Tx

Multiplication over

2m

F

Multiplication over

2

F

[JEHM04] S. Jaggi, M. Effros, T. Ho, and M. Médard. On linear network coding. In Proceedings of 42nd Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, 2004.

slide-67
SLIDE 67

69

Noiseless Network … … n n C

·

n C n

2m

F

C C C C C C

slide-68
SLIDE 68

70

Noiseless Network … …

·

n Cm n

2

F

n C C C n C Cm Cm Cm

slide-69
SLIDE 69

71

… … With noise n C n C

slide-70
SLIDE 70

Noise Model

72

Z

……

. . . . . . . . .

Worst-case bit-flip error matrix Z: no more than pEmn 1s, arbitrarily distributed E: num of edges in the network

n Em

slide-71
SLIDE 71

Noise Model

73

Z

……

. . . . . . . . .

Error bits on the 1st edge Link 1 Worst-case bit-flip error matrix Z: no more than pEmn 1s, arbitrarily distributed E: num of edges in the network S t

1 2 3 4 1 0 0 1 1 1 1 1 1 1 1 1

Em n

1 1 1 1 1 1

2

F 1 1 1 1

2

F

m

slide-72
SLIDE 72

Noise Model

74

1 1

Z

……

. . . . . . . . .

Error bits on the 1st edge Worst-case bit-flip error matrix Z: no more than pEmn 1s, arbitrarily distributed E: num of edges in the network S t

1 2 3 4

Link 1

1 0 0 1 1 1 1 1 1 1 1 1

Em n

1 1 1 1 1 1

2

F 1 1 1 1

2

F

slide-73
SLIDE 73

75

1,3

1 f 

2,4

1 f  X = 2 1 3 2 2 T = 1 1 Y = TX = 1 1 1 1 1 1 1 1 1 1

S t

1 2 3 4

Example:

2 1 3 2 2 2 1 3 2 2 1 1 = 1 1 1 1 1 1

4

F

2

F

4

F

4

F ·

2

F

2

F = · Y = TX =

slide-74
SLIDE 74

76

1,3

1 f 

2,4

1 f  X = 2 1 3 2 2 T = 1 1 Y = TX + Z=

S t

1 2 3 4

ˆ T 

1 1 1 1

ˆ T

1 1 1 1 1 1 1 1 1 1

2

F

2

F · + 1 1 1 1 1 1 1 1

2

F · 1 1

2

F

4

F

4

F

4

F 1 1 1 1 1 1 1 1 =

2

F = 3 3 3 2 2

4

F

slide-75
SLIDE 75

77

… …

·

n Cm

+

Em

·

n

=

n

2

F

ˆ T

X Y Z

Cm Cm Cm Em Cm

slide-76
SLIDE 76

78

· + · =

ˆ T

+ · =

ˆ T

1 1

. . . . . .

1

slide-77
SLIDE 77

79

· + · =

ˆ T

+ · =

ˆ T

1

. . . . . .

1 1 1 1 1 1 1 1 1 1 1 1 1 1

slide-78
SLIDE 78

Transform Metric

80

=

ˆ T

ith ith

+

at least di columns

Claim: di is a distance metric.

slide-79
SLIDE 79

Transform Metric

81

=

ˆ T

ith ith

+

slide-80
SLIDE 80

Transform Metric

82

=

ˆ T

ith ith

+

slide-81
SLIDE 81

Transform Metric

83

=

ˆ T

ith ith

+

at least di columns

Claim: is a distance metric.

ˆ 1

( , )

n i T i

d TX Y d

 

ˆ (

, )

T

d TX Y

slide-82
SLIDE 82

Main Results

84

  • For all ,

Hamming

  • For all ,
  • If ,

Plotkin

  • For all ,

Elias-Bassalygo

1 ( ) E R H p C  

2 C Em

p 

(1 )

C C E E

p  

R 

2 2

1 E R p CE C    (1 )

C C E E

p  

2 2

(1 )

C C Em Em

p  

1 1 4 1 ( ) 2 p E R H C    

  • Coherent GV-type codes achieve rates at least
  • Non-coherent GV-type codes achieve rates at least

Gilbert-Varshamov

  • Concatenated network codes achieve rates at least

Zyablov

1 (2 ) E H p C  1 (2 ) E H p C 

1 (2 )

1

2 max 1 ( (1 ))

E r H p C

C E

p r H r

  

        

Achievable schemes: Converses:

( )

2O n

(1) O

n

slide-83
SLIDE 83

85 TX(1) 2pEmn TX(2) TX(3) 2pEmn 2pEmn

Gilbert-Varshamov-Type Bound (coherent)

All the binary matrices

cm n 

slide-84
SLIDE 84
  • Need an upper bound on volume of
  • Different Y, or equivalently , can be bounded above by the

number of different Z, which equals

  • The summation can be bounded from above by

~

  • Lower bound on the size of the codebook
  • Asymptotically in n, the rate of coherent GV-type codes

86

ˆ TZ

ˆ (

,2 )

T

B TX pEmn

2 pEmn i

Emn i

     

(2 1) 2 Emn pEmn pEmn       

(2 )

(2 1)2H

p Emn

pEmn 

log(2 1) (1 (2 ) ) (2 )

2 2 (2 1)2

E pEmn Cmn H p Cmn C n H p Emn

pEmn

  

 

1 (2 ) E H p C 

TX(1) 2pEmn TX(2) TX(3) 2pEmn 2pEmn

Gilbert-Varshamov-Type Bound (coherent)

All the binary matrices

cm n 

slide-85
SLIDE 85

87

slide-86
SLIDE 86

Unknown knowns part III III: Arb rbitrarily Vary rying Networks

88

Peida Tian Oliver Kosut Sidharth Jaggi

slide-87
SLIDE 87

Background – Related Work

89

Node-based jamming adversary

 Calvin: eavesdrop on all links  jam on outgoing links of any 𝑨 nodes  Goal: reliable communication Upper bound:  Bounds from link-based adversary (too pessimistic)  cut-set bound [Kosut et al] (not tight in general) Lower bound (achievability):  routing bounds [Che et al] (unicast)  Polytope codes [Kosut et al] 𝑏

slide-88
SLIDE 88

Shared secrets – “Arbitrarily Varying Networks”

90

 Calvin: eavesdrop on all links  jam on outgoing links of any 𝑨 nodes  Goal: reliable communication 𝑏 Higher rate possible  How about negligible shared secrets between source and every nodes

slide-89
SLIDE 89

𝑏

91

Capacity: natural “erasure” outer bound 𝑏 Min-cut = 2 after deleting adversarial node Code strategy:  Authenticate packets  Intermediate nodes verify and delete corrupted packets Challenge

Shared secrets – “Arbitrarily Varying Networks”

slide-90
SLIDE 90

𝑏

92

Key tool: hash function ℎ(⋅) based on linearized polynomial Idea: Verify any linear combination 𝑏𝑌1 + 𝑐𝑌2 using hashes from 𝑌1, 𝑌2 Detect & delete Our code: Computationally efficient rate optimal

Shared secrets – “Arbitrarily Varying Networks”

slide-91
SLIDE 91

93

ℎ 𝑌1, 𝑡1 = 𝑡12 + 𝑙=1

𝑜

𝑦1𝑙𝑡11

𝑞𝑙

ℎ 𝑌2, 𝑡2 = 𝑡22 + 𝑙=1

𝑜

𝑦2𝑙𝑡21

𝑞𝑙

ℎ 𝑏𝑌1 + 𝑐𝑌2, 𝑡1 can be computed using ℎ(𝑌1, 𝑡1), ℎ(𝑌1, 𝑡2), ℎ(𝑌2, 𝑡1), ℎ(𝑌2, 𝑡2)  Properties of linearized polynomial  Schwartz-Zippel Lemma Sketch of hash functions

Shared secrets – “Arbitrarily Varying Networks”

slide-92
SLIDE 92

94

slide-93
SLIDE 93

Unknowns… Less well-understood

At least to me…

slide-94
SLIDE 94

Layers of secrecy

Anonymity “Who is hiding something?” Secrecy (IT or crypto) “What is s/he hiding?” Deniability/ Steganography “Is s/he hiding something?”

slide-95
SLIDE 95

Motivating Scenario

Edward Nancy Glenn

97

(e.g. whistleblower) (journalist) (oppressive regime)

Reliablity Deniability

Is Ed talking to Glenn? What could Ed be talking about?

Hidability

Is Ed a cat lover or whistleblower?

slide-96
SLIDE 96

Layers of robustness

  • Network-error correction – what is s/he saying?
  • Network function computation – what does s/he mean?
  • Network tomography – who’s messing with us?
slide-97
SLIDE 97

Future work…

slide-98
SLIDE 98