Fundamental Limits of Distributed Encoding
Nastaran Abadi Khooshemehr Mohammad Ali Maddah-Ali
Sharif University of Technology International Symposium on Information Theory (ISIT) 2020 June 2020
Fundamental Limits of Distributed Encoding Nastaran Abadi - - PowerPoint PPT Presentation
Fundamental Limits of Distributed Encoding Nastaran Abadi Khooshemehr Mohammad Ali Maddah-Ali Sharif University of Technology International Symposium on Information Theory (ISIT) 2020 June 2020 Classical Coding Source Channel Hamming
Sharif University of Technology International Symposium on Information Theory (ISIT) 2020 June 2020
Channel
Shannon approach
Probabilistic errors
Hamming approach
Adversarial errors 2 Source
Hamming approach
Adversarial errors Singleton bound If ๐ต๐(๐, ๐) is the maximum number
code of length ๐ and minimum distance ๐, then ๐ต๐ ๐, ๐ โค ๐๐โ๐+1. GilbertโVarshamov bound If ๐ต๐(๐, ๐) is the maximum number of possible codewords in a ๐-ary block code
๐ต๐ ๐, ๐ โฅ
๐๐ ฯ๐=0
๐โ1 ๐ ๐
๐โ1 ๐ .
Griesmer bound If ๐(๐, ๐) is the minimum length of a binary code of dimension ๐ and and minimum distance ๐, then ๐ ๐, ๐ โฅ ฯ๐=0
๐โ1 ๐ 2๐ .
and many more โฆ 3
4
Channel
5 Source
Shard 1 Shard 2 Shard 3
โฎ
6
Source node 1 Source node 2 Source node 3
distributed source nodes 7
Source node 1 Source node 2 Source node 3
Decoder Decoder connects to some encoding nodes. 8
Source node 1 Source node 2 Source node 3 9
Source node 1 Source node 2 Source node 3
More variables than equations
10
Impossible to decode
11
12
13
Source node 1 Source node 2 Source node 3 Decoder
We need the decoder to decode the messages of the honest nodes correctly.
14 No information about the adversaries and their behavior. No information about and .
We donโt care about the messages of adversaries.
๐ธ: the number of adversaries ๐ณ: the number of source nodes ๐: the maximum number of the messages of one adversarial source node ๐ถ: the number of encoding nodes ๐: the number of encoding nodes that decoder needs to connect to. ๐ฟ = 3 ๐ = 5 ๐พ = 1 ๐ค = 3 15 # of adversaries # of adversarial messages # of source nodes # of encoding nodes
(Informally, at least how many encoding nodes does the decoder need?)
16
In an ๐, ๐ฟ, ๐พ, ๐ค distributed encoding system,
๐ขโ = ๐ฟ + ๐พ ๐ค โ 1 + 1
๐ขโ = ๐
Recall 17
18
19
๐ ๐ฆ๐1, โฆ , ๐ฆ๐๐ฟ = เท ๐=1 ๐ฟ
๐ฝ๐1, โฆ , ๐ฝ๐๐ฟ: chosen independently and uniformly at random from the field
Using nonlinear code
nice structure
20
๐ ๐ฆ๐1, โฆ , ๐ฆ๐๐ฟ = เท ๐=1 ๐ฟ
๐ขโ = ๐ฟ + ๐พ ๐ค โ 1 + 1
21
22 We consider a partitioning for the encoding nodes. We form a set of nonlinear equations. In some steps, we transform it to another set of nonlinear equations. We use Bezout theorem to bound the number
all options for the messages
feasible and undesirable solutions
23
The decoder does not know the adversaries and their behavior. Decoder would be confused between two contradicting feasible solutions.
24
In an ๐, ๐ฟ, ๐พ, ๐ค distributed encoding system where ๐
1, โฆ , ๐ ๐ are
linear functions,
๐ขlinear
โ
= ๐ฟ + 2๐พ ๐ค โ 1
๐ขlinear
โ
= ๐
Theorem (linear code)
25
In an ๐, ๐ฟ, ๐พ, ๐ค distributed encoding system,
๐ขโ = ๐ฟ + ๐พ ๐ค โ 1 + 1
๐ขโ = ๐
Theorem (general code) Linear code is not good enough!
messages to the encoding nodes.
26
27