CS573 Data Privacy and Security Secure Multiparty Computation - - PowerPoint PPT Presentation

cs573 data privacy and security secure multiparty
SMART_READER_LITE
LIVE PREVIEW

CS573 Data Privacy and Security Secure Multiparty Computation - - PowerPoint PPT Presentation

CS573 Data Privacy and Security Secure Multiparty Computation General Constructions Li Xiong Last Lecture Symmetric & Public key encryption Secure Multiparty Computations Problem and security definitions General constructions


slide-1
SLIDE 1

CS573 Data Privacy and Security Secure Multiparty Computation

General Constructions Li Xiong

slide-2
SLIDE 2

Last Lecture

  • Symmetric & Public key encryption
  • Secure Multiparty Computations
  • Problem and security definitions
  • General constructions
  • Oblivious Transfer
slide-3
SLIDE 3

Secure Multiparty Computation

  • A set of parties with private inputs
  • Parties wish to jointly compute a function of

their inputs so that certain security properties are preserved

  • Properties must be ensured even if some of the

parties maliciously attack the protocol

  • Can model any cryptographic task
slide-4
SLIDE 4

Security Requirements

  • Consider a secure auction (with secret bids):

–An adversary may wish to learn the bids of all parties – to prevent this, require PRIVACY –An adversary may wish to win with a lower bid than the highest – to prevent this, require CORRECTNESS –But, the adversary may also wish to ensure that it always gives the highest bid – to prevent this, require INDEPENDENCE OF INPUTS –An adversary may try to abort the execution if its bid is not the highest – require FAIRNESS

slide-5
SLIDE 5

Security Requirements

  • Privacy: only the output is revealed
  • Correctness: the function is computed correctly
  • Independence of inputs: parties cannot choose

inputs based on others’ inputs

  • Fairness: if one party receives output, all receive
  • utput
  • Guaranteed output delivery
slide-6
SLIDE 6

Defining Security

  • Option 1: analyze security concerns for each

specific problem

– Auctions: as in previous slide – Elections: privacy, correctness and fairness only (?)

  • Problems:

– How do we know that all concerns are covered? – Definitions are application dependent and need to be redefined from scratch for each task

slide-7
SLIDE 7

Defining Security

  • Option 2: general definition that captures all

(most) secure computation tasks

  • Properties of any such definition

– Well-defined adversary model

  • Semi-honest, Malicious

– Well-defined execution setting

  • Stand-alone, concurrent general composition

– Security guarantees are clear and simple to understand

slide-8
SLIDE 8

Defining Security: the Ideal/Real Paradigm

  • What is the best we could hope for?

– An incorruptible trusted party – All parties send inputs to trusted party (over perfectly secure communication lines) – Trusted party computes output – Trusted party sends each party its output (over perfectly secure communication lines) – This is the ideal world

  • What can an adversary do?

– Just choose its input…

  • Semi-honest: simulator given input/output

generates the adversary’s view

slide-9
SLIDE 9

Today

  • Cont. Secure Multiparty Computations
  • Problem and security definitions
  • General constructions
slide-10
SLIDE 10

Construction paradigms

  • We sketch a couple of paradigms used in the

construction of secure multiparty protocols.

  • Passively-secure computation for two-parties

– Use oblivious transfer to securely select a value

  • Passively-secure computation with shares

– Use secret sharing scheme such that data can be reconstructed from some shares

  • From passively-secure protocols to actively-

secure protocols

– Use zero-knowledge proofs to force parties to behave in a way consistent with the passively-secure protocol

slide-11
SLIDE 11

Secret Sharing Scheme

  • Distributing a secret amongst n participants,

each of whom is allocated a share of the secret

  • The secret can be reconstructed only when a

sufficient number (t) of shares are combined together

– (t, n)-threshold scheme

slide-12
SLIDE 12

Trivial Secret Sharing Scheme

  • Splitting

– Encode the secret as an integer S. – Give to each player i (except one) a random integer ri. – Give to the last player the number 𝑇 − σ𝑗=1

𝑜−1 𝑠 𝑗

slide-13
SLIDE 13

(t, n) threshold scheme

  • Shamir’s scheme 1979

– It takes t points to define a polynomial of degree t-1 – Create a t-1 degree polynomial with secret as the first coefficient and the remaining coefficients picked at

  • random. Find n points on the curve

and give one to each of the players. At least t points are required to fit the polynomial.

slide-14
SLIDE 14

The GMW Paradigm

  • “Can we design protocols that remain secure

even when some parties can behave maliciously? “*

  • GMW (Goldreich, Micali and Wigderson)
  • Paradigm for designing secure computation

protocols against malicious adversaries

  • Secure computation for more than two

parties, computing Boolean circuits

Secure Multi-Party Computation, By M.M. Prabhakaran

slide-15
SLIDE 15

The GMW Paradigm

  • Construct a protocol for the semi-honest

model

  • “Compile it” to obtain a protocol that is secure

for the malicious model

– Compilation involves forcing the parties to follow the protocol

  • It may be more efficient to work differently
slide-16
SLIDE 16

General GMW Construction

  • For simplicity – consider two-party case
  • Let f be the function that the parties wish to

compute

  • Represent f as an arithmetic circuit with

addition and multiplication gates

  • Aim – compute gate-by-gate, revealing only

random shares each time

slide-17
SLIDE 17

Random Shares Paradigm

  • Let a be some value:

– Party 1 holds a random value a1 – Party 2 holds a+a1 – Note that without knowing a1, a+a1 is just a random value revealing nothing of a. – We say that the parties hold random shares of a.

  • The computation will be such that all

intermediate values are random shares (and so they reveal nothing).

slide-18
SLIDE 18

Circuit Computation

  • Stage 1: each party randomly shares its input with

the other party

  • Stage 2: compute gates of circuit as follows

– Given random shares to the input wires, compute random shares of the output wires

  • Stage 3: combine shares of the output wires in order

to obtain actual output

AND OR AND NOT OR AND

Alice’s inputs Bob’s inputs

slide-19
SLIDE 19

Addition Gates

  • Input wires to gate have values a and b:

– Party 1 has shares a1 and b1 – Party 2 has shares a2 and b2 – Note: a1+a2=a and b1+b2=b

  • To compute random shares of output c=a+b

– Party 1 locally computes c1=a1+b1 – Party 2 locally computes c2=a2+b2 – Note: c1+c2=a1+a2+b1+b2=a+b=c

slide-20
SLIDE 20

Multiplication Gates

  • Input wires to gate have values a and b:

– Party 1 has shares a1 and b1 – Party 2 has shares a2 and b2 – Wish to compute c = ab = (a1+a2)(b1+b2)

  • Party 1 knows its concrete share values a1 and

b1.

  • Party 2’s shares a2 and b2 are unknown to

Party 1, but there are only 4 possibilities (00,01,10,11)

slide-21
SLIDE 21

Multiplication (cont)

  • Party 1 prepares a table as follows:

– Row 1 corresponds to Party 2’s input 00 – Row 2 corresponds to Party 2’s input 01 – Row 3 corresponds to Party 2’s input 10 – Row 4 corresponds to Party 2’s input 11

slide-22
SLIDE 22

Multiplication (cont)

  • Party 1 prepares a table as follows (Let r be

a random bit chosen by Party 1):

– Row 1 contains the value ab+r when a2=0,b2=0 – Row 2 contains the value ab+r when a2=0,b2=1 – Row 3 contains the value ab+r when a2=1,b2=0 – Row 4 contains the value ab+r when a2=1,b2=1

slide-23
SLIDE 23

Concrete Example

  • Assume: a1=0, b1=1
  • Assume: r=1

Row Party 2’s shares Output value

1 a2=0,b2=0 (0+0).(1+0)+1=1 2 a2=0,b2=1 (0+0).(1+1)+1=1 3 a2=1,b2=0 (0+1).(1+0)+1=0 4 a2=1,b2=1 (0+1).(1+1)+1=1

slide-24
SLIDE 24

The Gate Protocol

  • The parties run a 1-out-of-4 oblivious transfer

protocol

  • Party 1 plays the sender: message i is row i of

the table.

  • Party 2 plays the receiver: it inputs 1 if a2=0

and b2=0, 2 if a2=0 and b2=1, and so on…

  • Output:

– Party 2 receives c2=c+r – this is its output – Party 1 outputs c1=r – Note: c1 and c2 are random shares of c, as required

slide-25
SLIDE 25

Summary

  • By computing each gate these way, at the end

the parties hold shares of the output wires

  • Function output generated by simply sending

shares to each other.

slide-26
SLIDE 26

Security

  • Reduction to the oblivious transfer protocol
  • Assuming security of the OT protocol, parties only

see random values until the end. Therefore, simulation is straightforward.

  • Note: correctness relies heavily on semi-honest

behavior (otherwise can modify shares).

  • Theorem: any functionality f can be securely

computed in the semi-honest model.

slide-27
SLIDE 27

Remark

  • The semi-honest model is often used as a tool

for obtaining security against malicious parties.

  • In many (most?) settings, security against

semi-honest adversaries does not suffice.

  • In some settings, it may suffice.

– One example: hospitals that wish to share data.

slide-28
SLIDE 28

Lecture 3

slide-29
SLIDE 29

Generalize to n parties

  • The setting:
  • Parties P1,…,Pn
  • Inputs x1,…,xn (bits, but can be easily generalized)
  • Outputs y1,…,yn
  • The protocol:
  • Each party shares its input bit
  • Scan the circuit gate by gate

– Input values of gate are shared by the parties – Run a protocol computing a sharing of the output value of the gate – Repeat

  • Publish outputs
slide-30
SLIDE 30

Protocol for semi-honest setting

  • The protocol:
  • Each party shares its input bit
  • The sharing procedure:

– Pi has input bit xi – It chooses random bits ri,j for all i≠j. – Sends bit ri,j to Pj. – Sets its own share to ri,i = xi + (Σj≠i ri,j ) mod 2 – Therefore Σj=1…n ri,j = xi mod 2.

  • Now every Pj has n shares, one for each input xi of

each Pi.

slide-31
SLIDE 31

Protocol for semi-honest setting

  • The protocol computes shares of the output

wires.

  • Each party sends its share of an output wire

to the party Pi that should learn that output.

  • Pi can then sum the shares, obtain the value

and output it.

slide-32
SLIDE 32

The Malicious Case

  • The above protocol is not secure against

malicious adversaries:

  • What can go wrong with malicious behavior?
  • Using shares other than those defined by the

protocol, using arbitrary inputs to the OT protocol and sending wrong shares of output wires…

  • In the OT protocol we saw, the receiver can easily

and undetectably learn both of the sender’s inputs

slide-33
SLIDE 33

Proving Security

  • Recall the definition

– Simulator interacts with a trusted party

  • Simulator sends corrupted parties’ inputs
  • Simulator receives corrupted parties’ outputs

– Output distribution of simulator and the honest parties is like in a real execution

  • Input extraction

– In order for the honest parties to output the same in a real and ideal execution, the simulator must extract the input used by the adversary

slide-34
SLIDE 34

Malicious Adversaries

  • We will show a compiler which forces the

parties to operate as in the semi-honest model (GMW)

  • The basic idea:
  • In every step, each Pi proves in zero knowledge

that its messages were computed according to the protocol

slide-35
SLIDE 35

Forcing Good Behavior

  • AIM: a party should prove that the

message it is sending is correct.

–That is, it is consistent with the protocol instructions, given the input and random- tape that are committed and the incoming messages (that are public).

slide-36
SLIDE 36

Tool: Zero Knowledge

  • Problem setting: a prover wishes to prove a

statement to the verifier so that:

– Zero knowledge: the verifier will learn nothing beyond the fact that the statement is correct – Soundness: the prover will not be able to convince the verifier of a wrong statement

slide-37
SLIDE 37

Zero Knowledge

  • Prover P, verifier V, language 𝑀
  • P proves that 𝑦 ∈ 𝑀 without revealing anything
  • Completeness: V always accepts when 𝑦 ∈ 𝑀 , and an

honest P and V interact

  • Soundness: V accepts with negligible probability when

𝑦 ∉ 𝑀, for any P*

– Computational soundness: only holds when P is polynomial-time

  • Zero-knowledge:
  • there exists a simulator S such that S(x) is indistinguishable

from the verifier’s output after a real proof execution.

slide-38
SLIDE 38

A Warmup

  • Assume that each 𝑄𝑗 runs a deterministic program Π𝑗

The compiler is the following:

  • Each 𝑄𝑗 commits to its input 𝑦𝑗 by sending 𝐷𝑗 (𝑠

𝑗 , 𝑦𝑗 ),

where 𝑠

𝑗 is a random string used for the commitment

  • Let 𝑈𝑗

𝑡 be the transcript of 𝑄𝑗 at step s, i.e. all messages

received and sent by 𝑄𝑗 until that step.

  • Define the language 𝑀𝑗 = {𝑈𝑗

𝑡 𝑡. 𝑢, ∃𝑦𝑗 , 𝑠 𝑗 so that all

messages sent by 𝑄𝑗 until step s are the output

  • f Π𝑗 applied to 𝑦𝑗 , 𝑠

𝑗 and to all messages received

by 𝑄𝑗 up to that step}

  • When sending a message in step s prove in zero-

knowledge that 𝑈𝑗

𝑡 ∈ 𝑀𝑗

slide-39
SLIDE 39

Recall: GMW

  • General methodology takes any secure semi-honest

two-party computation protocol and compiles it into a protocol that is secure against malicious adversaries

  • Two stages:
  • Stage 1: Show a protocol for securely computing any

functionality in the semi-honest adversarial model

  • Stage 2: Construct a protocol compiler that takes any

semi-honest protocol and “converts” it into a protocol that is secure in the malicious model

  • As this compiler is generic, it can be applied to any

semi-honest protocol

slide-40
SLIDE 40

Obtaining Security

Three goals:

  • Force the adversary to use a fixed input

– Furthermore, make it possible for the ideal-model simulator/adversary to extract this input.

  • Force the adversary to use a uniform random

tape

  • Force the adversary to follow the protocol

exactly (consistently with their fixed input and random tape)

slide-41
SLIDE 41

The compiler

  • Transformation from security against honest-but-curious

adversaries to security against malicious (active) adversaries works by combining three building blocks:

  • Input commitment scheme

– Each party commits to its input

  • Coin tossing protocol
  • The parties generate random tapes for each other

– Initial idea: random tape of Pi is defined as S1,i⨁S2,i⨁ …⨁Sn,i, where Sj,i is chosen by Pj

  • Protocol emulation phase:

– Run the protocol while proving that parties operations comply with their inputs and random tapes

slide-42
SLIDE 42

Stage 1: Input Commitment

Preliminaries: bit commitment Sender holds an item that receiver does not know

  • Commit Stage:

– Committer has a bit  ∈ 0,1 – Receiver obtains a commitment string c

  • Commitments of 0 and bit 1 appear identical to the receiver
  • Reveal Stage:

– Committer sends a decommit message to receiver – Receiver uses decommit message and c to obtain 

slide-43
SLIDE 43

Bit Commitment

Security Properties:

  • Binding: for every c, there exists only one value  for

which decommitment is accepted

– Given commitment cannot be opened in more than one way – The committer cannot change the commitment value by sending a different key in the reveal phase

  • Hiding: the receiver cannot distinguish a commitment

string that is to 0 from a commitment string that is to 1.

– Commitments of different bits look indistinguishable to the receiver

slide-44
SLIDE 44

Protocols

  • Commitment using public-key encryption:

– Committer chooses a key-pair (pk,sk). – Committer sends (pk,c=Epk()) to the receiver.

  • Decommitment:

– Committer sends the secret-key sk to the receiver – Receiver verifies that sk is associated with pk and decrypts, obtaining .

slide-45
SLIDE 45

Proving Security

  • Proof of security
  • 𝑄1 is corrupted: verify proof and extract

“witness”; send (𝑦, 𝑠) to the trusted party

  • 𝑄2 is corrupted: commit to garbage and run

zero knowledge simulator

slide-46
SLIDE 46

Coin Tossing

  • Aim: fix uniform random tape of each party

– Allow a pair of distrustful parties to agree on a common random value

  • In honest-but-curious adversary model: (Trivial)

– Alice can choose a random bit and send it to Bob.

  • In malicious adversary model: does not work

– Instead of selecting a random bit, Alice can choose a bit that is favorable to her and affect the bias of the output

  • Coin tossing of a bit:

– Alice and Bob agree on a common random bit – Alice and Bob obtain the same uniform (pseudorandom) bit r

slide-47
SLIDE 47

Coin Tossing

  • To ensure the output is random, need to make the outcome dependent
  • n both parties
  • Idea: Alice chooses a random bit rA, Bob chooses a random bit rB, and

they output rA +rB

– Then even if one of the parties tries to cheat, as long as the other party plays fairly the outcome will be truly random

  • Problem: this protocol cannot be realized in the communication model

– When Alice and Bob communicate, one of them has to go first. But if Alice sends her bit rA first, then Bob can make his choice rB dependent of rA

  • if Bob wants the outcome to be zero, he can set rB = rA
  • Solution: with help of commitment
slide-48
SLIDE 48

Coin Tossing Protocol [Blum]

  • Protocol:

– Make use of a commitment scheme – Alice chooses a random bit rA, computes a commitment cA to rA and sends cA to Bob – Bob chooses a random bit rB and sends it to Alice – Alice decommits, revealing rA

  • Outputs:

– Both parties output r=rArB

slide-49
SLIDE 49

Augmented Coin Tossing

  • Recall: coin tossing is for choosing

random tapes of parties.

  • But, Party 1 does not know Party 2’s

random tape in reality!

  • Augmented coin-tossing:

– Party 1 obtains a random string r – Party 2 obtains a commitment to r

slide-50
SLIDE 50

Security

  • 𝑸𝟐 is corrupted
  • Simulator receives (𝑐, 𝑠) from trusted party
  • Simulator rewinds in each iteration to make each bit

correct

– Note that the simulator does not get the decommitment of 𝑐𝑗 like in Blum – However, it can run all the way to the end and run the extractor for the proof – Quite complex

  • 𝑸𝟑 is corrupted
  • Simulator receives 𝑑 from trusted party
  • Simulator runs first part honestly with adversary
  • Simulator gives 𝑑 at end and simulates the zero knowledge
slide-51
SLIDE 51

Protocol Emulation

  • At this stage, each party holds a commitment

to the other party’s input and random tape.

  • A protocol is a deterministic function of a

party’s input, random tape and series of incoming messages.

  • Therefore, the commitments can be used to

force the parties to follow the protocol instructions.

slide-52
SLIDE 52

Protocol Compilation

  • Given any protocol, construct a new protocol

as follows:

– Both parties commit to inputs – Both parties generate uniform random tape – Parties send messages to each other, each message is proved “correct” with respect to the

  • riginal protocol, with zero-knowledge proofs.
slide-53
SLIDE 53

Resulting Protocol

  • Theorem: if the initial protocol was secure

against semi-honest adversaries, then the compiled protocol is secure against malicious adversaries.

  • Proof:

– Show that even malicious adversaries are limited to semi-honest behavior. – Show that the additional messages from the compilation all reveal nothing.

slide-54
SLIDE 54

Summary

  • GMW paradigm:

– First, construct a protocol for semi-honest adv. – Then, compile it so that it is secure also against malicious adversaries

  • There are many other ways to construct secure

protocols – some of them significantly more efficient.

  • Efficient protocols against semi-honest

adversaries are far easier to obtain than for malicious adversaries.

slide-55
SLIDE 55

Slides credits

  • Tutorial on secure multi-party computation, Lindell

www.cs.biu.ac.il/~lindell/research-statements/tutorial-secure-computation.ppt

  • Introduction to secure multi-party computation, Vitaly Shmatikov, UT Austin

http://www.cs.utexas.edu/~shmat/courses/cs380s_fall09/15smc.ppt

  • Introduction to Cryptography, Yehuda Lindell, Bar Ilan University, IL

http://crypto.biu.ac.il/sites/default/files/

  • Information Security Management, UTC
  • Cryptography fall 2012, Andrej Bogdanov, the Chinese University of Hong Kong

http://www.cse.cuhk.edu.hk/~andrejb/csc5440/

  • Parallel Coin-Tossing and Constant-Round Secure Two-Party Computation, Yehuda

Lindell, https://www.iacr.org/archive/crypto2001/21390170.pdf