Practical Secure Two-Party Computation and Applications Lecture 4: - - PowerPoint PPT Presentation

practical secure two party computation and applications
SMART_READER_LITE
LIVE PREVIEW

Practical Secure Two-Party Computation and Applications Lecture 4: - - PowerPoint PPT Presentation

Practical Secure Two-Party Computation and Applications Lecture 4: Hardware-Assisted Cryptographic Protocols Estonian Winter School in Computer Science 2016 Motivation 2 Two Areas Cryptographic Protocols Secure Hardware strong security


slide-1
SLIDE 1

Practical Secure Two-Party Computation and Applications

Lecture 4: Hardware-Assisted Cryptographic Protocols Estonian Winter School in Computer Science 2016

slide-2
SLIDE 2

Motivation

2

slide-3
SLIDE 3

Two Areas

3

  • strong security + privacy guarantees
  • often low performance


(e.g., due to high communication) Cryptographic Protocols Secure Hardware

  • provides secure (key) storage
  • + trusted execution environment
  • is getting cheaper
slide-4
SLIDE 4

Cryptographic Protocols + Hardware ?

4

§ Hardware Accelerators

§ allow to speed up computations § parallelism (e.g., GPU, FPGA, Cell Processor,…)

§ Secure Hardware

§ possibilities beyond SW-only § secure storage § secure execution environment § can be used to construct more efficient protocols § less computation § less communication

slide-5
SLIDE 5

Hardware-Assisted Cryptographic Protocols

5

§ Where do we use secure HW tokens?

§ banking, SIM cards, pay-tv, passports, health cards, …

§ Why do we use HW tokens?

§ SW alone often not secure/efficient/sufficient/…

§ Benefits for practice?

§ security, efficiency, unclonability, (money), … Sender S Receiver R sends token T protocol Π

slide-6
SLIDE 6

Overview of this lecture

6

in Practice in Theory Hardware Tokens Oblivious Transfer Private Set Intersection Special Purpose Protocols Generic Protocols Hardware-Assisted Cryptographic Protocols Hardware Tokens as Setup Assumption Yao GMW

slide-7
SLIDE 7

Hardware Tokens in Practice

7

slide-8
SLIDE 8

Popular Secure Hardware

8

§ Smartcards

§ Contact cards (e.g., Java Cards) § Contactless cards (e.g., Mifare DESFire)

§ Cryptographic Coprocessors

§ Special-purpose secure co-processors
 (e.g., Trusted Platform Module TPM) § General-purpose and user-programmable
 secure co-processors (e.g., IBM 4765)

slide-9
SLIDE 9

Overview: Smart Card Technology

9

§ Memory-Only (e.g., calling cards)

§ simple data storage (read/write or read-only) § usually hard-wired authentication schemes (e.g., using PIN)

§ Wired Logic (e.g., MIFARE-based electronic tickets)

§ Hard-wired state machine for encrypted and/or authenticated memory access § Multi-application support § Static file system § Contact or contactless interface

§ Secure Microcontroller (e.g., JavaCard)

§ Microcontroller, operating system, read/write memory § Computation and file system managed by operating system § Contact and/or contactless interface

slide-10
SLIDE 10

Secure Microcontroller Smart Cards

10

Central(Processing(Unit((CPU)( !" !8,!16!or!32!bit! Cryptographic(Co5Processor( !" !Symmetric!Encryp6on!(typically!!DES,!3DES!or!AES)! !" !Cryptographic!hashing!(typically!SHA"1)! !" !MAC!(typically!CBC"MAC)! !" !Public"key!encryp6on!(typically!RSA/ECC)! !" !Signature!(typically!RSA/ECC)! True(Random(Number(Generator((TRNG)( !"!E.g.,!for!the!genera6on!of!keys!and!nonces! Counters( !" !E.g.,!for!memory!access!control! Communica>on(Interface( !" !Contact!interface!(1.5!–!12!MB/s)! !" !Contactless!interface!(4!–!848!KB/s)! Protec>on(against(physical(aAacks((e.g.,(against( side5channel(and/or(invasive(aAacks)( !" !Typically!compliant!to!FIPS!140"2! !" !OUen!cer6fied!by!Common!Criteria! !" !Includes!environmental!sensors!(e.g.,!to!detect! !voltage,!frequency,!temperature!varia6ons)! Read5Only(Memory((ROM)((>(256(KB)( !" !Contains!opera6ng!system!and!applica6ons! !" !Ini6alized!during!manufacturing!of!the!device! Non5Vola>le(Memory((NVM)((>(128(KB)( !" !Read/write!memory!holding!data!aUer!power!off! !" !Stores!user!data!(e.g.,!device!serial!number,! !applica6on!data,!cryptographic!secrets)! !" !Supports!only!limited!number!of!writes!(>!50.000)! !" !E.g.,!EEPROM!and/or!Flash! Random(Access(Memory((RAM)((>(25(KB)( !" !Read/write!memory!losing!data!aUer!power!off! !" !Stores!temporary!data!during!opera6on!

slide-11
SLIDE 11

Trusted Platform Module (TPM)

11

§ Current implementation is a cryptographic co-processor

§ Hardware-based random number generation § Small set of cryptographic functions § Key generation, signing, encryption, hashing, MAC

§ Offers additional functionalities

§ Registers for secure platform integrity measurement and reporting § Secure storage (ideally tamper-resistant)

§ Embedded into the platform’s motherboard

§ Acts as a “Root of Trust” § TPM must be trusted by all parties

§ Many vendors already ship their platforms with a TPM

slide-12
SLIDE 12

Programmable Secure Coprocessor: IBM 4758

12

§ General-purpose secure coprocessor

§ Hard- and firmware maintained by IBM § User can run own operating system and applications § True Random Number Generator § Symmetric Crypto in HW (e.g., AES, SHA) § Modular Arithmetics in HW (e.g., RSA, DSA)

§ Security Functions

§ Integrity self-check (processor internally performs secure boot) § Hardware-protected storage including hardware-based access control

§ Tamper-responding hardware design

§ Sensors detect tampering (e.g., temperature, voltage, radiation variations) § Automatically erases secrets when tampering is detected § Certified under FIPS 140-2 Level 4 (highest level of security)

slide-13
SLIDE 13

How Secure is Secure Hardware?

13

§ Attacks [Kömmerling,Kuhn Smartcard’99]

§ Micro-probing: Obtain direct physical access to the device’s memory § Side-channel attacks: Analyze analog characteristics of all supply and interface connections and electromagnetic radiation of the device § Fault injection attacks: Observe device’s behavior under abnormal conditions (e.g., unspecified supply voltage, operating temperature, focused ion beam). § Example: Hardware Attack on TPM chip [Tarnovsky BlackHat’10]

§ Protection Mechanisms

§ Against side-channel attacks § Randomized program flows § Obfuscation of input data and intermediate results § Against microprobing and fault injection attacks § Tamper-detection mechanisms (e.g., temperature, voltage, 
 frequency sensors) that erase secret data on tampering attempts

Focused Ion Beam Microscope

➩ Security of Secure Hardware is always a Trade-Off

slide-14
SLIDE 14

Hardware Tokens in Theory

14

slide-15
SLIDE 15

State and Type

15

§ Stateless Tokens (read only)

§ initialized once (e.g., with key) § do not hold any long-term state between sessions § less possibilities for HW attacks (e.g., no counter) § more difficult protocol design

§ Stateful Tokens (read/write)

§ hold long-term state between sessions § need secure non-volatile memory § simple HW functionality:
 secure monotonic counter

§ Capabilities and resources

§ usually very small amount of secure memory § symmetric crypto vs. public key crypto, ...

slide-16
SLIDE 16

Trust Models

16

§ I trust my own token:

§ use as accelerator (e.g., GPU,FPGA,…)

§ All tokens are “good” (trusted by both parties):

§ Similar to Common Reference String Model where Trusted Third Party (TTP) generates well-formed tokens instead of well-formed CRS

§ I don’t trust your token (trusted by sender only):

§ Receiver does not trust token issued by Sender
 (as Receiver does not trust Sender either) § Sender S could send cheating token (or infected with HW Trojan)

§ I don’t trust my own token (trusted by nobody):

§ Receiver could break into the token and extract secrets by getting around physical protection mechanisms (e.g., side-channel attacks)

slide-17
SLIDE 17

Adversary Models

17

§ Semi-honest (honest-but-curious, passive):

§ all parties honestly follow protocol § adversary tries to learn additional information 
 from the corrupted parties’ state (including messages seen) § appropriate for many real life applications (e.g., protect against insider attacks)

§ Covert:

§ corrupted parties deviate from protocol § cheating detected with constant probability (e.g., ½)

§ Malicious (active):

§ corrupted parties deviate from protocol § adversary can succeed with at most negligible probability (e.g., 2−80)

§ Universal Composability:

§ security against active adversaries
 + secure universal protocol composition (sequential & concurrent)

slide-18
SLIDE 18

Hardware Tokens as Setup Assumption

18

slide-19
SLIDE 19

Overcome Impossibility Results of UC

19

The Universal Composability (UC) Framework [Canetti FOCS’01] § Security against active adversaries
 + secure universal protocol composition
 (sequential & concurrent) § Impossibility result: No non-trivial two-party functionality can be UC- realized in the plain model (without any trusted setup) § Setup Assumption required to overcome impossibility result

slide-20
SLIDE 20

Setup Assumptions for UC

20

Find minimal but “practical” assumptions for UC § Common Reference String (CRS)
 [Canetti,Fischlin CRYPTO’01] § Public-Key Registration Service
 [Barak,Canetti,Nielsen,Pass FOCS’04] § Government-issued “Signature Cards”
 [Hofheinz,Müller-Quade Moraviacrypt’05] § Tamper-proof Hardware Tokens
 (trusted by issuer only): next slides …

Trust in
 Technology Trust in
 Third Party

slide-21
SLIDE 21

Crypto Founded on Tamper-Proof HW

21

§ Sender constructs a hardware token implementing any desired (polytime) functionality (e.g., programmable smartcard) § Receiver and Adversary can use token as black box only,
 i.e., observe its input/output characteristics § Trust model

§ Token not trusted by Receiver § Token communicates with Receiver only § not to outside world § or only up to a certain number of bits to outside world

§ Different Models

§ Symmetric (both parties construct a tamper-proof token) § Asymmetric (only one party constructs a tamper-proof token)

slide-22
SLIDE 22

Crypto Founded on Tamper-Proof HW

22

§ Model tamper-proof HW (wrap functionality) § Construct (computational) UC-Commitments § Drawback: Reduction from UC-Commitments to UC-Secure Computation is not efficient in practice Protocol' #Tokens' Stateful' PK'opera4ons' [Katz&EUROCRYPT’07]& [Damgard,Nielsen,Wichs&TCC’09]& 2& Yes& Yes& [Chandran,Goyal,Sahai&EUROCRYPT’08]& 2& No& Yes& [Moran,Segev&EUROCRYPT’08]& 1& Yes& No&

slide-23
SLIDE 23

Hardware Tokens for Oblivious Transfer (OT)

23

slide-24
SLIDE 24

Oblivious Transfer (OT)

24

§ Fundamental Primitive for Secure Computation § SW-based OT protocols
 (e.g., [Naor,Pinkas SODA'01], [Ishai,Kilian,Nissim,Petrank CRYPTO’03])

§ Interaction (at least two messages) § k=128 bit communication per OT
 using OT extension

OT

Sender S Receiver R

b ∈ {0,1} s0, s1 ∈ {0,1}t sb

slide-25
SLIDE 25

Non-Interactive OT using One-Time Memory

25

§ base OT on very simple HW assumption to remove interaction § One-Time Memory (OTM):
 implements OT functionality in HW

§ tamper-proof bit x ensures
 at most one query § either s0 or s1 is read and revealed

§ No crypto & communication at all § OTM trusted by both players

OTM
 tamper-proof bit x
 s0, s1 b ∈ {0,1} sb

Sender S Receiver R sends token T [Goldwasser,Kalai,Rothblum CRYPTO’08]

slide-26
SLIDE 26

Interactive OT using HW Token

26

§ Token T:

§ stateless § symmetric crypto only § trusted by Sender S only

§ security against covert adversaries
 (Receiver R can detect cheating w.h.p.)

Sender S Receiver R sends token T protocol Π [Kolesnikov TCC’10]

slide-27
SLIDE 27

Interactive OT using HW Token

[Kolesnikov TCC’10]

Setup Phase: Sender S s0, s1 ∈ D = {0, 1}t Receiver R b ∈ {0, 1} k0, k1 ∈R D init T with k0, k1 T kD ∈R D v0, v1 Online Phase: p, y, vb⊕p y

?

∈ D \ Dt ek0 = F −1

Fk0 (y)(vb⊕p)

ek1 = F −1

Fk1 (y)(vb⊕p)

e0 = Encek0(s0⊕p) e1 = Encek1(s1⊕p) e0, e1 sb = Decx(eb⊕p) T kD Dt := {FkD|x ∈ D, x even} yt yt ∈R Dt x, xt ∈R D y ∈R D \ Dt query in random order: y, x v0 = FFk0(y)(x) v1 = FFk1(y)(x) c0

?

= Fk0

t (xt)

c1

?

= Fk1

t (xt)

p ∈R {0, 1} Enc, Dec: CPA-secure encryption F: Strong PRP t: symmetric security parameter yt

?

∈ Dt k0

t = Fk0(yt)

k1

t = Fk1(yt)

k0

t , k1 t

yt, xt c0, c1 c0 = FFk0(yt)(xt) c1 = FFk1(yt)(xt)

R randomly chooses
 test domain Dt R obtains test keys T can only guess on which query to cheat without being caught R can decrypt exactly one of the two messages

slide-28
SLIDE 28

Hardware Tokens for Private Set Intersection

28

X Y

slide-29
SLIDE 29

Private Set Intersection (PSI)

29

§ Use case examples:

§ Organization: find joint properties § Governments: find joint criminal suspects § Companies: find joint customers

§ SW-based PSI protocols (e.g., [DeCristofaro,Kim,Tsudik ASIACRYPT’10])

§ at least O(|X|+|Y|) communication

Private
 Set Intersection

Sender S Receiver R

set Y set X X ∩ Y

slide-30
SLIDE 30

Private Set Intersection with Hardware Token

30

[Hazay,Lindell CCS’08]

§ O(|X|+|Y|) symmetric crypto, O(|X|) communication § Runtime on Java Card ≈ 60ms · |Y| § Security: Universal Composability § Token T trusted by both parties

T A X = {x1, . . . , xnA} B Y = {y1, . . . , ynB} Setup Phase: k, OK ∈R D init T : k, OK, nB Online Phase: ∀yj ∈ Y : ¯ yj = Fk(yj) invalidate k OK0 = OK T yj ¯ yj OK00 OK00 ? = OK ¯ X = {Fk(x)}x2X ¯ X X ∩ Y = {yj|¯ yj ∈ ¯ X} done OK00 = OK0 OK0 F: SPRP, D={0,1}t
 t: symmetric security
 parameter
 (e.g., F=AES, t=128)

Sender S Receiver R

slide-31
SLIDE 31

PSI with Untrusted Hardware Token

31

[Fischlin,Pinkas,Sadeghi,Schneider,Visconti CT-RSA’11]

§ Extend PSI Protocol of [Hazay,Lindell CCS’08]:

§ Similar HW requirements and complexity (up to small constant factors) § Token(s) can be trusted less: § Security fully trusted untrusted by Receiver untrusted by Sender UC Fall-Back Security Output Correctness w.r.t. Covert Adversaries Input Privacy w.r.t. Malicious Adversaries Any Cheating Attempts

  • can breach only correctness
  • but not privacy
  • detected with high probability
slide-32
SLIDE 32

Receiver does not trust T: Privacy

32

§ If R sends nothing to S, his inputs remain private

§ Re-order messages to remove OK message
 ➩ non-interactive version of the protocol of [Hazay,Lindell CCS’08] T A X = {x1, . . . , xnA} B Y = {y1, . . . , ynB} Setup Phase: k, s ∈R D init T : k, s, nB Online Phase: pj = fs(j) ¯ y0

j = Fk(yj) ⊕ pj

T yj pj ¯ y0

j

¯ X = {Fk(x)}x∈X ¯ X ∀j ∈ {1, .., nB}: done invalidate k pj = fs(j) ¯ yj = ¯ y0

j ⊕ pj

X ∩ Y = {yj|¯ yj ∈ ¯ X} afterwards Sender S Receiver R

F: SPRP f: PRF

slide-33
SLIDE 33

Receiver Does Not Trust T: Correctness

33

§ Idea (adapted from [Kolesnikov, TCC’10]):

§ use live + test run ➩ R can check correctness of T’s answers

Setup Phase: A X = {x1, . . . , xnA} B Y = {y1, . . . , ynB} k, s, sT ∈R D init T : k, s, sT , nB T ∀j ∈ {1, .., nB} : pj = fs(j) ¯ y0

j = FK(yj) ⊕ pj

pT

j = fsT (j)

¯ y0T

j

= FKT (yj) ⊕ pT

j

yj (¯ y0

j, ¯

y0T

j )

Online Phase: r

?

= rT KT = Fk(rT ) K = Fk(r) ¯ X = {FK(x)}x∈X T ¯ X, KT r, rT 2R D, r 6= rT r, rT b ∈R {0, 1} if b = 1: flip order of (r, rT ) (r, rT ) K = Fk(r) KT = Fk(rT ) if b = 1: flip order of (¯ y0

j, ¯

y0T

j )

done invalidate k pj = fs(j) pT

j = fsT (j)

if b = 1: flip order of (pj, pT

j )

¯ yT

j = ¯

y0T

j ⊕ pT j ?

= FKT (yj) ¯ yj = ¯ y0

j ⊕ pj

X ∩ Y = {yj|¯ yj ∈ ¯ X} afterwards (pj, pT

j )

1) R sends live and test value to S 2) R gets test key, but not live key 3) R sends live and test
 value to T in random order
 ➩ T can only guess which 
 value is the live value 4) R uses test key to verify
 T’s test responses

Sender S Receiver R

slide-34
SLIDE 34

Sender A Does Not Trust T: Privacy

34

X ∩ Y = {yj|¯ y2,j ∈ ¯ X} r

?

= rT KT

i = Fki(rT )

Ki = Fki(r) ¯ X = {FK2(FK1(x))}x∈X Online Phase: Ti r, rT 2R D, r 6= rT r, rT ¯ X, KT

1 , KT 2 b ∈R {0, 1}

if b = 1: flip order of (r, rT ) for i ∈ {1, 2}: (r, rT ) Ki = Fki(r) KT

i = Fki(rT )

Setup Phase: A X = {x1, . . . , xnA} B Y = {y1, . . . , ynB} for i ∈ {1, 2}: ki, si, sT

i ∈R D

init Ti: ki, si, sT

i , nB

T1, T2 ∀j ∈ {1, .., nB} : pi,j = fsi(j) ¯ y0

i,j = FKi(yi,j) ⊕ pi,j

pT

i,j = fsT

i (j)

¯ y0T

i,j = FKT

i (yi,j) ⊕ pT

i,j

yi,j (¯ y0

i,j, ¯

y0T

i,j)

if b = 1: flip order of (¯ y0

i,j, ¯

y0T

i,j)

yi,j = ( yj if i = 1 ¯ yi−1,j else afterwards done invalidate ki pi,j = fsi(j) pT

i,j = fsT

i (j)

(pi,j, pT

i,j)

if b = 1: flip order of (pi,j, pT

i,j)

¯ yT

i,j = ¯

y0T

i,j ⊕ pT i,j ?

= FKT

i (yi,j)

¯ yi,j = ¯ y0

i,j ⊕ pi,j

§ Idea: Use multiple tokens
 (from different manufacturers) § Assumption: receiver can break into all but one token § Tool: Sequential composition

  • f PRPs FKi remains secure

even if all but one key is known

slide-35
SLIDE 35

Hardware Tokens for Yao

35

slide-36
SLIDE 36

SFE using Garbled Circuits

36

Transfer of GC is
 major communication bottleneck!

Client C

  • Garbled


Circuit

C f(·, ·) e C e y f(x, y) = e C(e x, e y) Server S

  • Circuit

z . . .

  • xn

yn

  • x1

y1

  • y2
  • x2
  • c1
  • c2

Garbled Table

z . . . xn yn x1 y1 y2 x2

< < <

c1 c2

Garbled
 Values

e.g., x < y private data x = x1, .., xn private data y = y1, .., yn (e x; ⊥) ← OT(x; (e x0, e x1))

Setup
 Phase Online
 Phase

e c0

1, e

c1

1

E(e x0

1, e

y0

1; e

cg(0,0)

1

) E(e x0

1, e

y1

1; e

cg(0,1)

1

) E(e x1

1, e

y0

1; e

cg(1,0)

1

) E(e x1

1, e

y1

1; e

cg(1,1)

1

)

slide-37
SLIDE 37

Embedded SFE

37

Use HW to remove communication bottleneck between C and S:

  • S provides C with tamper-proof HW token T (not trusted by C)
  • Applications: Mobile phones, Pay TV
  • T generates GC locally on behalf of S
  • T needs few HW resources only (constant memory+symmetric crypto)

T C S k f k x y z = f(x, y) fi e fi

[Järvinen,Kolesnikov,Sadeghi,Schneider FC’10]


Additionally, C can use HW accelerator to evaluate GC more efficiently.

[Järvinen,Kolesnikov,Sadeghi,Schneider CHES’10]


slide-38
SLIDE 38

Hardware Tokens for GMW

38

  • D. Demmler, T. Schneider, M. Zohner:


Ad-Hoc Secure Two-Party Computation on Mobile Devices using Hardware Tokens. In USENIX Security’14.

slide-39
SLIDE 39

Motivation

39

Smartphones are awesome… Contacts Calendar Location Banking Messaging Games … but limited… Computation Memory Communication Battery Life

$

slide-40
SLIDE 40

Generic Secure Computation

Yao’s GC and the GMW protocol have inherent limitations:

  • O(|f|) symmetric crypto
  • O(k |f|) communication for symmetric security parameter k

=> Too inefficient for mobile devices Yao implementations on PC: Fairplay [MNPS04], FastGC [HEKM11] Yao implementations on Mobile: Mobile Yao [HCE11]

40

Alice

Private Input: x

Bob

Private Input: y

x f(x,y) y f(x,y)

slide-41
SLIDE 41

Secure Computation Applications

41

Finding shared contacts
 (PSI)

Alice Bob Bart Lisa Homer Homer Krusty Maggie Maggie Marge Alice Bob = 1 1 1 1 1

Scheduling a meeting

slide-42
SLIDE 42

Secure Computation Applications

42

Scheduling a meeting with location information

slide-43
SLIDE 43

Our Setting

43

slide-44
SLIDE 44

Secure Computation Phases

44

Setup Phase

x y f(x,y) |f| |f|

Init Phase Online Phase

slide-45
SLIDE 45

Multiplication Triple Generation

45

A B

slide-46
SLIDE 46

Multiplication Triple Sets

46

2 4 8 14

slide-47
SLIDE 47

Android Apps for Secure Computation

47

Alice Bob

G&D MSC Service MT Set Service

SC App SC App

Wi-Fi Direct

slide-48
SLIDE 48

Benchmarks - General

48

Giesecke & Devrient Mobile Security Card SE 1.0

microSD JavaCard Memory: 75 KB EEPROM / 1750 Byte RAM AES: 16 KB/s

Samsung Galaxy S3

4x 1.4 GHz ARM CPU 1 GB RAM, 16 GB flash storage

Interactive OT Extension: 11 000 MT/s Init Phase on Smartcard: 5 800 MT/s

slide-49
SLIDE 49

Scheduling for a week with 392 time slots 512 contacts with 32 bit each (SortCompareShuffle-Circuit from [HEK12])

Benchmarks - Applications

49

Scheduling Location Scheduling Common Contatcs |f| / d(f) 392 / 1 280 605 / 87 799 232 / 79 Init 0.37 s 48.5 s 137.9 s Setup 1.3 s 1.8 s 2.5 s Online 0.003 s 0.82 s 1.9 s Ad-Hoc 1.3 s 2.6 s 4.4 s [HCE11] 3.82 s

  • 1 468.0 s

[HEK12]

  • 4.95 s

(16 bit coordinates)

slide-50
SLIDE 50

Comparison with Related Work

50

f unknown in init phase low ad-hoc communication low ad-hoc computation Yao‘s Garbled Circuits

✓ ✘ ✘

Token Yao

✘ ✓ ✘

GMW

✓ ✘ ✘

Token GMW

✓ ✓ ✓

Summary:

  • Mobile Secure Computation is becoming practical (but requires a smartcard)
  • Trusted hardware enables secure offline pre-computation
slide-51
SLIDE 51

Conclusion

51

§ Cryptographic Protocols + Secure Hardware

§ Can reduce complexity of protocols § Many protocols initially assume fully trusted token § To cover real-life threats, they can be modified to provide some level of
 security/correctness guarantees even if tokens are

§ malicious (e.g., Hardware Trojans) or § broken into (e.g., Side-Channel Attacks)

slide-52
SLIDE 52

Literature

52

§

  • W. Aiello, Y. Ishai, O. Reingold; Priced oblivious transfer: How to sell digital goods. In EUROCRYPT’01.

§

  • E. De Cristofaro, J. Kim, G. Tsudik: Linear-complexity private set intersection protocols secure in malicious model. In

ASIACRYPT’10. §

  • D. Demmler, T. Schneider, M. Zohner: Ad-hoc secure two-party computation on mobile devices using hardware tokens. In

USENIX Security’14. §

  • M. Fischlin, B. Pinkas, A.-R. Sadeghi, T. Schneider, I. Visconti: Secure set intersection with untrusted hardware tokens. In

CT-RSA’11. §

  • S. Goldwasser, Y. T. Kalai, G. N. Rothblum: One-time programs. In CRYPTO’08.

§

  • C. Hazay, Y. Lindell: Constructions of truly practical secure protocols using standard smartcards. In ACM CCS’08.

§

  • Y. Huang, P. Chapman, D. Evans. Privacy-preserving applications on smartphones. In HotSec’11.

§

  • Y. Huang, D. Evans, J. Katz: Private set intersection: are garbled circuits better than custom protocols? In NDSS’12.

§

  • Y. Huang, D. Evans, J. Katz, L. Malka. Faster secure two-party computation using garbled circuits. In USENIX Security’11.

§

  • Y. Ishai, J. Kilian, K. Nissim, E. Petrank. Extending oblivious transfers efficiently. In CRYPTO’03.

§

  • K. Järvinen, V. Kolesnikov, A.-R. Sadeghi, T. Schneider: Embedded SFE: Offloading server and network using hardware
  • tokens. In FC’10.

§

  • K. Järvinen, V. Kolesnikov, A.-R. Sadeghi, T. Schneider. Garbled circuits for leakage-resilience: Hardware implementation

and evaluation of one-time programs. In CHES’10. §

  • V. Kolesnikov. Truly efficient string oblivious transfer using resettable tamper-proof tokens. In TCC’10.

§

  • O. Kömmerling, M. G. Kuhn. Design Principles for Tamper-Resistant Smartcard Processors. In Smartcard ’99.

§

  • D. Malkhi, N. Nisan, B. Pinkas, Y. Sella. Fairplay - a secure two-party computation system. In USENIX Security’04.

§

  • M. Naor, B. Pinkas. Efficient oblivious transfer protocols. In SODA’01.

§

  • C. Tarnovsky. Deconstructing a ‘secure’ processor. In Black Hat Briefings Federal, 2010.

§

  • A. C. Yao. How to generate and exchange secrets. In FOCS’86.