From obfuscation to white-box crypto: relaxation and security - - PowerPoint PPT Presentation

from obfuscation to white box crypto relaxation and
SMART_READER_LITE
LIVE PREVIEW

From obfuscation to white-box crypto: relaxation and security - - PowerPoint PPT Presentation

From obfuscation to white-box crypto: relaxation and security notions Matthieu Rivain WhibOx 2016, 14 Aug, UCSB What does this program do?


slide-1
SLIDE 1

From obfuscation to white-box crypto: relaxation and security notions

Matthieu Rivain

WhibOx 2016, 14 Aug, UCSB

slide-2
SLIDE 2

What does this program do?

([]+/H/)[1&11>>1]+(+[[]+(1-~1<<1)+(~1+1e1)+(1%11)+(1|1>>1|1)+(~1+1e1)+(.1^!1)])[[([]+!![ 11])[11^11]+[[{}]+{}][1/1.1&1][1]]+([[]+111/!1][+!1][([{}]+{})[1e1>>1]+[[],[]+{}][1&11>> 1][1|[]]+([]+[][111])[1&1]+[{},1e1,!1+{}][~~(1.1+1.1)][1^1<<1]+(11/!{}+{})[1-~1<<1]+[!!{ }+[]][+(11>11)][[]+1]+(/^/[1.11]+/&/)[.1^!1]+[{},[{}]+{},1][1&11>>1][1+1e1+1]+([]+!!{})[ .1^!1]+([]+{}+[])[[]+1]+[!!{}+{}][!11+!111][[]+1]]+[])[(!/~/+{})[1|1<<1]+[/=/,[]+[][1]][ 1&11>>1][1&1>>1]+([]+{})[~~(1.1+1.1)]+[1,!1+{}][1%11][1^1<<1]+(111/[]+/1/)[~1+1e1+~1]+[! !/-/+[]][+(11>11)][1]]((1<<1^11)+((+(1<1))==([]+/-/[(!![11]+[])[+!1]+(!!/-/+{})[1-~1]+([ ]+!/~/)[1-~1]+(!!/-/+{})[!111+!111]])[11%11]),-~11>>1)](~1-~1e1<<1<<1)+([]+{111:1111}+[] )[11111.1%11.1*111e11|!11]+({}+/W/)[1+~1e1-(~11*1.1<<1)]+(+[[]+(1|1>>1)+(1|1>>1|1)+(11-1 >>1)+(1e1>>1|1)+(1e1>>1)+(1>>11)+(11>>>1)])[[(!!{}+[])[11>>>11]+[[]+{}][.1^!1][111%11]]+ ([11/[]+[]][111%111][([{}]+[{}])[1e1>>1]+[[],[{}]+[{}]][1|1>>1|1][1|[]]+([][11]+[])[[]+1 ]+[{},1e1,![1]+/~/][1<<!1<<1][1<<1^1]+(1/!1+{})[11+1>>1]+[!!/-/+{}][+(111>111)][111%11]+ ([][11]+/&/)[1&1>>1]+[{},[]+{}+[],1][[]+1][11-~1+11>>1]+([]+!!/-/)[11>>11]+([]+{})[1|1>> 1|1]+[[]+!!{}][1>>>1][1&11]]+[])[(!{}+[])[1^1<<1]+[/=/,[]+[][1]][1<<1>>1][!111+!111]+([] +{}+[])[1<<1^1>>1]+[1,![11]+[]][1|1>>1][1|1<<1|1]+(11/[]+/1/)[-~11>>1]+[!![111]+{}][+[]] [1|1>>1]]((1e1-1)+((1&1>>1)==([]+/-/[(!!{}+{})[+(1>1)]+(!!/-/+{})[1|1<<1]+(!1+{})[1|1<<1 |1]+(!!/-/+{})[11.11>>11.11]])[1&1>>1]),1-~1<<1)](~1-~1e1<<1<<1)+(/^!/+[])[1+!![11%111]]

slide-3
SLIDE 3

What does this program do?

([]+/H/)[1&11>>1]+(+[[]+(1-~1<<1)+(~1+1e1)+(1%11)+(1|1>>1|1)+(~1+1e1)+(.1^!1)])[[([]+!![ 11])[11^11]+[[{}]+{}][1/1.1&1][1]]+([[]+111/!1][+!1][([{}]+{})[1e1>>1]+[[],[]+{}][1&11>> 1][1|[]]+([]+[][111])[1&1]+[{},1e1,!1+{}][~~(1.1+1.1)][1^1<<1]+(11/!{}+{})[1-~1<<1]+[!!{ }+[]][+(11>11)][[]+1]+(/^/[1.11]+/&/)[.1^!1]+[{},[{}]+{},1][1&11>>1][1+1e1+1]+([]+!!{})[ .1^!1]+([]+{}+[])[[]+1]+[!!{}+{}][!11+!111][[]+1]]+[])[(!/~/+{})[1|1<<1]+[/=/,[]+[][1]][ 1&11>>1][1&1>>1]+([]+{})[~~(1.1+1.1)]+[1,!1+{}][1%11][1^1<<1]+(111/[]+/1/)[~1+1e1+~1]+[! !/-/+[]][+(11>11)][1]]((1<<1^11)+((+(1<1))==([]+/-/[(!![11]+[])[+!1]+(!!/-/+{})[1-~1]+([ ]+!/~/)[1-~1]+(!!/-/+{})[!111+!111]])[11%11]),-~11>>1)](~1-~1e1<<1<<1)+([]+{111:1111}+[] )[11111.1%11.1*111e11|!11]+({}+/W/)[1+~1e1-(~11*1.1<<1)]+(+[[]+(1|1>>1)+(1|1>>1|1)+(11-1 >>1)+(1e1>>1|1)+(1e1>>1)+(1>>11)+(11>>>1)])[[(!!{}+[])[11>>>11]+[[]+{}][.1^!1][111%11]]+ ([11/[]+[]][111%111][([{}]+[{}])[1e1>>1]+[[],[{}]+[{}]][1|1>>1|1][1|[]]+([][11]+[])[[]+1 ]+[{},1e1,![1]+/~/][1<<!1<<1][1<<1^1]+(1/!1+{})[11+1>>1]+[!!/-/+{}][+(111>111)][111%11]+ ([][11]+/&/)[1&1>>1]+[{},[]+{}+[],1][[]+1][11-~1+11>>1]+([]+!!/-/)[11>>11]+([]+{})[1|1>> 1|1]+[[]+!!{}][1>>>1][1&11]]+[])[(!{}+[])[1^1<<1]+[/=/,[]+[][1]][1<<1>>1][!111+!111]+([] +{}+[])[1<<1^1>>1]+[1,![11]+[]][1|1>>1][1|1<<1|1]+(11/[]+/1/)[-~11>>1]+[!![111]+{}][+[]] [1|1>>1]]((1e1-1)+((1&1>>1)==([]+/-/[(!!{}+{})[+(1>1)]+(!!/-/+{})[1|1<<1]+(!1+{})[1|1<<1 |1]+(!!/-/+{})[11.11>>11.11]])[1&1>>1]),1-~1<<1)](~1-~1e1<<1<<1)+(/^!/+[])[1+!![11%111]]

Answer: it prints “hello world”

slide-4
SLIDE 4

What does this program do?

#define _ -F<00||--F-OO--; int F=00,OO=00;main(){F_OO();printf("%1.3f\n",4.*-F/OO/OO);}F_OO() { _-_-_-_ _-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_ _-_-_-_ }

slide-5
SLIDE 5

What does this program do?

#define _ -F<00||--F-OO--; int F=00,OO=00;main(){F_OO();printf("%1.3f\n",4.*-F/OO/OO);}F_OO() { _-_-_-_ _-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_ _-_-_-_ }

Answer: it computes π

slide-6
SLIDE 6

What is (cryptographic) obfuscation?

slide-7
SLIDE 7

What is obfuscation?

Obfuscation is the deliberate act of creating obfuscated code, i.e. [...] that is difficult for humans to understand. Obfuscators make reverse engineering more difficult [...] but do not alter the behavior of the obfuscated application. – wikipedia

slide-8
SLIDE 8

What is obfuscation?

Obfuscation is the deliberate act of creating obfuscated code, i.e. [...] that is difficult for humans to understand. Obfuscators make reverse engineering more difficult [...] but do not alter the behavior of the obfuscated application. – wikipedia ⇒ make a program unintelligible while preserving its functionality

slide-9
SLIDE 9

Why obfuscation?

∎ To protect some secret inside a program ▸ the algorithm itself (e.g. a factoring program)

efficient factoring algorithm intelligble program N = p · q (p, q)

▸ some private data used by the program (e.g. conditional data

access)

private data if pwd correct then disclose f(data) pwd, f f(data)

∎ Obfuscating a hello-word program is useless

slide-10
SLIDE 10

Defining obfuscation

Program

∎ word in a formal (programming) language P ∈ L ∎ function execute ∶ L × {0,1}∗ → {0,1}∗

execute ∶ (P,in) ↦ out

∎ P implements a function f ∶ A → B if

∀a ∈ A ∶ execute(P,a) = f(a) denoted P ≡ f

∎ P1 and P2 are functionally equivalent if

P1 ≡ f ≡ P2 for some f denoted P1 ≡ P2

slide-11
SLIDE 11

Defining obfuscation

Obfuscator

∎ algorithm O mapping a program P to a program O(P) st: ∎ functionality: O(P) ≡ P ∎ efficiency: O(P) is efficiently executable ∎ security: ▸ (informal) O(P) is hard to understand ▸ (informal) O(P) protects its data

How to formally define the security property?

slide-12
SLIDE 12

Virtual Black-Box (VBB) Obfuscation

∎ O(P) reveals nothing more than the I/O behavior of P ∎ Any adversary on O(P) can be simulated with a black-box

access to P

slide-13
SLIDE 13

Virtual Black-Box (VBB) Obfuscation

∎ O(P) reveals nothing more than the I/O behavior of P ∎ Any adversary on O(P) can be simulated with a black-box

access to P

A

O(P)

  • 1

Adversary

S

  • 1

P

x P(x) Simulator

∣Pr[A(O(P))) = 1] − Pr[SP () = 1]∣ ≤ ε

slide-14
SLIDE 14

Impossibility result

∎ VBB-O does not exist on general programs (CRYPTO’01) ∎ Counterexample: uint128_t cannibal (prog P, uint128_t password) { uint128_t secret1 = 0 xe075b4f4eabf4377c1aa7202c8cc1ccb ; uint128_t secret2 = 0 x94ff8ec818de3bd8223a62e4cb7c84a4 ; if (password == secret1) return secret2; if (execute(P, null , secret1) == secret2) return secret1; return 0; }

O(cannibal)(O(cannibal),0) = secret1

slide-15
SLIDE 15

Indistinguishability obfuscation (iO)

∎ Restricted to circuits i.e. programs without branches/loops ∎ For any two programs P1 and P2 st P1 ≡ P2 and ∣P1∣ = ∣P2∣,

the obfuscated programs O(P1) and O(P2) are indistinguishable

A

O(P1)

  • 1

A

O(P2)

  • 1

∣Pr[A(O(P1)) = 1] − Pr[A(O(P2)) = 1]∣ ≤ ε

slide-16
SLIDE 16

Best possible obfuscation

∎ Anything that can be learned (efficiently) from O(P) can be

learned from any P ′ ≡ P with ∣P ′∣ ≈ ∣P∣ A

O(P)

  • 1

Adversary

S

P ′

  • 1

Simulator P ′ P

O

∣Pr[A(O(P))) = 1] − Pr[S(P ′) = 1]∣ ≤ ε

slide-17
SLIDE 17

iO and BPO are equivalent

∎ iO ⇒ BPO

A

O(P)

  • 1

O

A S

P ′

  • 1
slide-18
SLIDE 18

iO and BPO are equivalent

∎ iO ⇒ BPO

A

O(P)

  • 1

O

A S

P ′

  • 1

∎ BPO ⇒ iO

A

O(P1)

  • 1

A

O(P2)

  • 1
slide-19
SLIDE 19

iO and BPO are equivalent

∎ iO ⇒ BPO

A

O(P)

  • 1

O

A S

P ′

  • 1

∎ BPO ⇒ iO

A

O(P1)

  • 1

A

O(P2)

  • 1

S

P1

  • 1
slide-20
SLIDE 20

iO and BPO are equivalent

∎ iO ⇒ BPO

A

O(P)

  • 1

O

A S

P ′

  • 1

∎ BPO ⇒ iO

A

O(P1)

  • 1

A

O(P2)

  • 1

S

P1

  • 1
slide-21
SLIDE 21

iO and BPO are equivalent

∎ iO ⇒ BPO

A

O(P)

  • 1

O

A S

P ′

  • 1

∎ BPO ⇒ iO

≃ ≃

A

O(P1)

  • 1

A

O(P2)

  • 1

S

P1

  • 1
slide-22
SLIDE 22

iO and BPO are equivalent

∎ iO ⇒ BPO

A

O(P)

  • 1

O

A S

P ′

  • 1

∎ BPO ⇒ iO

≃ ≃ ≃

A

O(P1)

  • 1

A

O(P2)

  • 1

S

P1

  • 1
slide-23
SLIDE 23

iO and BPO are equivalent

∎ iO ⇒ BPO

A

O(P)

  • 1

O

A S

P ′

  • 1

∎ BPO ⇒ iO

≃ ≃ ≃

A

O(P1)

  • 1

A

O(P2)

  • 1

S

P1

  • 1

∎ We use iO in the rest of the presentation

slide-24
SLIDE 24

What is white-box cryptography?

slide-25
SLIDE 25

What is white-box cryptography?

“the attacker is assumed to have [...] full access to the encrypting software and control of the execution environment” “Our main goal is to make key extraction difficult.” “While an attacker can clearly make use of the software itself [...], forcing an attacker to use the installed instance at hand is often of value to DRM systems providers.” – Chow et al. (DRM 2002)

slide-26
SLIDE 26

What is white-box cryptography?

“the attacker is assumed to have [...] full access to the encrypting software and control of the execution environment”

⇒ obfuscation restricted to encryption (or another crypto primitive)

“Our main goal is to make key extraction difficult.” “While an attacker can clearly make use of the software itself [...], forcing an attacker to use the installed instance at hand is often of value to DRM systems providers.” – Chow et al. (DRM 2002)

slide-27
SLIDE 27

What is white-box cryptography?

“the attacker is assumed to have [...] full access to the encrypting software and control of the execution environment”

⇒ obfuscation restricted to encryption (or another crypto primitive)

“Our main goal is to make key extraction difficult.”

⇒ relaxed security requirements

“While an attacker can clearly make use of the software itself [...], forcing an attacker to use the installed instance at hand is often of value to DRM systems providers.” – Chow et al. (DRM 2002)

slide-28
SLIDE 28

What is white-box cryptography?

“the attacker is assumed to have [...] full access to the encrypting software and control of the execution environment”

⇒ obfuscation restricted to encryption (or another crypto primitive)

“Our main goal is to make key extraction difficult.”

⇒ relaxed security requirements

“While an attacker can clearly make use of the software itself [...], forcing an attacker to use the installed instance at hand is often of value to DRM systems providers.”

⇒ encryption software ≠ secret key

– Chow et al. (DRM 2002)

slide-29
SLIDE 29

What is white-box cryptography?

∎ Obfuscation restricted to a specific class of crypto primitives ∎ Typically, SPN ciphers:

m

LL

k1

S S S S S S S S LL

k2

S S S S S S S S LL

k3

S S S S S S S S LL

kn

S S S S S S S S

c

∎ Running example: F = {AESk(⋅) ∣ k ∈ {0,1}128} ∎ White-box obfuscator: k ↦ WB-AESk ≡ AESk(⋅)

slide-30
SLIDE 30

Strongest possible WBC

∎ VBB obfuscation restricted to AES

A

WB-AESk

  • 1

Adversary

S

  • 1

AESk(·)

m c Simulator ∎ Impossibility result does not apply ∎ The AES-LUT program achieves VBB ▸ but does not fit into 109 ⋅ 109 ⋅ 109 TB ∎ How to build a compact VBB AES implementation? ▸ could be impossible to achieve

slide-31
SLIDE 31

What does iO-AES mean?

∎ iO restricted to AES: O(Pk) ≃ O(P ′

k) for any Pk ≡ P ′ k ≡ AESk

∎ Example of iO AES obfuscator:

  • 1. k ← extract-key(Pk)
  • 2. return reference implem AESk

▸ probably inefficient obfuscator! ∎ If a (compact) VBB AES implementation exists

O(Pk) ≃ O(VBB-AESk) ⇒ efficient iO ⇔ VBB

∎ So what does iO-AES means?

slide-32
SLIDE 32

Defining WBC

simple AES VBB AES iO AES

?

Obfuscation scale

∎ We need something ▸ relaxed compared to VBB ▸ meaningful compared to iO

slide-33
SLIDE 33

Defining WBC

simple AES VBB AES iO AES

?

Obfuscation scale

further white-box security notions

∎ We need something ▸ relaxed compared to VBB ▸ meaningful compared to iO ⇒ further notions

slide-34
SLIDE 34

What could we expect from WBC?

slide-35
SLIDE 35

What could we expect?

∎ The least requirement: key extraction must be difficult

A

WB-AESk k

∎ Easy to satisfy for some variant of AES:

Ek(⋅) = AESh(⋅) with h = H(k)

▸ H one-way ⇒ simple AESh implem unbreakable ∎ We should expect more

slide-36
SLIDE 36

What could we expect?

∎ Code-lifting cannot be avoided ▸ the adversary can always use the software ∎ Code-lifting could be made unavoidable ▸ force the adversary to use the software ∎ The software should then constrain the adversary ▸ be less convenient to distribute ▸ have restricted functionalities ▸ include security features

slide-37
SLIDE 37

Less convenient to distribute

∎ Example: make the implementation huge and incompressible

A

WB-AESk

> 10 GB

AESk

< 10 KB

∎ Possible use case: DRM

slide-38
SLIDE 38

Restrict the software functionalities

∎ Example: make the implementation one-way

A

WB-AESk m c m

∎ Namely: turning AES into a public-key cryptosystem ∎ Possible use case: light-weight signature scheme

slide-39
SLIDE 39

Include security features

∎ Example: adding a password

WB-AESk,π

if (ˆ π == π) return AESk(m) else return ⊥

A

ˆ π m c c = AESk(m)

takes time O(2|π|)

∎ WB implem ⇒ software secure element ∎ Possible use case: payment with token

slide-40
SLIDE 40

Include security features

∎ Example: include a tracing mechanism

A

WB-AESk,id Π ≡ AESk(·)

T

id

∃ T st ∀ A ∶ WB-AESk,id ↦ Π ≡ AESk(⋅) ⇒ T (Π) = id

∎ Possible use case: pay-TV

slide-41
SLIDE 41

Include security features

∎ Example: include a tracing mechanism

A

WB-AESk,id Π ≡ AESk(·)

T

WB-AESk,id1 WB-AESk,id2 WB-AESk,idt id ∈ {id1, id2, . . . , idt}

∃ T st ∀ A ∶ WB-AESk,id ↦ Π ≡ AESk(⋅) ⇒ T (Π) = id

∎ Possible use case: pay-TV

slide-42
SLIDE 42

White-box security notions

slide-43
SLIDE 43

Security notions for symmetric ciphers

∎ Encryption scheme: E = (K,M,E,D) ▸ E,D ∶ K × M → M ▸ E(k,⋅) = D(k,⋅)−1 ∎ White-box compiler: CE ∶ (k,r) ↦ [Er

k] ≡ E(k,⋅)

∎ Attack model: ▸ target: a white-box encryption program [Ek] = CE(k,$) ▸ CPA (chosen plaintext attack) – unavoidable ▸ CCA (chosen ciphertext attack) – oracle for D(k,⋅) ▸ RCA (recompilation attack) – oracle for CE(k,$) ∎ Attack goals: ▸ break (extract k), compress, inverse, be untraced

slide-44
SLIDE 44

Unbreakability

A k ← $, r ← $ [Er

k] = CE(k, r)

[Er

k]

ˆ k ˆ k

?

= k

Challenger

D(k, ·) CE(k, $)

UBK-CCA UBK-RCA

c′ m′ [Er′

k ]

CE is (τ,ε)-secure wrt UBK-{CPA/CCA/RCA} ⇔ ∀ A running in time τ : Pr[ˆ k = k] ≤ ε

slide-45
SLIDE 45

One-Wayness

A k ← $, r ← $ [Er

k] = CE(k, r)

m ← $ c = E(k, m) [Er

k], c

ˆ m ˆ m

?

= m

Challenger

D(k, ·) CE(k, R)

OW-CCA OW-RCA

c′ m′ [Er′

k ]

CE is (τ,ε)-secure wrt OW-{CPA/CCA/RCA} ⇔ ∀ A running in time τ : Pr[ ˆ m = m] ≤ ε

slide-46
SLIDE 46

Incompressibility

∎ Distance between a program P and a function f ∶ X → Y

∆(P,f) = ∣{x ∈ X st P(x) ≠ f(x)}∣ ∣X∣

∎ If ∆(P,f) = 0 then P ≡ f

slide-47
SLIDE 47

Incompressibility

A

Challenger

k ← $, r ← $ [Er

k] = CE(k, r)

[Er

k]

P ∆(P, E(k, ·))

?

δ and |P|

?

< λ D(k, ·) CE(k, $)

INC-CCA INC-RCA

c′ m′ [Er′

k ]

CE is (τ,ε)-secure wrt (λ,δ)-INC-{CPA/CCA/RCA} ⇔ ∀ A running in time τ : Pr[∆(P,E(k,⋅)) ≤ δ ∧ ∣P∣ ≤ λ] ≤ ε

slide-48
SLIDE 48

Incompressibility

(λ,δ)-INC only makes sense for: δ ≈ 0 and ∣ ref implem ∣ < λ < min

k,r ∣ [Er k] ∣

slide-49
SLIDE 49

Toy example

∎ Encryption scheme E

E ∶ (k,m) ↦ me ∈ G D ∶ (k,m) ↦ me−1 mod ω ∈ G

▸ k = (G,ω,e) ▸ G : RSA group with secret order ω ▸ e ∈ [2,ω) coprime to ω ∎ White-box compiler CE ∶ (k,r) ↦ [Er

k]

▸ [Er

k] computes mf in G

▸ blinded exponent: f = e + r ⋅ ω

slide-50
SLIDE 50

Toy example

∎ CE is OW-CPA under RSA[G] ▸ RSA[G]: it’s hard to compute x1/e for x

$

← G

∎ CE is (λ,0)-INC-CPA (with λ ≈ log f) under ORD[G] ▸ ORD[G]: it’s hard to compute the order of x

$

← G

▸ wrt an adversary producing algebraic programs

slide-51
SLIDE 51

Toy example

∎ Disclaimer: toy example ▸ OW part = RSA ▸ INC part inefficient (linear in the size) ∎ Designing E with (efficient) OW CE = designing a PK

encryption scheme

∎ Designing E with (efficient) INC CE = designing an

incompressible encryption scheme

∎ White-box crypto is about designing a compiler for an

existing encryption scheme

∎ Real challenge: design a OW and/or INC compiler for AES

slide-52
SLIDE 52

Traceability

∎ White-box implem of the decryption (pay-TV use case) ∎ Principle: include secret perturbations of the decryption

functionality [Dr

k,C] = CE(k,r;C)

where [Dr

k,C](c) =

⎧ ⎪ ⎪ ⎨ ⎪ ⎪ ⎩

  • if c ∈ C ⊆ M

Dk(c)

  • therwise
slide-53
SLIDE 53

Traceability

∎ Perturbation-Value Hiding (PVH) security:

A k ← $, r ← $ [Dr

k,C] = CE(k, r | C)

c

$

← C [Dr

k,C], c

ˆ m ˆ m

?

= D(k, c)

Challenger

CE(k, $ | C′) C′ ⊇ C [Dr′

k,C′]

CE is (τ,ε)-secure wrt C-PVH ⇔ ∀ A running in time τ: Pr[ ˆ m = D(k,c)] ≤ ε

slide-54
SLIDE 54

Traceability

∎ User i gets Pi = CE(k,ri;Ci) ▸ for random sets C1 ⊆ C2 ⊆ ⋯ ⊆ Cn ⊆ M ∎ Pirate program from t traitors: Π = A(Pi1,Pi2,...,Pit) ▸ with ∆(Π,D(k,⋅)) negligible ∎ PVH security ⇒ linear tracing procedure

p(i) = Pr[c

$

← Ci/Ci−1 ∶ Π(c) = D(k,c)] 1 p(i) i1 i2 i3 n

slide-55
SLIDE 55

Traceability

∎ User i gets Pi = CE(k,ri;Ci) ▸ for random sets C1 ⊆ C2 ⊆ ⋯ ⊆ Cn ⊆ M ∎ Pirate program from t traitors: Π = A(Pi1,Pi2,...,Pit) ▸ with ∆(Π,D(k,⋅)) negligible ∎ PVH security ⇒ linear tracing procedure

p(i) = Pr[c

$

← Ci/Ci−1 ∶ Π(c) = D(k,c)] 1 p(i) i1 i2 i3 n

majority

  • utput
slide-56
SLIDE 56

Traceability

∎ User i gets Pi = CE(k,ri;Ci) ▸ for random sets C1 ⊆ C2 ⊆ ⋯ ⊆ Cn ⊆ M ∎ Pirate program from t traitors: Π = A(Pi1,Pi2,...,Pit) ▸ with ∆(Π,D(k,⋅)) negligible ∎ PVH security ⇒ linear tracing procedure

p(i) = Pr[c

$

← Ci/Ci−1 ∶ Π(c) = D(k,c)] 1 p(i) i1 i2 i3 n

unanimous

  • utput
slide-57
SLIDE 57

Traceability

∎ User i gets Pi = CE(k,ri;Ci) ▸ for random sets C1 ⊆ C2 ⊆ ⋯ ⊆ Cn ⊆ M ∎ Pirate program from t traitors: Π = A(Pi1,Pi2,...,Pit) ▸ with ∆(Π,D(k,⋅)) negligible ∎ PVH security ⇒ linear tracing procedure

p(i) = Pr[c

$

← Ci/Ci−1 ∶ Π(c) = D(k,c)] 1 p(i) i1 i2 i3 n

PVH

insecurity

slide-58
SLIDE 58

Traceability

∎ User i gets Pi = CE(k,ri;Ci) ▸ for random sets C1 ⊆ C2 ⊆ ⋯ ⊆ Cn ⊆ M ∎ Pirate program from t traitors: Π = A(Pi1,Pi2,...,Pit) ▸ with ∆(Π,D(k,⋅)) negligible ∎ PVH security ⇒ linear tracing procedure

p(i) = Pr[c

$

← Ci/Ci−1 ∶ Π(c) = D(k,c)] 1 p(i) i1 i2 i3 n

PVH

security

slide-59
SLIDE 59

Security hierarchy

∎ If E is a secure encryption scheme

INC ⇓ OW ⇒ UBK ⇐ PVH

slide-60
SLIDE 60

Security hierarchy

∎ If E is a secure encryption scheme

INC ⇓ VBB ⇒ OW ⇒ UBK ⇐ PVH ⇐ VBB

slide-61
SLIDE 61

Security hierarchy

∎ If E is a secure encryption scheme

VBB

  • INC

⇓ VBB ⇒ OW ⇒ UBK ⇐ PVH ⇐ VBB

slide-62
SLIDE 62

Conclusion

slide-63
SLIDE 63

Conclusion

∎ WBC can be define as a restriction of cryptographic

  • bfuscation

▸ subset of programs (e.g. keyed permutation) ▸ relaxed security notions ∎ More work needed to ▸ refine / define alternative security notions ▸ build candidate white-box compiler ∎ Open challenge: INC/OW/PVH-implementation of AES

slide-64
SLIDE 64

Final thoughts

∎ Science is overstepped by industrial usage in the field of WBC ▸ Digital content protection (pay-TV, DRM) ▸ Mobile payments ▸ Software protection ∎ Yet no secure solution available in the public literature ∎ Should we rely on the secret-spec model? ▸ Academic cryptographer: “over my dead body!” ▸ Industrial cryptographer: “only choice I have (for now)” ∎ Open question: who beats who? ▸ secret-spec designer vs. state-of-the-art cryptanalyst

slide-65
SLIDE 65

Biblio

∎ Obfuscation notions (VBB, iO, BPO) ▸ “On the (Im)possibility of Obfuscating Programs” (Barak et al. CRYPTO 2001) ▸ “On Best-Possible Obfuscation” (Goldwasser–Rothblum, TCC 2007) ∎ White-box crypto (introduction, first constructions) ▸ “A White-Box DES Implementation for DRM Applications” (Chow et al. DRM 2002) ▸ “White-Box Cryptography and an AES Implementation” (Chow et al. SAC 2002) ∎ Presented white-box security notions ▸ “White-Box Security Notions for Symmetric Encryption Schemes” (Delerabl´ ee et al. SAC 2013) ∎ Related works ▸ “Towards Security Notions for White-Box Cryptography” (Saxena–Wyseur–Preneel, ISC 2009) ▸ “White-Box Cryptography Revisited: Space-Hard Ciphers” (Bogdanov–Isobe, CCS 2015) ▸ “Efficient and Provable White-Box Primitives” (Fouque et al. ePrint 2016)

slide-66
SLIDE 66

Questions ?