Making sure crypto stays insecure Daniel J. Bernstein University - - PowerPoint PPT Presentation

making sure crypto stays insecure daniel j bernstein
SMART_READER_LITE
LIVE PREVIEW

Making sure crypto stays insecure Daniel J. Bernstein University - - PowerPoint PPT Presentation

Making sure crypto stays insecure Daniel J. Bernstein University of Illinois at Chicago & Technische Universiteit Eindhoven Terrorist in Hong Kong prepares to throw deadly weapon at Chinese government workers. Image credit: Reuters.


slide-1
SLIDE 1

Making sure crypto stays insecure Daniel J. Bernstein University of Illinois at Chicago & Technische Universiteit Eindhoven Terrorist in Hong Kong prepares to throw deadly weapon at Chinese government workers. Image credit: Reuters.

slide-2
SLIDE 2

Making sure stays insecure

  • J. Bernstein

University of Illinois at Chicago & echnische Universiteit Eindhoven Terrorist in Hong Kong prepares to throw deadly weapon at Chinese government workers. Image credit: Reuters. Drug-dealing invades cit begins selling Image credit:

slide-3
SLIDE 3

insecure Bernstein Illinois at Chicago & Universiteit Eindhoven Terrorist in Hong Kong prepares to throw deadly weapon at Chinese government workers. Image credit: Reuters. Drug-dealing carte invades city in Moro begins selling addictive Image credit: Wikip

slide-4
SLIDE 4

Chicago & Eindhoven Terrorist in Hong Kong prepares to throw deadly weapon at Chinese government workers. Image credit: Reuters. Drug-dealing cartel “Starbucks” invades city in Morocco; begins selling addictive liquid. Image credit: Wikipedia.

slide-5
SLIDE 5

Terrorist in Hong Kong prepares to throw deadly weapon at Chinese government workers. Image credit: Reuters. Drug-dealing cartel “Starbucks” invades city in Morocco; begins selling addictive liquid. Image credit: Wikipedia.

slide-6
SLIDE 6

rist in Hong Kong res to throw deadly weapon Chinese government workers. credit: Reuters. Drug-dealing cartel “Starbucks” invades city in Morocco; begins selling addictive liquid. Image credit: Wikipedia. Pedophile to remove sexually Image credit:

slide-7
SLIDE 7

Hong Kong w deadly weapon government workers. Reuters. Drug-dealing cartel “Starbucks” invades city in Morocco; begins selling addictive liquid. Image credit: Wikipedia. Pedophile convinces to remove most of sexually abuses child Image credit: Child

slide-8
SLIDE 8

eapon rkers. Drug-dealing cartel “Starbucks” invades city in Morocco; begins selling addictive liquid. Image credit: Wikipedia. Pedophile convinces helpless to remove most of her clothing; sexually abuses child in public. Image credit: Child pornographer.

slide-9
SLIDE 9

Drug-dealing cartel “Starbucks” invades city in Morocco; begins selling addictive liquid. Image credit: Wikipedia. Pedophile convinces helpless child to remove most of her clothing; sexually abuses child in public. Image credit: Child pornographer.

slide-10
SLIDE 10

Drug-dealing cartel “Starbucks” invades city in Morocco; selling addictive liquid. credit: Wikipedia. Pedophile convinces helpless child to remove most of her clothing; sexually abuses child in public. Image credit: Child pornographer. Criminal calling itself sells classifie Image credit:

slide-11
SLIDE 11

rtel “Starbucks” Morocco; addictive liquid. Wikipedia. Pedophile convinces helpless child to remove most of her clothing; sexually abuses child in public. Image credit: Child pornographer. Criminal organization calling itself “The sells classified government Image credit: The

slide-12
SLIDE 12

rbucks” liquid. Pedophile convinces helpless child to remove most of her clothing; sexually abuses child in public. Image credit: Child pornographer. Criminal organization calling itself “The Guardian” sells classified government secrets. Image credit: The Guardian.

slide-13
SLIDE 13

Pedophile convinces helpless child to remove most of her clothing; sexually abuses child in public. Image credit: Child pornographer. Criminal organization calling itself “The Guardian” sells classified government secrets. Image credit: The Guardian.

slide-14
SLIDE 14

edophile convinces helpless child remove most of her clothing; sexually abuses child in public. credit: Child pornographer. Criminal organization calling itself “The Guardian” sells classified government secrets. Image credit: The Guardian. We have everything so that w drug dealers, pedophiles,

slide-15
SLIDE 15

convinces helpless child

  • f her clothing;

child in public. Child pornographer. Criminal organization calling itself “The Guardian” sells classified government secrets. Image credit: The Guardian. We have to watch everything that people so that we can catch drug dealers, organized pedophiles, murderers,

slide-16
SLIDE 16

helpless child clothing; public. rnographer. Criminal organization calling itself “The Guardian” sells classified government secrets. Image credit: The Guardian. We have to watch and listen everything that people are doing so that we can catch terrorists, drug dealers, organized criminals, pedophiles, murderers, etc.

slide-17
SLIDE 17

Criminal organization calling itself “The Guardian” sells classified government secrets. Image credit: The Guardian. We have to watch and listen to everything that people are doing so that we can catch terrorists, drug dealers, organized criminals, pedophiles, murderers, etc.

slide-18
SLIDE 18

Criminal organization calling itself “The Guardian” sells classified government secrets. Image credit: The Guardian. We have to watch and listen to everything that people are doing so that we can catch terrorists, drug dealers, organized criminals, pedophiles, murderers, etc. We try to systematically monitor and record all Internet traffic. But what if it’s encrypted?

slide-19
SLIDE 19

Criminal organization calling itself “The Guardian” sells classified government secrets. Image credit: The Guardian. We have to watch and listen to everything that people are doing so that we can catch terrorists, drug dealers, organized criminals, pedophiles, murderers, etc. We try to systematically monitor and record all Internet traffic. But what if it’s encrypted? This talk gives some examples

  • f how we’ve manipulated

the world’s crypto ecosystem so that we can understand almost all of this traffic.

slide-20
SLIDE 20

Criminal organization itself “The Guardian” classified government secrets. credit: The Guardian. We have to watch and listen to everything that people are doing so that we can catch terrorists, drug dealers, organized criminals, pedophiles, murderers, etc. We try to systematically monitor and record all Internet traffic. But what if it’s encrypted? This talk gives some examples

  • f how we’ve manipulated

the world’s crypto ecosystem so that we can understand almost all of this traffic.

slide-21
SLIDE 21

rganization “The Guardian” government secrets. The Guardian. We have to watch and listen to everything that people are doing so that we can catch terrorists, drug dealers, organized criminals, pedophiles, murderers, etc. We try to systematically monitor and record all Internet traffic. But what if it’s encrypted? This talk gives some examples

  • f how we’ve manipulated

the world’s crypto ecosystem so that we can understand almost all of this traffic.

slide-22
SLIDE 22

rdian” secrets. rdian. We have to watch and listen to everything that people are doing so that we can catch terrorists, drug dealers, organized criminals, pedophiles, murderers, etc. We try to systematically monitor and record all Internet traffic. But what if it’s encrypted? This talk gives some examples

  • f how we’ve manipulated

the world’s crypto ecosystem so that we can understand almost all of this traffic.

slide-23
SLIDE 23

We have to watch and listen to everything that people are doing so that we can catch terrorists, drug dealers, organized criminals, pedophiles, murderers, etc. We try to systematically monitor and record all Internet traffic. But what if it’s encrypted? This talk gives some examples

  • f how we’ve manipulated

the world’s crypto ecosystem so that we can understand almost all of this traffic.

slide-24
SLIDE 24

ve to watch and listen to everything that people are doing that we can catch terrorists, dealers, organized criminals, edophiles, murderers, etc. to systematically monitor record all Internet traffic. what if it’s encrypted? talk gives some examples we’ve manipulated rld’s crypto ecosystem that we can understand all of this traffic. Other useful not covered Manipulate so that soft Break into hundreds screens,

slide-25
SLIDE 25

atch and listen to people are doing catch terrorists, rganized criminals, murderers, etc. systematically monitor Internet traffic. it’s encrypted? some examples nipulated crypto ecosystem nderstand this traffic. Other useful strategies, not covered in this Manipulate software so that software sta Break into computers; hundreds of million screens, microphones

slide-26
SLIDE 26

listen to doing rists, criminals, etc. monitor traffic. encrypted? examples ystem Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras.

slide-27
SLIDE 27

Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras.

slide-28
SLIDE 28

Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras. Add back doors to hardware. e.g. 2012 U.S. government report says that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”.

slide-29
SLIDE 29

Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras. Add back doors to hardware. e.g. 2012 U.S. government report says that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”. Some imp

  • 1. “We”

I want secure

slide-30
SLIDE 30

Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras. Add back doors to hardware. e.g. 2012 U.S. government report says that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”. Some important cla

  • 1. “We” doesn’t include

I want secure crypto.

slide-31
SLIDE 31

Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras. Add back doors to hardware. e.g. 2012 U.S. government report says that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”. Some important clarifications

  • 1. “We” doesn’t include me.

I want secure crypto.

slide-32
SLIDE 32

Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras. Add back doors to hardware. e.g. 2012 U.S. government report says that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”. Some important clarifications

  • 1. “We” doesn’t include me.

I want secure crypto.

slide-33
SLIDE 33

Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras. Add back doors to hardware. e.g. 2012 U.S. government report says that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”. Some important clarifications

  • 1. “We” doesn’t include me.

I want secure crypto.

  • 2. Their actions violate

fundamental human rights.

slide-34
SLIDE 34

Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras. Add back doors to hardware. e.g. 2012 U.S. government report says that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”. Some important clarifications

  • 1. “We” doesn’t include me.

I want secure crypto.

  • 2. Their actions violate

fundamental human rights.

  • 3. I don’t know how much
  • f today’s crypto ecosystem

was deliberately manipulated.

slide-35
SLIDE 35

Other useful strategies, not covered in this talk: Manipulate software ecosystem so that software stays insecure. Break into computers; access hundreds of millions of disks, screens, microphones, cameras. Add back doors to hardware. e.g. 2012 U.S. government report says that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”. Some important clarifications

  • 1. “We” doesn’t include me.

I want secure crypto.

  • 2. Their actions violate

fundamental human rights.

  • 3. I don’t know how much
  • f today’s crypto ecosystem

was deliberately manipulated. This talk is actually a thought experiment: how could an attacker manipulate the ecosystem for insecurity?

slide-36
SLIDE 36

useful strategies, covered in this talk: Manipulate software ecosystem that software stays insecure. into computers; access hundreds of millions of disks, screens, microphones, cameras. back doors to hardware. 2012 U.S. government report that Chinese-manufactured routers provide “Chinese intelligence services access to telecommunication networks”. Some important clarifications

  • 1. “We” doesn’t include me.

I want secure crypto.

  • 2. Their actions violate

fundamental human rights.

  • 3. I don’t know how much
  • f today’s crypto ecosystem

was deliberately manipulated. This talk is actually a thought experiment: how could an attacker manipulate the ecosystem for insecurity? Timing attacks 2005 Osvik–Shamir–T 65ms to used for Attack p but without Almost all use fast Kernel’s influences influencing influencing

  • f the attack

65ms: compute

slide-37
SLIDE 37

strategies, this talk: ware ecosystem stays insecure. computers; access illions of disks, hones, cameras. to hardware. government report Chinese-manufactured “Chinese vices access to telecommunication networks”. Some important clarifications

  • 1. “We” doesn’t include me.

I want secure crypto.

  • 2. Their actions violate

fundamental human rights.

  • 3. I don’t know how much
  • f today’s crypto ecosystem

was deliberately manipulated. This talk is actually a thought experiment: how could an attacker manipulate the ecosystem for insecurity? Timing attacks 2005 Osvik–Shamir–T 65ms to steal Linux used for hard-disk Attack process on but without privileges. Almost all AES implementations use fast lookup tables. Kernel’s secret AES influences table-load influencing CPU cache influencing measurable

  • f the attack process.

65ms: compute key

slide-38
SLIDE 38

ecosystem insecure. access disks, cameras. re. government report Chinese-manufactured to rks”. Some important clarifications

  • 1. “We” doesn’t include me.

I want secure crypto.

  • 2. Their actions violate

fundamental human rights.

  • 3. I don’t know how much
  • f today’s crypto ecosystem

was deliberately manipulated. This talk is actually a thought experiment: how could an attacker manipulate the ecosystem for insecurity? Timing attacks 2005 Osvik–Shamir–Tromer: 65ms to steal Linux AES key used for hard-disk encryption. Attack process on same CPU but without privileges. Almost all AES implementations use fast lookup tables. Kernel’s secret AES key influences table-load addresses, influencing CPU cache state, influencing measurable timings

  • f the attack process.

65ms: compute key from timings.

slide-39
SLIDE 39

Some important clarifications

  • 1. “We” doesn’t include me.

I want secure crypto.

  • 2. Their actions violate

fundamental human rights.

  • 3. I don’t know how much
  • f today’s crypto ecosystem

was deliberately manipulated. This talk is actually a thought experiment: how could an attacker manipulate the ecosystem for insecurity? Timing attacks 2005 Osvik–Shamir–Tromer: 65ms to steal Linux AES key used for hard-disk encryption. Attack process on same CPU but without privileges. Almost all AES implementations use fast lookup tables. Kernel’s secret AES key influences table-load addresses, influencing CPU cache state, influencing measurable timings

  • f the attack process.

65ms: compute key from timings.

slide-40
SLIDE 40

important clarifications e” doesn’t include me. secure crypto. Their actions violate fundamental human rights. don’t know how much day’s crypto ecosystem deliberately manipulated. talk is actually thought experiment: could an attacker manipulate ecosystem for insecurity? Timing attacks 2005 Osvik–Shamir–Tromer: 65ms to steal Linux AES key used for hard-disk encryption. Attack process on same CPU but without privileges. Almost all AES implementations use fast lookup tables. Kernel’s secret AES key influences table-load addresses, influencing CPU cache state, influencing measurable timings

  • f the attack process.

65ms: compute key from timings. 2011 Brumley–T minutes machine’s Secret branch influence Most cryptographic has many variations e.g., memcmp Many mo 2014 van extracted from 25

slide-41
SLIDE 41

clarifications esn’t include me. crypto. violate human rights. how much crypto ecosystem manipulated. actually eriment: attacker manipulate r insecurity? Timing attacks 2005 Osvik–Shamir–Tromer: 65ms to steal Linux AES key used for hard-disk encryption. Attack process on same CPU but without privileges. Almost all AES implementations use fast lookup tables. Kernel’s secret AES key influences table-load addresses, influencing CPU cache state, influencing measurable timings

  • f the attack process.

65ms: compute key from timings. 2011 Brumley–Tuveri: minutes to steal another machine’s OpenSSL Secret branch conditions influence timings. Most cryptographic has many more small-scale variations in timing: e.g., memcmp for IPsec Many more timing 2014 van de Pol–Sma extracted Bitcoin secret from 25 OpenSSL

slide-42
SLIDE 42

rifications me. rights. ecosystem manipulated. manipulate insecurity? Timing attacks 2005 Osvik–Shamir–Tromer: 65ms to steal Linux AES key used for hard-disk encryption. Attack process on same CPU but without privileges. Almost all AES implementations use fast lookup tables. Kernel’s secret AES key influences table-load addresses, influencing CPU cache state, influencing measurable timings

  • f the attack process.

65ms: compute key from timings. 2011 Brumley–Tuveri: minutes to steal another machine’s OpenSSL ECDSA Secret branch conditions influence timings. Most cryptographic software has many more small-scale variations in timing: e.g., memcmp for IPsec MACs. Many more timing attacks: 2014 van de Pol–Smart–Yarom extracted Bitcoin secret keys from 25 OpenSSL signatures.

slide-43
SLIDE 43

Timing attacks 2005 Osvik–Shamir–Tromer: 65ms to steal Linux AES key used for hard-disk encryption. Attack process on same CPU but without privileges. Almost all AES implementations use fast lookup tables. Kernel’s secret AES key influences table-load addresses, influencing CPU cache state, influencing measurable timings

  • f the attack process.

65ms: compute key from timings. 2011 Brumley–Tuveri: minutes to steal another machine’s OpenSSL ECDSA key. Secret branch conditions influence timings. Most cryptographic software has many more small-scale variations in timing: e.g., memcmp for IPsec MACs. Many more timing attacks: e.g. 2014 van de Pol–Smart–Yarom extracted Bitcoin secret keys from 25 OpenSSL signatures.

slide-44
SLIDE 44

Timing attacks Osvik–Shamir–Tromer: to steal Linux AES key for hard-disk encryption. process on same CPU without privileges. Almost all AES implementations fast lookup tables. Kernel’s secret AES key influences table-load addresses, influencing CPU cache state, influencing measurable timings attack process. compute key from timings. 2011 Brumley–Tuveri: minutes to steal another machine’s OpenSSL ECDSA key. Secret branch conditions influence timings. Most cryptographic software has many more small-scale variations in timing: e.g., memcmp for IPsec MACs. Many more timing attacks: e.g. 2014 van de Pol–Smart–Yarom extracted Bitcoin secret keys from 25 OpenSSL signatures. Manufacture that such Maybe terro won’t try 2001 NIST development Encryption “A general timing attacks each encryption

  • peration

amount not vulnerable

slide-45
SLIDE 45

Osvik–Shamir–Tromer: Linux AES key rd-disk encryption.

  • n same CPU

rivileges. implementations tables. AES key table-load addresses, cache state, measurable timings cess. key from timings. 2011 Brumley–Tuveri: minutes to steal another machine’s OpenSSL ECDSA key. Secret branch conditions influence timings. Most cryptographic software has many more small-scale variations in timing: e.g., memcmp for IPsec MACs. Many more timing attacks: e.g. 2014 van de Pol–Smart–Yarom extracted Bitcoin secret keys from 25 OpenSSL signatures. Manufacture public that such attacks exist. Maybe terrorists Alice won’t try to stop t 2001 NIST “Report development of the Encryption Standa “A general defense timing attacks is to each encryption and

  • peration runs in the

amount of time. : : not vulnerable to timing

slide-46
SLIDE 46

romer: ey encryption. CPU implementations addresses, state, timings timings. 2011 Brumley–Tuveri: minutes to steal another machine’s OpenSSL ECDSA key. Secret branch conditions influence timings. Most cryptographic software has many more small-scale variations in timing: e.g., memcmp for IPsec MACs. Many more timing attacks: e.g. 2014 van de Pol–Smart–Yarom extracted Bitcoin secret keys from 25 OpenSSL signatures. Manufacture public denials that such attacks exist. Maybe terrorists Alice and Bob won’t try to stop the attacks. 2001 NIST “Report on the development of the Advanced Encryption Standard (AES)”: “A general defense against timing attacks is to ensure that each encryption and decryption

  • peration runs in the same

amount of time. : : : Table lo not vulnerable to timing attacks.

slide-47
SLIDE 47

2011 Brumley–Tuveri: minutes to steal another machine’s OpenSSL ECDSA key. Secret branch conditions influence timings. Most cryptographic software has many more small-scale variations in timing: e.g., memcmp for IPsec MACs. Many more timing attacks: e.g. 2014 van de Pol–Smart–Yarom extracted Bitcoin secret keys from 25 OpenSSL signatures. Manufacture public denials that such attacks exist. Maybe terrorists Alice and Bob won’t try to stop the attacks. 2001 NIST “Report on the development of the Advanced Encryption Standard (AES)”: “A general defense against timing attacks is to ensure that each encryption and decryption

  • peration runs in the same

amount of time. : : : Table lookup: not vulnerable to timing attacks.”

slide-48
SLIDE 48

Brumley–Tuveri: minutes to steal another machine’s OpenSSL ECDSA key. branch conditions influence timings. cryptographic software many more small-scale riations in timing: memcmp for IPsec MACs. more timing attacks: e.g. van de Pol–Smart–Yarom extracted Bitcoin secret keys 25 OpenSSL signatures. Manufacture public denials that such attacks exist. Maybe terrorists Alice and Bob won’t try to stop the attacks. 2001 NIST “Report on the development of the Advanced Encryption Standard (AES)”: “A general defense against timing attacks is to ensure that each encryption and decryption

  • peration runs in the same

amount of time. : : : Table lookup: not vulnerable to timing attacks.” 2008 RF Layer Securit Version 1.2”: small timing performance extent on fragment, be large due to the existing MA

  • f the timing
slide-49
SLIDE 49

uveri: another enSSL ECDSA key. conditions timings. cryptographic software small-scale timing: IPsec MACs. timing attacks: e.g.

  • l–Smart–Yarom

Bitcoin secret keys enSSL signatures. Manufacture public denials that such attacks exist. Maybe terrorists Alice and Bob won’t try to stop the attacks. 2001 NIST “Report on the development of the Advanced Encryption Standard (AES)”: “A general defense against timing attacks is to ensure that each encryption and decryption

  • peration runs in the same

amount of time. : : : Table lookup: not vulnerable to timing attacks.” 2008 RFC 5246 “The Layer Security (TLS) Version 1.2”: “This small timing channel, performance depends extent on the size fragment, but it is be large enough to due to the large blo existing MACs and

  • f the timing signal.”
slide-50
SLIDE 50

ECDSA key. re Cs. attacks: e.g. arom eys signatures. Manufacture public denials that such attacks exist. Maybe terrorists Alice and Bob won’t try to stop the attacks. 2001 NIST “Report on the development of the Advanced Encryption Standard (AES)”: “A general defense against timing attacks is to ensure that each encryption and decryption

  • peration runs in the same

amount of time. : : : Table lookup: not vulnerable to timing attacks.” 2008 RFC 5246 “The Transp Layer Security (TLS) Protocol, Version 1.2”: “This leaves a small timing channel, since MA performance depends to some extent on the size of the data fragment, but it is not believed be large enough to be exploitable due to the large block size of existing MACs and the small

  • f the timing signal.”
slide-51
SLIDE 51

Manufacture public denials that such attacks exist. Maybe terrorists Alice and Bob won’t try to stop the attacks. 2001 NIST “Report on the development of the Advanced Encryption Standard (AES)”: “A general defense against timing attacks is to ensure that each encryption and decryption

  • peration runs in the same

amount of time. : : : Table lookup: not vulnerable to timing attacks.” 2008 RFC 5246 “The Transport Layer Security (TLS) Protocol, Version 1.2”: “This leaves a small timing channel, since MAC performance depends to some extent on the size of the data fragment, but it is not believed to be large enough to be exploitable, due to the large block size of existing MACs and the small size

  • f the timing signal.”
slide-52
SLIDE 52

Manufacture public denials that such attacks exist. Maybe terrorists Alice and Bob won’t try to stop the attacks. 2001 NIST “Report on the development of the Advanced Encryption Standard (AES)”: “A general defense against timing attacks is to ensure that each encryption and decryption

  • peration runs in the same

amount of time. : : : Table lookup: not vulnerable to timing attacks.” 2008 RFC 5246 “The Transport Layer Security (TLS) Protocol, Version 1.2”: “This leaves a small timing channel, since MAC performance depends to some extent on the size of the data fragment, but it is not believed to be large enough to be exploitable, due to the large block size of existing MACs and the small size

  • f the timing signal.”

2013 AlFardan–Paterson “Lucky Thirteen: breaking the TLS and DTLS record protocols”: exploit these timings; steal plaintext.

slide-53
SLIDE 53

Manufacture public denials such attacks exist. terrorists Alice and Bob try to stop the attacks. NIST “Report on the development of the Advanced Encryption Standard (AES)”: general defense against attacks is to ensure that encryption and decryption eration runs in the same amount of time. : : : Table lookup: vulnerable to timing attacks.” 2008 RFC 5246 “The Transport Layer Security (TLS) Protocol, Version 1.2”: “This leaves a small timing channel, since MAC performance depends to some extent on the size of the data fragment, but it is not believed to be large enough to be exploitable, due to the large block size of existing MACs and the small size

  • f the timing signal.”

2013 AlFardan–Paterson “Lucky Thirteen: breaking the TLS and DTLS record protocols”: exploit these timings; steal plaintext. Some instructions flow from timings: constant-distance (on most What if software instructions? see anything

slide-54
SLIDE 54

public denials attacks exist. Alice and Bob stop the attacks.

  • rt on the

the Advanced Standard (AES)”: defense against to ensure that and decryption in the same : : : Table lookup: to timing attacks.” 2008 RFC 5246 “The Transport Layer Security (TLS) Protocol, Version 1.2”: “This leaves a small timing channel, since MAC performance depends to some extent on the size of the data fragment, but it is not believed to be large enough to be exploitable, due to the large block size of existing MACs and the small size

  • f the timing signal.”

2013 AlFardan–Paterson “Lucky Thirteen: breaking the TLS and DTLS record protocols”: exploit these timings; steal plaintext. Some instructions flow from their inputs timings: e.g., logic constant-distance shifts, (on most CPUs), add, What if Alice and software built solely instructions? Yikes: see anything from

slide-55
SLIDE 55

denials Bob attacks. the Advanced (AES)”: against that decryption same lookup: attacks.” 2008 RFC 5246 “The Transport Layer Security (TLS) Protocol, Version 1.2”: “This leaves a small timing channel, since MAC performance depends to some extent on the size of the data fragment, but it is not believed to be large enough to be exploitable, due to the large block size of existing MACs and the small size

  • f the timing signal.”

2013 AlFardan–Paterson “Lucky Thirteen: breaking the TLS and DTLS record protocols”: exploit these timings; steal plaintext. Some instructions have no data flow from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply (on most CPUs), add, subtract. What if Alice and Bob use crypto software built solely from these instructions? Yikes: we won’t see anything from timings!

slide-56
SLIDE 56

2008 RFC 5246 “The Transport Layer Security (TLS) Protocol, Version 1.2”: “This leaves a small timing channel, since MAC performance depends to some extent on the size of the data fragment, but it is not believed to be large enough to be exploitable, due to the large block size of existing MACs and the small size

  • f the timing signal.”

2013 AlFardan–Paterson “Lucky Thirteen: breaking the TLS and DTLS record protocols”: exploit these timings; steal plaintext. Some instructions have no data flow from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply (on most CPUs), add, subtract. What if Alice and Bob use crypto software built solely from these instructions? Yikes: we won’t see anything from timings!

slide-57
SLIDE 57

2008 RFC 5246 “The Transport Layer Security (TLS) Protocol, Version 1.2”: “This leaves a small timing channel, since MAC performance depends to some extent on the size of the data fragment, but it is not believed to be large enough to be exploitable, due to the large block size of existing MACs and the small size

  • f the timing signal.”

2013 AlFardan–Paterson “Lucky Thirteen: breaking the TLS and DTLS record protocols”: exploit these timings; steal plaintext. Some instructions have no data flow from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply (on most CPUs), add, subtract. What if Alice and Bob use crypto software built solely from these instructions? Yikes: we won’t see anything from timings! Try to scare implementors away from constant-time software. e.g. “It will be too slow.” “It’s too hard to write.”

slide-58
SLIDE 58

RFC 5246 “The Transport Security (TLS) Protocol, ersion 1.2”: “This leaves a timing channel, since MAC rmance depends to some

  • n the size of the data

fragment, but it is not believed to rge enough to be exploitable, the large block size of existing MACs and the small size timing signal.” AlFardan–Paterson “Lucky Thirteen: breaking the TLS and record protocols”: exploit timings; steal plaintext. Some instructions have no data flow from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply (on most CPUs), add, subtract. What if Alice and Bob use crypto software built solely from these instructions? Yikes: we won’t see anything from timings! Try to scare implementors away from constant-time software. e.g. “It will be too slow.” “It’s too hard to write.” Fund variable-time maybe with that mak for resea but that with our

slide-59
SLIDE 59

“The Transport (TLS) Protocol, his leaves a channel, since MAC ends to some size of the data is not believed to to be exploitable, block size of and the small size signal.” rdan–Paterson “Lucky reaking the TLS and rotocols”: exploit steal plaintext. Some instructions have no data flow from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply (on most CPUs), add, subtract. What if Alice and Bob use crypto software built solely from these instructions? Yikes: we won’t see anything from timings! Try to scare implementors away from constant-time software. e.g. “It will be too slow.” “It’s too hard to write.” Fund variable-time maybe with “countermeasures” that make the timings for researchers to analyze but that are still break with our computer

slide-60
SLIDE 60

ransport Protocol, a since MAC some data elieved to exploitable,

  • f

all size “Lucky TLS and exploit plaintext. Some instructions have no data flow from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply (on most CPUs), add, subtract. What if Alice and Bob use crypto software built solely from these instructions? Yikes: we won’t see anything from timings! Try to scare implementors away from constant-time software. e.g. “It will be too slow.” “It’s too hard to write.” Fund variable-time software, maybe with “countermeasures” that make the timings difficu for researchers to analyze but that are still breakable with our computer resources.

slide-61
SLIDE 61

Some instructions have no data flow from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply (on most CPUs), add, subtract. What if Alice and Bob use crypto software built solely from these instructions? Yikes: we won’t see anything from timings! Try to scare implementors away from constant-time software. e.g. “It will be too slow.” “It’s too hard to write.” Fund variable-time software, maybe with “countermeasures” that make the timings difficult for researchers to analyze but that are still breakable with our computer resources.

slide-62
SLIDE 62

Some instructions have no data flow from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply (on most CPUs), add, subtract. What if Alice and Bob use crypto software built solely from these instructions? Yikes: we won’t see anything from timings! Try to scare implementors away from constant-time software. e.g. “It will be too slow.” “It’s too hard to write.” Fund variable-time software, maybe with “countermeasures” that make the timings difficult for researchers to analyze but that are still breakable with our computer resources. Continue expressing skepticism that constant time is needed. e.g. 2012 Mowery–Keelveedhi– Shacham “Are AES x86 cache timing attacks still feasible?”, unfortunately shredded by 2014 Irazoqui–Inci–Eisenbarth–Sunar “Wait a minute! A fast, cross-VM attack on AES”.

slide-63
SLIDE 63

instructions have no data from their inputs to CPU timings: e.g., logic instructions, constant-distance shifts, multiply most CPUs), add, subtract. if Alice and Bob use crypto re built solely from these instructions? Yikes: we won’t anything from timings! scare implementors away constant-time software. “It will be too slow.” too hard to write.” Fund variable-time software, maybe with “countermeasures” that make the timings difficult for researchers to analyze but that are still breakable with our computer resources. Continue expressing skepticism that constant time is needed. e.g. 2012 Mowery–Keelveedhi– Shacham “Are AES x86 cache timing attacks still feasible?”, unfortunately shredded by 2014 Irazoqui–Inci–Eisenbarth–Sunar “Wait a minute! A fast, cross-VM attack on AES”. What if use a different constant-time are simple Don’t standa e.g. choose not higher-securit Watch o standardization Discourage Pretend is a guarantee while anything has questionable

slide-64
SLIDE 64

instructions have no data inputs to CPU logic instructions, constant-distance shifts, multiply CPUs), add, subtract. and Bob use crypto solely from these Yikes: we won’t from timings! implementors away constant-time software. too slow.” write.” Fund variable-time software, maybe with “countermeasures” that make the timings difficult for researchers to analyze but that are still breakable with our computer resources. Continue expressing skepticism that constant time is needed. e.g. 2012 Mowery–Keelveedhi– Shacham “Are AES x86 cache timing attacks still feasible?”, unfortunately shredded by 2014 Irazoqui–Inci–Eisenbarth–Sunar “Wait a minute! A fast, cross-VM attack on AES”. What if terrorists Alice use a different cipher constant-time imple are simple and fast? Don’t standardize e.g. choose Rijndael not higher-security Watch out for any standardization effo Discourage use of Pretend that standa is a guarantee of securit while anything non has questionable securit

slide-65
SLIDE 65

data CPU instructions, multiply subtract. crypto these

  • n’t

timings! away re. Fund variable-time software, maybe with “countermeasures” that make the timings difficult for researchers to analyze but that are still breakable with our computer resources. Continue expressing skepticism that constant time is needed. e.g. 2012 Mowery–Keelveedhi– Shacham “Are AES x86 cache timing attacks still feasible?”, unfortunately shredded by 2014 Irazoqui–Inci–Eisenbarth–Sunar “Wait a minute! A fast, cross-VM attack on AES”. What if terrorists Alice and Bob use a different cipher for which constant-time implementations are simple and fast? Yikes! Don’t standardize that cipher. e.g. choose Rijndael as AES, not higher-security Serpent. Watch out for any subsequen standardization efforts. Discourage use of the cipher. Pretend that standardization is a guarantee of security while anything non-standard has questionable security.

slide-66
SLIDE 66

Fund variable-time software, maybe with “countermeasures” that make the timings difficult for researchers to analyze but that are still breakable with our computer resources. Continue expressing skepticism that constant time is needed. e.g. 2012 Mowery–Keelveedhi– Shacham “Are AES x86 cache timing attacks still feasible?”, unfortunately shredded by 2014 Irazoqui–Inci–Eisenbarth–Sunar “Wait a minute! A fast, cross-VM attack on AES”. What if terrorists Alice and Bob use a different cipher for which constant-time implementations are simple and fast? Yikes! Don’t standardize that cipher. e.g. choose Rijndael as AES, not higher-security Serpent. Watch out for any subsequent standardization efforts. Discourage use of the cipher. Pretend that standardization is a guarantee of security while anything non-standard has questionable security.

slide-67
SLIDE 67

variable-time software, with “countermeasures” make the timings difficult searchers to analyze that are still breakable

  • ur computer resources.

Continue expressing skepticism constant time is needed. 2012 Mowery–Keelveedhi– Shacham “Are AES x86 cache attacks still feasible?”, rtunately shredded by 2014 qui–Inci–Eisenbarth–Sunar a minute! A fast, cross-VM attack on AES”. What if terrorists Alice and Bob use a different cipher for which constant-time implementations are simple and fast? Yikes! Don’t standardize that cipher. e.g. choose Rijndael as AES, not higher-security Serpent. Watch out for any subsequent standardization efforts. Discourage use of the cipher. Pretend that standardization is a guarantee of security while anything non-standard has questionable security. Padding 1998 Bleichenbacher: Decrypt by observing to ≈106 SSL first then checks (which many Subsequent more serious Server resp pattern of pattern reveals

slide-68
SLIDE 68

riable-time software, “countermeasures” timings difficult to analyze breakable computer resources. ressing skepticism time is needed. ery–Keelveedhi– AES x86 cache still feasible?”, shredded by 2014 qui–Inci–Eisenbarth–Sunar A fast,

  • n AES”.

What if terrorists Alice and Bob use a different cipher for which constant-time implementations are simple and fast? Yikes! Don’t standardize that cipher. e.g. choose Rijndael as AES, not higher-security Serpent. Watch out for any subsequent standardization efforts. Discourage use of the cipher. Pretend that standardization is a guarantee of security while anything non-standard has questionable security. Padding oracles 1998 Bleichenbacher: Decrypt SSL RSA by observing server to ≈106 variants of SSL first inverts RSA, then checks for “PK (which many forgeries Subsequent processing more serious integrit Server responses re pattern of PKCS fo pattern reveals plaintext.

slide-69
SLIDE 69

re, “countermeasures” ifficult resources. epticism needed. ery–Keelveedhi– cache feasible?”, 2014 rth–Sunar AES”. What if terrorists Alice and Bob use a different cipher for which constant-time implementations are simple and fast? Yikes! Don’t standardize that cipher. e.g. choose Rijndael as AES, not higher-security Serpent. Watch out for any subsequent standardization efforts. Discourage use of the cipher. Pretend that standardization is a guarantee of security while anything non-standard has questionable security. Padding oracles 1998 Bleichenbacher: Decrypt SSL RSA ciphertext by observing server responses to ≈106 variants of ciphertext. SSL first inverts RSA, then checks for “PKCS padding” (which many forgeries have). Subsequent processing applies more serious integrity checks. Server responses reveal pattern of PKCS forgeries; pattern reveals plaintext.

slide-70
SLIDE 70

What if terrorists Alice and Bob use a different cipher for which constant-time implementations are simple and fast? Yikes! Don’t standardize that cipher. e.g. choose Rijndael as AES, not higher-security Serpent. Watch out for any subsequent standardization efforts. Discourage use of the cipher. Pretend that standardization is a guarantee of security while anything non-standard has questionable security. Padding oracles 1998 Bleichenbacher: Decrypt SSL RSA ciphertext by observing server responses to ≈106 variants of ciphertext. SSL first inverts RSA, then checks for “PKCS padding” (which many forgeries have). Subsequent processing applies more serious integrity checks. Server responses reveal pattern of PKCS forgeries; pattern reveals plaintext.

slide-71
SLIDE 71

if terrorists Alice and Bob different cipher for which constant-time implementations simple and fast? Yikes! standardize that cipher. choose Rijndael as AES, higher-security Serpent.

  • ut for any subsequent

rdization efforts. Discourage use of the cipher. Pretend that standardization guarantee of security anything non-standard questionable security. Padding oracles 1998 Bleichenbacher: Decrypt SSL RSA ciphertext by observing server responses to ≈106 variants of ciphertext. SSL first inverts RSA, then checks for “PKCS padding” (which many forgeries have). Subsequent processing applies more serious integrity checks. Server responses reveal pattern of PKCS forgeries; pattern reveals plaintext. Design cryptographic so that fo as much e.g. Design and check checking Broken b such as BEAST e.g. Design IPsec options. Paterson–Y Degabriele–P

slide-72
SLIDE 72

rists Alice and Bob cipher for which plementations fast? Yikes! rdize that cipher. dael as AES, higher-security Serpent. any subsequent efforts.

  • f the cipher.

standardization

  • f security

non-standard security. Padding oracles 1998 Bleichenbacher: Decrypt SSL RSA ciphertext by observing server responses to ≈106 variants of ciphertext. SSL first inverts RSA, then checks for “PKCS padding” (which many forgeries have). Subsequent processing applies more serious integrity checks. Server responses reveal pattern of PKCS forgeries; pattern reveals plaintext. Design cryptographic so that forgeries are as much processing e.g. Design SSL to and check padding checking a serious Broken by padding-o such as BEAST and e.g. Design “encrypt-only” IPsec options. Brok Paterson–Yau for Lin Degabriele–Paterson

slide-73
SLIDE 73

and Bob which mentations es! cipher. AES, ent. uent cipher. rdization rd Padding oracles 1998 Bleichenbacher: Decrypt SSL RSA ciphertext by observing server responses to ≈106 variants of ciphertext. SSL first inverts RSA, then checks for “PKCS padding” (which many forgeries have). Subsequent processing applies more serious integrity checks. Server responses reveal pattern of PKCS forgeries; pattern reveals plaintext. Design cryptographic systems so that forgeries are sent through as much processing as possible. e.g. Design SSL to decrypt and check padding before checking a serious MAC. Broken by padding-oracle attacks such as BEAST and POODLE. e.g. Design “encrypt-only” IPsec options. Broken by 2006 Paterson–Yau for Linux and Degabriele–Paterson for RFCs.

slide-74
SLIDE 74

Padding oracles 1998 Bleichenbacher: Decrypt SSL RSA ciphertext by observing server responses to ≈106 variants of ciphertext. SSL first inverts RSA, then checks for “PKCS padding” (which many forgeries have). Subsequent processing applies more serious integrity checks. Server responses reveal pattern of PKCS forgeries; pattern reveals plaintext. Design cryptographic systems so that forgeries are sent through as much processing as possible. e.g. Design SSL to decrypt and check padding before checking a serious MAC. Broken by padding-oracle attacks such as BEAST and POODLE. e.g. Design “encrypt-only” IPsec options. Broken by 2006 Paterson–Yau for Linux and 2007 Degabriele–Paterson for RFCs.

slide-75
SLIDE 75

adding oracles Bleichenbacher: Decrypt SSL RSA ciphertext

  • bserving server responses

106 variants of ciphertext. first inverts RSA, checks for “PKCS padding” many forgeries have). Subsequent processing applies serious integrity checks. responses reveal pattern of PKCS forgeries; pattern reveals plaintext. Design cryptographic systems so that forgeries are sent through as much processing as possible. e.g. Design SSL to decrypt and check padding before checking a serious MAC. Broken by padding-oracle attacks such as BEAST and POODLE. e.g. Design “encrypt-only” IPsec options. Broken by 2006 Paterson–Yau for Linux and 2007 Degabriele–Paterson for RFCs. Randomness 1995 Goldb SSL keys 2008 Bello: OpenSSL <20 bits 2012 Lenstra–Hughes–Augier– Bos–Kleinjung–W Heninger–Durumeric–W Halderman keys for The prim randomness

slide-76
SLIDE 76

Bleichenbacher: RSA ciphertext server responses riants of ciphertext. RSA, “PKCS padding” rgeries have). cessing applies integrity checks. reveal forgeries; plaintext. Design cryptographic systems so that forgeries are sent through as much processing as possible. e.g. Design SSL to decrypt and check padding before checking a serious MAC. Broken by padding-oracle attacks such as BEAST and POODLE. e.g. Design “encrypt-only” IPsec options. Broken by 2006 Paterson–Yau for Linux and 2007 Degabriele–Paterson for RFCs. Randomness 1995 Goldberg–Wagner: SSL keys had <50 2008 Bello: Debian/Ubuntu OpenSSL keys for <20 bits of entrop 2012 Lenstra–Hughes–Augier– Bos–Kleinjung–Wachter Heninger–Durumeric–W Halderman broke the keys for 0.5% of all The primes had so randomness that they

slide-77
SLIDE 77

ciphertext

  • nses

ciphertext. padding” have). applies hecks. rgeries; Design cryptographic systems so that forgeries are sent through as much processing as possible. e.g. Design SSL to decrypt and check padding before checking a serious MAC. Broken by padding-oracle attacks such as BEAST and POODLE. e.g. Design “encrypt-only” IPsec options. Broken by 2006 Paterson–Yau for Linux and 2007 Degabriele–Paterson for RFCs. Randomness 1995 Goldberg–Wagner: Netscap SSL keys had <50 bits of entrop 2008 Bello: Debian/Ubuntu OpenSSL keys for years had <20 bits of entropy. 2012 Lenstra–Hughes–Augier– Bos–Kleinjung–Wachter and Heninger–Durumeric–Wustro Halderman broke the RSA publ keys for 0.5% of all SSL servers. The primes had so little randomness that they collided.

slide-78
SLIDE 78

Design cryptographic systems so that forgeries are sent through as much processing as possible. e.g. Design SSL to decrypt and check padding before checking a serious MAC. Broken by padding-oracle attacks such as BEAST and POODLE. e.g. Design “encrypt-only” IPsec options. Broken by 2006 Paterson–Yau for Linux and 2007 Degabriele–Paterson for RFCs. Randomness 1995 Goldberg–Wagner: Netscape SSL keys had <50 bits of entropy. 2008 Bello: Debian/Ubuntu OpenSSL keys for years had <20 bits of entropy. 2012 Lenstra–Hughes–Augier– Bos–Kleinjung–Wachter and 2012 Heninger–Durumeric–Wustrow– Halderman broke the RSA public keys for 0.5% of all SSL servers. The primes had so little randomness that they collided.

slide-79
SLIDE 79

cryptographic systems that forgeries are sent through much processing as possible. Design SSL to decrypt check padding before checking a serious MAC. by padding-oracle attacks as BEAST and POODLE. Design “encrypt-only”

  • ptions. Broken by 2006

aterson–Yau for Linux and 2007 riele–Paterson for RFCs. Randomness 1995 Goldberg–Wagner: Netscape SSL keys had <50 bits of entropy. 2008 Bello: Debian/Ubuntu OpenSSL keys for years had <20 bits of entropy. 2012 Lenstra–Hughes–Augier– Bos–Kleinjung–Wachter and 2012 Heninger–Durumeric–Wustrow– Halderman broke the RSA public keys for 0.5% of all SSL servers. The primes had so little randomness that they collided. Make randomnes extremely Have each its own RNG Maintain each application. build this from the available Pay people RNGs such Claim “p

slide-80
SLIDE 80

cryptographic systems are sent through cessing as possible. to decrypt ng before serious MAC. padding-oracle attacks and POODLE. “encrypt-only” Broken by 2006 r Linux and 2007 rson for RFCs. Randomness 1995 Goldberg–Wagner: Netscape SSL keys had <50 bits of entropy. 2008 Bello: Debian/Ubuntu OpenSSL keys for years had <20 bits of entropy. 2012 Lenstra–Hughes–Augier– Bos–Kleinjung–Wachter and 2012 Heninger–Durumeric–Wustrow– Halderman broke the RSA public keys for 0.5% of all SSL servers. The primes had so little randomness that they collided. Make randomness-generation extremely difficult Have each application its own RNG “for sp Maintain separate each application. “F build this RNG in from the inputs conveniently available to that application. Pay people to use RNGs such as Dual Claim “provable securit

slide-81
SLIDE 81

systems through

  • ssible.

decrypt attacks POODLE. 2006 and 2007 RFCs. Randomness 1995 Goldberg–Wagner: Netscape SSL keys had <50 bits of entropy. 2008 Bello: Debian/Ubuntu OpenSSL keys for years had <20 bits of entropy. 2012 Lenstra–Hughes–Augier– Bos–Kleinjung–Wachter and 2012 Heninger–Durumeric–Wustrow– Halderman broke the RSA public keys for 0.5% of all SSL servers. The primes had so little randomness that they collided. Make randomness-generation extremely difficult to audit. Have each application maintain its own RNG “for speed”. Maintain separate RNG code each application. “For simplicit build this RNG in ad-hoc wa from the inputs conveniently available to that application. Pay people to use backdoored RNGs such as Dual EC. Claim “provable security”.

slide-82
SLIDE 82

Randomness 1995 Goldberg–Wagner: Netscape SSL keys had <50 bits of entropy. 2008 Bello: Debian/Ubuntu OpenSSL keys for years had <20 bits of entropy. 2012 Lenstra–Hughes–Augier– Bos–Kleinjung–Wachter and 2012 Heninger–Durumeric–Wustrow– Halderman broke the RSA public keys for 0.5% of all SSL servers. The primes had so little randomness that they collided. Make randomness-generation code extremely difficult to audit. Have each application maintain its own RNG “for speed”. Maintain separate RNG code for each application. “For simplicity” build this RNG in ad-hoc ways from the inputs conveniently available to that application. Pay people to use backdoored RNGs such as Dual EC. Claim “provable security”.

slide-83
SLIDE 83

Randomness Goldberg–Wagner: Netscape eys had <50 bits of entropy. Bello: Debian/Ubuntu enSSL keys for years had bits of entropy. Lenstra–Hughes–Augier– Bos–Kleinjung–Wachter and 2012 Heninger–Durumeric–Wustrow– Halderman broke the RSA public r 0.5% of all SSL servers. rimes had so little randomness that they collided. Make randomness-generation code extremely difficult to audit. Have each application maintain its own RNG “for speed”. Maintain separate RNG code for each application. “For simplicity” build this RNG in ad-hoc ways from the inputs conveniently available to that application. Pay people to use backdoored RNGs such as Dual EC. Claim “provable security”. What if merge all into a central This pool bad/failing/malicious if there is Merging Yikes!

slide-84
SLIDE 84

erg–Wagner: Netscape 50 bits of entropy. Debian/Ubuntu for years had entropy. Lenstra–Hughes–Augier– achter and 2012 Heninger–Durumeric–Wustrow– e the RSA public all SSL servers. so little they collided. Make randomness-generation code extremely difficult to audit. Have each application maintain its own RNG “for speed”. Maintain separate RNG code for each application. “For simplicity” build this RNG in ad-hoc ways from the inputs conveniently available to that application. Pay people to use backdoored RNGs such as Dual EC. Claim “provable security”. What if the terrorists merge all available into a central entrop This pool can survive bad/failing/malicious if there is one good Merging process is Yikes!

slide-85
SLIDE 85

Netscape entropy. Debian/Ubuntu had Lenstra–Hughes–Augier– and 2012 ustrow– public servers. collided. Make randomness-generation code extremely difficult to audit. Have each application maintain its own RNG “for speed”. Maintain separate RNG code for each application. “For simplicity” build this RNG in ad-hoc ways from the inputs conveniently available to that application. Pay people to use backdoored RNGs such as Dual EC. Claim “provable security”. What if the terrorists merge all available inputs into a central entropy pool? This pool can survive many bad/failing/malicious inputs if there is one good input. Merging process is auditable. Yikes!

slide-86
SLIDE 86

Make randomness-generation code extremely difficult to audit. Have each application maintain its own RNG “for speed”. Maintain separate RNG code for each application. “For simplicity” build this RNG in ad-hoc ways from the inputs conveniently available to that application. Pay people to use backdoored RNGs such as Dual EC. Claim “provable security”. What if the terrorists merge all available inputs into a central entropy pool? This pool can survive many bad/failing/malicious inputs if there is one good input. Merging process is auditable. Yikes!

slide-87
SLIDE 87

Make randomness-generation code extremely difficult to audit. Have each application maintain its own RNG “for speed”. Maintain separate RNG code for each application. “For simplicity” build this RNG in ad-hoc ways from the inputs conveniently available to that application. Pay people to use backdoored RNGs such as Dual EC. Claim “provable security”. What if the terrorists merge all available inputs into a central entropy pool? This pool can survive many bad/failing/malicious inputs if there is one good input. Merging process is auditable. Yikes! Claim performance problems in writing to a central pool, reading from a central pool. Modify pool to make it unusable (random) or scary (urandom).

slide-88
SLIDE 88

randomness-generation code extremely difficult to audit. each application maintain wn RNG “for speed”. Maintain separate RNG code for

  • application. “For simplicity”

this RNG in ad-hoc ways the inputs conveniently available to that application.

  • ple to use backdoored

such as Dual EC. “provable security”. What if the terrorists merge all available inputs into a central entropy pool? This pool can survive many bad/failing/malicious inputs if there is one good input. Merging process is auditable. Yikes! Claim performance problems in writing to a central pool, reading from a central pool. Modify pool to make it unusable (random) or scary (urandom). What if RNG speed Make it to use randomness possible. tests, en e.g. DSA new random m; could H(s; m). user is given which to Bushing–Ma “PS3 epic

slide-89
SLIDE 89

ndomness-generation code lt to audit. application maintain r speed”. rate RNG code for

  • application. “For simplicity”

in ad-hoc ways conveniently application. use backdoored Dual EC. security”. What if the terrorists merge all available inputs into a central entropy pool? This pool can survive many bad/failing/malicious inputs if there is one good input. Merging process is auditable. Yikes! Claim performance problems in writing to a central pool, reading from a central pool. Modify pool to make it unusable (random) or scary (urandom). What if the terrorists RNG speed isn’t an Make it an issue! to use randomness

  • possible. This also

tests, encouraging e.g. DSA and ECDSA new random numb m; could have replaced H(s; m). 1992 Rivest: user is given enough which to hang himself Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS

slide-90
SLIDE 90

s-generation code audit. maintain de for simplicity” ways conveniently application.

  • red

What if the terrorists merge all available inputs into a central entropy pool? This pool can survive many bad/failing/malicious inputs if there is one good input. Merging process is auditable. Yikes! Claim performance problems in writing to a central pool, reading from a central pool. Modify pool to make it unusable (random) or scary (urandom). What if the terrorists realize RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as

  • possible. This also complicates

tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries.

slide-91
SLIDE 91

What if the terrorists merge all available inputs into a central entropy pool? This pool can survive many bad/failing/malicious inputs if there is one good input. Merging process is auditable. Yikes! Claim performance problems in writing to a central pool, reading from a central pool. Modify pool to make it unusable (random) or scary (urandom). What if the terrorists realize that RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as

  • possible. This also complicates

tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the poor user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries.

slide-92
SLIDE 92

if the terrorists all available inputs central entropy pool?

  • ol can survive many

bad/failing/malicious inputs there is one good input. Merging process is auditable. performance problems in to a central pool, reading from a central pool. dify pool to make it unusable random) or scary (urandom). What if the terrorists realize that RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as

  • possible. This also complicates

tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the poor user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries. Pure crypto 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molna Osvik–de MD5 ⇒

slide-93
SLIDE 93

terrorists available inputs entropy pool? survive many bad/failing/malicious inputs

  • d input.

is auditable. rmance problems in central pool, central pool. make it unusable ry (urandom). What if the terrorists realize that RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as

  • possible. This also complicates

tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the poor user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries. Pure crypto failure 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molna Osvik–de Weger exploited MD5 ⇒ rogue CA

slide-94
SLIDE 94
  • l?

many inputs auditable. roblems in

  • l.

unusable urandom). What if the terrorists realize that RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as

  • possible. This also complicates

tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the poor user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries. Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS.

slide-95
SLIDE 95

What if the terrorists realize that RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as

  • possible. This also complicates

tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the poor user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries. Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS.

slide-96
SLIDE 96

What if the terrorists realize that RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as

  • possible. This also complicates

tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the poor user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries. Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack.

slide-97
SLIDE 97

What if the terrorists realize that RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as

  • possible. This also complicates

tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the poor user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries. Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack. Fact: By 1996, a few years after the introduction of MD5, Preneel and Dobbertin were calling for MD5 to be scrapped.

slide-98
SLIDE 98

What if the terrorists realize that RNG speed isn’t an issue? Make it an issue! Design crypto to use randomness as often as

  • possible. This also complicates

tests, encouraging bugs. e.g. DSA and ECDSA use a new random number k to sign m; could have replaced k with H(s; m). 1992 Rivest: “the poor user is given enough rope with which to hang himself”. 2010 Bushing–Marcan–Segher–Sven “PS3 epic fail”: PS3 forgeries. Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack. Fact: By 1996, a few years after the introduction of MD5, Preneel and Dobbertin were calling for MD5 to be scrapped. We managed to keep MD5. How? Speed; standards; compatibility.

slide-99
SLIDE 99

if the terrorists realize that speed isn’t an issue? it an issue! Design crypto randomness as often as

  • ssible. This also complicates

encouraging bugs. DSA and ECDSA use a random number k to sign could have replaced k with ). 1992 Rivest: “the poor given enough rope with to hang himself”. 2010 Bushing–Marcan–Segher–Sven epic fail”: PS3 forgeries. Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack. Fact: By 1996, a few years after the introduction of MD5, Preneel and Dobbertin were calling for MD5 to be scrapped. We managed to keep MD5. How? Speed; standards; compatibility. 2014: DNSSEC to “secure” e.g. dnssec-deployment.org address is

slide-100
SLIDE 100

terrorists realize that an issue? issue! Design crypto randomness as often as also complicates raging bugs. ECDSA use a number k to sign replaced k with Rivest: “the poor enough rope with himself”. 2010 rcan–Segher–Sven PS3 forgeries. Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack. Fact: By 1996, a few years after the introduction of MD5, Preneel and Dobbertin were calling for MD5 to be scrapped. We managed to keep MD5. How? Speed; standards; compatibility. 2014: DNSSEC uses to “secure” IP addresses. e.g. dnssec-deployment.org address is signed b

slide-101
SLIDE 101

realize that crypto

  • ften as

complicates a sign with “the poor with 2010 rcan–Segher–Sven rgeries. Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack. Fact: By 1996, a few years after the introduction of MD5, Preneel and Dobbertin were calling for MD5 to be scrapped. We managed to keep MD5. How? Speed; standards; compatibility. 2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024.

slide-102
SLIDE 102

Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack. Fact: By 1996, a few years after the introduction of MD5, Preneel and Dobbertin were calling for MD5 to be scrapped. We managed to keep MD5. How? Speed; standards; compatibility. 2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024.

slide-103
SLIDE 103

Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack. Fact: By 1996, a few years after the introduction of MD5, Preneel and Dobbertin were calling for MD5 to be scrapped. We managed to keep MD5. How? Speed; standards; compatibility. 2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024. Fact: Analyses in 2003 concluded that RSA-1024 was breakable; e.g., 2003 Shamir–Tromer estimated 1 year, ≈107 USD.

slide-104
SLIDE 104

Pure crypto failures 2008 Stevens–Sotirov– Appelbaum–Lenstra–Molnar– Osvik–de Weger exploited MD5 ⇒ rogue CA for TLS. 2012 Flame: new MD5 attack. Fact: By 1996, a few years after the introduction of MD5, Preneel and Dobbertin were calling for MD5 to be scrapped. We managed to keep MD5. How? Speed; standards; compatibility. 2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024. Fact: Analyses in 2003 concluded that RSA-1024 was breakable; e.g., 2003 Shamir–Tromer estimated 1 year, ≈107 USD. DNSSEC’s main excuse for sticking to RSA-1024: speed. “Tradeoff between the risk of key compromise and performance.”

slide-105
SLIDE 105

crypto failures Stevens–Sotirov– elbaum–Lenstra–Molnar– Osvik–de Weger exploited ⇒ rogue CA for TLS. Flame: new MD5 attack. By 1996, a few years the introduction of MD5, Preneel and Dobbertin were for MD5 to be scrapped. managed to keep MD5. How? eed; standards; compatibility. 2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024. Fact: Analyses in 2003 concluded that RSA-1024 was breakable; e.g., 2003 Shamir–Tromer estimated 1 year, ≈107 USD. DNSSEC’s main excuse for sticking to RSA-1024: speed. “Tradeoff between the risk of key compromise and performance.” How to convince that secure Many techniques: incompetent

slide-106
SLIDE 106

res Stevens–Sotirov– elbaum–Lenstra–Molnar– exploited CA for TLS. new MD5 attack. a few years duction of MD5, Dobbertin were to be scrapped. keep MD5. How? rds; compatibility. 2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024. Fact: Analyses in 2003 concluded that RSA-1024 was breakable; e.g., 2003 Shamir–Tromer estimated 1 year, ≈107 USD. DNSSEC’s main excuse for sticking to RSA-1024: speed. “Tradeoff between the risk of key compromise and performance.” How to convince terro that secure crypto Many techniques: incompetent benchma

slide-107
SLIDE 107

elbaum–Lenstra–Molnar– TLS. attack. rs MD5, ere pped.

  • MD5. How?

tibility. 2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024. Fact: Analyses in 2003 concluded that RSA-1024 was breakable; e.g., 2003 Shamir–Tromer estimated 1 year, ≈107 USD. DNSSEC’s main excuse for sticking to RSA-1024: speed. “Tradeoff between the risk of key compromise and performance.” How to convince terrorists that secure crypto is too slow? Many techniques: obsolete data, incompetent benchmarks, fraud.

slide-108
SLIDE 108

2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024. Fact: Analyses in 2003 concluded that RSA-1024 was breakable; e.g., 2003 Shamir–Tromer estimated 1 year, ≈107 USD. DNSSEC’s main excuse for sticking to RSA-1024: speed. “Tradeoff between the risk of key compromise and performance.” How to convince terrorists that secure crypto is too slow? Many techniques: obsolete data, incompetent benchmarks, fraud.

slide-109
SLIDE 109

2014: DNSSEC uses RSA-1024 to “secure” IP addresses. e.g. dnssec-deployment.org address is signed by RSA-1024. Fact: Analyses in 2003 concluded that RSA-1024 was breakable; e.g., 2003 Shamir–Tromer estimated 1 year, ≈107 USD. DNSSEC’s main excuse for sticking to RSA-1024: speed. “Tradeoff between the risk of key compromise and performance.” How to convince terrorists that secure crypto is too slow? Many techniques: obsolete data, incompetent benchmarks, fraud. Example: “PRESERVE contributes to the security and privacy of future vehicle-to-vehicle and vehicle- to-infrastructure communication systems by addressing critical issues like performance, scalability, and deployability of V2X security systems.” preserve-project.eu

slide-110
SLIDE 110

DNSSEC uses RSA-1024 “secure” IP addresses. dnssec-deployment.org address is signed by RSA-1024. Analyses in 2003 concluded RSA-1024 was breakable; 2003 Shamir–Tromer estimated 1 year, ≈107 USD. DNSSEC’s main excuse ticking to RSA-1024: speed. radeoff between the risk of key romise and performance.” How to convince terrorists that secure crypto is too slow? Many techniques: obsolete data, incompetent benchmarks, fraud. Example: “PRESERVE contributes to the security and privacy of future vehicle-to-vehicle and vehicle- to-infrastructure communication systems by addressing critical issues like performance, scalability, and deployability of V2X security systems.” preserve-project.eu “[In] most the pack 750 pack maximum goes wel (2,265 pack Processing second and ms can ha hardware. a Pentium needs ab a verification cryptographic likely to

slide-111
SLIDE 111

uses RSA-1024 addresses. dnssec-deployment.org by RSA-1024. in 2003 concluded was breakable; Shamir–Tromer r, ≈107 USD. excuse RSA-1024: speed. een the risk of key performance.” How to convince terrorists that secure crypto is too slow? Many techniques: obsolete data, incompetent benchmarks, fraud. Example: “PRESERVE contributes to the security and privacy of future vehicle-to-vehicle and vehicle- to-infrastructure communication systems by addressing critical issues like performance, scalability, and deployability of V2X security systems.” preserve-project.eu “[In] most driving the packet rates do 750 packets per second. maximum highway goes well beyond this (2,265 packets per Processing 1,000 pack second and processing ms can hardly be met

  • hardware. As discussed

a Pentium D 3.4 GHz needs about 5 times a verification : : : a cryptographic co-p likely to be necessa

slide-112
SLIDE 112

RSA-1024 dnssec-deployment.org RSA-1024. concluded ble; USD. speed.

  • f key

rmance.” How to convince terrorists that secure crypto is too slow? Many techniques: obsolete data, incompetent benchmarks, fraud. Example: “PRESERVE contributes to the security and privacy of future vehicle-to-vehicle and vehicle- to-infrastructure communication systems by addressing critical issues like performance, scalability, and deployability of V2X security systems.” preserve-project.eu “[In] most driving situations the packet rates do not exceed 750 packets per second. Only maximum highway scenario : goes well beyond this value (2,265 packets per second). Processing 1,000 packets per second and processing each in ms can hardly be met by current

  • hardware. As discussed in [32]

a Pentium D 3.4 GHz processo needs about 5 times as long a verification : : : a dedicated cryptographic co-processor is likely to be necessary.”

slide-113
SLIDE 113

How to convince terrorists that secure crypto is too slow? Many techniques: obsolete data, incompetent benchmarks, fraud. Example: “PRESERVE contributes to the security and privacy of future vehicle-to-vehicle and vehicle- to-infrastructure communication systems by addressing critical issues like performance, scalability, and deployability of V2X security systems.” preserve-project.eu “[In] most driving situations : : : the packet rates do not exceed 750 packets per second. Only the maximum highway scenario : : : goes well beyond this value (2,265 packets per second). : : : Processing 1,000 packets per second and processing each in 1 ms can hardly be met by current

  • hardware. As discussed in [32],

a Pentium D 3.4 GHz processor needs about 5 times as long for a verification : : : a dedicated cryptographic co-processor is likely to be necessary.”

slide-114
SLIDE 114

to convince terrorists secure crypto is too slow? techniques: obsolete data, incompetent benchmarks, fraud. Example: PRESERVE contributes to the security and privacy of future vehicle-to-vehicle and vehicle- to-infrastructure communication systems by addressing critical like performance, scalability, deployability of V2X security systems.” preserve-project.eu “[In] most driving situations : : : the packet rates do not exceed 750 packets per second. Only the maximum highway scenario : : : goes well beyond this value (2,265 packets per second). : : : Processing 1,000 packets per second and processing each in 1 ms can hardly be met by current

  • hardware. As discussed in [32],

a Pentium D 3.4 GHz processor needs about 5 times as long for a verification : : : a dedicated cryptographic co-processor is likely to be necessary.” Compare

  • n 1GHz

5.48 cycles/ 2.30 cycles/ for Salsa20, 498349 cycles 624846 cycles for Curve25519

slide-115
SLIDE 115

terrorists crypto is too slow? techniques: obsolete data, enchmarks, fraud. contributes to the rivacy of future vehicle-to-vehicle and vehicle- communication addressing critical rmance, scalability, y of V2X security preserve-project.eu “[In] most driving situations : : : the packet rates do not exceed 750 packets per second. Only the maximum highway scenario : : : goes well beyond this value (2,265 packets per second). : : : Processing 1,000 packets per second and processing each in 1 ms can hardly be met by current

  • hardware. As discussed in [32],

a Pentium D 3.4 GHz processor needs about 5 times as long for a verification : : : a dedicated cryptographic co-processor is likely to be necessary.” Compare to “NEON

  • n 1GHz Cortex-A8

5.48 cycles/byte (1.4 2.30 cycles/byte (3.4 for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH,

slide-116
SLIDE 116

slow?

  • bsolete data,

fraud. to the future vehicle- communication critical scalability, security “[In] most driving situations : : : the packet rates do not exceed 750 packets per second. Only the maximum highway scenario : : : goes well beyond this value (2,265 packets per second). : : : Processing 1,000 packets per second and processing each in 1 ms can hardly be met by current

  • hardware. As discussed in [32],

a Pentium D 3.4 GHz processor needs about 5 times as long for a verification : : : a dedicated cryptographic co-processor is likely to be necessary.” Compare to “NEON crypto”

  • n 1GHz Cortex-A8 core:

5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify.

slide-117
SLIDE 117

“[In] most driving situations : : : the packet rates do not exceed 750 packets per second. Only the maximum highway scenario : : : goes well beyond this value (2,265 packets per second). : : : Processing 1,000 packets per second and processing each in 1 ms can hardly be met by current

  • hardware. As discussed in [32],

a Pentium D 3.4 GHz processor needs about 5 times as long for a verification : : : a dedicated cryptographic co-processor is likely to be necessary.” Compare to “NEON crypto”

  • n 1GHz Cortex-A8 core:

5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify.

slide-118
SLIDE 118

“[In] most driving situations : : : the packet rates do not exceed 750 packets per second. Only the maximum highway scenario : : : goes well beyond this value (2,265 packets per second). : : : Processing 1,000 packets per second and processing each in 1 ms can hardly be met by current

  • hardware. As discussed in [32],

a Pentium D 3.4 GHz processor needs about 5 times as long for a verification : : : a dedicated cryptographic co-processor is likely to be necessary.” Compare to “NEON crypto”

  • n 1GHz Cortex-A8 core:

5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify. 1GHz Cortex-A8 was high-end smartphone core in 2010: e.g., Samsung Exynos 3110 (Galaxy S); TI OMAP3630 (Motorola Droid X); Apple A4 (iPad 1/iPhone 4).

slide-119
SLIDE 119

“[In] most driving situations : : : the packet rates do not exceed 750 packets per second. Only the maximum highway scenario : : : goes well beyond this value (2,265 packets per second). : : : Processing 1,000 packets per second and processing each in 1 ms can hardly be met by current

  • hardware. As discussed in [32],

a Pentium D 3.4 GHz processor needs about 5 times as long for a verification : : : a dedicated cryptographic co-processor is likely to be necessary.” Compare to “NEON crypto”

  • n 1GHz Cortex-A8 core:

5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify. 1GHz Cortex-A8 was high-end smartphone core in 2010: e.g., Samsung Exynos 3110 (Galaxy S); TI OMAP3630 (Motorola Droid X); Apple A4 (iPad 1/iPhone 4). 2013: Allwinner A13, $5 in bulk.

slide-120
SLIDE 120

most driving situations : : : packet rates do not exceed packets per second. Only the maximum highway scenario : : : ell beyond this value packets per second). : : : cessing 1,000 packets per and processing each in 1 can hardly be met by current

  • are. As discussed in [32],

entium D 3.4 GHz processor about 5 times as long for verification : : : a dedicated cryptographic co-processor is to be necessary.” Compare to “NEON crypto”

  • n 1GHz Cortex-A8 core:

5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify. 1GHz Cortex-A8 was high-end smartphone core in 2010: e.g., Samsung Exynos 3110 (Galaxy S); TI OMAP3630 (Motorola Droid X); Apple A4 (iPad 1/iPhone 4). 2013: Allwinner A13, $5 in bulk. What if hear about Yikes! Similar to Don’t standa Discourage

slide-121
SLIDE 121

driving situations : : : do not exceed

  • second. Only the

ay scenario : : :

  • nd this value

er second). : : : 1,000 packets per cessing each in 1 e met by current discussed in [32], GHz processor times as long for a dedicated co-processor is necessary.” Compare to “NEON crypto”

  • n 1GHz Cortex-A8 core:

5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify. 1GHz Cortex-A8 was high-end smartphone core in 2010: e.g., Samsung Exynos 3110 (Galaxy S); TI OMAP3630 (Motorola Droid X); Apple A4 (iPad 1/iPhone 4). 2013: Allwinner A13, $5 in bulk. What if the terrorists hear about fast secure Yikes! Similar to constant-time Don’t standardize Discourage use of

slide-122
SLIDE 122

situations : : : exceed Only the rio : : : value second). : : : per each in 1 current [32], cessor long for dedicated r is Compare to “NEON crypto”

  • n 1GHz Cortex-A8 core:

5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify. 1GHz Cortex-A8 was high-end smartphone core in 2010: e.g., Samsung Exynos 3110 (Galaxy S); TI OMAP3630 (Motorola Droid X); Apple A4 (iPad 1/iPhone 4). 2013: Allwinner A13, $5 in bulk. What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story Don’t standardize good crypto. Discourage use of good crypto.

slide-123
SLIDE 123

Compare to “NEON crypto”

  • n 1GHz Cortex-A8 core:

5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify. 1GHz Cortex-A8 was high-end smartphone core in 2010: e.g., Samsung Exynos 3110 (Galaxy S); TI OMAP3630 (Motorola Droid X); Apple A4 (iPad 1/iPhone 4). 2013: Allwinner A13, $5 in bulk. What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story. Don’t standardize good crypto. Discourage use of good crypto.

slide-124
SLIDE 124

Compare to “NEON crypto”

  • n 1GHz Cortex-A8 core:

5.48 cycles/byte (1.4 Gbps), 2.30 cycles/byte (3.4 Gbps) for Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) for Curve25519 DH, verify. 1GHz Cortex-A8 was high-end smartphone core in 2010: e.g., Samsung Exynos 3110 (Galaxy S); TI OMAP3630 (Motorola Droid X); Apple A4 (iPad 1/iPhone 4). 2013: Allwinner A13, $5 in bulk. What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story. Don’t standardize good crypto. Discourage use of good crypto. If the good crypto persists, try to bury it behind a huge menu of bad options. Advertise “cryptographic agility”; actually cryptographic fragility. Pretend that this “agility” justifies using breakable crypto.

slide-125
SLIDE 125

Compare to “NEON crypto” 1GHz Cortex-A8 core: cycles/byte (1.4 Gbps), cycles/byte (3.4 Gbps) Salsa20, Poly1305. 498349 cycles (2000/second), 624846 cycles (1600/second) Curve25519 DH, verify. Cortex-A8 was high-end rtphone core in 2010: e.g., Samsung Exynos 3110 (Galaxy S); OMAP3630 (Motorola Droid Apple A4 (iPad 1/iPhone 4). Allwinner A13, $5 in bulk. What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story. Don’t standardize good crypto. Discourage use of good crypto. If the good crypto persists, try to bury it behind a huge menu of bad options. Advertise “cryptographic agility”; actually cryptographic fragility. Pretend that this “agility” justifies using breakable crypto. Precomputed Try to build into many e.g. Complicate

slide-126
SLIDE 126

“NEON crypto” rtex-A8 core: (1.4 Gbps), (3.4 Gbps)

  • ly1305.

(2000/second), (1600/second) DH, verify. was high-end in 2010: e.g., Exynos 3110 (Galaxy S); (Motorola Droid ad 1/iPhone 4). A13, $5 in bulk. What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story. Don’t standardize good crypto. Discourage use of good crypto. If the good crypto persists, try to bury it behind a huge menu of bad options. Advertise “cryptographic agility”; actually cryptographic fragility. Pretend that this “agility” justifies using breakable crypto. Precomputed signatures Try to build cryptographic into many layers of e.g. Complicate the

slide-127
SLIDE 127

crypto” Gbps), Gbps) (2000/second), (1600/second) . high-end e.g., (Galaxy S); Droid 1/iPhone 4). in bulk. What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story. Don’t standardize good crypto. Discourage use of good crypto. If the good crypto persists, try to bury it behind a huge menu of bad options. Advertise “cryptographic agility”; actually cryptographic fragility. Pretend that this “agility” justifies using breakable crypto. Precomputed signatures Try to build cryptographic fragilit into many layers of the system. e.g. Complicate the protocols.

slide-128
SLIDE 128

What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story. Don’t standardize good crypto. Discourage use of good crypto. If the good crypto persists, try to bury it behind a huge menu of bad options. Advertise “cryptographic agility”; actually cryptographic fragility. Pretend that this “agility” justifies using breakable crypto. Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols.

slide-129
SLIDE 129

What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story. Don’t standardize good crypto. Discourage use of good crypto. If the good crypto persists, try to bury it behind a huge menu of bad options. Advertise “cryptographic agility”; actually cryptographic fragility. Pretend that this “agility” justifies using breakable crypto. Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols. Split cryptographic security into “the easy problem”

  • f protecting integrity

and “the hard problem”

  • f protecting confidentiality.
slide-130
SLIDE 130

What if the terrorists hear about fast secure crypto? Yikes! Similar to constant-time story. Don’t standardize good crypto. Discourage use of good crypto. If the good crypto persists, try to bury it behind a huge menu of bad options. Advertise “cryptographic agility”; actually cryptographic fragility. Pretend that this “agility” justifies using breakable crypto. Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols. Split cryptographic security into “the easy problem”

  • f protecting integrity

and “the hard problem”

  • f protecting confidentiality.

e.g. argue against encrypted SNI since DNS is unencrypted, and argue against encrypted DNS since SNI is unencrypted.

slide-131
SLIDE 131

if the terrorists bout fast secure crypto? r to constant-time story. standardize good crypto. Discourage use of good crypto. good crypto persists, bury it behind menu of bad options. Advertise “cryptographic agility”; actually cryptographic fragility. Pretend that this “agility” justifies using breakable crypto. Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols. Split cryptographic security into “the easy problem”

  • f protecting integrity

and “the hard problem”

  • f protecting confidentiality.

e.g. argue against encrypted SNI since DNS is unencrypted, and argue against encrypted DNS since SNI is unencrypted. Solve “the by precomputing Insist that allow precomputation e.g. DNSSEC.

slide-132
SLIDE 132

terrorists secure crypto? constant-time story. rdize good crypto.

  • f good crypto.

crypto persists, ehind bad options. “cryptographic agility”; cryptographic fragility. this “agility” reakable crypto. Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols. Split cryptographic security into “the easy problem”

  • f protecting integrity

and “the hard problem”

  • f protecting confidentiality.

e.g. argue against encrypted SNI since DNS is unencrypted, and argue against encrypted DNS since SNI is unencrypted. Solve “the easy problem” by precomputing signatures. Insist that the proto allow precomputation e.g. DNSSEC.

slide-133
SLIDE 133

crypto? story. crypto. crypto. rsists,

  • ptions.

agility”; fragility. crypto. Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols. Split cryptographic security into “the easy problem”

  • f protecting integrity

and “the hard problem”

  • f protecting confidentiality.

e.g. argue against encrypted SNI since DNS is unencrypted, and argue against encrypted DNS since SNI is unencrypted. Solve “the easy problem” by precomputing signatures. Insist that the protocol allow precomputation “for sp e.g. DNSSEC.

slide-134
SLIDE 134

Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols. Split cryptographic security into “the easy problem”

  • f protecting integrity

and “the hard problem”

  • f protecting confidentiality.

e.g. argue against encrypted SNI since DNS is unencrypted, and argue against encrypted DNS since SNI is unencrypted. Solve “the easy problem” by precomputing signatures. Insist that the protocol allow precomputation “for speed”. e.g. DNSSEC.

slide-135
SLIDE 135

Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols. Split cryptographic security into “the easy problem”

  • f protecting integrity

and “the hard problem”

  • f protecting confidentiality.

e.g. argue against encrypted SNI since DNS is unencrypted, and argue against encrypted DNS since SNI is unencrypted. Solve “the easy problem” by precomputing signatures. Insist that the protocol allow precomputation “for speed”. e.g. DNSSEC. The protocol has trouble handling dynamically generated answers, and unpredictable questions; also, trouble guaranteeing freshness. Deployment hits many snags.

slide-136
SLIDE 136

Precomputed signatures Try to build cryptographic fragility into many layers of the system. e.g. Complicate the protocols. Split cryptographic security into “the easy problem”

  • f protecting integrity

and “the hard problem”

  • f protecting confidentiality.

e.g. argue against encrypted SNI since DNS is unencrypted, and argue against encrypted DNS since SNI is unencrypted. Solve “the easy problem” by precomputing signatures. Insist that the protocol allow precomputation “for speed”. e.g. DNSSEC. The protocol has trouble handling dynamically generated answers, and unpredictable questions; also, trouble guaranteeing freshness. Deployment hits many snags. Argue that it’s too early to look at “the hard problem” when most data is still unsigned.

slide-137
SLIDE 137

Precomputed signatures build cryptographic fragility many layers of the system. Complicate the protocols. cryptographic security “the easy problem” rotecting integrity “the hard problem” rotecting confidentiality. gue against encrypted SNI DNS is unencrypted, rgue against encrypted DNS SNI is unencrypted. Solve “the easy problem” by precomputing signatures. Insist that the protocol allow precomputation “for speed”. e.g. DNSSEC. The protocol has trouble handling dynamically generated answers, and unpredictable questions; also, trouble guaranteeing freshness. Deployment hits many snags. Argue that it’s too early to look at “the hard problem” when most data is still unsigned. More strategies Divert “crypto” and human into activities threaten Set up centralized encrypting that collab More distraction: breakable Declare crypto without

slide-138
SLIDE 138

signatures cryptographic fragility

  • f the system.

the protocols. cryptographic security roblem” integrity roblem” confidentiality. against encrypted SNI unencrypted, against encrypted DNS unencrypted. Solve “the easy problem” by precomputing signatures. Insist that the protocol allow precomputation “for speed”. e.g. DNSSEC. The protocol has trouble handling dynamically generated answers, and unpredictable questions; also, trouble guaranteeing freshness. Deployment hits many snags. Argue that it’s too early to look at “the hard problem” when most data is still unsigned. More strategies Divert “crypto” funding and human resource into activities that threaten mass surveillance. Set up centralized encrypting data to that collaborate with More distraction: b breakable by active Declare crypto success without encrypting

slide-139
SLIDE 139

fragility system. cols. y y. encrypted SNI encrypted DNS Solve “the easy problem” by precomputing signatures. Insist that the protocol allow precomputation “for speed”. e.g. DNSSEC. The protocol has trouble handling dynamically generated answers, and unpredictable questions; also, trouble guaranteeing freshness. Deployment hits many snags. Argue that it’s too early to look at “the hard problem” when most data is still unsigned. More strategies Divert “crypto” funding and human resources into activities that don’t threaten mass surveillance. Set up centralized systems encrypting data to companies that collaborate with us. More distraction: build systems breakable by active attacks. Declare crypto success without encrypting the Internet.

slide-140
SLIDE 140

Solve “the easy problem” by precomputing signatures. Insist that the protocol allow precomputation “for speed”. e.g. DNSSEC. The protocol has trouble handling dynamically generated answers, and unpredictable questions; also, trouble guaranteeing freshness. Deployment hits many snags. Argue that it’s too early to look at “the hard problem” when most data is still unsigned. More strategies Divert “crypto” funding and human resources into activities that don’t threaten mass surveillance. Set up centralized systems encrypting data to companies that collaborate with us. More distraction: build systems breakable by active attacks. Declare crypto success without encrypting the Internet.