Secure Hardware HOW CAN WE PROTECT OUR HARDWARE ??? HOW CAN OUR - - PowerPoint PPT Presentation

secure hardware
SMART_READER_LITE
LIVE PREVIEW

Secure Hardware HOW CAN WE PROTECT OUR HARDWARE ??? HOW CAN OUR - - PowerPoint PPT Presentation

Secure Hardware HOW CAN WE PROTECT OUR HARDWARE ??? HOW CAN OUR HARDWARE PROTECT ITSELF ??? 1 OReilly 2 Trusted or Trustworthy? An object is trusted if and only if it operates as expected An object is trustworthy if and only if it is


slide-1
SLIDE 1

HOW CAN WE PROTECT OUR HARDWARE ??? HOW CAN OUR HARDWARE PROTECT ITSELF ???

1

Secure Hardware

slide-2
SLIDE 2

2

O’Reilly

slide-3
SLIDE 3

Trusted or Trustworthy?

An object is trusted if and only if it operates as expected An object is trustworthy if and only if it is proven to operate as expected.

3

  • Peter Neumann (paraphrased)
slide-4
SLIDE 4

4

slide-5
SLIDE 5

What Do TPMs Do? (Trusted Computing)

  • Store secrets (keys)
  • Protect secrets against infected systems.
  • Provide a repository for trusted system measurements that a

compromised platform cannot lie about when interrogated.

  • TPMs can best be described as platform trust coprocessors.
  • TPMs are also very useful in enabling verified boot of OS’s and

hypervisors.

slide-6
SLIDE 6

Other useful features of TPMs

  • Policy protected persistent secure storage (for keys, certificates, and
  • ther data)
  • Keys

– Generate, store, and use symmetric and asymmetric keys – Key hierarchy – key encrypting keys, platform keys, endorsement keys, storage root keys – Key cache – Key migration – Key usage policy enforcement

  • Random number generation
  • Limited crypto in TPM 2.0 (sign, verify, encrypt, decrypt)
  • Most TPM chips are FIPS 140-2 validated
  • Common Criteria protection profile for chips themselves (not the

systems they are in)

6 IBM Confidential

slide-7
SLIDE 7

Platform Configuration Registers (PCR)

  • In volatile storage in TPM
  • SHA-256 cumulative hash
  • Initialized to zero at TPM initialization

NEVER written to directly, always extended

PCRnew = SHA-256(PCRold || new value)

7

slide-8
SLIDE 8

8

slide-9
SLIDE 9

9

slide-10
SLIDE 10

10

slide-11
SLIDE 11

Because of the unknown order of the contents of the cumulative hash, you also need it to send you its log.

11

slide-12
SLIDE 12

Extreme HW Security

12

slide-13
SLIDE 13

13

The Threat Model

– Who are the attackers?

  • buggy software from business partners (intent is not required!)
  • system administrators?
  • employee insiders?
  • thrill-seeking hackers?
  • terrorists?
  • organized nation-states?

– What resources do they have? – What attacks are anticipated?

slide-14
SLIDE 14

14

What can a successful attacker gain?

– notoriety – money – life and limb of others – a marketable identity – defeat of an enemy in wartime – control of

  • a flight deck?
  • the brake system?
  • the on-board movie?
  • the radio of the BMW next to me in

traffic?

slide-15
SLIDE 15

15

How will my product defend itself?

Physical – it’s own enclosure – the enclosure of the system it’s in – a secure environment (inside the Pentagon) – inspection / human beings / dogs Logical / software – Encryption of sensitive material – Authentication of operators – Integrity of incoming commands and software – Integrity of on-board memory – Integrity of the bootstrap / factory initialization

slide-16
SLIDE 16

16

Why secure hardware?

  • Because we all doing . . .

– Increasingly important operations – In increasingly distributed environments – That are increasingly open

  • As a consequence . . .

– We need to trust machines we cannot control – And to which motivated adversaries may have direct access

slide-17
SLIDE 17

17

What is a secure coprocessor?

  • A general-purpose computing environment that withstands

physical and logical attacks

  • It runs only the programs it’s supposed to
  • It runs them unmolested
  • One can (remotely) tell the difference between the real

program on the real thing, and a clever impersonator

  • An attacker might carry out destructive analysis on one or more

devices, yet not break the security of the whole system

  • Usually incorporates high-performance crypto, but not just a

fast crypto box

slide-18
SLIDE 18

18

What do applications need from secure hardware?

  • Acceleration of security operations (e.g. cryptography, random

number generation)

  • Physical protection of information assets
  • Cryptographic keys
  • Electronic valuables (e.g. e-cash, postage, coupons)
  • Software (e.g. meters, risk calculations)
  • Enablement of network security operations (e.g. intrusion

detection)

slide-19
SLIDE 19

19

Some applications for secure hardware

  • Network security
  • Intrusion and virus defense, filtering, virtual private networks
  • E-commerce
  • Electronic payments, e-postage/tickets/coupons, smart card

personalization

  • Data centers
  • Secure databases (e.g. healthcare, corporate secrets),

entertainment content

  • Banking and finance
  • Securities trading, transaction processing and funds transfer
  • Government and military
  • Benefits transfer, government suppliers, high assurance

systems for defense, intelligence, justice

slide-20
SLIDE 20

20

IBM PCI-X Cryptographic Coprocessor (PCIXCC)

▪ Announced in September, 2003 ▪ Greatly improved performance ▪ PCI-X and network interface ▪ Same physical / logical security feature set as 4758 ▪ Certified FIPS 140-2 level 4

slide-21
SLIDE 21

21

Proprietary pipelining hardware DES/TDES engine Modular math engine SHA-1 engine Hardware random number generator Active tamper-detection
 and response circuitry

4764 hardware architecture

CPU Time of day clock FLASH (4MB) DRAM (4MB) ROM (64KB) Proprietary memory interlock Battery-backed RAM (8KB) I/O (PCI, serial port) Tamper-responding membrane 266+MHz PowerPC SoC PCI/X, Ethernet 64MB ECC 16MB 1MB 128KB

New pipeline and crypto ASICs
 w/ much better performance (e.g. 50X in some cases)

slide-22
SLIDE 22

22 20 January 2006

Daughter card

slide-23
SLIDE 23

23 20 January 2006

Inner copper enclosure

slide-24
SLIDE 24

24 20 January 2006

Tamper-sensing mesh/membrane

slide-25
SLIDE 25

25 20 January 2006

Outer copper shell and potting material

slide-26
SLIDE 26

26 20 January 2006

Completed assembly

slide-27
SLIDE 27

27

IBM 4769 PCIe Cryptographic Coprocessor (HSM)

2019

slide-28
SLIDE 28

28

IBM 4769 PCIe Cryptographic Coprocessor (HSM)

slide-29
SLIDE 29

Matchbox: Solution Overview

  • Allows collaborative

procesing amongst parties that do not trust each other

  • One or more parties provide

data (kept encrypted)

  • One or more parties make

requests (request encrypted, processing in secure coprocessor)

  • One or more parties gets

the results (transmission encrypted)

Solution Components

  • IBM 4758
  • MatchBox software
slide-30
SLIDE 30

Matchbox: Solution

  • All processing and

distributed results are based on automatically enforced contracts

  • Sensitive processing is

confined within the secure coprocessor (IBM 4758) and is not

  • bservable from outside
  • Sensitive data outside

the coprocessor is always encrypted

  • Secure matching of

biometrics

Agency 2

Auth/Enc (4758)

Agency 1

Auth/Enc (4758)

. . . . .

Law Enforcement

Auth/Enc (4758)

Airline 1

Auth/Enc (4758)

Airport

Airline 2

Auth/Enc (4758)

Law Enforcement

Auth/Enc (4758)

Railroad 1

Auth/Enc (4758)

Railroad

secure matching in 4758 secure matching in 4758 secure matching in 4758 secure matching in 4758 secure matching in 4758 secure matching in 4758

Matchbox service

Auth/Enc (4758)

slide-31
SLIDE 31

31 9/29/19

FIPS 140-2 Physical Security Requirements

Level 1 – production grade Level 2 – tamper evident Level 3 – tamper resistant Level 4 – tamper responding

slide-32
SLIDE 32

32 9/29/19

FIPS 140-2 Level 1 – Production Grade analogy

Ordinary snap-cap bottles

slide-33
SLIDE 33

33 9/29/19

FIPS 140-2 Level 2 – Tamper Evident analogy

Tamper evident bottle caps ▪ alert the user to tampering ▪ require inspection (rely on humans for protection)

slide-34
SLIDE 34

34 9/29/19

FIPS 140-2 Level 3 – Tamper Resistant analogy

Child resistant medicine bottle caps resist opening by unauthorized users (children). Tamper resistance → device begins to protect itself

slide-35
SLIDE 35

35 9/29/19

FIPS 140-2 Level 4 – Tamper Responding analogy

The PillSafe™ stacks drug tablets next to a stable chemical reactant that can destroy the drugs. Attempts to force the mechanism or penetrate the bottle cause instant destruction of the medication. Tamper responding → device protects itself

See http://www.healthcarepackaging.com/archives/2007/03/futuristic_pill_container_zaps.php

slide-36
SLIDE 36

36 9/29/19

FIPS level 1 (out of 4) – the bare mimimum

Requirements are mainly cumulative, with a few exceptions

  • Mainly algorithmic compliance (“our AES is compatible with

yours”)

  • Limited testing outside algorithm verification
  • Important indication if certified (but check security policy)
  • Double-check algorithm certificates (key sizes, modes etc.)
  • State transition diagrams
slide-37
SLIDE 37

37 9/29/19

FIPS level 2 (out of 4)

  • Level 1 and . . .
  • crypto users are identified acting in roles
  • Not very useful for software modules
  • Limited use for modules which provide only raw operations
  • Must authenticate user’s role
  • Visible tamper evidence is not very relevant today:

– One does not actually see most devices interacted with – Most devices themselves have disappeared into systems (or onto chips) – This was still feasible when FIPS 140-1 was written – Still useful in some restricted environments (example: ATMs)

slide-38
SLIDE 38

38 9/29/19

FIPS and CC roles and identities

  • Users must be identified / authenticated somehow
  • Users can be companies or software, not

necessarily human beings, e.g.

– Human being in Personnel – IBM as a microcode and operating system loader – An operating system as an application loader

slide-39
SLIDE 39

39 9/29/19

FIPS level 3 (out of 4)

  • Level 2 and. . .
  • Infeasible for practical software, unless protected by a Level 3+

enclosure (plus other features)

  • Additional, extensive testing over Levels 1-2
  • User separation / identification required (more than in Level 2)
slide-40
SLIDE 40

40 9/29/19

FIPS levels 3-4 (out of 4)

  • Sample tamper-detecting/evident box (Level 4)
  • One needs to certify the tamper-evident box itself, not product hosting it
  • There are different, obvious hacks around this, some truly innovative
slide-41
SLIDE 41

41 9/29/19

FIPS level 4

  • Level 4: tamper response

– Highest level achievable (much stronger than Level 3) – Module actively protects itself

  • Physical intrusion
  • Environmental parameters
  • Optionally against side channels attacks

– Level 4 specific features

  • Active testing for environment-related failures
  • Requires some formal verification

– Very few Level 4 devices exist (almost all from IBM) – Level 4 devices may be deployed to untrusted environments

slide-42
SLIDE 42

Detection - Physical Unclonable Functions (PUFs)

  • Do not store key in digital form
  • Generate key during usage then erase
slide-43
SLIDE 43

COMPOSITE SECURITY

43

slide-44
SLIDE 44

Sometimes the sum of the parts… is a HOLE !

44

David Safford, IBM Research

slide-45
SLIDE 45

45 9/29/19

Composite Evaluation: frequently misunderstood

Common false assumptions: “By using a certified product/library, added components themselves become certified” “By layering multiple certified components, the whole system becomes certified” Reality: Unless very carefully applied, composing two evaluations can actually result in LESS security than each component had on its own