& Trusted Computing Erik Poll Digital Security Radboud - - PowerPoint PPT Presentation

trusted computing erik poll digital security radboud
SMART_READER_LITE
LIVE PREVIEW

& Trusted Computing Erik Poll Digital Security Radboud - - PowerPoint PPT Presentation

Hardware Security Trusted Execution Environments (TEEs) & Trusted Computing Erik Poll Digital Security Radboud University Nijmegen Classic hardware-backed security solutions smartcard for users HSM (Hardware Security Module) in


slide-1
SLIDE 1

Hardware Security

Trusted Execution Environments (TEEs) & Trusted Computing

Erik Poll Digital Security Radboud University Nijmegen

slide-2
SLIDE 2
  • Classic hardware-backed security solutions
  • Today:

more modern & flexible alternatives

  • esp. for users

2

smartcard for users HSM (Hardware Security Module) in the back-end

slide-3
SLIDE 3

Trusted Computing & TEE

  • Trusted Computing

– initiative in early 2000s to ‘secure’ PCs & laptops running a standard OS on a standard processor – using additional piece of hardware, the Trusted Platform Module (TPM) – led by industry alliance Trusted Computing Group (TCG)

  • Trusted Execution Environments (TEEs)

– environment to run some code with drastically reduced TCB – using special hardware features of processor – very different architectures to do this:

  • ARM Trustzone

provides 1 secure world and 1 insecure world

  • Intel SGX

provides many secure enclaves

3

slide-4
SLIDE 4

Goals of this lecture

Understanding

  • Security objectives
  • Technologies / architectures / techniques to reach these
  • bjectives
  • Attacker models aka threat models
  • Trusted Computing Base (TCB)

In other words: – What are the security properties ? – How are these properties guaranteed ? – How good are these guarantees ?

4

slide-5
SLIDE 5

Understanding TEE techologies & impact

  • Complex mechanisms involving entire execution stack:
  • HW, I/O, OS, application layer
  • Understanding security goals (what it achieves)

given the mechanisms (how it does that) can be tricky

  • Sometimes obscured by marketing blurb
  • Potentially big impact on

– trustworthiness of computing infrastructure we use every day – privacy – openness – economics & monopolies

5

slide-6
SLIDE 6

Beware the dreaded T-word...

TEE = Trusted Execution Environment Statements involving the word ‘trust’ are

  • likely to be bullshit
  • ften wrong or vacuously true

Do you think that “trusted execution environments” exist? Can you give examples?

6

slide-7
SLIDE 7

example trusted execution environments

7

slide-8
SLIDE 8

Beware the dreaded T-word...

  • All execution enviroments are trusted (if they are used)

by some party, for some purpose

  • The T in TEE probably means highly trustworthy

for some guarantees, for some attacker model

  • Who is trusting what or whom, and for which properties?

At least three parties are involved

  • user
  • software / service provider
  • device manufacturer (OEM)

and often more: certifiers, TTPs, CAs, ...

8

slide-9
SLIDE 9

Trust vs Trustworthiness

  • Someone or something is trusted if they potentially can

compromise your security.

  • Someone or something is trustworthy if they will not

compromise your security.

  • Trustworthiness comes in many levels
  • Trust is a binary property

But – Trust is a binary property for some particular property

  • Eg you might have to trust X for the confidentiality of some

data, but not for the availability of that data – Defence in depth (eg having detection as well as prevention) provides different degrees of trust

9

slide-10
SLIDE 10

Trustworthiness & TCB

  • Size & complexity of the Trusted Computing Base (TCB) is

the crucial ‘measure’ of trustworthiness

  • Ideally, TCB should be

– small

  • esp. limited amount of software

– simple – well-understood – well-analysed & maybe certified

  • ideally verified using formal methods

10

slide-11
SLIDE 11

Motivating example: the SIM card

What are the security advantages for the telco?

  • The phone hardware & software are not in the TCB for

authentication

  • The telco does not have to trust the phone to keep crypto

keys for authentication confidential

  • The telco only has to trust the SIM

for confidentiality of keys and integrity of code Limitation: user has to type in the PIN code to unlock the SIM, so some phone hw & sw in TCB for confidentiality of the PIN

Goal of TEEs: Can we get such security guarantees without a separate processor?

And hence: without the limited resources such a of separate processor has

11

TEE app OS

slide-12
SLIDE 12

Attacker Models for TEE

12

slide-13
SLIDE 13

Attacker models: for crypto, protocols, software

1. Crypto attacker

wants to extract keys

2. Dolev-Yao attacker

wants to break security guarantees of a protocol

3. I/O attacker

wants to create any unwanted behaviour with malicious input – broader objectives than 1 & 2

  • eg including DoS, RCE

– lower abstraction level: looking at the code, not abstract algorithm or protocol

  • eg including buffer overflows

13 Encrypt KAES

Alice Bob App

slide-14
SLIDE 14

More powerful I/O attacker models for TEEs

  • malware attacker

access to the same platform

  • incl. common I/O channels

and persistent storage

(eg spoofing/phishing)

  • code attacker

controls (some) code used by victim app

(eg code injection attack via buffer overflow, or with malicious library)

  • platform attacker

eg malware attacker that managed to compromise the OS

TEEs also aim to protect against code & platform attacks!

14

App

platform (OS, hw, I/O)

App App

platform

slide-15
SLIDE 15

Defending against code/platform attacker?

You might think that defending against code or platform attacker is hopeless. But all is not lost! Possible defences:

  • runtime integrity checks on code, eg

– stack canaries, CFI (Control Flow Integrity), ... against buffer

  • verflows

– TPM reporting a hash of the software stack

  • sandboxing aka compartementalisation, eg

– software-based Java (Card) sandboxing – hardware-based Flicker sessions & Intel SGX enclaves

Of course, such mechanisms do come with a TCB.

15

App App

platform

slide-16
SLIDE 16

Irrelevant for online attacks but possibly relevant in some situations Two very different scenarios:

  • Attacker with physical access

eg removing a hard disk from stolen laptop to access raw data (by-passing OS security)

  • User is not trusted

by service provider or device manufacturer, eg DRM (Digital Rights Management) Note: user now included in the attacker model

Physical attacks?

16

App

platform

slide-17
SLIDE 17

Security Goals

17

slide-18
SLIDE 18

Goals of TEE - conceptually

18

integrity & confidentiality

  • f code & data

even if OS is compromised integrity & confidentiality

  • f user I/O

even if OS is compromised isolated execution and secure storage trusted path for trusted I/O

data app

+ +

slide-19
SLIDE 19

First attempt at defining TEE

Platform that provides applications with the security guarantee of isolation

  • integrity of behaviour
  • integrity & confidentiality of data, at rest & during execution

against very powerful attacker

  • malware on the same platform
  • and even (partial) compromise of the application or platform

with a high level of trustworthiness

  • minimal TCB
  • ultimately relying on hardware

and mechanisms to attest to the integrity of the system

  • as basis for others to trust it

19

HARDWARE

slide-20
SLIDE 20

TEE security goals (1) – ‘isolation’

  • Isolated Execution

Execution of an application cannot be compromised. Integrity & confidentiality of code and of data in use.

  • Secure Storage

Integrity, confidentiality and freshness of data at rest.

  • Trusted Path: a secure path to and from the user

Integrity & confidentiality of communication

  • secure attention sequence, eg. Ctrl-Alt-Delete on

Windows, or Home button on iOs & Android, is a special case of Trusted Path

This is nothing new! Any OS aims to provide these properties.

20

slide-21
SLIDE 21

platform

These security goals in a picture

21

app 1

storage

app c app 2 Spoofing remains a tricky concern

  • an app can know it has exclusive use of display or keyboard,

but how can the human user know who it is talking to?

slide-22
SLIDE 22

TEE security goals (2) – ‘assurance’

Who & what are we dealing with? Can we trust this?

from perspective of an app, remote party, or local human user

  • Platform Integrity

– Can we trust or verify platform integrity?

  • (Remote) Attestation

– Can a (remote) party verify integrity of platform or app?

  • Identification & Authentication

– Can we authenticate the identity of a platform or app? – Ultimately, this requires some device identity

  • Secure Provisioning

– Mechanism to send data to specific software module on a specific device

  • eg for DRM, updating, or sync-ing apps across devices

22

slide-23
SLIDE 23

How to provide 'isolation'?

  • use different physical devices

– very secure  and provides Trusted Path  – but costly & cumbersome 

  • classic OS

– huge TCB 

  • hypervisor

– smaller TCB 

  • (multi-application) smartcard

– even smaller TCB  – also some protection against physical attacks  – limited resources , no I/O to user , let alone Trusted Path 

  • TPM & TEE hardware-based solutions    ??

23

slide-24
SLIDE 24

Phone OS is in TCB for I/O with user

SIM card as TEE?

24

Main CPU App Baseband chip

Phone

Phone OS

Phone OS not in TCB for authentication to network

slide-25
SLIDE 25

SIM card as TEE?

26

  • What is in the TCB when you

unlock you SIM card? Upon booting the phone, the main phone OS may still be partly excluded from TCB?

  • In any case, malware on the

phone could phish this! – by faking this display

slide-26
SLIDE 26

TEE vs smartcard

Unlike a separate smartcard, ideally a TEE should offer

  • fast & full memory access
  • running at full CPU speed
  • interacting with normal I/O

27

slide-27
SLIDE 27

TEE technologies

28

slide-28
SLIDE 28

TEE technologies (chronologically)

  • TPM
  • Flicker

– Uses TPM and SVM

  • AMD’s SVM ≈ Intel’s TXT (formerly LaGrande)
  • Intel IPT (Identity Protection Technology)
  • ARM TrustZone

– used by Trustonic ≈ Samsung KNOX

  • Trustonic also uses Global Platform idea from Java Card
  • Intel SGX

– newer, but related to the older Flicker approach

29

slide-29
SLIDE 29

Trusted Computing & TPM

30

slide-30
SLIDE 30

Trusted Computing

  • Initiative by industry consoritium

– initially TCPA (Trusted Computing Platform Alliance), succeeded by TCG (Trusted Computing Group) including Microsoft, AMD, Intel, IBM, HP,....

  • Goal: common open spec of TPM (Trusted Platform Module)
  • TPM is separate chip on the motherboard

– that monitors the CPU & offers services to the CPU, aka protected capabilities that use shielded locations,

  • incl. authenticated boot

NB the main CPU remains in control!

31

slide-31
SLIDE 31

Platform Integrity: Secure vs Authenticated Boot

32

[source: Ekberg et al., The Untapped Potential of Trusted Execution Environments on Mobile Devices, IEEE Security & Privacy 2014]

secure boot authenticated boot

slide-32
SLIDE 32

Secure vs Authenticated Boot

  • Secure Boot

At each step of the boot process, before code is loaded & executed, the integrity is checked, – eg using code signing The boot process is halted if this check fails – The integrity checks have to be trusted, of course

  • Authenticated Boot

At each step of the boot process, a cryptographic hash of the code is computed (a integrity measurement), and chained with earlier hash – The boot process is never halted, but integrity measurement can be checked later – The computation, storage & reporting of integrity measurements has to be trusted, of course

  • hence.... the TPM

33

slide-33
SLIDE 33

Protected Capabilities of TPM

  • Crypto, incl. secure key storage & random number generation
  • Integrity metric reporting

– chip can compute & report integrity measurements

  • stored in PCRs (Platform Configuration Registers)

– for attesting to the state of device, incl. authenticated boot

  • Special kind of secure storage: sealing of data

– access to data conditional to device being in a particular state

  • ie you can only access the data if the integrity measure of

the code is a certain value

  • Typical use case: DRM

34

slide-34
SLIDE 34

Using TPM for TEE?

Basic idea:

  • TPM measures hash of all software loaded since BIOS boot,
  • incl. OS, and can attest to the integrity

so that

  • software running on the machine and external parties can verify

system state (remote attestation)

  • access to remote services or local data can be conditional on

system state – by using sealed storage of data

  • eg this file can only be opened for a given software stack

– by using remote attestation for remote services

  • eg attesting that this is a genuine Intel processor running a

correct version of Windows XP

35

slide-35
SLIDE 35

Trusted Computing controversy

Lots of debate about: openness, privacy, and control

  • TPM cannot prevent user running Linux on Intel hardware,

but can prevent LibreOffice on Linux from opening .doc files – by using sealed storage

  • TPM is ‘a way for Bill Gates to make the Chinese pay for software’?
  • Privacy concern: TPM has a unique serial number

– But DAA for anonymous remote attestation to reduce privacy impact

  • attesting that eg. 'this is some legitimate copy of Windows

running on some AMD machine'

More info:

  • Ross Anderson’s FAQ

http://www.cl.cam.ac.uk/~rja14/tcpa-faq.html

  • [Felten, Understanding Trusted Computing, IEEE Security & Privacy 2003]

36

slide-36
SLIDE 36

Trusted Computing ?

Big practical problems built-in from start

  • Software stack is far too dynamic

– with continuous patching of OS, variety in device drivers, etc., the chance that ‘identical’ computers produce identical integrity measurement is small

  • OS is far too big to be trusted as TCB

– the idea that checking the integrity of boot sequence incl. the entire OS will ensure absence of malware is silly

  • Microsoft stopped development of NGSCB aka Palladium,

their intended ‘trusted OS’ that would use the TPM, in 2004.

  • TPM is still used for Bitlocker

37

slide-37
SLIDE 37

Flicker & SGX

  • providing secure sessions/enclaves
  • n main CPU

38

slide-38
SLIDE 38

Dynamic Root of Trust in TPM v1.2

  • TPM v1.2 added for dynamic PCRs

– not for integrity measurement starting at boot, but for integrity measurement starting from later point in time

– set to -1 on boot; can be set to 0 by CPU, to record integrity measurement from that point on

  • Special register PCR 17 :

– can only be reset by one special instruction of CPU

  • SKINIT on AMD SVM, SENTER on Intel TXT/LaGrande

– resets the CPU, disables interrupts and DMA – measures & executes Secure Loader Block

39

slide-39
SLIDE 39

Flicker TEE

Flicker uses TPM with dynamic PCRs for trusted execution. To switch to secure mode

1. all normal execution (incl. OS) is suspended 2. Flicker session: small piece of code executed using SKINIT – with code integrity measurement in PCR 17 – possible accessing & updating sealed memory 3. normal execution (incl. OS) resumes Code executed in Flicker Session isolated from all other execution:

  • No code executed before or after can influence or observe it
  • Only 250 lines of software in TCB
  • Downside: the code cannot use any OS services

40

slide-40
SLIDE 40

App

Normal Execution vs Execution using Flicker

41

OS is in the TCB for entire App

OS CPU S TPM CPU TPM

Flicker session

Part of the App, S, executed in Flicker session

  • OS no longer in TCB for S

App

OS S

slide-41
SLIDE 41

Flicker TEE

  • sensitive code fragment called PAL (Piece of Application Logic)
  • PAL is included in the SLB (Secure Loader Block) that is passed

to the SKINIT instruction Example uses:

  • running some crypto code with access to key material in sealed

memory

  • a password check with access to password

42 [McCune et al., Flicker: An Execution Infrastructure for TCB Minimization, EuroSys 2008]

slide-42
SLIDE 42

Intel SGX

Parts of app can be done in secure enclaves

  • Similar to Flicker session, so main OS no longer in TCB
  • Each enclave has its own code & data, but can access all

memory of the app – Confidentiality & integrity of code & data protected – Entry points into enclave's code are secured

  • to stop ROP (Return-Oriented Programming) style attacks

43

App1 App2 Enclave API Enclave 1 :

code + data

Intel SGX OS

Enclave 2 :

code + data

slide-43
SLIDE 43

Intel SGX – capabilities & limitations

  • HW provides Isolation, Attestation, Sealed Storage
  • Context switch to enclave is fast
  • Can provide Trusted Path in combination with Intel IPT?
  • But: side-channel attacks on SGX exist

– Malicious enclave can extract RSA private key used by other enclave on same machine – Malicious enclave code is impossible to detect or analyse, as it is protected by the enclave mechanism

Eg [Malware Guard Extension: Using SGX to Conceal Cache Attacks, Cybersecurity Vol 3, Jan 2020]

44

slide-44
SLIDE 44

Intel ITP

  • using separate processor

for trusted execution

45

slide-45
SLIDE 45

Intel IPT (Identity Protection Technology)

Separate processor providing a Java VM

  • can be used for crypto, incl. key storage & RNG

– integrated with Windows Crypto API – example use case: One-Time-Password (OTP generation)

  • also controls the display, so can provide Trusted Path

47

[source: van Rijswijk-Deij et al., Using Trusted Execution Environments in Two-Factor Authentication, Open Identity 2013]

slide-46
SLIDE 46

What the user sees

IPT for secure PIN entry

48

slide-47
SLIDE 47

What malware on the main CPU would see

IPT for secure PIN entry

49 http://www.intel.com/content/www/us/en/architecture-and-technology/identity-protection/protected-transaction-display.html

slide-48
SLIDE 48

ARM Trustzone

  • providing a secure & an insecure world

50

slide-49
SLIDE 49

ARM TrustZone

Instead of two processors, one trusted and one untrusted, ARM TrustZone is a single processor (SoC) offering 2 modes:

  • ‘normal world’ and ‘secure world’

– extra 33rd bit on the bus, to indicate the mode – device could have an indicator (eg LED) for the mode – separation of memory, peripherals, DMA, and interrupts – context switch between worlds is slow

  • Intended use

– untrusted OS, eg Android, runs in the normal world, providing REE (Rich Execution Enviroment) for normal apps – secure world provides TEE for sensitive applications & services (aka trustlets)

  • TrustZone available on majority of modern smartphones, but

mainly for manufacturer-internal purposes

51

slide-50
SLIDE 50

ARM TrustZone

52

ARM Trustzone Normal World Secure World

app 1

REE

app 2

TEE

trustlet 1 trustlet 2

slide-51
SLIDE 51

TrustZone SoC hardware architecture

53

[source: Ekberg et al., The Untapped Potential of Trusted Execution Environments on Mobile Devices, IEEE Security & Privacy 2014]

slide-52
SLIDE 52

TrustZone software architecture

54

[source: Ekberg et al., The Untapped Potential of Trusted Execution Environments on Mobile Devices, IEEE Security & Privacy 2014]

slide-53
SLIDE 53

Secure storage in untrusted world?

55

platform app A app Persistent storage can be done in untrusted world, if we use encryption plus integrity & freshness checks. Trusted app still needs some secure storage in trusted world

  • for crypto keys for confidentiality & integrity
  • for sequence numbers to ensure freshness (Data Rollback

Protection) secure storage for app A?

storage

slide-54
SLIDE 54

Android Secure Key Storage API

Trustlet accessible for 3rd party apps

  • if no TEE is available, a software fallback for key storage is used
  • Keys cannot be extracted from device, or used by malicious app,

– But Android is in TCB for determining if app 1 or app 2 is asking for its keys

  • Protecting keys with a user PIN helps, until attacker intercept it

56

Trustzone

Normal World Secure World Secure Key Storage app 1

Android OS

app 2

Keys

Qualcomm SEE OS

[Tim Cooijmans et al., Analysis of Secure Key Storage Solutions on Android, SPSM 2014]

slide-55
SLIDE 55

Trustonic

  • TrustZone only provides two worlds

– protection one way: trusted protected from untrusted, not vv

  • Trustonic provides multiple isolated enviroments within

the secure world

– like Global Platform isolates applets on JavaCard smart card

  • Samsung KNOX does something similar

57

TrustZone platform Secure world Normal world Trustonic Android

slide-56
SLIDE 56

Trustonic/KNOX software architecture

58

[source: Ekberg et al., The Untapped Potential of Trusted Execution Environments on Mobile Devices, IEEE Security & Privacy 2014]

slide-57
SLIDE 57

Recent analysis of TrustZone security failures

Cerdeira et al, SoK: Understanding the Prevailing Security Vulnerabilities if TrustZone-assisted TEEs, IEEE S&P 2020

  • SoK = Systemisation of Knowledge

Security problems due to

  • software bugs in trusted OS and trusted apps
  • architectural deficiencies
  • large attack surface, dangerous API calls,

no ASLR, no stack canaries, …

  • hardware issues
  • voltage & clock manipulations (CLKSCREW)
  • micro-architectural side-channels via caches, branch

prediction, or RowHammering

59

slide-58
SLIDE 58

Comparison & Conclusions

slide-59
SLIDE 59

TEE technologies – Recap

  • TPM
  • Flicker

– Using TPM & AMD’s SVM ≈ Intel’s TXT (formerly LaGrande)

  • Intel SGX
  • Intel IPT
  • ARM TrustZone with Trustonic

≈ Samsung KNOX

61

slide-60
SLIDE 60

Performance & functionality

  • TPM

– slow – fixed & limited functionality, not progammable

  • Flicker

– TEE code runs on the main CPU – no access to OS or drivers, so no/very limited I/O

  • IPT

– more powerful than TPM & programmable, but less powerful than main CPU

  • Trustzone and SGX

– TEE code runs on main CPU

  • for Trustzone RAM may be limited

– Trustzone provides 1 secure & 1 insecure world, SGX provides many secure enclaves

62

no trusted path! with some trusted path, in some configurations

slide-61
SLIDE 61

Openness, control, privacy

  • TPM specs are open & standardised.

Solutions like SGX, Trustzone, IPT are vendor-specific

  • IPT, Trustonic, KNOX are ‘closed’

any code running in TEE requires permission from Intel, Trustonic or Samsung

  • Who is in control?

the chip manufacturer (eg Intel), the handset manufacturer (aka OEM, eg Samsung) the OS vendor (eg Google or Apple), the telco (eg Vodaphone), the user ???

Even if solution is technically interesting, conflicting interests may prevent take-up & limit use.

  • Unique device IDs and certificates held in TEE are privacy

concerns

64

slide-62
SLIDE 62

Conclusions

  • TEEs offer interesting possibilities: for Isolated Execution,

Secure Storage, Trusted Path and Remote Attestation

  • TEE will become big(ger) in near future.

Lots of TEEs out there! What will they be used for?

Newer solutions include TrustZone-M for IoT devices

  • Openness, Privacy and Control?
  • Economics rather than security may (will?) determine

success or failure

65

slide-63
SLIDE 63

References

  • Murdoch, Introduction to Trusted Execution Environments, slides
  • Felten, Understanding Trusted Computing, IEEE Security & Privacy 2003
  • Ross Anderson’s Trusted Computing FAQ

http://www.cl.cam.ac.uk/~rja14/tcpa-faq.html

  • McCune et al., Flicker: An Execution Infrastructure for TCB Minimization,

EuroSys 2008

  • Vasudevan et al., Trustworthy Execution on Mobile Devices: What

Security Properties Can My Mobile Platform Give Me?, Trust 2012

  • Ekberg et al., The Untapped Potential of Trusted Execution Environments
  • n Mobile Devices, IEEE Security & Privacy 2014
  • Malware Guard Extension: Using SGX to Conceal Cache Attacks,

https://arxiv.org/abs/1702.08719

  • Cerdeira et al, SoK: Understanding the Prevailing Security Vulnerabilities

if TrustZone-assisted TEEs, IEEE S&P 2020.

  • van Rijswijk-Deij and Poll, Using Trusted Execution Environments in Two-

Factor Authentication, Open Identity 2013

  • Cooijmans et al., Analysis of Secure Key Storage Solutions on Android,

SPSM 2014

66