Interoperability-Guided Testing of QUIC Implementations using - - PowerPoint PPT Presentation

interoperability guided testing of quic implementations
SMART_READER_LITE
LIVE PREVIEW

Interoperability-Guided Testing of QUIC Implementations using - - PowerPoint PPT Presentation

Interoperability-Guided Testing of QUIC Implementations using Symbolic Execution Felix Rath , Daniel Schemmel, Klaus Wehrle https://comsys.rwth-aachen.de EPIQ Workshop, Heraklion, Greece, 2018-12-04 QUANT ? Mozquic ? mvfst ? picoquic ?


slide-1
SLIDE 1

Interoperability-Guided Testing

  • f QUIC Implementations

using Symbolic Execution

Felix Rath, Daniel Schemmel, Klaus Wehrle

https://comsys.rwth-aachen.de EPIQ Workshop, Heraklion, Greece, 2018-12-04

slide-2
SLIDE 2

Motivation

QUANT? Mozquic? mvfst? picoquic? AppleQUIC

?

Are the implementations interoperable?

2

Felix Rath

slide-3
SLIDE 3

Motivation

QUANT

?

Mozquic? mvfst? picoquic? AppleQUIC

?

Are the implementations interoperable?

2

Felix Rath

slide-4
SLIDE 4

Motivation

QUANT

?

Mozquic

?

mvfst? picoquic? AppleQUIC

?

Are the implementations interoperable?

2

Felix Rath

slide-5
SLIDE 5

Motivation

QUANT

?

Mozquic

?

mvfst

?

picoquic? AppleQUIC

?

Are the implementations interoperable?

2

Felix Rath

slide-6
SLIDE 6

Motivation

QUANT

?

Mozquic

?

mvfst

?

picoquic

?

AppleQUIC

?

Are the implementations interoperable?

2

Felix Rath

slide-7
SLIDE 7

Motivation

QUANT

?

Mozquic

?

mvfst

?

picoquic

?

AppleQUIC

?

Are the implementations interoperable?

2

Felix Rath

slide-8
SLIDE 8

Motivation

QUANT? Mozquic? mvfst? picoquic? AppleQUIC

?

Are the implementations interoperable?

2

Felix Rath

slide-9
SLIDE 9

Results Summary

  • State-of-the-Art software testing approaches work

Can analyze complex communications Uncovered subtle, hard-to-detect bugs

But: Deep interoperability testing requires more insight into implementations Our idea: Comparing belief states of endpoints

Based on a common defjnition Provided by implementations

We would like to propose the development of a comon way to query implementations for their current belief state.

3

Felix Rath

slide-10
SLIDE 10

Results Summary

  • State-of-the-Art software testing approaches work

▶ Can analyze complex communications Uncovered subtle, hard-to-detect bugs

But: Deep interoperability testing requires more insight into implementations Our idea: Comparing belief states of endpoints

Based on a common defjnition Provided by implementations

We would like to propose the development of a comon way to query implementations for their current belief state.

3

Felix Rath

slide-11
SLIDE 11

Results Summary

  • State-of-the-Art software testing approaches work

▶ Can analyze complex communications ▶ Uncovered subtle, hard-to-detect bugs

But: Deep interoperability testing requires more insight into implementations Our idea: Comparing belief states of endpoints

Based on a common defjnition Provided by implementations

We would like to propose the development of a comon way to query implementations for their current belief state.

3

Felix Rath

slide-12
SLIDE 12

Results Summary

  • State-of-the-Art software testing approaches work

▶ Can analyze complex communications ▶ Uncovered subtle, hard-to-detect bugs

  • But: Deep interoperability testing requires more insight into

implementations Our idea: Comparing belief states of endpoints

Based on a common defjnition Provided by implementations

We would like to propose the development of a comon way to query implementations for their current belief state.

3

Felix Rath

slide-13
SLIDE 13

Results Summary

  • State-of-the-Art software testing approaches work

▶ Can analyze complex communications ▶ Uncovered subtle, hard-to-detect bugs

  • But: Deep interoperability testing requires more insight into

implementations

  • Our idea: Comparing belief states of endpoints

Based on a common defjnition Provided by implementations

We would like to propose the development of a comon way to query implementations for their current belief state.

3

Felix Rath

slide-14
SLIDE 14

Results Summary

  • State-of-the-Art software testing approaches work

▶ Can analyze complex communications ▶ Uncovered subtle, hard-to-detect bugs

  • But: Deep interoperability testing requires more insight into

implementations

  • Our idea: Comparing belief states of endpoints

▶ Based on a common defjnition Provided by implementations

We would like to propose the development of a comon way to query implementations for their current belief state.

3

Felix Rath

slide-15
SLIDE 15

Results Summary

  • State-of-the-Art software testing approaches work

▶ Can analyze complex communications ▶ Uncovered subtle, hard-to-detect bugs

  • But: Deep interoperability testing requires more insight into

implementations

  • Our idea: Comparing belief states of endpoints

▶ Based on a common defjnition ▶ Provided by implementations

We would like to propose the development of a comon way to query implementations for their current belief state.

3

Felix Rath

slide-16
SLIDE 16

Results Summary

  • State-of-the-Art software testing approaches work

▶ Can analyze complex communications ▶ Uncovered subtle, hard-to-detect bugs

  • But: Deep interoperability testing requires more insight into

implementations

  • Our idea: Comparing belief states of endpoints

▶ Based on a common defjnition ▶ Provided by implementations

We would like to propose the development of a comon way to query implementations for their current belief state.

3

Felix Rath

slide-17
SLIDE 17

Testing Interoperability

Observations:

  • Checking an implementation for standard compliance is problematic

Requires a new “implementation” in the form of a specifjcation Even then: Verifjcation of implementations almost impossible Many implementations already available Test multiple implementations against each other Interoperability-issues occur when “things go wrong” Result: Either non-compliance or bug in standard

4

Felix Rath

slide-18
SLIDE 18

Testing Interoperability

Observations:

  • Checking an implementation for standard compliance is problematic
  • Requires a new “implementation” in the form of a specifjcation

Even then: Verifjcation of implementations almost impossible Many implementations already available Test multiple implementations against each other Interoperability-issues occur when “things go wrong” Result: Either non-compliance or bug in standard

4

Felix Rath

slide-19
SLIDE 19

Testing Interoperability

Observations:

  • Checking an implementation for standard compliance is problematic
  • Requires a new “implementation” in the form of a specifjcation
  • Even then: Verifjcation of implementations almost impossible

Many implementations already available Test multiple implementations against each other Interoperability-issues occur when “things go wrong” Result: Either non-compliance or bug in standard

4

Felix Rath

slide-20
SLIDE 20

Testing Interoperability

Observations:

  • Checking an implementation for standard compliance is problematic
  • Requires a new “implementation” in the form of a specifjcation
  • Even then: Verifjcation of implementations almost impossible
  • Many implementations already available

Test multiple implementations against each other Interoperability-issues occur when “things go wrong” Result: Either non-compliance or bug in standard

4

Felix Rath

slide-21
SLIDE 21

Testing Interoperability

Observations:

  • Checking an implementation for standard compliance is problematic
  • Requires a new “implementation” in the form of a specifjcation
  • Even then: Verifjcation of implementations almost impossible
  • Many implementations already available
  • → Test multiple implementations against each other

Interoperability-issues occur when “things go wrong” Result: Either non-compliance or bug in standard

4

Felix Rath

slide-22
SLIDE 22

Testing Interoperability

Observations:

  • Checking an implementation for standard compliance is problematic
  • Requires a new “implementation” in the form of a specifjcation
  • Even then: Verifjcation of implementations almost impossible
  • Many implementations already available
  • → Test multiple implementations against each other
  • Interoperability-issues occur when “things go wrong”

Result: Either non-compliance or bug in standard

4

Felix Rath

slide-23
SLIDE 23

Testing Interoperability

Observations:

  • Checking an implementation for standard compliance is problematic
  • Requires a new “implementation” in the form of a specifjcation
  • Even then: Verifjcation of implementations almost impossible
  • Many implementations already available
  • → Test multiple implementations against each other
  • Interoperability-issues occur when “things go wrong”
  • Result: Either non-compliance or bug in standard

4

Felix Rath

slide-24
SLIDE 24

“Things Going Wrong”

Example 1:

4 open streams + 1 closed stream 5 open streams

Example 2:

malformed (?) packet

5

Felix Rath

slide-25
SLIDE 25

“Things Going Wrong”

Example 1:

4 open streams + 1 closed stream 5 open streams

Example 2:

malformed (?) packet

5

Felix Rath

slide-26
SLIDE 26

“Things Going Wrong”

Example 1:

4 open streams + 1 closed stream 5 open streams

Example 2:

malformed (?) packet

5

Felix Rath

slide-27
SLIDE 27

“Things Going Wrong”

Example 1:

4 open streams + 1 closed stream 5 open streams

Example 2:

malformed (?) packet

5

Felix Rath

slide-28
SLIDE 28

“Things Going Wrong”

Example 1:

4 open streams + 1 closed stream 5 open streams

Example 2:

malformed (?) packet

5

Felix Rath

slide-29
SLIDE 29

“Things Going Wrong”

Example 1:

4 open streams + 1 closed stream 5 open streams

Example 2:

malformed (?) packet

5

Felix Rath

slide-30
SLIDE 30

“Things Going Wrong”

Example 1:

4 open streams + 1 closed stream 5 open streams

Example 2:

malformed (?) packet

5

Felix Rath

slide-31
SLIDE 31

“Things Going Wrong”

Example 1:

4 open streams + 1 closed stream 5 open streams

Example 2:

malformed (?) packet

5

Felix Rath

slide-32
SLIDE 32

Belief State

Observations:

  • Endpoints track state of the connection

Adjusted due to received and sent packets, time, QUIC is standardized, so this state should be similar State can be compared across implementations Belief State

In practice:

Unclear what comprises the belief state of a QUIC connection Extraction requires deep understanding of each implementation

6

Felix Rath

slide-33
SLIDE 33

Belief State

Observations:

  • Endpoints track state of the connection
  • Adjusted due to received and sent packets, time, . . .

QUIC is standardized, so this state should be similar State can be compared across implementations Belief State

In practice:

Unclear what comprises the belief state of a QUIC connection Extraction requires deep understanding of each implementation

6

Felix Rath

slide-34
SLIDE 34

Belief State

Observations:

  • Endpoints track state of the connection
  • Adjusted due to received and sent packets, time, . . .
  • QUIC is standardized, so this state should be similar

State can be compared across implementations Belief State

In practice:

Unclear what comprises the belief state of a QUIC connection Extraction requires deep understanding of each implementation

6

Felix Rath

slide-35
SLIDE 35

Belief State

Observations:

  • Endpoints track state of the connection
  • Adjusted due to received and sent packets, time, . . .
  • QUIC is standardized, so this state should be similar
  • → State can be compared across implementations

Belief State

In practice:

Unclear what comprises the belief state of a QUIC connection Extraction requires deep understanding of each implementation

6

Felix Rath

slide-36
SLIDE 36

Belief State

Observations:

  • Endpoints track state of the connection
  • Adjusted due to received and sent packets, time, . . .
  • QUIC is standardized, so this state should be similar
  • → State can be compared across implementations
  • → Belief State

In practice:

Unclear what comprises the belief state of a QUIC connection Extraction requires deep understanding of each implementation

6

Felix Rath

slide-37
SLIDE 37

Belief State

Observations:

  • Endpoints track state of the connection
  • Adjusted due to received and sent packets, time, . . .
  • QUIC is standardized, so this state should be similar
  • → State can be compared across implementations
  • → Belief State

In practice:

  • Unclear what comprises the belief state of a QUIC connection

Extraction requires deep understanding of each implementation

6

Felix Rath

slide-38
SLIDE 38

Belief State

Observations:

  • Endpoints track state of the connection
  • Adjusted due to received and sent packets, time, . . .
  • QUIC is standardized, so this state should be similar
  • → State can be compared across implementations
  • → Belief State

In practice:

  • Unclear what comprises the belief state of a QUIC connection
  • Extraction requires deep understanding of each implementation

6

Felix Rath

slide-39
SLIDE 39

Testing Robustness

Observations:

  • Protocol implementations are often exposed to arbitrary input

Should be stable and safe as few bugs as possible Therefore: Also test implementations for robustness Main attack vector: Packet handling code Test receiver functions as thoroughly as possible Detect cases that lead to errors

7

Felix Rath

slide-40
SLIDE 40

Testing Robustness

Observations:

  • Protocol implementations are often exposed to arbitrary input
  • Should be stable and safe → as few bugs as possible

Therefore: Also test implementations for robustness Main attack vector: Packet handling code Test receiver functions as thoroughly as possible Detect cases that lead to errors

7

Felix Rath

slide-41
SLIDE 41

Testing Robustness

Observations:

  • Protocol implementations are often exposed to arbitrary input
  • Should be stable and safe → as few bugs as possible
  • Therefore: Also test implementations for robustness

Main attack vector: Packet handling code Test receiver functions as thoroughly as possible Detect cases that lead to errors

7

Felix Rath

slide-42
SLIDE 42

Testing Robustness

Observations:

  • Protocol implementations are often exposed to arbitrary input
  • Should be stable and safe → as few bugs as possible
  • Therefore: Also test implementations for robustness
  • Main attack vector: Packet handling code

Test receiver functions as thoroughly as possible Detect cases that lead to errors

7

Felix Rath

slide-43
SLIDE 43

Testing Robustness

Observations:

  • Protocol implementations are often exposed to arbitrary input
  • Should be stable and safe → as few bugs as possible
  • Therefore: Also test implementations for robustness
  • Main attack vector: Packet handling code
  • Test receiver functions as thoroughly as possible

Detect cases that lead to errors

7

Felix Rath

slide-44
SLIDE 44

Testing Robustness

Observations:

  • Protocol implementations are often exposed to arbitrary input
  • Should be stable and safe → as few bugs as possible
  • Therefore: Also test implementations for robustness
  • Main attack vector: Packet handling code
  • Test receiver functions as thoroughly as possible
  • Detect cases that lead to errors

7

Felix Rath

slide-45
SLIDE 45

Symbolic Execution

1 int parse_packet 2 (char* data, size_t len){ 3 if(len >= 1) 4 if(data[1] == 'x') 5 return 1; 6 return 0; 7 }

8

Felix Rath

slide-46
SLIDE 46

Symbolic Execution

1 int parse_packet 2 (char* data, size_t len){ 3 if(len >= 1) 4 if(data[1] == 'x') 5 return 1; 6 return 0; 7 } data = symbolic() len = symbolic()

8

Felix Rath

slide-47
SLIDE 47

Symbolic Execution

1 int parse_packet 2 (char* data, size_t len){ 3 if(len >= 1) 4 if(data[1] == 'x') 5 return 1; 6 return 0; 7 } data = symbolic() len = symbolic() if(len >= 1)

8

Felix Rath

slide-48
SLIDE 48

Symbolic Execution

1 int parse_packet 2 (char* data, size_t len){ 3 if(len >= 1) 4 if(data[1] == 'x') 5 return 1; 6 return 0; 7 } data = symbolic() len = symbolic() if(len >= 1) data[1] len ≥ 1 return 0; ¬(len ≥ 1)

8

Felix Rath

slide-49
SLIDE 49

Symbolic Execution

1 int parse_packet 2 (char* data, size_t len){ 3 if(len >= 1) 4 if(data[1] == 'x') 5 return 1; 6 return 0; 7 } data = symbolic() len = symbolic() if(len >= 1) data[1] len ≥ 1 return 0; ¬(len ≥ 1) Out-of-bounds if(data[1] == 'x')

8

Felix Rath

slide-50
SLIDE 50

Symbolic Execution

1 int parse_packet 2 (char* data, size_t len){ 3 if(len >= 1) 4 if(data[1] == 'x') 5 return 1; 6 return 0; 7 } data = symbolic() len = symbolic() if(len >= 1) data[1] len ≥ 1 return 0; ¬(len ≥ 1) Out-of-bounds if(data[1] == 'x') return 1; s = 1 return 0; ¬(s = 1)

8

Felix Rath

slide-51
SLIDE 51

Symbolic Execution

1 int parse_packet 2 (char* data, size_t len){ 3 if(len >= 1) 4 if(data[1] == 'x') 5 return 1; 6 return 0; 7 } data = symbolic() len = symbolic() if(len >= 1) data[1] len ≥ 1 return 0; ¬(len ≥ 1) Out-of-bounds if(data[1] == 'x') return 1; s = 1 return 0; ¬(s = 1)

Exhaustive exploration

8

Felix Rath

slide-52
SLIDE 52

Symbolic Execution

1 int parse_packet 2 (char* data, size_t len){ 3 if(len >= 1) 4 if(data[1] == 'x') 5 return 1; 6 return 0; 7 } data = symbolic() len = symbolic() if(len >= 1) data[1] len ≥ 1 return 0; ¬(len ≥ 1) Out-of-bounds if(data[1] == 'x') return 1; s = 1 return 0; ¬(s = 1)

Exhaustive exploration Find bugs/assertions

8

Felix Rath

slide-53
SLIDE 53

SymEx of QUIC Implementations

SymEx requires. . .

  • single process and single entry point (e.g., main(), parse_packet())

symbolically running executed code

QUIC implementations

are libraries (no explicit entry point) use a lot of external/kernel functionality (TLS, networking, ) Challenge: How to bridge these gaps?

9

Felix Rath

slide-54
SLIDE 54

SymEx of QUIC Implementations

SymEx requires. . .

  • single process and single entry point (e.g., main(), parse_packet())
  • symbolically running executed code

QUIC implementations

are libraries (no explicit entry point) use a lot of external/kernel functionality (TLS, networking, ) Challenge: How to bridge these gaps?

9

Felix Rath

slide-55
SLIDE 55

SymEx of QUIC Implementations

SymEx requires. . .

  • single process and single entry point (e.g., main(), parse_packet())
  • symbolically running executed code

QUIC implementations . . .

  • are libraries (no explicit entry point)

use a lot of external/kernel functionality (TLS, networking, ) Challenge: How to bridge these gaps?

9

Felix Rath

slide-56
SLIDE 56

SymEx of QUIC Implementations

SymEx requires. . .

  • single process and single entry point (e.g., main(), parse_packet())
  • symbolically running executed code

QUIC implementations . . .

  • are libraries (no explicit entry point)
  • use a lot of external/kernel functionality (TLS, networking, . . .)

Challenge: How to bridge these gaps?

9

Felix Rath

slide-57
SLIDE 57

SymEx of QUIC Implementations

SymEx requires. . .

  • single process and single entry point (e.g., main(), parse_packet())
  • symbolically running executed code

QUIC implementations . . .

  • are libraries (no explicit entry point)
  • use a lot of external/kernel functionality (TLS, networking, . . .)

Challenge: How to bridge these gaps?

9

Felix Rath

slide-58
SLIDE 58

Symbolic Test Scenarios

Solution 1

Defjne concrete test scenarios that use symbolic input.

  • pen_conn

+

  • pen_stream

+

send_request

+

send_response

+

close_conn

+

10

Felix Rath

slide-59
SLIDE 59

Symbolic Test Scenarios

Solution 1

Defjne concrete test scenarios that use symbolic input.

  • pen_conn

+

  • pen_stream

+

send_request

+

send_response

+

close_conn

+

10

Felix Rath

slide-60
SLIDE 60

Symbolic Test Scenarios

Solution 1

Defjne concrete test scenarios that use symbolic input.

  • pen_conn

+

  • pen_stream

+

send_request

+

send_response

+

close_conn

+

10

Felix Rath

slide-61
SLIDE 61

Symbolic Test Scenarios

Solution 1

Defjne concrete test scenarios that use symbolic input.

  • pen_conn

+

  • pen_stream

+

send_request

+

send_response

+

close_conn

+

10

Felix Rath

slide-62
SLIDE 62

Symbolic Test Scenarios

Solution 1

Defjne concrete test scenarios that use symbolic input.

  • pen_conn

+

  • pen_stream

+

send_request

+

send_response

+

close_conn

+

10

Felix Rath

slide-63
SLIDE 63

Symbolic Test Scenarios

Solution 1

Defjne concrete test scenarios that use symbolic input.

  • pen_conn

+

  • pen_stream

+

send_request

+

send_response

+

close_conn

+

10

Felix Rath

slide-64
SLIDE 64

Symbolic Test Scenarios

Solution 1

Defjne concrete test scenarios that use symbolic input.

  • pen_conn

+

  • pen_stream

+

send_request

+

send_response

+

close_conn

+

10

Felix Rath

slide-65
SLIDE 65

Symbolic Test Scenarios

Solution 1

Defjne concrete test scenarios that use symbolic input.

  • pen_conn + λ
  • pen_stream + λ

send_request + λ send_response + λ close_conn + λ 10

Felix Rath

slide-66
SLIDE 66

Mock Libraries

Solution 2

Replace external functionality with mock implementations.

For this work:

OS (UNIX Sockets, ...): enable symbolic packet data Crypto (OpenSSL, ...): make encryption transparent Networking (Libev, ...): enable applications

11

Felix Rath

slide-67
SLIDE 67

Mock Libraries

Solution 2

Replace external functionality with mock implementations.

For this work:

  • OS (UNIX Sockets, ...): enable symbolic packet data

Crypto (OpenSSL, ...): make encryption transparent Networking (Libev, ...): enable applications

11

Felix Rath

slide-68
SLIDE 68

Mock Libraries

Solution 2

Replace external functionality with mock implementations.

For this work:

  • OS (UNIX Sockets, ...): enable symbolic packet data
  • Crypto (OpenSSL, ...): make encryption transparent

Networking (Libev, ...): enable applications

11

Felix Rath

slide-69
SLIDE 69

Mock Libraries

Solution 2

Replace external functionality with mock implementations.

For this work:

  • OS (UNIX Sockets, ...): enable symbolic packet data
  • Crypto (OpenSSL, ...): make encryption transparent
  • Networking (Libev, ...): enable applications

11

Felix Rath

slide-70
SLIDE 70

Mock Libraries

Solution 2

Replace external functionality with mock implementations.

For this work:

  • OS (UNIX Sockets, ...): enable symbolic packet data
  • Crypto (OpenSSL, ...): make encryption transparent
  • Networking (Libev, ...): enable applications

Server Client Libev Mock

OpenSSL Mock OpenSSL Mock Socket Mock

  • sym. drop
  • sym. mod

11

Felix Rath

slide-71
SLIDE 71

Case Study

Setup

Picoquic client QUANT server Per-library frontends Three test scenarios Interoperability: Failed connections + Timeouts Robustness: Symbolic packet modifjcations + drops

12

Felix Rath

slide-72
SLIDE 72

Case Study

Setup

  • Picoquic client

QUANT server Per-library frontends Three test scenarios Interoperability: Failed connections + Timeouts Robustness: Symbolic packet modifjcations + drops

12

Felix Rath

slide-73
SLIDE 73

Case Study

Setup

  • Picoquic client
  • QUANT server

Per-library frontends Three test scenarios Interoperability: Failed connections + Timeouts Robustness: Symbolic packet modifjcations + drops

12

Felix Rath

slide-74
SLIDE 74

Case Study

Setup

  • Picoquic client
  • QUANT server
  • Per-library frontends

Three test scenarios Interoperability: Failed connections + Timeouts Robustness: Symbolic packet modifjcations + drops

12

Felix Rath

slide-75
SLIDE 75

Case Study

Setup

  • Picoquic client
  • QUANT server
  • Per-library frontends
  • Three test scenarios

Interoperability: Failed connections + Timeouts Robustness: Symbolic packet modifjcations + drops

12

Felix Rath

slide-76
SLIDE 76

Case Study

Setup

  • Picoquic client
  • QUANT server
  • Per-library frontends
  • Three test scenarios
  • Interoperability: Failed connections + Timeouts

Robustness: Symbolic packet modifjcations + drops

12

Felix Rath

slide-77
SLIDE 77

Case Study

Setup

  • Picoquic client
  • QUANT server
  • Per-library frontends
  • Three test scenarios
  • Interoperability: Failed connections + Timeouts
  • Robustness: Symbolic packet modifjcations + drops

12

Felix Rath

slide-78
SLIDE 78

Evaluation

Setup

  • Draft 14
  • Six confjgurations
  • 8h per confjguration
  • 32GB memory limit

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

13

Felix Rath

slide-79
SLIDE 79

Sym-Stream

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

No stream Stream without response Stream with 1 byte response

Bugs detected

Interoperability bug (known beforehand) Use-after-free

14

Felix Rath

slide-80
SLIDE 80

Sym-Stream

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

  • No stream
  • Stream without response
  • Stream with 1 byte response

Bugs detected

Interoperability bug (known beforehand) Use-after-free

14

Felix Rath

slide-81
SLIDE 81

Sym-Stream

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

  • No stream
  • Stream without response
  • Stream with 1 byte response

Bugs detected

  • Interoperability bug (known beforehand)

Use-after-free

14

Felix Rath

slide-82
SLIDE 82

Sym-Stream

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

  • No stream
  • Stream without response
  • Stream with 1 byte response

Bugs detected

  • Interoperability bug (known beforehand)
  • Use-after-free

14

Felix Rath

slide-83
SLIDE 83

Sym-Version

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

Version announced by the picoquic client

Bugs detected

No connection when version is 0xbabababa

15

Felix Rath

slide-84
SLIDE 84

Sym-Version

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

  • Version announced by the picoquic client

Bugs detected

No connection when version is 0xbabababa

15

Felix Rath

slide-85
SLIDE 85

Sym-Version

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

  • Version announced by the picoquic client

Bugs detected

  • No connection when version is 0xbabababa

15

Felix Rath

slide-86
SLIDE 86

Sym-Drop

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

Every packet dropped symbolically

Bugs detected

Nullpointer-deref when certain packets are dropped

Testcase drops 4th, 5th, 7th, error on 9th Points toward a hard-to-detect issue

16

Felix Rath

slide-87
SLIDE 87

Sym-Drop

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

  • Every packet dropped symbolically

Bugs detected

Nullpointer-deref when certain packets are dropped

Testcase drops 4th, 5th, 7th, error on 9th Points toward a hard-to-detect issue

16

Felix Rath

slide-88
SLIDE 88

Sym-Drop

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

  • Every packet dropped symbolically

Bugs detected

  • Nullpointer-deref when certain packets are dropped

Testcase drops 4th, 5th, 7th, error on 9th Points toward a hard-to-detect issue

16

Felix Rath

slide-89
SLIDE 89

Sym-Drop

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

  • Every packet dropped symbolically

Bugs detected

  • Nullpointer-deref when certain packets are dropped

▶ Testcase drops 4th, 5th, 7th, error on 9th Points toward a hard-to-detect issue

16

Felix Rath

slide-90
SLIDE 90

Sym-Drop

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

  • Every packet dropped symbolically

Bugs detected

  • Nullpointer-deref when certain packets are dropped

▶ Testcase drops 4th, 5th, 7th, error on 9th ▶ Points toward a hard-to-detect issue

16

Felix Rath

slide-91
SLIDE 91

Sym-Mod-X

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

Made fjrst X bytes of each sent packet symbolic

Bugs detected

Error when fjrst 10 bytes of fjrst packet are changed

Testcase changes them to

[0xff, 0x01, 0x01, 0x01, 0x01, 0x67, 0xff, 0xff, 0xff]

Also points toward a hard-to-detect issue

17

Felix Rath

slide-92
SLIDE 92

Sym-Mod-X

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

  • Made fjrst X bytes of each sent packet symbolic

Bugs detected

Error when fjrst 10 bytes of fjrst packet are changed

Testcase changes them to

[0xff, 0x01, 0x01, 0x01, 0x01, 0x67, 0xff, 0xff, 0xff]

Also points toward a hard-to-detect issue

17

Felix Rath

slide-93
SLIDE 93

Sym-Mod-X

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

  • Made fjrst X bytes of each sent packet symbolic

Bugs detected

  • Error when fjrst 10 bytes of fjrst packet are changed

Testcase changes them to

[0xff, 0x01, 0x01, 0x01, 0x01, 0x67, 0xff, 0xff, 0xff]

Also points toward a hard-to-detect issue

17

Felix Rath

slide-94
SLIDE 94

Sym-Mod-X

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

  • Made fjrst X bytes of each sent packet symbolic

Bugs detected

  • Error when fjrst 10 bytes of fjrst packet are changed

▶ Testcase changes them to

[0xff, 0x01, 0x01, 0x01, 0x01, 0x67, 0xff, 0xff, 0xff]

Also points toward a hard-to-detect issue

17

Felix Rath

slide-95
SLIDE 95

Sym-Mod-X

Confjguration Instrs/s Time[h] ICov[%] BCov[%] TSolver[%] MaxMem[GB] Unique errors sym-stream 1725742 0:01 38.96 24.81 0.06 0.16 2 sym-version 232139 0:25 38.87 24.83 83.83 0.15 1 sym-drop 432753 8:00 38.85 25.31 0.02 11.97 1 sym-mod-1 380751 8:00 41.10 27.04 0.79 32.44 sym-mod-5 241116 7:00 40.11 26.11 8.35 33.02 sym-mod-10 4118 8:01 32.11 18.79 78.78 5.34 1

Symbolic Input

  • Made fjrst X bytes of each sent packet symbolic

Bugs detected

  • Error when fjrst 10 bytes of fjrst packet are changed

▶ Testcase changes them to

[0xff, 0x01, 0x01, 0x01, 0x01, 0x67, 0xff, 0xff, 0xff]

▶ Also points toward a hard-to-detect issue

17

Felix Rath

slide-96
SLIDE 96

Future Work

Belief-State

  • Defjnition of belief-state
  • Standard interface
  • Use this for divergence testing
  • Would also enable thorough testing using other approaches

Scalability

  • 10 symbolic bytes already diffjcult
  • Some simple strategies available
  • Take a closer look at structure/semantics of packets

18

Felix Rath

slide-97
SLIDE 97

Conclusion

The goal: Interoperability testing Main fjndings:

  • Software testing approaches (e.g., SymEx) are promising
  • But: Common defjnition of an interoperability measure,

e.g., Belief State, desirable for testing Case study:

  • Tested picoquic and QUANT
  • Concrete test scenarios with symbolic input
  • Mock libraries to enable SymEx
  • Found fjve bugs

▶ Including two deep interaction issues

19

Felix Rath