Comparing the Effectiveness of Penetration Testing and Static Code - - PowerPoint PPT Presentation

comparing the effectiveness of penetration testing and
SMART_READER_LITE
LIVE PREVIEW

Comparing the Effectiveness of Penetration Testing and Static Code - - PowerPoint PPT Presentation

Comparing the Effectiveness of Penetration Testing and Static Code Analysis Detection of SQL Injection Vulnerabilities in Web Services Nuno Antunes, Marco Vieira PRDC 2009 nmsa@dei.uc.pt, mvieira@dei.uc.pt University of Coimbra


slide-1
SLIDE 1

Nuno Antunes, Marco Vieira

nmsa@dei.uc.pt, mvieira@dei.uc.pt University of Coimbra – Portugal

PRDC 2009

Comparing the Effectiveness of Penetration Testing and Static Code Analysis

Detection of SQL Injection Vulnerabilities in Web Services

slide-2
SLIDE 2

2

Web Services

n Web services are becoming a strategic

component in a wide range of organizations

n Components that can be remotely invoked

n Well defined interface

n Web services are extremely exposed to attacks

n Any existing vulnerability will most probably be

uncovered/exploited

n Both providers and consumers need to assess

services’ security

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-3
SLIDE 3

3

Web Services Environment

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-4
SLIDE 4

4

SQL Injection vulnerabilities…

public String auth(String login, String pass) throw SQLException { String sql = "SELECT * FROM users WHERE "+ "username='" + login + "' AND "+ "password='" + pass + "'"; ResultSet rs = statement.executeQuery(sql); (…) } public void delete(String str) throw SQLException{ String sql = "DELETE FROM table "WHERE id='" + str + "'"; statement.executeUpdate(sql); } ' OR 1=1 -- "SELECT * FROM users WHERE username='' OR 1=1 -- ' AND password=''“; "DELETE FROM table WHERE id='' OR '' = ''"; ' OR ''='

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-5
SLIDE 5

5

Developers must…

n Apply best coding practices n Perform code analysis

n Manual code analyses (reviews, inspections) n Automated static code analysis

n Perform tests

n Manual penetration testing n Automated penetration testing (vulnerability scanners)

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-6
SLIDE 6

6

Penetration testing

n Widely used by developers n Consists in stressing the application from the

point of view of an attacker

n “black-box” approach n Uses specific malicious inputs

n e.g., for SQL Injection: ‘ or 1=1

n Can be performed manually or automatically

n Many tools available

n Including commercial and open-source

n Does not require access to the code

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-7
SLIDE 7

7

Static code analysis

n “white-box” approach n Consists in analyzing the source code of the

application, without execution it

n Looks for potential vulnerabilities

n Among other types of software defects

n Can be performed manually or automatically

n These tools provide an automatic way for highlighting

possible coding errors

n Does require access to the code (or bytecode)

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-8
SLIDE 8

8

Our goal…

n Evaluate several automatic penetration testing

tools and static analysis tools

n In a controlled environment

n Focus on two key measures of interest:

n Coverage

n Portrays the percentage of existing vulnerabilities that are

detected by a given tool

n False positives rate

n Represents the number of reported vulnerabilities that in fact

do not exist

n Target only SQL Injection vulnerabilities

n Extremely relevant in Web Services

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-9
SLIDE 9

9

Steps

n Preparation

n Select the penetration testers and static code analyzers n Select the Web Services to be considered

n Execution

n Use the tools to identify potential vulnerabilities

n Verification

n Perform manual verification to confirm that the

vulnerabilities identified by the tools do exist

n i.e., are not false positives

n Analysis

n Analyze the results obtained and systematize the

lessons learned

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-10
SLIDE 10

10

Web Services tested

n Eight Web Services

n A total of 25 operations

n Four of the services are based on the TPC-App

performance benchmark

n Four other services have been adapted from

code publicly available on the Internet

n Implemented in Java and use a relational

database

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-11
SLIDE 11

11

Web Services characterization

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-12
SLIDE 12

12

Tools studied

n Penetration testing

n HP WebInspect n IBM Rational AppScan n Acunetix Web Vulnerability Scanner n [Antunes 2009]

n Static code analysis

n FindBugs n Yasca n IntelliJ IDEA

n Decided not to mention the brand of the tools

n VS1, VS2, VS3, VS4 (without any order in particular) n SA1, SA2, SA3 (without any order in particular)

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-13
SLIDE 13

13

Tools and environment configuration

n Penetration-testing

n Underlying database restored before each test

n This avoids the cumulative effect of previous tests n Guarantees that all the tools started the service testing in a

consistent state

n If allowed by the testing tool, information about the

domain of each parameter was provided

n If the tool requires an exemplar invocation per operation, the

exemplar respected the input domains of operation

n All the tools in this situation used the same exemplar

n Static code analysis

n Configured to fully analyze the services code n For the analyzers that use binary code, the

deployment-ready version was used

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-14
SLIDE 14

14

Web Services manual inspection

n It is essential to correctly identify the

vulnerabilities that exist in the services code

n A team of experts was invited to review the

source code looking for vulnerabilities

n False positives were eliminated by cross-checking the

vulnerabilities identified by different people

n A key difficulty is that different tools report

(and count) vulnerabilities in different ways

n Penetration testing: a vulnerability for each vulnerable

parameter

n Static analysis: a vulnerability for each vulnerable line

in the service code

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-15
SLIDE 15

15

Vulnerabilities found

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-16
SLIDE 16

16

Penetration testing results

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-17
SLIDE 17

17

Examples of penetration testing limitations

Marco Vieira PRDC 2009, November 16-18, Shangai, China

public void operation(String str) { try { String sql = "DELETE FROM table" + "WHERE id='" + str + "'"; statement.executeUpdate(sql); } catch (SQLException se) {} } public String dumpDepositInfo(String str) { try { String path = "//DepositInfo/Deposit"+ "[@accNum='" + str + "']"; return csvFromPath(path); } catch (XPathException e) {} return null; }

No return value; exceptions related with SQL mal-formation do not leak out to the invocator Lack of output information

slide-18
SLIDE 18

18

Static code analysis results

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-19
SLIDE 19

19

Examples of static analysis limitations

Marco Vieira PRDC 2009, November 16-18, Shangai, China

public void operation(String str) { int i = Integer.parseInt(str); try { String sql = "DELETE FROM table" + "WHERE id='" + str + "'"; statement.executeUpdate(sql); } catch (SQLException se) {} } public String dumpDepositInfo(String str) { try { String path = "//DepositInfo/Deposit"+ "[@accNum='" + str + "']"; return csvFromPath(path); } catch (XPathException e) {} return null; }

Analyzers identify the vulnerability because the SQL query is a non-constant string Depending on the complexity of csvFromPath method A static analysis tool may not be able to find the vulnerability

slide-20
SLIDE 20

20

Penetration testing vs Static analysis (1)

n Coverage

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-21
SLIDE 21

21

Penetration testing vs Static analysis (2)

n False positives

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-22
SLIDE 22

22

Key observations

n The coverage of static code analysis is

typically higher than of penetration testing

n False positives are a problem for both

approaches

n But have more impact in the case of static analysis;

n Different tools report different vulnerabilities in

the same piece of code

n Even tools implementing the same approach frequently

n Very poor results!

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-23
SLIDE 23

23

Conclusions

n The effectiveness of vulnerability detection

tools is very low

n How to improve penetration testing?

n Increase representativeness of the workload n Guarantee high coverage n Improve the attacks performed n Improve the vulnerability detection algorithms

n How to improve static analysis?

n Include new vulnerable code patterns

n Merge penetration testing and static analysis?

Marco Vieira PRDC 2009, November 16-18, Shangai, China

slide-24
SLIDE 24

24

Questions?

Marco Vieira PRDC 2009, November 16-18, Shangai, China