comparing the effectiveness of penetration testing and
play

Comparing the Effectiveness of Penetration Testing and Static Code - PowerPoint PPT Presentation

Comparing the Effectiveness of Penetration Testing and Static Code Analysis Detection of SQL Injection Vulnerabilities in Web Services Nuno Antunes, Marco Vieira PRDC 2009 nmsa@dei.uc.pt, mvieira@dei.uc.pt University of Coimbra


  1. Comparing the Effectiveness of Penetration Testing and Static Code Analysis Detection of SQL Injection Vulnerabilities in Web Services Nuno Antunes, Marco Vieira PRDC 2009 nmsa@dei.uc.pt, mvieira@dei.uc.pt University of Coimbra – Portugal

  2. Web Services n Web services are becoming a strategic component in a wide range of organizations n Components that can be remotely invoked n Well defined interface n Web services are extremely exposed to attacks n Any existing vulnerability will most probably be uncovered/exploited n Both providers and consumers need to assess services ’ security 2 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  3. Web Services Environment 3 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  4. SQL Injection vulnerabilities … ' OR 1=1 -- public String auth(String login, String pass) throw SQLException { String sql = "SELECT * FROM users WHERE "+ "username='" + login + "' AND "+ "password='" + pass + "'"; ResultSet rs = statement.executeQuery(sql); "SELECT * FROM users WHERE username='' OR 1=1 -- ' AND (…) password=''“; } ' OR ''=' public void delete(String str) throw SQLException{ String sql = "DELETE FROM table "WHERE id='" + str + "'"; statement.executeUpdate(sql); "DELETE FROM table WHERE id='' OR '' = ''"; } 4 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  5. Developers must … n Apply best coding practices n Perform code analysis n Manual code analyses (reviews, inspections) n Automated static code analysis n Perform tests n Manual penetration testing n Automated penetration testing (vulnerability scanners) 5 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  6. Penetration testing n Widely used by developers n Consists in stressing the application from the point of view of an attacker n “ black-box ” approach n Uses specific malicious inputs n e.g., for SQL Injection: ‘ or 1=1 n Can be performed manually or automatically n Many tools available n Including commercial and open-source n Does not require access to the code 6 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  7. Static code analysis n “ white-box ” approach n Consists in analyzing the source code of the application, without execution it n Looks for potential vulnerabilities n Among other types of software defects n Can be performed manually or automatically n These tools provide an automatic way for highlighting possible coding errors n Does require access to the code (or bytecode) 7 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  8. Our goal … n Evaluate several automatic penetration testing tools and static analysis tools n In a controlled environment n Focus on two key measures of interest: n Coverage n Portrays the percentage of existing vulnerabilities that are detected by a given tool n False positives rate n Represents the number of reported vulnerabilities that in fact do not exist n Target only SQL Injection vulnerabilities n Extremely relevant in Web Services 8 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  9. Steps n Preparation n Select the penetration testers and static code analyzers n Select the Web Services to be considered n Execution n Use the tools to identify potential vulnerabilities n Verification n Perform manual verification to confirm that the vulnerabilities identified by the tools do exist n i.e., are not false positives n Analysis n Analyze the results obtained and systematize the lessons learned 9 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  10. Web Services tested n Eight Web Services n A total of 25 operations n Four of the services are based on the TPC-App performance benchmark n Four other services have been adapted from code publicly available on the Internet n Implemented in Java and use a relational database 10 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  11. Web Services characterization 11 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  12. Tools studied n Penetration testing n HP WebInspect n IBM Rational AppScan n Acunetix Web Vulnerability Scanner n [Antunes 2009] n Static code analysis n FindBugs n Yasca n IntelliJ IDEA n Decided not to mention the brand of the tools n VS1, VS2, VS3, VS4 (without any order in particular) n SA1, SA2, SA3 (without any order in particular) 12 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  13. Tools and environment configuration n Penetration-testing n Underlying database restored before each test n This avoids the cumulative effect of previous tests n Guarantees that all the tools started the service testing in a consistent state n If allowed by the testing tool, information about the domain of each parameter was provided n If the tool requires an exemplar invocation per operation, the exemplar respected the input domains of operation n All the tools in this situation used the same exemplar n Static code analysis n Configured to fully analyze the services code n For the analyzers that use binary code, the deployment-ready version was used 13 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  14. Web Services manual inspection n It is essential to correctly identify the vulnerabilities that exist in the services code n A team of experts was invited to review the source code looking for vulnerabilities n False positives were eliminated by cross-checking the vulnerabilities identified by different people n A key difficulty is that different tools report (and count) vulnerabilities in different ways n Penetration testing: a vulnerability for each vulnerable parameter n Static analysis: a vulnerability for each vulnerable line in the service code 14 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  15. Vulnerabilities found 15 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  16. Penetration testing results 16 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  17. Examples of penetration testing limitations public void operation(String str) { No return value; try { String sql = "DELETE FROM table" + exceptions related with SQL "WHERE id='" + str + "'"; mal-formation do not leak out statement.executeUpdate(sql); } catch (SQLException se) {} to the invocator } public String dumpDepositInfo(String str) { try { String path = "//DepositInfo/Deposit"+ Lack of output "[@accNum='" + str + "']"; return csvFromPath(path); information } catch (XPathException e) {} return null; } 17 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  18. Static code analysis results 18 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  19. Examples of static analysis limitations public void operation(String str) { Analyzers identify the int i = Integer.parseInt(str); try { vulnerability because the SQL String sql = "DELETE FROM table" + query is a non-constant "WHERE id='" + str + "'"; statement.executeUpdate(sql); string } catch (SQLException se) {} } Depending on the public String dumpDepositInfo(String str) { complexity of try { String path = "//DepositInfo/Deposit"+ csvFromPath method A "[@accNum='" + str + "']"; return csvFromPath(path); static analysis tool may } catch (XPathException e) {} return null; not be able to find the } vulnerability 19 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  20. Penetration testing vs Static analysis (1) n Coverage 20 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  21. Penetration testing vs Static analysis (2) n False positives 21 Marco Vieira PRDC 2009, November 16-18, Shangai, China

  22. Key observations n The coverage of static code analysis is typically higher than of penetration testing n False positives are a problem for both approaches n But have more impact in the case of static analysis; n Different tools report different vulnerabilities in the same piece of code n Even tools implementing the same approach frequently n Very poor results! 22 Marco Vieira PRDC 2009, November 16-18, Shangai, China

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend