detecting sql injection vulnerabilities in web services
play

Detecting SQL Injection Vulnerabilities in Web Services Nuno - PowerPoint PPT Presentation

Detecting SQL Injection Vulnerabilities in Web Services Nuno Antunes, Marco Vieira { nmsa, mvieira}@dei.uc.pt LADC CISUC 2009 Department of Informatics Engineering University of Coimbra Outline n Web Services n Web Security Scanners


  1. Detecting SQL Injection Vulnerabilities in Web Services Nuno Antunes, Marco Vieira { nmsa, mvieira}@dei.uc.pt LADC CISUC 2009 Department of Informatics Engineering University of Coimbra

  2. Outline n Web Services n Web Security Scanners n New Approach for the Detection of SQL Injection Vulnerabilities in WS n Experimental Evaluation n Conclusions 2 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  3. Web Services n Web services are becoming a strategic component in a wide range of organizations n Components that can be remotely invoked n Well defined interface n Web services are extremely exposed to attacks n Any existing vulnerability will most probably be uncovered/exploited n Both providers and consumers need to assess services’ security 3 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  4. Web Services Environment 4 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  5. Examples of Vulnerabilities ' OR 1=1 -- public String auth(String login, String pass) throw SQLException { String sql = "SELECT * FROM users WHERE "+ "username='" + login + "' AND "+ "password='" + pass + "'"; ResultSet rs = statement.executeQuery(sql); "SELECT * FROM users WHERE username='' OR 1=1 -- ' AND (…) password=''“; } ' OR ''=' public void delete(String str) throw SQLException{ String sql = "DELETE FROM table "WHERE id='" + str + "'"; statement.executeUpdate(sql); "DELETE FROM table WHERE id='' OR '' = ''"; } 5 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  6. Web Security Scanners n Easy and widely-used way to test applications searching vulnerabilities n Use fuzzing techniques to attack applications n Penetration testing n Perform thousands of tests in an automated way n What is the effectiveness of these tools? n Can programmers rely on these tools? 6 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  7. Experimental Study n Apply leading commercial scanners in public web services n 300 Web Services tested n Randomly selected n 4 Scanners used n Approach: n Preparation: select services and scanners n Execution: test the services using the scanners n Verification: identify false positives n Analysis: analysis and systematization of results 7 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  8. Scanners 8 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  9. Overall results analysis VS1.1 VS1.2 VS2 VS3 Vulnerability Types # Vuln. # WS # Vuln. # WS # Vuln. # WS # Vuln. # WS SQL Injection 217 38 225 38 25 5 35 11 XPath Injection 10 1 10 1 0 0 0 0 Code Execution 1 1 1 1 0 0 0 0 Possible Parameter Based 0 0 0 0 0 0 4 3 Buffer Overflow Possible Username or 0 0 0 0 0 0 47 3 Password Disclosure Possible Server Path 0 0 0 0 0 0 17 5 Disclosure Total 228 40 236 40 25 5 103 22 9 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  10. SQL Injection 225 False Positives 200 37% 40% Doubtful 83 175 87 Confirmed Vulnerabilities 150 11,6% 6,5% 26 125 14 100 25,7% 75 14% 116 116 50 32% 9 25 5 8 21 17 0 VS1.1 VS1.2 VS2 VS3 10 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  11. Can we do better? n Yes, we can! J n We propose a new approach to detect SQL Injection vulnerabilities in web services code n Main improvements: n A representative workload to exercise the services and understand the expected behavior n A broader set of attacks n Well defined rules to analyze the service's responses n To improve coverage and remove false positives n Completely automatic 11 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  12. Execution Steps 1. Prepare the tests 1.1. Gather information about the web service’ operations, call parameters, data types, and input domains 1.2. Generate the workload 2. Execute the tests 2.1. Execute the workload to understand the expected behavior of the service in the absence of attacks 2.2. Perform the attacks to trigger faulty behaviors and disclose SQL Injection vulnerabilities 3. Analyze the responses to detect and confirm the vulnerabilities 12 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  13. Prepare the Tests: Gather Information n Web service interfaces are described as a WSDL file n This file is processed automatically to obtain: n Operations n Call parameters n Data types n The valid values for each parameter (i.e., input domains) have to be provided by the user 13 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  14. Prepare the Tests: Generate the Workload n Two options: n User-defined workload n Random workload n Random workload is generated automatically n Generate test values for each input parameter n Generate test calls for each operation n Select test calls for each operation n It may be unfeasible to use a workload based on all the test calls generated (e.g., due to time constraints) n It is up to the user to specify the size of this subset 14 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  15. Execute the Tests: Configuration 15 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  16. Execute the Tests: Type of Attacks n Examples: n A total of 137 types n The list can be continuously improved n Just add new attack patterns to a configuration file 16 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  17. Execute the Tests: Attacks Generation n Mutation of the workload test calls n Valid values are replaced by malicious values n Number of attacks can be extremely large, e.g.: n 3 operations with 5 parameters each n A workload with 25 test calls per operation n 137 attack types è 52500 attacks n The tool allows specifying the number of test calls to be used for the attack load generation n The original test calls are ranked based on their ability to help us detecting vulnerabilities n e.g. test calls that that lead to valid web service responses (i.e., no error) are in the top of the list 17 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  18. Analyze the Responses n W n Valid call n A n Attack call 18 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  19. Experimental Evaluation n Web services tested n 262 public web services n Four steps: n Preparation: select a large set of web services. n Execution: use the vulnerability scanners to scan the services to identify potential vulnerabilities n Verification: perform manual testing to confirm that the vulnerabilities identified do exist n Analysis: analyze the results and compare the effectiveness of our tool to the commercial ones 19 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  20. Raw Results for Public Web Services 20 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  21. What about false positives? (1) n Manual checking n Reported vulnerabilities are false positives if: n The error/answer obtained is related to a robustness problem and not to a SQL command n e.g., NumberFormatException n The error/value in response is not caused by the elements "injected" by the tool n i.e., the same problem occurs when the service is executed with valid inputs 21 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  22. What about false positives? (2) n Reported vulnerabilities are confirmed if: n It is possible to observe that a SQL command was invalidated by the values "injected" by the tool n The “injected” values lead to exceptions raised by the database server n It is possible to access unauthorized services or web pages n e.g., by breaking the authentication process using SQL Injection n If none of these rules can be applied then a reported vulnerability is classified as doubtful 22 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  23. Results for Public Web Services 23 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

  24. Detection Coverage n Based on limited knowledge n Probably we don’t know all the existing vulnerabilities 24 Marco Vieira LADC 2009, September 01-04, João Pessoa, Brazil

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend