evaluation of intrusion detection systems in clouds
play

Evaluation of Intrusion Detection Systems in Clouds Thibaut Probst , - PowerPoint PPT Presentation

Problem statement Methodology Testbed and Experimental Results Conclusion Evaluation of Intrusion Detection Systems in Clouds Thibaut Probst , E. Alata, M. Ka aniche, V. Nicomette LAAS-CNRS - Dependable Computing and Fault Tolerance (TSF)


  1. Problem statement Methodology Testbed and Experimental Results Conclusion Evaluation of Intrusion Detection Systems in Clouds Thibaut Probst , E. Alata, M. Kaˆ aniche, V. Nicomette LAAS-CNRS - Dependable Computing and Fault Tolerance (TSF) team Journ´ ee SEC 2 - June 30th, 2015 Evaluation of Intrusion Detection Systems in Clouds 1 / 22

  2. Problem statement Methodology Testbed and Experimental Results Conclusion Problem statement 1 Methodology 2 Cloning Analysis of accessibilities IDS/IPS evaluation Testbed and Experimental Results 3 Conclusion 4 Evaluation of Intrusion Detection Systems in Clouds 2 / 22

  3. Problem statement Methodology Testbed and Experimental Results Conclusion Outline Problem statement 1 Methodology 2 Testbed and Experimental Results 3 Conclusion 4 Evaluation of Intrusion Detection Systems in Clouds 3 / 22

  4. Problem statement Methodology Testbed and Experimental Results Conclusion Context and problem statement Cloud computing Deploy and manage applications, development and execution platforms, virtual infrastructures hosted by a provider. On-demand self-service, broad network access, resource pooling, rapid elasticity, measured service. Security concerns Many actors and technologies → many possible threats. Security mechanisms deployed in virtual infrastructures : firewalls (network filtering), IDS/IPS (attack detection). ⇒ How to assess their efficiency ? Evaluation of Intrusion Detection Systems in Clouds 4 / 22

  5. Problem statement Methodology Testbed and Experimental Results Conclusion Overview of the Approach Objective Allow the automated evaluation and analysis of security mechanisms deployed in virtual infrastructures : ⇒ Provide the client and provider security reports on network accessibilities and IDS/IPS performance. Main assumptions Service model : Infrastructure as a Service (IaaS). Considered firewall types : Edge firewall and Hypervisor-based firewall The audit process should not disturb client’s business. Evaluation of Intrusion Detection Systems in Clouds 5 / 22

  6. Problem statement Methodology Testbed and Experimental Results Conclusion Overview of the Approach Three-phase approach 1 Retrieval of information and cloning of infrastructure. 2 Determination of accessibilities in two ways and analysis of discrepancies in the results. 3 Construction and execution of attack campaigns. Evaluation of Intrusion Detection Systems in Clouds 6 / 22

  7. Problem statement Cloning Methodology Analysis of accessibilities Testbed and Experimental Results IDS/IPS evaluation Conclusion Outline Problem statement 1 Methodology 2 Cloning Analysis of accessibilities IDS/IPS evaluation Testbed and Experimental Results 3 Conclusion 4 Evaluation of Intrusion Detection Systems in Clouds 7 / 22

  8. Problem statement Cloning Methodology Analysis of accessibilities Testbed and Experimental Results IDS/IPS evaluation Conclusion Preparation of the infrastructure Objective From the client ID/name, clone the virtual infrastructure (virtual datacenters, edge firewalls and virtual networks) Create the different VMs from a template Avoid IP conflicts due to cloning in external networks. Evaluation of Intrusion Detection Systems in Clouds 8 / 22

  9. Problem statement Cloning Methodology Analysis of accessibilities Testbed and Experimental Results IDS/IPS evaluation Conclusion Analysis of network access controls Accessibility : Authorized service from a source to a destination. First support of attack vectors. Accessibility matrix : Set of VMs ← → VMs and external location ← → VMs accessibilities. User- defined in a security policy and implemented on equipments (firewalls) as filtering rules. 2 methods to derive it : Statically → configured accessibilities. Dynamically → observed accessibilities. Discrepancy : accessibility not noticed in all 3 matrices. Evaluation of Intrusion Detection Systems in Clouds 9 / 22

  10. Problem statement Cloning Methodology Analysis of accessibilities Testbed and Experimental Results IDS/IPS evaluation Conclusion Static analysis Accessibility predicate : accessibility(X, SPORT, Y, PROTO, DPORT) Static analysis : determination of all accessibility predicates from cloud components configuration : configured accessibilities. Evaluation of Intrusion Detection Systems in Clouds 10 / 22

  11. Problem statement Cloning Methodology Analysis of accessibilities Testbed and Experimental Results IDS/IPS evaluation Conclusion Dynamic analysis Dynamic analysis : network packet exchanges between VMs (client → server) to determine accessibilities : observed accessibilities Design of an algorithm to perform all client-server sessions in a the smallest possible number of iterations ⇒ run sessions in parallel when possible. Evaluation of Intrusion Detection Systems in Clouds 11 / 22

  12. Problem statement Cloning Methodology Analysis of accessibilities Testbed and Experimental Results IDS/IPS evaluation Conclusion Evaluation trafic Automata based modelisation To generate and replay legitimate and malicious packets exchanges To avoid installing, running and stopping applications during the attack campaigns To be free of Windows licences Evaluation of Intrusion Detection Systems in Clouds 12 / 22

  13. Problem statement Cloning Methodology Analysis of accessibilities Testbed and Experimental Results IDS/IPS evaluation Conclusion Evaluation trafic Automata generation Process before attack campaigns. Use of Metasploit (malicious trafic) and various tools (legitimate trafic) to interact with vulnerable applications Evaluation of Intrusion Detection Systems in Clouds 13 / 22

  14. Problem statement Cloning Methodology Analysis of accessibilities Testbed and Experimental Results IDS/IPS evaluation Conclusion Attack campaigns Principle Execution of attacks and legitimate activity sessions for each accessibility discovered during the previous phase. Try to parallelize as much as possible the sessions Use of attacks dictionnary Alarms from the different NIDS are collected, backed-up and analysed to calculate usual IDS metrics Evaluation of Intrusion Detection Systems in Clouds 14 / 22

  15. Problem statement Cloning Methodology Analysis of accessibilities Testbed and Experimental Results IDS/IPS evaluation Conclusion Attack campaigns NIDS metrics True Positive if : Duration between alarm and attack < Threshold. IP adresses, protocols and ports included in the alarm correspond to those of the attack. The alarm is serious. The CVEs of the alarm correspond to the CVE of the attack. False Positive if a raised alarm does not correspond to an attack False Negative if no alarm raised for an attack TP DR = TP + FN . TP PR = TP + FP . Evaluation of Intrusion Detection Systems in Clouds 15 / 22

  16. Problem statement Methodology Testbed and Experimental Results Conclusion Outline Problem statement 1 Methodology 2 Testbed and Experimental Results 3 Conclusion 4 Evaluation of Intrusion Detection Systems in Clouds 16 / 22

  17. Problem statement Methodology Testbed and Experimental Results Conclusion Testbed environment Evaluation of Intrusion Detection Systems in Clouds 17 / 22

  18. Problem statement Methodology Testbed and Experimental Results Conclusion Dynamic analysis of accessibilities : example Evaluation of Intrusion Detection Systems in Clouds 18 / 22

  19. Problem statement Methodology Testbed and Experimental Results Conclusion NIDS Evaluation : attacks dictionnary exploit id cve proto port date description N sl N sm 35660 2014-9567 TCP 80 2014-12-02 ProjectSend Arbitrary File Upload 3 3 34926 2014-6287 TCP 80 2014-09-11 Rejetto HttpFileServer Remote Command Exe- 12 12 cution 33790 N/A TCP 80 2014-05-20 Easy File Management Web Server Stack Buffer 7 8 Overflow 25775 2013-2028 TCP 80 2013-05-07 Nginx HTTP Server 1.3.9-1.4.0 - Chuncked En- 3 8 coding Stack Buffer Overflow 16970 2002-2268 TCP 80 2010-12-26 Kolibri 2.0 - HTTP Server HEAD Buffer Over- 2 2 flow 16806 2007-6377 TCP 80 2007-12-10 BadBlue 2.72b PassThru Buffer Overflow 9 3 28681 N/A TCP 21 2013-08-20 freeFTPd PASS Command Buffer Overflow 11 4 24875 N/A TCP 21 2013-02-27 Sami FTP Server LIST Command Buffer Over- 11 3 flow 17355 2006-6576 TCP 21 2011-01-23 GoldenFTP 4.70 PASS Stack Buffer Overflow 13 7 16742 2006-3952 TCP 21 2006-07-31 Easy File Sharing FTP Server 2.0 PASS Overflow 11 6 16713 2006-2961 TCP 21 2006-06-12 Cesar FTP 0.99g MKD Command Buffer Over- 11 6 flow 16821 2007-4440 TCP 25 2007-08-18 Mercury Mail SMTP AUTH CRAM-MD5 Buffer 9 8 Overflow 16822 2004-1638 TCP 25 2004-10-26 TABS MailCarrier 2.51 - SMTP EHLO Overflow 6 6 16476 2006-1255 TCP 143 2006-03-17 Mercur 5.0 - IMAP SP3 SELECT Buffer Over- 7 4 flow 16474 2005-4267 TCP 143 2005-12-20 Qualcomm WorldMail 3.0 IMAPD LIST Buffer 2 2 Overflow Evaluation of Intrusion Detection Systems in Clouds 19 / 22

  20. Problem statement Methodology Testbed and Experimental Results Conclusion NIDS evaluation : example with Snort and Suricata Detection rate and precision Snort detects more attacks than Suricata. Suricata is more accurate than Snort. Considering only serious alarms decreases the detection rate but improves the precision. Evaluation of Intrusion Detection Systems in Clouds 20 / 22

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend