Evaluation of Intrusion Detection Systems in Clouds Thibaut Probst , - - PowerPoint PPT Presentation

evaluation of intrusion detection systems in clouds
SMART_READER_LITE
LIVE PREVIEW

Evaluation of Intrusion Detection Systems in Clouds Thibaut Probst , - - PowerPoint PPT Presentation

Problem statement Methodology Testbed and Experimental Results Conclusion Evaluation of Intrusion Detection Systems in Clouds Thibaut Probst , E. Alata, M. Ka aniche, V. Nicomette LAAS-CNRS - Dependable Computing and Fault Tolerance (TSF)


slide-1
SLIDE 1

Problem statement Methodology Testbed and Experimental Results Conclusion

Evaluation of Intrusion Detection Systems in Clouds

Thibaut Probst, E. Alata, M. Kaˆ aniche, V. Nicomette

LAAS-CNRS - Dependable Computing and Fault Tolerance (TSF) team

Journ´ ee SEC 2 - June 30th, 2015

Evaluation of Intrusion Detection Systems in Clouds 1 / 22

slide-2
SLIDE 2

Problem statement Methodology Testbed and Experimental Results Conclusion

1

Problem statement

2

Methodology Cloning Analysis of accessibilities IDS/IPS evaluation

3

Testbed and Experimental Results

4

Conclusion

Evaluation of Intrusion Detection Systems in Clouds 2 / 22

slide-3
SLIDE 3

Problem statement Methodology Testbed and Experimental Results Conclusion

Outline

1

Problem statement

2

Methodology

3

Testbed and Experimental Results

4

Conclusion

Evaluation of Intrusion Detection Systems in Clouds 3 / 22

slide-4
SLIDE 4

Problem statement Methodology Testbed and Experimental Results Conclusion

Context and problem statement

Cloud computing Deploy and manage applications, development and execution platforms, virtual infrastructures hosted by a provider. On-demand self-service, broad network access, resource pooling, rapid elasticity, measured service. Security concerns Many actors and technologies → many possible threats. Security mechanisms deployed in virtual infrastructures : firewalls (network filtering), IDS/IPS (attack detection). ⇒ How to assess their efficiency ?

Evaluation of Intrusion Detection Systems in Clouds 4 / 22

slide-5
SLIDE 5

Problem statement Methodology Testbed and Experimental Results Conclusion

Overview of the Approach

Objective Allow the automated evaluation and analysis of security mechanisms deployed in virtual infrastructures : ⇒ Provide the client and provider security reports on network accessibilities and IDS/IPS performance. Main assumptions Service model : Infrastructure as a Service (IaaS). Considered firewall types : Edge firewall and Hypervisor-based firewall The audit process should not disturb client’s business.

Evaluation of Intrusion Detection Systems in Clouds 5 / 22

slide-6
SLIDE 6

Problem statement Methodology Testbed and Experimental Results Conclusion

Overview of the Approach

Three-phase approach

1 Retrieval of information and cloning of infrastructure. 2 Determination of accessibilities in two ways and analysis of

discrepancies in the results.

3 Construction and execution of attack campaigns. Evaluation of Intrusion Detection Systems in Clouds 6 / 22

slide-7
SLIDE 7

Problem statement Methodology Testbed and Experimental Results Conclusion Cloning Analysis of accessibilities IDS/IPS evaluation

Outline

1

Problem statement

2

Methodology Cloning Analysis of accessibilities IDS/IPS evaluation

3

Testbed and Experimental Results

4

Conclusion

Evaluation of Intrusion Detection Systems in Clouds 7 / 22

slide-8
SLIDE 8

Problem statement Methodology Testbed and Experimental Results Conclusion Cloning Analysis of accessibilities IDS/IPS evaluation

Preparation of the infrastructure

Objective From the client ID/name, clone the virtual infrastructure (virtual datacenters, edge firewalls and virtual networks) Create the different VMs from a template Avoid IP conflicts due to cloning in external networks.

Evaluation of Intrusion Detection Systems in Clouds 8 / 22

slide-9
SLIDE 9

Problem statement Methodology Testbed and Experimental Results Conclusion Cloning Analysis of accessibilities IDS/IPS evaluation

Analysis of network access controls

Accessibility :

Authorized service from a source to a destination. First support of attack vectors.

Accessibility matrix :

Set of VMs ← → VMs and external location ← → VMs accessibilities. User-defined in a security policy and implemented on equipments (firewalls) as filtering rules.

2 methods to derive it :

Statically → configured accessibilities. Dynamically → observed accessibilities.

Discrepancy : accessibility not noticed in all 3 matrices.

Evaluation of Intrusion Detection Systems in Clouds 9 / 22

slide-10
SLIDE 10

Problem statement Methodology Testbed and Experimental Results Conclusion Cloning Analysis of accessibilities IDS/IPS evaluation

Static analysis

Accessibility predicate : accessibility(X, SPORT, Y, PROTO, DPORT) Static analysis : determination of all accessibility predicates from cloud components configuration : configured accessibilities.

Evaluation of Intrusion Detection Systems in Clouds 10 / 22

slide-11
SLIDE 11

Problem statement Methodology Testbed and Experimental Results Conclusion Cloning Analysis of accessibilities IDS/IPS evaluation

Dynamic analysis

Dynamic analysis : network packet exchanges between VMs (client → server) to determine accessibilities : observed accessibilities Design of an algorithm to perform all client-server sessions in a the smallest possible number of iterations ⇒ run sessions in parallel when possible.

Evaluation of Intrusion Detection Systems in Clouds 11 / 22

slide-12
SLIDE 12

Problem statement Methodology Testbed and Experimental Results Conclusion Cloning Analysis of accessibilities IDS/IPS evaluation

Evaluation trafic

Automata based modelisation To generate and replay legitimate and malicious packets exchanges To avoid installing, running and stopping applications during the attack campaigns To be free of Windows licences

Evaluation of Intrusion Detection Systems in Clouds 12 / 22

slide-13
SLIDE 13

Problem statement Methodology Testbed and Experimental Results Conclusion Cloning Analysis of accessibilities IDS/IPS evaluation

Evaluation trafic

Automata generation Process before attack campaigns. Use of Metasploit (malicious trafic) and various tools (legitimate trafic) to interact with vulnerable applications

Evaluation of Intrusion Detection Systems in Clouds 13 / 22

slide-14
SLIDE 14

Problem statement Methodology Testbed and Experimental Results Conclusion Cloning Analysis of accessibilities IDS/IPS evaluation

Attack campaigns

Principle Execution of attacks and legitimate activity sessions for each accessibility discovered during the previous phase. Try to parallelize as much as possible the sessions Use of attacks dictionnary Alarms from the different NIDS are collected, backed-up and analysed to calculate usual IDS metrics

Evaluation of Intrusion Detection Systems in Clouds 14 / 22

slide-15
SLIDE 15

Problem statement Methodology Testbed and Experimental Results Conclusion Cloning Analysis of accessibilities IDS/IPS evaluation

Attack campaigns

NIDS metrics True Positive if :

Duration between alarm and attack < Threshold. IP adresses, protocols and ports included in the alarm correspond to those of the attack. The alarm is serious. The CVEs of the alarm correspond to the CVE of the attack.

False Positive if a raised alarm does not correspond to an attack False Negative if no alarm raised for an attack DR =

TP TP+FN .

PR =

TP TP+FP .

Evaluation of Intrusion Detection Systems in Clouds 15 / 22

slide-16
SLIDE 16

Problem statement Methodology Testbed and Experimental Results Conclusion

Outline

1

Problem statement

2

Methodology

3

Testbed and Experimental Results

4

Conclusion

Evaluation of Intrusion Detection Systems in Clouds 16 / 22

slide-17
SLIDE 17

Problem statement Methodology Testbed and Experimental Results Conclusion

Testbed environment

Evaluation of Intrusion Detection Systems in Clouds 17 / 22

slide-18
SLIDE 18

Problem statement Methodology Testbed and Experimental Results Conclusion

Dynamic analysis of accessibilities : example

Evaluation of Intrusion Detection Systems in Clouds 18 / 22

slide-19
SLIDE 19

Problem statement Methodology Testbed and Experimental Results Conclusion

NIDS Evaluation : attacks dictionnary

exploit id cve proto port date description Nsl Nsm 35660 2014-9567 TCP 80 2014-12-02 ProjectSend Arbitrary File Upload 3 3 34926 2014-6287 TCP 80 2014-09-11 Rejetto HttpFileServer Remote Command Exe- cution 12 12 33790 N/A TCP 80 2014-05-20 Easy File Management Web Server Stack Buffer Overflow 7 8 25775 2013-2028 TCP 80 2013-05-07 Nginx HTTP Server 1.3.9-1.4.0 - Chuncked En- coding Stack Buffer Overflow 3 8 16970 2002-2268 TCP 80 2010-12-26 Kolibri 2.0 - HTTP Server HEAD Buffer Over- flow 2 2 16806 2007-6377 TCP 80 2007-12-10 BadBlue 2.72b PassThru Buffer Overflow 9 3 28681 N/A TCP 21 2013-08-20 freeFTPd PASS Command Buffer Overflow 11 4 24875 N/A TCP 21 2013-02-27 Sami FTP Server LIST Command Buffer Over- flow 11 3 17355 2006-6576 TCP 21 2011-01-23 GoldenFTP 4.70 PASS Stack Buffer Overflow 13 7 16742 2006-3952 TCP 21 2006-07-31 Easy File Sharing FTP Server 2.0 PASS Overflow 11 6 16713 2006-2961 TCP 21 2006-06-12 Cesar FTP 0.99g MKD Command Buffer Over- flow 11 6 16821 2007-4440 TCP 25 2007-08-18 Mercury Mail SMTP AUTH CRAM-MD5 Buffer Overflow 9 8 16822 2004-1638 TCP 25 2004-10-26 TABS MailCarrier 2.51 - SMTP EHLO Overflow 6 6 16476 2006-1255 TCP 143 2006-03-17 Mercur 5.0 - IMAP SP3 SELECT Buffer Over- flow 7 4 16474 2005-4267 TCP 143 2005-12-20 Qualcomm WorldMail 3.0 IMAPD LIST Buffer Overflow 2 2 Evaluation of Intrusion Detection Systems in Clouds 19 / 22

slide-20
SLIDE 20

Problem statement Methodology Testbed and Experimental Results Conclusion

NIDS evaluation : example with Snort and Suricata

Detection rate and precision

Snort detects more attacks than Suricata. Suricata is more accurate than Snort. Considering only serious alarms decreases the detection rate but improves the precision.

Evaluation of Intrusion Detection Systems in Clouds 20 / 22

slide-21
SLIDE 21

Problem statement Methodology Testbed and Experimental Results Conclusion

Outline

1

Problem statement

2

Methodology

3

Testbed and Experimental Results

4

Conclusion

Evaluation of Intrusion Detection Systems in Clouds 21 / 22

slide-22
SLIDE 22

Problem statement Methodology Testbed and Experimental Results Conclusion

Conclusion

Proposed approach Analysis of network access controls and evaluation of IDS/IPS. Fully automated process with no impact on the client’s business. Contributions Cloning of virtual infrastructures Static analysis of network access controls. Dynamic analysis of network access controls. Construction and execution of attack campaigns through automata based methods. Ongoing work and perspectives Autonomous system to automatically launch security audits. Extend the prototypes to other cloud solutions (OpenStack...).

Evaluation of Intrusion Detection Systems in Clouds 22 / 22