using rule based activity using rule based activity using
play

Using Rule-Based Activity Using Rule-Based Activity Using - PowerPoint PPT Presentation

Using Rule-Based Activity Using Rule-Based Activity Using Rule-Based Activity Using Rule-Based Activity Descriptions to Evaluate Descriptions to Evaluate Descriptions to Evaluate Descriptions to Evaluate Intrusion Detection Systems


  1. Using Rule-Based Activity Using Rule-Based Activity Using Rule-Based Activity Using Rule-Based Activity Descriptions to Evaluate Descriptions to Evaluate Descriptions to Evaluate Descriptions to Evaluate Intrusion Detection Systems Intrusion Detection Systems Intrusion Detection Systems Intrusion Detection Systems Dominique Alessandri <dal@zurich.ibm.com> September 2001

  2. Motivation and Goals of this Work Motivation and Goals of this Work Motivation and Goals of this Work Motivation and Goals of this Work Unresolved issues with respect to ID-architecture components, i.e. IDSes High failure rate of IDSes (false positives and false negatives) Insufficient understanding of the semantics attached to alarms generated by diverse IDSes (alarm correlation & design of ID-architectures) Difficulty of IDS testing (specific to environment, static setup, heavy weight procedures) Goals of this work Evaluate IDSes wrt. their potential detection capabilities and wrt. their potential to failure Identify combinations of IDSes that provide increased ID-coverage

  3. Classification of Attacks Classification of Attacks Classification of Attacks Classification of Attacks We need an attack classification to identify a representative set of attacks to evaluate IDSes. Issues with existing attack classifications (e.g. Howard, Cohen, Neumann, Kumar and others) : None of the existing classifications allows the classification of attacks with respects to aspects relevant to an IDS (This namely includes all the aspects of attacks that are potentially observable by an IDS) . Often aspects of attacks and vulnerabilities are not clearly separated. Classification categories are not distinct. We need to develop a classification of attacks that classifies attacks according to criteria relevant to ID.

  4. Classification of IDSes Classification of IDSes Classification of IDSes Classification of IDSes We need a description scheme for IDSes that enables us to determine what an IDS is able to deduce from a given activity. Issues with existing taxonomies and classifications: Existing classifications (e.g. Debar et al., Axelsson, Lunt, Jackson and others) are not sufficiently systematic and/or detailed . We need to develop a description scheme for IDSes that is sufficiently systematic and detailed such that it becomes possible to analyze what IDSes are able to deduce from activities.

  5. Our Approach to IDS Evaluation Our Approach to IDS Evaluation Our Approach to IDS Evaluation Our Approach to IDS Evaluation Classif. scheme for IDS description attacks and scheme activities in general Classification of Description of Selection attacks IDS descriptions activities of attacks (using VulDa) (incl. attacks) RIDAX tool (Rule-based Intrusion Detection IDS Analysis and eXamination) Evaluation Potential true/false positives/negatives Analysis of alarm set semantics Precision and coverage estimates for single IDSes combinations of IDSes

  6. A new IDS Evaluation Paradigm A new IDS Evaluation Paradigm A new IDS Evaluation Paradigm A new IDS Evaluation Paradigm Comparison of Existing work by Lippmann et al. Our approach paradigms (and others) (rule-based evaluation) Goal Provide measures to judge quality of Provide coverage and precision estimates IDSes and to support selection of IDSes of single IDSes and IDS combinations to support ID-architecture design Implementation Evaluation of specific versions and Evaluation of the potential of IDSes based configurations of real IDSes on a description of their capabilities Evaluation testbed; replay of recorded Description of IDSes, attacks and benign Realization traffic activity using prolog rules IDS implementation and configuration Potential of the technology used What is evaluated? Environment specific and given (testbed) independent Input Real attacks and background activity Description of classes of attacks and of (traffic) benign activity Known variants of given attacks selected Input variation generated systematically Input variation Results wrt. true List of specific attacks the IDS can detect List of attack classes the IDS can potentially detect positives Results wrt. false List of false positives observed during List of activity classes that potentially evaluation process cause false positives positives Number and percentage of detected Potential estimates and precision for a Analysis of attacks and false positives normalized input set results

  7. Estimates Estimates Estimates Estimates (Rating Precision vs. Attack Detection Recall) (Rating Precision vs. Attack Detect ion Recall) (Rating Precision vs. Attack Detection Recall) (Rating Precision vs. Attack Detection Recall) Goal: find the 90.00% Single IDS combination of IDSes 80.00% 2 IDSes that meets our 3 IDSes 70.00% requirements (e.g. 4 IDSes 60.00% Recall (Coverage) 5 IDSes 80% coverage, 80% 50.00% rating precision) 40.00% Optima at concurrent 100% coverage and 30.00% 100% rating 20.00% precision 10.00% 0.00% 0.00% 20.00% 40.00% 60.00% 80.00% 100.00% Rating Precision Considered IDSes: Snort (simple configuration), Snort (all features), DaemonWatcher for httpd and ftpd (IBM) and WebIDS (IBM)

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend