proactive detection of network security incidents a study
play

Proactive Detection of Network Security Incidents A Study Andrea - PowerPoint PPT Presentation

Proactive Detection of Network Security Incidents A Study Andrea Dufkova (ENISA) Piotr Kijewski (CERT Polska/NASK) FIRST 2012 Conference 21st June 2012, Malta www.enisa.europa.eu OUR TALK TODAY i. Links with ENISA work ii. Facts


  1. Proactive Detection of Network Security Incidents – A Study Andrea Dufkova (ENISA) Piotr Kijewski (CERT Polska/NASK) FIRST 2012 Conference 21st June 2012, Malta www.enisa.europa.eu

  2. OUR TALK TODAY … i. Links with ENISA work ii. Facts about the study iii. Dive into the research findings iv. Impact of the study in Poland v. Open questions vi. Recommendations www.enisa.europa.eu

  3. Background information ENISA CERT relations/operational security – focus in 2012 - studies • Definition of baseline capabilities of national and governmental CERTs • Training and exercises • Cybercrime prevention • Information sharing and alerting • Early warning 3 www.enisa.europa.eu

  4. Some Facts Project ran for ½ year Study published in December 2011 … 133 pages to read, but… Inventory of services/tools and mechanisms ( pages 27-98) 16 shortcomings – pages 108 - 127 35 recommendations - pages 128-132 Where to get the study: http://www.enisa.europa.eu/activities/ cert/support/proactive-detection 4 www.enisa.europa.eu

  5. Problem definition Reactive approach Wait for incoming incident reports (internal/external) vs Proactive approach Actively look for incidents taking place • Subscribe to external services informing about problems • Deploy internal monitoring tools / mechanisms Provide a sort of ‘ Early warning ’ service from the constituent’s (client’s) perspective 5 www.enisa.europa.eu

  6. Objectives Inventory of available methods, activities and information sources for proactive detection of network security incidents Identify good practice and recommended measures What needs to be done to improve and by whom 6 www.enisa.europa.eu

  7. Target audience National / governmental and other CERTs Abuse teams Data providers new or already established .... 7 www.enisa.europa.eu

  8. Approach Authors of the study – ENISA experts and CERT Polska / NASK (contractor) Main steps: Desktop research Survey among CERTs (>100 invitations, 45 responses) Analysis Expert group (active survey participants, other experts) • Meeting • Mailing list 8 www.enisa.europa.eu

  9. Survey Respondent profile Government/public administration 2% Academic 7% 12% ISP 33% Other(please specify) 14% Commercial Company Financial 32% 9 www.enisa.europa.eu

  10. Survey How do you feel with the incident information sources you currently have? We are fully satisfied with information 4% sources we currently have We would consider to try other sources to improve 47% We feel information deficit in general 49% – we think there are significantly more incidents we do not know about We feel we have too many information sources 10 www.enisa.europa.eu

  11. Survey What you would like to improve? 16 15 Accuracy 14 Number of responses 13 12 Coverage 11 10 Timeliness 8 6 6 5 Ease of use 4 2 Resources 0 required Accuracy Coverage Timeliness Ease of use Resources required 11 www.enisa.europa.eu

  12. Survey How do you obtain incident related data about your constituency? 200 180 Primary source Number of responses 160 140 Auxiliary source 120 100 Not used 80 60 40 20 0 Internal Monitoring Monitoring Incoming Monitoring monitoring of of Incident of external commercial Reports closed sources sources (reactive) sources 12 www.enisa.europa.eu

  13. Survey Resources available We do process all incoming information, but only higher priority incidents are further handled, more input information would leave even more lower priority incidents without attention 11% We can fully handle current amount of incident information. We could handle even more incident information 13% 45% We can fully handle current amount of incident information, but would not be 31% able to handle more We cannot properly handle even the amount of incident related information currently available 13 www.enisa.europa.eu

  14. Survey External sources of information 90 excellent good Number of rates given 80 70 Rates for timeliness, accuracy of results, 60 ease of use, coverage and resources 50 required are all summed up 40 30 20 10 Shado … Zeus/ … Spam … … Malwa … … … AusCE … DSHIE … … Cert.b … Malwa … … DNS- … Honey … … Malc0d … … … 0 Google CBL The Arbor aMaDa Team ARAKIS SGnet ISC FIRE Team 14 www.enisa.europa.eu

  15. Survey CERTs that use most popular source (Shadowserver) 40% 15 www.enisa.europa.eu

  16. Survey External sources of information Do you use any closed sources of information you cannot disclose? No 39% Yes 61% 16 www.enisa.europa.eu

  17. Survey Internal tools used 50 45 40 35 30 25 20 15 10 5 0 No answer I never used it and will not use it. I used it in the past, but dropped it. I don't use it but plan to use it in future. I use it 17 www.enisa.europa.eu

  18. Survey Do you collect data about other constituencies? 7% 5% yes 45% no cannot tell 43% not sure 18 www.enisa.europa.eu

  19. Survey Do you share this information? No 48% Yes 52% 19 www.enisa.europa.eu

  20. Survey Under what rules do you share? 4% Limited access 7% Other 15% Anyone (public) 56% Commercial 18% Public subscription based www.enisa.europa.eu

  21. Survey CERTs that collect info about others and share 23,4% 21 www.enisa.europa.eu

  22. Survey Do you correlate? No 20% Yes 80% 22 www.enisa.europa.eu

  23. Survey how do you correlate information from multiple sources 18% Adhoc Automated system 56% 26% Adhoc and automated system 23 www.enisa.europa.eu

  24. Survey CERTs that automate the correlation process in any way 35,2% 24 www.enisa.europa.eu

  25. Analysis Evaluation criteria: Timeliness Accuracy Ease of use Coverage Resources required Scalability (for internal tools) Extensibility (for internal tools) Significant degree of subjectiveness present (expert judgment, survey responses, workgroup expert opinions) 25 www.enisa.europa.eu

  26. Summary of external sources Service Timeliness Accuracy of results Ease of use Coverage Resources required DNS-BH Malware Domain Blocklist Fair Good Excellent Excellent Excellent MalwareURL Good Good Excellent Excellent Excellent DSHIELD Excellent Fair Good Excellent Excellent Google Safe Browsing Alerts Good Fair Good Excellent Good HoneySpider Network (as a service) Excellent Fair Good Fair Excellent AusCERT Good Good Good Good Excellent Cert.br data feed Good Good Fair Good Good FIRE Good Good Fair Good Good Team Cymru - TC Console Excellent Good Good Excellent Excellent EXPOSURE Good Good Excellent Good Excellent AmaDa Excellent Good Excellent Fair Excellent Malware Domain List Excellent Good Excellent Good Excellent Zeus/SpyEye Tracker Good Excellent Excellent Fair/Good Excellent The Spamhaus Project Datafeed Excellent Good Good Excellent Good Shadowserver Foundation Good Good Excellent Good/Excellent Excellent SGNET Good Excellent Good Fair Good ARAKIS Good Good Excellent Good Excellent Malc0de database Excellent Good Excellent N/A Excellent ParetoLogic URL Clearing House Excellent Good Good N/A Good SpamCop Excellent Good Good Excellent Good Arbor ATLAS Good Good Excellent Excellent Excellent CBL (Composite Blocking List) Excellent Excellent Fair/Good Excellent Good Cert.br Spampots Excellent N/A Good Fair Fair Team Cymru's CAP Good Excellent Excellent Excellent Good Project Honeypot Good Good Excellent Excellent Good/Excellent Malware Threat Center Good Fair Excellent Fair Good Smart Network Data Services Good Good Excellent Excellent Good Malware Patrol Excellent N/A Excellent N/A Excellent 26 Zone-H Excellent Excellent Good Good Fair/Excellent www.enisa.europa.eu Cisco IronPort SenderBase Excellent Good/Excellent Excellent Excellent Good

  27. Top 5 recommended external sources Shadowserver foundation (http://www.shadowserver.org) Zeus/SpyEye Tracker (https://spyeyetracker.abuse.ch, https://zeustracker.abuse.ch) Google Safe Browsing Alerts (http://safebrowsingalerts.googlelabs.com) Malware Domain List (http://www.malwaredomainlist.com/) Team Cymru's CSIRT Assistance Program (http://www.team-cymru.org/Services/CAP/) 27 www.enisa.europa.eu

  28. Summary of internal tools Accuracy of Resources Category Timeliness Ease of use Coverage Scalability Extensibility results required Client Excellent Fair-Excellent Fair/ Good Fair/ Good Good Excellent Fair honeypot Server Excellent Good Good Good Good Good Good honeypot Excellent Fair Good Fair/ Good Good Excellent Fair- Excellent Firewalls Excellent Good Good Fair- Excellent Fair/ Good Good Fair- Excellent IDS/IPS Excellent Good Fair Fair/Good Fair Good/ Excellent Good Netflow Excellent Fair/ Good Fair N/A Fair Fai- Excellent Fair- Excellent Sandboxes Excellent Good Fair Fair- Excellent Fair Good Fair Darknet Passive DNS Excellent Good/ Excellent Good Fair/ Good Good Good/ Excellent Fair monitoring Excellent Fair/ Good Fair Fair Good Good Good Spamtrap Web Excellent Good/ Excellent Fair Fair Fair Good Good Application Firewalls - - - - - - - App logs Excellent Good Good Fair- Excellent Good Good N/A Antivirus 28 www.enisa.europa.eu

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend