measuring security and cybercrime
play

Measuring security and cybercrime Daniel R. Thomas Cambridge - PowerPoint PPT Presentation

Measuring security and cybercrime Daniel R. Thomas Cambridge Cybercrime Centre, Department of Computer Science and Technology, University of Cambridge, UK SecHuman 2018 GPG: 5017 A1EC 0B29 08E3 CF64 7CCD 5514 35D5 D749 33D9


  1. Measuring security and cybercrime Daniel R. Thomas Cambridge Cybercrime Centre, Department of Computer Science and Technology, University of Cambridge, UK SecHuman 2018 GPG: 5017 A1EC 0B29 08E3 CF64 7CCD 5514 35D5 D749 33D9 Firstname.Surname@cl.cam.ac.uk

  2. Format 1. Group warm up (5 minutes) 2. Short lecture (35 minutes). 3. Experimental design and review (50 minutes) 3.1 Designing an experiment to measure security or cybercrime (30 minutes) 3.2 Plenary feedback (20 minutes) 2 of 39

  3. What is security and how to we measure it? 3 of 39 ▶ Discuss in groups for 2 minutes ▶ Then we will listen to some of the ideas

  4. Measuring security and cybercrime is important 4 of 39 ▶ Is security getting better or worse? ▶ Did this intervention work? ▶ Is there a difgerence in security between these products?

  5. Two examples of security measurement research Drawing out the principles, insights, and mistakes as we go along. 5 of 39 ▶ Measuring security of Android ▶ Measuring DDoS attacks (cybercrime)

  6. Security metrics for the Android ecosystem 1 https://androidvulnerabilities.org/ Daniel R. Thomas Alastair R. Beresford Andrew Rice Daniel Wagner 1 Daniel R. Thomas, Alastair R. Beresford, and Andrew Rice. 2015. Security in Smartphones and Mobile Devices (SPSM) . ACM, Denver, Colorado, USA, (Oct. 2015), 87–98. isbn : 978-1-4503-3819-6. 6 of 39 metrics for the Android ecosystem. In ACM CCS workshop on Security and Privacy

  7. Smartphones contain many apps written by a spectrum of developers How “secure” is a smartphone? 7 of 39

  8. Root/kernel exploits are harmful running on the device 8 of 39 ▶ Root exploits break permission model ▶ Cannot recover to a safe state ▶ In 2012 37% Android malware used root exploits ▶ We’re interested in critical vulnerabilities, exploitable by code

  9. Hypothesis: devices vulnerable because they are not updated 9 of 39 ▶ Anecdotal evidence was that updates rarely happen ▶ Android phones, sold on 1-2 year contracts

  10. No central database of Android vulnerabilities: so we built one 10 of 39

  11. Device Analyzer gathers statistics on mobile phone usage contributors contributors (2015 numbers) 11 of 39 ▶ Deployed May ’11 ▶ 30 000 ▶ 4 000 phone years ▶ 180 billion records ▶ 10TB of data ▶ 1089 7-day active

  12. Device Analyzer gathers wide variety of data Including: system statistics 12 of 39 ▶ OS version and build number ▶ Manufacturer and device model ▶ Network operators

  13. Is the ecosystem getting updated? 13 of 39

  14. Google data: device API levels 14 of 39 1.0 12 1314 23 22 21 0.8 19 Proportion of devices 18 17 0.6 16 15 0.4 10 0.2 8 7 3 4 0.0 1 2 2 3 3 4 4 5 5 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 2 2 2 2 2 2 2 2 2 t r t r t r t r t c p c p c p c p c O A O A O A O A O

  15. Are devices getting updated? 15 of 39

  16. LG devices by OS version 16 of 39

  17. Connecting the two data sets: assume OS version → vulnerability 17 of 39 ▶ We have an OS version from Device Analyzer ▶ We have vulnerability data with OS versions ▶ Match on OS and Build Number and assign: ▶ Vulnerable ▶ Maybe invulnerable ▶ Invulnerable (not known vulnerable)

  18. Vulnerability varies over time 18 of 39 zergRush APK duplicate file Fake ID Last AVO 1.0 19% invulnerable 0.8 Proportion of devices 11% maybe invulnerable 0.6 0.4 vulnerable 0.2 70% 0.0 1 2 2 3 3 4 4 5 5 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 2 2 2 2 2 2 2 2 2 t r t r t r t r t c p c p c p c p c O A O A O O A O A

  19. The FUM metric measures the security of Android devices 2 f ree from (known) vulnerabilities u pdated to the latest version m ean unfjxed vulnerabilities 19 of 39 FUM = 4 f + 3 u + 3 1 + e m

  20. 20 of 39 Galaxy Nexus 4.4.3 KTU84M 4.0.4 IMM76D 4.3.1 JLS36I 4.1.1 JRO03U 4.4.2 KVT49L 4.0.4 IMM30D 4.3 JWR66V 4.1.1 JRO03L 4.1.2 JZO54K 4.3 JWR67B 4.2.2 JDQ39E 4.0.4 IMM76K 4.4.4 KTU84P 4.0.1 ITL41F 4.3 JLS36G 4.4.2 KOT49H 4.4.4 KTU84Q 4.2 JOP40C 1.0 4.0.4 IMM76I 4.0.4 IMM30B 0.8 4.1.1 JRO03O 4.2.2 JDQ39 Proportion of devices 4.1.1 JRO03C 4.3 JWR66Y 2.3.4 GRJ22 0.6 2.3.3 GRI40 4.2.1 JOP40G 4.1 JRN84D 0.4 4.3 JSS15Q 4.0.2 ICL53F 4.3 JSS15J 4.1.1 JRO03R 0.2 4.0.4 ICL53F 4.2.1 JOP40D 4.0.3 IML74K 0.0 1 2 2 3 3 4 4 5 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 2 2 2 2 2 2 2 2 b g g b g b g b u e u e u e u e A F A F A F A F 2.3.7 GRJ22 other 2.3.6 GINGERBREAD

  21. Lack of security updates 21 of 39 HTC Desire HD A9191 1.0 0.8 Proportion 0.6 2.3.5 GRJ90 0.4 0.2 2.3.3 GRI40 0.0 Aug 2011 Feb 2012 Aug 2012 Feb 2013 Aug 2013 Feb 2014 Aug 2014 Feb 2015 Symphony W68 1.0 0.8 Proportion 0.6 4.2.2 JDQ39 0.4 0.2 0.0 Aug 2011 Feb 2012 Aug 2012 Feb 2013 Aug 2013 Feb 2014 Feb 2015 Aug 2014

  22. Comparing manufacturers 22 of 39 FUM scores 7 6 5 m 4 FUM score u f 3 2 1 0 LG Samsung HTC Alps Walton Nexus devices Motorola Sony Asus Symphony

  23. Why is fjxing vulnerabilities hard: software ecosystem is complex 23 of 39 ▶ Division of labour ▶ Open source software ▶ Core OS production ▶ Driver writer ▶ Device manufacturer ▶ Retailer ▶ Customer ▶ Apple and Google have difgerent models ▶ Hypothesis: Apple’s model is more secure

  24. Google to the rescue Recommended 24 of 39 ▶ Play Store ▶ Verify apps ▶ Android Security Patch Level ▶ Later: Android Enterprise

  25. What happened next? provision updated scores. 25 of 39 ▶ Plenty press coverage ▶ Contacts with Google, manufacturers, UK Home Offjce ▶ FTC cites work. ▶ Google uses graphs to pressure manufacturers to improve update ▶ We move on: no further collection of vulnerability data, no

  26. 1000 days of UDP amplifjcation DDoS attacks 2 Daniel R. Thomas Richard Clayton Alastair R. Beresford 2 Daniel R. Thomas, Richard Clayton, and Alastair R. Beresford. 2017. 1000 days Research (eCrime) . IEEE, (Apr. 2017). 26 of 39 of UDP amplifjcation DDoS attacks. In APWG Symposium on Electronic Crime

  27. UDP scanning 27 of 39 Re fl ector 8.8.8.8 big.gov IN TXT big.gov IN TXT " src: 192.168.25.4 Extremely long dst: 8.8.8.8 response.............. (2) (1) ........................... ........................... Attacker .........................." 192.168.25.4 src: 8.8.8.8 dst: 192.168.25.4

  28. UDP refmection DDoS attacks 28 of 39 big.gov IN TXT " Re fl ector Extremely long response.............. 8.8.8.8 ........................... big.gov IN TXT ........................... .........................." src: 172.16.6.2 src: 8.8.8.8 dst: 8.8.8.8 dst: 172.16.6.2 Victim Attacker 172.16.6.2 192.168.25.4

  29. We run lots of UDP honeypots QOTD, CHARGEN, DNS, NTP, SSDP, SQLMon, Portmap, mDNS, LDAP 29 of 39 ▶ Median 65 nodes since 2014 ▶ Hopscotch emulates abused protocols ▶ Snifger records all resulting UDP traffjc ▶ (try to) Only reply to black hat scanners

  30. Total attacks estimated using capture-recapture 30 of 39 A=160 B=200 80 80 120 Estimated population: 400 ± 62

  31. 31 of 39 100000 Estimated number of attacks per day (log) CHARGEN DNS NTP SSDP 10000 1000 100 10 2014-07 2014-10 2015-01 2015-04 2015-07 2015-10 2016-01 2016-04 2016-07 2016-10 2017-01 2017-04 2017-07

  32. 32 of 39 1 90 Proportion of all attacks that we observe 80 0.8 70 60 0.6 50 40 0.4 30 20 CHARGEN 0.2 DNS 10 NTP SSDP 0 0 2014-07 2014-10 2015-01 2015-04 2015-07 2015-10 2016-01 2016-04 2016-07 2016-10 2017-01 2017-04 2017-07

  33. 33 of 39 1 90 80 Number of honeypots in operation 0.8 70 60 0.6 50 40 0.4 30 20 0.2 10 # A+B # A 0 0 2014-07 2014-10 2015-01 2015-04 2015-07 2015-10 2016-01 2016-04 2016-07 2016-10 2017-01 2017-04 2017-07

  34. 34 of 39 1 90 Proportion of all attacks that we observe 80 Number of honeypots in operation 0.8 70 60 0.6 50 40 0.4 30 # A+B # A 20 CHARGEN 0.2 DNS 10 NTP SSDP 0 0 2014-07 2014-10 2015-01 2015-04 2015-07 2015-10 2016-01 2016-04 2016-07 2016-10 2017-01 2017-04 2017-07

  35. This was ethical increase harm. tomorrow. 35 of 39 ▶ We reduce harm by absorbing attack traffjc ▶ We don’t reply to white hat scanners (no timewasting) ▶ We used leaked data for validation, this was necessary and did not ▶ Further discussion of the ethics of using leaked data for research

  36. This is a solvable problem 36 of 39 ▶ BCP38/SAVE ▶ Follow the money ▶ Enforce the law ▶ Warn customers it is illegal

  37. Experimental design [30 minutes] MH Elections improving security? Would it be possible to cheat your measurement without actually How would you collect it? What data would you need to collect? RE Smartphones providers OB Online payment HER Offjces How would you measure the relative security of difgerent: GE IoT manufacturers manufacturers E Cycle lock DU Operating systems DO Residential ISPs BOT CPU vendors BO Banks 37 of 39

  38. Plenary discussion [20 minutes] Feedback from each group on their experimental design. 38 of 39

  39. Thank you! Questions? Daniel R. Thomas Daniel.Thomas@cl.cam.ac.uk @DanielRThomas24 https://www.cl.cam.ac.uk/~drt24/ 5017 A1EC 0B29 08E3 CF64 7CCD 5514 35D5 D749 33D9 Daniel Thomas is supported by the EPSRC [grant number EP/M020320/1]. 39 of 39

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend