SDN / NFV panel @ SBRC
Sébastien Tandel sta@hpe.com slideshare.net/standel
May 2017
SDN / NFV panel @ SBRC Sbastien Tandel sta@hpe.com - - PowerPoint PPT Presentation
SDN / NFV panel @ SBRC Sbastien Tandel sta@hpe.com slideshare.net/standel May 2017 Sbastien Tandel Working within HPE Aruba CTO as a Principal Architect Technologist with sound business knowledge Software engineer with sound
Sébastien Tandel sta@hpe.com slideshare.net/standel
May 2017
Sébastien Tandel
– Working within HPE Aruba CTO as a Principal Architect
– Technologist with sound business knowledge – Software engineer with sound knowledge of hardware – Product focused with sound experience in all innovation waves (research & advanced development)
– Lead me to drive many programs from Software-Defined Infrastructure & Intelligent Edge to Security Analytics – Contributions in several aspects of SDN / NFV since 2010
– First Software-Defined Lync demo @ ONS’13 – First Software-Defined Security demo (IPS coupled to security analytics) @ ONS’14 – First HW accelerated SFC (MAC Chaining) including legacy physical SFCs demo @ Sigcomm’16 – Distributed Software-Defined Load Balancer, IoT Universal Profiler (identification & anomaly behavior detection) views and opinions expressed are my own and does not necessarily reflect views or opinions of my employer 2
SDN & NFV Markets by 2020
3
SDN Datacenter SDN/NFV Providers Facebook Net Income 2015
Gartner Hype Cycle => Customer Focused and Realist !
5
Physical IPS appliance : 10,000 feet hardware architecture
6
Intrusion Prevention System Attack Signatures Set
pre- filtering
‘hardware’ Deep Packet Inspection (NPU) ‘software’ Deep Packet Inspection (CPU)
100% traffic 10%
Fast (line rate) slower slow path
Clean & Malicious Traffic Clean Traffic
90% of traffic won’t go to Deep Inspection
a story of decomposition : pre-filtering as micro-VNF
Switch Performing Pre-Filtering Clients Destination 20 Gb/s ~2 Gb/s Less than 2 Gb/s
IPS only processes ~10% traffic (suspicious traffic)
Clean ~18 Gb/s IPS
Distribute pre-filtering function
Regular traffic directly forwarded to destination 90% traffic
µVNF changes cost/performance
8
IPS Max Inspection throughput (Gb/s) Listing price (US$) US$ per Gb/s of inspection TPT S7500 20 500000 25000 Snort (4 proc) 2 10000 5000 pre-filtering µVNF + Snort 20 10000 500
ü State-of-the-art Product Performance ü 50x cheaper than TippingPoint
Confidential
9
Software-Defined Security: IPSaaS model
Creating a Security Control plane
10
IPS Switch with pre-filtering Switch with pre-filtering
SFC
Device 1 Device 2 Device 3
IPSaaS app on SDN Controller
Dynamic setup of traffic redirection to IPS Attack Signatures Set
Confidential
11
Software-Defined Security: Closing the loop
Making Sense of Security Events & Automate Remediation Actions
12
IPS Switch Switch
SFC
Device 1 Device 2 Device 3 Redirect to another Security Sensor
security events Automated Remedia2on Ac2ons
Software-Defined Security SDN Controller
Block Device 2
Security Analy2cs security events
Confidential
13
Summary
– Physical µVNF ; Software-Defined Security ; Big Data applied to Security – It’s a model working for other use cases
Ø Cloud-first (SaaS) for fast TTM Ø Edge Computing model Ø µVNF for better scale and cost performance – Open APIs to avoid vendor lock-in & fragmentation
14
sta@hpe.com www.slideshare.net/standel
15
16
Current Infrastructure Security Architecture
17
IPS Sensor IPS Sensor IPS Sensor DNS Sensor DNS Sensor DNS Sensor DDoS Sensor DDoS Sensor DDoS Sensor NBAD NBAD NBADSecurity boxes at fixed place, manually connected Edge / East-West weakly protected => BYOD, IoT Security boxes unaware of each other: No collaboration => security gaps
Signals Ex: DNS reqs Security Events Sensor Coordination Real-time Policy enforcement Near real-time policy enforcement
Software-Defined Security: a Security Control Plane to Rule them All
Confidential
19
Software-Defined Security & Intelligent Edge
Key Take Aways
20
10 20 30 40 50 60 70 80 90 100 Software Hardware
Product: Performance x Time-To-Market
Product Performance Time To Market
Very Good Very Bad
Better performance with hardware (improving scale & price) Longer to reach market with hardware (slower innovation)
Key Take Aways
21
10 20 30 40 50 60 70 80 90 100 Software Hardware
Product: Performance x Time-To-Market
Product Performance Time To Market
Very Good Very Bad
software is excellent starting point to test the market How do you evolve? What may remain software? Open interfaces?