1 / 39
Reluplex: An Efficient SMT Solver for Verifying Deep Neural Networks
David L Dill Stanford University
Reluplex: An Efficient SMT Solver for Verifying Deep Neural Networks - - PowerPoint PPT Presentation
Reluplex: An Efficient SMT Solver for Verifying Deep Neural Networks David L Dill Stanford University 1 / 39 Based on: Reluplex: An Efficient SMT Solver for Verifying Deep Neural Networks Guy Katz, Clark Barrett, David Dill, Kyle
1 / 39
David L Dill Stanford University
2 / 39
Based on: “Reluplex: An Efficient SMT Solver for Verifying Deep Neural Networks” Guy Katz, Clark Barrett, David Dill, Kyle Julian and Mykel Kochenderfer Available on arXiv: 1702.01135 Accepted at CAV conference in July.
3 / 39
An emerging solution to a wide variety of problems A “black box” solution
Goal:
Prove that some properties of NN are guaranteed
4 / 39
New standard being developed for collision avoidance
advisories for unmanned aircraft
Produce advisories:
1. Strong left (SL) 2. Weak left (L) 3. Weak right (R) 4. Strong right (SR) 5. Clear of conflict (COC)
Implementation: an array of 45 neural networks
5 / 39
Neural networks generalize to previously-unseen inputs Show that certain properties hold for all inputs Examples:
Crucial for increasing the level of confidence
6 / 39
Can we manually prove properties? Networks are too big to reason about manually
7 / 39
A possible answer: automatic verification … but, existing verification tools don’t scale up! A difficult (NP-complete) problem
8 / 39
The difficulty: Rectified Linear Units (ReLUs) ReLU(𝑦) = max(0, 𝑦)
1 3 −2 2 1 −2 −1 4 1 ⋅ 1 + −2 ⋅ 3 + 0 ⋅ −2 = −5 2 ⋅ 1 + 0 ⋅ 3 + (−1) ⋅ −2 = 4
9 / 39
ReLUs require case-splitting:
Property holds iff it holds for all sub-problems 𝑜 ReLUs imply 2𝑜 combinations
10 / 39
A technique for solving linear programs with ReLUs
Extends the simplex method
Does not require case splitting in advance
Scales to the ACAS Xu networks
11 / 39
Reluplex Evaluation on the ACAS Xu Networks Conclusion
12 / 39
Reluplex Evaluation on the ACAS Xu Networks Conclusion
13 / 39
Split ReLUs into:
1. A weighted sum variable 𝑦𝑥 2. An activation function variable 𝑦𝑏
Allow ReLU violations (𝑦𝑏 ≠ 𝑦𝑥), Fix them gradually during simplex Split cases (𝑦𝑥 < 0, 𝑦𝑥 ≥ 0) as a last resort
𝑦2
𝑥
𝑦3
𝑥
𝑦1 𝑦4 1 −1 1 1 𝑦2
𝑏
𝑦3
𝑏
ReLU ReLU
14 / 39
Can we always find a solution using pivots and updates? No: sometimes get into a loop Have to split on ReLU variables Use a Satisfiability Modulo Theories (SMT) framework
to manage decision tree resulting from splitting
(i.e. splitting) with reasoning in a theory (in this case, linear arithmetic with ReLUs)
15 / 39
A method for solving linear programs with ReLUs Sound and terminating Efficient
16 / 39
Reluplex Evaluation on the ACAS Xu Networks Conclusion
17 / 39
1.
No unnecessary turning advisories
2.
Alerting regions are consistent
3.
Strong alerts do not appear when vertical separation is large
18 / 39
Satisfiable properties: 𝑧 ≥ 𝑑 for output node 𝑧
𝝌𝟐 𝝌𝟑 𝝌𝟒 𝝌𝟓 𝝌𝟔 𝝌𝟕 𝝌𝟖 𝝌𝟗 CVC4
1 37
2040 9780
1 1 1
11 3 9 10 155 7 10 14
19 / 39
“If the intruder is near and approaching from the left, the
network advises strong right”
Proof time: 01:29:29 (using 4 machines)
20 / 39
“For a distant intruder, the network advises COC”
Proof time: 01:25:15 (using 4 machines)
21 / 39
“If vertical separation is large and the previous advisory is
weak left, the network advises COC or weak left”
Time to find counter example: 11:08:22
(using 1 machine)
22 / 39
Reluplex Evaluation on the ACAS Xu Networks Conclusion
23 / 39
Reluplex: a technique for solving linear programs with
ReLUs
Can encode neural networks and properties as
Reluplex inputs
Scalable Sound and terminating
24 / 39
More complete verification of ACAS Xu Improving performance
(e.g. sum of infeasibilities)
Soundness guarantees
More expressiveness
25 / 39
Available on arXiv: 1702.01135