|
Guy Katz |
Aleksandar Zeljić |
Clark Barrett |
|
Haoze Wu |
et al. |
|
Hebrew University |
Stanford University |
|
|
|
Robustness properties: |
Fixing all ReLUs linear problem
ReLUs cases
Simplex + Case-splitting
Re-implementation of Reluplex algorithm
FCFF and Convolutional DNNs with piece-wise linear activation functions
Network formats: .nnet, .pb (TensorFlow)
Properties: .txt, via Python interface
Simplex - finds a satisfying assignment of linear constraints
SMT - manages splits of piece-wise linear constraints
Bound tightening - propagates bounds through the network
ReLU splits - guesses to simplify the problem
exponential search space
How can we focus the search?
Narrowing input ranges can fix hundreds of ReLUs
Which ranges do we choose? On which inputs?
|
TIMEOUT:
|
SAT:
ALL UNSAT:
|
./marabou --dnc --num-workers=4 --online-split=4 --init-to=5s --to-factor=1.5| Benchmark set | Marabou | Reluplex | ReluVal | Planet | |
|---|---|---|---|---|---|
| ACAS Xu | (135) | ✔ | ✔ | ✔ | ✔ |
| TwinStream | (81) | ✔ | ✔ | ||
| CollisionAvoidance | (500) | ✔ | ✔ | ||
Scaling experiment:
|
|
|
|
|
|
|
|
FCFF and Conv. DNNs with p.w. linear activation functions
NN formats: .nnet, .pb (TensorFlow)
Properties: .txt, via Python interface
Network-level reasoning - Symbolic bound tightening
Parallel Divide & Conquer algorithm
|
|