Notice
This item was automatically migrated from a legacy system. It's data has not been checked and might not meet the quality criteria of the present system.
Grünbacher, S., Hasani, R., Lechner, M., Cyranka, J., Smolka, S. A., & Grosu, R. (2020). On The Verification of Neural ODEs with Stochastic Guarantees. arXiv. http://hdl.handle.net/20.500.12708/141429
E191-01 - Forschungsbereich Cyber-Physical Systems
-
Date (published):
2020
-
Number of Pages:
12
-
Preprint Server:
arXiv
-
Abstract:
We show that Neural ODEs, an emerging class of timecontinuous
neural networks, can be verified by solving a
set of global-optimization problems. For this purpose, we
introduce Stochastic Lagrangian Reachability (SLR), an
abstraction-based technique for constructing a tight Reachtube
(an over-approximation of the set of reachable states
over a given time-horizon), and provide stochastic guara...
We show that Neural ODEs, an emerging class of timecontinuous
neural networks, can be verified by solving a
set of global-optimization problems. For this purpose, we
introduce Stochastic Lagrangian Reachability (SLR), an
abstraction-based technique for constructing a tight Reachtube
(an over-approximation of the set of reachable states
over a given time-horizon), and provide stochastic guarantees
in the form of confidence intervals for the Reachtube bounds.
SLR inherently avoids the infamous wrapping effect (accumulation
of over-approximation errors) by performing local
optimization steps to expand safe regions instead of repeatedly
forward-propagating them as is done by deterministic
reachability methods. To enable fast local optimizations, we
introduce a novel forward-mode adjoint sensitivity method
to compute gradients without the need for backpropagation.
Finally, we establish asymptotic and non-asymptotic convergence
rates for SLR.