Stability of neural ODEs by a control over the expansivity of their flows
Arturo De Marinis,
Nicola Guglielmi,
Stefano Sicilia,
Francesco Tudisco,
preprint,
(2025)
Abstract
We propose a method to enhance the stability of a neural ordinary
differential equation (neural ODE) by means of a control over the Lipschitz
constant $C$ of its flow. Since it is known that $C$ depends on the logarithmic
norm of the Jacobian matrix associated with the neural ODE, we tune this
parameter at our convenience by suitably perturbing the Jacobian matrix with a
perturbation as small as possible in Frobenius norm. We do so by introducing an
optimization problem for which we propose a nested two-level algorithm. For a
given perturbation size, the inner level computes the optimal perturbation with
a fixed Frobenius norm, while the outer level tunes the perturbation amplitude.
We embed the proposed algorithm in the training of the neural ODE to improve
its stability. Numerical experiments on the MNIST and FashionMNIST datasets
show that an image classifier including a neural ODE in its architecture
trained according to our strategy is more stable than the same classifier
trained in the classical way, and therefore, it is more robust and less
vulnerable to adversarial attacks.
Please cite this paper as:
@article{demarinis2025stability,
title={Stability of neural ODEs by a control over the expansivity of their
flows},
author={De Marinis, Arturo and Guglielmi, Nicola and Sicilia, Stefano and Tudisco, Francesco},
journal={arXiv:2501.10740},
year={2025}
}
Links:
arxiv
Keywords:
neural ode
deep learning
neural networks
adversarial attacks
nonlinear eigenvalues
eigenvalue optimization