PuSH - Publication Server of Helmholtz Zentrum München

Sparsity in Continuous-Depth Neural Networks.

In: (Advances in Neural Information Processing Systems). 2022. (Advances in Neural Information Processing Systems ; 35)
Neural Ordinary Differential Equations (NODEs) have proven successful in learning dynamical systems in terms of accurately recovering the observed trajectories. While different types of sparsity have been proposed to improve robustness, the generalization properties of NODEs for dynamical systems beyond the observed data are underexplored. We systematically study the influence of weight and feature sparsity on forecasting as well as on identifying the underlying dynamical laws. Besides assessing existing methods, we propose a regularization technique to sparsify “input-output connections” and extract relevant features during training. Moreover, we curate real-world datasets consisting of human motion capture and human hematopoiesis single-cell RNA-seq data to realistically analyze different levels of out-of-distribution (OOD) generalization in forecasting and dynamics identification respectively. Our extensive empirical evaluation on these challenging benchmarks suggests that weight sparsity improves generalization in the presence of noise or irregular sampling. However, it does not prevent learning spurious feature dependencies in the inferred dynamics, rendering them impractical for predictions under interventions, or for inferring the true underlying dynamics. Instead, feature sparsity can indeed help with recovering sparse ground-truth dynamics compared to unregularized NODEs.
Additional Metrics?
Edit extra informations Login
Publication type Article: Conference contribution
Corresponding Author
ISSN (print) / ISBN 1049-5258
Conference Title Advances in Neural Information Processing Systems
Quellenangaben Volume: 35 Issue: , Pages: , Article Number: , Supplement: ,
Non-patent literature Publications