PuSH - Publication Server of Helmholtz Zentrum München

Stapor, P. ; Schmiester, L. ; Wierling, C.* ; Merkt, S.* ; Pathirana, D.* ; Lange, B.M.H.* ; Weindl, D. ; Hasenauer, J.

Mini-batch optimization enables training of ODE models on large-scale datasets.

Nat. Commun. 13:34 (2022)
Publ. Version/Full Text Research data DOI PMC
Open Access Gold
Creative Commons Lizenzvertrag
Quantitative dynamic models are widely used to study cellular signal processing. A critical step in modelling is the estimation of unknown model parameters from experimental data. As model sizes and datasets are steadily growing, established parameter optimization approaches for mechanistic models become computationally extremely challenging. Mini-batch optimization methods, as employed in deep learning, have better scaling properties. In this work, we adapt, apply, and benchmark mini-batch optimization for ordinary differential equation (ODE) models, thereby establishing a direct link between dynamic modelling and machine learning. On our main application example, a large-scale model of cancer signaling, we benchmark mini-batch optimization against established methods, achieving better optimization results and reducing computation by more than an order of magnitude. We expect that our work will serve as a first step towards mini-batch optimization tailored to ODE models and enable modelling of even larger and more complex systems than what is currently possible.
Altmetric
Additional Metrics?
Edit extra informations Login
Publication type Article: Journal article
Document type Scientific Article
Corresponding Author
ISSN (print) / ISBN 2041-1723
e-ISSN 2041-1723
Quellenangaben Volume: 13, Issue: 1, Pages: , Article Number: 34 Supplement: ,
Publisher Nature Publishing Group
Publishing Place London
Non-patent literature Publications
Reviewing status Peer reviewed
Grants Bundesministerium für Wirtschaft und Energie
Bundesministerium fur Bildung und Forschung (BMBF)
BMBF-DLR