CCN 2022 Contributed Talk on Neural ODEs

Date:

This talk explores how standard (discretized) deep-learning implementations of neuroscience models often distort their underlying differential equations, limiting both performance and dynamical accuracy. We present neural ODEs as a more faithful alternative, allowing large-scale models to be trained end-to-end using precise and adaptive ODE solvers. Using predictive coding and hGRU as case studies, we show that neural-ODE implementations yield more stable dynamics and better task performance than conventional Euler-based updates, highlighting neural ODEs as a powerful framework for scaling biologically inspired models without sacrificing their continuous-time structure.

Recording