Coupling Liquid Time‑Constant Encoders with Modern Hopfield Memory
Abstract
Continuous-time neural networks provide adaptive dynamics, but rely on a single hidden state to encode both fast input fluctuations and longer-term context. This shared representation forces rapidly changing inputs to overwrite slower contextual signals, causing the model to lose past information as new observations arrive. In contrast, biological perceptual systems maintain stable behaviour under evolving sensory input by integrating ongoing signals with stored associative patterns rather than relying on a single evolving state.Motivated by this distinction, we study a simple coupling of Liquid Time-Constant Networks (LTCs) with a Modern Hopfield Network (MHN) that serves as a content-addressable memory. At each time step, the liquid state is projected into a query, the MHN retrieves a memory vector, and the two representations are concatenated before a readout layer. We analyse this coupling under standard norm and Lipschitz assumptions and show that the combined representation remains bounded. We further show that the retrieval map contracts gradients for parameters upstream of the memory query, which provides a mechanism for reducing curvature in the loss landscape.On public time-series benchmarks, the coupled LTC-MHN model improves mean accuracy by 2.3\% over competitive recurrent and continuous-time baselines and reduces the estimated Hessian trace by about an order of magnitude relative to a standalone LTC encoder, with the largest gains on classification tasks and competitive performance on a regression task. Qualitative analyses of training curves, loss landscapes, and latent embeddings support the interpretation that Hopfield retrieval smooths optimization and encourages more compact, linearly separable class manifolds. Code will be released upon publication.