Free Dynamics of Feature Learning Processes - Archive ouverte HAL Access content directly
Preprints, Working Papers, ... (Preprint) Year :

Free Dynamics of Feature Learning Processes


Regression models usually tend to recover a noisy signal in the form of a combination of regressors, also called features in machine learning, themselves being the result of a learning process. The alignment of the prior covariance feature matrix with the signal is known to play a key role in the generalization properties of the model, i.e. its ability to make predictions on unseen data during training. We present a statistical physics picture of the learning process. First we revisit the ridge regression to obtain compact asymptotic expressions for train and test errors, rendering manifest the conditions under which efficient generalization occurs. It is established thanks to an exact test-train sample error ratio combined with random matrix properties. Along the way in the form of a self-energy emerges an effective ridge penalty-precisely the train to test error ratio-which offer a very simple parameterization of the problem. This formulation appears convenient to tackle the learning process of the feature matrix itself. We derive an autonomous dynamical system in terms of elementary degrees of freedom of the problem determining the evolution of the relative alignment between the population matrix and the signal. A macroscopic counterpart of these equations is also obtained and various dynamical mechanisms are unveiled, allowing one to interpret the dynamics of simulated learning processes and reproduce trajectories of single experimental run with high precision.
Fichier principal
Vignette du fichier
paper.pdf (2.3 Mo) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03878500 , version 1 (29-11-2022)


Public Domain



Cyril Furtlehner. Free Dynamics of Feature Learning Processes. 2022. ⟨hal-03878500⟩
19 View
8 Download



Gmail Facebook Twitter LinkedIn More