First Exit Time Analysis of Stochastic Gradient Descent Under Heavy-Tailed Gradient Noise - Télécom Paris
Communication Dans Un Congrès Année : 2019

First Exit Time Analysis of Stochastic Gradient Descent Under Heavy-Tailed Gradient Noise

Thanh Huy Nguyen
Umut Şimşekli
Mert Gürbüzbalaban
  • Fonction : Auteur
  • PersonId : 1058705
Gael Richard

Résumé

Stochastic gradient descent (SGD) has been widely used in machine learning due to its computational efficiency and favorable generalization properties. Recently, it has been empirically demonstrated that the gradient noise in several deep learning settings admits a non-Gaussian, heavy-tailed behavior. This suggests that the gradient noise can be modeled by using α-stable distributions, a family of heavytailed distributions that appear in the generalized central limit theorem. In this context, SGD can be viewed as a discretization of a stochastic differential equation (SDE) driven by a Lévy motion, and the metastability results for this SDE can then be used for illuminating the behavior of SGD, especially in terms of ‘preferring wide minima’. While this approach brings a new perspective for analyzing SGD, it is limited in the sense that, due to the time discretization, SGD might admit a significantly different behavior than its continuous-time limit. Intuitively, the behaviors of these two systems are expected to be similar to each other only when the discretization step is sufficiently small; however, to the best of our knowledge, there is no theoretical understanding on how small the step-size should be chosen in order to guarantee that the discretized system inherits the properties of the continuous-time system. In this study, we provide formal theoretical analysis where we derive explicit conditions for the step-size such that the metastability behavior of the discrete-time system is similar to its continuous-time limit. We show that the behaviors of the two systems are indeed similar for small step-sizes and we identify how the error depends on the algorithm and problem parameters. We illustrate our results with simulations on a synthetic model and neural networks.
Fichier principal
Vignette du fichier
first-exit-time-analysis-of-stochastic-gradient-descent-under-heavy-tailed-gradient-noise.pdf (586.46 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-02372376 , version 1 (20-11-2019)

Identifiants

  • HAL Id : hal-02372376 , version 1

Citer

Thanh Huy Nguyen, Umut Şimşekli, Mert Gürbüzbalaban, Gael Richard. First Exit Time Analysis of Stochastic Gradient Descent Under Heavy-Tailed Gradient Noise. 33rd Conference on Neural Information Processing Systems (NeurIPS 2019), Dec 2019, Vancouver, Canada. ⟨hal-02372376⟩
125 Consultations
120 Téléchargements

Partager

More