Generalized Concomitant Multi-Task Lasso for Sparse Multimodal Regression - Télécom Paris
Communication Dans Un Congrès Année : 2018

Generalized Concomitant Multi-Task Lasso for Sparse Multimodal Regression

Résumé

In high dimension, it is customary to consider Lasso-type estimators to enforce sparsity. For standard Lasso theory to hold, the regulariza-tion parameter should be proportional to the noise level, which is often unknown in practice. A remedy is to consider estimators such as the Concomitant Lasso, which jointly optimize over the regression coefficients and the noise level. However, when data from different sources are pooled to increase sample size, noise levels differ and new dedicated estima-tors are needed. We provide new statistical and computational solutions to perform het-eroscedastic regression, with an emphasis on brain imaging with magneto-and electroen-cephalography (M/EEG). When instantiated to de-correlated noise, our framework leads to an efficient algorithm whose computational cost is not higher than for the Lasso, but addresses more complex noise structures. Experiments demonstrate improved prediction and support identification with correct estimation of noise levels.
Fichier principal
Vignette du fichier
massias18a.pdf (793.77 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01812011 , version 1 (11-06-2018)

Identifiants

  • HAL Id : hal-01812011 , version 1

Citer

Mathurin Massias, Olivier Fercoq, Alexandre Gramfort, Joseph Salmon. Generalized Concomitant Multi-Task Lasso for Sparse Multimodal Regression. 21st International Conference on Artificial Intelligence and Statistics (AISTATS 2018), Apr 2018, Lanzarote, Spain. ⟨hal-01812011⟩
182 Consultations
178 Téléchargements

Partager

More