Optimizing transformations for contrastive learning in a differentiable framework - Télécom Paris Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Optimizing transformations for contrastive learning in a differentiable framework

Résumé

Current contrastive learning methods use random transformations sampled from a large list of transformations, with fixed hyperparameters, to learn invariance from an unannotated database. Following previous works that introduce a small amount of supervision, we propose a framework to find optimal transformations for contrastive learning using a differentiable transformation network. Our method increases performances at low annotated data regime both in supervision accuracy and in convergence speed. In contrast to previous work, no generative model is needed for transformation optimization. Transformed images keep relevant information to solve the supervised task, here classification. Experiments were performed on 34000 2D slices of brain Magnetic Resonance Images and 11200 chest X-ray images. On both datasets, with 10% of labeled data, our model achieves better performances than a fully supervised model with 100% labels.

Dates et versions

hal-03761521 , version 1 (26-08-2022)

Identifiants

Citer

Camille Ruppli, Pietro Gori, Roberto Ardon, Isabelle Bloch. Optimizing transformations for contrastive learning in a differentiable framework. Medical Image Learning with Limited & Noisy Data - MILLanD (Workshop MICCAI), Sep 2022, Singapore, Singapore. ⟨hal-03761521⟩
61 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More