Downbeat tracking with multiple features and deep neural networks
Résumé
In this paper, we introduce a novel method for the automatic estimation of downbeat positions from music signals. Our system relies on the computation of musically inspired features capturing important aspects of music such as timbre, harmony, rhythmic patterns, or local similarities in both timbre and harmony. It then uses several independent deep neural networks to learn higher-level representations. The downbeat sequences are finally obtained thanks to a temporal decoding step based on the Viterbi algorithm. The comparative evaluation conducted on varied datasets demonstrates the efficiency and robustness across different music styles of our approach.