Hadrien Foroughmand Aarabi will defend his thesis on the 14th of January, at 2PM
Towards Global Tempo Estimation and Rhythm-Oriented Genre Classification Based On Harmonic Characteristics of Rhythm
You can follow his thesis on Youtube: here
Thesis directed by Geoffroy Peeters at IRCAM, Analysis and Synthesis of the Sounds team, and has been financed by the European H2020 Project FuturePulse.
Jury’s members :
Roland Badeau (Professeur, Télécom Paristech),
Emmanuel Vincent (Senior Research Scientist, INRIA - LORIA),
Emilia Gomez (Professeure, Universitat Pompeu Fabra (UPF) - MTG) ,
Matthew Davies (Senior Researcher, Centre for Informatics and Systems of the University of Coimbra - Department of Informatics Engineering)
Emmanuel Saint-James (Maitre de conférence (HDR), Sorbonne Université - LIP6).
The automatic detection of rhythmic structure within music is one of the challenges of the research area "Music Information Retrieval".
The advent of technology dedicated to the arts has allowed the emergence of new musical trends generally described by the term "Electronic/Dance Music" (EDM) which encompasses a plethora of sub-genres.
This type of music often dedicated to dance is characterized by its rhythmic structure.
We propose a rhythmic analysis of what defines certain musical genres including those of EDM.
To do so, we want to perform an automatic global tempo estimation task and a genre classification task based on rhythm.
Tempo and genre are two intertwined aspects since genres are often associated with rhythmic patterns that are played in specific tempo ranges.
Some so-called "handcrafted" tempo estimation systems have been shown to be effective based on the extraction of rhythm-related characteristics.
Recently, with the appearance of annotated databases, so-called "data-driven" systems and deep learning approaches have shown progress in the automatic estimation of these tasks.
In this thesis, we propose methods at the crossroads between "handcrafted" and "data-driven" systems.
The development of a new representation of rhythm combined with deep learning by convolutional neural network is at the basis of all our work.
In this thesis, we present in detail our Deep Rhythm method but also several extensions of it based on musical intuitions that allow us to improve the estimation of both tasks.