Treffer: Comparing content and context based similarity for musical data
Department of Forestry and Management of the Environment and Natural Resources, Democritus University of Thrace, Pandazidou 193, Orestiada 68200, Greece
CC BY 4.0
Sauf mention contraire ci-dessus, le contenu de cette notice bibliographique peut être utilisé dans le cadre d’une licence CC BY 4.0 Inist-CNRS / Unless otherwise stated above, the content of this bibliographic record may be used under a CC BY 4.0 licence by Inist-CNRS / A menos que se haya señalado antes, el contenido de este registro bibliográfico puede ser utilizado al amparo de una licencia CC BY 4.0 Inist-CNRS
Physics: acoustics
Psychology. Ethology
FRANCIS
Weitere Informationen
Similarity measurement between two musical pieces is a hard problem. Humans perceive such similarity by employing a large amount of contextually semantic information. Commonly used content-based methodologies rely on data descriptors of limited semantic value, and thus are reaching a performance upper bound. Recent research pertaining to contextual information assigned as free-form text (tags) in social networking services has indicated tags to be highly effective in improving the accuracy of music similarity. In this paper, a large scale (20k real music data) similarity measurement is performed using mainstream off-the-shelf methodologies relying on both content and context. In addition, the accuracy of the examined methodologies is tested against not only objective metadata but also real-life user listening data as well. Experimental results illustrate the conditionally substantial gains of the context-based methodologies and not a so close match of these methods with the similarity based on real-user listening data.