Please use this identifier to cite or link to this item:
|Title:||Modeling concept dynamics for large scale music search|
music information retrieval
|Source:||Shen, J.,Pang, H.,Wang, M.,Yan, S. (2012). Modeling concept dynamics for large scale music search. SIGIR'12 - Proceedings of the International ACM SIGIR Conference on Research and Development in Information Retrieval : 455-464. ScholarBank@NUS Repository. https://doi.org/10.1145/2348283.2348346|
|Abstract:||Continuing advances in data storage and communication technologies have led to an explosive growth in digital music collections. To cope with their increasing scale, we need effective Music Information Retrieval (MIR) capabilities like tagging, concept search and clustering. Integral to MIR is a framework for modelling music documents and generating discriminative signatures for them. In this paper, we introduce a multimodal, layered learning framework called DMCM. Distinguished from the existing approaches that encode music as an ensemble of order-less feature vectors, our framework extracts from each music document a variety of acoustic features, and translates them into low-level encodings over the temporal dimension. From them, DMCM elucidates the concept dynamics in the music document, representing them with a novel music signature scheme called Stochastic Music Concept Histogram (SMCH) that captures the probability distribution over all the concepts. Experiment results with two large music collections confirm the advantages of the proposed framework over existing methods on various MIR tasks. © 2012 ACM.|
|Source Title:||SIGIR'12 - Proceedings of the International ACM SIGIR Conference on Research and Development in Information Retrieval|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Dec 5, 2017
checked on Dec 9, 2017
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.