Please use this identifier to cite or link to this item:
|Title:||Automatic music soundtrack generation for outdoor videos from contextual sensor information|
music soundtrack generation
|Citation:||Yu, Y.,Shen, Z.,Zimmermann, R. (2012). Automatic music soundtrack generation for outdoor videos from contextual sensor information. MM 2012 - Proceedings of the 20th ACM International Conference on Multimedia : 1377-1378. ScholarBank@NUS Repository. https://doi.org/10.1145/2393347.2396493|
|Abstract:||We present a system to automatically generate soundtracks for user-generated outdoor videos (UGV) based on concurrently captured contextual sensor information with mobile apps for the ACM Multimedia 2012 Google challenge: Automatic Music Video Generation. Our method addresses the use case of making "a video much more attractive for sharing by adding a matching soundtrack to it." Our system correlates viewable scene information from sensors with geographic contextual tags from OpenStreetMap. The co-occurance of geo-tags and mood tags are investigated from a set of categories of the web site Foursquare.com and a mapping from geo-tags to mood tags is obtained. Finally, a music retrieval component returns music based on matching mood tags. The experimental results show that our system can successfully create soundtracks that are related to the mood and situation of UGVs and therefore enhance the enjoyment of viewers. Our system sends only sensor data to a cloud service and is therefore bandwidth efficient since video data does not need to be transmitted for analysis. © 2012 Authors.|
|Source Title:||MM 2012 - Proceedings of the 20th ACM International Conference on Multimedia|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Nov 16, 2018
checked on Nov 3, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.