Immensely popular video sharing websites such as YouTube have become the most important sources of music information for Internet users and the most prominent platform for sharing live music. The audio quality of this huge amount of live music recordings, however, varies significantly due to factors such as environmental noise, location, and recording device. However, most video search engines do not take audio quality into consideration when retrieving and ranking results. Given the fact that most users prefer live music videos with better audio quality, we propose the first automatic, non-reference audio quality assessment framework for live music video search online. We first construct two annotated datasets of live music recordings. The first dataset contains 500 human-annotated pieces, and the second contains 2,400 synthetic pieces systematically generated by adding noise effects to clean recordings. Then, we formulate the assessment task as a ranking problem and try to solve it using a learning-based scheme. To validate the effectiveness of our framework, we perform both objective and subjective evaluations. Results show that our framework significantly improves the ranking performance of live music recording retrieval and can prove useful for various real-world music applications.
Music is inherently abstract and multidimensional. However, existing music search engines are usually not convenient or too complicated for users to create multidimensional music queries, leading to the intention gap between users' music information needs and the input queries. In this paper, we present a novel content-based music search engine, the Intelligent & Interactive Multidimensional mUsic Search Engine (i 2 MUSE), which enables users to input music queries with multiple dimensions efficiently and effectively. Six musical dimensions, including tempo, beat strength, genre, mood, instrument, and vocal, are explored in this study. Users can begin a query from any dimension and interact with the system to organize the query. Once the parameters of some dimensions have been set, i 2 MUSE is able to intelligently highlight a suggested parameter and gray out an un-suggested parameter for every other dimension, helping users express their music intentions and avoid parameter conflicts in the query. In addition, i 2 MUSE provides a real-time illustration of the percentage of matched tracks in the database. Users can also set the relative weight of each specified dimension. We have conducted a pilot user study with 30 subjects and validated the effectiveness and usability of i 2 MUSE.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.