A study of musical timbre semantics was conducted with listeners from two different linguistic groups. In two separate experiments, native Greek and English speaking participants were asked to describe 23 musical instrument tones of variable pitch using a predefined vocabulary of 30 adjectives. The common experimental protocol facilitated the investigation of the influence of language on musical timbre semantics by allowing for direct comparisons between linguistic groups. Data reduction techniques applied to the data of each group revealed three salient semantic dimensions that shared common conceptual properties between linguistic groups namely: luminance, texture, and mass. The results supported universality of timbre semantics. A correlation analysis between physical characteristics and semantic dimensions associated: i) texture with the energy distribution of harmonic partials, ii) thickness (a term related to either mass or luminance) and brilliance with inharmonicity and spectral centroid variation, and iii) F0 with mass or luminance depending on the linguistic group.
T h e c u r r e n t s t u d y e x p a n d s o u r p r e v i o u s work on interlanguage musical tim bre semantics by examining the relationship between semantics and per ception of timbre. Following Zacharakis, Pastiadis, and Reiss (2014), a pairwise dissimilarity listening test involv ing participants from two separate linguistic groups (Greek and English) was conducted. Subsequent multidi mensional scaling analysis produced a 3D perceptual tim bre space for each language. The comparison between perceptual spaces suggested that timbre perception is unaffected by native language. Additionally, comparisons between semantic and perceptual spaces revealed sub stantial similarities which suggest that verbal descriptions can convey a considerable amount of perceptual informa tion. The previously determined semantic labels "auditory texture" and "luminance" featured the highest associa tions with perceptual dimensions for both languages. "Auditory mass" failed to show any strong correlations. Acoustic analysis identified energy distribution of har monic partials, spectral detail, temporal/spectrotemporal characteristics and the fundamental frequency as the most salient acoustic correlates of perceptual dimensions.
Sensorimotor activity in response to motion reflecting audiovisual titillation is studied in this article. EEG recordings, and especially the Mu-rhythm over the sensorimotor cortex (C3, CZ, and C4 electrodes), were acquired and explored. An experiment was designed to provide auditory (Modest Mussorgsky's "Promenade" theme) and visual (synchronized human figure walking) stimuli to advanced music students (AMS) and non-musicians (NM) as a control subject group. EEG signals were analyzed using fractal dimension (FD) estimation (Higuchi's, Katz's and Petrosian's algorithms) and statistical methods. Experimental results from the midline electrode (CZ) based on the Higuchi method showed significant differences between the AMS and the NM groups, with the former displaying substantial sensorimotor response during auditory stimulation and stronger correlation with the acoustic stimulus than the latter. This observation was linked to mirror neuron system activity, a neurological mechanism that allows trained musicians to detect action-related meanings underlying the structural patterns in musical excerpts. Contrarily, the response of AMS and NM converged during audiovisual stimulation due to the dominant presence of human-like motion in the visual stimulus. These findings shed light upon music perception aspects, exhibiting the potential of FD to respond to different states of cortical activity.
Electroencephalogram (EEG) recordings, and especially the Mu-rhythm over the sensorimotor cortex that relates to the activation of the mirror neuron system (MNS), were acquired from two subject groups (orchestral musicians and nonmusicians), in order to explore action representation processes involved in the perception and performance of musical pieces. Two types of stimuli were used, i.e., an auditory one consisting of an excerpt of Beethoven's fifth symphony and a visual one presenting a conductor directing an orchestra performing the same excerpt of the piece. Three tasks were conducted including auditory stimulation, audiovisual stimulation, and visual stimulation only, and the acquired signals were processed using fractal [time-dependent fractal dimension (FD) estimation] and statistical analysis (analysis of variance, Mann-Whitney). Experimental results showed significant differences between the two groups while desychronization of the Mu-rhythm, which can be linked to MNS activation, was observed during all tasks for the musicians' group, as opposed to the nonmusicians' group who exhibited similar response only when the visual stimulus was present. The mobility of the conductor was also correlated to the estimated FD signals, showing significantly higher correlation for the case of musicians compared to nonmusicians' one. The present study sheds light upon the difference in action representation in auditory perception between musicians and nonmusicians and paves the way for better comprehension of the underlying mechanisms of the MNS.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.