Previous research has shown that auditory cues can influence the flavor of food and drink. For instance, wine tastes better when preferred music is played. We have investigated whether a music background can modify judgments of the specific flavor pattern of a beverage, as opposed to mere preference. This was indeed the case. We explored the nature of this crosstalk between auditory and gustatory perception, and hypothesized that the ‘flavor’ of the background music carries over to the perceived flavor (i.e., descriptive and evaluative aspects) of beverages. First, we collected ratings of the subjective flavor of different music pieces. Then we used a between-subjects design to cross the music backgrounds with taste evaluations of several beverages. Participants tasted four different samples of beverages under two contrasting audio conditions and rated their taste experiences. The emotional flavor of the music had the hypothesized effects on the flavor of the beverages. We also hypothesized that such an effect would be stronger for music novices than for music experts, and weaker for aqueous solutions than for wines. However, neither music expertise nor liquid type produced additional effects. We discuss implications of this audio-gustatory interaction.
The quality of a concert hall primarily depends on its acoustics. But does visual input also have an impact on musical enjoyment? Does the color of ambient lighting modulate the perceived music quality? And are certain colors perceived to fit better than others with a given music piece? To address these questions, we performed three within-subjects experiments. We carried out two pretests to select four music pieces differing in tonality and genre, and 14 lighting conditions of varying hue, brightness, and saturation. In the main experiment, we applied a fully crossed repeated-measures design. Under each of the four lighting conditions, participants rated the musical variables ‘Harmonic’, ‘Powerful’, ‘Gloomy’, ‘Lively’ and overall liking of the music pieces, as well as the perceived fit of music and lighting. Subsequently, participants evaluated music and lighting separately by rating the same variables as before, as well as their emotional impact (valence, arousal, dominance). We found that music and lighting being similarly rated in terms of valence and arousal in the unimodal conditions were judged to match better when presented together. Accordingly, tonal (atonal) music was rated to fit better with weakly saturated (highly saturated) colors. Moreover, some characteristics of the lighting were carried over to music. That is, just as red lighting was rated as more powerful than green and blue lighting, music was evaluated to be more powerful under red compared to green and blue lighting. We conclude that listening to music is a multisensory process enriched by impressions from the visual domain.
Sound by itself can be a reliable source of information about an object’s size. For instance, we are able to estimate the size of objects merely on the basis of the sound they make when falling on the floor. Moreover, loudness and pitch are crossmodally linked to size. We investigated if sound has an effect on size estimation even in the presence of visual information, that is if the manipulation of the sound produced by a falling object influences visual length estimation. Participants watched videos of wooden dowels hitting a hard floor and estimated their lengths. Sound was manipulated by (A) increasing (decreasing) overall sound pressure level, (B) swapping sounds among the different dowel lengths, and (C) increasing (decreasing) pitch. Results showed that dowels were perceived to be longer with increased sound pressure level (SPL), but there was no effect of swapped sounds or pitch manipulation. However, in a sound-only-condition, main effects of length and pitch manipulation were found. We conclude that we are able to perceive subtle differences in the acoustic properties of impact sounds and use them to deduce object size when visual cues are eliminated. In contrast, when visual cues are available, only loudness is potent enough to exercise a crossmodal influence on length perception.
Numerous studies indicate that listening to music and reading are processes that interact in multiple ways. However, these interactions have rarely been explored with regard to the role of emotional mood. In this study, we first conducted two pilot experiments to assess the conveyed emotional mood of four classical music pieces and that of four narrative text excerpts. In the main experiment, participants were asked to read the texts while listening to the music and to rate their emotional state in terms of valence, arousal, and dominance. Subsequently, they rated text and music of the multisensory event in terms of the perceived mood, liking, immersion, and music-text fit. We found a mutual carry-over effect of happy and sad moods from music to text and vice versa. Against our expectations, this effect was not mediated by the valence, arousal, or dominance experienced by the subject. Moreover, we revealed a significant interaction between music mood and text mood. Texts were liked better, they were classified as of better quality, and participants felt more immersed in the text if text mood and music mood corresponded. The role of mood congruence when listening to music while reading should not be ignored and deserves further exploration.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.