This review critically evaluates the literature concerning the impact of visual appearance cues (including colour, foam, and cloudiness) on people's perception in the beer category. The authors assess both the sensory expectations that are elicited by the visual appearance of beer, and the extent to which those expectations carry-over to influence the actual tasting experience. Beer is a particularly intriguing category to study since the differing production rules in different countries mean that there is not always the same scope to modify the colour in order to meet perceived consumer demands. What is more, there is currently disagreement in the literature concerning the impact of beer colour and foam on people's expectations of beer prior to tasting, and their multisensory flavour perception on tasting. Given how much beer is consumed annually, it is surprising that more research has not been published that assesses the undoubtedly important role of visual appearance in this beverage category. Part of the reason for this may simply be that it is difficult to create consistent experimental stimuli given the rapid transition of the head of the beer post-serving.
Our emotions influence our perception. In order to determine whether emotion influences the perception of beer, 32 participants watched either a scene from the movie Wall-E to induce joviality, or a short clip from the Shawshank Redemption to induce sadness. The participants were then required to sample up to 250 mL of Yenda Pale Ale beer and rate it on a variety of taste and flavor characteristics (e.g., bitterness), before completing the Positive and Negative Affect Schedule-X (PANAS-X). The data were analyzed using Bayesian t-tests and Null Hypothesis Significance Tests (NHSTs). After applying conservative corrections for multiple comparisons, NHSTs failed to reach statistical significance. However, the effect sizes suggested that inducing joviality, relative to inducing sadness, resulted in the beer being rated as (a) tasting more pleasant, (b) tasting sweeter, and (c) being of higher quality. Following the induction of joviality, participants were also willing to pay more for the beer. The Bayesian analyses indicated that induced emotion can influence flavor perception for complex taste stimuli. The effect sizes and Bayesian analyses are interpreted in terms of Feelings-as-Information theory. These preliminary findings can tentatively be applied to real-world environments such as venues that serve and/or market alcohol.
This review critically evaluates the literature documenting the impact of training on people's perception of beer. In certain circumstances, training has been shown to improve people's ability to match and identify beers, and to discriminate between beers and between the distinctive attributes of beer. However, a reasonably consistent finding is that the benefits of beer training do not seem to generalise to novel beers (i.e., those not experienced during training). As such, training would appear to improve the capacity to label perception and/or people's recognition memory for beers, rather than necessarily influencing perception itself. Given how much beer is consumed annually, it is surprising that there has not been more published research into the role that training plays in this particular beverage category. Part of the reason for this may relate to (a) the limited role of experts in this field (e.g., primarily restricted to product testing and quality control; as compared to their much more prominent role in the world of wine), (b) the fact that training is an expensive and time-consuming process, and (c) it is wrongly assumed that the effects of training are similar across different beverage categories (e.g., beer and wine). We suggest that further efforts are therefore required before it will be possible to confidently conclude that training lowers the perceptual thresholds, and enhances the perceptual discrimination abilities, of beer experts above those of novices.
According to thetemporal principleof multisensory integration, cross-modal synchronisation of stimulus onset facilitates multisensory integration. This is typically observed as a greater response to multisensory stimulation relative to the sum of the constituent unisensory responses (i.e.,superadditivity). The aim of the present study was to examine whether the temporal principle extends to the cross-modal synchrony of amplitude-modulation (AM) rate. It is well established that psychophysical sensitivity to AM stimulation is strongly influenced by AM rate where the optimum rate differs according to sensory modality. This rate-dependent sensitivity is also apparent from EEG steady-state response (SSR) activity, which becomes entrained to the stimulation rate and is thought to reflect neural processing of the temporal characteristics of AM stimulation. In this study we investigated whether cross-modal congruence of AM rate reveals both psychophysical and EEG evidence of enhanced multisensory integration. To achieve this, EEG SSR and psychophysical sensitivity to simultaneous acoustic and/or vibrotactile AM stimuli were measured at cross-modally congruent and incongruent AM rates. While the results provided no evidence of superadditive multisensory SSR activity or psychophysical sensitivity, the complex pattern of results did reveal a consistent correspondence between SSR activity and psychophysical sensitivity to AM stimulation. This indicates that entrained EEG activity may provide a direct measure of cortical activity underlying multisensory integration. Consistent with the temporal principle of multisensory integration, increased vibrotactile SSR responses and psychophysical sensitivity were found for cross-modally congruent relative to incongruent AM rate. However, no corresponding increase in auditory SSR or psychophysical sensitivity was observed for cross-modally congruent AM rates. This complex pattern of results can be understood in terms of the likely influence of theprinciple of inverse effectivenesswhere the temporal principle of multisensory integration was only evident in the context of reduced perceptual sensitivity for the vibrotactile but not the auditory modality.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.