Visual perception of architectural spaces and human aesthetic experience in these spaces have recently received considerable interest in cognitive science. However, it has been difficult to construe a common understanding of aesthetic experience for architectural space, since different studies use different scales to measure aesthetic experiences. In this interdisciplinary study spanning cognitive science and architecture, we aim to provide an empirically driven systematic characterization of human aesthetic experience and investigate what aspects of the architectural spaces affect aesthetic experience. To this end, we manipulated various architectural variables including the shape of the curvilinear boundaries of architectural spaces as well as their size, light, texture, and color in virtual reality. We then had people evaluate these spaces by exhausting a large list of commonly used scales in the literature and applied principal component analysis to reveal the key dimensions of aesthetic experience. Our findings suggest that human aesthetic experience can be reduced to 3 key dimensions, namely familiarity, excitement, and fascination. Each of these dimensions are differentially affected by the various architectural variables revealing their differences. In sum, our study provides a comprehensive framework to characterize human aesthetic experience in virtual architectural spaces with curved boundaries.
The present study aims to investigate how gender stereotypes affect people's gender attribution to social robots. To this end, we examined whether a robot can be attributed a gender depending on a performed action. The study consists of 3 stages. In the first stage, we determined masculine and feminine actions by a survey conducted with 54 participants. In the second stage, we selected a gender-neutral robot by having 76 participants rate several robot stimuli in the masculine-feminine spectrum. In the third stage, we created short animation videos in which the gender-neutral robot determined in stage two performed the masculine and feminine actions determined in stage one. We then asked 102 participants to evaluate the robot in the videos in the masculine-feminine spectrum. We asked them to rate the videos according to their own view (self-view) and how they thought the society would evaluate them (society-view). We also used the Socialization of Gender Norms Scale (SGNS) to identify individual differences in gender attribution to social robots. We found the main effect of action type (feminine vs masculine) on both self-view reports and society-view reports suggesting that a neutral robot was reported to be feminine if it performed feminine actions and masculine if it performed masculine actions. However, society-view reports were more pronounced than the self-view reports: when the neutral robot performed masculine actions, it was found to be more masculine in the society-view reports than the self-view reports. Likewise, when the neutral robot performed feminine actions, it was found to be more feminine in the society-view reports than the self-view reports. In addition, the SGNS predicted the society-view reports (for feminine actions) but not the self-view reports. In sum, our study suggests that people can attribute gender to social robots depending on the task they perform.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.