Can shared experience and dialogue on social touch be orchestrated in playful smart public spaces? In smart city public spaces, in which physical and virtual realities are currently merging, new forms of social connections, interfaces and experiences are being explored. Within art practice, such new connections include new forms of affective social communication with additional social and sensorial connections to enable and enhance empathic, intimate experience in playful smart public space. This chapter explores a novel design for shared intimate experience of playful social touch in three orchestrations of 'Saving Face', in different cultural and geographical environments of smart city (semi-) public spaces, in Beijing, Utrecht, Dessau-Berlin. These orchestrations are purposefully designed to create a radically unfamiliar sensory synthesis to disrupt the perception of 'who sees and who is being seen, who touches and who is being touched'. Participants playfully 'touch themselves and feel being touched, to connect with others on a screen'. All three orchestrations show that shared experience and dialogue on social touch can be mediated by playful smart cities technologies in public spaces, but rely on design of mediated, intimate and exposed forms of 'self-touch for social touch', ambivalent relations, exposure of dialogue and hosting.
Can social engagement and reflection be designed through social touch in today's smart city's public spaces? This paper explores ludic, playful design for shared engagement and reflection in public spaces through social touch. In two Artistic Social Labs (ASL), internationally presented in public spaces, a radically unfamiliar sensory synthesis is acquired, for which perception of 'who sees and who is being seen, who touches and who is being touched' is disrupted. Participants playfully 'touch themselves and feel being touched, to connect with others on a screen'. On the basis of the findings in the ASLs, guidelines are proposed for orchestrating social engagement and reflection, through social touch as play.
Can shared intimate experience of social touch be mediated through multibrain-computer interface (Multi-brain BCI) interaction in public space? Two artistic EEG KISS orchestrations, both multi-modal, multi-brain BCIs, are shown to create novel shared experiences of social touch in public space. These orchestrations purposefully disrupt and translate known forms of face-to-face connection and sound, to re-orchestrate unfamiliar sensory syntheses of seeing, hearing, touching and moving, connected to data-visualization and audification of brain activity. The familiar sensory relations between 'who you kiss and who is being kissed, what you see and what you hear' are captured in a model of digital synaesthetics in multi-modal multi brain BCI interaction for social touch. This model links hosted self-disclosure, witnessing, dialogue and reflection to intimate experience in public space through syntheses of the senses. As such, this model facilitates the design of new shared intimate experiences of multi modal multi brain BCI interaction through social touch in public space.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.