“…For instance, in studies on affective communication in human-computer interaction (e.g., Tajadura-Jiménez and Västfjäll, 2008), the EmojiGrid could be used to enable users to repeatedly report perceived affect (e.g., by projecting a pointer-controlled beam on the grid). Such an application could also be useful for the affective annotation of multimedia (Chen et al, 2007;Soleymani et al, 2008;Runge et al, 2016;Suhaimi et al, 2018), for personalized affective multimedia retrieval (e.g., through query-by-emoji: Cappallo et al, 2019) or multimedia recommender systems (Hanjalic and Xu, 2005;Xu et al, 2008;Zhang et al, 2010;Lopatovska and Arapakis, 2011;Koelstra et al, 2012;Zhao et al, 2013), for the affective appraisal of multimedia entertainment in real-time (Fleureau et al, 2012), to give affective feedback in serious gaming applications (Anolli et al, 2010), and for affective multimedia generation (e.g., music: Kim and André, 2004). We are currently implementing the EmojiGrid in a multisensory VR environment (the Sensiks Sensory Reality Pod: www.sensiks.com) as an interface for the user to select and adjust the desired multisensory (visual, auditory, tactile, and olfactory) affective experiences.…”