“…The users' emotional impact is being assessed while users watch the movies, to provide user feedback and to catalog the movies: based on biosensors like EEG, EDA, and a webcam for facial expressions; and by having users engaging in self-assessment and annotation of the movies, using different models or interfaces like categorical emotions, maniken and emotional wheel (both based on the VA dimensions) [23]; and articulating with other project tasks where video content-based features are extracted mostly in audio, subtitles and image. This application is also integrating our previous As Music Goes By [20,21,33], allowing users to search, visualize and explore music and movies from complementary perspectives of music versions, artists, quotes and movie soundtracks. Next, we present and discuss our emotional model approach, that we want to keep rich and expressive, though effective, flexible and easy to understand; and the movie visualization and search features based on their emotional impact, as a whole and along time, with multimodal interfaces for different contexts of use.…”