multi-modal sensory data relevant to a range of application domains and problem contexts where interpreting human behaviour is central. The overall motivation and driving theme of the special issue pertains to artificial intelligence based methods and tools that may serve a foundational purpose toward the high-level semantic interpretation of largescale, dynamic, multi-modal sensory data, or data streams. A crucial focus of the special issue has been on foundational methods supporting the development of human-centred technologies and cognitive interaction systems aimed at assistance and empowerment, e.g. in everyday life and activity, professional problem solving for creative and analytical decision-making, planning etc.
Multi-Modal Event and Activity InterpretationThe multi-modality that is alluded to in the context of this special issue stems from an inherent synergistic value in the integrated processing and interpretation of a range of data sources that are common in the context of cognitive interaction systems, computational cognition, and human-computer interaction scenarios.Multi-modal data-sources that may be envisaged include, but are not limited to, one or more of the following:• Visuo-spatial imagery: -Image, video, video and depth (RGB-D), pointclouds. -Geospatial satellite imagery, remote sensing data, crowd-sourced data, survey data.• Movement and interaction data:Abstract This special issue presents interdisciplinary research-at the interface of artificial intelligence, cognitive science, and human-computer interaction-focussing on the semantic interpretation of human behaviour. The special issue constitutes an attempt to highlight and steer foundational methods research in artificial intelligence, in particular knowledge representation and reasoning, for the development of human-centred cognitive assistive technologies.Of specific interest and focus have been application outlets for basic research in knowledge representation and reasoning and computer vision for the cognitive, behavioural, and social sciences.