Over the last years there has been a growing consensus that new generation interfaces turn their focus on the human element by enriching Human-Computer Communication with an Aflective dimension. Affective generation of autonomous agent behaviour aspires to give computer interfaces emotional states that relate and take into account user as well as system environment considerations. Internally, through computational models of artificial hearts (emotion and personality), and externally through believable multi-modal expression augmented with quasi-human characteristics. Computational models of affect are addressing problems of how agents anive at a given affective state and how these states are expressed through natural multimodal communicative interaction. Much of this work is targeting the entertainment environment and generally does not address the real-time requirements of multi-agent systems, where behaviour is dynamically changing based on agent goals as well as the shared data and knowledge. This paper discusses one of the requirements for real-time realisation of Personal Service Assistant interface characters.We describe an operational approach to enabling the computational perception required for the automated generation of affective behaviour through inter-agent communication in multi-agent real-time environments. The research is investigating the potential of extending current agent communication languages so as they not only convey the semantic content of knowledge exchange but also they can communicate affective attitudes about the shared knowledge. Providing a necessary component of the framework required for real-time autonomous agent development with which we may bridge the gap between current research in psychological theory and practical implementation of social multi-agent systems.