“…Emotions are visualized through various indicators in humans, many of these indicators have been previously analyzed to provide affective knowledge to machines, focusing on facial expressions [5], [6], vocal features [7], [8], [9], body movements and postures [10], [11], [12], [13] and the integration of all of them in emotion analysis systems [14], [15], [16]. But human beings cannot always hope that robots may be able to react in a timely and sensible manner, especially if they haven't be able to recover all the affective information through their sensors.…”