The paper presents the research conducted within the framework of the STHENOS project (www.sthenos.gr), which aims at the development of methodologies and systems for assistive environments. The research within the project aims at the development of methodologies and tools to compose pervasive human-centered systems, which will be able to understand the human state (identity, emotions and behavior) in assistive environments using audiovisual and biological signals. The proposed systems and applications are capable of offering services such as support for the aged/disabled/chronic patients, detection of critical situations from audiovisual content, biosignal and neurophysiology analysis for the detection of pathology (e.g. Alzheimer's disease), as well as for treatment follow-up. The paper includes an overview of the three perspectives of human centered computing studied in the STHENOS project, namely: audiovisual activity and status recognition, affective computing and neurophysiological analysis.