This paper discusses natural communication based on an emotional model for partner robots based on visual perception. The partner robots should have various capabilities of perceiving, acting, communicating, and surviving through physically and emotionally interacting with a human. First of all, we discuss the role of an emotional model for a partner robot. Next, we propose the concept of natural communication with an emotional model. Partner robots performs utterances based on the results of the emotional model. Furthermore, this paper proposes a method for extracting human facial landmarks to realize natural communication for partner robots. Evolutionary computation is used for human face detection and human tracking based on adaptive color template matching. A temporary template pattern is updated according to the search result of the previous image. After constructing a normalized human face image, the positions and features of facial landmarks are extracted. Finally, we show experimental results of the emotional communication based on the visual perception.