Looking at objects is one of the most basic sensorimotor behaviours, which requires calibration of the perceptual and motor systems. Recently, we have introduced a neural-dynamic architecture, in which the sensorimotor transformations, which lead to precise saccadic gaze shifts, is initially learned and is autonomously updated if changes in the environment or in the motor plant of the agent require adaptation. Here, we demonstrate how the allocentric, gazedirection independent memory representations may be formed in this architecture and how sequences of precise gaze shifts may be generated to memorised targets. Our simulated robotic experiments demonstrate functioning of the architecture on an autonomous embodied agent.