We are developing an embedded vision system for the humanoid robot iCub, inspired by the biology of the mammalian visual system, including concepts such as stimulusdriven, asynchronous signal sensing and processing. It comprises stimulus-driven sensors, a dedicated embedded processor and an event-based software infrastructure for processing visual stimuli. These components are integrated with the existing standard machine vision modules currently implemented on the robot, in a configuration that exploits the best features of both: the high resolution, color, framebased vision and the neuromorphic low redundancy, wide dynamic range and high temporal resolution event-based sensors. This approach seeks to combine various styles of vision hardware with sensorimotor systems to complement and extend the current state-of-the art.