Resource-constrained embedded devices, operating with images, are becoming increasingly common. Examples include remote low-power smart sensors, wireless sensor networks, autonomous cameras, eye tracking devices, etc. The principal requirements of such devices are the operation in real time, low power, low heat as well as low MIPS. These requirements can be fulfilled with the use of approximated version of original image processing algorithms. The EyeDee™ embedded eye tracking solution (developed by SuriCog) is the world's first innovative solution using the eye as a real-time mobile digital cursor, while maintaining full mobility. Being an example of resourceconstrained embedded device, the system consists of a wearable device (Weetsy™ frame) capturing images on the human's eye and an embedded pre-processing device (Weetsy™ preprocessing board) sending these eye images over a transmission medium (wire/wireless transmission) to a remote processing unit for the further gaze reconstruction. This paper is aimed at introducing image compression approaches in the resourceconstrained devices in general and some of their implementation in the Weetsy™ pre-processing board in particular.