Haptic interaction significantly augments our experience with a computer and in cyberworlds in particular. However, haptic interaction requires the content creators to make physical or haptic models of the virtual objects while it is not always possible or feasible, especially when it comes to using real images or videos as elements of interaction. We, therefore, propose to use image-driven haptic rendering where a displayed image, real or simulated, is used as a source of the force-feedback calculations at any pixel touched by the haptic device. We introduce the main idea and describe how it is implemented as a core algorithm for image-driven haptic rendering, as well as for a few particular cases of haptic rendering of different dominant colors, textures and contours of the objects displayed in the images. Implementations of the proposed method to desktop tangible image application and haptic video communication on the web are presented as a proof of concept.