The booming haptic data significantly improves the users' immersion during multimedia interaction. As a result, the study of Haptic, Audio-Visual Environment (HAVE) has attracted attentions of multimedia community. To realize such a system, a challenging task is the synchronization of multiple sensorial signals that is critical to user experience. Despite of audio-visual synchronization efforts, there is still a lack of haptic-aware multimedia synchronization model. In this work, we propose a timestamp-independent synchronization for haptic-visual signal transmission. First, we exploit the sequential correlations during delivery and playback of a hapticvisual communication system. Second, we develop a key sample extraction of haptic signals based on the force feedback characteristics, and a key frame extraction of visual signals based on deep object detection. Third, we combine the key samples and frames to synchronize the corresponding haptic-visual signals. Without timestamps in signal flow, the proposed method is still effective and more robust to complicated network conditions. Subjective evaluation also shows a significant improvement of user experience with the proposed method.