When virtualizing large-scale images of the real world, online hashing provides an efficient scheme for fast retrieval and compact storage. It converts high-dimensional streaming data into compact binary hash codes while saving the structural characteristics between samples into the Hamming space. Existing works usually update the hashing function based on the similarity between input data, or design a codebook to assign code words for each single input sample. However, assigning code words to multiple samples while retaining the balanced similarity of the image instances is still challenging. To address this issue, we propose a novel discriminative similarity-balanced online hashing (DSBOH) framework in this work. In particular, we first obtain the Hadamard codebook that guides the generation of discriminative binary codes according to label information. Then, we maintain the correlation between the new data and the previously arrived data by the balanced similarity matrix, which is also generated by semantic information. Finally, we joined the Hadamard codebook and the balanced similarity matrix into a unified hashing function to simultaneously maintain discrimination and balanced similarity. The proposed method is optimized by an alternating optimization technique. Extensive experiments on the CIFAR-10, MNIST, and Places205 datasets demonstrate that our proposed DSBOH performs better than several state-of-the-art online hashing methods in terms of effectiveness and efficiency.