Supervised cross-modal hashing has gained a lot of attention recently. However, most existing methods learn binary codes or hash functions in a batch-based scheme, which is inefficient in an online scenario, i.e., data points come in a streaming fashion. Online hashing is a promising solution; however, there still exist several challenges, e.g., how to effectively exploit semantic information, how to discretely solve the binary optimization problem, how to efficiently update hash codes and hash functions. To address these issues, in this paper, we propose a novel supervised online cross-modal hashing method, i.e., Label EMbedding ONline hashing, LEMON for short. It builds a label embedding framework including label similarity preserving and label reconstructing, which may generate discriminative binary codes and reduce the computational complexity. Furthermore, it not only preserves the pairwise similarity of incoming data, but also establishes a connection between newly coming data and existing data by the inner product minimization on a block similarity matrix. In the light of this, it can exploit more similarity information and make the optimization less sensitive to incoming data, leading to effective binary codes. In addition, we design a discrete optimization algorithm to solve the binary optimization problem without relaxation. Therefore, the quantization error can be reduced. Moreover, its computational complexity is only relevant to the size of incoming data, making it very efficient and scalable to large-scale datasets. Extensive experimental results on three benchmark datasets demonstrate that LEMON outperforms some state-of-the-art offline and online cross-modal hashing methods in terms of accuracy and efficiency. CCS CONCEPTS • Computing methodologies → Learning paradigms; • Information systems → Multimedia and multimodal retrieval.