With the rapid development of brain-computer interfaces (BCIs), human visual decoding, one of the important research directions of BCIs, has attracted a substantial amount of attention. However, most visual decoding studies have focused on graphic and image decoding. In this paper, we rst demonstrate the possibility of building a new kind of task-irrelevant, simple and fast-stimulus BCI-based experimental paradigm that relies on visual evoked potentials (VEPs) during colour observation. Additionally, the features of visual colour information were found through reliable real-time decoding. We selected 9 subjects who did not have colour blindness to participate in our tests. These subjects were asked to observe red, green, and blue screens in turn with an interstimulus interval of 1 second. The machine learning results showed that the visual colour classi cation accuracy had a maximum of 93.73%. The latency evoked by visual colour stimuli was within the P300 range, i.e., 176.8 milliseconds for the red screen, 206.5 milliseconds for the green screen, and 225.3 milliseconds for the blue screen. The experimental results hereby show that the VEPs can be used for reliable colour real-time decoding.