Augmented Reality (AR) technologies enhance the real world by integrating contextual digital information about physical entities. However, inconsistencies between physical reality and digital augmentations, which may arise from errors in the visualized information or the user's mental context, can considerably impact user experience. This study characterizes the brain dynamics associated with processing incongruent information within an AR environment. We designed an interactive paradigm featuring the manipulation of a Rubik's cube serving as a physical referent. Congruent and incongruent information regarding the cube's current status was presented via symbolic (digits) and non-symbolic (graphs) stimuli, thus examining the impact of different means of data representation. The analysis of electroencephalographic (EEG) signals from 19 participants revealed the presence of centro-parietal N400 and P600 components following the processing of incongruent information, with significantly increased latencies for non-symbolic stimuli. Additionally, we explored the feasibility of exploiting incongruency effects for brain-computer interfaces. Hence, we implemented decoders using linear discriminant analysis, support vector machines, and EEGNet, achieving comparable performances with all methods. The successful decoding of incongruency-induced modulations can inform systems about the current mental state of users without making it explicit, aiming for more coherent and contextually appropriate AR interactions.