Biometric signals have been extensively used for user identification and authentication due to their inherent characteristics that are unique to each person. The variation exhibited between the brain signals (EEG) of different people makes such signals especially suitable for biometric user identification. However, the characteristics of these signals are also influenced by the user's current condition, including his/her affective state. In this paper, we analyze the significance of the affect-related component of brain signals within the subject identification context. Consistent results are obtained across three different public datasets, suggesting that the dominant component of the signal is subject-related, but the affective state also has a contribution that affects identification accuracy. Results show that identification accuracy increases when the system has been trained with EEG recordings that refer to similar affective states as the sample that is to be identified. This improvement holds independently of the features and classification algorithm used, and it is generally above 10% under a rigorous setting, when the training and validation datasets do not share data from the same recording days. This finding emphasizes the potential benefits of considering affective information in applications that require subject identification, such as user authentication.