Study of emotions has gained interest in the field of sensory and consumer research. Accurate information can be obtained by studying physiological behavior along with self-reported-responses. The aim was to identify physiological and self-reported-responses towards visual stimuli and predict self-reported-responses using biometrics. Panelists (N = 63) were exposed to 12 images (ten from Geneva Affective PicturE Database (GAPED), two based on common fears) and a questionnaire (Face scale and EsSense). Emotions from facial expressions (FaceReaderTM), heart rate (HR), systolic pressure (SP), diastolic pressure (DP), and skin temperature (ST) were analyzed. Multiple regression analysis was used to predict self-reported-responses based on biometrics. Results showed that physiological along with self-reported responses were able to separate images based on cluster analysis as positive, neutral, or negative according to GAPED classification. Emotional terms with high or low valence were predicted by a general linear regression model using biometrics, while calm, which is in the center of emotion dimensional model, was not predicted. After separating images, positive and neutral categories could predict all emotional terms, while negative predicted Happy, Sad, and Scared. Heart Rate predicted emotions in positive (R2 = 0.52 for Scared) and neutral (R2 = 0.55 for Sad) categories while ST in positive images (R2 = 0.55 for Sad, R2 = 0.45 for Calm).