Tactile enhanced multimedia is generated by synchronizing traditional multimedia clips, to generate hot and cold air effect, with an electric heater and a fan. This objective is to give viewers a more realistic and immersing feel of the multimedia content. The response to this enhanced multimedia content (mulsemedia) is evaluated in terms of the appreciation/emotion by using human brain signals. We observe and record electroencephalography (EEG) data using a commercially available four channel MUSE headband. A total of 21 participants voluntarily participated in this study for EEG recordings. We extract frequency domain features from five different bands of each EEG channel. Four emotions namely: happy, relaxed, sad, and angry are classified using a support vector machine in response to the tactile enhanced multimedia. An increased accuracy of 76.19% is achieved when compared to 63.41% by using the time domain features. Our results show that the selected frequency domain features could be better suited for emotion classification in mulsemedia studies.