Determining the mood of a person is an important step in the Human-Robot interaction. In this paper, we propose a human-inspired approach in which the changes in emotions, done using emotion induction, can be used to determine the mood of a person. The emotion induction, which can be done through robot actions or through showing video clips, stimulates changes in the emotions of the person, reducing the observation time needed to estimate mood. Consequently, the changes in the emotions, which are biased by his/her mood can be used by a robot to determine the mood of a person. To do so, we induced happy emotions by showing a comical clip and measured the intensity of each happy and sad emotions, and also the intensity of the neutral state. Then we extracted a feature set, including both time and frequency domain features, which is used to determine the mood of the person. The approach has been implemented and compared to no-emotion-induction approach and shows better results. Based on the classification results, the approach is able to distinguish between good vs. bad moods with the accuracy of 91.5% with 0.1 mean absolute error. But the neutral state was not distinguishable well.