Touch is an important nonverbal form of interpersonal interaction which is used to communicate emotions and other social messages. As interactions with social robots are likely to become more common in the near future these robots should also be able to engage in tactile interaction with humans. Therefore, the aim of the research presented in this dissertation is to work towards socially intelligent robots that can understand and respond to human touch. To become a socially intelligent actor a robot must be able to sense, classify and interpret human touch and respond to this in an appropriate manner. To this end we present work that addresses different parts of this interaction cycle.After the introduction in Part I of the dissertation, we have taken a datadriven approach in Part II. We have focused on the sense and classify steps of the interaction cycle to automatically recognize social touch gestures such as pat, stroke and tickle from pressure sensor data.In Chapter 2 we present CoST: Corpus of Social Touch, a dataset containing 7805 captures of 14 different social touch gestures. All touch gestures were performed in three variants: gentle, normal and rough on a pressure sensitive mannequin arm. Recognition of these 14 gesture classes using various classifiers yielded accuracies of up to 60%; moreover, gentle gestures proved to be harder to classify than normal and rough gestures. We further investigated how different classifiers, interpersonal differences, gesture confusions and gesture variants affected the recognition accuracy.In Chapter 3 we describe the outcome of a machine learning challenge on touch gesture recognition. This challenge was extended to the research community working on multimodal interaction with the goal of sparking interest in the touch modality and to promote exploration of the use of data processing techniques from other more mature modalities for touch recognition. Two datasets were made available containing labeled pressure sensor data of social touch gestures: the CoST dataset presented in Chapter 2 and the Human-Animal Affective Robot Touch (HAART) gesture set. The most important outcomes of the challenges were: (1) transferring techniques from other modalities, such as image processing, speech, and human action recognition provided valuable feature sets; (2) gesture classification confusions were similar despite the various data processing methods that were used. vii In Part III of the dissertation we present three studies on the use of social touch in interaction with robot pets. We have mainly focused on the interpret and respond steps of the interaction cycle to identify which touch gestures a robot pet should understand, how touch can be interpreted within a social context and in which ways a robot can respond to human touch.In Chapter 4 we present a study of which the aim was to gain more insight into the factors that are relevant to interpret the meaning of touch within a social context. We elicited touch behaviors by letting participants interact with a robot pet com...