In this study we evaluate the convergent validity of a new graphical self-report tool (the EmojiGrid) for the affective appraisal of perceived touch events. The EmojiGrid is a square grid labeled with facial icons (emoji) showing different levels of valence and arousal. The EmojiGrid is language independent and efficient (a single click suffices to report both valence and arousal), making it a practical instrument for studies on affective appraisal. We previously showed that participants can intuitively and reliably report their affective appraisal (valence and arousal) of visual, auditory and olfactory stimuli using the EmojiGrid, even without additional (verbal) instructions. However, because touch events can be bidirectional and dynamic, these previous results cannot be generalized to the touch domain. In this study, participants reported their affective appraisal of video clips showing different interpersonal (social) and object-based touch events, using either the validated 9-point SAM (Self-Assessment Mannikin) scale or the EmojiGrid. The valence ratings obtained with the EmojiGrid and the SAM are in excellent agreement. The arousal ratings show good agreement for object-based touch and moderate agreement for social touch. For social touch and at more extreme levels of valence, the EmojiGrid appears more sensitive to arousal than the SAM. We conclude that the EmojiGrid can also serve as a valid and efficient graphical self-report instrument to measure human affective response to a wide range of tactile signals.