Is this joke really funny? Judging the mirth by audiovisual laughter analysis

S. Petridis, Maja Pantic

    Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

    25 Citations (Scopus)
    225 Downloads (Pure)

    Abstract

    This paper presents the results of an empirical study suggesting that, while laughter is a very good indicator of amusement, the kind of laughter (unvoiced laughter vs.voiced laughter) is correlated with the mirth of laughter and could potential be used to judge the actual hilarity of the stimulus joke. For this study, an automated method for audiovisual analysis of laugher episodes exhibited while watching movie clips or observing the behaviour of a conversational agent has been developed. The audio and visual features, based on spectral properties of the acoustic signal and facial expressions respectively, have been integrated using feature level fusion, resulting in a multimodal approach to distinguishing voiced laughter from unvoiced laughter and speech. The classification accuracy of such a system tested on spontaneous laughter episodes is 74 %. Finally, preliminary results are presented which provide evidence that unvoiced laughter can be interpreted as less gleeful than voiced laughter and consequently the detection of those two types of laughter can be used to label multimedia content as little funny or very funny respectively.
    Original languageUndefined
    Title of host publicationIEEE International Conference on Multimedia and Expo (ICME'09)
    Place of PublicationLos Alamitos
    PublisherIEEE
    Pages1444-1447
    Number of pages4
    ISBN (Print)978-1-4244-4291-1
    DOIs
    Publication statusPublished - 2009
    EventIEEE International Conference on Multimedia and Expo, ICME 2009 - New York, NY, USA
    Duration: 28 Jun 20093 Jul 2009

    Publication series

    Name
    PublisherIEEE Computer Society Press
    ISSN (Print)1945-788X

    Conference

    ConferenceIEEE International Conference on Multimedia and Expo, ICME 2009
    Period28/06/093/07/09
    Other28 June - 3 July 2009

    Keywords

    • METIS-264325
    • IR-69560
    • Implicit content based indexing
    • EWI-17211
    • audiovisual laughter detection
    • HMI-MI: MULTIMODAL INTERACTIONS
    • HMI-HF: Human Factors
    • EC Grant Agreement nr.: FP7/211486

    Cite this