Human Computing and Machine Understanding of Human Behavior: A Survey

Maja Pantic, Alex Pentland, Antinus Nijholt, Thomas Huang

    Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

    124 Citations (Scopus)
    206 Downloads (Pure)

    Abstract

    A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing, which we will call human computing, should be about anticipatory user interfaces that should be human-centered, built for humans based on human models. They should transcend the traditional keyboard and mouse to include natural, human-like interactive functions including understanding and emulating certain human behaviors such as affective and social signaling. This article discusses a number of components of human behavior, how they might be integrated into computers, and how far we are from realizing the front end of human computing, that is, how far are we from enabling computers to understand human behavior.
    Original languageUndefined
    Title of host publicationACM SIGCHI Proceedings Eighth International Conference on Multimodal Interfaces
    EditorsF. Quek, Yie Yang
    Place of PublicationNew York
    PublisherAssociation for Computing Machinery (ACM)
    Pages239-248
    Number of pages10
    ISBN (Print)159593-541-X
    DOIs
    Publication statusPublished - Nov 2006
    Event8th International Conference on Multimodal Interfaces, ICMI 2006 - Banff, Canada
    Duration: 2 Nov 20064 Nov 2006
    Conference number: 8

    Publication series

    Name
    PublisherACM
    Numbersuppl 2

    Conference

    Conference8th International Conference on Multimodal Interfaces, ICMI 2006
    Abbreviated titleICMI
    CountryCanada
    CityBanff
    Period2/11/064/11/06

    Keywords

    • METIS-237803
    • EC Grant Agreement nr.: FP6/506811
    • EWI-8625
    • IR-66738

    Cite this