A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. To realise this prediction, next-generation computing should develop anticipatory user interfaces that are human-centred, built for humans and based on naturally occurring multimodal human communication. These interfaces should transcend the traditional keyboard and mouse and have the capacity to understand and emulate human communicative intentions as expressed through behavioural cues, such as affective and social signals. This article discusses how far we are to the goal of human-centred computing and Human-Centred Intelligent Human-Computer Interaction (HCI²) that can understand and respond to multimodal human communication.
|Number of pages||20|
|Journal||International journal of autonomous and adaptive communications systems|
|Publication status||Published - 1 Aug 2008|
- EC Grant Agreement nr.: FP7/211486
- EC Grant Agreement nr.: FP6/0027787
- HMI-MI: MULTIMODAL INTERACTIONS