Abstract
In this talk we survey recent research views on non-traditional brain-computer interfaces (BCI). That is, interfaces that can process brain activity input, but that are designed for the ‘general population’, rather than for clinical purposes. Control of applications can be made more robust by fusing brain activity information with other information, either explicitly provided by a user (such as commands) or extracted from the user by interpreting his or her behavior (movements, posture, gaze, facial xpression, nonverbal speech) or sensing (neuro-)physiological characteristics (skin conductivity, heart rate, brain activity). Traditional BCI research, guided by clinical applications and focused on patients that have no other means than their brain activity for control and communication have neglected a possible role for BCI in multimodal interaction with intelligent devices. We will emphasize such a role and will also look at some roadmaps for BCI research (2020 and beyond) and how their authors see the development of BCI for intelligent systems.
Original language | Undefined |
---|---|
Pages | 1-1 |
Number of pages | 1 |
DOIs | |
Publication status | Published - Sept 2016 |
Event | 4th IIAE International Conference on Intelligent Systems and Image Processing, ICISIP 2016 - Kyoto International Community House, Kyoto, Japan Duration: 8 Sept 2016 → 12 Sept 2016 Conference number: 4 https://www2.ia-engineers.org/icisip2016/ http://www2.ia-engineers.org/icisip2016/keynote-speakers/ |
Conference
Conference | 4th IIAE International Conference on Intelligent Systems and Image Processing, ICISIP 2016 |
---|---|
Abbreviated title | ICISIP 2016 |
Country/Territory | Japan |
City | Kyoto |
Period | 8/09/16 → 12/09/16 |
Internet address |
Keywords
- EWI-27180
- IR-104107
- HMI-MI: MULTIMODAL INTERACTIONS
- Affective Computing
- User monitoring
- Multi-modal interaction
- Intelligent sensors
- Brain-Computer Interfaces
- HMI-CI: Computational Intelligence