Over the last several years, brain-computer interface (BCI) research has grown well beyond initial efforts to provide basic communication for people with severe disabilities that prevent them from communicating otherwise. Since BCIs rely on direct measures of brain activity, users do not have to move in any way to convey information. During the early years of BCI research, BCI systems had little to offer healthy users. Since most users can communicate quickly and easily by speaking or typing, why would healthy people use a BCI? Several different answers to this question have been presented. For example, when healthy users’ hands and/or voice are not available in certain situations, they may experience ‘situational disability’. In that situation, healthy users may need a low-bandwidth interface. Healthy people may use a BCI just for fun; the idea of communicating just through brainwaves is new and exciting to many people. BCIs may also detect emotion, arousal, or other characteristics that would otherwise be burdensome, irritating, or impossible to convey through other means. The last category of BCIs presented includes ‘affective BCIs’ or ‘aBCIs’, which are BCIs that detect affect. Information about users’ affect could be used to modify the user’s interaction with software and people in real time. For example, the alpha WoW system can change a user’s World of Warcraft avatar if the user’s EEG indicates an increased level of stress. Users could always press a button to send this information, but doing so creates some distraction and burden just when the user exhibits stress, and may need hands free to deal with ingame challenges. Similarly, games or other software applications might automatically adapt to states when users seem overburdened or bored. Again, users could choose to press a button when they want to increase or decrease their workload, but might prefer to keep focused on their work while the environment adapts automatically.
- HMI-MI: MULTIMODAL INTERACTIONS
- Brain-Computer Interfaces