ABSTRACT

This review discusses the potential use of the sense of touch (haptics) in brain–computer interfaces (BCIs) that commenced a decade ago. In motor imagery BCIs, haptics can be used to provide feedback, which feels more intuitive to users and obtains results comparable to visual feedback. In event-related potential BCIs, haptic stimuli (mainly in the form of vibration to the torso, wrists, or fingers) are used to present the user with options. Spatial, selective tactile attention increases specific components of the ERP in a similar way visual attention does. Performance of these tactile ERPs has improved rapidly over the past few years and surpasses that of gaze-independent visual BCIs. This gaze independence is of interest to specific users who may not have full control over their eye movements or whose visual channel is under the threat of overload. Haptic stimuli are also investigated with steady-state evoked potential BCIs. So far, this BCI paradigm has not been very successful using haptic stimuli. Recent reports, however, show that adding haptic steady-state stimuli to a motor imagery BCI may improve overall performance. To progress haptic BCIs, we should further optimize hardware, BCI paradigms, and classification algorithms tuned to the characteristics of haptics.