abstract
- The control and communication in man and the machine has been an active area of research since the early 1940's and since then the usage of the computing machine for the enhancement, augmentation, and rehabilitation of mankind has been broadly investigated. One active area of such research is the interface of the human brain to the computer; brain-computer-interfacing (BCI) or neuroprostheses. Current examples of functional BCI typically control the computer screen cursor movement, but require extensive subject training and significant, if not full, cognitive focus. Our model proposed an alternative approach to implementing the BCI for the application of controlling a digital hearing aid by autonomously modifying the speech signal based on the identification of electrophysiological response, or an affective state. Using a support vector machine binary classifier our model successfully demonstrated the efficacy of single-trial identification of affective states as an enhanced method of hearing neuroprosthetic control at a communication transfer rate of 240 bits/minute.