Roth D, Westermeier F, Brübach L, Feigl T, Schell C, Latoschik ME (2019)
Publication Language: English
Publication Type: Conference contribution, Original article
Publication year: 2019
Publisher: ACM
Pages Range: 564-565
Conference Proceedings Title: Mensch und Computer 2019 - Workshopband
URI: https://dl.gi.de/bitstream/handle/20.500.12116/25205/571.pdf
The perception and expression of emotion is a fundamental part of social interaction. This project aims to utilize neuronal signals to augment avatar-mediated communications. We recognize emotions with a brain-computer-interface (BCI) and supervised machine learning. Using an avatar-based communication interface that supports head tracking, gaze tracking, and speech to animation, we leverage the BCI-based affect detection to visualize emotional states.
APA:
Roth, D., Westermeier, F., Brübach, L., Feigl, T., Schell, C., & Latoschik, M.E. (2019). Brain 2 Communicate: EEG-based Affect Recognition to Augment Virtual Social Interactions. In Gesellschaft für Informatik e.V. (Eds.), Mensch und Computer 2019 - Workshopband (pp. 564-565). Hamburg, DE: ACM.
MLA:
Roth, Daniel, et al. "Brain 2 Communicate: EEG-based Affect Recognition to Augment Virtual Social Interactions." Proceedings of the Mensch und Computer 2019, Hamburg Ed. Gesellschaft für Informatik e.V., ACM, 2019. 564-565.
BibTeX: Download