Brain 2 Communicate: EEG-based Affect Recognition to Augment Virtual Social Interactions

Roth D, Westermeier F, Brübach L, Feigl T, Schell C, Latoschik ME (2019)


Publication Language: English

Publication Type: Conference contribution, Original article

Publication year: 2019

Publisher: ACM

Pages Range: 564-565

Conference Proceedings Title: Mensch und Computer 2019 - Workshopband

Event location: Hamburg DE

URI: https://dl.gi.de/bitstream/handle/20.500.12116/25205/571.pdf

DOI: 10.18420/muc2019-ws-571

Abstract

The perception and expression of emotion is a fundamental part of social interaction. This project aims to utilize neuronal signals to augment avatar-mediated communications. We recognize emotions with a brain-computer-interface (BCI) and supervised machine learning. Using an avatar-based communication interface that supports head tracking, gaze tracking, and speech to animation, we leverage the BCI-based affect detection to visualize emotional states.

Authors with CRIS profile

Related research project(s)

Involved external institutions

How to cite

APA:

Roth, D., Westermeier, F., Brübach, L., Feigl, T., Schell, C., & Latoschik, M.E. (2019). Brain 2 Communicate: EEG-based Affect Recognition to Augment Virtual Social Interactions. In Gesellschaft für Informatik e.V. (Eds.), Mensch und Computer 2019 - Workshopband (pp. 564-565). Hamburg, DE: ACM.

MLA:

Roth, Daniel, et al. "Brain 2 Communicate: EEG-based Affect Recognition to Augment Virtual Social Interactions." Proceedings of the Mensch und Computer 2019, Hamburg Ed. Gesellschaft für Informatik e.V., ACM, 2019. 564-565.

BibTeX: Download