Roth D, Brübach L, Westermeier F, Schell C, Feigl T, Latoschik ME (2019)
Publication Language: English
Publication Type: Conference contribution, Original article
Publication year: 2019
Publisher: ACM Digital Library
City/Town: New York, NY, USA
Conference Proceedings Title: Proceedings of the Symposium on Spatial User Interaction (SUI'19)
ISBN: 978-1-4503-6975-6
URI: https://dl.acm.org/citation.cfm?id=3357251.3360018
In this demonstration we present a prototype for an avatar-mediated social interaction interface that supports the replication of head- and eye movement in distributed virtual environments. In addition to the retargeting of these natural behaviors, the system is capable of augmenting the interaction based on the visual presentation of affective states. We derive those states using neuronal data captured by electroencephalographic (EEG) sensing in combination with a machine learning driven classification of emotional states.
APA:
Roth, D., Brübach, L., Westermeier, F., Schell, C., Feigl, T., & Latoschik, M.E. (2019). A Social Interaction Interface Supporting Affective Augmentation Based on Neuronal Data. In Christoph W. Borst, Arun K. Kulshreshth, Gerd Bruder, Stefania Serafin, Christian Sandor, Kyle Johnsen, Jinwei Ye, Daniel Roth, Sungchul Jung (Eds.), Proceedings of the Symposium on Spatial User Interaction (SUI'19). New Orleans, US: New York, NY, USA: ACM Digital Library.
MLA:
Roth, Daniel, et al. "A Social Interaction Interface Supporting Affective Augmentation Based on Neuronal Data." Proceedings of the Symposium on Spatial User Interaction (SUI'19), New Orleans Ed. Christoph W. Borst, Arun K. Kulshreshth, Gerd Bruder, Stefania Serafin, Christian Sandor, Kyle Johnsen, Jinwei Ye, Daniel Roth, Sungchul Jung, New York, NY, USA: ACM Digital Library, 2019.
BibTeX: Download