AmbiGaze: Direct Control of Ambient Devices by Gaze

Beitrag bei einer Tagung
(Konferenzbeitrag)


Details zur Publikation

Autor(en): Velloso E, Wirth M, Weichel C, Esteves A, Gellersen H
Herausgeber: ACM New York
Verlag: ACM
Verlagsort: New York
Jahr der Veröffentlichung: 2016
Tagungsband: Proceedings of the 2016 ACM Conference on Designing Interactive Systems
Seitenbereich: 812-817
ISBN: 978-1-4503-4031-1
Sprache: Englisch


Abstract


Eye tracking offers many opportunities for direct device control in smart environments, but issues such as the need for calibration and the Midas touch problem make it impractical. In this paper, we propose AmbiGaze, a smart environment that employs the animation of targets to provide users with direct control of devices by gaze only through smooth pursuit tracking. We propose a design space of means of exposing functionality through movement and illustrate the concept through four prototypes. We evaluated the system in a user study and found that AmbiGaze enables robust gaze-only interaction with many devices, from multiple positions in the environment, in a spontaneous and comfortable manner.



FAU-Autoren / FAU-Herausgeber

Wirth, Markus
Stiftungs-Juniorprofessur für Sportinformatik (Digital Sports)


Zitierweisen

APA:
Velloso, E., Wirth, M., Weichel, C., Esteves, A., & Gellersen, H. (2016). AmbiGaze: Direct Control of Ambient Devices by Gaze. In ACM New York (Eds.), Proceedings of the 2016 ACM Conference on Designing Interactive Systems (pp. 812-817). Brisbane, AU: New York: ACM.

MLA:
Velloso, Eduardo, et al. "AmbiGaze: Direct Control of Ambient Devices by Gaze." Proceedings of the The 2016 ACM Conference on Designing Interactive Systems, Brisbane Ed. ACM New York, New York: ACM, 2016. 812-817.

BibTeX: 

Zuletzt aktualisiert 2018-19-04 um 03:30