AmbiGaze: Direct Control of Ambient Devices by Gaze

Velloso E, Wirth M, Weichel C, Esteves A, Gellersen H (2016)


Publication Language: English

Publication Type: Conference contribution, Conference Contribution

Publication year: 2016

Publisher: ACM

City/Town: New York

Pages Range: 812-817

Conference Proceedings Title: Proceedings of the 2016 ACM Conference on Designing Interactive Systems

Event location: Brisbane AU

ISBN: 978-1-4503-4031-1

URI: http://dl.acm.org/citation.cfm?doid=2901790.2901867

DOI: 10.1145/2901790.2901867

Abstract

Eye tracking offers many opportunities for direct device control in smart environments, but issues such as the need for calibration and the Midas touch problem make it impractical. In this paper, we propose AmbiGaze, a smart environment that employs the animation of targets to provide users with direct control of devices by gaze only through smooth pursuit tracking. We propose a design space of means of exposing functionality through movement and illustrate the concept through four prototypes. We evaluated the system in a user study and found that AmbiGaze enables robust gaze-only interaction with many devices, from multiple positions in the environment, in a spontaneous and comfortable manner.

Authors with CRIS profile

Involved external institutions

How to cite

APA:

Velloso, E., Wirth, M., Weichel, C., Esteves, A., & Gellersen, H. (2016). AmbiGaze: Direct Control of Ambient Devices by Gaze. In ACM New York (Eds.), Proceedings of the 2016 ACM Conference on Designing Interactive Systems (pp. 812-817). Brisbane, AU: New York: ACM.

MLA:

Velloso, Eduardo, et al. "AmbiGaze: Direct Control of Ambient Devices by Gaze." Proceedings of the The 2016 ACM Conference on Designing Interactive Systems, Brisbane Ed. ACM New York, New York: ACM, 2016. 812-817.

BibTeX: Download