Pre-CHI 2026

Fabrication

GazeZoom: Exploration of Gaze-Assisted Multimodal Techniques for Panning and Zooming

Yilong Lin, Mingyu Han, Weitao Jiang, Seungwoo Je, Ian Oakley

at  15:00 ! Livein  F.101 b/cfor  15min

Zooming and panning are fundamental input actions for exploring complex 2D and 3D scenes and data such as images, maps, and designs. Multi-touch zoom/pan interactions have been proven effective on mobile devices, and have been directly ported to HMDs, where they are typically accomplished by analogous but relatively large-scale movements of both hands. We argue that such motions are inefficient and induce fatigue and explore how the eye-tracking features of HMDs can be leveraged to achieve improvements. We evaluated three interaction techniques that combine gaze with two-handed, one-handed, and head-based input in a study (N=24) that contrasts them against a baseline two-handed technique. The results indicate that gaze-assisted two- and one-handed techniques outperform the baseline (17%-36% faster), while our head-based technique achieves similar performance to the Baseline but leaves the hands free for other tasks. We further developed a VR application demonstrating these techniques and validating their practical applicability.

 Overview  Programme