Lost in Style: Gaze-driven Adaptive Aid for VR Navigation

1 George Mason University, 2Japan Broadcasting Corporation, 3University of Massachusetts-Boston, 4National Chiao Tung University, 5Facebook Reality Labs
a street intersection, middle 
              image shows a user wearing a VR headset while his gaze is being tracked
a user wearing a VR headset while his gaze is being tracked
image shows the same intersection as the leftmost but with an arrow showing the direction 
              the user should take to get to his/her destination.

(a) While navigating a scene in virtual reality, (b) the user's gaze sequence can indicate his/her need for navigation help and (c) an aid is displayed accordingly.


A key challenge for virtual reality level designers is striking a balance between maintaining the immersiveness of VR and providing users with on-screen aids after designing a virtual experience. These aids are often necessary for wayfinding in virtual environments with complex paths.

We introduce a novel adaptive aid that maintains the effectiveness of traditional aids, while equipping designers and users with the controls of how often help is displayed. Our adaptive aid uses gaze patterns in predicting user's need for navigation aid in VR and displays mini-maps or arrows accordingly. Using a dataset of gaze angle sequences of users navigating a VR environment and markers of when users requested aid, we trained an LSTM to classify user's gaze sequences as needing navigation help and display an aid. We validated the efficacy of the adaptive aid for wayfinding compared to other commonly-used wayfinding aids.


Games/Play, Virtual/Augmented Reality, Eye Tracking

Media Coverage



author={Alghofaili, Rawan and Sawahata, Yasuhito and Huang, Haikun
and Wang, Hsueh-Cheng and Shiratori, Takaaki and Yu, Lap-Fai}, 
title={Lost in Style: Gaze-driven Adaptive Aid for VR Navigation}, 
booktitle={Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems},
series = {CHI '19},
publisher = {ACM},
location={Glasgow, UK}, 
keywords={Games/Play, Virtual/Augmented Reality, Eye Tracking}


We thank Kristen Laird for her help in conducting the user studies. This research is supported by the National Science Foundation under award number 1565978. We are grateful to the anonymous reviewers for their constructive comments.