Labeling Out-of-View Objects in Immersive Analytics to Support Situated Visual Searching

Tica Lin, Yalong Yang, Johanna Beyer, Hanspeter Pfister

View presentation:2022-10-19T20:00:00ZGMT-0600Change your timezone on the schedule page
2022-10-19T20:00:00Z
Exemplar figure, described by caption below
AR label design for objects outside the field-of-view (FOV). (a) Our user study uses a VR HMD to simulate consistent AR conditions. (b) The user is surrounded by spatially sparse objects and can see labels for in-view and out-of-view objects on the AR screen. Labels for out-of-view objects are placed on the boundary to support embodied navigation. (c) A simulated grocery shopping experience with AR labels.

Prerecorded Talk

The live footage of the talk, including the Q&A, can be viewed on the session page, Immersive Analytics and Situated Visualization.

Fast forward
Keywords

Object Labeling, Mixed / Augmented Reality, Immersive Analytics, Situated Analytics, Data Visualization

Abstract

Augmented Reality (AR) embeds digital information into objects of the physical world. Data can be shown in-situ, thereby enabling real-time visual comparisons and object search in real-life user tasks, such as comparing products and looking up scores in a sports game. While there have been studies on designing AR interfaces for situated information retrieval, there has only been limited research on AR object labeling for visual search tasks in the spatial environment. In this paper, we identify and categorize different design aspects in AR label design and report on a formal user study on labels for out-of-view objects to support visual search tasks in AR. We design three visualization techniques for out-of-view object labeling in AR, which respectively encode the relative physical position (height-encoded), the rotational direction (angle-encoded), and the label values (value-encoded) of the objects. We further implement two traditional in-view object labeling techniques, where labels are placed either next to the respective objects (situated) or at the edge of the AR FoV (boundary). We evaluate these five different label conditions in three visual search tasks for static objects. Our study shows that out-of-view object labels are beneficial when searching for objects outside the FoV, spatial orientation, and when comparing multiple spatially sparse objects. Angle-encoded labels with directional cues of the surrounding objects have the overall best performance with the highest user satisfaction. We discuss the implications of our findings for future immersive AR interface design.