IEEE VIS 2024 Content: SpatialTouch: Exploring Spatial Data Visualizations in Cross-reality

SpatialTouch: Exploring Spatial Data Visualizations in Cross-reality

Lixiang Zhao - Xi'an Jiaotong-Liverpool University, Suzhou, China

Tobias Isenberg - Université Paris-Saclay, CNRS, Orsay, France. Inria, Saclay, France

Fuqi Xie - Xi'an Jiaotong-Liverpool University, Suzhou, China

Hai-Ning Liang - The Hong Kong University of Science and Technology (Guangzhou), Guangzhou, China

Lingyun Yu - Xi'an Jiaotong-Liverpool University, Suzhou, China

Room: Bayshore I

2024-10-17T18:45:00ZGMT-0600Change your timezone on the schedule page
2024-10-17T18:45:00Z
Exemplar figure, described by caption below
SpatialTouch is a novel cross-reality environment that seamlessly integrates a monoscopic 2D surface (an interactive screen with touch and pen input) with a stereoscopic 3D space (an augmented reality HMD) to jointly host spatial data visualizations. This innovative approach combines the best of two conventional methods of displaying and manipulating spatial 3D data, enabling users to fluidly explore diverse visual forms using tailored interaction techniques. Providing such effective 3D data exploration techniques is pivotal for conveying its intricate spatial structures---often at multiple spatial or semantic scales---across various application domains and requiring diverse visual representations for effective visualization.
Fast forward
Full Video
Keywords

Spatial data, immersive visualization, cross reality, interaction techniques

Abstract

We propose and study a novel cross-reality environment that seamlessly integrates a monoscopic 2D surface (an interactive screen with touch and pen input) with a stereoscopic 3D space (an augmented reality HMD) to jointly host spatial data visualizations. This innovative approach combines the best of two conventional methods of displaying and manipulating spatial 3D data, enabling users to fluidly explore diverse visual forms using tailored interaction techniques. Providing such effective 3D data exploration techniques is pivotal for conveying its intricate spatial structures---often at multiple spatial or semantic scales---across various application domains and requiring diverse visual representations for effective visualization. To understand user reactions to our new environment, we began with an elicitation user study, in which we captured their responses and interactions. We observed that users adapted their interaction approaches based on perceived visual representations, with natural transitions in spatial awareness and actions while navigating across the physical surface. Our findings then informed the development of a design space for spatial data exploration in cross-reality. We thus developed cross-reality environments tailored to three distinct domains: for 3D molecular structure data, for 3D point cloud data, and for 3D anatomical data. In particular, we designed interaction techniques that account for the inherent features of interactions in both spaces, facilitating various forms of interaction, including mid-air gestures, touch interactions, pen interactions, and combinations thereof, to enhance the users' sense of presence and engagement. We assessed the usability of our environment with biologists, focusing on its use for domain research. In addition, we evaluated our interaction transition designs with virtual and mixed-reality experts to gather further insights. As a result, we provide our design suggestions for the cross-reality environment, emphasizing the interaction with diverse visual representations and seamless interaction transitions between 2D and 3D spaces.