IEEE VIS 2024 Content: Eliciting Model Steering Interactions from Users via Data and Visual Design Probes

Eliciting Model Steering Interactions from Users via Data and Visual Design Probes

Anamaria Crisan -

Maddie Shang -

Eric Brochu -

Room: Bayshore II

2024-10-16T13:30:00ZGMT-0600Change your timezone on the schedule page
2024-10-16T13:30:00Z
Full Video
Keywords

Design Probes, Interactive Machine Learning, Model Steering, Semantic Interaction

Abstract

Visual and interactive machine learning systems (IML) are becoming ubiquitous as they empower individuals with varied machine learning expertise to analyze data. However, it remains complex to align interactions with visual marks to a user’s intent for steering machine learning models. We explore using data and visual design probes to elicit users’ desired interactions to steer ML models via visual encodings within IML interfaces. We conducted an elicitation study with 20 data analysts with varying expertise in ML. We summarize our findings as pairs of target-interaction, which we compare to prior systems to assess the utility of the probes. We additionally surfaced insights about factors influencing how and why participants chose to interact with visual encodings, including refraining from interacting. Finally, we reflect on the value of gathering such formative empirical evidence via data and visual design probes ahead of developing IML prototypes.