Modeling a user's interactions is intimately tied to many areas of research
in the fields of HCI and Visual Analytics. Most notably, developing adaptive
visual interfaces and effectively prefetching for large datasets, first
requires understanding the user's behavior and analytical process. In this
work, we demonstrate the potential of using a user's mouse movements and
clicks to achieve this goal. In an online study, we gather users'
interactions as they perform a complex visual search task. Our results
indicate a significant difference between the search strategies employed by
users who were quick at completing the task and those who were slow.