VIS 2016 banner

VAST Paper Types

Further Clarification on Paper Types in VAST 2016

VAST has two tracks, TVCG and Conference-only tracks, which correspond to different levels of originality, rigor, and significance. In general, VAST papers should be written, submitted and reviewed in the same way as papers at the other two VIS conferences (i.e., InfoVis and SciVis), following the detailed submission guidelines. However, with the rapid development of the science, technology and application of visual analytics, it is sensible to adjust our understanding of VAST publications from time to time. We provide the following clarifications about paper types for VAST 2016, beyond the discussion of the five paper types in the shared guidelines.

In visual analytics, concepts, theories, algorithms, techniques, designs, systems, empirical studies and applications normally create a context where analysis, visualization and interaction are integrated to optimize the combination of human and machine capabilities. It is this context that differentiates VAST from other conferences in VIS, while data involved can be spatial or non-spatial, techniques can be human-centric or machine-centric, and the application domain can be almost any academic discipline, industry, business sector, or governmental operation. Within such a context, an individual VAST paper may give a strong focus on an aspect where novel contributions reside, or place its emphasis on the integration of different aspects.

VAST papers typically fall into one of these six categories:

  • Theory and Model
  • Technique and Algorithm
  • Design Study
  • Empirical Study (referred to as the Evaluation type in in the shared guidelines)
  • System
  • Application (as a separate category from the Design Study type in the shared guidelines)

Theory and Model

  • Fundamentals of visual analytics.
  • Conceptual understanding and modelling of visual analytics (e.g., definitions, taxonomies, analytic frameworks, and research methods, etc.).
  • Philosophical and sociological discourses of visual analytics (e.g., human vs machine, ethics, data security, uncertainty, biases, and privacy, etc.).
  • Perception and cognition in visual analytics.
  • Mathematical abstraction and modelling of visual analytics processes.
  • Concepts and models that govern quality metrics and benchmarks for evaluating visual analytics processes and systems.


  • Megan Monroe et al. “Temporal event sequence simplification.” VAST 2013 Honorable Mention.

Technique and Algorithm

  • Visualization techniques in visual analytics processes.
  • Close integration of technical components of visual analytics (e.g., statistical analysis, human-defined and machine-learned algorithms, knowledge representations, visualization/interaction techniques and methodologies, etc.) for supporting visual data mining.
  • Visual analytics for supporting the advancement of non-visual technical components of visual analytics (e.g., visual analytics for supporting model development, simulation, learning, monitoring, and optimization).
  • Integrated data acquisition, management, retrieval, processing and transformation in visual analytics (e.g., multi-sources; multi-resolution; data provenance; uncertainty; real world measures; textual, audio, visual and other media; factual, statistical, semantic, synthesized, and hypothesized data; etc.).
  • VA techniques for spatial and non-spatial data, temporal data, streaming data, quantitative and qualitative data, text and document data, and so on.
  • Techniques for production, presentation, and dissemination of VA results.


  • Thomas Muhlbacher and Harald Piringer. “A Partition-based framework for building and validating regression models.” VAST 2013 Best Paper.
  • Stef van den Elzen et al. “Reducing snapshots to points: a visual analytics approach to dynamic network exploration.” VAST 2015 Best Paper.

Empirical Study

  • Understanding human-centric components in visual analytics processes (e.g., perception, cognition, interaction, communication, collaboration, etc.).
  • Understanding human capabilities and limitations in data intelligence (e.g., exploration, navigation, sensemaking, context awareness, knowledge discovery, learning, argumentation, causality reasoning, accountability, biases, etc.).
  • Understanding visual signatures in data intelligence (e.g., patterns of clusters, patterns of anomalies, etc.).
  • Understanding the potential merits and demerits of technologies in visual analytics (e.g., display technologies, interactive technologies, automated analytics, crowdsourcing analytics, and so on).
  • Human-centric comparative studies on aspects of visual analytics (e.g., visual representations, interaction techniques, active learning, visual analytics literacy, requirements analysis, etc.).
  • Evaluation methodologies for visual analytics techniques and systems in real world environments.
  • Different quantitative and qualitative (including ethnographic) forms of empirical studies (e.g., lab-based studies, field studies, crowdsourcing, group discussions, surveys, interviews, user experience observation, shadowing, case studies and casebook construction, etc.)
  • Transformation of scenarios and data captured in studies to benchmark problems and data-driven metrics.


  • Narges Mahyar and Melanie Tory. “Supporting communication and coordination in collaborative sensemaking.” VAST 2014 Best Paper.
  • Hua Guo et al. “A case study using visualization interaction logs and insight metrics to understand how analysts arrive at insights.” VAST 2015 Honorable Mention.

Design Study

  • Designing disseminative visual analytics (e.g., storytelling, illustration and animation, public engagement, etc.)
  • Designing observational visual analytics (e.g., multivariate data, streaming data, multimedia data, geospatial data, spatio-temporal, etc.)
  • Designing analytical visual analytics (e.g., clustering, anomaly detection, association and network analysis, correlation, causality, uncertainty, etc.)
  • Designing model-developmental visual analytics (e.g., exploring parameter space, and model space, supporting dimensionality reduction and machine learning, model-developmental life cycle, etc.)
  • Design methodologies for real world visual analytics systems and users.


  • Jian Zhao et al. “#FluxFlow: Visual analysis of anomalous information spreading on social media.” VAST 2014 Honorable Mention.


  • Methodologies for engineering real world visual analytics systems.
  • System platforms (from wearable devices to desktops to large infrastructures, and from architectures and software libraries (toolkits), to stand alone systems and apps, to online services and open source repositories).
  • Comparative studies on real world visual analytics systems.
  • Development tools for the software lifecycle of visual analytics systems, including requirements analysis, system specification, system design, system implementation, system testing, user evaluation, and system maintenance).
  • Addressing challenges in real world visual analytics systems (e.g., provenance management, scalability, uncertainty, open testbeds, etc.).
  • Automation, customization, and personalization, and interoperability.
  • Best practices (e.g., interoperability, workflow design, cost-benefit analysis, standardization, etc.)


  • Tanja Blascheck et al. “VA2: A visual analytics approach for evaluating visual analytics applications.” VAST 2015 Honorable Mention.


  • Delivering visual analytics solutions to applications in academic disciplines (e.g., physical sciences, biological and medical sciences, engineering sciences, social sciences, arts and humanities, and sports sciences).
  • Delivering visual analytics solutions to applications in industries and governance.
  • Delivering visual analytics solutions to applications in public services and entertainment (e.g., resilience, healthcare, transport, sports, tourism, broadcasting, and social media).


  • Conglei Shi, et al. “LoyalTracker: Visualizing loyalty dynamics in search engines.” VAST 2014 Honorable Mention.

Multi-type Papers. It is important to note that a VAST paper can present a mixture of contributions that fall into different categories. For example, a new technique may be presented in conjunction with an important application; a new design study may be led by an empirical study and supported by qualitative evaluation; a theoretical model may be supported by evidence from a real world application; and so forth. Reviewers should appreciate the combined values of the mixed contributions rather than shoehorning such a paper into a specific category, while authors should appreciate such a paper may present a challenge to some reviewers, and may not be reviewed consistently.

Authors’ Perspective. The VIS guidelines, together with cited papers and reports, provide authors, especially less experienced authors, with useful guidance to organize their research activities and structure papers. Even experienced authors should not overlook such guidelines.

Reviewers’ Perspective. Meanwhile, since VAST research usually features innovation, creativity and cross-disciplinarity, reviewers should not use these guidelines as a check list for acceptance or rejection in a simplistic manner. The goal of the review process is to bring the most exciting or important advances in areas of visual analytics to the VIS attendees, TVCG readers, and the larger community. Hence the role of a reviewer should be closer to a judge for a talent show than a hygiene inspector.

Reviewing is essentially an evaluation process with reviews intended to offer a balanced assessment of originality, rigor, and significance. For VAST, we particularly welcome papers that excel in at least one of these three aspects while being adequate in others. We equally welcome papers that feature significant impact on visual analytics applications, and/or interdisciplinary research activities (e.g., machine learning, cognitive sciences, and so on). Reviewers are encouraged to appraise positively Application papers and Empirical Study papers that feature one of the following qualities.