14 - 19 OCTOBER, 2012. SEATTLE, WASHINGTON, USA

Reproducible Visualization Research: How Do We Get There?

Organizers
Organizer: 
Enrico Bertini
Panelists: 
Juliana Freire
Panelists: 
Gordon Kindlmann
Panelists: 
Tamara Munzner
Panelists: 
Tim Dwyer
Description

Reproducible research refers to the ability for third parties to independently re-create and test the results described in research papers. As the whole field of Visualization grows and gets more mature, it is necessary to promote research standards that lead to reliable work that people can trust and refer to with confidence. For instance, it is desirable to ensure people will be able to access data, parameters and software to replicate the results described in the paper.

Reproducible computational research is not a problem of Visualization alone. The whole area of Computational Science is affected by the need to make results more trustworthy and accessible. The IEEE Computing in Science and Engineering published a special issue on reproducibility in 2009. Science published one in 2011 [Peng 2011]. Vanderwalle et al. ran a study on reproducibility in Signal Processing in 2009 [Vandewalle et al. 2009]. The ACM SIGMOD conference (the top conference in Databases) started in 2008 the "Experimental Reproducibility Effort", an attempt to introduce a systematic mechanism to promote the publication of research with high reproducibility standards.

Comparatively, the field of Visualization (including InfoVis, SciVis, and VAST) has witnessed little interest and commitment to the promotion of higher reproducibility standards. Given its focus on data, experimental evaluation and computational methods, it is especially important to devise mechanisms to encourage easy access, testing and re-use of proposed solutions.

The main goal of the panel is to raise awareness about this issue and to start a constructive discussion about potential mechanisms to increase the adoption of reproducibility. As discussed elsewhere [Freire et al. 2012], several degrees of reproducibility can be expected and alternative mechanisms and standards can be used . The panel will be an opportunity to openly compare and contrast them.