The ACM Task Force on Data, Software, and Reproducibility in Publication

Bernard Rous, ACM Director of Publications (Emeritus January 2017)

Independent verification of published research results has not been systematic nor even typical in computer science, despite innovation in the field relying heavily on experimentation with software, hardware, and data.

New impetus to address this situation has come from government mandates for data management plans to accompany grant applications for sharing of experimental outcomes, as well as grass-roots recognition that access to experimental artifacts underlying reported results is the best way to ensure scientific integrity and advance the field.

In November 2013, the ACM Executive Committee held a Strategic Retreat to address the future of ACM. With respect to ACM’s publishing program and its Digital Library platform, two general directions were set:

  1. Develop a vision for what will constitute published content in the future, and
  2. Develop a vision for a new service‐centric ACM Digital Archive that supports the preservation and use of published content as well as the work‐processes of the computing research community.

From these directives, the ACM Publications Board established a Digital Library Standing Committee chaired by Ron Boisvert. This Committee in turn launched two projects, one on DL Analytics and the other, of primary interest here, a Task Force on Data, Software, and Reproducibility in Publication led by Michael Lesk of Rutgers and Alex Wade of Microsoft Research.

While other disciplines have focused more on the data and methodology aspects of reproducibility, various ACM communities have tackled the problem of re-running software, with experimental variables, hardware configurations, and data all considered as fundamental aspects of the problem.

The DL Task Force pulled together members of ACM SIG and journal communities working on reproducibility, as well as representatives from sister societies SIAM and IEEE. For the most part, these communities had been acting in isolation. At a workshop held in July 2015, they discovered details of each other’s efforts. Despite clear recognition of substantive differences among sub-disciplines, the Task Force members realized they were all grappling with certain common problems and resolved to craft a Best Practices Guideline for Data, Software, and Reproducibility in Publication.

The Task Force agreed that it is too early to be prescriptive about the Best Practices (hence the plural), but that specific recommendations are definitely needed to:

  • Clarify basic definitions, evaluation criteria, and branding for: replicability, repeatability, reproducibility, re-usability, availability
  • Motivate and incentivize: authors, reviewers, program committees, editorial boards
  • Adopt/invent standard metadata descriptions: for software, data, methodologies
  • Enable: artifact evaluation processes in automated submission workflows
  • Encourage: sharing of artifacts
  • Define: acceptable storage and packaging formats
  • Support and integrate: internal and external data and software depositories
  • Identify, cite, and link: artifacts as first-class publication objects
  • Curate and preserve: artifacts for future re-use

Since the July 2015 workshop, members of the Task Force have been actively pursuing artifact evaluation in publishing some ACM journals and conferences. Initial branding of this work now appears in the Digital Library. Historical data for replicated results has been gathered and will be added to the DL records. Collectively, over 650 pages of source materials have been contributed to inform the Best Practices Guideline. A second workshop is now being held in early May 2016 to refine this input.

Concurrent to these activities, ACM and members of the Task Force have worked with Bruce Childers of the University of Pittsburgh to secure an Alfred P. Sloan Foundation grant for a study of The Impact of Emerging Platforms for Artifact Review and Active Curation on the ACM Digital Library.

This comparative study of curation platforms and the services each offers is necessary since sub-disciplinary requirements are quite diverse. The/ACM Transactions on Mathematical Software/, for example, has been publishing refereed software associated with studies of the properties and performance of relatively small, self-contained algorithms since 1975. More recently, they developed a broader initiative to formally replicate computational results. The/ACM Transactions on Modeling and Computer Simulation/has also adopted this process. At the other end of the spectrum, the ACM Conference on Supercomputing needs to evaluate large-scale parallel execution in high-performance networked systems. Studies of cyber-physical and embedded computing systems present still other dimensions that must be considered in evaluating and reproducing experimental results.

The ACM Digital Library will need to interoperate with a number of recommended data repositories and software curation platforms. DL users will need to inspect, interact with, and be able to manipulate new types of digital objects. The platforms themselves may need to adapt to some Digital Library requirements and interfaces in order to become fully useful in scholarly publication.

Through these activities and feedback from the broader CS community, the ACM Digital Library will be ideally positioned to serve emerging requirements for sharing software, data and other artifacts, leading to increased scientific accountability and the adoption of improved experimental practices.

The Sloan-funded study is intended to help ACM develop its recommendations for the Best Practices Guidelines. It will identify specific community use-cases and evaluate exemplar curation platforms against their needs. Some of these platforms will be integrated with the ACM Digital Library as pilots to test and refine interfaces. Feedback from all the ACM communities involved in artifact evaluation is an integral part of the Sloan study.

 

Members of the Task Force: Michael Lesk, Alex Wade, Ron Boisvert, Bruce Childers, David Grove, Dirk Beyer, Gianluca Setti, Grigori Fursin, Gwendal Simon, Henning Schulzrinne, Juliana Freire, Limon Peer, Lin Uhrmacher, Yolanda Gil, Micah Altman, Jim Crowley, Michael Heroux, Michael Miksis, Michela Taufer, Patrick Madden, Randy LeVeque, Philippe Bonnet, Sheila Morrisey, Simon Harper, Gerry Grenier, Stratos Idreos, Tim Hopkins, Wilfred Pinfold, Mark Gross, Jack Davidson, Joe Konstan, Nik Dutt, Victoria Stodden, Stephen Spencer, Roch Guerin, Bernie Rous, Wayne Graves, Scott Delman, Craig Rodkin.

Bringing You the World’s Computing Literature

The most comprehensive collection of full-text articles and bibliographic records covering computing and information technology includes the complete collection of ACM's publications. 

ACM Digital Library