ISSTA 2024
Mon 16 - Fri 20 September 2024 Vienna, Austria
co-located with ISSTA/ECOOP 2024

The Artifact Evaluation process is a service provided by the community to help authors of accepted papers provide more substantial supplements to their papers so that future researchers can more effectively build on and compare with previous work.

Call for Artifacts

ISSTA 2024 invites submissions for artifact evaluation. Research artifacts denote digital objects that were either created by the authors of a research article to be used as part of their study or generated by their experiments.

The artifact evaluation (AE) process aims to foster reproducibility and reusability. Reproducibility refers to researchers or practitioners being able to validate the paper’s results using the provided artifact. Reusability means that researchers can extend the artifact or use it in a different context or for a different use case. Overall, the artifact evaluation process allows our field to progress by incentivizing and supporting authors to make their artifacts openly available and improve their quality. Furthermore, a formal artifact evaluation documents the outstanding nature of the published research through recognizable and recognized badges stamped directly on the published papers. Therefore, it is common to offer the authors of accepted papers at high-quality conferences, such as ISSTA, an artifact evaluation service before publication.

More details can be found here: ACM guidelines on Artifact Review and Badging Version.

Submission and Preparation Overview

The following instructions provide an overview of how to prepare an artifact for submission. Please also read the instructions and explanations in the subsequent sections on this page before submission.

  1. Prepare Your Artifact: Along with the artifact itself, prepare a README file (with a .txt, .md, or .html extension) containing two sections:
    • Getting Started, to demonstrate how to set up the artifact and validate its general functionality (e.g., based on a small example) in less than 30 min.
    • Detailed Description, to describe how to validate the paper’s claims and results in detail.
  2. Include a Preprint of Your Paper: A preprint of the paper associated with the artifact must be included in the submission package. This is essential to assess whether the artifact adequately supports the claims made in the paper. The preprint should be the accepted version of the paper to ease the artifact evaluation process, allowing reviewers to directly correlate the claims with the provided artifacts.
  3. Upload the Artifact: Use Zenodo or a similar service to acquire a DOI for your artifact. This step ensures that your artifact is accessible and can be properly cited.
  4. Submit Through HotCRP: Provide the DOI and additional information about your artifact, including the paper abstract and the included preprint, using HotCRP (link available soon).

The Artifact Evaluation Process

The following provides a detailed explanation of the scope of artifacts, the goal of the evaluation process, and the submission instructions.

Scope of Artifacts

Artifacts can be a variety of different types (but are not limited to):

  • Tools, which are standalone systems.
  • Data repositories storing, for example, logging data, system traces, or survey raw data.
  • Frameworks or libraries, which are reusable components.
  • Machine-readable proofs (see the guide on Proof Artifacts by Marianna Rapoport)

If you are in doubt whether your artifact can be submitted to the AE process, please contact the AE chairs.

Evaluation Objectives and Badging

The evaluation of the artifacts subsequently target three different objectives:

  • Artifact Available v.1.1 Availability: The artifact should be available and accessible for everyone interested in inspecting or using it. As detailed below, an artifact has to be uploaded to Zenodo to obtain this badge.
  • Artifact Evaluated - Functional Functionality: The main claims of the paper should be backed up by the artifact.
  • Artifact Evaluated - Reusable Reusability: Other researchers or practitioners should be able to inspect, understand, and extend the artifact.

Each of the different objectives is handled as part of the evaluation process with each successful outcome awarded with an ACM badge.

Availability

Your artifact should be made available via Zenodo, a publicly-funded platform aiming to support open science. The artifact needs to be self-contained. During upload, you will be required to select a license and provide additional information, such as a description of the artifact. Zenodo will generate a DOI that is necessary for the artifact evaluation submission (HotCRP). Note that the artifact is immediately public and can no longer be modified or deleted. However, it is possible to upload an updated version of the artifact that receives a new DOI (e.g., to address reviewer comments during the kick-the-tires response phase).

The default storage for Zenodo is currently limited to 50GB per artifact but can be extended on request Zenodo FAQ - Section Policies. Still, please keep the size reasonably small to support reviewers in the process.

Functionality

To judge the functionality and reusability of an artifact, two to three reviewers will evaluate every submission. The reviewers will evaluate the artifact in detail and validate that it backs up the paper’s important claims.

The README file is crucial for guiding reviewers through the evaluation process and should include:

Getting Started: This section must outline the necessary steps to set up the artifact and verify its general functionality. This could involve:

  • Listing the artifact’s requirements, with considerations for different operating systems or environments.
  • Detailed instructions for initializing the artifact, whether it involves compiling source code, running a virtual machine or container, or other setup processes.
  • Specific commands or actions reviewers should perform to verify basic functionality, including expected outcomes and approximate time required for these steps. The goal is for reviewers to complete this part within 30 minutes.

Detailed Description: This section should demonstrate how the artifact supports each claim and result presented in the paper. It may include:

  • Step-by-step instructions to replicate the experiments or analyses.
  • Explanation of how the artifact’s outputs validate the paper’s claims.
  • Any necessary background information or context to understand the artifact’s operation and significance.

Those are the main requirements to achieve the “Artifacts Evaluated – Functional” badge.

Reusability

For the “Artifacts Evaluated - Reusable” badge, all requirements for the “Artifacts Evaluated - Functional” need to be met as a prerequisite. When submitting your artifact to HotCRP, you are required to argue if and why your artifact should receive an “Artifacts Evaluated – Reusable” badge. A typical reusable artifact is expected to correspond to one or multiple of the following characteristics:

  • The artifact is highly automated and easy to use.
  • It is comprehensively documented, and the documentation describes plausible scenarios on how it could be extended.
  • The artifact contains all means necessary such that others can extend it. For example, a tool artifact includes its source code, all not commonly available requirements, and a working description of compiling it. Container or virtual machines with all requirements are preferred.
  • The README should contain or point to other documentation that is part of the artifact and describes use case scenarios or details beyond the scope of the paper. Such documentation is not limited to text; for example, a video tutorial could demonstrate how the artifact could be used and evaluated more generally.

In general, the wide variety of artifacts makes it difficult to come up with an exact list of expectations. The points above should be seen as a guideline for authors and reviewers of what to provide and what to expect. In case of any doubt, feel free to contact the Artifact Evaluation Chairs.

Distinguished Artifact Awards

Artifacts that go above and beyond the expectations of the Artifact Evaluation Committee will receive a Distinguished Artifact Award.

FAQ

  • Is the reviewing process double-blind? No, the reviewing process is single-blind. The reviewers will know the authors’ identities, while reviewers’ identities are kept hidden from the authors. Authors can thus submit artifacts that reveal the authors’ identities.
  • How can we submit an artifact that contains private components (e.g., a commercial benchmark suite)? An option would be to upload only the public part of the artifact to Zenodo, and share a link to the private component that is visible only to the reviewers by specifying the link in the Bidding Instructions and Special Hardware Requirements HotCRP field. If this is not possible, another option would be to provide reviewers access to a machine that allows them to also interact with the artifact’s private component. Both options must adhere to the single-blind reviewing process (i.e., they must not reveal the reviewers’ identities). Whether an “Availability” flag will be awarded for partially available artifacts will be determined based on the AEC’s evaluation.

Contact

If you have any questions or comments, please reach out to the Artifact Evaluation Chairs.

We are pleased to invite you to contribute your expertise as a reviewer on the Artifact Evaluation Committee for ISSTA 2024. This role is an excellent opportunity for senior PhD students, postdocs, and professionals who have previously engaged in the AE process, either as authors or reviewers. While such experience is highly valued, it is not a strict requirement.

What We’re Looking For

  • Expertise in Software Testing and Analysis: Your specialized knowledge in these fields is crucial for providing high-quality reviews.
  • Commitment to Constructive Evaluations: We value your dedication to providing detailed and constructive feedback.
  • Availability: Ability to review 2-3 artifacts in the period from July 5th to July 24th, 2024.

Reviewer Responsibilities

  • Evaluate Artifacts Thoroughly: Assess the quality, reproducibility, and relevance of the submissions.
  • Provide Clear, Constructive Feedback: Your insights will significantly assist authors in improving their work.
  • Participate in Committee Discussions: Engage in deliberations to determine the allocation of artifact badges.

Benefits of Being a Reviewer

  • Stay Informed on Emerging Research: Keep abreast of the latest developments in software testing and analysis.
  • Network with Peers: Connect with other senior researchers and professionals in the field.
  • Earn Recognition: Your contributions will be acknowledged on the ISSTA 2024 website and in conference proceedings.

Application Process

  • Self-Nomination: Please express your interest by filling out this Google form
  • Application Deadline: January 31, 2024
  • Decision Notification: Decisions on reviewer selections will be communicated at the beginning of March, 2024.

Your expertise and contributions as a reviewer are invaluable to the success of ISSTA 2024. We eagerly anticipate your participation and look forward to receiving your application.

Questions? Use the ISSTA Artifact Evaluation contact form.