ECOOP 2022
Mon 6 June - Thu 7 July 2022 Berlin, Germany

Traditionally, technical research papers are published without including any artifacts (such as tools, data, models, videos, etc.), even though the artifacts may serve as crucial and detailed evidence for the quality of the results that the associated paper offers. They support the repeatability of experiments and precise comparison with alternative approaches, thus enabling higher quality in the research area as a whole. They may also make it easier for other researchers to perform their own experiments, thus helping the original authors disseminating their ideas in detail. Hence, artifacts should be taken seriously and recognized separately.

The AE process at ECOOP 2022 is a continuation of the AE process at previous ECOOP editions, and several other conferences, including ESEC/FSE, OOPSLA, PLDI, ISSTA, HSCC, and SAS: see the authoritative Artifact Evaluation for Software Conferences web site.

Call for Artifacts

Research artifacts denote digital objects that were either created by the authors of a research article to be used as part of their study or generated by their experiments (adopted from https://www.acm.org/publications/policies/artifact-review-and-badging-current).

It has become common practice at many conferences to offer the authors of accepted papers an artifact evaluation, i.e., a systematic review of their research artifacts prior to publication. The benefit for authors is (at least) two-fold: (a) each positively evaluated research artifact can be trusted to fulfill certain quality standards and (b) the positive evaluation outcome is documented with a badge on the published paper.

ECOOP has a long-standing tradition of offering artifact evaluation dating back to 2013. For the first time this year, this offer is no longer restricted to accepted papers. Instead, we offer artifact evaluation for every single submission to ECOOP 2022. In addition to providing feedback on the artifacts, we will make positive evaluation results available to the technical PC to be taken into consideration for the overall assessment in the paper peer review process, such that artifact submissions can help to improve the overall review score.

Artifact Preparation Guidelines

When preparing artifacts, we encourage you to read the HOWTO for AEC Submitters. We would also like to provide artifact authors with general information on what makes good / bad artifacts and suggestions for good practices. In a nutshell, committee members want artifacts that:

  • Contain all dependencies (Docker image / VM)
  • Have few setup steps
  • Have getting started guides where all instructions are tested
  • Include some documentation on the code and layout of the artifact
  • Have a short run reviewers can try first (several minutes max)
  • Show progress messages (percentage complete) during longer runs

Authors should avoid:

  • Downloading content over the internet during experiments or tests
  • Closed source software libraries, frameworks, operating systems, and container formats
  • Experiments or tests that run for multiple days – If the artifact takes several days to run or requires special hardware, please get in touch with the AEC chairs, let us know of the issue, and provide us with (preferably SSH) access to a self-hosted platform for accessing the artifact. In case this is a viable option for your artifact, you can also provide us with the full artifact and a reduced input set (in addition to the full set) to only partially reproduce your results in a shorter time. Authors of Proof Artifacts: We encourage authors and reviewers of mechanized proofs to consult the recent guidelines for submitting and reviewing proof artifacts.

Artifact Packaging Guidelines

When packaging your artifact for submission, please take the following into consideration: Your artifact should be as accessible to the AEC members as possible, and it should be easy for the AEC members to quickly make progress on the evaluation of your artifact. Please provide some simple scenarios describing concretely how the artifact is intended to be used; for a tool, this would include specific inputs to provide or actions to take, and expected output or behavior in response to this input. In addition to these very tightly controlled scenarios that you prepare for the AEC members to try out, it may be very useful if you suggest some variations along the way, such that the AEC members will be able to see that the artifact is robust enough to tolerate experiments.

To avoid problems with software dependencies and installation during artifact review, artifacts must be made available either as a Docker image (https://www.docker.com/) or as a virtual machine image in OVF/OVA format containing the artifact already installed. The artifact must be provided as a self-contained archive file, using a widely supported archive format (e.g., zip, tgz). Please use widely supported open formats for documents, and preferably CSV or JSON for data.

Double-blind artifact review: Please note that artifact evaluation for ECOOP 2022 will be double-blind. Please make sure you do not reveal author identities, e.g., through user names in VM/Docker file systems.

Artifact Submission

Submission link: https://ecoop22aec.hotcrp.com/

Every submission must include:

  • A Markdown-formatted file providing an overview of the artifact. Please use this template.
  • A URL for downloading the artifact. Please make sure to use hosting platforms for your artifacts that do not track IP addresses, as this would undermine the double-blind review process.

Review Process

Submitted artifacts will go through a two-phase evaluation:

  • Kick-the-tires: Reviewers check the artifact integrity and look for any possible setup problems that may prevent it from being properly evaluated (e.g., corrupted or missing files, VM won’t start, immediate crashes on the simplest example, etc.). Authors are informed of the outcome and will be given a 4-day period to read and respond to the kick-the-tires reports of their artifacts. The author response period will resemble paper rebuttals, i.e., there will be no interactive discussions. AEC members will try to phrase any issues they may encounter with artifact as concisely as possible and authors are expected to address these issues in a single response.
  • Artifact assessment: Reviewers evaluate the artifacts, checking if they live up to the claims the authors make in the accompanying documentation. If they do, the positive result of the evaluation, along with a judgment of the importance of these claims for the novelty and technical soundness of the submitted papers, will be taken into account in the paper review process.

An artifact may be awarded one or more of the following badges (see ACM Artifact Review and Badging for details)

  • Functional - The artifacts associated with the research are found to be documented, consistent, complete, exercisable, and include appropriate evidence of verification and validation.
    For the artifact evaluation, we rely on the claims that authors submit along with the artifact. For inclusion of these results in the paper review process, the role of these claims for the novelty and soundness of the paper will be evaluated.

  • Reusable - The artifacts associated with the paper are of a quality that significantly exceeds minimal functionality. That is, they have all the qualities of the Artifacts Evaluated – Functional level, but, in addition, they are very carefully documented and well-structured to the extent that reuse and repurposing is facilitated.
    For the evaluation of the artifact’s reusability, we will rely on the reuse scenarios that the authors describe in their documentation. For inclusion of the results in the paper review process, the relevance of the documented scenarios will be judged by the PC.

  • Available - Author-created artifacts relevant to this paper have been placed on a publicly accessible archival repository. A DOI for the object is provided.
    Repositories used to archive data should have a declared plan to enable permanent accessibility (e.g., as in the retention policies for Zenodo or FigShare). The artifact’s DOI or URL must be included in the camera-ready version of the paper.

Artifacts that go beyond expectations of quality will receive a Distinguished Artifact award. The selection procedure will be based on review scores and feedback from the artifact evaluation committee.

Kick-the-Tires Response Period

Authors will be given a 4-day period to read and respond to the kick-the-tires reports of their artifacts. Authors may be asked for clarifications in case the committee encountered problems that may prevent reviewers from properly evaluating the artifact. The author response period will resemble paper rebuttals, i.e., there will be no interactive discussions. AEC members will try to phrase any issues they may encounter with artifact as concisely as possible and authors are expected to address these issues in a single response.

Notes

The process for evaluation is based on and consistent with the ACM Artifact Review and Badging and NISO’s guidelines for reproducibility badging. However, neither ACM nor NISO are involved in the implementation or evaluation process on behalf of ECOOP.