This page provides relevant information for authors.
JHPS has a rapid publication process.
The stages of the submission process for JHPS are as follows:
- Validation phase: Our initial non-public quality check of the submitted manuscript, leading to the tentative initial acceptance or immediate rejection. If the article draft meets the required quality standards and is appropriate for the scope of the journal, it is published in our incubator area.
- Review phase: The manuscript is open for comments from peers and the broader public according to our guidelines for reviewers. The author should react upon feedback and suggestions of the reviewers and make changes as necessary. We consider this exchange of ideas to be an extended community shepherding.
- Acceptance: The article will be assigned to an issue and must be uploaded to Zenodo providing appropriate metadata regarding the journal.
- Feedback phase: The commentable version of the article remains online in the issue section for comments from the public. For at least six months, the authors must consider these comments and respond accordingly. This may lead to revised versions of the article – these will remain published in the same issue but we will inform the readers of updates.
More information is provided in the detailed review process.
The following criteria are considered for the acceptance of the article:
- Scope: Does it address one or multiple topics relevant for the readers?
- Significance: Is the article significant for the respective topic, i.e., contributing to the state of the art?
- Readability: Is the technical English understandable and the article well structured to follow a narrative? JHPS publishes only in English.
- Presentation: Is the general presentation appealing, e.g., diagrams, images?
- References: Is important related work covered and discussed in the article? Are references up to date?
- Correctness: Are conducted experiments correct (also in their workflow description) and is the article convincing?
For acceptance in the incubator, the lead reviewer must conclude that the paper meets a standard in the quality that likely leads to an approval of the manuscript after the community review. JHPS will only accept articles for final publication that meet the high standard in the aforementioned criteria.
In detail, the timeline for the publication and the process is organized as follows:
- Preparation of the manuscript
- A manuscript must have at least eight pages in the given format (excluding references).
- Typically, articles range in the order of 16-34 pages. The journal has no explicit page limit.
- You should follow the description of reproducible workflows.
- Ensure you follow our ethical guidelines.
- There are two alternative publication templates for the authors: A Word template in Google Doc and a LaTeX template. The preparation and interaction are slightly different for them but the general workflow the same
- Template in LaTeX: clone our GitHub. Check the documentation of the template in the repository. In this case, the reproducible workflow can be stored in the same repository. Note that we have created a workflow that combines the advantages of Google Doc for the review with the benefit of LaTeX typesetting and Git version control. In a nutshell, a Google Doc mirrors a LaTeX file and allows for comments while a plugin allows synchronizing the file with a Git repository and allows publication of revised paper versions. We will setup the Google Doc for you.
- Word template in Google Doc. You may edit it using Google Doc. Clone the existing template (Select “File”, “Create a Copy”). Check the documentation of the style in the document.
- We use the JHPS Manuscript Central for managing the manuscripts.
- The main author should register and submit the manuscript via the provided form including the information:
- A link to the manuscript draft in one of the required styles.
- A link to a GitHub repository with the reproducible analysis workflow and optionally, the scripts to repeat the experiments.
- Validation phase
- Within two hours, you should receive an automatic confirmation (please check your SPAM folder).
- Within seven business days, an initial check for validity is made and a lead reviewer will be assigned. The role of the lead reviewer is to check that the manuscript meets the minimum quality standard of the journal.
- Within 14 days, the lead reviewer will report to the editorial office if the article is a candidate for acceptance to the incubator or if it must be rejected due to a mismatch of topic or quality standards. The editorial office will report back to the main author.
- The article candidate will be published in our incubator area. We will clone your Google Doc and repository providing write access to the copied documents to you and co-authors.
- Review phase
- The article remains in the incubator for at least one month and until at least five open reviews have been added to the article voting for acceptance – at least one review must be from an editorial board member.
- Note that if the author does not make improvements, the article can remain in the incubator for a longer period of time.
- Papers for which the author does not respond to comments will be removed after a month of inactivity.
- The authors may withdraw their article any time during the review phase by email, which will lead to the removal of the article from the webpage.
- The article will be published in the next issue of the journal. Authors are notified and receive details of the uploading process for Zenodo including the issue metadata referencing to the journal. This metadata will look like “The Journal of High-Performance Storage, Issue 1, 2019, https://jhps.vi4io.org/issues/1/”.
- The authors should update the acknowledegment section taking into consideration the role of the reviewers and their relevant contributions.
- The author must update the footnote of the article according to the issue metadata.
- The author must upload the current version of the article to Zenodo providing the metadata regarding the journal.
- Enter all relevant metadata of the article, e.g., author information.
- Add the community: “Journal of High-Performance Storage”
- Select as Upload type: “Publication”
- In “Basic Information”, click on “Reserve DOI”
- Under “Journal”, add the issue metadata we provided to you with the acceptance email.
- The author must reply to the email of the acceptance notification providing the DOI from Zenodo.
- The article will then be published in the next issue of the journal.
The webpage will include metadata of the article:
- BibTex, links to the PDF, and DOI on Zenodo.
- The link to the commentable version of the article.
- Feedback phase
- The community may provide further comments to the article in the issue section. Thus, we preserve the Google Doc (and LaTeX) revisions for further input of the community.
- For at least six months, the authors must consider these comments and respond accordingly. We will sanction future submissions of the authors if they do not follow their responsibility.
- The feedback may lead to a minor revision of the article that will need to be updated on Zenodo.
JHPS is committed to the ethical standards of publications. Authors must accept their responsibilities:
- The article is the original work of the listed author(s) and does not omit the authorship of relevant contributors.
- The article must not be currently submitted to any other journal or academic venue.
- Already published artifacts (e.g., diagrams, text snippets) that are used in the article must be referenced correctly. That includes previously published articles by the authors or any third-party author.
- The supplied reproducibility workflow and data are exactly as executed on the system.
- Replying to the comments provided by the community and consider addressing them in revisions of the article.
The journal will react harshly to plagiarism and copyright infringements and reserves the right to remove articles from published issues in such cases.
The aim of the JHPS initiative is to bring reproducibility and validation of experiments into the digital age by enabling experiments to be digitally reproduced. Digital reproductions are perfect and do not degrade with the number of copies. This is in contrast to analog reproductions which degrade with each copy of a copy.
For experiments to be digitally reproduced like music, one has to have a name for the experiment (i.e. a URL), a player, and a standard format that instructs the player how to reproduce the experiment. Digitally reproducing experiments is not a new concept and has been in practice in the DevOps community for years in the context of testing. It is called Continuous Integration (CI), a practice that provides testing as a service by executing software delivery pipelines whenever a new change is pushed into a software repository. Popular examples are Travis, Jenkins, and Gitlab-CI. Such a CI service turns out to be a convenient entry point for “playing” an experiment, not only for the authors but also for reviewers so they can easily validate reproducibility. This is why we also call digital reproducibility “automated reproducibility.”
While the JHPS Reproducibility Initiative is agnostic to the tools that are used to allow playing an experiment, we promote a standard format specification and an infrastructure for digitally reproducing experiments. The standard format specification is the Github Actions (GHA) workflow language which is based on HCL. The publicly available open source GHA toolset provides experimenters with a myriad of reusable components and tools that greatly reduce the time it takes to implement and replay experiments associated with a scientific exploration. Popper, one of such tools conveniently generates configuration files for various CI services, allowing authors to share a URL that others can use to replay and inspect the output of an experiment.
Our experience is that experiments specified this way are not only reproducible but have significantly increased personal productivity of students and researchers -- even though it does require some curiosity about well-established tools that have recently revolutionized the world of software engineering. We will do our best to support authors along this process who are willing to try it out.
Levels of Reproducibility
JHPS annotates the papers with badges according to the level of reproducibility that can be achieved:
- Silver: Analysis workflow is reproducible, i.e., 1) the reviewer can check and understand the steps of the analysis workflow and 2) is able to run and verify the analysis workflow on some other publicly available infrastructure or their own machine without any additional coding. This could be achieved using scripts, Popper or Docker.
- Gold: Silver plus the experimental workflow is reproducible but the access to the test environment is restricted, i.e., the reviewer convinced themselves that they can run the experimentation workflow on a restricted environment if they had access to it. This includes that the reviewer can read and understand the steps of the experimentation workflow.
Theoretical papers fall also into this category but require that any generated graph can be recreated and all required data is included. A theoretical paper cannot achieve the Platin badge.
- Platin: Gold plus the reviewer is able to run and verify the experimentation workflow on some publicly available infrastructure without any additional coding, e.g., on a continuous integration infrastructure like Travis.
We recognize that the mandatory execution of workflows might be a constraint that inhibits execution on some systems. Therefore, to meet the requirements for our badges, authors can still commence their analysis or experiments using a traditional set of scripts as long as they meet the criteria.
JHPS does not accept papers with experimental data that do not meet a minimal level of reproducibility. Theoretical papers that include any automatically generated graph must include a reproducible workflow to recreate the graphic.
Getting Started with Popper
To get started with the syntax of the Github Actions (GHA) workflow language, visit the workflow language and runtime documentation.
Popper is a tool for executing GHA workflows. Please take a look at the official documentation, in particular the Getting Started guide, for a 5-minute hands-on tutorial on how to write and run your first GHA workflow.
Read this “How To” guide on writing a workflow for an existing set of experimentation scripts.
A list of workflow examples can be found on the webpage including:
If you have any questions regarding writing GHA workflows or the use of Popper, please do not hesitate to ask on the public Gitter channel or to contact the board.