Following the successful introduction of the artifact evaluation process at MLSys'19 and the success of related initiatives at NeurIPS'19 and ASPLOS'20 (more than half of the accepted papers submitted artifacts for evaluation), we will continue the reproducibility initiative at the Conference on Machine Learning and Systems 2020.
The Conference on Machine Learning and Systems promotes reproducibility of experimental results and encourages code and data sharing to help the community quickly validate and compare alternative approaches. Authors of accepted papers are invited to formally describe supporting materials (code, data, models, workflows, results) using the standard Artifact Appendix template and submit it to the Artifact Evaluation process (AE).
Note that this submission is voluntary and will not influence the final decision regarding the papers. The point is to help authors validate experimental results from their accepted papers by an independent AE Committee in a collaborative way, and to help readers find articles with available, functional, and validated artifacts! For example, ACM Digital Library already allows one to find papers with available artifacts and reproducible results!
You need to prepare your artifacts and appendix using the following guidelines. You can then submit your paper with the standard artifact appendix via the dedicated MLSys AE website before January 15, 2020. Your submission will be reviewed according to the following guidelines. Please, do not forget to describe the required hardware, software, data sets and models in your artifact abstract - this is essential to find appropriate evaluators! You can find the examples of Artifact Appendices in these MLSys'19 papers.
|artifact evaluated (functional)|
At the end of the process we will inform you about how to add these badges and the artifact appendix to your camera-ready paper.