The 25th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD) released its paper acceptance results this week, and many researchers whose papers were rejected have taken to social media to question the review process.
As one of the world’s top international conference in data mining, KDD is known for a strict paper review process that yields an annual acceptance rate of no more than 20 percent. This figure this year is just 14 percent, down from 18 percent in 2018.
Paper submission requirements become more stringent this year. KDD 2019 adopted a double-blind review system with authors’ names and organizational information concealed from reviewers. Additionally, KDD 2019 introduced a focus on reproducibility as “an important factor in the review process of the paper”. Only papers with a two-page appendix covering reproducibility would be considered for the Best Paper Award. However, there seems to have been some confusion as to whether for example GitHub links would satisfy this requirement.
University of Pittsburgh researcher Konstantinos Pelechrinis tweeted that his submitted paper included a link to a GitHub with data and code to fulfill the reproducibility requirements, but was rejected for lack of a specific reproducibility appendix.
Some people joked that the reviewers might have used the April Fool’s online spoof autoreject.org to automatically generate the reviews.
Whether these complaints are reasonable or merely sour grapes, the fact remains that the KDD acceptance rate this year is much lower than other top tier computer science conferences. While low acceptance rates can indicate high standards in a good way, clearly not everyone shares this view.
KDD 2019 will be held August 4 to 8 in Anchorage, Alaska.
Author: Reina Qi Wan | Editor: Michael Sarazen