UAI 2017 - Reviewing Criteria
Below we provide some guidelines to reviewers on how to write reviews, both the content of reviews and how the numerical scoring system works.
Writing Reviews: Content
For each paper you will provide written comments under each of the headings below. Your review should address both the strengths and weaknesses of the paper - identify the areas where you believe the paper is particularly strong and particularly weak - this will be very valuable to the program chairs and the SPC.
In this section please explain in your own words what is the problem that the paper is trying to address, and by what approach. In what area lie the main accomplishments of the paper. Optionally, a summary of your opionion, impact and significance could be included too.
This is arguably the single most important criterion for selecting papers for the conference. Reviewers should reward papers that propose genuinely new ideas, papers that truly depart from the "natural" next step in a given problem or application. We recognize that novelty can sometimes be relative, and we ask the reviews to assess it in the context of the respective problem or application area. It is not the duty of the reviewer to infer what aspects of a paper are novel - the authors should explicitly point out how their work is novel relative to prior work. Assessment of novelty is obviously a subjective process, but as a reviewer you should try to assess whether the ideas are truly new, or are novel combinations or adaptations or extensions of existing ideas.
Are the results technically sound? Are there obvious flaws in the conceptual approach? Are claims well-supported by theoretical analysis or experimental results? Did the authors ignore (or appear unaware of) highly relevant prior work? Are the experiments well thought out and convincing? Are there obvious experiments that were not carried out? Will it be possible for other researchers to replicate these results? Are the data sets and/or code publicly available? Is the evaluation appropriate? Did the authors discuss sensitivity of their algorithm/method/procedure to parameter settings? Did the authors clearly assess both the strengths and weaknesses of their approach?
Is this really a significant advance in the state of the art? Is this a paper that people are likely to read and cite in later years? Does the paper address an important problem (e.g., one that people outside UAI are aware of)? Does it raise new research issue for the community? Is it a paper that is likely to have any lasting impact? Is this a paper that researchers and/or practitioners might find useful 5 or 10 years from now? Is this work that can be built on by other researchers?
Please make full use of the range of scores for this category so that we can identify poorly-written papers early in the process. Is the paper clearly written? Does it adequately inform the reader? Is there a good use of examples and figures? Is it well organized? Are there problems with style and grammar? Are there issues with typos, formatting, references, etc.? It is the responsibility of the authors of a paper to write clearly, rather than it being the duty of the reviewers to try to extract information from a poorly written paper. Do not assume that the authors will fix problems before a final camera-ready version is published - unlike journal publications, there will not be time to carefully check that accepted papers are properly written. Think of future readers trying to extract information from the paper - it may be better to advise the authors to revise a paper and submit to a later conference, than to accept and publish a poorly-written version. However if the paper is likely to be accepted, feel free to make suggestions to improve the clarity of the paper, and provide details of any typos that you have found.
This is an optional section on the review form used to add additional comments for the authors that don’t naturally fit into any of the sections above.
Again this is another optional section. If there are any comments that you would like to communicate to the SPC and program chairs, but that you do not wish to be seen by the authors, they can go in this section. In particular, you can include rankings or comparisons among the papers you have reviewed for UAI.
Please be as precise as you can in your comments to the authors and avoid vague statements. Your criticism should be constructive where possible - if you are giving a low score to a paper then try to be clear in explaining to the authors the types of actions they could take to improve their paper in the future. For example, if you think that this work is incremental relative to prior work, please cite the specific relevant prior work you are referring to. Or if you think the experiments are not very realistic or useful, let the author(s) know what they could do to improve them (e.g., more realistic data sets, larger data sets, different evaluation metrics, sensitivity analyses, etc.).
Writing Reviews: Numeric Scoring
For UAI 2017 we are using a 10-point scoring system. We strongly encourage you to use the full range of scores, if appropriate for your papers. Try not to put all of your papers in a narrow range of scores in the middle of the scale, e.g., 3’s, 4’s, and 5’s. Don’t be afraid to assign 1’s/2’s, or 9’s/10’s, if papers deserve them. If you are new to the UAI conference (or have not attended for a number of years) you may find it useful to take a look at online proceedings from recent UAI conferences to help calibrate your scores. The scoring system is as follows:
Golden Sponsor
Golden Sponsor
Golden Sponsor
Bronze Sponsor
Bronze Sponsor
Training session Sponsorship
Startup Sponsor
Media Sponsor