Call for Tutorials - Review

By Yanina Bellini Saibene in Theme Features R

January 18, 2023

When the call for tutorials closes, we have the proposals to review but to be ready at that stage, we will have to complete some tasks:

Invite the reviewers

You can start inviting the reviewers when you are writing the call. In the invitation. In the invitation, you should include the due date to review and how much time we estimate this can take. Here is an example of an email to invite reviewers. We use this text for useR! 2021:

Template email invite reviewer

Subject: useR! 2021 - Invitation to program committee

Text: Hi,

We are pleased to announce that {conference name}, the next International {conference name} will take place online on {place, date and time}.

Given your expertise in R, we would like to invite you to be a member of the {conference year} Program Committee. This will greatly help us to consolidate the preferences for the contributions that will be presented during {conference year}.

We would like you to review some one-page tutorial proposals if you accept this invitation. At this time, it is not possible to anticipate how many proposals we will assign you. However, we will do our best to keep your service as a program committee member to a maximum of eight hours.

We expect that your service will take place approximately during the last three weeks of {month and year} once the tutorial submission process has been completed. We thank you for considering this invitation and hope that you might be able to contribute to the conference.

Should you have any questions, please do not hesitate to contact us.

Looking forward to hearing from you at your earliest convenience, before {date}.

Best wishes,

{Name of the chairs of the program committee}

Chairs of the Program Committee

These reviewers will be part of the Program Committee. They will carefully evaluate each submission’s overall quality, research scope, technical content, and pedagogical proposal and appeal to a reasonable fraction of the conference community. Remember, we detailed all these aspects on the call.

I recommend at least two reviews for each submission. In case of a significant discrepancy between the two reviewers, you can request a third review. As the review is not blind, since the tutors' data are known, it should be taken into account that there may be a conflict of interest between the reviewers and the authors. In this case, the review should be performed by another person.

You should consider these aspects when deciding how many reviewers you invite.

Create a rubric or criteria for the review

Reviewers will fill out a form with a review rubric/criteria to help choose the final list of tutorials. Here is an example of potential criteria (We use similar measures at useR! 2021 and JAIIO):

  • Relevance: in this point, reviewers address how relevant, and interesting the tutorial is for this conference edition. According to the proposed topic, the target audience, language, and motivation, they indicate this tutorial’s relevance for the conference. The values for the scoring are: Very good, Good, Acceptable, Weak, and Very weak.

  • Pedagogy: indicate the clarity and level of the pedagogical proposal of the tutorial, with emphasis on the description, outline, learning objectives, duration, and target audience. Include your rating of material such as websites and GitHub, if they exist. The values for the scoring are: Very good, Good, Acceptable, Weak, and Very weak.

  • Originality: According to the proposed topic, the target audience, the pedagogical proposal, language, and motivation, indicate this tutorial’s originality for the conference. Reviewers have to check if tutors offered the tutorial in past editions. Values for scoring: Very good, Good, Acceptable, Weak, Very weak

  • Accessibility: considering our accessibility guidelines for tutorials, evaluate this tutorial’s accessibility proposal and the accessibility of any linked materials. The values are Very good, Good, Acceptable, Weak, and Very weak

  • Instructor: brief comments on the instructor’s qualifications and the quality of their instruction. In addition, it is helpful to know details such as:

    • the tutor’s experience working with the global community or in their region. For example, for useR! We take into account if they are R-Ladies or RUG organizers.
    • Scope and popularity of their courses or materials: for example, for JAIIO, we gather information if they are SADIO’s instructors.
    • Any knowledge of their instructional ability: for LatinR, we take into account if they are The Carpentries or RStudio certified instructors or trainers.
    • Any other information that may be of interest when ranking the tutorial: for example, for useR! we tried to allow tutors new to the conference and have a geographical and language representation.
  • Overall Evaluation: this is an overall evaluation of the proposal. The values are: Very Good (must be accepted), Good (I recommend that it be accepted), Limit (is at the limit of what is acceptable), Weak (I recommend rejecting it), and Very weak (I strongly recommend that it be rejected)

  • Brief comments for the program and content tutorial committee: reviewers can explain their scoring or give more details for the program committee, like suggestion change the type of contribution. It is the place to describe important details for the selection.

  • Brief comments for the authors of the proposal: these comments provide constructive feedback to the author so they can improve the tutorial and future submissions.

  • Level: It is a good idea to ask reviewers to categorize the proposal according to the level they think the tutorial audience should have and compare it with what the author state. The values can be Beginner, Intermediate, Advanced or Novice, Competent practitioner, Expert.

You will share these review criteria with the reviewers in the email you sent with the instructions to review. Here you can find an example of the email:

Template email to send reviewer the proposals and instructions to review

Text: Dear {name},

Thank you for agreeing to review tutorial proposals. We are now sending you {the total number of proposals} to review.

{explanation on how to access and use the review infrastructure}

The review criteria are

{review criteria}

Please complete the reviews by {date}, and please reach out early if there is any problem. Thank you again for your time, and please don’t hesitate to contact us if anything is unclear.

Kind regards,

{conference year} Program Committee

{commite members names}

To ensure that the process takes the stipulated time, it is advisable to communicate with the reviewers during the revisions. You can remind them of the due dates and ask them if you can help them in any way. Here is an email example:

Template email for review reminder for tutorials

Subject: [Friendly Reminder] Tutorial {conference year} review

Text: Hi,

Thank you for agreeing to review tutorial abstracts for {conference year}.

We are writing to you since we are approaching the tutorial abstract review deadline ({date}). Meeting this date is critical for us to carry out the final selection process on time.

Please let us know if you have any questions/issues with the abstract review process; we are happy to help.

Thanks in advance for your time and contribution.

Kind regards,

{conference year} Program Committee {names of the chairs/head of program committee}

Finally, you should thank the reviewers for their work when the process is finished. Here is a template for this task:

Template email to thanks to tutorial’s reviewers

Subject: Tutorial {conference year} review

Text: Hi,

Thank you so much for completing your reviews for {conference year}. Your work contributes to the organization of a quality, global, and accessible conference.

We much appreciate your time and dedication. Thank you!

Kind regards,

{conference year} Program Committee {names of the chairs/head of the program committee}

Setup the review infrastructure

This is probably the same system for the rest of the conference submissions. Still, it may also be a specific solution for tutorials.

For example, for useR! 2021 we use a Shiny App solution and self-developed spreadsheets for the tutorials and ConfTool software for the rest of the conference. In useR! 2022, we used RedCap for all submissions and a self-developed shiny App for the reviews.

In the case of LatinR, we use Google Forms and Google Spreadsheet for the tutorials and Easy Chair for the rest of the submissions. For the JAIIO, we host our instance of Open Conference Systems and use this system for all the submissions and reviews.

All these options have different costs in time, money, and barrier to use. You need to balance their pros and cons in the context of your conference.

Next steps

In the next post, we will learn about the final selection and the communication of the results to the authors, emphasizing providing constructive feedback and how to create the final agenda.