Teaching Resources

Course Evaluations

Resource Overview

What you need to know to make the most of the student course evaluation process

Have questions?

Contact me today!

Rick Moore photo

Rick Moore

Assistant Director of Assessment and Evaluation

(314) 935-9171


Effective Evaluations

How can you make the most out of student course evaluations?

At WashU, course evaluations are administered by the Office of the Registrar in partnership with individual schools and programs. Details about the logistics of course evaluations can be found at WashU Course Evals. This includes the login to your Course Evaluation Dashboard, contact information for school/program eval administrators, and answers to frequently asked questions.

But there’s more to course evaluations than just the logistics of their administration. Below you can find advice and tips for designing your own custom questions (Question Personalization), encouraging students to complete evaluations, and interpreting the results when they come in.

We also offer confidential on-on-one consultations on the entire course evaluation process, from Question Personalization to using student feedback effectively.


Question Personalization

  • Instructors can add up to 3 customized questions to each of their courses. These questions can be used to gather information on aspects of your course that are not covered by your school or program’s standard evaluation questions. There are several pre-defined question types based on matching the question formats of each school.
  • This “Question Personalization” period for most semester-long courses begins 11 days before evaluations are opened to students and lasts for one week.
  • You will get an email when the QP period begins that lets you know the exact dates this option is available for your course.
  • Further logistical details are available at WashU Course Evals.
Deciding what to ask
  • Remember, QP is optional; you don’t HAVE to ask anything.
  • If you do decide to ask your own questions, return to your course’s learning goals. Consider asking questions related to the learning goals that are most important to the course..
  • Consider asking about about aspects of your course that your program or department’s standard questions don’t cover.
  • Focus on student experience, not asking students to directly rate the quality of parts of the course, as this can help reduce potential bias. For example, instead of asking whether a lecture on a difficult topic was well-designed, ask if students felt they understood the material better after the lecture.
  • Ask questions where the answer might change how you do something in the course. If you think of a question but potential student responses might not be able to guide you in making changes, you may want to reconsider the question.
Open vs. Closed Questions

You can ask either open or closed questions in QP.

Open questions:

  • Respondents type in text.
    • E.g. “What did you find useful about our Friday review sessions?”
  • Open questions are good for getting at details and learning things you might not have thought to ask about in a closed question.
  • BUT, many respondents skip open questions, the answers can be difficult to interpret, and analyzing the answers can be time consuming for larger classes.

Closed questions:

  • Respondents select an option from the choices provided (i.e. multiple choice).
    • E.g. “To what extent do you agree or disagree with the following statement? ‘The two-stage exam helped me learn.'”
      Strongly agree / Agree / Neither agree nor disagree / Disagree / Strongly Disagree
  • Closed questions are easier to analyze than open questions, and students are more likely to respond.
  • BUT, responses to closed questions have less depth than open questions, and do not always provide enough context to make effective change.

In sum, balance the size of your class and your goals in deciding whether to ask open or closed questions. If you have a smaller class, you might want to ask more open questions to help you gather in-depth feedback. If your class is larger, using more closed questions could help you get a general overview of what you’re interested in from your students’ perspectives. You can also schedule a consult with the CTL or send us draft versions of questions for feedback as well.

Question Writing Tips

Below are some suggestions to keep in mind as you write your custom questions.

Be clearer than you think you need to be

When designing questions, avoid ambiguities in either the question and/or answers that might lead different people to interpret the items differently.

Less clear:

“I did the reading.”

1 – 7 (Strongly disagree – Strongly agree)


“I always read the assigned reading completely.”

1 – 7 (Strongly disagree – Strongly agree)

Avoid leading questions

Check your questions to make sure that they don’t accidently push respondents to answer a certain way.


“In the past, most students have usually enjoyed the visit to the library archives. How would you rate the quality of this semester’s archives visit?”

1 – 5 (Poor – Excellent)


“How would you rate the quality of this semester’s archives visit?”

1 – 5 (Poor – Excellent)

Avoid asking two questions at once (aka “double-barreled” questions)


“The lectures and readings helped me learn.”

1 – 5 (Strongly disagree – Strongly agree)


“The lectures helped me learn.”

1 – 5 (Strongly disagree – Strongly agree)

“The readings helped me learn.”

1 – 5 (Strongly disagree – Strongly agree)

Make sure answers to closed questions are appropriate

Unmatched and unbalanced:

“The assignments helped me learn.”

1 – 5 (Poor – Excellent)


“The assignments helped me learn.”

1 – 5 (Strongly disagree – Strongly agree)

These suggestions for writing questions are loosely based on tips from Bryman (2016).

Encouraging Student Response

Talk about evals
  • Students do not necessarily understand the impact, if any, their completing course evaluations will have (Hoel and Dahl 2019).
  • It’s therefore important to explain why course evaluations are meaningful to you. Showing students that you care about course evaluations can increase response rates (Chapman and Joines 2017).
  • Also tell students what you plan to do with the results. If you’ve made changes to your courses in the past based on student feedback, tell them that too.
  • Describe how low response rates hurt your ability to make changes, by making it harder to figure out what is working and what isn’t in the course.
Give time to complete in class
  • Consider reserving class time for students to complete their evals on their devices.
  • The beginning of class works better for this than the end (Standish et al 2018).
  • Provide the link to the evals portal in case your students have misplaced their evaluation email.
Consider Incentivizing Response
  • You can see the class response rate when the evaluation period is still open. (Although you cannot see which individual students have and have not completed their evaluations.)
  • This information can be used to create class incentives:
    • If [blank]% of students complete their evals by [a certain date] then everyone gets an extra credit point / class let out early / later due date on the final paper / etc.
  • Class incentives, even when small, have been shown to increase student response rates (Boysen 2016).
Demonstrate you care what your students think
  • Ask for feedback from your students throughout the semester.
  • Demonstrate that you take it seriously.
  • Evidence suggests instructors who ask for mid-semester feedback may get higher responses on end-of-semester evals (Stanny and Arruda 2017).

Interpreting Results

Coming in January!

Look to this space for resources on interpreting the results of evaluations, incorporating feedback into your courses, and upcoming events related to evaluation results.


Boysen, Guy A. 2016. “Using Student Evaluations to Improve Teaching: Evidence-Based Recommendations.” Scholarship of Teaching and Learning in Psychology 2(4):273–84. doi: 10.1037/stl0000069.

Bryman, Alan. 2012. Social Research Methods, 4th Edition. 4th edition. Oxford ; New York: Oxford University Press.

Chapman, Diane D., and Jeffrey A. Joines. 2017. “Strategies for Increasing Response Rates for Online End-of-Course Evaluations.” International Journal of Teaching and Learning in Higher Education 29(1):47–60.

Chávez, Kerry, and Kristina M. W. Mitchell. 2020. “Exploring Bias in Student Evaluations: Gender, Race, and Ethnicity.” PS: Political Science & Politics 53(2):270–74. doi: 10.1017/S1049096519001744.

Hoel, Anniken, and Tove Irene Dahl. 2019. “Why Bother? Student Motivation to Participate in Student Evaluations of Teaching.” Assessment & Evaluation in Higher Education 44(3):361–78. doi: 10.1080/02602938.2018.1511969.

Murray, Dakota, Clara Boothby, Huimeng Zhao, Vanessa Minik, Nicolas Bérubé, Vincent Larivière, and Cassidy R. Sugimoto. 2020. “Exploring the Personal and Professional Factors Associated with Student Evaluations of Tenure-Track Faculty.” PLOS ONE 15(6):e0233515. doi: 10.1371/journal.pone.0233515.

Peterson, David A. M., Lori A. Biederman, David Andersen, Tessa M. Ditonto, and Kevin Roe. 2019. “Mitigating Gender Bias in Student Evaluations of Teaching.” PLoS One 14(5):e0216241. doi: http://dx.doi.org/10.1371/journal.pone.0216241.

Standish, Trey, Jeff A. Joines, Karen R. Young, and Victoria J. Gallagher. 2018. “Improving SET Response Rates: Synchronous Online Administration as a Tool to Improve Evaluation Quality.” Research in Higher Education 59(6):812–23. doi: 10.1007/s11162-017-9488-5.

Stanny, Claudia J., and James E. Arruda. 2017. “A Comparison of Student Evaluations of Teaching with Online and Paper-Based Administration.” Scholarship of Teaching and Learning in Psychology 3(3):198–207. doi: 10.1037/stl0000087.

Have suggestions?

If you have suggestions of resources we might add to these pages, please contact us:

ctl@wustl.edu(314) 935-6810Mon - Fri, 8:30 a.m. - 5:00 p.m.