Did that go well? … (Collecting midterm feedback from your students)

midterm-feedback-image-croppedWhen you teach, it’s natural to wonder how it’s going. Are the students getting it? Are they interested in the course content? Do they think the examples you are using are as illuminating as you do? If you are teaching this fall, consider collecting feedback from your students about their class experience so far. Now, 6-7 weeks from the beginning of the term, is a good time to solicit feedback. It is far enough into the semester that you’ve given yourself and your students time to adjust, but it is still far enough from the end of the term to give you time to tweak and adjust the course components if needed.

The goal of collecting feedback from your students at midterm is to gather information that can help you improve the design of your course. We refer to this as formative assessment.

There are a variety of approaches to collecting midterm feedback. The technique most commonly used here at MIT is an in-class or outside-of-class survey which is the focus of this blog post. Regardless of the approach, you’ll want to balance your efforts in soliciting specific feedback on essential aspects while giving students the opportunity to offer their suggestions and feedback on aspects that may not have been on your radar. You’ll also want to consider how long it might take students to complete your midterm survey.

Designing your survey

Take a moment to consider what information would be most useful to you. Brainstorm aspects of your course for which you want feedback:

The structure and organization of the lectures?
The in-class activities?
The readings?
The online simulations that you just developed last year?
The way you have structured in-class discussion?
The in-class demonstrations?
The pace of the course?
The length of the homework assignments?
The level of challenge of the course?

The most basic mid-term feedback surveys look something like this:

  1. What aspect of this course/lab/recitation is most helpful to you?
  2. What aspect of this course/lab/recitation is least helpful to you?
  3. Please list any suggestions you have about how to improve this course/lab/recitation.

This is a short, simple survey. It’s easy to put together, and students should be able to respond relatively quickly. It is used extensively, and can provide practical and useful suggestions.

A potential downfall of using only broad, open-ended questions is that it’s difficult to predict what information you will get. So, if you did brainstorm a list of specific and essential aspects of your course, consider writing questions that directly address those aspects. You could ask about them using free-response questions, multiple-choice questions, a Likert scale, or a mix of these approaches. Here are some examples:

  • Which in-class activities aid your comprehension of course material the most? [free response]
  • How many hours per week do you spend on this course outside of class (include time on Psets, reading, etc.)?
    • 0-2 hrs
    • 3-5 hrs
    • 6-8 hrs
    • 9-10 hrs
    • > 10 hrs
  • screen-shot-2016-10-18-at-10-06-40-am

Close-ended questions are more quantitative and will give you a sense if something is useful or not useful for a large fraction of the students in the class.

Distributing your survey: Paper or Online?

The screenshot of the question above that I authored in Qualtrics Survey Service @MIT might make you wonder, “Should I ask my students to respond to my survey on paper or online?” There is no clear-cut answer to that question. Lower response rates are typically seen with online course evaluations1,2; however, email reminders and announcements from instructors can help mitigate that3,4. An advantage to online evaluations is that students typically provide more detailed responses to open-ended questions compared to paper evaluations1,5,6.

What do I do when I receive the midterm survey responses?

Read through the responses and look for themes. Are there reasonable changes to the course that you can make in response? Is this change going to help students learn? Are students requesting something that you have no control over or are unwilling to change? Make some notes, make some decisions, and then share your responses with your students in class. Thank them for their feedback, summarize the major themes, and tell them what you will be changing/not changing about the course in response and why. Your students will know that you read their feedback and that you care about their experience in your course.

Here’s an example of instructor response to student feedback:

Many of you find the real-world examples that I present in class very useful. I’m glad. I’ll try to continue to highlight where the course concepts come into play in the real world throughout the remainder of the term. Some of you commented that you’d rather work alone on the in-class exercises instead of small groups. Let me clarify how I’d like you to approach the in-class exercises. Definitely think through the problems and try to get started on your own. In your small groups, I’d like you to compare problem-solving approaches and, as needed, help each other become ‘unstuck’. I’ve selected challenging problems for these exercises so that you can benefit from teaching each other. The research shows that this will help you learn…

Notice that in the example above, the professor did not buckle to the students’ wishes to work alone on the in-class exercises, instead explaining her rationale for structuring the in-class work.

If you read through your midterm feedback and aren’t quite sure what action to take, contact usWe are happy to review the feedback with you and make recommendations for specific changes or for ‘staying the course.’ And, as is true for all interactions with TLL’s teaching and learning consultants, the conversation will be confidential.

(Note: While the debate over the validity and reliability of course evaluations continues, keep in mind that much of this debate centers around the use of course evaluations in the faculty tenure and promotion process, and not so much around the utility of using the data to inform course revisions. Sadly, because of the undue emphasis put on course evaluations for faculty promotion and tenure, I think sometimes people forget that, when well-formulated, the data from course evaluations can be used to inform and improve future offerings of the course.)

References:

[1] Anderson, H. M., J. Cain, and E. Bird. “Online Student Course Evaluations: Review of Literature and a Pilot Study.” American Journal of Pharmaceutical Education 69, no. 1, article 5 (2005).

[2] Avery, R. J., W. K. Bryant, A. Mathios, H. Kang, and D. Bell. “Electronic Course Evaluations: Does an Online Delivery System Influence Student Evaluations?.” Journal of Economic Education 37, no. 1 (2006).

[3] Norris, J., and C. Conn. “Investigating Strategies for Increasing Student Response Rates to Online Delivered Course Evaluations.” Quarterly Review of Distance Education 6, no. 1 (2005).

[4] Berk, R. A. “Top 20 strategies to increase the online response rates of student rating scales.” International Journal of Technology in Teaching and Learning 8, no. 2 (2012).

[5] Donovan, J., C. E. Mader, and J. Shinsky. “Constructive Student Feedback: Online vs. Traditional Course Evaluations.” Journal of Interactive Online Learning 5, no. 3 (2006).

[6] Kasiar, J. B., S. L. Schroeder, and S. G. Holstad. “Comparison of Traditional and Web-Based Course Evaluation Processes in a Required, Team-Taught Pharmacotherapy Course.” American Journal of Pharmaceutical Education 66 (2002).

(Image courtesy of MIT OpenCourseWare)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s