Online Course Evaluation FAQ

Why are we talking about this?

 At the March 11, 2019 Faculty Meeting, Patrik Hultberg listed the following reasons to move away from paper forms to an online delivery system:

  1. To ensure that all instructors receive their course evaluations back before the start of every term.  Currently we struggle to achieve this between Fall and winter and, especially, between Winter and Spring.
  2. To accommodate the evaluation of all courses taught at K, including partial credit courses. This is a request by the Faculty Personnel Committee and we do not have the resources to achieve this today, especially in a timely manner.
  1. To save resources in the Provost’s Office.
  2. To customize the evaluation forms. In particular, remove questions that are not relevant to a particular course (such as Part 3. Service-Learning and Labs). Also, allow instructors to add a small number of questions that address particular issues in the course.
  3. To better accommodate team-taught and multi-section courses.
  4. To give students an opportunity to add more written comments. In particular, give students an opportunity to comment on any numerical score given.
  5. To increase our ability to evaluate scores and comments (scores through better reporting and statistical approaches; comments by having all of them be legible).
  6. To give students more time to thoughtfully fill out the course evaluations, if they wish.
  7. To increase both anonymity and add the possibility of assessing results based on various demographic factors (if we want).

Whose idea was this, anyway?

The faculty Teaching and Learning Committee, in response to feedback from staff in IS and the Provost’s office, worked in winter and spring of 2019 to learn about the state of the art in online course evaluations and a number of vendors’ products for carrying them out. With guidance from the Faculty Personnel Committee, the TLC organized a pilot study in spring 2019 involving 15 faculty volunteers and 17 courses.

Can we give it a try before we decide?

We already have! With guidance from the Faculty Personnel Committee, the TLC organized a pilot study in spring 2019. Patrik Hultberg announced the pilot and asked for volunteers at the tenth week faculty meeting in winter 2019.

The pilot study included student evaluation data from the 17 courses and 15 instructors — 8 women and 7 men.  The sample included tenured faculty volunteers diverse with respect to division, discipline, course level and course size.  The sample intentionally didn’t include non-tenured faculty to safeguard against any possible negative repercussions in the tenure review process.

What did we learn from the pilot study?

The results of the pilot study show:

  • Identical student response rate compared with traditional paper and pencil format when students were given time in class to complete the online evaluation form. These results contrast starkly with a pilot study conducted at K a decade ago.
  • Accuracy of student responses: legible written responses, elimination of bubbling errors, accurate numerical counts due to elimination the of scanning errors frequently encountered in the current format
  • Evaluation responses are available immediately at the end of the term. The current process takes weeks to complete and consumes many hours of staff time that could be better spent ensuring smooth operation of other areas of our academic mission. 
  • Numerical averages didn’t change:  In comparing historical averages for each course/instructor pair with the online results, the data from the 17 courses in the trial gave no evidence that overall numbers for “course” and “instructor” are in any way different when comparing the online evaluation format to the traditional paper format.

What did the students in the pilot study courses think?

Students liked the online format:  In a follow-up post-evaluation survey, 87% of the responding students from the pilot study courses responded favorably, reporting that they found their experience with the online format to be the same (7%), slightly better (30%) or much better (50%) than their previous experiences with the paper format.  Among students who had negative comments about the experience, most complained about not having the opportunity to complete the evaluation form during class.

When would we switch from paper to online format?

If a switch to online format is approved by faculty at the tenth week fall faculty meeting,  the winter 2020 course evaluations would be administered in the online format.

How would the online evaluations be implemented?

The College would contract with SmartEvals. This is one of several venders from whom quotes were solicited. SmartEvals was the most economical and also the most flexible in working with the specific requirements of K. Each term the Academic Computing staff in Information Services will work with the Registrar’s office to make sure the registration information from all K courses is transmitted to SmartEvals. The student feedback is processed by SmartEvals and is available to instructors immediately following the grade submission deadline.  The IS staff at the College will download and archive the evaluation data each term.

What will SmartEvals do with the data collected?

Here are relevant sentences from the User Agreement, noting that Gap Technologies is the parent company of SmartEvals: Under no circumstances will Gap Technologies, Inc. use any information provided directly or indirectly by Customer or users, including but not limited to email addresses, for any use other than administering the Service for Customer’s use. Gap Technologies may use the collected student evaluation data for its own statistical purposes so long as Gap Technologies, Inc. removes all personally identifiable information (including names, student identifiers, course names, and email addresses) from such data.

Will the online format result in any change in the course evaluation average numbers?

No! In comparing historical averages for each course/instructor pair with the online results, the data from the 17 courses in the pilot study gave no evidence that overall numbers for “course” and “instructor” are in any way different when comparing the online evaluation format to the traditional paper format.

Will the online format result in any change in the quality or quantity of student narrative feedback?

The pilot study results show that narrative feedback is certainly more legible in the online format than the paper format. Furthermore, participating instructors have reported anecdotally that students’ narrative responses are more numerous and more thoughtful than with the paper format.

Is it possible to solicit course-specific feedback using the online format?

Yes! For historical continuity, the current collection of course evaluation questions will remain. In addition, instructors can include their own customized course-specific questions. What’s more, questions about lab sections and service learning components will be automatically omitted from courses to which they don’t apply.

How do the instructors in the pilot study feel about the experience of the online evaluations?

In response to a survey after the pilot, participating instructors were neutral to generally postive. By acting on their thoughtful suggestions about the administration of the surveys — timing, duration, student notification, in-class and out-of-class — as well as the feedback from the faculty during the fall discussions the process will be made to fit better with faculty and student expectations and workflow.

I’m feeling feelings about this. How can my views be heard?

Please email Rick Barth with your questions and concerns. Your feedback will determine how we proceed with this.