Moving course evaluations from the traditional pencil-and-paper format to an online ratings system significantly increased the number of evaluations that were completed.
Despite the ability to opt out, 90.76 percent of students chose to participate in the online course evaluations for spring 2009 and 85.38 percent participated for fall 2009, according to a report released by the Provost’s Office. Before the online evaluation system was put into place, participation rates were between 70 and 75 percent.
While no changes were made to the questions asked on the surveys, students were given a greater period of time in which to complete the evaluations.
In previous years, evaluation forms were distributed in class near the end of the semester — students who were absent that day simply missed the opportunity to fill out an evaluation.
The online evaluations provide a more “thoughtful feedback,” saidAssociate Provost for Education Andrew Binns.
“Filling out the forms online while sitting at your computer allows for greater consideration than quickly filling out a paper form at the end of a class period,” he said.
Another change was instituting an online “grade pathway.”
“This was the single most important element to the online system,” said Director for Education in the Office of the Provost Rob Nelson. “All students had to either complete the online course evaluation or at least opt out of taking the survey before they could access their final grades.”
As a result, students like Engineering sophomore Haarika Kamani completed evaluations for all of her classes.
“I received several e-mails telling me I wouldn’t get my grades if I didn’t at least log on to complete the evaluations,” Kamani said. “Once I logged on to the web site, it only made sense to fill them in.”
Additionally the average quality rating decreased with the online system.
However, the decrease in average quality rating was not a result of the increase in participation rates or the extension of the evaluation period, according to the report.
In addition, while the average differences in quality ratings observed were statistically significant, the actual differences were minor.
Average course quality only decreased from 3.08 in spring 2008 to 2.92 in spring 2009.
College sophomore Rachel Taube is one student who took the time this semester to fill out all of her evaluations.
“I want other students to know if a teacher is good or bad. The teacher also deserves the recognition of good reviews, but if they performed badly, I want them to know that, too,” she said.
However, Taube acknowledged that she filled out evaluations for all her classes even before the move to the online system.
“I don’t mind either way, both were really quick,” she added.
On the other hand, this year’s results would be more accurate because “doing it online guarantees everyone will do it, so the idea we get from Course Review is more valid,” Kamani said.Comments powered by Disqus
Please note All comments are eligible for publication in The Daily Pennsylvanian.