Some question accuracy of professor evaluations in determining classes
By Christopher DaCosta
Each registration period sophomore Danielle Camardo routinely refers to the faculty evaluation database compiled on Santa Clara's Web site in order to make what she calls the "best course selection."
"I find that the evaluations act as a good indicator of what I can expect from a certain professor and his or her course," said Camardo.
In fact, many students' actions do not differ from Camardo's during the frantic and often stressful registration period. The evaluation database can often be a valuable resource by either tipping the scales in favor of one professor or by realistically conveying the difficulty and work loads associated with the course.
Sophomore James Hutchinson used the evaluations to pick his schedule last quarter. "I managed to have a course load in which the work was evenly distributed," said Hutchinson. "The section asking 'does this course have more work than your other courses?' really helped me make my choices."
With so many students relying on faculty evaluations for course decisions, accuracy remains a big concern. An exact protocol for distributing and collecting evaluations has been devised to achieve maximum accuracy. Evaluations are conducted towards the end of the quarter with the professor absent from the room.
"I personally leave the classroom and have a student collect them and bring them to me," said Adjunct OMIS Professor James Rowan. "The student is instructed to seal the envelope in which they are deposited before bringing them back to me."
Once Rowan has received the envelope, he signs it and returns it to the student, who subsequently gives it to the department administrative assistant.
Results from evaluations are then compiled and calculated by the Information Technology (IT) department by being run through a scantron computer program. This part of the process is overseen by Michael Bonfert, manager of the computer operations department, and John Storer, the project leader.
According to Storer and Bonfert, the computer responsible for tabulating results from evaluations is accurate despite the problem of damaged scantrons.
"If a scantron is wrinkled or damaged, we feed it in manually," said Bonfert. "No evaluations are thrown out by the computer."
In fact, Storer said that the computer is not the basis for accuracy in calculating a faculty member's results.
"Accuracy is based on how well our instructions are followed and how things are carried out," Storer said.
Bonfert added that accuracy really depends on the honesty of students.
Despite the precautions taken by the IT staff to ensure that their specific protocol is followed, some students feel that faculty evaluations do not accurately portray a faculty member's teaching ability or the difficulty of their class.
Sophomore Camille Johnson thinks student opinions aren't necessarily an appropriate way to gauge faculty effectiveness.
"The subjectivity of the evaluation process doesn't make it accurate enough," said Johnson. "Results can be tainted by students who are receiving a good grade or a bad grade."
Johnson is concerned that some students can be biased toward classes that satisfy requirements within their major.
This raises new questions for users of the evaluations, especially department chairs and school deans who may consider the results in "faculty annual reviews, and tenure and promotion decisions," according to the course evaluations online disclaimer.
"I personally take the evaluations seriously," said Rowan. "The written comments are more helpful than the numerical evaluations; I read each evaluation and make decisions about my course and my pedagogical methods based on my student's input."
Despite his knowledge being limited to his own individual experience, Rowan has found that his students take the evaluation process as seriously as he does.
"I personally think that [the evaluations] are a relatively fair reflection on teaching ability," Rowan said. "I get a rather high percentage of students who take the time to make thoughtful comments."
With her registration appointment fast approaching, Camardo says she will weigh evaluations less in her decisions.
"I probably rely less on the evaluations than I have in the past, but I still think that referring to them can't hurt too much."