By Suzanne Wertheim, Ph.D.
Recently, there’s been a lot of press about a new study showing that student evaluations of college instructors do not actually correlate with teaching effectiveness. In fact, the study shows that student evaluations are better at measuring the unconscious bias of the evaluators (here, students) than they are at assessing the performance of workers (here, teaching assistants).
The study looked at two groups of students. The first group, in France, was made up of students taking large classes with multiple sections taught by teaching assistants (TAs). In their evaluations, male TAs were systematically assessed as teaching better than their female colleagues. This then raises the question – what if those male TAs were just better teachers? The researchers analyzed results for final exams, and found that the students who had been taught by female TAs actually did better on average than the students who had been taught by males. So, at least based on final exams, the female TAs, while rated lower, were overall more effective teachers than the males.
The second group of students was enrolled in an online course in the US and unknowingly took part in an experiment. This course had sections taught by two TAs, one male and one female. Each TA taught the same number of students, half the time using their real name, half the time using the other TA’s name. Half the time, the TAs were doing their online (text-only) teaching in disguise – the same exact teaching, just under a different name. Once again, the students evaluated the performance of the male instructors as better. When the male TA taught using a female name, he was evaluated lower than when he taught as himself. And when the female instructor taught using a male name, she was rated better than when she taught as herself. However, students taught by the female TA, in either guise, performed significantly better in the course.
What’s especially interesting about this second study is that the skewed performance evaluations were made even for “objective” data. One example is whether or not instructors returned assignments in a timely fashion. The course software showed when assignments had been returned. The return time was identical for each TA. Even so, students ranked their “male” TA as better at this teaching task.
In France, it was male students who showed slightly more bias against their female instructors. In the US, it was female students who were slightly more biased against female instructors. But the end result for the people being evaluated was always the same: being female meant that your work was systematically evaluated as less good, even when it was identical to, or possibly better than, the work of your male colleagues.
These kinds of studies are helpful when thinking about bias in all kinds of workplaces, not just the classroom. Because the researchers had access to large numbers of evaluations, could run comparisons, and could analyze performance outcomes, they were able to find clear evidence of gender bias. The same unconscious bias found in the students is also present in managers and co-workers in all kinds of workplaces because it stems from the same cultural patterns and expectations. It’s just harder to spot and harder to prove, both because we can’t get access to the same volume and kinds of data, and because it can be expressed in ways that are more subtle and harder to measure.
As a linguistic anthropologist, I have been researching expressions of bias, particularly in the tech world. I’ve collected data by interviewing women in tech, attending panels on diversity and inclusion, and analyzing published surveys and first-person narratives. What I’ve found is that everyday interactions, mundane and innocuous seeming for many, may make women in inequitable workplaces feel marginalized, demeaned, sexualized, uncredited, or unwanted – in other words, evaluated unfairly.
For example, conversations when we meet someone new are an opportunity to establish placement on a hierarchy – are we on the same level or should one of us be marked as higher? They also establish where boundaries should be drawn – are we in-group or are we out-group? A getting-to-know you conversation is actually an evaluation of sorts, and language is used both to reflect and to establish a social relationship. Data shows that for women in tech, these conversations can be a means of marking them as lower or as out-group when in fact they are neither. In these conversations, women are sometimes downgraded in terms of their work position, for example, assumed to be assistants because they sit near an executive rather than the software engineers that they are. (In the same vein, female professors are often assumed to be graduate students and female doctors to be nurses or medical students.) Or the men they’re talking to may push back at their claimed status by questioning their credentials in detail, which suggests without stating it explicitly that they find their claims of expertise not be believable. Or they may evaluate the women as not worth talking to at all, avoiding eye contact or addressing questions and responses to male colleagues only.
When this “getting to know you” conversation is part of a job interview, the ramifications of gender bias can be even more significant. In addition, these kinds of evaluations can appear in performance reviews, where coded language may show that female workers are being evaluated more negatively and with different criteria than their male colleagues. By systematically studying linguistic expressions of bias, and training managers and co-workers how to recognize and avoid them, we can work towards more equitable evaluations.
Bio: Suzanne Wertheim is the Founder and CEO of Worthwhile Research & Consulting, a boutique firm providing customized diversity training and communication workshops, legal consulting and continuing legal education, and communication coaching.
This article was originally published in the February 2016 edition of the ACBA Labor and Employment Section's newsletter. Click here to be taken to the full newsletter. Check out the ACBA Labor and Employment's webpage for more information on the section and for upcoming events!