Skip to content
Inside Beat

Rate My Professors' bias problem deserves failing grades

While Rate My Professor presents itself as a useful tool, it might not be as helpful as we think. – Photo by Rate My Professor / Facebook

To me, one of the hardest parts of starting a new semester (besides having to unlearn the previous semester’s schedule), is trying to get a feel for what kinds of instructors I’ll be studying under for the next four months.

Daniel Handler, better known as Lemony Snicket, once said, “First impressions are often entirely wrong.” But when it comes to professors, I usually advise my friends and classmates to err on the side of caution and drop any classes whose professors may prove to be an issue later on — even if their unease exists only as a gut feeling at the present time.

When it comes to education, from whom you learn is half the battle. Sometimes, you’ll be lucky enough to have a friend who had previously taken a teacher’s class to fill you in on what sort of person they are. But, realistically, it’ll be up to you and you alone to make the call during those crucial first few classes whether you want to replace the class with another or stay the course.

You might be thinking, “If only there was a place I could see a quick rating (perhaps on a scale from one to five stars) of my new professors based on what their previous students had to say in order to better decide if I want to keep their course.” Trust me, I get it.

I, too, wish there was a publicly available database on everyone I had to work with in order to know what sort of person they are, maybe to get to know them better without actually going through the work of overcoming the awkward small talk phase of becoming acquainted with someone. But save for social media stalking (and the moral ambiguity that entails), there unfortunately isn’t really a way to judge what sort of person somebody is in any context without actually meeting them first to form your opinion.

But in May 1999, American software engineer John Swapceinski tried his hand at solving this problem in the education sector with the creation of The website's aim was to allow students from universities across the U.S., Canada and the U.K. to, as is denoted by its title, rate their professors.

The site was later rebranded as (RMP) in 2001, which has remained its name since, despite ownership of the site changing hands multiple times over its 23 years in operation.

In addition to a simple one- to five-star rating, RMP allows students to create new listings for any instructor not already on the site as well as indicate more specific aspects of their teaching style, such as the professor’s reliance on a textbook or their attendance policies.

Although this site may seem to solve the problem previously mentioned, the reality of the situation is unfortunately far more complicated than to be solved by a simple forum.

For starters, think about what sort of person would utilize a website like that. Universities can barely get students to fill out course satisfaction surveys baked into their own online class management software. So even assuming all of the reviews on RMP were made in good faith, you’ll likely only see the feelings of students who felt passionately enough about a professor to either praise or pan them on a third-party website.

And again, that’s assuming everything said on the site where students can create as many anonymous accounts as they wish is both accurate and valid, which I (unsurprisingly) believe to not be the case.

In February 2015, “The Guardian” contributor Laura Bates published an article titled “Female academics face huge sexist bias – no wonder there are so few of them.” Bates' piece discusses a tool created by Northeastern University assistant professor Benjamin Schmidt which allows users to search through 14 million student reviews on RMP for specific terms, visualizing how often each term appears in a review for a male professor versus a female professor.

The results were as stunning as they were worrying. Bates' research found that terms such as “brilliant” or “intelligent” were more likely to appear in reviews for male professors, while words like “annoying,” "harsh” or “unfair” were more prevalent in women's.

Interestingly, Bates found that although a somewhat significant portion of the reviews mentioned a professor’s physical appearance, this was a phenomenon that applied to both male and female instructors — with either sex being described using different words. For example, Bates found that although “hot” was a fairly commonplace result across both sexes, “sexy” was found more for male teachers, with “beautiful” being more likely to describe female teachers. 

The very fact that a significant enough portion of the sample RMP reviews discussed a professor’s attractiveness demonstrates the site’s lack of validity. Since there are no credential checks in place, students can review professors they’ve never had at a university they may not even attend.

So, unfortunately, you will still need to attend that infamous professor’s first class you need for your major to see if you can tough it out — at least until humanity comes up with a proper means of digitally getting an accurate feel of a stranger’s personality.

Related Articles


Join our newsletterSubscribe