academica, writing

A website, Genderanalyzer, attempts to guess the gender of a blog based on the writing. Its data is trained by UClassify. I figured that it would be interesting to look at a dozen sociology bloggers, half male, half female. The results: Ts’i mahnu uterna (Correct), Kieran Healy (Correct), Kristina B (Wrong), What is the What? (Wrong), Eszter Hargittai (Wrong), Shakha‘s posts on Scatterplot (Wrong), Jeremy‘s posts on Scatterplot (Correct), Tina‘s posts on Scatterplot (Wrong), GradMommy (Wrong), Rachel’s Tavern (Wrong), Chris Uggen (Correct), and Peter Levin (Correct). Now, there appears to be a great deal wrong with this little program, since when I took it, its reported success rate was barely over 50%. But I also thought about what this says about reifying gender… and scholaristic writing in academia. Based on just these few trials within this social field, we see that it was incorrect 100% of the time with the female bloggers, and 16% incorrect for the male bloggers.


6 thoughts on “blogender

  1. I wonder if the fact that all the women tested here are academics makes a difference. I imagine that our training in academic-ese bleeds through to our blog writing. And academic-ese is very “masculine.” I did this once, and it was pretty confident I was male, too. Go figure.

  2. I would guess as much as well… In looking at ‘academic’ blogs, I am imagining that this was the case. What might be of more interest is how/why male academics can come up ‘wrong.’ Shakha was the only example of that… I wonder what it is to have tilted the program in the ‘female’ direction…

  3. Interesting. I think it’s a stretch to suggest that all of the women listed here have a uniform blogging style so this idea of “academic-ese” doesn’t seem very likely to me, but of course I have no idea how the system works and what it considers so perhaps there are a few academic-type phrases it looks for that may have tipped the scale.

  4. Eszter: I didn’t intend to suggest that all of the women cited here write uniformly… The program itself seems to reify gendered language, and I was just positing what that might mean. Implicit in your comment is that ‘academic-type phrases’ might tip the scale. I would guess as much as well, but that means that women, according to the program, use fewer academic type phrases.

    In the time that I first collected the above ‘data,’ at least two things on the GenderAnalyzer site have changed. First, they must have recalibrated their formula: it now correctly identifies more of the blogs (now identifying your blog as written by a female and Shakha’s posts on Scatterplot as written by a male), only still identifying What Is the What and Tina’s Scatterplot posts incorrectly. It also moved its analysis of Peter Levin’s blog from ‘male’ to ‘female’ (but only slightly). Which brings me to the second change: it now offers a percentage of the certainty. It finds a 76% chance your blog is written by a woman. It states that mine is written by a man, but is only 55% sure, stating that my blog is ‘gender neutral.’ (Keiran’s blog, 73% certainty, Peter Levin’s only 51%.) Slightly more interesting.

    Beyond gender, perhaps it says something about the language that academics use, even when writing in a more public venue. I wonder if this program, paired with the Flesch–Kincaid Readability Test, could say something about a gender bias to these sorts of assessments of gender. Just some thoughts.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s