Hello all. It’s been a while since I’ve posted anything, but I figured I might resurrect this dead blog (I think blogging as a whole is dead and I missed the opportune window for this to be really anything) to put a bookend on something I have been talking about without end for the past six days – Helen Pluckrose, James Lindsay, and Peter Boghossians’ hoaxing of “Academic Grievance Studies”.
Now, for those of you who aren’t familiar with this piece of investigative journalism, the story is that these three academic philosophers recently took it upon themselves to expose a certain section of academia known for its strictly activist bent. Typically under the guise of social justice issues, these journals, a couple of which which are highly ranked in women’s studies, are notorious for publishing postmodernist, anti-colonialist, incomprehensible word salad nonsense aimed at airing grievances about Western culture and it issues.
This piece has been pretty controversial, especially on Twitter. Those of you following the blog from there have probably seen me arguing with a number of folks, including well respected academics over my opinions on this (academics whom, despite completely ad-homineming me into complete oblivion by quoting me out of context, putting me on blast to their followers, and dismissing me as a raging leftist and “Nazi justifier”, I still highly respect). I think because of this, I’m just going to put my opinions on here and leave this thing alone permanently. It’s simply not worth the time or effort to argue with people who I have more in common with than separate.
To make this fair, easily readable, and so that nothing I say can be taken out of context, what I am going to do is I am going to give you what I think what was good, what was bad, and what was downright ugly about this thing and keep my comments open. If anyone wants to talk about this, they’re free to address issues on here or to me on Twitter, but otherwise I’m not voluntarily jumping in any more fights over this.
Before we start get too involved, I should remind people a little about where I stand in terms of grievance study specialists and other academics of this type. Despite being an anthropologist, my interests are almost purely scientific. For the last three years of my life, I have been a primatologist in anthropology departments studying the sensory systems and ecological relationship between different primates and their predators. Over the past year or so, I have begun to shift to the fields of cultural evolution and evolutionary psychology (both of which are controversial and problematic fields for grievance types). My own background which has strongly reflected the general attitude in the different anthropology departments I came from was science first, conclusions second. This is an attitude I’ve held both on my blog and in my professional life.
To this end, I have also published against anti-scientific rhetoric coming from within academia. This includes an essay on The Selfish Nature of Human Cooperation for Areo (the magazine which has published the hoax we’re discussing) and another for Quillette detailing the dangers of anthropological cultural relativism and its demonstrated self-professed ties to European nationalist movements. I’m not going to lay out my entire CV here, but you get the point- I’m on the other side from these grievance studies scholars.
Now, starting with this thing, I have to say that these three academic journalists went really out of their way to give it to these scholars and show what utter nonsense these guys are willing to peddle. The article showed that in addition to having few standards for rigor, these journals employ reviewers it appears have a main mission in life to push their political ideology. The sheer ridiculousness of some of the arguments and the reviewers’ unwillingness to comment on what were very clear political biases present in these studies speaks for itself. In a sense, they’ve done what the account @RealPeerReview does in reverse. Instead of showing us published nonsense, they showed us nonsense published.
I think the biggest thing contribution of this thing has been in raising awareness of what’s going on in these fields. Although RealPeerReview has been tweeting these academic shortcomings for over two years, the visibility of this has been largely restricted to the people following that account on twitter. With this hoaxing came a documentary, a press kit, and a large amount of hype which has made sure everyone is talking about it. This was an excellent example of the job of investigative journalism fulfilled.
This is where I’ve gotten into trouble on twitter and with my friends. I had a lot of problems with the methodology of the whole thing, and I think under further scrutiny this hoaxing has less merit than its authors and ardent supporters thinks it has. To be fair, this wasn’t a scientific study, it was a piece of investigative journalism, and as I said above, it did its job to this extent.
On the other hand, my ability and what seems like so much of the rest of academia’s ability to pick this thing apart is not a good thing because when it comes down to departmental decisions and time for this deadwood to be burned, these bad actors in liberal arts departments are going to be able to say, “It seems to me there wasn’t really a hoax at all.” It also reveals that the scope of this thing really wasn’t that wide. I’ll list some main points here.
- Experiment Design: It was interesting to see how wide these guys were able to cast their net in exposing these grievance studies. This can be a good thing in that a number disciplines can be exposed rather than just one journal. Yet at the same time, I think the problem is the variance here. Fat Studies, Poetry Therapy, and Porn Studies are nonsense journals. They are unranked on both Journal Citation Reports and Eigenfactor. My intuition is that submission to these journals was a way of padding just how many of these silly articles got accepted because a number of others were rejected from higher ranked journals. This is just an intuition, but I wonder what a more controlled experiment like aiming for six or seven equally ranked journals may have looked like; or if they had stuck to three journals and attempted to continuously resubmit nonsense; or if they had stuck to one ranked field- perhaps Womens’ Studies and picked the top ten journals.
Perhaps what is also important to fairly remember here is that none of these three academic journalists are scientists or statisticians. I don’t say this disparagingly whatsoever, but what I am saying is that we can’t force them to have actual controls or know what controls need to be built into their exposé to make it effective.
- Data: The top three highest ranked journals which accepted the hoaxes were accepted with fake data. Now, if you wanted to show that these fields are not rigorous in terms of assessing the data, this might be something to highlight, but the replication crisis in psychology has gotten so bad that Lee Jussim has literally suggested running betting pools on every new study that comes out. This is because it seems that randomized online bettors can accurately predict which studies are actually going to replicate. I feel this is a bad time to cast the first stone. Alternatively, the data presented in these articles was actually very strong – the paper on dog rape claimed to have had 1,000 hours of observational data. The paper on male anal penetration was claiming to be reporting what it was that their subjects reported to them.
I have always felt that in science, there are no stupid questions. Assuming the data shown in these articles were true, as strong as they purported to be, and assuming other fields like social psychology or primatology would accept the same values with an equal level of statistical rigor- it means that these results demand to be answered to rather than simply dismissed to being silly to human sentiments.
- Scope: Speaking of casting the first stone, in the documentary the authors discuss how the hoax has exposed fields including sociology (none of these journals are ranked in sociology), but fails to recognize where it is these fields are ranked. I might remind you all that the “breastaurants” paper they put out was published in Sex Roles – the number one journal for Womens Studies, but also the number 9 journal for Social Psychology and number 13 journal for Developmental Psychology. On Eigenfactor, Gender Place and Cultural is ranked 25 for geography. Again, perhaps they should have stuck to the top ten journals listed for Womens’ Studies for consistency- who knows if these issues are just as pervasive in Social Psychology and Developmental Psychology?
- Effectiveness: A point made by computational biologist and geneticist Michael Eisen on Twitter is that these 20 papers were submitted a number of 48 times. Seven were accepted and seven were sitting on the backburner when they cancelled their experiment. This means that despite having seven acceptances, there were at least 34 rejections with as many as another seven on their way.
- The Ideas: Again, in the documentary, James Lindsey says that, “what appears beyond dispute is that making absurd and horrible ideas politically fashionable can get them validated at the highest levels of academic grievance studies.” He also argues that studies which push a certain political agenda should be discounted. First and foremost, and this seems to be the most controversial point made on twitter- is that not all of these ideas are stupid in essence. People refer to the defense of some of these arguments as a “self-own,” but in highlighting the article in Fat Studies, one could easily argue for changes in attractiveness across time and across space today. Assuming they existed (they did), the status of Venus figurines depicting a thicc woman during the Upper Paleolithic probably reflected the benefits of a more desirable and healthier lifestyle than the periodic starvation these peoples likely went through. This image of attraction was, indeed, shaped by cultural norms.
The section above mostly focused on methodological issues with placing this hoax as an experimental control. This section is just on things that rubbed me the wrong way about the affair. First and foremost, I want to draw to light on the Hitler thing, which I personally believe is completely unacceptable and downright dishonest. One of the articles accepted, titled, “Our Struggle is My Struggle: Solidarity Feminism as an Intersectional Reply to Neoliberal and Choice Feminism,” has been advertised as a rewording of Hitler’s Mein Kampf submitted to the journal Affilia, which is the 18th ranking Womens’ Studies journal and 26th ranking Social Work journal on Eigenfactor. Out of all of the articles published, this one seems to be the biggest, “Aha! Gotcha!” for the strongest supporters of the hoax, and I am shocked to see how people have supported this.
What the hell were they thinking with this? Drawing on Chapter 12 of Mein Kampf, the authors and others claimed to have accurately reworded the words of Hitler to show that these grievance studies scholars are akin to fascists. But the fact of the matter is, almost any idea you have ever had in your life and almost anything you have ever said, has been uttered by a genocidal maniac. Agreeing with what are arguably some of the mildest and most straightforward sentences in a book about the grossest sentiments of 1925 Germany, does not mean that these sentences utter those sentiments. Why is this so hard for people to understand?
This passage in particular has continuously been highlighted to show that the journal article actually showed these Nazi sentiments, stating that the issues are with solidarity and absolute intolerance:
But my God man! These reviewers were given the above sentences completely contextless and asked to assess the sentences at face value, and we have to do the same. Why should we not have an absolute intolerance to oppression? Is feminism’s capacity to effect change exclusively guaranteed by its ability to achieve solidarity or not? When we make jokes about feminist movements “eating themselves,” this is exactly the stuff we are referring to. These ideas, which are not actually Hitler’s ideas word for word so much as strategic concepts, and need to be addressed at face value rather than thrown into guilt by association.
I find it ironic that in my discussions about this, with otherwise balanced yet heterodoxically-aligned people who have claimed to have been associated with Nazis for their sentiments, I have been called a Nazi for the first time in my life. When you make the rules of the game such that the only way for people to disagree is by agreeing with Hitler, I don’t think you’re doing any favors for yourself.
Maybe most of what I’ve said is nitpicking and this is just my mind acting as a scientist’s mind should when presented with what claims to be a systematic analysis of a reported phenomenon. I want to note that most of the scientists I have discussed this with tend to err on the side of caution, like myself, when assessing the claims. The two main questions here that are being asked by me and my colleagues are 1) Is there a systemic problem? 2) Does this show this problem as systemic? My general anecdotal consensus is that we all agree there’s a problem, but that this hoax fails to show its pervasiveness. On the other hand, my non-scientist friends almost unanimously support this hoax, meaning that despite not convincing some of us nitpickers, this stunt has done a decent role in exposing this issue to the public.
In terms of effectiveness, I’m not sure where things will go in the future since academics don’t seem convinced, but this may be a springboard for other things. Perhaps this study will replicate Sokal and nothing will happen. Perhaps more rigorous studies will follow or perhaps we will begin indexing, like RealPeerReview, the bad articles we see. Like studies of p-hacking, we can do text and sentiment analyses of articles, meta-analyses of methodologies looking for falsifiability, and a suite of other things to ensure nonsense isn’t being pushed. Despite the data being completely false and the argument that one could just as easily hoax physics, psychology, chemistry, or biology; perhaps what we need is a discussion about discussion sections and the sentiments we push with our data. Ultimately, the importance of this whole thing for its good, bad, and ugly parts is that now there is a debate and dialogue about this whole mess, and while I’m choosing to step out of it, I’m hoping something productive comes of this.
Like my blog and want to support my writing? Please consider supporting me on Patreon.
One thought on “Sokal Squared: The Good, the Bad, and the Ugly (and the Outlook)”
Nice job. I am worried that people can just use the “I was just hoaxing” excuse behind bad data.