Spotting Fakery?

I previously pointed to an article on people manipulating Amazon rankings for their books, today there is a bigger brouhaha on whether somebody has manipulated the New York Time bestseller list: The method used (if true) isn’t new and political books have been prone to this approach before i.e. buy lots of the book from the right bookshops and head up the rankings.

One thing new to me from those articles was this site: It claims to be a site that will analyse reviews on sites like Amazon and Yelp and then rate the reviews in terms of how “fake” they seem to be. The mechanism looks at reviewers and review content and looks for relations with other reviews, and also rates reviewers who only ever give positive reviews lower. Now, I don’t know if their methods are sound or reliable, so take the rest of this with a pinch of salt for the time being.

Time to plug some things into their machine but what! Steve J No-Relation Wright has very bravely volunteered to start reading Vox Day’s epic fantasy book because it was available for $0 ( ) and so why not see what Fakespot has to say about “Throne of Bones”


Ouch…but to some extent, we already know that the comment section of Vox’s blog is full of willing volunteers ready to do sycophanting stuff and/or trolling/griefing at Vox’s request. Arguably those are genuine reviews, just that they are hard to distinguish between click-farm fakery. Think of it as a kind of Turing Test, which his right-wing minions repeatedly fail by acting like…well, minions.

How reliable is this? There’s no easy way to tell. As a side-by-side experiment I put in Castalia’s attempt at spoiler campaign versus the mainstream SF book they were trying to spoil:

Ironically, the reviews that Vox complains about, probably improve the Fakespot rating of the reviews – i.e. many negative reviews from people will make the rating of the quality of the reviews better. I also don’t see a way in general of Fakespot distinguishing between fake NEGATIVE reviews -i.e. showing that the poor ratings of a book aren’t genuine.

[A note of caution: the site doesn’t re-analyse automatically so the analysis you get may be out of date. The initial ratings for those two books were different but changed when I clicked the option to re-analyse]

I also don’t see a way in general of Fakespot distinguishing between fake NEGATIVE reviews -i.e. showing that the poor ratings of a book aren’t genuine. The basic report seems to assume that fake reviews are for the purpose of the seller artificially boosting a book rather than somebody maliciously trying to make a book look bad.


, , ,

8 responses to “Spotting Fakery?”

  1. I’m thinking of buying a new appliance and a couple of the models I was pondering are there, with a B, and one with a C. It’s a pretty hefty expense so this is good to know.

    They also show a couple of suspicious reviewers and why they consider them unreliable.

    “Stone Sky” has a 5.0 rating and over 90% non-fake.


  2. Ahhh this is quite addictive, especially when I’m supposed to be working. I ran some more pup stuff:

    A (>90% reliable): SJWs Always Lie (VD); Mutiny in Space (“Rod Walker” *cough VD cough*; Castalia); Forbidden Thoughts Anthology (various, superversive)

    B (80 – 90% reliable): The Missionaries (“Owen Stanley”; Castalia); Nethereal (Brian Niemeier)

    D (I think <70% reliable): Honor at Stake (Declan Finn) – 46% reliable; Star Realms: Rescue Run (Jon Del Arroz) – 65% reliable

    Initial findings:
    1. The rabid-adjacent base does seem to pass Turing test most of the time. I wonder if it's earlier reviews causing Throne of Bones to fail and there's been a change in tactics since.
    2. Oh, Declan.


  3. The Corrosion report identifies several potentially fake reviewers, and you can click through to see their Amazon profiles. They don’t strike me as fake as such, just sycophantic VD fans who give him five stars for everything. Mind you, identifying those sort of reviews is probably as valid a service as spotting genuine astroturfing.


  4. One reason that reviews by Rabids will look more valid than they really are to Fakespot is that the same minions who give 5 star reviews to Rabid works will also have given 1-star reviews to works by Scalzi, Leckie, Jemisin, Alexandra Erin, etc. — so they will pass the test of not having only posted positive reviews, the way that paid reviewers do.


%d bloggers like this: