What happens to indie writers when they don’t get 5 stars on their Amazon reviews? What happens when they get a 3 or 3.5 star? If you don’t have a publishing house paying for your advertisements (not that they do an even decent job of book promotion that I’ve observed / you could probably do better yourself), what does it mean to have 5 stars… or even less?
These 5 guys turn out to be important, they’re given weight by buyers with stars in their eyes, but there’s no telling how many of our readers actually stop to check the quality of reviews as they do so. Reviews are a bit of a crystal ball, it turns out. Or they can be.
Not all reviews are created equal. I’m pulling on some stats that came from a really brilliant Amazon data review in 2013 (A Statistical Analysis of 1.2 Million Amazon Reviews) but there are newer — though somewhat less openly accessible numbers — to rely on for you mad-statisticians to math with!
Max Woolf, a data scientist with credentials reaching into Buzzfeed and back to Apple, is responsible for some of my insights here. The rest of the supposition (and whinging)? That’s all me.
Astronomy vs. Astrology
People who run the numbers on Amazon and other book sales sites are looking at solid market behaviour. I consider them astronomers — star-gazers who have real data and can make and test hypotheses. They can project ahead, sort of like the Data Guy does. This separates them from the rest of us muddlers who engage in, to some degree, the astrology of stargazing. Oddly, Amazon reviews — all reviews, really — work the exact same way. There are helpful and unhelpful reviewers, according to Amazon’s metrics, but it may be more truthful to say is that some reviews carry more weight than others.
“All the longer reviews have high helpfulness; there are very, very few unhelpful reviews that are also long.”Max Woolf
Reviews are opinions, of course, but Amazon won’t let you come in and write a review that consists entirely of ‘U suck’, or ‘Great book’! There’s a minimum character requirement. Which means there’s an effort that must be made. That, alone, weeds out a lot of casual trolls and haters. A 100 character + minimum? Too much work! Also, Amazon’s helpful/unhelpful rating of reviews can be used to weed out personal attacks, and junk reviews, foisted on authors (we’ve seen this with even top-sellers, a la ‘It’s x because she’s Mormon’). But authors aren’t allowed to sculpt, or even delete reviews the way you might on, say, Fanfiction.net. This is good, because as a consumer, you see peoples’ honest opinions and not a statistical skew, and bad, because, in a massive machine like Amazon, it’s nigh impossible to contest a review no matter how it might attempt to blackball, or even gaslight.
It may be wise to be wary, even when a 3-star comes up. I’ve gotten glowing reviews before ‘I loved the book!’ that were 3-star. Not all people agree on what the star ratings in Amazon really mean, but we still base book purchases off of them.
Truthfully, there’s nothing tying star ratings to any specific type of review. They’re essentially the next galaxy over from their associated review text. One, in no way, has to reflect the real content of the other. Now: Is this bad?
Your fate is written in the stars… or is it?
Yes. Sort of. Also no.
I’ve seen blogs purporting the doom of whole genres due to 5 star reviews, and, I admit, a book with 5 stars across hundreds or thousands of readers is a rare thing. It’s a benchmark, J.K. Rowling, ‘era-founding’ thing. But I think readers usually are being honest, even when they have no idea how to write an actual, book-oriented review, and this is despite the spat of paid-reviews from Fiverr that has Amazon totally discounting the reviews of indie writers on any other book (why not throw the baby out with the bathwater, Amazon — it’s just a baby, right?). As writers, I doubt we have any place extolling the idea that reviewers should be assholes instead of fans: be tougher, be critical, be x, y, z, ‘You’re not being honest enough!‘ (yes, I’ve seen it). Writers probably have zero business in their own reviews — or so I’ve learned. It’s reader-response driven, and reviews really do have the power to bring out a writer’s dark-side. (I’ve been there – reviews can be a bit like alcohol, you gotta cut yourself off before you have a capital-P Problem).
A large number of 5-stars can, in fact, look suspicious in big number sets. But I doubt it’s as damning as to sink the entire enterprise of indie authorship (as I’ve also read in blogs). Amazon Prime TV series can have 4.5 and 5 reviews across huge numbers of people and survive. So can indie books. We’re not magic fairies, after all. I’d like for some blogger to sweep in and and tell those reviewers ‘You aren’t being honest enough‘. Riiight. Because we have the authority to police our readers. What horse-poop.
With 510, 434 separate reviewers out of 1.2 million reviews, the sample set is large, but not… what some people might expect. Those 500, 000 odd reviewers must be motivated though. Motivated people tend to be passionate. Passionate people may write longer and more positive reviews (they can also be livid and write more negative ones, btw). The numbers demonstrated that the average star-score is also rising over time — an indication that reviewers are learning the system, and, perhaps, thinning out as reviews become more normalized and less faddish (again this last bit is review-astrology).
Point is, the numbers point at what I call a passion-trend in reviews.
Skyview coordinates + star numerology
More than half of the reviews give a 5-star rating. Aside from perfect reviews, most reviewers give 4-star or 1-star ratings, with very few giving 2-stars or 3-stars relatively.Max Woolf
I can’t know the mind of reviewers, but this was useful data to me! What does this 1.2 million review data-set find re. how consumers comprehend star ratings? Basically, this is what I take from the statistics:
- If people went through the trouble to write a review a lot of them are passionate enough about the subject to either give it:
- a 5 ‘SO worthwhile!‘
- or a 1 ‘Die in fire, product!‘
- If they gave it a 4-star that’s a strong recommendation. Rejoice!
- 2 and 3 stars are neutral / may not exist / a phenomenon I call Schrödinger’s review.
- 1 star reviews deserve a read to see if the person is fair in their criticism. Are they:
- Furious about the product for legitimate reasons?!
- Attacking unrelated elements like shipping, the condition of the box, packaging, or angry that the post does X?
- Unfairly attacking the writer (passive-aggressively or otherwise) him- or herself?
Basically, I saw an inverted Bell Curve with a skew to the direction of higher numbers. Customers appear to be feeling-out (or averaging out) consensus on what the star-rating system at Amazon means. Maybe readers are going back to books and writers that reward their interests? Maybe Amazon’s categories and keywords are becoming more refined? I’m not sure, but, for me, the numbers shed some light on the matter!
How about you?