Archive for December, 2013


M-60 vs. Leopard 1 vs. Chieftain vs. AMX-30

M-60A3

The American M-60: MUCH better than I grew up believing!

Like anyone who is both my age and the author of military fiction, I took a keen interest in the very modern military affairs of the Reagan era. Quite by chance, I learned just how wrong the open source material of that period could be.

Back in the day, there was a common consensus about the 2nd generation of Western main battle tanks, those of the 1960s and ’70s, updated versions of which were still in use today. The French AMX-30 was the lighter and more mobile tank; the British Chieftain was the heavily armored, heavily armed (120mm rifled gun vs. 105mm rifled gun of the others), and consequently slow one; and the German Leopard 1 and American M-60 were about the same and somewhere in the middle. Everyone repeated this conventional wisdom over and over again: experts in the media, defense ministries, even games!

This was the picture that held until these vehicles started finding their way into private museums, and that is where the surprising truth came out, and I happened to stumble upon it. Take frontal armor, where it’s thickest: the M-60 has 155mm, and while the turret face of the Chieftain is indeed thicker, the vaunted Leopard 1’s thickest armor was only 70mm! In other words, the supposedly equivalent M-60 had 3 1/3 more inches of steel up front. Also, the supposedly lighter AMX-30 had 10mm more frontal armor than the Leopard 1.

Now, when you realize how many soldiers in how many countries operated these vehicles for more than 30 years without all the comparative data getting out (nevermind how many factory workers were involved in building the things) it’s a testament to just how many people can keep a secret when the reason for keeping the secret is so obviously important. I must also comment on how sneaky the marketing guys were in Germany, selling as many inferior tanks as they did from Australia to Canada to Chile!

Given how last place the Leopard 1 turned out to be, I have to wonder if the Leopard 2 is as good as it’s cracked up to be. Unlike the Challenger or the M-1 Abrams, the Leopard 2 has never really been tested in combat.

Fake reviews on Amazon are nothing new. They are so commonplace, in fact, that they have attracted attention from Forbes, Huffington Post, and the Wall Street Journal. Amazon cracked down on companies using dozens or hundreds of fake profiles to sell fake reviews, but fakes are still posted every day. Going to a faker company isn’t necessary; all the dishonest really need is a social network full of people willing to post false reviews on their behalf.

Don’t Blame Amazon
Amazon’s customer review system is actually better than most review systems on the internet, because at least it requires an acccount holder to make two separate and distinct purchases before they are allowed to comment on products. That barrier is set much higher than almost all other retail and social media sites, who only ask for a valid e-mail address. Frankly I don’t see what else Amazon could realistically do to combat fake reviewing, except raise the bar higher.

Yet the fake reviews matter, because they are about more than (in the case of books) author ego. Amazon’s algorithm incorporates the star-rating of a product as part of its computations as to whether and where that product should appear on those “Recommended For You” ads that appear in its e-mails and on its website. So, by plumping up ratings and/or sabotaging ratings of a competitor, a merchants and authors can enhance the visibility of their own offerings. I’ve seen plenty of examples of both.

Fakes, Friendly and Malicious
Coming from a publishing background, I already had a decent notion of what to expect from organic and natural Amazon review feedback, and that serves as a model for helping to detect when an author is piling on fake reviews. Let’s say an author has an established fan following, they release a new book, and on just Amazon that book sells 300 to 500 copies per day in its first month, plus what that book sells out in the brick and mortar bookshop world. That is 12,000 on Amazon, plus however many were sold out of book shops.

My experience is that a book like that, with sales of 12,000+ in the first 30 days, might reasonably expect up to 30 entries from fans who bought it, immediately read it, and responded with a review in that same 30 days. Most of those will be five-star, but some will be four-star and even three-star, since some of those fans won’t be fully satisfied or are the type that never give out five-star reviews in the first place.

Since Stonewall Goes West came out, I’ve kept an eye on books that have popped up since that Amazon puts alongside it. Keeping this in mind, it’s pretty obvious that when a new or little-known author, one whose sales in the first month add up to 500 or less (in some cases I’ve tracked, 100 or less) winds up with 10, 20, or even 40 reviews in the space of just a few weeks, all solidly five-stars, it’s mostly or entirely fake.

Conversely, some people clearly view book sales as a zero-sum contest, and try to undermine those they perceive as competitors by slashing away at them with bad, fake reviews. Most of the authors I’ve spoken to on the subject report the experience of having someone¬† give them a two-line canard of a review that says little more than “this book sucks and is full of grammatical errors” attached to a low star rating, and that being the only review of record for that person.

Beyond the self-interested, there are also the simple, old fashioned trolls. Amazon is still the internet, and wherever the internet provides an audience and grants people the chance to express their opinions, trolls will gather.

The sabotage tactics work because of the law of averages. It takes two five-star reviews to balance a two-star out into a reasonable four star, so just a handful of low reviews can drag a book down, and therefore off Amazon’s internal advertising. Something like that happened to me, when after a summer of rising sales I was “carpetbombed” with malicious, fake reviews over the course of several weeks. By October, Amazon wasn’t advertising me anymore, and my sales fell by half.

How to Spot Fake Reviews
Here are the factors I keep in mind to spot fakes. No one factor should be taken as confirming the review as a fake, but the more boxes a review checks off, the more likely it is illegitimate.

  • Short and insubstantial:¬†If a review is just long enough to meet Amazon’s minimum word count, and offers no details of the product in question, it is probably a fake.
  • Reviews all say the same, forming a pattern: Have you ever noticed how the pundits who appear on news programs often repeat the same points as all the other pundits from their camp, drawn from specified “talking points?” Fake reviewers sometimes have a script they draw their substance from, so if you see a dozen reviews in a row that say “this book is like Tom Clancy meets Steven Pressfield!” one right after the other, most or all of them are fake.
  • The reviewer is an obvious shell: Check out the reviewer’s profile, where you can find his other reviews. If all of them are short and insubstantial, the profile is probably a fake. If the review in question is the only one there, it’s almost certainly a fake. Conversely, a record that shows substance and diversity most likely isn’t fake.
  • Amazon Verified Purchaser: This proves the account in question actually bought to item. While not conclusive in and of itself (an author’s friend might buy the book, and then write a glowing review without reading it until much later), it is suggestive.

FYI: At the time I wrote this blog post, insofar as I know only three of my friends and family had finished my book and filed a review, and one of them gave me a four-star instead of a five!

I also decline to name names here, since I have no interest whatsoever in pissing in someone else’s cornflakes.