More Dragon Stats

I’m still playing with my great-big-spreadsheet of Dragon Award finalists. I think yesterday’s data shows conclusively that 2019 is different in the character of the finalists but I want to dig deeper into that.

The change in character raises the question of change in voters. There’s roughly three possibilities, more, less or the same:

  • Maybe more people nominated this year, boosting an organic vote that reduced the impact of more factional votes.
  • Maybe fewer people nominated this year, but mainly in the former of fewer factional voters reducing their overall impact.
  • Maybe about the same numbers of people nominated but there was a shift in who voted and how they voted.

I don’t mean ‘factional voters’ in a pejorative sense, I just need a name for people voting because of a specific call-to-action kind of request from an author or author group they liked.

A fourth possibility is very few people vote every year and the choices have substantial input from admins i.e. the choices are curated. We know from previous years that there’s some sensible curation going on with the categories works appear in (e.g. ‘The Fifth Season’ and ‘Changeling Island’ appearing in two categories in 2016 or in other years works that authors had boosted for one category appearing in another).

Without data there is no way to tell. Some pertinent facts are:

  • Promotion of the awards remains weak. ‘Dragon Award’ has SEO problem. I just did a search on “Dragon Award” on Google for the past week and the top hits were Cora’s blog, File 770 and here, which (no offence to any of those august organs) suggests the announcement didn’t get the wider coverage it needs.
  • Boosting of the awards this year was much lower. I saw fewer people asking for nominations from fans. Really hard to quantify but I thought there was noticeably less interest. Not sure why.
  • On the plus side, Larry Correia has consistently promoted the awards, so there’s no reason to assume that fewer of his wider circle of fans nominated works this year.
  • Red Panda Fraction’s eligibility sheet made it a lot easier for people to find works and nominate.
  • On the “huh?” side of things Larry Correia is a finalist although he asked his fans not to nominate him. And Ian McEwan is a finalist, which just seems odd.

But for the sake of argument let us assume that the wider pool of Larry Correia/Baen voters was the same and that there was at least some extra voters because of Red Panda’s activities. Notably the proportion of Baen works nominated went *down* so the impact of Baen voters was reduced which, putting it altogether, implies the total vote went up. Maybe. Without numbers this remains guesswork.

So here is a graph. There are more graphs below the fold along with explanations but as they take up a lot of space I’m hiding the rest from casual view.

You clicked for more you magnificent nerd!

So I’ve been trying to tackle publishing categories and they are a pain. In the line graph above I first separated out Baen and Castalia House and then I classified everybody else as either traditionally published or not.

Baen’s best year was 2016, Castalia’s was 2017. I just looked at novels, so the one comic Vox Day got on the ballot in 2018 doesn’t show up. 2018 was also the best year for the non-trad books. An interesting exercise would be to see which books were available on Kindle Unlimited that year but it is an exercise beyond my patience.

Grouping nominees by publisher is an utter pain in the bottom. ‘Publisher’ is a terrible category for this set of data where it means quite different kinds of roles and organisations. Some publishers are essentially just the author, others are a group of friends, some are weird schemes like Inkshares and traditional publishers are a maze of imprints and companies owned by other companies.

With mainstream imprints I have tried to go to the broadest level of ownership for the biggest names. For a whole bunch of works with a mainstream publisher that only appears once or twice, I’ve lumped them together as ‘medium’, these are typically big publishers that aren’t owned by Macmillan, Hachette etc (or at least I don’t think they are). CBS is mainly Simon & Schuster.

“Amazon”, well everything is Amazon and nearly nothing is. In the end the only works that are classified as Amazon are those published by 47North – it’s quasi imprint.

“lf-published” is self-published but I messed up cropping the picture. “Small” is just a grab-bag of any publisher name that I couldn’t tell if it was a bijou trad-publisher or some guy formatting ebooks for Kindle Unlimited. “CKP” is Chris Kennedy’s suite of imprints, many with Baen adjacent authors. It’s a significant player in the Dragon Awards but not always obviously so because it has several brands in play.

I sorted the categories with trad (minus Baen) on one side and non-trad on the other.

Here we go, year by year, to see how things change.

2016: First year out, Tor, Orbit, Roc and Del-Rey filling out the big-three’s numbers. Strong showing from Baen. Nothing from smaller trad-publishers (aside from Baen) i.e. my ‘medium’ category’

2017: The weakest year for the trad publishers. Inkshares were the surprise player. CKP makes its first showing.

2018: No more Castalia House. CKP makes a big showing. The distribution among the trad-publishers is very different. There’s a greater variety of publisher among the trads. The decline in self-published works maybe just more writers calling themselves a publisher or joining with friends.

2019: Hachette, Penguin and Macmillan are back to similar proportions as 2016/17. The gain for trad-pub works has been in other publishers. CKP is still strong.

33 thoughts on “More Dragon Stats

  1. Chris Kennedy’s books are all in Kingle Unlimited. Ditto for Nick Cole and Jason Anspach and Michael Anderle and Craig Martelle’s LMBPN Publishing and Michael Scott Earle, when he was still around. Castalia House books used to be in Kindle Unlimited, but Vox pulled them out at one point.

    In general, I suspect that most self-published authors and authors writing for author collectives nominated for Dragon Awards are in KU; because military science fiction, LitRPG, harem/reverse harem are almost entirely Kindle Unlimited genres.

    Liked by 1 person

  2. It seems to me that some of our Scrappiest friends haven’t changed their level of effort but their results are deteriorating over time, suggesting they’re getting overtaken by an increasing number of “ordinary” DC voters.

    The general lack of promotion is just weird though. Maybe they took the view that last year’s results didn’t look too bad, a decent mix of trad with some factions without anything truly embarrassing, and decided they were on track.
    Certainly I’ve not seen any negative reactions from finalists or the noisy withdrawals of earlier years.

    Liked by 2 people

  3. My own guess is that they have a secret jury that looks at the nominations list and picks a list of finalists based on a long list of some sort. That isn’t actually a bad approach, if that’s what they’re really doing. It would also explain why they wouldn’t want to release any actual numbers.

    As far as puppy stuff, they lost their influence when their White Nationalist wing moved on to other projects. The ones who remain are simply too few to have much effect on anything.

    Liked by 1 person

      1. I was thinking something more along the lines of “We select 5 finalists from the top 20 most-nominated works.” Although a curated list might allow the curators to just add works on their own say-so, a work’s complete absence from the nominations would suggest it might not have much appeal to the target audience.

        Liked by 2 people

      2. I like the idea of doing crowdsourcing for longlist and then jury for shortlist. It works well with letting anyone vote as long as they have one (or several) emails.

        On the other hand, it doesn’t work as well with fan-award as was the initial selling point. Which might be why they’re quiet.

        Liked by 2 people

      3. It’s an elegant suggestion but is there any evidence that they’re actually doing that? The simpler answer is that the results stem from differing groups voting – have there been any finalists who were genuinely inexplicable?

        Liked by 1 person

      4. To me the McEwan book is the clearest evidence possible that the Admins are curating the list of finalists based not just on what’s nominated, but on what’s popular on the various Bestseller lists.

        Liked by 1 person

      5. I’ll admit the McEwan being on there is weird (odd, bizarre, wtf…) but I can’t see why anyone would think it ‘improves’ the dragon award ballot – what on earth does it represent to them?
        (Ok, I suppose someone selecting for controversy might pick it out, but nothing else seems to fit into that category)

        Liked by 1 person

      6. The McEwan strikes me as exactly the sort of thing that an Admin who is not widely-read in SFF might see on a bestseller list and add to the ballot in the interests of representing a wide range of tastes.

        Liked by 4 people

      7. “The McEwan strikes me as exactly the sort of thing that an Admin who is not widely-read in SFF might see on a bestseller list and add to the ballot in the interests of representing a wide range of tastes.”

        This. And also to add a note of respectability/high-classness/literary legitimacy.

        Liked by 4 people

      8. Tbh I’d have added McEwan’s name just for the opportunity to try and contact him and explain what it was all about….

        Liked by 2 people

      9. Ah, JJ beat me to it but I do think Larry’s nomination is a bit odd. 2018 he wasn’t a finalist even though he had an eligible book. He’d asked fans not to nominate him and apparently they didn’t. Maybe he turned the nomination down in 2018?

        Liked by 1 person

      10. Re: Larry’s nomination.

        I figured that since that crowd all thinks they are masters of four-dimension chess or whatever, that clearly in Larry is saying don’t nominate him, what he really means is “nominate me but give me the option to humble brag that despite my biggest fans honoring my wishes and not nominating me, I made the ballot and it would be an insult to the voters not to accept…”

        Who knows?

        Liked by 1 person

      11. What makes the nomination of “Machines Like Me” by Ian McEwan seem so strange to people? There really aren’t very many alternate history novels out there, and this may well be the only one lots of people read during the past eligibility period.


  4. I could have sworn Amazon and Publishers Weekly partnered on a prize, which was discontinued and replaced with BookLife. No idea what it was called…


  5. The award runners may not have had much interest in going along with Larry Correia in order to have balance in their selections for the Fantasy Award. He obviously had votes, as fans voted for him in spite of his request or didn’t know he’d made the request. And that made him a candidate for curating. If he doesn’t want it, he can withdraw his nomination.

    My guess would be, since we can’t know definitively, that voting has gone up a bit in number but organized campaigns of voting blocks have declined and are being more discounted by the award runners to get a broader list.

    Fewer requests to nominate from authors may be due in part to the confusing disorganization of this year’s awards, continued understanding from many authors that the results are not a legitimate vote count and concerns from controversies over authors promoting themselves for awards in general. There are a lot of awards and the Dragons are not very prominent yet and don’t have much backing from DragonCon, so many authors may have decided not to bother.

    Liked by 1 person

  6. CF, I don’t think there is anything like an “organic vote” for this award. It mostly looks like a tussle between different fan groups.


Comments are closed.