Page Views and the Dragon Award

There is a common impression that there has been a change in character of the Dragon Awards this year. I though I might use the Wikipedia page view metric (see here) to see if I could quantify it it in a different way.

An immediate obstacle with using the page view figure is that the distribution is very Zipf like. That makes averages very misleading because the odd Steven King or Margaret Atwood creates a big change in the mean score. To overcome that issue and also to show the authors who don’t have Wikipedia pages, I’ve grouped the data in bins that get proportionately bigger. The first bin is 0 to 10 (basically people who don’t have a Wikipedia page) then 10 to 50, then 50 to 100, then 100 to 500 etc. up to 100,000 or more which is basically Steven King.

One major caveat. The page view numbers are as they are in September 2020 in all cases. So figures for past years reflect those counts for the authors now and not as they were in the year of the award.

This is the table for book categories (I haven’t gather the data for people in the comic book categories).

< 104262453444227
≥ 101113
≥ 502215
≥ 1005488631
≥ 5002136
≥ 1,00012109141560
≥ 5,0003144214
≥ 10,0006943527
≥ 50,0002114
> 100,00011
Winners and Finalists (book categories)

Obviously, there are many ways you can group this data but I think it shows some sensible groupings.

< 10111238
≥ 5011
≥ 100112
≥ 50022
≥ 1,0003322212
≥ 5,00013116
≥ 10,0004217
≥ 50,000112
> 100,00011
Winners (book categories)

These tables don’t suggest any substantial changes to the Dragon Awards. There are ups and downs but the overall character seems to be similar: a mix of big names (e.g. in 2016, Terry Pratchett and Brandon Sanderson) down to names that are famous within their Amazon niches (e.g. Nick Cole).

However, if we look at just the ‘headline’ categories defined by the broad genres Science Fiction, Fantasy, and Horror (I thought I should include Horror) we see a different story.

< 1071212233
≥ 10112
≥ 501214
≥ 10022318
≥ 50022
≥ 100056261029
≥ 500011327
≥ 100002332515
≥ 50000112
> 10000011
Winners and Finalists in Science Fiction, Fantasy and Horror

In these three categories, the authors are (by the page view metric) more notable in 2020 than in previous years.

What about gender? The Dragon Awards have been very male dominated both in absolute terms and even more so in comparison to contemporary awards. Using the page metric groups, a shift becomes more clear.

< 103543217
≥ 100
≥ 5011
≥ 1002133211
≥ 50022
≥ 1,00023361024
≥ 5,00021227
≥ 10,00032117
≥ 5,000011
> 100,0000
Authors using she/her pronouns book categories

The substantial increase is with women authors in the 1000 to 5000 range. The difference in gender balance becomes clearer in aggregate across the years.

GroupHe/himShe/HerTotal% he% she
< 1077179482%18%
≥ 1033100%0%
≥ 5041580%20%
≥1 0020113165%35%
≥ 50042667%33%
≥ 1,00036246060%40%
≥ 5,000771450%50%
≥ 10,0002072774%26%
≥ 50,00031475%25%
> 100,00011100%0%
Gender split 2016-2020 book categories

The gender balance increases with grouping size until the 5,000 group and then declines. Interestingly, with three each, the 50-50 split in that group also exists for winners.

So, yes the Dragons are changing but only in places. Down ballot, finalists still tend to be less notable and more male in a way that’s not very different from 2016.

Authors: which ones get looked up?

A perennial question around award nominees is just how significant are the authors being honoured. It’s a tricky question, particularly as there is no good data about book sales. Amazon ranks are mysterious and Goodreads data may be a reflection of particular community.

I’m currently taking a few baby steps into web scraping data and I was playing with Wikipedia. Every Wikipedia article has a corresponding information page with some basic metadata about the article. For example here is the info page for the article on the writer Zen Cho On that page is a field called “Page views in the past 30 days” that gives the figure stated. As a first attempt at automating some data collection, it’s a relatively easy piece of data to get.

So, I put together a list of authors from my Hugo Award and Dragon Award lists, going back a few years (I think to 2013). Not all of them have Wikipedia pages, partly because they are early in their careers but also because Wikipedia does a poor job of representing authors who aren’t traditionally published. Putting the ‘not Wiki notable’ authors aside, that left me with 163 names. With a flash of an algorithm I had a spreadsheet of authors ranked by the current popularity of their Wikipedia page.

Obviously this is very changeable data. A new story, a tragedy, a scandal or a recent success might change the number of page views significantly from month to month. However, I think it’s fairly useful data nonetheless.

So what does the top 10 look like?

1Stephen King216,776
2Margaret Atwood75,427
3Brandon Sanderson72,265
4Terry Pratchett55,591
5Rick Riordan43,484
6N. K. Jemisin34,756
7Cixin Liu32,372
8Sarah J. Maas21,852
9Ian McEwan20,468
10Neal Stephenson20,058

The rest of the top 30 look like this:

11Robert Jordan19,169
12Ted Chiang17,635
13Owen King16,041
14Jim Butcher15,493
15James S. A. Corey15,109
16Stephen Chbosky14,490
17Leigh Bardugo13,787
18China Miéville13,580
19Andy Weir13,057
20Harry Turtledove11,452
21Cory Doctorow11,362
22Jeff VanderMeer11,243
23John Scalzi10,796
24Chuck Tingle10,763
25Ben Aaronovitch10,493
26Brent Weeks10,271
27Ken Liu9,003
28Tamsyn Muir9,002
29Alastair Reynolds8,951
30Kim Stanley Robinson8,879

There’s a big Zipf-like distribution going on with those numbers that decline quickly by rank. John Scalzi has Chuck Tingle levels of fame on this metric.

OK, so I know people want to know where some of our favourite antagonists are, so here are some of the notable names from the Debarkle years.

40Vox Day5,271
45Larry Correia4,455
60John Ringo2,878
81John C. Wright1,251
111Brad R. Torgersen560
123Sarah A. Hoyt407
140L. Jagi Lamplighter229
152Dave Freer102
153Lou Antonelli101
156Brian Niemeier81

Day probably gets a lot more views due to people looking him up because of his obnoxious politics. Larry Correia is in a respectable spot in the 40’s. He is just below Martha Wells who has 4,576 page views — which is essentially the same number given how these figures might change from day to day. John Ringo is just above Chuck Wendig and Rebecca Roanhorse (2,806 and 2,786). John C Wright is sandwiched between Tade Thompson and Sarah Gailey.

You can see the full list here

Let me know if you find any errors.

Some More Dragon Award Stats

I’ve done a very rough division of publishers into big, medium and small. Big is easy to define: the parent company is some sort of huge corporate entity doing hug corporate entity things. Medium? Well don’t ask me for a good definition but the sort-of-company-that-gets-bought-out by a huge corporate entity. Basically, I lumped in all sorts of publishers that had many significant properties and what looks like a corporate structure etc. “Small” is an even worse crime against taxonomy and is basically everybody else from Castalia House to LMBPN to self-published works.

Here’s how the awards split by publisher size, winner/finalist and gender (as marked by pronouns) for all printed categories (books and comics).

big Total66.27%33.14%
medium Total86.79%12.26%
small Total82.52%17.48%
Grand Total76.46%23.02%

On pronouns, I don’t think there is a finalist who uses pronouns other than he/him or she/her but I may be wrong. The totals don’t add up to 100% because there are a few finalist only listed as “various” which I didn’t include.

Generally, gender representation is more equitable with the bigger publishers and worse with the medium sized publishers.

The change in character of the award is mapped out in this table.

Grand Total44.71%28.04%27.25%
all finalists

Aside from 2017, the percentages for “medium” have been fairly stable. The growth in works from the big publishers has been at the expense of the small publishers/self-published works.

Here is how those percentages shift when just looking at the winners.

winners only

Some Dragon Award Numbers

Officially fewer people voted but the exact number is unclear.

“More than 8,000 fans cast ballots for Dragon Award winners, selected from among 93 properties in 15 categories covering the full range of fiction, comics, television, movies, video gaming, and tabletop gaming.”

Last year’s press release used the same wording but said “10,000”

Some decent winners this year but the gender balance is still way off. This table shows the number of authors who weren’t men in all the story categories (novels and comics).

YearFinalistWinnerWithdrawnGrand TotalPercent
Grand Total811229525.47%

The major change is in the two headline categories of Best SF and Best Fantasy were Erin Morgenstern is the first woman to win a Dragon Award in these categories.

2020 Dragon Awards

Well, I can say what I like about the Dragon Awards but their livestream award announcement beat the Hugo Award in terms of efficiency and general presentation.

The winners (I missed the games) are:

  1. Best Science Fiction Novel: The Last Emperox by John Scalzi
  2. Best Fantasy Novel (Including Paranormal): The Starless Sea by Erin Morgenstern
  3. Best Young Adult / Middle Grade Novel: Finch Merlin and the Fount of Youth by Bella Forrest
  4. Best Military Science Fiction or Fantasy Novel: Savage Wars by Jason Anspach & Nick Cole
  5. Best Alternate History Novel: Witchy Kingdom by D. J. Butler
  6. Best Media Tie-In Novel: Firefly – The Ghost Machine by James Lovegrove
  7. Best Horror Novel: The Twisted Ones by T. Kingfisher
  8. Best Comic Book: Avengers by Jason Aaron, Ed McGuinness
  9. Best Graphic Novel: Battlestar Galactica Counterstrike by John Jackson Miller, Daniel HDR
  10. Best Science Fiction or Fantasy TV Series: The Mandalorian – Disney+
  11. Best Science Fiction or Fantasy Movie: Star Wars: The Rise of Skywalker by J. J. Abrams

Also Siobhan Carroll won the Eugie Award for He Can Creep which was a personal favourite.

I’m counting pronouns

I woke up late because I was working late on things from the mundane physical world, which means ploughing through spreadsheets and making columns think they are rows and rows think they are columns. This meant this morning I had less time for my entertainment, which means ploughing through spreadsheets and making columns think they are rows and rows think they are columns. Specifically, I wanted to wrangle the IGNYTE award finalists into a spreadsheet with a similar format to the one I’m using to collate Dragon Award finalists.

One bonus for the IGNYTE awards is the finalist list typically includes publisher where it is relevant. So that’s good. A downside (purely from a tabulating things perspective) is some of the names attached to a single finalist are as many as 20+ but that is a good challenge. Often things that look convenient from a data perspective (‘a book’ has ‘an author’) reveal assumptions about our world. Writing a list of stuff is something imbued with socio-political perspectives that can literally trip you up[1]. Counting people means categorising people and categorising people is not a simple thing.

A key issue is counting gender. There are good reasons for tabulating gender because it is an almost universal issue of social disparity across the world. We can see a lot about social change and inequality by looking first at gender and we have roughly two major categories of people (male, female) plus some proportionally smaller ones[2]. But gender is also complex and I’m not sure about the best way of counting it. Also, even though I primarily present gender stats in aggregate, I still am going through lists of authors and sticking them in a gender box.

Looking at the ‘how’ of the classifications I decided to modify the way I was tabulating gender. What I did with the Dragons was check Wikipedia entries or other sources of author bios and looking for indications of gender…but really what I was doing was checking pronoun usage. So, if what I was actually doing was counting pronoun usage then a smarter move was to tabulate PRONOUNS rather than gender i.e. instead of ‘other/non-binary’, ‘male’, ‘female’, use the categories of ‘they/other’, ‘he’, ‘she’. That way the presentation of the data matches what I was actually doing.

I don’t know if that’s the best approach but it has another benefit to a different question about the gender of authors. In note [1] below, the issue of James S A Corey was pertinent – two people who author books under a single name. Daniel Abraham and Ty Franck are both men but what gender is James S A Corey? It’s not just an abstruse philosophical question because if we are thinking about how sexism plays out in awards or book purchases etc the presentation of gender is relevant. “Robert Galbraith” is a pseudonym used by author J.K.Rowling and at least initially it was a secret that Galbraith was Rowling, which presents an interesting question when classifying books by the gender of their author.

Historically this has a further layer. Many women have used male pseudonyms (or made their gender less obvious by using initials) as a way to avoid sexism in book purchasing etc. However, some authors in the past who used pseudonyms of a different gender to their ‘everyday’ names did so for other reasons i.e. as a means for exploring their own gender. Nor are those mutually exclusive motivations. Authors regarded socially as female may have chosen male pseudonyms both to avoid sexism and to express their own understanding of their gender, nor is it going to be entirely clear which.

There’s a sort of moral to this story which is the unsurprising conclusion that gender is complex. The specifics in this case is that classifying authors by gender is complex REGARDLESS of your views on gender. You could have quite regressive views on gender (e.g. J.K.Rowling) but that doesn’t change that there are cases of authors were gender can be hard to classify (e.g. Robert Galbraith).

[1]e.g. back in 2019 some people scoffed that I’d made an error saying 10 men had won Dragon Awards in the two headline categories because four years and two categories comes to 8, so how could it be 10 etc.

[2] A reminded that ‘proportionally small’ can add up to a lot of actual people

The Webtoon Short Story Contest

Webtoon is an online comic company that uses the Korean style of online comics and has become an increasingly popular platform for graphic stories. I mentioned Webtoon mostly recently in my summary of James Davis Nicoll’s Hugo Packet where he reviewed this series

Where there are stories gathered together there are story competitions and Webtoon is no different. They recently held their Short Story competition with the winners announced here It’s a juried award with cash prizes that splits winners and runners up into two categories: “Brain” for stories that blow your mind and “Heart” for stories that warm your heart (Rules and FAQs).

“Why are you telling us all this Camestros?” I hear you say. Why because these are interesting graphic stories told in a vertically scrolling format and I think you might like some of them. I don’t need hidden agendas to do that do I? I don’t need ulterior motives just to point out fun or interesting genre-related stories?


Remember Comicsgate aka The Crappiest Gate and how Vox “I have never been a neo-Nazi” Day’s Castalia House had its own comics imprint Arkhaven and how they were going to have a movie etc etc? Recently, Arkhaven started putting a limited number of their comics on Webtoons. Arkhaven submitted one of their series “Midnight’s War” to the webtoons Short Story Competition. It’s a story about a vigilante superhero in a dystopian future were vampires rule the world (you can work out the subtext given it is coming from a company that’s published overtly Qanon themed comics).

Well surprise, surprise, it didn’t win. Fair’s fair, the artwork is more slick than the rest of Arkhaven’s offerings but it has dull dialogue and is not an obvious fit with the general style of Webtoons. It’s also not really a short story as such (more an opening chapter) and also it’s violent and the rules expressly stated that entries should not be “excessively violent”…and so on.

Anyway, no surprise it didn’t get anywhere.

Day though isn’t happy:

“And it wasn’t just unawarded. Midnight’s War somehow didn’t even qualify as one of the 36 runners-up despite being one of the top 10 ranked in Popularity and earning a higher rating than two out of the three Silver winners.

This tells me that Arkhaven needs to seriously rethink our plan to use Webtoons as a platform. While some of the winning stories were pretty good, precisely none of the art in any of the winners or runners-up was even close to that of Midnight’s War. It’s now clear that it’s just not the sort of thing that they are ever going to promote to their readership.” [warning]

It’s bad week for the right in literary awards but it’s actually worse than it looks.

Arkhaven already has it’s own website and Day has boasted about the money made from crowd funding and Amazon sales etc…so why the mini-tantrum about not winning an award on website were Arkhaven is giving away their material?

It is the law of diminishing returns or rather the law of a very saturated and limited audience. Each of Day’s multiple schemes over the years have been tapping the same cloud of people for money each time. To be fair to Day and his publishing schemes, he has actually produced products (books, comics etc) that have actual words and pictures. It’s not a con as such but Arkhaven is facing the same problem Castalia House had, the products never break out beyond this limited audience. Worse, back in 2014 the potential audience (disaffected right-leaning online people who like SFF) was broader and less factional. Day’s own antics have pissed off multiple people over time e.g. Back in 2013/14 Mad Genius or Larry Correia would still make excuses for Day, in 2015 they distanced themselves to create deniability over Rabid Puppies, by 2016 they started pretending he doesn’t exist. Similar disputes with people further to the right (e.g. Gab or other putative leaders of the Alt-right) have divided the potential audience for his product further. Feuding with Ethan Van Scriver over who owns “Comicsgate” in a bald-men-figthing-over-a-broken-comb dispute, divided that audience further and so on.

Webtoons was a recruitment drive and it didn’t work.

So what’s actually changed with the Dragons?

There is much wailing and gnashing of teeth and lamentations unto heaven about this year’s crop of Dragon Award finalists. However, the level of woe is not matched by volume — it’s really just from a small corner i.e. Declan Finn ( and Brian Niemeier ( Other authors of a woeful-hound aspect are either ignoring the ballot or are phlegmatic about it.

Elsewhere, the very trad-pub nature of the ballot has resulted in more positive press about the Dragons. How come? It’s no great mystery. Put books on a ballot list that are published by companies with a social media department and you will get more and wider coverage.

However, what has really changed here? To see what is different I want to point to this post from almost exactly a year ago.

I drew various doughnut graphs showing a breakdown of publishers at an aggregate level. I won’t copy over all the graphs from that post but here is 2016 – the first Dragon Award year.

The distinctive aspect of the Dragons was on the lefthand side of the graph for both good and bad. Many self-published works or small press works (often coalitions of self-published authors). Baen was a major force and there was the added presence of Vox “I have never been a Neo-Nazi” Day’s Castalia House.

Here is how 2020 looks:

There are still a significant chunk of small and medium publishers there (including new names like Aethon Press). However self-published and Baen have only a single book each and the behemoths of publishing (via smaller imprints) have a big slice of the doughnut (to mix bakery-product metaphors.

Here is a fancy gimmick that doesn’t quite work to show the change from 2019 to 2020.

While it seems like a big change in character, the big guys have increased consistently with how they have been increasing over time with the Dragons.

The biggest change is the loss of any finalists from Chris Kennedy Publishing, whose stable of authors have been a steady set of finalists over the years. Baen also has declined but it has been declining in numbers in the Dragons since 2016. Here’s a somewhat arbitrary grouping to show the changes:


If put under torture, I don’t think I’d be able to offer a consistent rationale for how I split things between trad and non-trad (e.g. Amazon’s 47North I counted as non-trad despite being owned by a huge company). Baen’s decline is a consistent trend. I guess if I could classify authors as “Baen adjacent” (eg Chris Kennedy again or Christopher Ruocchio who is published by DAW but is a Baen editor), the numbers would be bigger.

Is the change this year Covid-19 related? The pandemic is such a substantial presence this year that I can’t dismiss that out of hand. However, given that the Dragon Awards is and always has been a primarily online activity with only nominal connection to the DragonCon event (aside from the award ceremony), I’m sceptical. Overall I believe the data suggests that 2020 is not an unusual year but really just a continuation of an existing trend. Over time the Dragon Awards have featured more finalists published by imprints of the big publishing companies.

We have two competing hypotheses for the Dragon Awards:

  • They are a genuine popular vote, with finalists and winners determined by simple counts of online voters.
  • They are a somewhat curated set of finalists and winners that use online votes as advice and information.

These results do not help us discover where the truth lies between those two hypotheses. We might expect that over time votes will become more mainstream. Many, many people read independently published books in SFF but the common overlap of books read are more likely (in the long run) to be from big publishers. So over time we would expect a popular vote award to become more mainstream.

Good news for the Dragons? Yes and no. Yes in so far as some past winners have been hyper-dodgy. No because the smaller and self-published works were a distinctive aspect of the Dragons that distinguished them from other awards. However, rather like the down-ballot of the Hugo Awards helps promote short fiction and fan-creators, the down-ballot of the Dragons is still quite dragony in the MilSF and Alt-history categories. Arguably, very mainstream finalists in the headline categories brings more attention to the less trad-pub books further down.

Yet, the Dragon’s are also stuck with their initial legacy. Improved status going forward gives the original 2016 winners more status in the future.