Visualized TV show ratings over time

[Permalink]

Hahahaha, the last season of Dexter. Good job team.

4 Likes

The last season of Dexter didn’t hit the high point that the show managed to reach during the other seasons, but I don’t think the decline was that significant. And the ending arguably was a fitting end to the character, even if it wasn’t perfectly satisfying.

1 Like

It looks to me like most of these “trends” are imposed on a wildly random dataset. Especially when you measure from the x-axis, it pretty much looks like “what trend?”

1 Like

What I find interesting is how many “quality” shows like “Breaking Bad” and “Mad Men” and “Doctor Who” go from low numbers at the beginning of each season to high numbers at the end, then back to roughly the same low at the start of the next season.

It’s almost like by the end of a declining show’s series most of the people who don’t like it have stopped watching/rating… :stuck_out_tongue:

2 Likes

Am I misunderstanding how these graphs work? How does the above “Dexter” graph square with this, from wikipedia: " The original broadcast of the series finale — shown at 9 p.m. on September 22, 2013 — drew 2.8 million viewers, the largest overall audience in Showtime’s history"?

Yeah I noticed that too. Check out The Wire: http://graphtv.kevinformatics.com/tt0306414

The ratings are from IMDB user votes, not viewership numbers.

2 Likes

The graphs are for IMDb users’ ratings of the episode, not for the total number of views. So 2.8 million viewers watched the last episode of Dexter, but I guess very few of them actually liked it.

Game of Thrones shows the same thing very strongly, except that when you re-scale the y-axis to be from 0-10 the effect all-but disappears, and any residual would be swamped by error bars.
see: http://en.wikipedia.org/wiki/Misleading_graph#Truncated_graph

The Littlest Hobo visualization is sad.

http://graphtv.kevinformatics.com/tt0078644

Are you suggesting the ratings don’t actually go up over the course of each season? Those look like statistically significant trends to me, truncated scales or no. At least for The Wire they do, not sure about Game of Thrones. Small R value but small p value.

No, the trend is up, but I’m not sure that it’s statistically significant. I mean, it might be, but I’m not inclined to work it out right now. My point was simply that GoT, for example, seems to show a very strong upward trend across each season, until you notice that the Y axis is truncated. When you re-scale it, that trend all but disappears.

The Wire is similar, but (checks) the trend is still obvious (though not as dramatic) after re-scaling. I noticed the same effect across most of the shows I looked at. shrug Something to be aware of, I guess.

Same for Seinfeld.

Probably a good way to decide which Dr Who season are worth watching and which aren’t, based on the season trend.

Consistently great I guess apart from the season finales (don’t remember S4 being that much worse…)

It’s interesting, but you actually do have to scale to the 7-9 range to see the true range of values. I can’t find the paper at the moment, but this blog post describes how IMDB’s choice of having too many rating options ends up causing less deviation and skews towards the high end: http://ezinearticles.com/?Using-Smaller-Response-Scales-in-Online-Surveys&id=3442219

This topic was automatically closed after 5 days. New replies are no longer allowed.