beschizza at March 24th, 2014 10:25 — #1
tanya_d at March 24th, 2014 11:16 — #2
Hahahaha, the last season of Dexter. Good job team.
ktaylor38854 at March 24th, 2014 11:21 — #3
The last season of Dexter didn't hit the high point that the show managed to reach during the other seasons, but I don't think the decline was that significant. And the ending arguably was a fitting end to the character, even if it wasn't perfectly satisfying.
boundegar at March 24th, 2014 11:46 — #4
It looks to me like most of these "trends" are imposed on a wildly random dataset. Especially when you measure from the x-axis, it pretty much looks like "what trend?"
mwmpreece at March 24th, 2014 12:00 — #5
What I find interesting is how many "quality" shows like "Breaking Bad" and "Mad Men" and "Doctor Who" go from low numbers at the beginning of each season to high numbers at the end, then back to roughly the same low at the start of the next season.
rev at March 24th, 2014 12:08 — #6
It's almost like by the end of a declining show's series most of the people who don't like it have stopped watching/rating...
mwmpreece at March 24th, 2014 13:11 — #7
Am I misunderstanding how these graphs work? How does the above "Dexter" graph square with this, from wikipedia: " The original broadcast of the series finale — shown at 9 p.m. on September 22, 2013 — drew 2.8 million viewers, the largest overall audience in Showtime's history"?
hughstimson at March 24th, 2014 13:21 — #8
Yeah I noticed that too. Check out The Wire: http://graphtv.kevinformatics.com/tt0306414
hughstimson at March 24th, 2014 13:21 — #9
The ratings are from IMDB user votes, not viewership numbers.
psychonaut at March 24th, 2014 13:23 — #10
The graphs are for IMDb users' ratings of the episode, not for the total number of views. So 2.8 million viewers watched the last episode of Dexter, but I guess very few of them actually liked it.
jons at March 24th, 2014 14:18 — #11
Game of Thrones shows the same thing very strongly, except that when you re-scale the y-axis to be from 0-10 the effect all-but disappears, and any residual would be swamped by error bars.
genre_slur at March 24th, 2014 16:17 — #12
hughstimson at March 24th, 2014 19:16 — #13
Are you suggesting the ratings don't actually go up over the course of each season? Those look like statistically significant trends to me, truncated scales or no. At least for The Wire they do, not sure about Game of Thrones. Small R value but small p value.
jons at March 24th, 2014 20:10 — #14
No, the trend is up, but I'm not sure that it's statistically significant. I mean, it might be, but I'm not inclined to work it out right now. My point was simply that GoT, for example, seems to show a very strong upward trend across each season, until you notice that the Y axis is truncated. When you re-scale it, that trend all but disappears.
The Wire is similar, but (checks) the trend is still obvious (though not as dramatic) after re-scaling. I noticed the same effect across most of the shows I looked at. shrug Something to be aware of, I guess.
teapot at March 24th, 2014 22:20 — #15
Same for Seinfeld.
Probably a good way to decide which Dr Who season are worth watching and which aren't, based on the season trend.
garygoldfinch at March 25th, 2014 07:54 — #16
Consistently great I guess apart from the season finales (don't remember S4 being that much worse...)
mrscience at March 27th, 2014 16:32 — #17
It's interesting, but you actually do have to scale to the 7-9 range to see the true range of values. I can't find the paper at the moment, but this blog post describes how IMDB's choice of having too many rating options ends up causing less deviation and skews towards the high end: http://ezinearticles.com/?Using-Smaller-Response-Scales-in-Online-Surveys&id=3442219
beschizza at March 29th, 2014 10:25 — #18
This topic was automatically closed after 5 days. New replies are no longer allowed.