Disagree with the premise. New movies tend to have more plot holes, less characterization and worse writing. Improved effects only rarely make up that margin. I also find the following stories just as plausible as yours: “New movies are over-represented on the IMDB top 250 because they get bolstered by excited fans who just saw the film and haven’t yet taken the time to digest the movie or see how it dates and who, often, haven’t seen the old movies on the list.” The Return of the King is not better than Blade Runner.
For what it’s worth, when you look at rankings, they almost always exhibit a recency bias (Murray gives some examples in Human Accomplishment as he tries to correct for it), so I am pretty skeptical that the IMDB would exhibit an anti-recency bias.
Parts of that argument are unfair. People voting on new movies often haven’t seen the old ones, but obviously most people who voted on the old ones haven’t seen the new ones, either. I’m not sure whether “see how it dates” is a good criterion either—it’s basically saying that what we think of as good movies changes over time, which isn’t ever an argument against any particular movie. If we want to keep the ratings more modern, we could weight new votes more than old votes.
If IMDB exists 100 years from now, it will probably at least be effective at comparing non-recent movies from different time ranges. The two movies that got compared would get a chance both at the new-fan vote and the old-fan vote. Assuming value drift, it’s not clear that the comparison would be meaningful, but it would at least be fair.
I don’t think it’s obvious that people voting on old movies haven’t seen new movies.
It might be likely that people who watch many old movies are less likely to have seen new movies, but it could just be that these people watch more movies in general.
It’s obvious not because of any character trait that those people have, but because the majority of the voting on those movies was done before the new movies came out.
I was going to say that this is likely more true for movies coming out in 2011 than 2000, which I still believe somewhat, but cursory research indicates that imdb was started in 1990 and acquired by amazon in 1998, so even in the year 2000 imdb was fairly large, and therefore probably had reasonable traffic.
When I thought about older movies I thought about movies from the ’50s and ‘60s especially, where all of the reviews necessarily came out long before the movies were released, rather than movies from the ’90s, where the effect you mention should be pretty strong.
So my new hypothesis in your vein would be that ’90s movies should be over-represented compared with ’00s movies.
Null hypothesis from the data I’ve referenced in my other comments: approximately 37 movies from the ’90s.
Actual data: 40 movies from the 1990s in the top 250. So signs point to movie quality being essentially constant across time (at least on the decade level of granularity. I’ll take another look at specifically 1998-2003; the five years after being acquired by amazon in which presumably the site had the most traffic, but I feel like I’m privileging the hypothesis here.)
Disagree with the premise. New movies tend to have more plot holes, less characterization and worse writing. Improved effects only rarely make up that margin. I also find the following stories just as plausible as yours: “New movies are over-represented on the IMDB top 250 because they get bolstered by excited fans who just saw the film and haven’t yet taken the time to digest the movie or see how it dates and who, often, haven’t seen the old movies on the list.” The Return of the King is not better than Blade Runner.
/done with my silly arguing for the day.
For what it’s worth, when you look at rankings, they almost always exhibit a recency bias (Murray gives some examples in Human Accomplishment as he tries to correct for it), so I am pretty skeptical that the IMDB would exhibit an anti-recency bias.
Parts of that argument are unfair. People voting on new movies often haven’t seen the old ones, but obviously most people who voted on the old ones haven’t seen the new ones, either. I’m not sure whether “see how it dates” is a good criterion either—it’s basically saying that what we think of as good movies changes over time, which isn’t ever an argument against any particular movie. If we want to keep the ratings more modern, we could weight new votes more than old votes.
If IMDB exists 100 years from now, it will probably at least be effective at comparing non-recent movies from different time ranges. The two movies that got compared would get a chance both at the new-fan vote and the old-fan vote. Assuming value drift, it’s not clear that the comparison would be meaningful, but it would at least be fair.
I don’t think it’s obvious that people voting on old movies haven’t seen new movies.
It might be likely that people who watch many old movies are less likely to have seen new movies, but it could just be that these people watch more movies in general.
It’s obvious not because of any character trait that those people have, but because the majority of the voting on those movies was done before the new movies came out.
I was going to say that this is likely more true for movies coming out in 2011 than 2000, which I still believe somewhat, but cursory research indicates that imdb was started in 1990 and acquired by amazon in 1998, so even in the year 2000 imdb was fairly large, and therefore probably had reasonable traffic.
When I thought about older movies I thought about movies from the ’50s and ‘60s especially, where all of the reviews necessarily came out long before the movies were released, rather than movies from the ’90s, where the effect you mention should be pretty strong.
So my new hypothesis in your vein would be that ’90s movies should be over-represented compared with ’00s movies.
Null hypothesis from the data I’ve referenced in my other comments: approximately 37 movies from the ’90s.
Actual data: 40 movies from the 1990s in the top 250. So signs point to movie quality being essentially constant across time (at least on the decade level of granularity. I’ll take another look at specifically 1998-2003; the five years after being acquired by amazon in which presumably the site had the most traffic, but I feel like I’m privileging the hypothesis here.)
6 from 1998, 6 from 1999, 5 from 2000, basically exactly what I’d expect, nothing super high.
I agree in large parts, but it seems likely that value drift plays a role, too.