Empirical Evidence that Flickchart is the Future of Film Criticism
The homepage of Flickchart states “If they’re all 5 star movies, which one is the best?” I put that to the test to determine just how effective their system is, and just how broken traditional rating methods are.
I keep a running diary of all the films I watch, complete with date, film, rating, and check marks indicating whether it’s the first time I’ve seen the movie and if I watched it on TCM. There are very few other things to do with the “Latino Images in Film” spiral notebooks I won from TCM a while back. I’m almost positive Cantinflas would have done the same thing. I watch a movie, record it in the diary an hour later after it has sunk in, and then get it on Flickchart when I get the chance. The current book is at over 500 movies since July ’09. Now, it would make sense that the relationship of the ratings I come up with after seeing a movie and their positions on Flickchart would match, right?
Here are some of those 500+ movies selected completely at random. All of the movies are in what I consider to be their homes on Flickchart. Except for the odd match-up or two, those ranks aren’t changing. I’ll organize them by Flickchart ranking (out of my current 1870 movies total) so you can see the disparity:
Flickchart #25 – 7 Brides for 7 Brothers – Diary = 7.5
Flickchart #63 – Peeping Tom – Diary = 9.0
Flickchart #101 – Stormy Weather– Diary = 8.0
Flickchart #195 – Ricochet – Diary = 8.7
Flickchart #210 – Back to the Future II – Diary = 7.7
Flickchart #239 – Rocky III – Diary = 9.6
Flickchart #249 – The Narrow Margin – Diary = 5.9
Flickchart #312 – The Dark Knight – Diary = 7.8
Flickchart #355 – The Innocents – Diary = 8.7
Flickchart #395 – Star Trek – Diary = 9.4
Flickchart #461 – 9 – Diary = 6.8
Flickchart #468 – The Curious Case of Benjamin Button – Diary = 5.4
Flickchart #473 – Coraline – Diary = 4.2
Flickchart #709 – Duck Soup – Diary = 7.8
Flickchart #818 – Gremlins – Diary = 8.4
I knew listening to Ice-T and not dropping science would eventually pay off. Look at that evidence. I have pages of it. My least favorite movie (according to Flickchart) from that random selection is rated higher than my favorite. At their core, those numerical ratings mean nothing, they’re too arbitrary. Even fellow cancer survivor and my personal hero, Roger Ebert, is victim to the archaic ratings system. He just wrote up The Big Lebowski in his “Great Movies” essay series – one of only a couple hundred to get such treatment – and he only gave it three stars when he first saw it. All of a sudden a 7.5 becomes a 10? Rotten Tomatoes shows him as having roughly 1000 4-star reviews, that doesn’t even count 3.5-star reviews. How does a 3-star all of a sudden get to be better than a majority of those? Simple, giving a movie a true and precise rank just can’t be done until you drop them in a red square and click one of them. Or in other words, to give them perspective and to see how they relate to the other movies in your consciousness, this is exactly what Flickchart specializes in.
I’d would love to see a day where critics grew a pair and gave thoughtful and meaningful ratings, instead of the generic status quo. Really, what does 2-stars even mean anyways? Could you imagine seeing the latest critics’ reviews, and instead of giving a movie 2-stars, it was “Flickchart #5734/8126?” They could show the two movies above it and below it to give their readers some actual perspective. Opening up about how they rank movies and what their tastes are in general by joining the Flickchart community could just help save their dying profession.
This post is part of our User Showcase series. You can find Daniel as espin39 on Flickchart. If you’re interested to submit your own story or article describing your thoughts about movies and Flickchart, read our original post for how to become a guest writer here on the Flickchart Blog.