How To Refashion Your Flickchart Rankings As Ratings

Nathan Chase

Nathan Chase is a co-founder and the designer of Flickchart. He’s also a multimedia designer & developer living in central Florida, an online culture and social networking enthusiast, a proud father, an avid PC gamer, an incessant movie watcher, known for an eclectic musical taste, and often writing and performing music – on the drums, guitar, piano, or computer.

You can find Nathan on Flickchart as Zampa, and email him at nathan@flickchart.com.

You may also like...

10 Responses

  1. HungryTyger says:

    @One Small Popcorn –

    There certainly can be merits to each system, in addition to the weaknesses. I agree that defining what the number of stars means is important if you are going to use that system. But from there how would you create an ordered list of the movies you have rated? Obviously all of the five-starred movies would be above the four-starred movies, but then what? That is part of what Flickchart attempts to address.

    You bring up a good point, though. Knowing the order of a Flickchart user’s list would not necessarily tell you how “good” they think a movie is. If that user only ever saw terrible movies, knowing which movie is in their #1 spot is not very helpful. (Unless you’re into terrible movies.) Likewise, if a user only ever has seen great movies, there’s no sense in mocking whichever movie is in their dead last spot.

    The good news in this scenario is that most people have seen movies that cover the spectrum, so looking at another user’s Top 20 movies actually is meaningful. But you’re right that the user would have to define where the cutoff is, for them, on their flickchart to differentiate between the good, the bad, and the ugly. That currently does not exist on the site. We also do not yet have a way for users to indicate what “weight” they would give a particular movie. For example, just how much more does Nathan like Back to the Future over Back to the Future 2? You can’t absolutely tell that yet, even though we know they are respectively #5 and #24 out of 1015 movies on his list. That is something we are working toward to further refine users’ lists and the combined, global list on the site.

    Thanks for your post. We really appreciate hearing others’ thoughts on the whole thing, and feedback like that helps us hone our own opinions of the process.

  2. FitFortDanga says:

    Your conversion of ranking to rating is severely flawed. The percentage of films above a movie does not translate to a numerical evaluation of that movie’s merits. If someone scores the lowest in his class on a test, that does not mean his score is 0. If he’s in the middle of the curve, it does not mean his score is 50. You are assuming that all the movies you’ve seen are perfectly aligned from “bad” to “good” in even, incremental gradations.

  3. Robards says:

    If you’re talking about distance between films, an interface where you drag movies away from each other to show how far apart in your estimations they are would be cool, and could result in some kind of rankings list incorporating spaces between individual or groups of films. The films would still be in ranked order, but the intervals/gaps between them on the visibly reproduced scale would not be regular.

    If you wanted to represent Star Wars as far and away the greatest movie of all time, then you’d drag it higher up the top of your scale. The 2nd movie would still be 2nd, but there’d be a huge visible gap between it and Star Wars.

    So you would do your movie vs movie ranking, then tweak your final list by adding space between movies or groups of movies.

    This could allow the user to communicate weighting, without having to go the rating route.

    Also, to indicate your recommended cutoff point, you could be given the opportunity to place a seperator line or something. Could have as many of these as you saw fit.

    Add these to the thousands of unsolicited requests you guys get every day!

  4. Nathan Chase says:

    @FitFortDanga – Yes. That’s exactly what I’m inferring. The top of my list represents the best, while the bottom of my list represents the worst. Everything else is in the gradient of most liked to least liked. That’s why I posit that the position of a film on my list most accurately reflects my opinion of said film.

  5. johnmason says:

    It’s all so true. I mean, there is probably a good solid chunk of my Top 100 that I’ve given 10 stars on IMDB. (What can I say? I’m generous, like Ebert.)

    I’ve often thought of comparing my IMDB rankings to my Flickchart, and adjusting one or the other accordingly, but I usually just opt to let my Flickchart work itself out. My IMDB rankings are often based on my feelings after just having seen a particular film; I often wind up changing them after another viewing.

    But, of course, that’s the same with Flickchart, too. Very recently, I rewatched Groundhog Day. Subsequently, it jumped over 100 spots on my Flickchart. Think I gave it an 8 on IMDB, but it’s now in my Top 200. Such a subjective thing….

  6. johnmason says:

    Even now, I’m looking at my Top 50 and seeing a couple problems. Most of those films I genuinely consider to deserve a “10” ranking (including The Fugitive, currently at #50). But I’m noticing a couple of films on there (Die Hard with a Vengeance being a really good example) that I would only consider to be an 8 or 9. Odd….

  7. Charlie Johnson says:

    Both rankings and ratings serve a purpose. I tend to use ratings as an immediate response to a film. They’re a good short-hand both for reviewing and seeing how good someone thinks a film is. It takes all of five-seconds to come up with a rating plus if you want to, you can go back and change it later.

    Rankings take time and effort to produce. For some (including me) there is endless fun to be had in the tinkering to be done. They can reflect your changing opinion on a film more subtly than a rating system plus they give rise to fun debates with friends over beer and a pizza.

    Both have their place, but as HungryTyger says, just because you have a nice list of ranked films, it doesn’t imply a linear relationship with your ratings. There’s a correlation between my rankings and ratings which is especially close at the top and bottom ends, but more mixed up in the middle.

  8. I think this point has been made in these comments, more or less, but I think the real idea is to give your lowest ranked film an F, and call F a grade of 50, rather than a grade of 0. The problem with considering all films from 0-60 to be an F is that F gets a waaaaay bigger share of the available numbers than D, C, B or A. It make the math a little more difficult, but maybe that’s the best way to make Primer the B that it is, rather than the D that this system indicates it is. (By the way, I’m more inclined to agree with the D grade for Primer — I was not only on the “hard to follow” bandwagon, but was on the “this film is obtuse and pointless” bandwagon. But that’s just me. I guess I’d say it was a good effort that failed more noticeably than most noble failures fail.)

  9. Nathan Chase says:

    Right, I mainly used the letter grades to show another form of merit. I agree that it’s probably unfair to give anything past my 60% mark an F, but it does mean something if those films are that far below so many others…