Tracking a Mythical Beast: Two Methods of Measuring Changes in Your Flickchart
I have a dream that someday my Flickchart will be in perfect order, and that I’ll never make a choice that changes a movie’s position. In the rational part of my brain, I know that day will never come. My feelings about movies change as I rewatch them, read about them, start to forget them, or work them over in my mind. But I can’t shake the notion that the more I rank movies, the more “accurate” my chart should become and the less it should change.
So a couple of years ago I created a system to track my chart’s changes mathematically. I wanted to determine whether its rate of change was slowing down.
Every time I made a choice that changed a movie’s ranking, I made a tally mark. There were two kinds of tally marks, one for movies that rose between 10 and 99 places on the chart, and one for movies that rose 100 or more places. In my quest for perfection, small changes are less troubling than large ones. A movie that rose fewer than 10 places, I generously figured, was not to be blamed for so small an offense. My Dinner With Andre did it to The Sundowners while I was preparing this article. Both are movies I esteem highly, so I’m content with either order.
I did this for 30,000 rankings or more. Every 500 rankings I counted the tally marks and hoped there would be fewer than there were in the last set. Much to my dismay, the number of tally marks was very consistent. A typical set of 500 rankings resulted in about 25 changes of 10-99 places and 1 change of 100 places. A really demoralizing run might result in 4 or even 5 100-place changes. No 100-point changes happened while I was preparing this article, but Doctor Zhivago came close when it rocketed 81 places to take the spot ahead of Disney’s Frozen in my top 200.
I was careful not to let my dismay at the number of tally marks I was accumulating affect my judgment, but I must admit there were times when I may have let close matchups be decided on the basis of their current rankings, giving preference to the higher-ranked one. I could feel the brown bar judging me. More than usual, I mean.
My insensitive tally marks held surprisingly steady, but I did not despair for long. I realized there was a flaw in my system; the 100-place changes aside, my 10-99 range was simply too wide to allow any qualitative judgments. The number of changes in my chart might have been holding steady even as the actual rate of change was decreasing. I had designed an experiment ill-suited to answer my question.
At the beginning of 2015 I created a new system, one whose results and effectiveness remain to be seen. Instead of tally marks, I note the actual number of places each film rises. At the end of 500 rankings I simply find the average. I also record changes of less than 10 places, even though they don’t really bother me.
Between rankings 64000 and 64500, movies that changed position changed 40 places on average. From rankings 64500 to 65000, the average jump was 37 places. It ticked back up to 41 places between 65000 and 65500 rankings. The average movement so far is 39 places, just a little more than Peter Pan moved to supplant Rebel Without a Cause this week on my chart. What’s up with Disney movies not being in the right spot?
My new system is more demanding than the old, but it has already reassured me that the average change in my chart is less than 50 places. I will run this experiment for a long time to find out if the average is dropping. If it ever drops below 10 places, I will be ready to call my chart as perfect as it can reasonably be.
If it ever reaches 0, I’ll know I’m dreaming.
If you have a better tracking system, or see a fatal flaw in mine, please let me know.