Me and My Nine Iron

November 29, 2010

Why IMDb is better than Rotten Tomatoes

Filed under: For your pleasure — BJ @ 4:20 pm
Tags: ,

When people look up to see how good a movie is, they go to one of two sites: IMDb or Rotten Tomatoes. It really irks me when I hear anyone mention what a movie’s “Tomatometer” is, let alone that person deciding whether to see the movie based on the Rotten Tomatoes score. In my terse attempt, here’s why you should stop checking Rotten Tomatoes and go to IMDb.

The fundamental difference lies in the voting system. Rotten Tomatoes only allows the voter to like it or not, whereas IMDb offers the voter to rate the movie based on a scale of 1-10, 10 being the best. It’s a fact that the more choices a voter is given, the more accurate the results will be. It doesn’t take a math major to figure out ten choices allow for a more accurate grade of a movie than two, especially when the two choices are you have to either like it or not. It’s also the reason I bumped my own grading scale from four stars to five because I found even four is too limiting. That alone should sell you but if it doesn’t, read on.

If you’re just looking to see if a movie is universally agreed upon, this argument is moot. Toy Story 3 boasts an 8.8 IMDb or a 99% Tomatometer. Similarly, Tyler Perry‘s For Colored Girls has a 4.1 or 34%. The question is do you want to know how good a movie is or do you just want to know if people liked it. The former is where I find moviegoers uneducated in their choice to go with Rotten Tomatoes for reasons I’ll explain shortly.

It also bothers me that Tomatometers are very polarizing. Given their voting guide, it makes sense that there are a boatload of movies in the 90’s. But again, the problem is people compare movies based on these scores and while you can say a 94% is practically the same as a 93%, this isn’t the difference between some people giving it a 9 instead of a 10, it’s the difference between some people not liking the movie at all. And that’s a little scary.

I was thinking the other day why the best movie of all-time on IMDb only has a 9.2. I can’t stand unrealistic expectations and surely, there has to be at least one movie in history that was flawless for two hours. But, it makes sense. Like the teacher who never hands out ‘A’s, you have to reserve the perfect score to prevent a slippery slope of well, if that movie got a 10, and this movie’s as good if not better than that movie, then so should this one. And that’s exactly what Rotten Tomatoes does; the top 100 of all-time are all 100%. Who knows how many hundreds more stand at 100%?

IMDb only has three notable movies that can claim a 9.0 or better. And for those unfamiliar with it, a 7.0 is a solid movie and anything that starts with an 8 is an absolute must-watch. I’ll list the year’s best movies per IMDb and compare it to its Tomatometer to show some perspective on Rotten Tomatoes’ widely inaccurate ranking.

Inception – 8.9; 87%
Toy Story 3 – 8.7; 99%
How to Train Your Dragon – 8.1; 98%

From these three movies alone, we can already see two clear differences: Rotten Tomatoes’ relatively low satisfaction of Inception, and its near-flawless reception of How to Train Your Dragon, both of which I disagree with. Having seen all three titles, I’m in agreement with the IMDb list of the three movies and wouldn’t have a problem with the first two flip-flopping.

First, Inception. Per Rotten Tomatoes, it’s the 34th best movie of the year! That’s just ridiculous. IMDb has it tied for fifth all-time. You’ve seen it. You tell me what it’s closer to. How to Train Your Dragon, in my opinion, is generously ranked on IMDb as well. It’s on the same level as Despicable Me, which holds a solid 7.6. But Rotten Tomatoes has it as the second-best wide release American movie of the year. Again, if you’ve seen it, you tell me which one you’re leaning towards – a good movie or an Oscar contender for Best Picture.

Those are pretty big differences for just two movies, two of the best, this year. Imagine all the differences from the historic movies, and they’re out there. Maybe, I’ve been making the wrong argument the whole time. That really, what I should’ve been saying is quite simply, that Rotten Tomatoes doesn’t know the difference between Oscar caliber and Tyler Perry-capable, with having too many arguable Tomatometers, something I can’t say with the very consistent IMDb. Try it. Look up any movie you want to. All this, and you’d think I’m getting paid by IMDb.

So make the right choice, and go IMDb. (But if not, tell me why you go to Rotten Tomatoes in the Comments.)

BJ

18 Comments »

  1. I agree with strangecloud hust look at this movie http://www.imdb.com/title/tt5222624/ (similar to One Direction’s “This Is Us” but 5 Seconds Of Summer Instead) not many people rated it and most 5SOS Fans might have rated it a 10 out of 10 ten that is why it might have gotten such a high rating (9.0/10.0)

    Comment by rd1452002 — December 30, 2015 @ 2:30 am | Reply

  2. When it comes to movie ratings on IMDb and movie ratings on Rotten Tomatoes, three key aspects differentiate the two. First, the quality of the screenwriting, and fast, or nonlinear film editing, is what most film critics use to rate movies on IMDb. On the other hand, Rotten Tomatoes focuses on the emotional and/or charismatic way a film is presented as controlled by the director. Take for example, Martin Scorsese’s The Wolf of Wall Street. In this film, Terence Winter (screenwriter) provided high quality dialogue and fast moving action to express the success and conflicts of stockbrokers in the 1980s, as this is why the film received a high rating on IMDb. However, Martin Scorsese (director) failed to visualize Terence Winter’s script in a more emotional and/or charismatic way that refects on the true nature, rise, and fall of Wall Street stockbrokers. In other words, this film was primarily meant to entertain, as this is why it received a lower rating on Rotten Tomatoes. Now take for example, Moneyball, starring Brad Pitt. This is a film that received a very high rating on Rotten Tomatoes considering the director presented highly emotional and charismatic action by defining a baseball team’s courage to change their strategy of winning games. If only the screenwriter focused on using better and more intricate dialogue, it would have received a higher rating on IMDb. A film that defines both an emotional and charismatic style of directing, along with outstanding screenwriting and dialogue use is Steven Spielberg’s Saving Private Ryan, as this is why it received very high ratings on Rotten Tomatoes and on IMDb.

    Comment by Chris Walczyk — April 26, 2015 @ 2:47 pm | Reply

  3. Neither. Metacritic rules all. IMDB voters are all just fanboys and haters, whereas RT just tells you if most critics liked a movie or not, not how much they liked it.* Metacritic uses a weighted average of critics’ ratings. That’s what I want.

    * RT does have an average rating too, but it is in small print.

    Comment by giantslor — March 9, 2015 @ 2:25 pm | Reply

  4. […] to waste my time investigating this so you don’t have to, using a different sort of analysis than here or here or here. My methods: I downloaded the IMDB plaintext data files containing all their […]

    Pingback by Are movies getting longer? A short history. - Extracurricular — December 6, 2014 @ 11:32 am | Reply

  5. […] to waste my time investigating this so you don’t have to, using a different sort of analysis than here or here or here. My methods: I downloaded the IMDB plaintext data files containing all their […]

    Pingback by IMDB vs Rotten Tomatoes (RT)? | Movie Statistics — July 22, 2014 @ 6:34 am | Reply

  6. […] I decided to waste my time investigating this so you don’t have to, using a different sort of analysis than here or here or here. […]

    Pingback by IMDB vs Rotten Tomatoes (RT)? | Movie Statistics — July 21, 2014 @ 3:36 pm | Reply

  7. Thanks for laying out the differences in rating systems used by each site. I have wondered about that off and on for some time. I learned something today.

    To me the inception rating seems more or less consistent. An 8.9 vs a 87% is relatively to same. After all, both are on a 1-100 scale, just arrive by different means.

    I agree that a genuine heartfelt “score” can do more than a “percentage” to convey the quality of the film as opposed to just “likeability,” but can be problematic itself. E.g, somebody feels an 8 is off the charts and will never rate that high for fear of wasting it, then waiting to hell freezes over, etc. This sounds crazy and you’d expect people to exercise common, but common sense ain’t always so common.

    Thanks for an excellent post and enlightening some of us as to the uniquely different systems used by each site.

    Comment by Joshua Myles Gibson — May 11, 2014 @ 3:38 pm | Reply

  8. Imdb is an American viewpoint; you can get a British perspective from Rotten Tomatoes.

    Comment by Strangecloud — February 14, 2014 @ 4:18 pm | Reply

    • In an earlier comment someone said that the top rated movie
      on IMDB is only a 9.2 because perhaps it’s NOT the best movie of all
      time. I think there’s a much simpler reason; here is my theory:

      Each person will have their own interpretation of what the numbers 1 to 10 mean
      on the IMDB rating system. But I reckon most voters, like me,
      will reserve a rating of 10 for their personal, all time favourites.
      And logic says you can’t have many of
      these, say about 5 or 10 films; certainly no more than 20.
      To have very many more than this wouldn’t make sense;
      if someone were to ask you what you’re all time favourite films are and
      you were to reel off a list of 30 films, they’d probably repeat the
      question and hope for a more sensible answer.
      If you had 30 films on your list, would you be able to at least remember
      what they are ? I doubt it.

      So, since these “shortlists” are probably so small and selective for each
      individual voter, there’s no surprise that the average rating across all voters
      struggles to get anywhere near a 10.

      I’m sure someone at IMDB would be able to confirm or deny by looking at the average number of 10s each voter
      awards.

      Comment by John — March 7, 2014 @ 8:25 am | Reply

  9. Imdb seems it could be more correct.

    Comment by mace — June 16, 2013 @ 7:07 am | Reply

  10. IMDB combines the critic score and user score it seems.

    Comment by mace — June 16, 2013 @ 7:06 am | Reply

  11. Interesting origin on RT, although I disagree on the point you make about the functionality of professional critic reviews on that site, which can also be found on IMDb, by the way. As you know, I don’t check RT, but many of my friends do. And I do know that as the average moviegoer, they check one thing on that site – its Tomatometer. Which brings me back to my argument about how shallow a movie’s grade is, comprised solely on whether someone simply liked the movie or not.

    The 9.2 movies you ask of are The Shawshank Redemption and The Godfather. Opinions? Absolutely. But to say that you can’t form a “best of” list from opinions or a consensus of hundreds of thousands of people means you take yourself too seriously. I’m afraid to ask what you view as the best movies of all-time. But then again, who are you to say that?

    Comment by BJ — January 31, 2012 @ 10:47 pm | Reply

  12. The difference between RT and IMDB is thus:

    People who value films from an artistic viewpoint (i.e. not merely as entertainment), as well as those who have experience or an educational background in filmmaking and, therefore, have extensive knowledge on the subject, most likely prefer RT, as it aggregates reviews from professional critics, who also have extensive knowledge on filmmaking and film criticism. These reviews tend to offer insight from an analytical mind, one that has viewed hundreds, if not thousands, of movies over many years and has opened their mind to everything from silent films to the Golden Age to present day, including foreign films, indie productions and, oftentimes, even underground movies. Their reviews use sophisticated language, literate references and filmmaking terminology to dissect a film’s structure and determine its success in various aspects. Furthermore, RT was initially created as a review collection site, before all the other bells and whistles were added. Although RT does also provide user reviews, does anyone really go there for the user reviews? I know I don’t.

    IMDB, on the other hand, was generated as a database of information to be used as a reference for the film industry and moviegoers alike. The reviews found there are not by professional critics; they are written by average moviegoers EVERYWHERE. That means you can read reviews from individuals, just like yourself, in other countries. These reviews can be from anyone who is old enough to write/type. That also means that the reviews use simpler language and, hence, will be more accessible to those who do not have a strong command of the English (or other) language, who have little to no knowledge of filmmaking or who only go to the movies for two hours of escapism entertainment and nothing else: hence, 95% of the general public. These reviews are perfect for the general public, because they are from people just like them: individuals who merely want to be entertained by a movie and, most probably, they will share similar opinions. There is nothing wrong with this, just so long as we realize where these opinions originate and for whom they are intended.

    In other words, if i were an average moviegoer, one who wants nothing more than to sit in a darkened theatre and be entertained by a movie, while eating my popcorn, I might visit IMDB or Amazon to read reviews from average people like myself. If at least 70% of the user reviews are positive, then it’s a good bet that I will enjoy it, too. Considering myself a film buff, with over 25 years experience critiquing movies (and many more simply watching them) and over 20 years employment within the film industry, I place myself in the other category, since I frequently disagree with the general public. For example, I really disliked “Toy Story 3”, and I was never a fan of “Toy Story”, to begin with. Also, I thought “The Social Network” was a bore and “Inception” was good, at best. Now, some might be tempted to call me a “contrarian” but that is far from the truth and an easy label to place on those who don’t share public opinion. My tastes, however, are more refined than those of the moviegoing masses. Yes, I know I sound like a snob but, with my film background and expertise, I feel that I have earned the right to sound like one. I challenge anyone who has seen less than 5000 movies in their lifetime (or who has seen but a mere handful of movies released before 1980 and less of those released between 1900-1960) to claim they know more about movies than I do . . .

    . . . which brings me to another difference between RT and IMDB. The critics whose reviews appear on RT can be easily researched to examine their credentials, background, experience and track record, all of which aid in determining their credibility. The individuals whose reviews appear on IMDB, however, can only be researched in terms of their track record; that is, if they use the same identity in all their reviews. But one has no idea of their background in film or, even, experience in movie viewing. For all one knows, their review could be for the first movie they’ve ever seen; though this is doubtful, I believe my point is made.

    Imagine, if you will, two separate websites created to offer criticism of art (e.g. paintings, sculpture, architecture, etc.): one contributed to and viewed, primarily, by professional art critics; the other, with criticism by and for visitors and potential visitors, respectively, of a given art museum or gallery. Which site, do you think, would offer the most educated and insightful analysis and critique of the art at hand; one based on a wealth of knowledge, background information and experience; one that might guide you in developing your own interpretation and opinion? Which site would you visit if you merely wish to wander through a gallery of pretty pictures, knowing that other people were satisfied by this pleasant afternoon stroll and not caring to know the meaning or intention of the artist?

    “I was thinking the other day why the best movie of all-time on IMDb only has a 9.2.” I’m afraid to ask to which movie you refer, but this statement alone undermines your argument, as it is, in itself, an opion. Who are you to call whatever film you are referring to “the best movie of all-time”? Perhaps it’s a 9.2 because it is NOT the best movie of all-time?

    Comment by Minutemaus@aol.com — January 29, 2012 @ 12:11 pm | Reply

  13. It’s an opinion, not a bias. All critics do it.

    I personally won’t get into what I think the best movies ever are. I find it impossible. But like I said in the post, I’d lean towards fifth-best ever than 34th of the year.

    Comment by BJ — August 19, 2011 @ 11:59 pm | Reply

  14. I loved Inception, but you’re biased as fuck to say what deserves what score when there are so many more aggregates such as Metacritic, RT User score, RT top critics average, RT average.

    Do you believe that Inception is the 5th best film of all time as IMDB says?

    Comment by Anonymous — August 18, 2011 @ 2:12 am | Reply

  15. Actually, IMDb and Rotten Tomatoes both have separate ratings for critics and users. For IMDb, the rating on the movie’s page reflects the users, which could be slightly different from its critic rating, which is what’s used if it’s in the Top 250. I’m talking a 0.1 difference higher among users, if that. Also, you need to be a regular voter for your rating to count on IMDb. I think there’s an exact minimum number, but you need to vote on so many movies. For my post, I used only critic ratings.

    Just looking at the numbers on Metacritic is proving itself to be dumb. And as you say, someone assigning an accurate number up to 100 based on words by another person is scientifically impossible to do. On the other end of the spectrum, 100 is way too large a scale to be rating movies. And just like Rotten Tomatoes, seeing scores in the 20’s is so off-putting. Saw 3D – 23? Due Date – 51? Inception? Shudder.

    Comment by Bryan — November 29, 2010 @ 10:45 pm | Reply

  16. I’ve been waiting for his post.

    I gotta confess: I use Rotten Tomatoes. But I also use IMDb, and I agree with pretty much everything you say. IMDb’s system provides a more comprehensive estimation of how good a movie is. I have only one problem: any doofus on the internet can chime in and effect the overall score.

    I value the opinion of paid, professional critics (even though I don’t always agree with them), which I think you do as well. I prefer a site that rates movies on the aggregate of reviews by respected critics from respected publications. That’s why I also (along with RT and IMDb) check metacritic (http://www.metacritic.com/).

    They separate critic reviews from user reviews. Users rate movies on a scale of 1 to 10, while critics are allowed the larger range of 1 to 100. But the one flaw in their system is the 1 to 100 scale; they aren’t given by the critics themselves. Instead I think they have people who read these critics’ reviews and “objectively” choose an appropriate score. If Roger Ebert gushes about a movie, for example, Metacritics gives it a 100 (which isn’t rare). If the NY Times loves a film but has one moderate complaint, MC will give it a 95. And so on.

    Don’t ask me how it works. I have no idea how the site differentiates a 93 rating from a 94 rating, as if there’s some formula; it seems pointless. But I find the site reliable, generally. And, as with any other site, you have to be cognizant of how many reviews there are. Two reviews for an average score of 90 isn’t more dependable than 30 reviews of an avg. score of 85.

    For 2010, MC’s top rated movies (with 35+ reviews):

    The Social Network – 95
    Toy Story 3 – 92
    A Prophet – 90
    Winter’s Bone – 90
    Tillman Story – 86
    The Kids Are All Right – 86
    Lebanon – 86
    Inside Job – 86

    You don’t want to know what Inception got (74).

    Comment by Chris Le — November 29, 2010 @ 9:06 pm | Reply

    • In my world, the MC list would read (using the same titles and numbers given):

      Lebanon – 95
      Winter’s Bone – 92
      Inside Job – 90
      The Kids Are All Right – 90
      A Prophet – 86
      Inception – 86
      Tillman Story – 86
      The Social Network – 86
      Toy Story 3 – 74

      Comment by Minutemaus@aol.com — January 29, 2012 @ 12:24 pm | Reply


RSS feed for comments on this post. TrackBack URI

Leave a comment

Create a free website or blog at WordPress.com.