How does rotten tomatoes make money




















As the reviews come in, The Tomatometer measures the positive reviews against the negative ones and assigns either an overall score of fresh or rotten rating to the film or television show. Rotten Tomatoes is careful in its Critic curation. It aggregates those who have been regularly putting out movie reviews over the last two years, and those who are considered active by Rotten Tomatoes standards. While there are about 3, accepted reviewers see the Tomatometer-approved critics criteria , usually only several hundred are actively reviewing for any given film.

And Top Critics are counted with a separate score. So while the the Rotten Tomatoes rating system is really just general consensus, you can see some of the more renowned critics in a different space. You also get a fully rounded out review because you can also see how the audience feels. The score is the percentage of users who have rated the movie or show positively.

There is also a section for Verified Ratings which includes those that have actually bought tickets. The most interesting finds are the ones that have a green splat for critics, and a full bucket of popcorn from the audience. And while reviews are opinion to some extent, the site boasts something called Certified Fresh, which brings a little more objectivity to the critique.

If it meets these requirements, it is automatically flagged for review. When the Rotten Tomatoes staff can determine the movie or show is unlikely to fall below these numbers, it achieves its Certified Fresh status.

Because the Rotten Tomatoes ratings system is so general, RT certified fresh consideration gives the site more objective credibility. So, what's the bottom line? With the movie theater business under constant assault from the rise of streaming services, audiences are less and less likely to venture out to the movies.

If they do happen to make it outside the house, they'll likely be extra picky about how they spend their money. Will they choose an "untested" wildcard movie or one that has general approval from fans and critics?

The answer is self-evident. On its surface, the Rotten Tomatoes rating system and Tomatometer seem to be a legitimate resource for the discerning consumer. Of course the name more straightforwardly evokes the supposed old-time practice of hurling fruit at unsatisfactory stage performers. In that spirit, the site also offers a second, more Yelp-like rating called the Audience Score, determined by hundreds of thousands of Rotten Tomatoes users who grade movies from 0.

Tim Ryan's maximalist archival project befits the growth of the site. Founded in by Berkeley postgrads who wanted to rate Jackie Chan movies, Rotten Tomatoes matured into a powerhouse by proving its usefulness to corporate America.

Steve Jobs, an early evangelist, name-checked the site during his keynote presentations. In it was bought by Flixster, which was bought the following year by corporate overlord Warner Bros. Now, when you browse for showtimes on Fandango, which is the country's dominant ticket seller, you'll see a Tomatometer beside each release. For studios, the Tomatometer has become a ubiquitous marketing tool, while news coverage of the scores has become its own odd internet subgenre.

As the site's influence grew, it inevitably led to a reckoning. In producers started blaming low scores for the dismal performance of expensive summer fare—like the Baywatch reboot and the latest terrible Pirates of the Caribbean installment. Casual conspiracy theorists, meanwhile, imagined that Rotten Tomatoes intentionally goosed movie scores according to the wishes of studio bosses. While there is no evidence that curators can be bought, the site's Audience Score is definitely corruptible.

In late and early , it fell prey to a trolling epidemic, as bigoted male comic book fans appeared to bull-rush the site to take down the audience score of superhero movies, like Black Panther and Captain Marvel , whose stars they deemed unacceptably black or female.

All of a sudden, along with the rest of the internet, Rotten Tomatoes was not to be trusted. The crowds were not wise. Still, there is an authoritative allure in the site's numerical scores. As a Rotten Tomatoes user, I reflexively—and nonsensically—trust a Fresh 60 percent Tomatometer over a Rotten 59 percent.

Yet the numbers themselves, as I found, can be close to meaningless. And it raises the question: What's the best way to choose? Or, more to the point, who do you trust? Rotten Tomatoes' office, which it shares with the larger Fandango staff, has a Silicon Valley feel. Walls you can write on. Walls you can remove. Pods, booths, nooks. The orange of Fandango's logo everywhere.

But this meeting felt less startup and more extremely random J-school seminar. The meeting works like this: Curators submit articles that may or may not be reviews, and the room decides if they are. That's it. Rotten Tomatoes will not consider reported features, tweets, or—to its eternal credit—recaps. Today's submissions include a Guardian piece on 30 Rock 's overreliance on celebrity guests, a rambling discussion on a culture podcast, and a Entertainment Weekly piece about the short-lived daytime program The Bonnie Hunt Show.

All were swiftly labeled nonreviews. Robert Fowler, a TV curator, laid out the problem. In this case, I think it's kind of a byproduct of a very established television critic maybe being a little bored by his subject matter. Nobody could tell. Meetings like this are crucial to maintaining Tomatometer integrity. Few contemplate this more than Jeff Giles. Bearded, wearing a Henley and a flannel shirt when I met him, he exudes steadiness and chill, which is a good quality to have when you read Joker reviews for a living.

A New Hampshire resident who mostly works remotely, Giles began curating for Rotten Tomatoes in Giles, 45, leads the theatrical department. That sounds grander than it is. Of Rotten Tomatoes' four dozen employees, just 12 are curators. Three work on historical reviews. Seven monitor the content fire hose that is peak TV.

That leaves just two, including Giles, working full-time on movies. Giles, who was in Beverly Hills on a regular visit, stared at his laptop while I observed his daily labors. Each curator is responsible for a list of publications. The job: Evaluate a review's freshness, then trawl for a good pull quote to slap on the website. The review is meandering and difficult to evaluate. Craving a challenge, I ask Giles for a tougher call. He cites a condescending but lighthearted review he had already logged of the Downton Abbey movie.

No quota for superlatives, no scale for snark. Critics love movies and want them to be good, and we try to be honest when we see one that we don't measures up. That doesn't mean the audience can't like a movie with a rotten rating, or hate a movie with a fresh rating. It's no insult to critics when audience opinion diverges.

In fact, it makes talking and thinking about movies more interesting. It's helpful to get a quick sense of critical consensus, even if it's somewhat imprecise. Many people use Rotten Tomatoes to get a rough idea of whether critics generally liked a film. And that, frankly, is what makes art, entertainment, and the world at large interesting: Not everyone has the same opinion about everything, because people are not exact replicas of one another.

Most critics love arguing about movies, because they often find that disagreeing with their colleagues is what makes their job fun. A good Rotten Tomatoes score indicates strong critical consensus, and that can be good for smaller films in particular. The result, they hope, is increased interest and ticket sales when the movie opens in other cities. And the more recent The Big Sick became one of last summer's most beloved films, helped along by its 98 percent rating. But a bad score for a small film can help ensure that it will close quickly, or play in fewer cities overall.

Its potential box office earnings, in turn, will inevitably take a hit. A good Rotten Tomatoes score, for example, doesn't necessarily guarantee a film will be a hit. Still, studios certainly seem to believe the score makes a difference. Last summer, studios blamed Rotten Tomatoes scores and by extension, critics when poorly reviewed movies like Pirates of the Caribbean: Dead Men Tell No Tales , Baywatch , and The Mummy performed below expectations at the box office.

The Emoji Movie , for example, was critically panned, garnering an abysmal 6 percent Rotten Tomatoes score. And the more you think about it, the less surprising it is that plenty of people bought tickets to The Emoji Movie in spite of its bad press: It's an animated movie aimed at children that faced virtually no theatrical competition, and it opened during the summer, when kids are out of school.

Great reviews might have inflated its numbers, but almost universally negative ones didn't seem to hurt it much. The Mummy gave Tom Cruise his biggest global opening ever. If there is a Rotten Tomatoes effect, it seems to only extend to the American market. Plenty of people would like you to believe that the weak link between box office earnings and critical opinion proves that critics are at fault for not liking the film, and that audiences are a better gauge of its quality.

Fans LOVE the movie. Huge positive scores. Baywatch ended up with a very comfortably rotten 19 percent Tomatometer score , compared to a just barely fresh 62 percent audience score. We are also a rather reserved and nerdy bunch, not regularly armed with venom and knives.

But somehow, I suspect that younger ticket buyers — an all-important demographic — lacked nostalgia for year-old lifeguard TV show, and thus weren't so sure about seeing Baywatch in the first place. Likewise, I doubt that a majority of Americans were ever going to be terribly interested in the fifth installment of the Pirates of the Caribbean franchise which notched a 30 percent Tomatometer score and a 64 percent audience score , especially when they could just watch some other movie.

But with lackluster reviews, the average moviegoer just had no reason to give them a chance. Big studio publicists, however, are paid to convince people to see their films, not to candidly discuss the quality of the films themselves.

Consider, for example, the case of the aforementioned Emoji Movie. I and most other critics hoped the movie would be good, as is the case with all movies see.

It screened for press on Wednesday night at 5 pm, and then the review embargo lifted at 3 pm the next day — mere hours before the first public showtimes. Thus, in spite of there being no strong correlation between negative reviews and a low box office, its first-weekend box returns might be less susceptible to any potential harm as a result of bad press.

Such close timing can also backfire; critics liked this summer's Captain Underpants , for example, but the film was screened too late for the positive reviews to measurably boost its opening box office. That first-weekend number is important, because if a movie is the top performer at the box office or if it simply exceeds expectations, like Dunkirk and Wonder Woman did this summer , its success can function as good advertising for the film, which means its second weekend sales may also be stronger.

And that matters , particularly when it means a movie is outperforming its expectations, because it can actually shift the way industry executives think about what kinds of movies people want to watch. The implication was that Fox believed the movie would be a critical success, and indeed, it was — the movie has a 97 percent Tomatometer score and an 86 percent audience score.



0コメント

  • 1000 / 1000