Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> 3. RT is based on people's opinions, but news should be fact checked.

Except when RT doesn't like to hear people's opinion when it differs from the mainstream. Unfortunately it's easy to box it as "trolling" and hide it under the rug.



If you want to go on a rant, is it too much to ask that you list examples and how they apply, rather than assume everyone here is as on top of Internet drama like you might be? Your examples might even make it clear why a RT for news would be even more difficult than the one that exists for movies.


I guess it might be better to repackage this as something that isn't a rant.

RT is a company that is currently owned by Warner Bros. (through a chain of ownership). This alone is enough for someone to raise the possibility they may not be completely unbiased in how they treat each movie despite having any evidence of such behavior. Conflict of interest, in my own experience, have never required evidence of any further problem outside of the conflict of interest itself. Since the end result is merely movie ratings which is a fully subjective experienced, it isn't worth time digging any further into.

When it comes to a news rating site that will likely be deciding things of a more factual nature (news is true or not, or at least somewhat based on true events), I think a potential conflict of interest has larger implications. Whom ever owns the site will have a conflict of interest with regards to at least some news, and the ability of that site to be used by other groups like non-profits, NGOs, or even governments will need to be limited with respect to where the conflict of interest will occur.

I don't think the difference is in the difficulty of making a RT for news, but on the downstream impacts of the conflict of interest that exists between owner and purpose.


While that's true for the possibility of conflict of interests, do you have any proof/data showing they've actually flexed that conflict for their own gain? Undermining the trust in the authority of the aggregate values is a huge issue I'm sure they'd much rather avoid as much as possible in order to extract as much value out of the platform as possible.

I know a bunch of folks recently complained that they were trying to manipulate the scores for Captain Marvel, but that was also under a huge organized attack to torpedo the audience scores even before it was released for viewing. So they had a legitimate reason to go in and try to "fix" the scores. Whether the new values is "accurate" is of course up to debate (for philosophical values of "accurate") but most of the criticisms I've seen have had reasonable explanations behind them.


When avoiding a conflict of interest matters, normally there isn't a standard of requiring evidence of any impropriety for behavior to be deemed inappropriate. Merely not avoiding the conflict of interest is enough. Granted, for something like civil lawsuit or criminal charges this alone isn't enough and evidence is required, but for matters such as violating professional ethics not disclosing a conflict of interest is enough.

Like I said, with movies reviews it does not matter (at least in my opinion). The issue I have is that I don't think such a carefree attitude can be taken when it comes to rating the truthfulness or applicability of news.

As for any examples of actions, I do not have access to the sort of data to determine if any such actions were taken, nor do I have the resources nor desire necessary to even begin investigating. Also, especially in relation to recent events that touch on a more political scale, I rather not deal with the issue of evidence being trusted based on the extent it confirms existing views (a cognitive bias common to humanity that isn't particularly worse in this case, but which I have seen result in flame wars on other forums).


Perfectly reasonable. Thanks


The conflict of ownership also doesn’t work since Captain Marvel’s studio, Disney-Fox, are the biggest competitors to Rotten Tomatoes’s AT&T/Warner and Comcast/NBC Universal owners.


Pretty unimportant but just clearing up ownership info. Warner Bros/AT&T are minority owners of Fandango which owns Rotten Tomatoes. Comcast/NBC Universal are majority owners.


I was going off my now shown to be incorrect memory. It appears to be too late to edit the comment. Apologies for spreading any incorrect information.


Hah I wasn’t blasting you or anything! Sorry if I made it that way. I tried saying it was unimportant to the overall convo. If anything it reinforces your argument since it’s two of the big 5 movie studios.


That's a good question. I'm not sure what the other poster had in mind, and I suspect it's not the same, but RT has a bunch of rating problems that would probably become far more egregious for news than they are for movies. (Leaving aside the tricky question of how to translate the formats: is a movie like a news story, or a news source? Does news really have an existing aggregation-worthy source like movie reviews?) RT provides percentage scores for both audience and critic reviews. Those percentages are not averaged scores, but percentage of positive reviews. There's good reason for this in both cases, but it causes a lot of weird side effects.

- For audience reviews, reviewers rate out of 5 stars, and the score is the percentage of reviews at least 3.5 stars.

The goal is to insulate scores from extreme opinions and odd voting patterns. On IMDB, which doesn't do this, a few low scores can destroy a movie's rating, and movies that invite careful analysis are effectively penalized. The Dark Knight ends up rated higher than Schindler's List, but the latter has lots of slightly-imperfect scores with incredibly positive review text.

The basic side effect is the loss of all meaningful score data. That punishes daring movies: 100 4-star reviews equals 100% while 50 3-star and 50 5-star reviews equals 50%, but I would much rather see the second movie. It also hides deep flaws: The Boondock Saints outscores 2001, but a look at the reviews shows that its "rotten" ratings are often 1 and 2-star reviews calling it garbage, while 2001 is full of 3-star reviews like "slow and confusing, but fascinating". Another problem here is that naive binary scores suffer from reviewer skew and vote-bombing campaigns. It's less bad than IMDB because extreme scores are 'softened', but still worse than systems which relate individual scores to the reviewer's other opinions. BeerAdvocate, of all things, takes on this problem with interesting meta-info like "reviewer's average distance from consensus".

- For critical reviews, each review is reduced to a binary opinion and the score is a weighted average of positive review count.

The goal there is to condense varied or absent rating schemes into an aggregation-friendly score and avoid reviewer skew. Metacritic, which doesn't do this, ends up systematically distorted by which outlets review things. Unscored reviews like Ars Technica's (on video games) are skipped, while sites with extremely skewed scores are included by some unspecified weighting system. As a result, Metacritic lists game reviews averaging 75-89 as "generally favorable", but there's a widespread understanding that any major-release product below 85 is terrible. (GameSpot once infamously fired a reviewer for scoring a game 6/10.) It's largely the same problem as Uber and Amazon ratings: does a perfect score mean "exceptional" or "nothing wrong"? RT tries to dodge the problem by saying "good" or "bad", with a weighting to accommodate a reviewer's 'signal strength'.

The side effect here is, again, the total loss of secondary data. RT sometimes gets reviews actively wrong; something like "despite lovely action sequences, this movie is a meandering waste of time" will get parsed as "lovely action sequences". That's probably an automation error, but lots of other reviews can't be clearly declared positive or negative even by humans. And within a categorization, there's a great deal more lost data. Ebert produces 'fresh' for The Avengers because "It provides its fans with exactly what they desire," but he produces 'fresh' for There Will Be Blood by declaring it "A force beyond categories."

- The reason all of this works alright for Rotten Tomatoes is that it's fundamentally answering one question: "will I enjoy watching this?" The Avengers isn't on par with There Will Be Blood, but if you like the description and it's rated fresh, you'll enjoy either one. If you see that critics hate The Boondock Saints and audiences don't, well, you probably know whether you're somebody who likes that kind of cult classic.

But what of news? People already shred Snopes whenever it tries to deal with assessments like "factually correct but misleading" or "incorrect statistic but the real number also supports the same thesis". The entire point of news reviews is to not rely on "will I enjoy this story?", so all the things RT quietly glosses over will become critical problems.


> BeerAdvocate, of all things, takes on this problem with interesting meta-info like "reviewer's average distance from consensus".

There's a lot to be said for intelligently dealing with ratings and their metadata in ways like this.

For example, the Kappa statistic: https://en.wikipedia.org/wiki/Inter-rater_reliability

Presenting crowdsourced ratings as mere averages -- or really any collapse onto a single scalar -- is a big step up from no info at all, but it's hardly the best one can do even with well-known statistical techniques.


Yep, I really wish sites would either give me a histogram of raw scores, provide more thoughtful interpretations, or both.

Histograms are a screamingly obvious way of distinguishing "mediocre" from "some good some bad", which is one of the most common needs with things like Amazon products. But beyond that, there's so much more to be done. You can weight or shift scores by reviewer's average, reviewer's average distance from consensus, or a dozen other things. A one-star review from someone who uses Yelp exclusively to call out bad experiences is relevant, but a one-star review from someone who often gives 4-5 is far more interesting.

Maybe the weirdest thing is that a lot of this is done to catch fake/paid reviewers, but it's not extended to providing clearer info overall. Even the fight against fake reviews would be much achievable if it was shifted from a binary "take down or don't" to a more flexible approach to maximizing review value.


The one that exists for movies adjusted their public mirror reflection the moment the public voiced a contrarian opinion.

What would make a similar site for news safe from such trust abuse?


[flagged]


This is why my hackles are raised the instant somebody starts a "people should be exposed to various viewpoints discussion." I want to give the good faith interpretation, which would be, maybe, a healthy exposure to different economic system philosophies or something. But, nowadays, I find this argument, or people "just promoting the right to free speech," are really exactly the kind of people that casually toss around "sjw" as an accusation.

Like, downvoted because HN is a hive of sjws? As if. Progressive, sure, but a hive of SJW as you undoubtedly mean by the word is a laughable accusation.

Nowadays it seems when someone wants "multiple viewpoints" to be presented what they really want is the right to continue to bully people. This would have been an extremely presumptive thing to say but, crististm, you've gone and made this obvious for me by trotting out the old sjw accusation.


> Progressive, sure

There's many words I'd use to describe HN but "progressive" would be way, way down the list.

> Nowadays it seems when someone wants "multiple viewpoints" to be presented what they really want is the right to continue to bully people.

It's pretty much a dead cert. Along with "I'm just asking questions".


Interesting that you consider it bullying if an individual abrasively states different viewpoints from the crowd.

Is it bullying if the crowd uses similar tactics to silence the individual?

Perhaps it depends on whether you agree with the crowd's opinion or not.


I disagree that all instances of "silencing" are bullying.

Take this journalist's recent flight experience: https://mobile.twitter.com/i/events/1110170398589480961

Some young woman was on her own on a flight and was getting creeped on by an older man. The journalist stepped in, flight attendants stepped in, the man was berated and eventually forced to move.

Now those words - the man was berated and forced to move. If a group of schoolkids did that to some random nerdy kid on the playground, and we used those words to describe the situation, that's bullying. In the case of the flight, it obviously isn't.

So, I don't think it's so much about whether I agree with the crowd's opinion, but a greater context.

At an LGBT parade there's a fire and brimstone pastor shouting about how everyone's going to hell. The crowd is shouting back. Is that bullying? I don't really think so. I think it's the pastor that came to do the bullying.


Just to be clear, I should have said swarm. This is not about the whole HN and I've been here for a long time to see how it adjusted its directions from the top.

I should probably know better than to voice my opinion to a group of strangers, but no, I will not apologize for what I think.

I voiced my opinion and that part of HN crowd that heard me considered that I should shut up. Such is life


You're getting downvoted and told to "shut up" primarily because of your use of the word "sjw", because it's a dumb meaningless slur used to drag anybody with socially progressive values. It's not a criticism, it's just a petty insult.

If you have a genuine problem with the social direction of HN, there's ways to raise those concerns without regurgitating offensive alt right memes


Did you mistake Rotten Tomatoes for Russia Today?


I think he's referring to when they delete reviews that are affected by cultural phenomenon.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: