Review scores versus game quality *spawn

Nesh

Double Agent
Legend
Another note: I realize a few people around here hate the Xbox lineup this generation, but I think it's better than the PS lineup. None of that matters though, it's just our personal opinions. The pro reviewers in the aggregate believe that MS has more good 8.5 games than Sony, but Sony has more home run 9s than MS. Working within that reality is the only honest way to have this discussion. People should probably stop calling Halo, Starfield and Forza "disappointing". If 8.5 is disappointing to you then why isn't Spider-Man Miles Morales disappointing? Why isn't Horizon FW disappointing because it didn't score 9s from critics? It's not honest. It wasn't all doom and gloom for Sony when all they could put out was remakes and cross-gen titles for 2 years after launch. The brand was powerful enough that they couldn't do any wrong in some peoples eyes. MS owned 5 studios in 2017 and now they have 40. As AAA games take 5 years to make we're only now going to see the fruits of MS pre-gen acquisitions, not yet the 30 studios that have since been acquired. Be honest about the potential here. It's more than "meaningful" in my opinion.

Xbox fans, myself included, expected too much. 25 million units vs. 50 million PS5s isn't a terrible start given the challenges. Outside of Japan that's probably 25 million to 45 million, + making ground on PC. If 2023 sales ratio of 3:1 continues, then Sony will sell 75 million units and MS will sell another 25 million. Thus 120:50, which is on the low end of MS's predictions. Not good, but MS can probably live with it until they get the studios churning out consistently.
I dont think it is the meta score that makes people think these games are disappointing. There seems to be a discrepancy between the official scores and people's experiences or expectations compared to what they were waiting from these titles.
 
I dont think it is the meta score that makes people think these games are disappointing. There seems to be a discrepancy between the official scores and people's experiences or expectations compared to what they were waiting from these titles.

You mean like The Last of Us II has a user score 25 points lower than the critic score? Review bombing happens all the time. Unless you're telling me that critics are right when they score a Sony game high, but wrong when they score an MS game high... Btw, Starfield's user score is only 13 points lower than critic score, just like God of War Ragnarok. Halo Infinite is only 8 points difference. I guess that means GoWR is overated compared to Halo Infinite. Please give up the false narrative that Xbox doesn't have good exclusives. They're just not to your tastes and that's ok.
 
They seem to be dogmatically pushing the cloud play anywhere on any display philosophy. Instead of doing whats needed now.
How are they being "dogmatic" by just offering xCloud as an option? Lol. MS let's you play on S, X, cloud, steamdeck and PC and somehow they're dogmatic.

Listen, we get it, you don't like the Xbox lineup, but critics don't agree with you. I'm not going to waste any more time trying to teach you that 85% is considered a good score for a game and not "disappointing".

These forums deserve better than your straw man arguments. I'm mainly an Xbox gamer, but I would never come here and claim that TLoU2 was a failure or disappointment just because it scored 25 points less with users than critics. It wouldn't be honest.
 
Last edited:
How are they being "dogmatic" by just offering xCloud as an option? Lol. MS let's you play on S, X, cloud, steamdeck and PC and somehow they're dogmatic.

Listen, we get it, you don't like the Xbox lineup, but critics don't agree with you. I'm not going to waste any more time trying to teach you that 85% is considered a good score for a game and not "disappointing".

These forums deserve better than your straw man arguments. I'm mainly an Xbox gamer, but I would never come here and claim that TLoU2 was a failure or disappointment just because it scored 25 points less with users than critics. It wouldn't be honest.
I didnt ask you to teach me, as well I dont think you are in a position/capacity to teach anyone, we all have views. My point was critic review scores are no longer dependable. And I'll add especially with Xbox exclusive games. And I say this as someone that only owns and plays the Series X this gen. This eventually impacts whatever software they'll release be it for their next gen/pro/nothing console in 2026.
 
This. Review scores are no longer dependable.
Review scores have never been dependable, they're typically a subjective view by a single person. This used to be really evidence from podcasts like IGN's Game of the Years which collects 4-5 IGN editors who discuss their favourite games of the year and how the official IGN review varied from their own.

It used to be better, back in the 8-bit days of Crash! and Zapp!64 (for C64), every review was the product of three reviewers and each review carried a factual account of things like gameplay mechanics, but also a couple of paragraphs from each of the three reviewers. That was about as good as it ever was, with scores being a composite of the three people's views.
 
Review scores have never been dependable, they're typically a subjective view by a single person. This used to be really evidence from podcasts like IGN's Game of the Years which collects 4-5 IGN editors who discuss their favourite games of the year and how the official IGN review varied from their own.

It used to be better, back in the 8-bit days of Crash! and Zapp!64 (for C64), every review was the product of three reviewers and each review carried a factual account of things like gameplay mechanics, but also a couple of paragraphs from each of the three reviewers. That was about as good as it ever was, with scores being a composite of the three people's views.
Fully agree, makes sense although I wasnt yet born in the 80s. At the end of the day they're quite subjective including the highly regarded IGN. I used to find critic reviews somewhat dependable during the PS3/Xbox 360 generation. On the PS2/Xbox gen in the early 2000s it wasnt seen as a failure if a game didnt get 8 or above out of 10. Some of my favourite games during the PS2/Xbox era had some of the worst critic reviews like Dark Angel on the PS2/Xbox. Or God Hand on the PS2.

I think another problem is these days a score of 7/10 is seen as a failure so there's a need by pro console reviewers to position games in such a way that they at least receive 8/10. I also think last gen Xbox titles got more negative reviews than they deserved(my subjective opinion). And this gen I think some of the things Xbox has published have gotten better critic reviews than they deserved(Again my subjective opinion).
 
After the reviews of Cyberpunk on PS4, I think that they have zero reliability, it's not just opinion.
 
IGN always came across to me as reviews from people who don’t actually play games. I would say the reviews in the old video game magazines were the best. You had a separate score from 3-5 different people, thereby giving you a range of opinions.
 
After the reviews of Cyberpunk on PS4, I think that they have zero reliability, it's not just opinion.
The weird thing about IGN is that they sometimes care about bugs and performance, and sometimes not. The fact that they re-reviewed Cyberpunk on PS4/XBone and gave it a 4 (down from the initial 9 that all versions had) because of performance is insane. Not that the score is different, the fact that they gave all versions of the game the same score to begin with. And 4 is shockingly bad, when you consider a game like Skull Island: Rise of Kong is a 3. Prey (the Arkane one) was also a 4, until they got bullied into changing it to an 8. I'm sure they will say that it got patched and the bugs they cared about are gone, but that's also true on PS4/XBO Cyberpunk to a large degree. And it's also true of Redfall, which is also an IGN 4. Miles Morales, which had a fairly common crash to dashboard bug at launch, still scored a 9 and never had an amended review. Baldurs Gate 3 has a bug that deletes your save on Series consoles, a bug not unlike Prey's that earned it's 4, yet it sits at a 10. And all versions of Castlevania Symphony of the Night share the same review/score. That's the PS1 version, the Saturn version, the IOS/Android release, the 360 Arcade release, and the PSP version. There are huge differences in some of those ports, both in content, performance and the general quality of the packages.
 
It’s not. Those are even better due to a larger sample size.
I think one of his fundamental point was there was more detailed objective review of gameplay mechanics. On the other hand you're right a larger statistically significant sample size is better.
 
After the reviews of Cyberpunk on PS4, I think that they have zero reliability, it's not just opinion.
Cyberpunk has a 57 Metacritic on PS4 and IGN gave it 4/10. The initial Cyberpunk reviews were positive because journalists were not sent console versions.
 
After the reviews of Cyberpunk on PS4, I think that they have zero reliability, it's not just opinion.
Were there good reviews of Cyberpunk on PS4? Because everywhere I looked when reviews dropped on the day before release was of the PC version as that code has been released a couple of weeks earlier whereas the PS4/XBO code was only released the day before, and review sites like IGN, GameSpot, and GiantBomb made that really clear.

I did buy Cyberpunk PS4 to play on PS5 but only after I'd seen PlayStation Access stream that setup for about three hours first so I knew exactly what I was getting. What I didn't count on was CD Project Red making the PS4 version better by dialling back NPC/car counts and graphics so ironically, it got worse on PS5 as it got better on PS4. ¯\_(ツ)_/¯
 
Why are we doing metacritic list wars in this thread?
Because Starfield and Forza Motorsport are very good games and it hurts some people's egos. :)


As far as I know, Cyberpunk gets a lot of criticism and bad reviews, and today it is called one of the best games in the World. So Starfield has every chance to be the best game in the Universe a little later....

Oh My God... :LOL:
 
It’s not. Those are even better due to a larger sample size.
Aggregator sites are only "better" if you are willing accept that some site's aggregation algorithms are not flat mean/medians but using unpublished weightings, ergo in practise some site's scores carry more score weight than others. For example, Metacritic has a team of editors who arbitrarily decide, with no transparency, that site A's score are better than site B based purely on the site and not the reviewer or review. Which sounds like some grade A bullshit in terms of objectivity to me.

Having been on B3D for a while now, I've observed that there is nobody on this forum with whom I agree with on most/many/some games. I may agree with their opinion on one game but not on another. Which really tells you what we all know, that views of other people really are no use at all for what type of game you will like.
 
Aggregator sites are only "better" if you are willing accept that some site's aggregation algorithms are not flat mean/medians but using unpublished weightings, ergo in practise some site's scores carry more score weight than others. For example, Metacritic has a team of editors who arbitrarily decide, with no transparency, that site A's score are better than site B based purely on the site and not the reviewer or review. Which sounds like some grade A bullshit in terms of objectivity to me.
Sure, but you can see the reviews for yourself and read as many as you want to get opinions. If the advantage of reviews from yesteryear was "You had a separate score from 3-5 different people, thereby giving you a range of opinions," now you have 30+ opinions to read instead of just 3-5. Don't have to take the metascore at face value any more than you need to take the Edge's or Crash! of Zzap64!'s final score.
 
Sure, but you can see the reviews for yourself and read as many as you want to get opinions. If the advantage of reviews from yesteryear was "You had a separate score from 3-5 different people, thereby giving you a range of opinions," now you have 30+ opinions to read instead of just 3-5. Don't have to take the metascore at face value any more than you need to take the Edge's or Crash! of Zzap64!'s final score.
As techuse was referring to sample size, I interpreted that as a comment on usefulness of a metascore, rather than what I think you're suggesting which is just aggregator sites as an alternative to finding reviews with Google.

I've cannot recall seeing reviews where individual reviewers provide different scores, but here is Zzap!64's review of Bubble Bobble. The review team play each game and the review is a product of that experience with one set of scores. You will rarely get a cohesive position when multiple people individually assess something subjective compared to working together and discussing it as a group. The group exists to reduce outliers, preference and bias intruding.

This is obviously a more resource intensive approach and given how long some modern games are now, probably isn't practical.
 
Aggregator sites are only "better" if you are willing accept that some site's aggregation algorithms are not flat mean/medians but using unpublished weightings, ergo in practise some site's scores carry more score weight than others. For example, Metacritic has a team of editors who arbitrarily decide, with no transparency, that site A's score are better than site B based purely on the site and not the reviewer or review. Which sounds like some grade A bullshit in terms of objectivity to me.

Having been on B3D for a while now, I've observed that there is nobody on this forum with whom I agree with on most/many/some games. I may agree with their opinion on one game but not on another. Which really tells you what we all know, that views of other people really are no use at all for what type of game you will like.
I wasn’t aware there was a weighting bias. I thought it was just an average.
 
Back
Top