Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

I'm not sure why your bringing up PC even now when I am speaking in a vacuum about what the consoles can do. And I don't know what market competition has to do with what I'm talking about when I am referring to what developers can bring out of the platform
Your main issue with the article was it calling the consoles 'weak'

And I evidenced that compared to what other options are out their for consumers they are weak.
Your premise is wrong as the cpus in the current consoles are much greater than the jaguars CPU
Those CPU's are still old and in current games get battered by a £100 Intel quad core.

And my comment about them using old mid-range PC GPU's is factually correct.
I wasn't just referring to graphics either. But what developers will be able to do at 60fps when fully dedicating their time to the new machines. Games at 60fps with scale scope and detail all at the same time were rare on last gen and now they don't have to be, even if it still requires compromises.
And they're be rare this gen too.
Your obsession with PC vs console is strange.
You are welcome to evidence this obsession as this instance represents an extremely rare occasion where I've bought PC up.

What's happened is simply this..... you posted an article that claimed the consoles are weak which you didn't like or agree with, I agreed with that article and provided examples as to why I agreed with it and you didn't like it.
 
Last edited:
Davis.anthony has been arguing ps5 vs pc with me, in favor of the ps5. I doubt he’s in for some classic, intended platform warring.
I honestly think hes a jack of all trades.
 
They are weak, they're comparable to 4 year old mid-range PC GPU's and we are already seeing that in games in relation to what you get on PC.

An £400 RTX3060ti alone is ~30% faster in raster and depending on the game, up to twice as fast in ray tracing.
They could have been as powerful as a 4090 and the first two years would have released titles that looked like this. This is the first year ever where cross Gen games on consoles was really a thing.

The expected console experience is new consoles and therefore immediate generational boost in graphics. This didn’t happen this time, it was largely older technologies designed for older machines scaled upwards to next generation.

The power envelope is not the heart of the problem, the issue is that there was no
Discernible difference for users because the power was spent drawing more pixels or at a higher frame rate.

I disagree with bringing PC into this argument. It’s not at the heart of the problem for the console experience for this generation. The article is disingenuous, and used it as a setup to route readers to their “build pcs” article as noted in their final paragraph.

All they did was name drop a ton of jargon that means shit all to an average reader.

Very definition of garbage.
 
I think it's slower but it's still coming, the Big jump over last gen. We have a glimpse of it with fortnite, graphics are vastly improved, in geometry detail and lighting, while keeping the 60fps target. It's a good sign of things to come.
Without talking about the massive gains in loading times, which are part of the overall experience.
 
But isn't the context of this discussion based on an article lamenting the issue of having to move off the 4k60 target to a lower resolution (upscaled) and 30 fps?

Last generation consoles only had to deal with a pseudo target transistion of 720p30/1080p30 to 1080p30. If the pseudo target expectation for this gen moves from 1080p30->4k60 than that will simply eat a lot of the hardware gains alone. Realistically if they relax those targets there simply is a lot more headroom, we're talking about something akin to doubling the amount of hardware resources available per frame if not more in some aspects.

Having 4k60 being common now is possibly also just a by product of still largely being cross gen. Scalability in terms of fidelity/scope is not as easy compared to just adding raw frames and resolution.

There is also the general issue that perceived fidelity/scope has diminishing returns as time progresses, and we're definitely further into the curve than last gen. As such my feeling is if people really want that next jump along those lines they're going to have to accept moving off 4k60. The UE5 1440p30 "guideline/recommendation" I don't feel is without it's merits basically.

Another thing is aside from move to SSD storage I don't know if anything in this generation is really at all "exotic" or "revolutionary" compared to last gen relative to previous gen updates. Of course it's faster, but it's just faster more or less along the same lines. At the risk of starting a multifaceted IHV debate, had the say the GPU incorporated a more robust/stronger RT element or ML elements I could see more "creative" ways for developers to make more out of seemingly less due to the larger departure from the previous platforms. Ultimately I guess we'll have to see how the SSD element can truly we leveraged I guess.
 
Last edited:
But isn't the context of this discussion based on an article lamenting the issue of having to move off the 4k60 target to a lower resolution (upscaled) and 30 fps?
Nah, because the console market has not changed in this respect. If you want to criticize something, anything really, you need to criticize it for what they are trying to be not what you want them to be. They are first and foremost consoles, 6 years from now, the latest PS5 and XSX will still have exactly the same hardware from 2020. Lamenting the fact that consoles are aging is perhaps the dumbest article one could write, they are still selling PS4s for crying out loud. This article should never had been written, the console market does not care for everything written about in that article, if ensuring every single game ran at 4K60 for the next 10 years, you'd be forced to constantly upgrade your system on bi annual basis, and they don't do that. Mid gen refresh may be the closest you will ever see of it, and we are not likely to see that again.

Another thing is aside from move to SSD storage I don't know if anything in this generation is really at all "exotic" or "revolutionary" compared to last gen relative to previous gen updates. Of course it's faster, but it's just faster more or less along the same lines. At the risk of starting a multifaceted IHV debate, had the say the GPU incorporated a more robust/stronger RT element or ML elements I could see more "creative" ways for developers to make more out of seemingly less due to the larger departure from the previous platforms. Ultimately I guess we'll have to see how the SSD element can truly we leveraged I guess.
There's a lot of work to make a game fully leverage the SSD. I suspect it's as painful to plumb into engines as trying to get Ray Tracing plumbed into everything. As soon as you go the hard route of designing games around super fast IO and super high speed streaming, you're cutting off a big portion of PC users and older consoles. I don't think they are ready to take that plunge yet, so a large portion of games are just leveraging the IO as faster loading.

If they ever get to the point where they feel, yea, lets do this, asset sizes could balloon significantly, you would let the engine handle LOD, and games should have a generational leap in texture quality, geometry, level, asset differentiation (no longer 1 model per enemy type, but perhaps hundreds!) and encounter design etc. Those are all things that very few titles have. I mean, I could make the same arguments for RT. You gotta cut off the entire old T&L pipeline to make a game look correct with all RT based lighting. 4A studios came somewhat close with their enhanced edition of Metro, but it was a full retrofit, not a complete build up where they were designing entire levels with RTGI in mind.
 
I think it's slower but it's still coming, the Big jump over last gen. We have a glimpse of it with fortnite, graphics are vastly improved, in geometry detail and lighting, while keeping the 60fps target. It's a good sign of things to come.
Without talking about the massive gains in loading times, which are part of the overall experience.
sorry but fortnite rt beside being top notch technicaly still looks like ps3 game ;d
 
Interesting? Like selective comparison to favour an argument? I banged in two GPUs that have similar part numbers and I know are somewhat related! 🤣I don't know these GPUs.

I had invited other more knowledgable people to make the comparison to answer my question (and the wider question of what factors may be influencing results) for me. As no-one stepped up to the plate, I quickly did a search for one example. I don't even know if those number are accurate on that website.

Anyone wanting to correct me and present better data-analysis and fairer comparison to reach an answer is welcome. Indeed someone who knows these GPUs should do! By all means make a fairer comparison using other GPUs. Seeing as I did such a shit job, why don't you spend less time pointing out the 1070 is an 'interesting' choice, pick a few GPU-comparisons and make us a little table of theoretical fill-rate datas and glyph benchmark deltas?

We still have people positing theories comparing XBXS to PS5 without even considering the other numerous references available. Why use science when just imagining things is so much quicker and easier?¯\_(ツ)_/¯

* rant caveat - this assumes your choice of 'interesting' was accurate to intention. It's possible you meant somthing like "unfortunate" or "well, I woudln't have picked 1070" or similar.
My apologies for not being clearer; I meant 'interesting' in the most literal sense of the word. I wasn't insinuating that you picked that card to somehow misrepresent the benchmark. :) Interesting in that the 1070 has a unique architectural quirk that rarely is mentioned or thought about nowadays that might actually be relevant here, and could be clearly reflected in this particular benchmark. Edit: To be even clearer, I was genuinely excited by the choice and the 1070's inclusion for the above reason.
 
I think it's slower but it's still coming, the Big jump over last gen. We have a glimpse of it with fortnite, graphics are vastly improved, in geometry detail and lighting, while keeping the 60fps target. It's a good sign of things to come.
Without talking about the massive gains in loading times, which are part of the overall experience.
I think Fortnite with ue5.1 is an example of the tech the current gen is capable of using which will allow for what I'm talking about. After all lumen and nanite are limited by a baseline of performance which is shown with the 120fps mode being off the table for PS5 and series x regarding that tech. But at 60fps impressive things are still very possible.

But I'd use horizon forbidden west as a better example. After a few patches the 60fps mode has made the 30fps mode almost functionally pointless and is a world away from what the PS4 version is doing at 30.

Other games created from scratch will show what is possible at 60fps with great image quality and fps for this generation. Like I was saying earlier devs are lot less shackled with what they can do in a 16ms budget this gen than ever before. And that will show in the games that are made when the crossgen period leaves I think, although I dont doubt 30fps games will show up, it's much more of a design choice consideration than previously.


But isn't the context of this discussion based on an article lamenting the issue of having to move off the 4k60 target to a lower resolution (upscaled) and 30 fps?
That is the false context I was complaining about. And thought was really nonsense. Because the article is not just noting res or fps reductions in two games in a vacuum.

But peddling a stupid narrative based on those two games alone being 30fps that the consoles are already too weak to handle games that will be coming out in the future and that "the promises(which ones?) of the console manufacturers have not been kept".

When in reality the consoles will handle the games that are created for them just fine, and being the baseline machines for those games will by default allow developers to optimize them for whatever needs they want as consoles always have done for games made for them.

Like 120/60/30, and how the game runs in general is purely a developer oriented choice in an industry dominated by consoles being the lead platform.

We have seen open worlds far more impressive than in Gotham nights running at 60fps on PS5 and series x. Hell the previous Batman game on base PS4 runs and looks better than it. And plague tale possibly could have run at 60 on these machines. If they wanted to make a mode for it. But the article is more interested in "haha, 30fps lol, consoles weak so fast" and that's what I was objecting to.
 
Have you tried to play a random PS3 game (running at 60fps) lately?
Besides fighting games, call of duty...and a few action combat games like Dante's inferno, ninja gaiden games I don't recall any 60fps games on PS3 at all. Time makes one forget how weird that generation really was. I think there was literally only like 1 arcade racer at 60 in burnout paradise
 
sorry but strongly disagree ;d I played it and yes when you looking at rt reflections or compare to old version you can admire ue5 but still looks simple/bad

You confuse art direction and technology. Silent Hill 2 will use the same technology and will look better from your point of view because of art direction.
 
Back
Top