What do you prefer for games: Framerates and Resolutions? [2020]

What would you prioritize?


  • Total voters
    42
No one knows that, in fact going by official specs, the only thing one can say is that the performance delta is at least 17%. The PS5's GPU performance is constrained by a common power envelope shared with the CPU, the XSX GPU is not.
And that's just going by the raw performance metric. But we do know that MS has designed its system to exploit a whole host of performance tools/hacks: VRS, mesh shaders, stateless rendering (currently reading some very interesting patents wrt GPU command buffer and GPU driven primitive culling) and upscaling via directml. Sony has confirmed none of those features and yet they are core features for the expected gain in efficiency of the RDNA 2 architecture. VRS alone can result in a 10% gain in performance on some titles and Sony has been dodging DF on questions about it for months.
This isn’t a reasonable point to make through comparison however. Each one of these features could have varying amount of effect on performance. It’s not standard across all platforms enough for the type of discussion we are having. Not to mention the complete lack of data surrounding said features and whether they would even be used in more than 5-10% of title releases.
 
This is not the point. Spencer was running his mouth about how we have now reached the limit of photorealism. We now know that this is BS. We are not even close to photorealism without full scene ray-tracing. That marble demo, as flawed as it is, show us the full possibility of advanced hardware raytracing and is a clear step up from current gen. Those next-gen machines are simply not strong enough to provide it. While I understand the technical and economic reason for why this is so, there is no need to push the BS that this is how far we can actually go.

I don’t think the demo is “flawed”. I agree that it looks really good. But it is definitely limited, and that’s my point. I’d love to have full RT in complex games but that’s unarguably a bit far off for now. Especially on consoles!
 
Last edited:
But while very cool, VRR is still a feature that the vast majority of us do not and will not have on our TVs for years. I sure as hell will not upgrade my 4K/HDR TV any time soon.

Fair one. But I think I edited my post a little too late for your good self to see: I referenced God of War which ran at 1080p with an unlocked framerate. VRR may well have made that more pleasing to the eye, but it still looked better than the 4K mode on my old 1080p TV.

So if there were to be a situation in which the XSX ran at 4K60, yet the PS5 couldn't hold the same framerate, I would be very surprised if there wasn't at least an option to run at an unlocked framerate.
 
Fair one. But I think I edited my post a little too late for your good self to see: I referenced God of War which ran at 1080p with an unlocked framerate. VRR may well have made that more pleasing to the eye, but it still looked better than the 4K mode on my old 1080p TV.

So if there were to be a situation in which the XSX ran at 4K60, yet the PS5 couldn't hold the same framerate, I would be very surprised if there wasn't at least an option to run at an unlocked framerate.
My point isn’t worth debating in the grand scheme of things. If it does happen it will be a very low percentage. Nothing of any real value. I just wanted to point out that the rule wasn’t absolute and that there is likely a grey/edge cases where titles fall into for whatever reason in which such a scenario could occur. I attempted at showcasing how such a thing could occur using benchmarks to make my point stronger, and failed, but I think it would have been easier to just use the 20% differential in retrospect.

My posts were not meant to persuade members to believe any more than this; but I need to work on the precision of my messaging if people are getting crossed.
 
This isn’t a reasonable point to make through comparison however. Each one of these features could have varying amount of effect on performance. It’s not standard across all platforms enough for the type of discussion we are having. Not to mention the complete lack of data surrounding said features and whether they would even be used in more than 5-10% of title releases.

VRS is already widely available through the turing platform and is expected to replace the aggressive dynamic resolution that has plagued consoles this gen. Sony is the odd man out if it does not have it on the PS5.

AI is the next frontier. DLSS is what is truly making the RTX card shine and expect NVIDIA to crank it up to another level with Ampere. Having some type of acceleration for ML workload is a must if consoles wants to compete with what is to become mid-range cards in the next two years.
 
What I want to know is if the 10GB GDDR6 560GB/s bandwidth might be a limiting factor if the graphics hits the limit and it has to use the 6GB 336GB/s portion of the memory. Yes developers will likely work around this but could it be a limiting factor if it ever becomes an issue?

We know the PS5 will have constant stream of 448GB/s bandwidth for it's 16GB GDDR6 memory at it's disposal so it's likely developers will use more than 10GB when designing games for the PS5.
 
VRS is already widely available through the turing platform and is expected to replace the aggressive dynamic resolution that has plagued consoles this gen. Sony is the odd man out if it does not have it on the PS5.

I'm pretty sure Sony already confirmed VRS is on the PS5 and that it's a basic feature of RDNA2 design.
 
What I want to know is if the 10GB GDDR6 560GB/s bandwidth might be a limiting factor if the graphics hits the limit and it has to use the 6GB 336GB/s portion of the memory. Yes developers will likely work around this but could it be a limiting factor if it ever becomes an issue?

We know the PS5 will have constant stream of 448GB/s bandwidth for it's 16GB GDDR6 memory at it's disposal so it's likely developers will use more than 10GB when designing games for the PS5.
With sampler feedback, you don't actually need that much more memory. It's a memory capacity multiplier. I do have doubts about whether the 560 GB/s bandwidth is enough.
 
VRS is already widely available through the turing platform and is expected to replace the aggressive dynamic resolution that has plagued consoles this gen. Sony is the odd man out if it does not have it on the PS5.

AI is the next frontier. DLSS is what is truly making the RTX card shine and expect NVIDIA to crank it up to another level with Ampere. Having some type of acceleration for ML workload is a must if consoles wants to compete with what is to become mid-range cards in the next two years.
You can only compare those features on a per title basis. When those features come up on multi platform titles you can get an idea, but I don’t expect it to be the same amount of savings per title.
So on a general sense, a lot of what you written, sure they can be used, but you can’t use that for general GPU performance. Those are very title specific
 
...is expected to replace the aggressive dynamic resolution that has plagued consoles this gen.
It's not a substitute. Dynamic res reduces all drawing demands by the scale reduction. With lots of transparency effects for example, a primary reason to downscale to reduce BW demands, dynamic res reduces the number of transparent pixels to be drawn. VRS reduces the shading part of some surfaces. VRS preserves geometry resolution but reduces shader resolution, and it introduces 'blur' on surfaces, but it cannot adjust scene rendering demands downwards significantly. VRS is an optimisation for normal rendering, reducing detail where it isn't needed, and not a tool to adapt to fluctuating framerates.

The best tool there will be dynamic res and (ML?) upscaling. Good upscaling will be very hard to notice for the brief moment of frame res change. That or VRR, although that's unlikely to be targeted for games expected to be played on TVs. I guess with lots of PC ports, cross-platform titles might include it as a bonus for console gamers with VRR capable sets.
 
That was an Activision developer, or was it EA? Forgot, sorry.
Had a look.
That was an Activision developer, or was it EA? Forgot, sorry.
From the article I found, Activision.

https://segmentnext.com/2020/04/09/ps5-has-vrs/

There's no confirmation from Cerny but he does say that the GE has "easy optimisation for geometry culling" which could mean VRS or I suppose a custom VRS as developers control what is shown or not i.e. exactly like VRS.

PlayStation 5 houses “a custom AMD GPU based on their RDNA 2 technology,” according to system architect Mark Cerny. GE, part of RDNA 2, offers developers unparalleled control over triangles and other primitives, and easy optimisation for geometry culling. Hence, as the lead artist from Activision pointed out above, both are necessary parts of the same equation.
 
Had a look.

From the article I found, Activision.

https://segmentnext.com/2020/04/09/ps5-has-vrs/

There's no confirmation from Cerny but he does say that the GE has "easy optimisation for geometry culling" which could mean VRS or I suppose a custom VRS as developers control what is shown or not i.e. exactly like VRS.
GE is more in line with mesh shaders, which alters the front end of the unified shader pipeline.
VRS determines the amount of shading for a designated area. So you can go extremely low, do a single calculation and spread the values over a wide area of pixels, or you can go short and narrow and perform a single calculation over a narrow strip.

Without VRS, you perform a calculation for each single pixel.
 
Video might be easier actually:

Video on GE or Mesh Shaders:

The reason you want to use GE/Mesh Shaders is to make sure you're only drawing the triangles you need and the amounts you need.
Once you're done all that work up front, the shading to those triangles/pixels occur. That's where VRS will step in.
One of those tweets, talks about how VRS is useless if you don't GE first. Which makes sense, you're still shading things that don't need to be shaded.

So cull and transform as required with GE/Mesh Shaders.
Then VRS.

If you don't VRS you can use things like dynamic resolution or reconstruction.
 
GE is more in line with mesh shaders, which alters the front end of the unified shader pipeline.
VRS determines the amount of shading for a designated area. So you can go extremely low, do a single calculation and spread the values over a wide area of pixels, or you can go short and narrow and perform a single calculation over a narrow strip.

Without VRS, you perform a calculation for each single pixel.

It still doesn't prove that the PS5 won't have VRS or that it's culling method will be better or worse than the Xbox version.

from the article and twitter

Variable Rate Shading is nice for saving cycles, but VRS’ optimization capability doesn’t hold a handle to the Geometry Engine’s capabilities. VRS without GE means you’re still processing vertices you can/should eliminate in earlier stages to begin with. More free compute/memory.

 
It still doesn't prove that the PS5 won't have VRS or that it's culling method will be better or worse than the Xbox version.

from the article and twitter



No, I didn't say that PS5 doesn't have VRS. I'm just showing you the difference between what VRS is and what GE is. As per that tweet.
VRS doesn't cull. Mesh Shaders and GE do.

And if there is any consolation prize, most modern engines do culling through compute shaders, so that would also skip GE/Mesh Shaders. But they'll use them in the event that they are faster, there's certainly more incentive to use Mesh Shaders now given their flexibility and power.
 
Back
Top