What do you prefer for games: Framerates and Resolutions? [2020]

What would you prioritize?


  • Total voters
    42
The two are completely different, so stop making definitive statements like "GE is exactly like VRS".
 
Video might be easier actually:

Video on GE or Mesh Shaders:

The reason you want to use GE/Mesh Shaders is to make sure you're only drawing the triangles you need and the amounts you need.
Once you're done all that work up front, the shading to those triangles/pixels occur. That's where VRS will step in.
I know how VRS works and seen all the videos.
One of those tweets, talks about how VRS is useless if you don't GE first. Which makes sense, you're still shading things that don't need to be shaded.

So cull and transform as required with GE/Mesh Shaders.
Then VRS.

If you don't VRS you can use things like dynamic resolution or reconstruction.

Yep, which way to cull the triangles/pixels better is still to be proven I'd guess. Bring on next-gen.
 
Yep, which way to cull the triangles/pixels better is still to be proven I'd guess. Bring on next-gen.
I'm not understanding why culling is so important? I understand the need to cull as much as possible if your'e running higher and higher assets. And this type of thing is incredibly useful for unified shader pipeline. But there are multiple ways to cull large amounts of triangles using compute now. We see this in action with UE5. They've moved largely away from the FF pipeline to a software rasterizer on compute.
 
Yep, which way to cull the triangles/pixels better is still to be proven I'd guess. Bring on next-gen
What do you mean which is better?
Xsx also has mesh shaders(which does the cullung your talking about)as well as VRS?

I don't think anyone has said PS5 doesn't have VRS, what has been said is its not been confirmed, even when they have been explicitly asked.
 
Well, I apologize, I didn't watch this particular video, I was running off information of the original article and interpreted it poorly. But this is a good info here.

I was wrong about the benchmark, but that still doesn't change the nature of the my argument.
You are fixated largely on the wrong argument, I'm not saying XSX will run 4K60 and therefore PS5 will run 4K30.

I'm saying if a game is optimized exactly for 4K60 on XSX. Then by default PS5 will run below 4K60. Since the next closest framerate is 30fps, the option is to clip or scale resolution. If your engine does not support scalable resolution, you will clip.

I'm not saying there is a power differential between the 2 GPUs of that amount. I'm saying in the real world and how in the console world things are running fixed frame rates, these types of scenarios can happen.

Performance between these 2 GPUs can be determined through formulation.
But when it comes to retail release and specifications, formulation doesn't apply, we move onto probability and distributions.
If everything was as simple as formulation, we wouldn't have such as many outliers as we do in comparing PS4 Pro and X1X, some of which titles are running 4K on X1X and 1080p on 4Pro. Does formulation explain that? No, it doesn't. But that was the release. We have median values, and 4Pro for the most part performed in the median value with respect tot performance vs X1X. But that doesn't remove the outliers. Which has been my argument all this time. I am "speculating" that there will be some titles that fall into this category, because we've seen it happen before.

There were at least 2000 titles released last generation, we are likely to have another 2000 odd or so. Are you so confident that there is no way outliers could occur in which XSX will double the frame rate, or resolution will be doubled? I wouldn't bet on that, surely not based on the data we have from this current mid-generation. In the same vein, I wouldn't bet against that PS5 and XSX run the same settings either despite the 18-20% advantage XSX has over PS5.
From the graphs posted here all I see is the rough equivalent of PS5 part, an OC 5700xt is about 6fps slower than a 2080 which is presumably slightly faster than the XSX. Now the paper difference of 18% in compute performance is far larger here than the actual fps difference. It would take one hellava turd like optimization on the PS5 version to have it run it 1440p vs 4k on XSX or 30 vs 60. Not even Pro vs 1X sees this regular occurrence despite the monumental bigger performance delta than the PS5 vs XSX. Personally I see the norm in multiplat difference being 4k native on XSX while either dynamic 4k on PS5 or 1900p-2000p then CBR to 4k on PS5 with same settings otherwise. You can bank on it.
 
I'm not understanding why culling is so important? I understand the need to cull as much as possible if your'e running higher and higher assets. And this type of thing is incredibly useful for unified shader pipeline. But there are multiple ways to cull large amounts of triangles using compute now. We see this in action with UE5. They've moved largely away from the FF pipeline to a software rasterizer on compute.
Main advantage of mesh shaders is that it replaces vertex pipeline with compute with path to rasterization pipeline. (I expect primitive shaders to be same.)
This allows many algorithms and with fine grained culling before at meshlet level. (small groups of polygons.)

IE.
Proper tessellation unlike the current DX11 tessellation mess. (No silly tessellation limit, patterns.)
Scattering objects to scene and culling non-contributing ones.. (Pebbles on beach, tree bark and cull parts which are occluded by something (Possibly by coarse occluders etc.))
Particle systems, ribbons etc.
 
Last edited:
From the graphs posted here all I see is the rough equivalent of PS5 part, an OC 5700xt is about 6fps slower than a 2080 which is presumably slightly faster than the XSX. Now the paper difference of 18% in compute performance is far larger here than the actual fps difference. It would take one hellava turd like optimization on the PS5 version to have it run it 1440p vs 4k on XSX or 30 vs 60. Not even Pro vs 1X sees this regular occurrence despite the monumental bigger performance delta than the PS5 vs XSX. Personally I see the norm in multiplat difference being 4k native on XSX while either dynamic 4k on PS5 or 1900p-2000p then CBR to 4k on PS5 with same settings otherwise. You can bank on it.
Yea, we all see that as the norm. I wasn't talking about the norm
And I would disagree that the paper difference of 18% is far larger than the actual fps difference.
43.1 * 1.18 = 50fps.
2080 is 49.4fps on that chart. So that's actually lined up fairly well.
Except that super overclocked thicc boy GPU is running compute and memory settings above PS5. If you used a standard 5700XT OC, it would be 40fps on avg, making that gap around 40 to 50fps.

Also on the topic, you're still comparing a largely sub-optimized piece of unreleased hardware running sub-optimal drivers etc.
If we look at the hardware differences
2080 has 46 Streaming MultiProcessors which are the equivalent of CUs
1500-1800 Mhz core clock
256 bit bus, 8 GB of memory. 448 GB/s of bandwidth
64*2*46 *1800 = 10.25 TF of performance perhaps higher with more boost, I'm unsure.

Comparatively to a XSX
52 CUs
1800 Mhz
320 bit bus, 16 GB of memory and 560 GB/s of bandwidth.

From a pure hardware perspective XSX still has some more to give. After a 2 week port it's performance was compared to a 2080. It's not yet done. In July Gears 5 enchanted edition will be announced, we'll know then. Considering that enhanced editions target 4k60 as per their badging, it's just a question of what settings it runs and how close to ultra for every setting it's set to.
 
I don't think anyone has said PS5 doesn't have VRS,
In making comparisons of performance, some people are assuming VRS isn't present and suggesting XBSX's performance delta is higher as a result.

Has RDNA2 got VRS? If so, how can Sony not have it? Could if be locked away behind software patents? Maybe PS5 has the hardware but Sony/AMD need a way to use it that doesn't infringe patents, so they can't say 'no' or 'yes'? :???:
 
A bad framerate affects my gameplay much more than a mediocre resolution.

This is an interactive media which requires real-time interaction. The choice most developers make baffles me.
 
Choice makes a lot more sense when you look at the history of game development and the impact of screenshots in print. The move to video advertising somewhat helps, but attention is still highly reliant on static image sharing in screenshots in reviews and also in social media sharing.

Your game may be a better game for targeting 60fps, but it may well sell less than being 30fps. People share tweets of gorgeous screenshots but no-one's going to share a tweet of someone saying, "love that this game is 60fps."
 
Your game may be a better game for targeting 60fps, but it may well sell less than being 30fps. People share tweets of gorgeous screenshots but no-one's going to share a tweet of someone saying, "love that this game is 60fps."
Is there evidence to this?

Would TLOU2 sell less than it has if it was a 60fps game, with less detail here and there? I'm not so sure, and we'll never know really. But would I have enjoyed it quite a bit more if it was? Oh yes.
 
Is there evidence to this?
Short of Insomniac's remarks, I don't think there's clear evidence. However, there's definitely evidence that good screenshots gets you social-media coverage and marketing, and zero evidence that people share 'game is 60fps!' ;)

I noticed in trying to get any attention for my games, and seeing how other marketing fairs, it's the pretties that get all the attention. Post "my game is 60fps solid" and no-one will retweet it. Post a static screenshot of something realtime raytraced and looking amazing yet only 20 fps, it'll get a gazillion shares and likes. And no-one of those respondents will ask, 'what framerate does it run?'

If you can get main-press coverage and a review extolling the value of your smooth gameplay and solid framerate, you can capitalise on the feature in marketing, but it cannot stand alone with any appeal. Like romance, you have to be pretty to be noticed. No-one's going to crowd around a plain and dowdy looking girl/guy to see if they've got a great personality.
 
In making comparisons of performance, some people are assuming VRS isn't present and suggesting XBSX's performance delta is higher as a result.
Hmmm... Yea, thing is it's not been confirmed and they've been asked explicitly from what i remember and been said here.
My expectations is that it does have it though.
Has RDNA2 got VRS? If so, how can Sony not have it? Could if be locked away behind software patents? Maybe PS5 has the hardware but Sony/AMD need a way to use it that doesn't infringe patents, so they can't say 'no' or 'yes'? :???:
Reason I belive PS5 has it, is because RDNA2 does.
That doesn't mean RDNA2 is 100% same implementation as xsx, either in hardware or some form of software algorithm.
 
Short of Insomniac's remarks, I don't think there's clear evidence. However, there's definitely evidence that good screenshots gets you social-media coverage and marketing, and zero evidence that people share 'game is 60fps!' ;)

I noticed in trying to get any attention for my games, and seeing how other marketing fairs, it's the pretties that get all the attention. Post "my game is 60fps solid" and no-one will retweet it. Post a static screenshot of something realtime raytraced and looking amazing yet only 20 fps, it'll get a gazillion shares and likes. And no-one of those respondents will ask, 'what framerate does it run?'

If you can get main-press coverage and a review extolling the value of your smooth gameplay and solid framerate, you can capitalise on the feature in marketing, but it cannot stand alone with any appeal. Like romance, you have to be pretty to be noticed. No-one's going to crowd around a plain and dowdy looking girl/guy to see if they've got a great personality.
But pretty games will be pretty enough for PR screenshort at 30fps and 60fps. I don't know, I think it's all about positioning. KH3 was never really marketed as a 60fps game, but it sure as hell was marketed as a "Look! Disney! Pretty!" kind of game. Same for Doom Eternal, which to me is the pinnacle of tech in this generation.
 
Yea, we all see that as the norm. I wasn't talking about the norm
And I would disagree that the paper difference of 18% is far larger than the actual fps difference.
43.1 * 1.18 = 50fps.
2080 is 49.4fps on that chart. So that's actually lined up fairly well.
Except that super overclocked thicc boy GPU is running compute and memory settings above PS5. If you used a standard 5700XT OC, it would be 40fps on avg, making that gap around 40 to 50fps.

Also on the topic, you're still comparing a largely sub-optimized piece of unreleased hardware running sub-optimal drivers etc.
If we look at the hardware differences
2080 has 46 Streaming MultiProcessors which are the equivalent of CUs
1500-1800 Mhz core clock
256 bit bus, 8 GB of memory. 448 GB/s of bandwidth
64*2*46 *1800 = 10.25 TF of performance perhaps higher with more boost, I'm unsure.

Comparatively to a XSX
52 CUs
1800 Mhz
320 bit bus, 16 GB of memory and 560 GB/s of bandwidth.

From a pure hardware perspective XSX still has some more to give. After a 2 week port it's performance was compared to a 2080. It's not yet done. In July Gears 5 enchanted edition will be announced, we'll know then. Considering that enhanced editions target 4k60 as per their badging, it's just a question of what settings it runs and how close to ultra for every setting it's set to.
XsX was actually slightly slower than 2080 there so 57-58fps perhaps? Thicc boy was still clocked lower than ps5 so overall performance might be similar at ~43fps. The delta was definitely lower than the paper specs. Also you make it sound like xsx has all 16gb of gddr6 at 560 gb/s when only 10gb of them are and the rest are lower than ps5’s so another limitation there.
 
In making comparisons of performance, some people are assuming VRS isn't present and suggesting XBSX's performance delta is higher as a result.

Has RDNA2 got VRS? If so, how can Sony not have it? Could if be locked away behind software patents? Maybe PS5 has the hardware but Sony/AMD need a way to use it that doesn't infringe patents, so they can't say 'no' or 'yes'? :???:
There’s one way.
If it’s true I’ll eat crow and no one will like this idea anyway. But it’s not impossible.

let’s start with why Sony made their own SSD solution, the tempest engine,
Because it didn’t exist. If there was something available that they could leverage, they likely would have.

So there are customizations and then there are new technology builds.

so one should ask that if VRS Tier 2 on RDNA 2, why did MS go and build and entire solution from scratch? Why not customize it? Why go and form your own entire patent of doing it and roll your own solution negating AMD?

the only reasonable answer is that it doesn’t exist or is incompatible. Leaving the only other option is that it performs much better than AMDs variant, or lastly pure incompetence.

if we go back to a time ago, We discussed rumours of a RDNA1.9; I ripped on it for a while, perhaps excessively.

i mean, what if both companies are using RDNA 1, because they need that GCN, and upgraded as much as they could to RDNA2, but could not port everything over. This being one of them?

MS still wants it; they roll their own solution. Sony is a ?? But if they don’t have it or aren’t using AMDs solution and they have their own that would also be suspect. perhaps it’s true that RDNA 1.9 may actually be the truth.
 
Also you make it sound like xsx has all 16gb of gddr6 at 560 gb/s when only 10gb of them are and the rest are lower than ps5’s so another limitation there.
How much does the game have access to on PS5?

Xsx after removing os allocation is left with 3GB slower memory.
Deduct game engine
Audio
AI
Level data
Whatever, how much do you expect to be left of that 3GB?
so one should ask that if VRS Tier 2 on RDNA 2, why did MS go and build and entire solution from scratch? Why not customize it? Why go and form your own entire patent of doing it and roll your own solution negating AMD?
Are the patents incompatible with RDNA2's implementation?
Or can they modify/customize it.
 
Last edited:
I also think being 30fps potentially gives you a much higher chance to score better from the critics. You have GOTY material like God of War, RDR2, Tlou2, UC4, The Witcher 3, Bloodborne, Breath of the Wind all at 30fps for their primary target frame rate. Yet the critics and majority of the gamers alike eat them up like cakes without a sliver of uproar, this indicates people are generally fine with 30fps being the norm as long as it's locked without severe dipping. Now imagine cutting their base resolution in half to 720p, reduce the interactivity, asset quality etc but you get 60fps now. I bet a lot of base console folks would not sit very well with it at all. the critics would be looking at the latest Naughty Dog games as if back in the PS3 days due to the extreme blurriness on their full HD and 4k TVs. Same applies to next gen, you can't have your game looking subpar in an already highly competitive visual based industry, the more pretty pixels get you more attention period. Only the hardcore B3Ders and forum goers crave for smoothness over eye candies.
 
I also think being 30fps potentially gives you a much higher chance to score better from the critics. You have GOTY material like God of War, RDR2, Tlou2, UC4, The Witcher 3, Bloodborne, Breath of the Wind all at 30fps for their primary target frame rate. Yet the critics and majority of the gamers alike eat them up like cakes without a sliver of uproar, this indicates people are generally fine with 30fps being the norm as long as it's locked without severe dipping. Now imagine cutting their base resolution in half to 720p, reduce the interactivity, asset quality etc but you get 60fps now. I bet a lot of base console folks would not sit very well with it at all. the critics would be looking at the latest Naughty Dog games as if back in the PS3 days due to the extreme blurriness on their full HD and 4k TVs. Same applies to next gen, you can't have your game looking subpar in an already highly competitive visual based industry, the more pretty pixels get you more attention period. Only the hardcore B3Ders and forum goers crave for smoothness over eye candies.
Well you'd say that. You think with your... eyes! ;)
 
Back
Top