Your 2080ti performs almost same as my 3070 does (but a tad bit better). But I need high/med textures to match the performance you're having wtih very high textures, which spawned this entire discussion to begin with. Even with low textures, I still lose some performance compared to you, for example, because my VRAM, from the game's perspective, is still full to its maximum (6.4 GB budget). My 3070 losing huge frames with very high textures causes Michael Thompson to claim PS5 performs like a 3070 in this scene, whereas it is false, since your 2080ti, having exact same theoritical "gaming performance" as 3070 performs %35 above PS5 with exact same settings, textures included, thanks to having enough VRAM budget. (Actually, you too are limited somewhat. The %80 rule makes it so that your game only has a total of 8.8 GB budget for itself, so it is still possible that you lose a tiny bit of performance due to game needing an extra 1-1.2 GB of data to be offsetted to normal RAM. Overall, you get the intended performance of your card, whereas my 3070 and other 8 GB cards such as 2070 cannot, resulting this discussion.PS5 does use IGTIAA on Fidelity Mode 4K VRR, correct? It's also dynamic. I'm not sure how close here is how it runs on my 2080 Ti at Native 4K/TAA Fidelity Mode settings in that scene. My 2080 Ti is paired with an i9-9900K.
View attachment 6912
In that scene, it fluctuates between 61-68fps.
Here is how it runs at Native 4K/ITGI Ultra Quality
View attachment 6913
Once again, in that scene, it fluctuates between 75-81fps. There is no DSR in any of those examples. Whatever the case, I think that the PS5 being equivalent to a 3070/2080 Ti is a gross exaggeration. The 2080 Ti is A LOT faster in those scenes.
I mean, I think what we have works pretty well.Someone needs to build a machine learning based pixel counter!
To be fair I also think it's worth calling out NVIDIA for selling $500 cards with 8GB of VRAM. The 1070 back in 2016 launched with 8GB. How do we still have 8GB enthusiast cards two generations later? The closest competitors to the 3070 are the 6700XT and 6800 with 12GB and 16GB respectively. NVIDIA should have configured their GPU's so the 3070 could have 10GB and the 3080 12GB (because no way they would go with 16 and 20). That's the bare minimum in 2022 in my opinion.I actually would like to get native fidelity VRR unlocked mode captures directly from that specific scene on PS5. That way we can confirm if it is indeed native 4K or not. Logically it should be, I don't see why a VRR mode would drop resolution, but I'm not quite sure. But considering 2080ti punches %35 above it, tells me that it can be indeed native 4K. The game is not super heavy on ray tracing, evicended by 2080ti being able to get almost native 4K/60 FPS with RT enabled.
That's the bare minimum in 2022 in my opinion.
They're actually slightly different and not the same GPU. If memory serves right, the 12GB has slightly more cores (8704 vs 8960) and a wider bus (384-bit vs 320-bit).3080 10GB vs 3080 12GB would be a really interesting comparison in this game.
To be fair I also think it's worth calling out NVIDIA for selling $500 cards with 8GB of VRAM. The 1070 back in 2016 launched with 8GB. How do we still have 8GB enthusiast cards two generations later? The closest competitors to the 3070 are the 6700XT and 6800 with 12GB and 16GB respectively. NVIDIA should have configured their GPU's so the 3070 could have 10GB and the 3080 12GB (because no way they would go with 16 and 20). That's the bare minimum in 2022 in my opinion.
I believe it was also in DOOM Eternal where the 3080 (or 3070?) underperformed compared to the 2080 Ti again because of a frame buffer limitation.
In general, more extensive tests at native 4K with Very High textures would be a really interesting comparison all around.3080 10GB vs 3080 12GB would be a really interesting comparison in this game.
Because there's been no need for them.The 1070 back in 2016 launched with 8GB. How do we still have 8GB enthusiast cards two generations later?
I don't disagree with that. It made sense with the 1070 and 2070 but absolutely none with the 3070. Those consoles were also designed to run at 1080p, often featured low-res textures (especially later in their lives), extremely low AF, and low-resolution effects among other things. The 3070 was released right around the time that the current-gen consoles came out. It should have been obvious that 8GB was being incredibly stingy, especially when it's a card geared at 1440p+.Because there's been no need for them.
Why put 16GB on a GPU when all it's going to be running is multiplats built for base PS4 which has 5.5GB of usable RAM?
Now more VRAM makes sense as the base spec has increased.
I don't disagree with that. It made sense with the 1070 and 2070 but absolutely none with the 3070. Those consoles were also designed to run at 1080p, often featured low-res textures (especially later in their lives), extremely low AF, and low-resolution effects among other things. The 3070 was released right around the time that the current-gen consoles came out. It should have been obvious that 8GB was being incredibly stingy, especially when it's a card geared at 1440p+.
It is with NVIDIA because only they do silly stuff like 1060 3GB vs 6GB. 3060 12GB vs 6GB etc. Every card should come with an adequate amount of VRAM that shouldn't make you wonder whether or not you'll have enough or if you should go with a lower-tier model with more VRAM. AMD has enough VRAM at every tier to the point that the amount is a non-factor when making a purchasing decision.With VRAM it's a cost vs 'is it worth it' question.
Every card should come with an adequate amount of VRAM that shouldn't make you wonder whether or not you'll have enough or if you should go with a lower-tier model with more VRAM.
AMD has enough VRAM at every tier to the point that the amount is a non-factor when making a purchasing decision.
For high-end and enthusiast cards, the amount of VRAM usable by the consoles is a good reference point. Out of 16GB, the PS5 has around 13.5 usable for games I believe so anywhere from 12 to 16GB is adequate for an 80 series. For a 70, 10GB is good enough.But how do you determine what's an adequate amount? I don't and have never questioned if my 3060ti has enough VRAM.
For high-end and enthusiast cards, the amount of VRAM usable by the consoles is a good reference point. Out of 16GB, the PS5 has around 13.5 usable for games I believe so anywhere from 12 to 16GB is adequate for an 80 series. For a 70, 10GB is good enough.
8GB is fine for a 3060 Ti. It's a high mid-range card for 1080p/1440p. My problem is with those high-end $500+ cards sporting 8GB and 10GB but advertised for 1440p/4K.PS5 is around 12.5GB but that also includes what would also be 'system RAM' on a PC.
Lets look at my options when I was looking at buying a new graphics card, my options were:
- RTX 3060ti with 8GB VRAM
- AMD 6750XT with 12GB VRAM
I bought the 3060ti, why?
Because I prefer the trade offs that it will require in the future over the trade offs the 6750XT will require, let me explain.
In the future the 3060ti with it's 8GB VRAM will likely have lower quality textures but due it's vastly superior ray tracing performance will offer higher quality lighting, reflections and shadows.
Meanwhile the 6750XT will offer higher quality textures with its 12GB VRAM but at the cost of offering lower quality lighting, reflections and shadows due to have significantly weaker ray tracing performance.
Given the choice between the trade offs each card may require in the future I prefer to have higher quality lighting, reflections and shadows over higher quality textures.
But that's just me, others will prefer higher quality textures.