Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Man, everything they were showing from the previous spiderman games looked much more pleasing in that analysis even though Insomniac did improve some assets.
There is something about the lighting in Spiderman2 that just doesnt work very well.

Horizon Forbidden West: Burning Shores looks more like a proper PS5 only game, even though the engine is very much rooted in PS4. DLC at that!

SM2, on the other hand...🤷‍♂️
 
Insomniac are usually very good at pushing technical boundaries while also being very efficient and productive, but I think perhaps their focus on efficiency and productivity has simply meant they've not had the opportunity to push on the technical side as hard as they normally like to this time around. I mean, Insomniac seem like they're doing much of the heavy lifting for Playstation in recent years in terms of filling out their 1st party catalog. It's quite incredible how many proper AAA games they're able to produce compared to almost any other developer out there.
 
Citation :"Last week's showing raised questions about the game's tone, but also about its overall visual quality. Soon after, Insomniac Games developers stated the Spider-Man 2 footage we saw at the PlayStation Showcase didn't show the game's final version, something both Smith and Intihar reiterated."

Wait and see
 
I hate this tlou port has caused this vram obsession.. vram wont make your gpu faster with ugly textures
Its not an obsession when vram capacity has been stagnant for almost a decade. Its abnormal and its high time it was called out as manufacturers were fleecing the consumer and hording it as profit. Every gen, vram capacity should increase along with gpu performance, bandwidth, etc. A new card should be better than it's predecessor in all ways otherwise it has no reason to exist. That's why the 4060ti and the upcoming 4060 are an absolute waste of natural resources. They could have simply pulled a rebrandeon and cut prices. Instead they increased prices and delivered a woeful product. If the 4060ti which is actually a 4050ti in disguise was $200, the conversation around the card would be different.

Finally, many people are missing what Jensen and Nvidia have been telling us thinking its a meme. "The more you buy, the more you save". As in if you buy a more expensive card, you get a better value than if you buy a cheaper card. It seems like that's what they've done with the Ada lineup in an attempt to realign with the retail space. Again at retail you get a better deal when you buy in bulk than when you buy a single item. I don't expect this to fly in the GPU marketplace but if Intel and AMD tag along for the ride, then we're kinda screwed.
 
Finally, many people are missing what Jensen and Nvidia have been telling us thinking its a meme. "The more you buy, the more you save". As in if you buy a more expensive card, you get a better value than if you buy a cheaper card.
Agree with most of your post, but not this. Jensen was never addressing us, the graphics/consumer market, when he talks about 'the more you buy, the more you save'. That's addressed specifically to businesses who buy many processors, talking specifically about how you get a better deal the more processors you buy. That's it. You're reading into something that absolutely isn't there, nor does it even make any sense anyways.
 
Agree with most of your post, but not this. Jensen was never addressing us, the graphics/consumer market, when he talks about 'the more you buy, the more you save'. That's addressed specifically to businesses who buy many processors, talking specifically about how you get a better deal the more processors you buy. That's it. You're reading into something that absolutely isn't there, nor does it even make any sense anyways.
I don't know, if Jensen was never addressing us then that phrase would have been reserved for computex only. Unfortunately this has not been the case and he's used it at gaming only events. Furthermore, if you look at the way their product stack has shifted over the past few gens, it's clearly evident that the best value has shifted up the stack. x60 is no longer has the best value proposition. With Turing, it was all bad. Arguably with ampere, the best value proposition at msrp was the 3080 10gb. With Ada, it's either the 4070ti(kinda) or the 4090. So his statement clearly seems relevant to the gaming community.
 
Every gen, vram capacity should increase

No it shouldn't.

8GB GPU's may have been around for a very long time but the games that actually used that much VRAM has only been around the last <12 months, so everyone who bought an 8GB AMD GPU 7 years ago now have a GPU that's simply too slow to run shit despite having enough VRAM, so they've essentially paid for VRAM they've never needed.
 
Its not an obsession when vram capacity has been stagnant for almost a decade. Its abnormal and its high time it was called out as manufacturers were fleecing the consumer and hording it as profit. Every gen, vram capacity should increase along with gpu performance, bandwidth, etc. A new card should be better than it's predecessor in all ways otherwise it has no reason to exist. That's why the 4060ti and the upcoming 4060 are an absolute waste of natural resources. They could have simply pulled a rebrandeon and cut prices. Instead they increased prices and delivered a woeful product. If the 4060ti which is actually a 4050ti in disguise was $200, the conversation around the card would be different.

Finally, many people are missing what Jensen and Nvidia have been telling us thinking its a meme. "The more you buy, the more you save". As in if you buy a more expensive card, you get a better value than if you buy a cheaper card. It seems like that's what they've done with the Ada lineup in an attempt to realign with the retail space. Again at retail you get a better deal when you buy in bulk than when you buy a single item. I don't expect this to fly in the GPU marketplace but if Intel and AMD tag along for the ride, then we're kinda screwed.
Nah jensen is saying nvidia offers more value.. dlss 2 and 3 can turn a 4060 or a 4060ti into a 3080 by one click... 24 gig vram wont do that
 
No it shouldn't.

8GB GPU's may have been around for a very long time but the games that actually used that much VRAM has only been around the last <12 months, so everyone who bought an 8GB AMD GPU 7 years ago now have a GPU that's simply too slow to run shit despite having enough VRAM, so they've essentially paid for VRAM they've never needed.

Not sure I'd fully agree with this.

Games that fully use 8GB ram, sure, that's pretty recent. But games that can benefit from > 4GB have been around for a while. Throw in Discord, a browser, multi-monitor etc and 8GB is great even on my RX 570.

The reason to go for 8GB was not to use it all, but rather not to be limited by 4GB. 6GB would have probably been fine, but that wasn't an option. A 4GB version would definitely have been a mistake (for me). And the 8GB version of this card was so damn cheap!
 
8GB GPU's may have been around for a very long time but the games that actually used that much VRAM has only been around the last <12 months, so everyone who bought an 8GB AMD GPU 7 years ago now have a GPU that's simply too slow to run shit despite having enough VRAM, so they've essentially paid for VRAM they've never needed.
Doom Eternal came out in 2020, and it limited texture quality on GPUs with less than 8GB. It came out just before I upgraded from a 6GB RTX2060 to an 8GB 2070Super. It simply wouldn't let you set the textures high enough to overload VRAM, so that maintained performance by removing the ability of the user to set things too high.

As a person with an 8GB GPU still (the aforementioned 2070S), my main issue with recent releases isn't that I have to use settings lower than max to the constraints of the VRAM budget, it's that those medium setting today look worse than the low settings from, just a few years ago.
 

For example, Gears 5 (2019), Rise of the Tomb Raider (2016), God of War (Feb 2022 so just about counts), Far Cry 6 (2021), Days Gone (2021), Cyberpunk (2020), Horizon Zero Dawn (2021), RE3 Remake (2020), Doom Eternal (2020) ... there's lots of games more than a year old that can use more than 4GB even without RTX.

If you're happy to fiddle with settings (which I'm sure everyone here is!), you can go medium on certain settings to gain FPS and high / ultra on textures (which costs very very little).

And this is without considering, as I said earlier, losing 100s of MB's to browser tabs, discord, multi monitor, high res desktop, whatever else. And this also doesn't take into account mods for games like Fallout and Skyrim etc which are decidedly one of the perks of being a PC gamer.

And of course you really want to avoid the PCIe bus for video memory if you want to avoid lows, stutter and general jankiness. And this is particularly important for DX 12. The money I spent on the extra vram was certainly not wasted IMO. Infact, without it, my card would be underperforming much like a 4060Ti does.

PS4Pro has 5.5GB for games, X1X has 9 GB for games. An RX 570 is in the middle of those two in terms of performance. Having < 4GB physical VRAM available for games would be a detriment to the card IMO. Especially for cross gen games.

I personally don't want to spend money on a graphics processor only to clip its wings with the amount of vram it has. I'll happily spend a bit more to avoid that.
 
Doom Eternal came out in 2020, and it limited texture quality on GPUs with less than 8GB. It came out just before I upgraded from a 6GB RTX2060 to an 8GB 2070Super. It simply wouldn't let you set the textures high enough to overload VRAM, so that maintained performance by removing the ability of the user to set things too high.

As a person with an 8GB GPU still (the aforementioned 2070S), my main issue with recent releases isn't that I have to use settings lower than max to the constraints of the VRAM budget, it's that those medium setting today look worse than the low settings from, just a few years ago.
Well that’s not too surprising as asset variety has increased significantly. In the past, medium and low “looked” better because the amount of assets variety was laughably bad. Go play older games and look at how all the enemies look mostly the same, look at the repeating textures, look at the lack of animation variety, etc. All of this was to minimize ram usage. Like people saying “8 gb of gpus should still be sufficient” are clearly not paying attention. I only hope developers continue to push asset variety, environment variety, and enemy variety inspite of the 8gb complaints.

I mean look at cyberpunk, a game people rave about as looking next gen due to path tracing (it’s doesn’t) literally has repeating characters everywhere.
 
For example, Gears 5 (2019), Rise of the Tomb Raider (2016), God of War (Feb 2022 so just about counts), Far Cry 6 (2021), Days Gone (2021), Cyberpunk (2020), Horizon Zero Dawn (2021), RE3 Remake (2020), Doom Eternal (2020) ... there's lots of games more than a year old that can use more than 4GB even without RTX.

If we use the RX 5500XT as an example, a card that comes in both 4GB and 8GB versions and is around 10% faster than a GTX1060 the difference is average frame rate across Techpowerups review is between the 4GB and 8GB versions is 4fps at 1080p and 3fps at 1440p.

Those numbers are comparing cards from two different vendors so clock speeds may also cause the difference.

At playable settings 8GB was useless for a very long time unless the user purposefully did something to the game that made need 8GB VRAM.

And this is without considering, as I said earlier, losing 100s of MB's to browser tabs, discord, multi monitor, high res desktop, whatever else. And this also doesn't take into account mods for games like Fallout and Skyrim etc which are decidedly one of the perks of being a PC gamer.

That's such a specific and small use case that I feel it's irrelevant.

VRAM is and always has been a balancing act, we've had periods in time where AMD haven't given their GPU's enough VRAM to complement their GPU core performance (HD5800 series)

And there's also been instances where Nvidia haven't had enough VRAM for their GPU core performance (GTX670/680)

And then there's bad ports, is it fair to use those games to gauge if a card has enough VRAM? Or do you go with what the other 99% are using?

The amount of AAA games in 2023 that require more than 8GB VRAM at 1080p, at playable settings you can count on your fingers.

8GB is fine for 1080p with playable settings, 8GB is fine for 1440p in 99% of games at playable settings.
 
If we use the RX 5500XT as an example, a card that comes in both 4GB and 8GB versions and is around 10% faster than a GTX1060 the difference is average frame rate across Techpowerups review is between the 4GB and 8GB versions is 4fps at 1080p and 3fps at 1440p.

Those numbers are comparing cards from two different vendors so clock speeds may also cause the difference.

At playable settings 8GB was useless for a very long time unless the user purposefully did something to the game that made need 8GB VRAM.
Surely it would be better to use minimum rather than average framerates since stuttering is what really affects the experience?
 
I mean look at cyberpunk, a game people rave about as looking next gen due to path tracing (it’s doesn’t) literally has repeating characters everywhere.
The lighting is clearly next gen, but I do agree that there is an excess of duplicate assets, and assets of different levels of quality. But the flipside of that is that duplicating assets is an optimization. And the lack of duplicate assets would therefor be a lack of optimization. It's great if a game can have a ton of variety in assets, but if that means it results in inferior image quality from a previous game, is that really worth it?
 
The lighting is clearly next gen, but I do agree that there is an excess of duplicate assets, and assets of different levels of quality. But the flipside of that is that duplicating assets is an optimization. And the lack of duplicate assets would therefor be a lack of optimization. It's great if a game can have a ton of variety in assets, but if that means it results in inferior image quality from a previous game, is that really worth it?
As the available power increases, we should move to reduce the asset duplication and it’s worth it. It’s very immersion breaking to fight the same looking baddies for the entirety of a 30 hour game. It’s completely unrealistic and is detrimental to the believability of the game world. It’s a compromise we should look to eradicate. If 8gb of vram has to die as a result, let it die. In terms of image quality, developers are already using DLSS and similar technologies as a crutch so I don’t think much will change in terms of image quality.
 
Status
Not open for further replies.
Back
Top