Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Join the true gaming elite: 1440p 240-360Hz with low settings.

Thankfully for me, while I can appreciate the extra smoothness that comes with 240 Hz in games, going from 120 Hz to 240 Hz isn't nearly as dramatic as going from 60 Hz to 120 Hz. I think 120 Hz might be good enough for me as a lower bound with 240 Hz being something nice I can dream about but it's not quite into the "must have" territory that 120 Hz represents. :)

As soon as I had a taste of 120 Hz on my friend's rig, I couldn't wait to get rid of 60 Hz gaming. I don't have the same feeling after trying 240 Hz and higher.

Regards,
SB
 
The '70' class Nvidia GPU's have always been 1440p GPU's. (With '60' class for 1080p and '80' class and above for 4k) so what are you even doing talking about 4k performance for a '70' class Nvidia GPU?

Can we apply this same silly logic to the 7900XT? As unlike the 4070ti the '900' class for AMD GPU's is a 4k GPU.
Feel free to apply that logic to AMD. The 7900xt is over priced. They just followed Nvidia with the price gouging. That card should be like $600 max. With regards to the '70' class gpus, what Nvidia calls it doesn't matter. It's what they charge for it. The 4070 for example should actually be the 4060 or 4060ti at worst. Nvidia and Amd are out here price gouging and we have folks in here applauding it. It's very sad.
 
With regards to the '70' class gpus, what Nvidia calls it doesn't matter. It's what they charge for it.

I dissagree, it's still a 1440p class GPU no matter how people try to swing it.

The high prices have just made people expect them to perform above their class (which I can kind of understand) but it's a 1440p card.
 
I have both (well the demo on RE4 anyway) along with a 12GB GPU and I can assure you neither game comes even remotely close to 'obsoleting' it based on VRAM.

Putting aside the fact that TLOU should not be used as a watermark for anything given how much of a technical mess it is. It's also not particularly playable on a 4070Ti at native 4K ultra settings which is pretty much what you'd need to break 12GB. I can get the game comfortably below 12GB by turning on DLSS Quality and reducing the environment textures from Ultra to High with literally no visible difference in texture quality. And that's with the game supposedly reserving 2.5GB VRAM for the rest of the system, which according to @yamaci17 it's not actually doing anyway (and is enormous overkill for what most people actually need).

RE4 is a classic case of hyper inflated vram requirements for no appreciable gain. You can max everything out while staying under 12GB by simply reducing shadow quality from Max to High for no appreciable image quality loss. I suspect it's AMD sponsored roots are at play there.
It doesn't matter what the excuse is, bad ports, etc. Those will always exist. Nvidia's job is to deliver value with their products and so far, the whole Ada line fails to deliver much value. If you purchase a graphics card for $800 USD in the case of the 4070ti, the card should be designed with longevity in mind. The 4070ti is not.
It's highly likely that even the $1600 4090 won't be able to play "next gen raytracing/path tracing games" at native 4K, and don't even get me started on the $1000 7900XTX, so I really don't see why you're expecting that of the lower end cards. Upscaling has already been targetted squarly at games with RT because the performance just isn't there to use it at high native resolutions. Try playing CP2077 at native 4k with Path tracing enabled on a 4090 and see what happens.
The real question is why are you defending the card so strongly? Most people look at the ada line up and think it's a bad deal all around. I don't know why you have a vested interest in defending Nvidia.
No, this doesn't matter in the slightest. It's a far faster card than the 3070 by much more than 13% in every scenario. Ada obviously has significant changes over Ampere (largely the heavily increased caches) which make it much less reliant on VRAM bandwidth than previous architectures.

I mean come on, the 7900XTX literally has less bandwidth than the Radeon VII. Are you suggesting that makes it a worse GPU somehow?
Since when is 30% far faster? I chose the bus to highlight the stagnation that's occurring with the 4070. The whole card is bad, I could have chosen the lack of change in cuda cores or other specifications.
Hey no-ones saying that more VRAM wouldn't have been nice, of course it would. But that would also have made it more expensive, and the 12GB it does have is not even remotely going to obsolete it for the remainder of this generation. There may be the odd corner case where some minor compromises have to be made where they otherwise wouldn't if it had 16GB, but my money is on those being very few and far between, and very minor in nature. Nothing like the launch issues with Forsaken for example which meant 8GB GPU's had to suffer PS3 like textures at any setting (which has been resolved now for the record, just like most of the recent 8GB hobbling issues).
That's really not the point. Nvidia went 7 years or 3 architectures(1070 -> 2070 -> 3070) and delivered no improvement in memory capacity while increasing the price significantly. 16gb is the base expectation at the prices they're charging for a "70" class gpu which is really a 4060 in disguise. We know they're playing funny games as they tried to pass of a 4070 as a 4080 but got caught red-handed. They then successfully passed off a 4070 as a 4070ti and people lapped it up claiming Nvidia had self corrected. What a joke.

Again, why does this matter in the slightest? If the redesigned architecture allows it to run at a much higher clock speed which in turn results in much higher performance, along with much improved performance per watt to boot, why does it matter that they achieved it that way?
Who says the redesigned architecture has anything to do with it? They went from Samsung's bad 8nm process to TSMC's "4N" process. If you just put ampere architecture on that process, you'd have gotten a huge boost in performance simply by doing nothing. All the things you're talking about are process related. Like I said, when i see evidence that Ada is actually significantly faster clock for clock than Ampere, then I'll gladly give credit where credit is due. So far, I haven't seen any evidence of that at all.
 
Last edited:
It doesn't matter what the excuse is, bad ports, etc. Those will always exist. Nvidia's job is to deliver value with their products and so far, the whole Ada line fails to deliver much value. If you purchase a graphics card for $800 USD in the case of the 4070ti, the card should be designed with longevity in mind. The 4070ti is not.
1681510703473.png

What do we call the experience on a 7900XTX?
 
View attachment 8755

What do we call the experience on a 7900XTX?

With respect with 'longevity' in mind, it's very bad, but with some pretty significant caveats that are dependent on a number of potentialities:

1) FSR 3.0 sucks or has poor dev uptake.
2) Cyberpunk's path tracing become more commonplace outside of one game, and if it does, other devs are as good at vram management as CD Projet is so the 12GB of a 4070TI doesn't also become a bottleneck.

51 fps avg is a hell of a lot better than 24 btw - but it also low enough that I would probably not be using overdrive on a 4070ti either. Overdrive at this point is a nice look into the future, but whether that 'future' means within the next 2 years we have a substantial amount of games that will be utilizing similar tech is the question.
 
There will be enough new games with old school Raytracing which be running worse on a 7900XTX than on a 12GB nVidia card like the 4070TI. This ARK remake has been announced with UE5 and RTXDI...

But this wasnt my point. VRAM is not the only part of a new GPU. And there are already enough games available in which the 7900XTX isnt better than a 4070TI:
1681511921112.png

Not everyone plays in 4K, not everyone cares about the latest AAA games.
 
Not everyone plays in 4K, not everyone cares about the latest AAA games.

Yup, and not everyone cares about the current state of RT. If nothing except the 4090 is fast enough in RT, which is the case for me and even then not all games are fast enough for me with RT on the 4090, then RT just plain doesn't matter. In which case the 7900 XT/X outside of it's absurd price (just like the 40xx series) is pretty good.

Since RT doesn't matter to me, I'm more concerned with memory amount and non-RT performance still. For some people RT is the only thing that matters, and so, perhaps they are fine with paying more for less memory and less non-RT performance. And that's fine as well. :)

Regards.
SB
 
RDNA 3 in it’s currently available iterations does suck. That fact doesn't mean Ada is great. The cards are overpriced and I believe Nvidia’s margins on them are higher than previous GPUs of the equivalent class.
 
51 fps avg is a hell of a lot better than 24 btw - but it also low enough that I would probably not be using overdrive on a 4070ti either.

It runs at a pretty consistent and very smooth (I.e no stutters) 60fps at 3840x1600 max settings DLSS Auto on my system. I'm not sure anyone can reasonably say those are unplayable settings.

As to playing it without PT just to get a few extra fps or a slightly sharper image, that would be a huge waste IMO. Hell I played through Crysis at something like 720p/30 because I wanted to experience the new pinnacle of graphics, and now I can do that at something resembling 4k at a pretty decent 60fps. Why on Earth would I drop that back to 'current gen' graphics just for that?
 
Speaking of Capcom's QA issues, saw a ~300mb update for both RE2 and RE3 on Steam, so went to the Steam forums to see if there were any changes as no patch notes (per usual), and apparently this patch removed RT options. Tested both and can confirm, no more RT.

Sure this is a bug, but lol.
Really? You're using the DX12 versions and not the Beta no RT DX11 versions right?

Why would they remove RT from the game?
 
It runs at a pretty consistent and very smooth (I.e no stutters) 60fps at 3840x1600 max settings DLSS Auto on my system. I'm not sure anyone can reasonably say those are unplayable settings.

I wouldn't say those are 'unplayable' settings either, but I'm going by the chart provided which has it at 51fps at 1440p - is that with native but no DLSS2, but with just frame gen? Or do you have a 4080/4090?

Really? You're using the DX12 versions and not the Beta no RT DX11 versions right?

Why would they remove RT from the game?

Yes it's the DX12 version, everyone is getting it. As to why, it's a shitty QA system - assuming it's not intentional, which I don't know why it would be after all this time.
 
Last edited:
Status
Not open for further replies.
Back
Top