Not generating the exact same bits in GPU product and architecture comparisons

trinibwoy

Meh
Legend
Supporter

Very impressive debut for FSR 4. Tim kills it again with a great analysis too.

At one point in the video he makes a distinction between detail and sharpness and that’s something I’m not really clear on. In theory sharpness can’t substitute for missing detail but in practice I find post processing sharpening can make textures appear more detailed by increasing contrast in color transitions.
 
A meta analysis of FSR4 cost compared to FSR3/DLSS3.

Spider-Man Miles Morales:
4K: 5070Ti is 10% faster.
4K FSR3/DLSS3: 5070Ti is 12% faster.
4K FSR4/DLSS3: 5070Ti is 19% faster.

So the cost of FSR4 is 7% in this game vs FSR3.

Call of Duty Black Ops 6:
4K: 9070XT is 14% faster.
4K FSR3/DLSS3: 9070XT is 5% faster.
4K FSR4/DLSS3: 5070Ti is 12% faster.

So the cost of FSR4 in this game is 17% vs FSR3.

Horizon Forbidden West:
4K: 9070XT is 17% faster.
4K FSR3/DLSS3: 9070XT is 4% faster.
4K FSR4/DLSS3: 5070Ti is 1% faster.

So the cost of FSR4 in this game is 5% vs FSR3.

God Of War Ragnarock:
4K: 5070Ti is 6% faster.
4K FSR3/DLSS3: 5070Ti is 5% faster.
4K FSR4/DLSS3: 5070Ti is 12% faster.

So the cost of FSR4 in this game is 6% vs FSR3.

Conclusion:
FSR4 has an average cost of about 6% higher than FSR3, sometimes it can go higher (up to 17%), and DLSS3 is faster than FSR3, as it helps equalize several performance gaps at between the 5070Ti and 9070XT at native.

 
Yeah these heavier FSR4/DLSS4 models are going to weigh heavily on head to head comparisons. I suppose it’s technically part of GPU performance so shouldn’t be treated differently to any other workload in a game.

The bigger challenge is IQ comparisons. I can’t imagine these guys will continue to devote so much time and energy to doing zoomed in video comparisons of upscaling methods especially as they become more commonplace. Just like nobody does comparisons of texture filtering now but it was a big deal 20 years ago. Hopefully they can come up with some way to automate IQ evaluation.
 
I think for me, FSR4 has crossed the "good enough" threshold where I don't really care what the differences are from DLSS. I'm more likely to be interested in performance differences between the two than quality differences. It'll get super complicated doing reviews. I still don't know if I'm interested in frame gen, because I haven't seen what good frame gen looks like in person. In theory I like it, but I don't know if either DLSS or FSR4 FG will pass the "good enough" bar for me.

There are just too many factors for a review if you want to cater to everyone. Like, I don't really care about native rendering at all anymore, except in a synthetic way like this gpu has 1.5x performance over this one. I'll literally never run a game at native if there's an option for FSR4. That requires a totally different review than the person that wants 4k native, or uses super-sampling etc.

Edit: Also, this Daniel Owen vid is very good. I used to not like him because I thought he rambled and his videos were way longer than they needed to be. He's gotten better, and I feel like there aren't a lot of wasted words, and he's breaking things down pretty clearly and addressing a bunch of use cases. You could criticize the selection of games, but overall I think his thought process is pointed in the right direction.
 
Last edited:
I think for me, FSR4 has crossed the "good enough" threshold where I don't really care what the differences are from DLSS. I'm more likely to be interested in performance differences between the two than quality differences. It'll get super complicated doing reviews. I still don't know if I'm interested in frame gen, because I haven't seen what good frame gen looks like in person. In theory I like it, but I don't know if either DLSS or FSR4 FG will pass the "good enough" bar for me.

There are just too many factors for a review if you want to cater to everyone. Like, I don't really care about native rendering at all anymore, except in a synthetic way like this gpu has 1.5x performance over this one. I'll literally never run a game at native if there's an option for FSR4. That requires a totally different review than the person that wants 4k native, or uses super-sampling etc.
And that's why reviews should be done apples to apples and leave scaling, framegen and whatnot to articles specifically about them, like I've been saying since day 1
 
There are just too many factors for a review if you want to cater to everyone. Like, I don't really care about native rendering at all anymore, except in a synthetic way like this gpu has 1.5x performance over this one. I'll literally never run a game at native if there's an option for FSR4. That requires a totally different review than the person that wants 4k native, or uses super-sampling etc.
This is wholly dependent on a review subject. If you're looking at a GPU review then upscaling options are just one of features supported and they should be explored in a separate section (one thing I don't like is when these are being used for review as a baseline; "quality" vs "quality" is not an equal comparison). If we're talking about some game performance review though then using upscaling seem unavoidable these days as most gamers will use it.
 
And that's why reviews should be done apples to apples and leave scaling, framegen and whatnot to articles specifically about them, like I've been saying since day 1

That’s not practical as scaling is gradually becoming the “normal” way to play some games. For a long time now we’ve essentially assumed that all hardware outputs the same image when that wasn’t always the case. One day we’ll assume the same for upscalers.
 
This is wholly dependent on a review subject. If you're looking at a GPU review then upscaling options are just one of features supported and they should be explored in a separate section (one thing I don't like is when these are being used for review as a baseline; "quality" vs "quality" is not an equal comparison). If we're talking about some game performance review though then using upscaling seem unavoidable these days as most gamers will use it.
Will they? General gamer is a whole different story from forum dwellers. Unless it's turned on automatically, I dare to say most gamers won't use it. Good example of this is NVIDIAs latest PR slide saying this many % use x tech now, when they're taking the numbers from NVIDIA App which automatically turns them on, so in reality they're showing how many actually turn them off, not on. It wasn't that long ago when some site (TPU?) did a questionnaire and under half of people even knew what DLSS is - and that's on actual tech site.

edit: this isn't the one I mean, but little over year old and show people preferring native over scaling and this is for forum dwellers https://forums.overclockers.co.uk/t...e-updated-poll-choices-24-12-revote.18981788/
 
That’s not practical as scaling is gradually becoming the “normal” way to play some games. For a long time now we’ve essentially assumed that all hardware outputs the same image when that wasn’t always the case. One day we’ll assume the same for upscalers.

Exactly. I'm still for including native res results at least, but lets face it - if the 9070 did not have FSR4, it would be viewed as a non-starter by most reviewers, even if DLSS4 didn't materialize. Having a better 'value' argument at native res is irrelevant if the reconstruction methods you present to players are so poor that it's basically native or nothing.

The 9070 is competitive precisely because AMD devoted engineering and die area resources to processing its AI model.
 
Will they? General gamer is a whole different story from forum dwellers. Unless it's turned on automatically, I dare to say most gamers won't use it. Good example of this is NVIDIAs latest PR slide saying this many % use x tech now, when they're taking the numbers from NVIDIA App which automatically turns them on, so in reality they're showing how many actually turn them off, not on.
What? NvApp doesn't turn anything on "automatically".

It wasn't that long ago when some site (TPU?) did a questionnaire and under half of people even knew what DLSS is - and that's on actual tech site.
Which isn't telling us anything. DLSS is viewed by most as a "free" performance upgrade. I'd argue that people who prefer "native" over everything is the minority in this picture as the majority don't care about "native", they only care about performance. Case to consider here is the UP mode which was created for 8K output and yet I constantly see people saying they use it on their definitely not 8K monitors.
 
Good example of this is NVIDIAs latest PR slide saying this many % use x tech now, when they're taking the numbers from NVIDIA App which automatically turns them on, so in reality they're showing how many actually turn them off, not on. It wasn't that long ago when some site (TPU?) did a questionnaire and under half of people even knew what DLSS is - and that's on actual tech site.

I don’t think that’s true. At least it’s never defaulted anything on for me. You have to opt-in to “optimized settings” or something for that to happen.

Exactly. I'm still for including native res results at least, but lets face it - if the 9070 did not have FSR4, it would be viewed as a non-starter by most reviewers, even if DLSS4 didn't materialize. Having a better 'value' argument at native res is irrelevant if the reconstruction methods you present to players are so poor that it's basically native or nothing.

The 9070 is competitive precisely because AMD devoted engineering and die area resources to processing its AI model.

Yeah and when next generation consoles are producing amazing looking upscaled 4K images nobody will think running PC games at native is a smart thing to do.
 
What? NvApp doesn't turn anything on "automatically".


Which isn't telling us anything. DLSS is viewed by most as a "free" performance upgrade. I'd argue that people who prefer "native" over everything is the minority in this picture as the majority don't care about "native", they only care about performance. Case to consider here is the UP mode which was created for 8K output and yet I constantly see people saying they use it on their definitely not 8K monitors.

The Nvidia App can automatically adjust your game settings to "optimized." I have no idea how many gamers are using that. Would be interesting to see if they're just tracking it from the Nvidia app or some driver telemetry. Ultimately I don't really care. There are lots of use cases for GPUs now because people like different things.
 
Back
Top