Only if 3070 was in fact bandwidth constrained3070 Ti on the other hand might be more interesting.
Only if 3070 was in fact bandwidth constrained3070 Ti on the other hand might be more interesting.
Maybe the G6X power hog hit again?I guess they raised TDP to almost 300 watts to ensure a difference- meh.
3070Ti will be an interesting product as a comparison point for sure.Maybe the G6X power hog hit again?
Why are we talking about these cards like we'll actually be able to get one?
I think you are forgetting how bad DLSS 1 was.It pretty much never looked worse than basic monitor upscaling when normalized for performance. HUB was the only source which made such nonsense claims.
What it did is introduce an obvious "ML hallucination" style into the upscaled image which may not have been to everyone's liking.
That's one game and the first one to get both DLSS and RT. Again these claims can't be made universally based on such limited sample size. On average it did manage to look better than a standard upscale while being slower of course. And HUB isn't a source I consider trustworthy in these things.I think you are forgetting how bad DLSS 1 was.
It didn't even manage to look better than a 1440p standard upscale despite being much slower.
Do you think they are tampering with the footage/lying about what footage is what?That's one game and the first one to get both DLSS and RT. Again these claims can't be made universally based on such limited sample size. On average it did manage to look better than a standard upscale while being slower of course. And HUB isn't a source I consider trustworthy in these things.
I think that when their analysis is telling me something which I know is wrong or which my own eyes don't see then I can't trust their analysis. They are fine for CPU/MB and monitors reviews but their GPU coverage for the last couple of years was seriously lacking.Do you think they are tampering with the footage/lying about what footage is what?
I think that when their analysis is telling me something which I know is wrong or which my own eyes don't see then I can't trust their analysis. They are fine for CPU/MB and monitors reviews but their GPU coverage for the last couple of years was seriously lacking.
You also seem to have missed the part about making generalized claims based on one sample.
A selection of gaming press articles that say differently doesn't prove your point. Other press, and posters on technical forums, were not impressed. DLSS 1.9 and 2.0 were Nvidia's response to the general shared disappoint in 1.0.
Good grief Our recollections clearly differ greatly.
So maybe you should look beyond that time frame and beyond HUB?There weren't many samples during that time frame.
Do the research.Metro was also bad but a patch made DLSS at least equal to the built in upsampling. It was just those 2 and FFXV for a while. I believe the next game was half a year or so later when Tomb Raider added it in. It wasn’t any better there either.
Depends on what is "positive light" to you.What DLSS 1 titles showed the technology in a positive light?
Nvidia routes all FP16 calculations through tensors so perhaps that was what you were seeing. Just a guess.So I was wondering about something:
Are we really sure DLSS 1.9 ran entirely on the shader cores? I remember running nSight in 2019 with Control and once DLSS 1.9 was activated, it did actually use the tensor pipeline.
So I am a bit confused. If it really was not using tensor cores, then Nvidia could make it available on older cards and AMD, a DLSS Lite so to speak as a direct answer to FSR. Would look a ton better than FSR I imagine.
DLSS is a feature of new h/w, they use it to sell new GPUs. If they wanted to make something like FSR capable of running on any GPU out there they would made it years ago.So I am a bit confused. If it really was not using tensor cores, then Nvidia could make it available on older cards and AMD, a DLSS Lite so to speak as a direct answer to FSR. Would look a ton better than FSR I imagine.
DLSS is a feature of new h/w, they use it to sell new GPUs. If they wanted to make something like FSR capable of running on any GPU out there they would made it years ago.
Another point is that they promote DLSS as a way to keep RT performance hit minimal - and RT isn't really available on older GPUs.
But yeah it would be cool if they would make some NRDLSS (with NR standing for "not really") capable of running on all GPUs on the market.