Nvidia GeForce RTX 40x0 rumors and speculation

Very misleading. DLSS3 results are not in any way a comparable measure of performance especially since it’s not universally available in all games. It’s a nice feature but shouldn’t be presented as a baseline performance uplift because it’s not.
It seems likely to me that all new games will be DLSS 3.

So the remaining question is, at 1440p, which existing games aren't already a fantastic experience with 4070Ti? A few of the most painful for Ampere, at least, are DLSS 3, so for the rest, which are still painful on Ada?

Of course, I'm ignoring the "latency" question. If you think you're a 120fps update-rate AND 120fps latency gamer, then DLSS 3 is just a pisstake.
 
If Videocardz are to be believed then both AD106 and AD107 gets 128 bit bus.

Upd: CB.de is reporting the same:
 
https://wccftech.com/nvidia-geforce...s-leak-out-flagship-faster-than-the-rtx-3090/

the leaked benchmarks today show it is faster than the RTX 3090 and within a striking distance of the 7900XTX, so I guess it's going to be faster in Raytracing -though not when it comes to rasterization-
 
Wouldn't be the first time flashing a different BIOS on an Nvidia GPU got you more shader cores and VRAM.

That's a very old post. Throwing aside questions about the source for a moment, we now know it was renamed to the 4070ti and the configuration is the same. The actual RTX 4070 currently is rumored to be cut down quite a bit 46 SMs (vs. the full 60).

Also unless there's a major security leak bios flashing to do that is not going to happen anymore. Not to mention they are generally laser cutting and now fusing off parts in hardware as opposed to just relying on software locks.

Of course, I'm ignoring the "latency" question. If you think you're a 120fps update-rate AND 120fps latency gamer, then DLSS 3 is just a pisstake.

I don't understand this take, although we're likely going to be rehashing old debates. DLSS 3 support mandates reflex which lowers latency compared to non reflex. If we go with that thinking than is latency unacceptable for all current games not using reflex?

Also frame generation can be 120fps+.

Many games are also going to start to be CPU/memory bound when going high refresh, even with the highest end current CPU/memory configs. Not to mention even for GPU buyers in this price range I would expect everyone to be that high end in CPU/memory. As a fun fact extreme example one of the HUB reviewers apparently has a RTX 4090 paired with a 12400 in this personal machine.

My usual suspect for such cases is VRAM but alas in this case they are comparing a new 12GB card to an old 24GB one.
Will be interesting to see what this "Overdrive" mode does for Ada to be that much faster.

I'd guess it's the increased amount of ray tracing and leveraging SER.
 
They were, nothing to wonder there.
So they got 1.5x in APTR while comparing the game running in 4K native to it running in 4K with DLSS Performance and FG?
Which means that the card will be about 1/2 in comparison to 3090 in 4K native?
...Nothing to wonder here?..
 
yeah, basically when they deliberately being vague: assume the worst, the most deceitful, the most sleazy, the most scummy.

then when real benchmarks came out, and turns out they were telling the truth: no disappointment, no anger. if turns out indeed they were lying thru their teeth: also no dissapointment no anger.

btw anyone have the full VRAM sizes lineups? and the adjustable TDP ranges?
 
So they got 1.5x in APTR while comparing the game running in 4K native to it running in 4K with DLSS Performance and FG?
Which means that the card will be about 1/2 in comparison to 3090 in 4K native?
...Nothing to wonder here?..
It's DLSS Q vs DLSS Q + FG
 
It's DLSS Q vs DLSS Q + FG

Yes. Meaning 1.5x performance without FG as a minimum. Obviously this is an extreme corner case with most games likely falling around 3090-3090Ti level performance without FG. Given the lower bandwidth I wouldn't be surprised if it's generally a bit slower than the Ti at native 4k.
 
Insane for a laptop!
what I find most impressive is the fact that the 4090 laptop version competes with a 7900XTX -and probably beats it at Ray Tracing- and then you see the power consumption..., :oops: just imagine that chip on a desktop computer, it would defeat every device under the earth efficiency wise. That's the 4090 I like the most, mainly 'cos of form factor and the power consumption of the desktop's 4090 is unrealistic for my 550W PSU.

OfsmFY2.png
 
Back
Top