NVidia Ada Speculation, Rumours and Discussion

Status
Not open for further replies.
I still find the narrative interesting in respect to being insistent at testing at "max settings" while putting RT in it's own bracket.

Well it sorta makes sense since RT isn’t supported by many of the cards still in circulation today. Though it would be nice to see the end of ultra settings that make no visual difference.
 
Putting RT vs raster aside I think sites should just test a wide range of modern, popular, and demanding games at their max settings. With max settings obviously including full RT where available.

I wouldn't want to see them cut out new and popular AAA non-RT games from the benchmark suite though just because they run very fast. Maybe they could supersample them if the framerates are too stupid? i.e. Nvidia DLSR and whatever the AMD equivalent is. Seeing Forza Horizon 5 at an internal 8K downsampled to 4k for example but still running at >100fps would be pretty sweet.
 
What like for like basis? A 5nm die costs more than an 8nm die of the same size. Of course we all want big performance increases at lower prices but acting like there’s some magic number we’re entitled to is just silly.
You're not 'entitled' to anything whatsoever going by this. They could actually deliver worse performance per dollar with the next generation and going by your own logic, I'd get to call YOU entitled if you complained about it whatsoever. Come on now.

Nothing I'm saying is unreasonable. And obviously my number isn't some magic figure that has to be adhered to strictly, I'm sure you understand that perfectly well. But a significant jump in performance per dollar each generation is literally the ONLY thing that makes new consumer GPU's properly worthwhile or worth being interested in.
 
Putting RT vs raster aside I think sites should just test a wide range of modern, popular, and demanding games at their max settings. With max settings obviously including full RT where available.
Sites already test that way, though?

I'd much prefer them testing at high raster (instead of ultra) and medium Raytracing, as Ultra settings + Ultra RT need a ton of performance that in most cases do not contribute much to image quality.
 
Sites already test that way, though?

I'd much prefer them testing at high raster (instead of ultra) and medium Raytracing, as Ultra settings + Ultra RT need a ton of performance that in most cases do not contribute much to image quality.

This was is response to earlier comments about testing the 4090 in RT games exclusively. Although in any 40x0 generation testing I think I'd want to see settings maxed out on all tiers of GPU aside from the most extreme exceptions (Cyberpunk Psycho RT on 4060 for example).

I'd certainly like to see the Ultra settings continue to be used testing as while they can often deliver little or even no visual improvement, that isn't always the case (look at Spidermans High vs Very High RT for example) and with GPU's as powerful as the upcoming generation are likely to be, I want to see how they can handle everything to the max.
 
We are far from 4k 144 being easy in terms of rasterization. Unless we are now calling 4k DLSS performance mode 4k.
More importantly, we have yet to see any proper next gen games yet on PC. Demands will go up. Achieving 4k/120fps or even 4k/60fps is going to become more difficult in games that are really pushing things. Just compare how hard it is to run Bioshock at 4k with what it takes to run Horizon Zero Dawn at 4k. Massive world of difference in demands between a single generation.

Ray tracing will not be the only part of this, either. Far, far from it. There's still plenty of room to push graphics a whole lot without any ray tracing at all.

Anybody arguing that 'only rasterization should matter' or 'only ray tracing should matter' concerning benchmarks are being quite silly. We clearly need a mix of both still, which is why many reviewers might sometimes separate them as different categories. Makes perfect sense.

As for whether ray tracing will become 100% ubiquitous in a few years, I doubt it. More common, sure. But until we get a new generation of consoles with far better ray tracing capabilities, devs are gonna still build heavily around non-ray traced solution for plenty of things.
 
More importantly, we have yet to see any proper next gen games yet on PC. Demands will go up. Achieving 4k/120fps or even 4k/60fps is going to become more difficult in games that are really pushing things. Just compare how hard it is to run Bioshock at 4k with what it takes to run Horizon Zero Dawn at 4k. Massive world of difference in demands between a single generation.

Indeed. With many big console games still pushing 30fps in their Fidelity modes without RT and with DRS at that, even a 4090 might be hard pressed to hit 144hz at a locked 4k with ease. Something like Horizon Forbidden West when it eventually hits the PC might be a good example there.
 
You're not 'entitled' to anything whatsoever going by this.

Not exactly. You’re entitled to decide whether to spend your money or not.

Nothing I'm saying is unreasonable. And obviously my number isn't some magic figure that has to be adhered to strictly, I'm sure you understand that perfectly well. But a significant jump in performance per dollar each generation is literally the ONLY thing that makes new consumer GPU's properly worthwhile or worth being interested in.

Yes and everyone’s definition of significant is different. Mine is closer to 80-100% before I upgrade but I’m not going to impose that standard on everyone else. I upgraded from Haswell to Zen 3 while other people upgraded much more frequently. There are people who think 10% is “significant”.
 
Those on Ampere gpu's are most likely not even looking to upgrade to RTX4000 gpu's. Heck even most >RTX2060S owners arent going to upgrade. its not even really a need, anyone with a RTX2060S/2070 and better will be tagging along fine as thats ballpark console performance (+ some extra rt performance then). Coupled with the improving DLSS its not really the largest usergroup thats looking to upgrade.
I think Pascal users like those on GTX1060, 1070 etc are far more intrested in looking to upgrade to a Lovelace gpu. For those its a 6 year (pascal 2016) run.

As for performance per dollar it depends how you look at it. If Ada delivers twice the performance thats a large leap, in special if it isnt drawing much more (power efficiency). Hardware and games generally have gotten more expensive across all platforms unfortunately.
 
That depends on VRAM situation. If 8 GB ends up with very bad looking textures with nextgen games, I will look forward to upgrading it.
I don't think you have to be concerned with that. Future games will use virtual texturing / Sampler Feedback Streaming to get grisp texture detail with a low VRAM budget.

The Matrix Demo uses around 7 GB at 4K.
 
That depends on VRAM situation. If 8 GB ends up with very bad looking textures with nextgen games, I will look forward to upgrading it.

True, but i'd take the wait and see route and see what happens when we truly leave cross-gen. 8GB would be the minimum perhaps but not totally worthless either. Above poster was before me.
Anyway current gen titles and in special console ports to pc arent the best way to gauge if 8GB is too small for VRAM or not. Obviously shopping for a GPU now i'd go with >12.
 
Obviously shopping for a GPU now i'd go with >12.
8-12 GB is fine for anything below 4K now and likely for some years into the future still.
You really need >12 GB only for 4K+RT modes which are somewhat unlikely to be playable on any <12 GB Ampere GPU anyway.
The only odd duck here is the 3080 10GB, and it will be a pity if Nvidia will suggest for the owners of that card to upgrade to a 4080 12GB at the same price point.
I would very much prefer for the 4080 16GB to launch at $700-900.
 
I still find the narrative interesting in respect to being insistent at testing at "max settings" while putting RT in it's own bracket.

I don't mind if all sites tested both RT off and RT on. I don't expect RT to be relevant to me for at least another generation (so the one after Ada), but I'd be quite happy to be surprised if I can afford a GPU can do either 3200x1800 or 4k native at 120 Hz with RT on, or if DLSS Quality gets significantly better that might also potentially be decent to use with RT.

RT implementations also have to improve in addition to cards I can afford being fast enough to use with RT, especially if it can't be locked to 4k/120 or 3200x1800 @120 Hz. Otherwise it's likely one of the first things I'll start to disable in order to get it locked to 120 Hz.

Yes, it'll be nice to look at the RT when not playing in order to see how graphics rendering is advancing, but as with previous generations of cards, I MUCH prefer fluid and responsive gameplay over and above any graphical IQ level or setting. Just like I quite happily disabled shadows and AO in games (didn't help that they didn't look good 5+ years ago AND came with relatively larger performance hits) to hit a locked framerate, I'll quite happily do the same with RT.

So, while I find RT benchmarks interesting, they are almost completely irrelevant to me. But that said, I still want to see them as a technical reference point ... in addition to non-RT benchmarks.

Regards,
SB
 
Last edited:
Well it sorta makes sense since RT isn’t supported by many of the cards still in circulation today. Though it would be nice to see the end of ultra settings that make no visual difference.

Isn't that what Hard|OCP basically was trying to do? Reviews of cards at what they considered "playable" settings. And then commentary on settings they turned down that were the least impactful on the visibly rendered output?

I have no idea if they still do those types of reviews since I haven't been there in well over half a decade.

Regards,
SB
 
Isn't that what Hard|OCP basically was trying to do? Reviews of cards at what they considered "playable" settings. And then commentary on settings they turned down that were the least impactful on the visibly rendered output?

I have no idea if they still do those types of reviews since I haven't been there in well over half a decade.

Regards,
SB

They don’t do any types of reviews. That site shut down many years ago. Their approach was somewhat useful at the time but only if you trusted their subjective opinions on IQ settings and the definition of playable.

Regarding VRAM, there are much smarter algorithms now for managing texture memory usage. However the need to store entire BVHs in VRAM will make quick work of 8GB.
 
Isn't that what Hard|OCP basically was trying to do? Reviews of cards at what they considered "playable" settings. And then commentary on settings they turned down that were the least impactful on the visibly rendered output?

I have no idea if they still do those types of reviews since I haven't been there in well over half a decade.

Regards,
SB
They used to use the term "real world benchmarks" to describe their methodology and I never quite agreed with it. They started that when the FX came out and ATi/AMD (can't remember which they were then) was killing them in real benchmarks.
Kyle left HardOCP to go to work for Intel's GPU dept around the time they hired all the AMD people. He left Intel over a year ago to care for his son, I respect him for knowing his priorities at least.
 
I don't think you have to be concerned with that. Future games will use virtual texturing / Sampler Feedback Streaming to get grisp texture detail with a low VRAM budget.

The Matrix Demo uses around 7 GB at 4K.
I will believe it when I see it. :)
True, but i'd take the wait and see route and see what happens when we truly leave cross-gen. 8GB would be the minimum perhaps but not totally worthless either. Above poster was before me.
Anyway current gen titles and in special console ports to pc arent the best way to gauge if 8GB is too small for VRAM or not. Obviously shopping for a GPU now i'd go with >12.
Well, I look forward to how Avatar game will look on 8 GB. It also supposedly will be built upon ray tracing. If it produces okay textures, than I'm okay. I'm simply too traumatized by lastgen games which had super low res, ugly textures anything below their maximum texture setting, like RDR 2/Cyberpunk/AC Origins. Some games are exception to the rule, like Doom Eternal, but sadly, an exception. To me lastgen always felt like even turning texture quality one notch always degraded and downgraded textures in a huge manner to a point where I simply start thinking there exists no texture quality.
 
Also people should keep in mind games have to be developed with the lowest common denominator in mind, the Series S. Based from my findings, the Series S runs texture settings that are similar to what a 4 GB PC GPU offers. That doesn't apply to PS5 ports of course.

So very likely, if you adjust texture quality settings accordingly, 8 GB and 6 GB will be enough for the whole generation.
 
Status
Not open for further replies.
Back
Top