NVidia Ada Speculation, Rumours and Discussion

Status
Not open for further replies.
Double ROPs double confirmed. What would be the point of that? UE5 hardly even uses them :)

You’ve been running various games through Nsight, do any of them have very high ROP usage?

DL2 is probably the most impressive implementations of open world RT so far. The softness and subtlety of the GI makes a huge difference in those scenes. Are there games that have achieved similar IQ without RT?
With baked lighting which is still fine for what current games are doing IMO.
 
With baked lighting which is still fine for what current games are doing IMO.
Yes, baked lighting is fine until you try the RT version of the same scene. When you go back to fake light, it doesn't cut it anymore and you realize how much you miss.
 
You’ve been running various games through Nsight, do any of them have very high ROP usage?

Not really. There are depth/gbuffer/post-processing passes that stress the ROPs and VRAM but they’re usually extremely quick. Speeding them up further won’t reduce overall frametimes much. ROPs consume a ton of write bandwidth, maybe it’s a cheap option for taking advantage of the new L2. Still a weird move given performance will likely be dominated by RT and shading in upcoming games.

With baked lighting which is still fine for what current games are doing IMO.

Meh, baked lighting is a cop out. You can always claim you don't need RT if you're happy with static lighting. The more valid question is what other games with dynamic lighting approach the same level of quality.
 
Not really. There are depth/gbuffer/post-processing passes that stress the ROPs and VRAM but they’re usually extremely quick. Speeding them up further won’t reduce overall frametimes much. ROPs consume a ton of write bandwidth, maybe it’s a cheap option for taking advantage of the new L2. Still a weird move given performance will likely be dominated by RT and shading in upcoming games.



Meh, baked lighting is a cop out. You can always claim you don't need RT if you're happy with static lighting. The more valid question is what other games with dynamic lighting approach the same level of quality.
Curios choice then. Nvidia must have done it for a reason assuming its true.

I don’t consider it a copout since the only thing games are doing with dynamic lighting is a useless day night cycle. That can be done with various bakes a la AC Unity. What legitimate use have games made of a dynamic lighting?
 
Maybe the implications of the ROP per GPC change with respect to the rest of the line given how they seem to approach design? While GD102 does greatly increase the number of GPCs over Turing and would have increased ROPs regardless the rest of the line has the same or even less GPCs (GD104 at 5 vs GA104 at 6). Maybe also not yet known future Tegra SoCs that have only 1 GPC would now have 32 ROPs instead of 16.
 
I don’t consider it a copout since the only thing games are doing with dynamic lighting is a useless day night cycle. That can be done with various bakes a la AC Unity. What legitimate use have games made of a dynamic lighting?

If you think dynamic lighting is useless in games then this whole conversation is a bit pointless isn't it? Have you never played Splinter Cell or any game with lights that turn on and off e.g. a flashlight. I'm not sure if you're joking...

Maybe the implications of the ROP per GPC change with respect to the rest of the line given how they seem to approach design? While GD102 does greatly increase the number of GPCs over Turing and would have increased ROPs regardless the rest of the line has the same or even less GPCs (GD104 at 5 vs GA104 at 6). Maybe also not yet known future Tegra SoCs that have only 1 GPC would now have 32 ROPs instead of 16.

Yeah that's a good point. 32 ROPs per GPC may be overkill for AD102 but fine for other products. One thing I did notice is that ROPs get completely slammed at 8K. Maybe Nvidia is still going to push that nonsense :)
 
This is a pretty weak argument given we've seen almost nothing of actual next gen games so far. You're literally naming cross-gen titles here, and I'd also add that even of those two games you specifically mention, they do NOT actually look 'rather close' to how they do on the old generation hardware. They are way better looking on the proper XSX/PS5 machines. I also disagree completely that these games having ray tracing would substantially improve their visuals. Their lighting systems are already very good and both do a good job in terms of small scale shadows and ambient occlusion and whatnot to provide a good sense of depth and placement in scenes for the huge density of objects in them. Ray tracing may be able to refine these aspects, but it wouldn't transform the visuals by any means. I mean, these games both look better as a whole than basically all of the other 'last gen but now with ray tracing' cross gen games we've seen.

I mean, heck, most of us before the specs of the new consoles was announced didn't even expect them to have ray tracing hardware whatsoever. Any reasonable person would not have then surmised that, "Oh without ray tracing acceleration, there's not actually gonna be any significant room for improvement in graphics this generation". Maybe some clueless Nvidia fanboys might have thought that, but there would otherwise still be the expectation of a meaningful leap in the graphics and ambitions of games on much more powerful hardware, even if just in more predictable areas(though something like Nanite does prove there's still other revolutions possible).

Good developers will be able to make incredible looking next gen games on XSX/PS5 without ray tracing. The improvements achieved using RT just isn't always going to be worth the compromises that will need to be made elsewhere to make room for them. Again, if there wasn't still big improvements possible elsewhere then sure, ray tracing would be a no-brainer use of the power overhead of the new consoles for almost any game, but this isn't the case, and the consoles are still limited, fixed spec machines that also happen to not actually be very good at ray tracing. It basically cannot be the defining aspect of this upcoming generation. Not to say it wont be common, just that it's not necessarily gonna be the main wow factor in next gen titles and not using it will not mean 'irrelevance' by any means.

Look at Avatar that is aimed at current gen consoles and pc. Going by your assumptions that would mean the ray tracing hardware in the consoles would be sitting largely unused, which isnt the case now and wont be the case going forward either. Yes its weaksauce RT, but its something, and certainly better then nothing. As many say, ray tracing is the true game changer this gen and it shows when utilized well enough (even on consoles).

Studios can and will make incredible looking games on XSX/PC and even PS5 native games, then imagine what they can scale up to on capable pc gpu's, as per the topic RTX4000.
That doesnt take away that RT isnt a viable solution to example lighting not just to reduce dev time but also largely increase fidelity.
 
This is a pretty weak argument given we've seen almost nothing of actual next gen games so far. You're literally naming cross-gen titles here, and I'd also add that even of those two games you specifically mention, they do NOT actually look 'rather close' to how they do on the old generation hardware.
I'm naming cross gen titles because that's the point - without RT being used as a base design feature you have these games where the best improvements from last gen are resolutions and geometry. And yes, they do look really close between old and new gen, especially for an uneducated person. RT changes that completely as it has a very visible visual impact on how a game may look.

Good developers will be able to make incredible looking next gen games on XSX/PS5 without ray tracing.
Sure but why? Do you know many console h/w which weren't used in games for that h/w?
This feels like a mantra of sorts - "they won't use it... they won't..."
They will. Simply because it's there and console s/w tend to use whatever h/w there is.
These examples I gave are not using it not because it's too slow (HFW is a 30 fps game without using RT btw) but because the games were design for previous gen h/w.
If we go back to UE5 which will supposedly always (although I doubt that) have s/w Lumen option - why wouldn't a console game use RT h/w in it for reflections for example? Why wouldn't it use it for everything in fact, considering that from what we saw it seem that h/w Lumen is in fact faster than s/w Lumen at the same quality level.
Once we'll pass cross gen and games will be made for new console h/w from scratch the vast majority of them will try to use RT in some way or form. I expect a huge share of such games to actually drop support for non-RT options of same ways/forms simply because they can - and it will save time (money) and will likely improve graphics at the same time.
 
Not really. There are depth/gbuffer/post-processing passes that stress the ROPs and VRAM but they’re usually extremely quick. Speeding them up further won’t reduce overall frametimes much. ROPs consume a ton of write bandwidth, maybe it’s a cheap option for taking advantage of the new L2. Still a weird move given performance will likely be dominated by RT and shading in upcoming games.

+
maybe it's to cool off the gpu easier, by having some more dark silicon. heat density is increasing by multitude by going to 4nm.
 
If you think dynamic lighting is useless in games then this whole conversation is a bit pointless isn't it? Have you never played Splinter Cell or any game with lights that turn on and off e.g. a flashlight. I'm not sure if you're joking...

Yeah that's a good point. 32 ROPs per GPC may be overkill for AD102 but fine for other products. One thing I did notice is that ROPs get completely slammed at 8K. Maybe Nvidia is still going to push that nonsense :)

One perhaps not so common use case (but common enough!) for needing the ROPs is VR. Render resolutions with supersampling can hit 8k per eye and the games tend to be pretty light on shading / compute. It's also the one scenario where MSAA is still around even in new titles.

Even the AAA+ title that gets the most play time still has MSAA:

https://petrakeas.medium.com/half-l...settings-produce-a-sharper-image-4d17fb8c19bb
 
I'm naming cross gen titles because that's the point - without RT being used as a base design feature you have these games where the best improvements from last gen are resolutions and geometry. And yes, they do look really close between old and new gen, especially for an uneducated person. RT changes that completely as it has a very visible visual impact on how a game may look.
Forza Horizon 5 and Horizon Forbidden West on XSX/PS5 do not look 'really close' to their last gen counterparts. You need this to be true or else your argument kind of falls apart, but it's just not.

Sure but why? Do you know many console h/w which weren't used in games for that h/w?
This feels like a mantra of sorts - "they won't use it... they won't..."
They will. Simply because it's there and console s/w tend to use whatever h/w there is.
These examples I gave are not using it not because it's too slow (HFW is a 30 fps game without using RT btw) but because the games were design for previous gen h/w.
If we go back to UE5 which will supposedly always (although I doubt that) have s/w Lumen option - why wouldn't a console game use RT h/w in it for reflections for example? Why wouldn't it use it for everything in fact, considering that from what we saw it seem that h/w Lumen is in fact faster than s/w Lumen at the same quality level.
Once we'll pass cross gen and games will be made for new console h/w from scratch the vast majority of them will try to use RT in some way or form. I expect a huge share of such games to actually drop support for non-RT options of same ways/forms simply because they can - and it will save time (money) and will likely improve graphics at the same time.
I very literally explained quite specifically WHY a developer might choose to not use ray tracing in their game. Several times in fact. Not gonna keep repeating myself if you're not gonna read what I'm actually saying.

Another sign you're not reading what I'm saying is this interpretation that I'm saying developers wont use RT. I very literally said that devs will use it and will even be reasonably common.

This is getting pointless cuz you're not actually discussing this in good faith whatsoever anymore.
 
Look at Avatar that is aimed at current gen consoles and pc. Going by your assumptions that would mean the ray tracing hardware in the consoles would be sitting largely unused, which isnt the case now and wont be the case going forward either. Yes its weaksauce RT, but its something, and certainly better then nothing. As many say, ray tracing is the true game changer this gen and it shows when utilized well enough (even on consoles).

Studios can and will make incredible looking games on XSX/PC and even PS5 native games, then imagine what they can scale up to on capable pc gpu's, as per the topic RTX4000.
That doesnt take away that RT isnt a viable solution to example lighting not just to reduce dev time but also largely increase fidelity.
Ray tracing hardware going unused in an RDNA2 GPU is not a big deal whatsoever. It's not like Nvidia hardware where they've included some notable transistor count to specific ray tracing functionality. AMD basically just dual-purposed the existing TMU's with some extra functionality.

And no, *some* ray tracing is not inherently better than nothing when you have fixed spec, limited hardware. Using that ray tracing means you have to eat a large chunk of the graphics budget. It's an opportunity cost factor, meaning you're going to have to limit improvements in some other area(s) if you use it.

And again, since you and Degustator keep twisting my comments - I am not saying devs will not use ray tracing. Obviously they will. We've seen it already. It can even be quite nice in the occasional title. But we can also see that these last gen games with XSX/PS5 updates that have added ray tracing still look like last gen games for the most part. Ray tracing isn't inherently some completely transformative graphics enhancement and there are lots of other areas of the graphics that can still be pushed much harder to provide 'next gen' visuals, or even the same areas just using cheaper non-RT solutions.

Again, if these consoles didn't have hardware-accelerated ray tracing at all(as we mostly all originally anticipated), they would still be capable of very significant leaps in graphics that we'd define as clearly next-gen.
 
Again, if these consoles didn't have hardware-accelerated ray tracing at all(as we mostly all originally anticipated), they would still be capable of very significant leaps in graphics that we'd define as clearly next-gen.

That's true when compared to last generation consoles. Without RT though the current generation consoles wouldn't bring much new to the table for PC folks. The shader/texture/memory performance required to render the latest console games have been available on PC for a long time. The main benefit of RT adoption on consoles is to free developers from being stuck supporting old rendering methods which will hopefully unlock the potential of RT on cross platform PC titles going forward.

Having said that there's obvious benefit to having faster consoles. UE5 probably wouldn't be what it is today without console hardware that can run it.
 
Ray tracing hardware going unused in an RDNA2 GPU is not a big deal whatsoever. It's not like Nvidia hardware where they've included some notable transistor count to specific ray tracing functionality. AMD basically just dual-purposed the existing TMU's with some extra functionality.

And no, *some* ray tracing is not inherently better than nothing when you have fixed spec, limited hardware. Using that ray tracing means you have to eat a large chunk of the graphics budget. It's an opportunity cost factor, meaning you're going to have to limit improvements in some other area(s) if you use it.

And again, since you and Degustator keep twisting my comments - I am not saying devs will not use ray tracing. Obviously they will. We've seen it already. It can even be quite nice in the occasional title. But we can also see that these last gen games with XSX/PS5 updates that have added ray tracing still look like last gen games for the most part. Ray tracing isn't inherently some completely transformative graphics enhancement and there are lots of other areas of the graphics that can still be pushed much harder to provide 'next gen' visuals, or even the same areas just using cheaper non-RT solutions.

Again, if these consoles didn't have hardware-accelerated ray tracing at all(as we mostly all originally anticipated), they would still be capable of very significant leaps in graphics that we'd define as clearly next-gen.

It is a big deal if it would go un-used largely in the consoles. Its being advertised as a game changing future in electronic stores and other adds. Its something console makers have been boasting about.
It is a departure into new technologies on what we see on-screen. Being stuck to just rasterization would be very 'meh'. Its weaksauce on consoles but it certainly still has use-cases.
And you have to start somewhere, next generation consoles will be more capable in the RT department. its like the pixel/vertex shaders 20 years ago.

HZFW looks quite good on the PS4, its not a generational leap going to the PS5 version. Not any game has shown that 'true generational leap' as per DF has noted. Ue5 tech demos do, but thats using hw ray tracing....
 
Something like TLOU 2 holds up rather well.
It is a gorgeous game sure, but it's also a 6-year development cycle and $100+ million budget. RT doesn't mean that any indie studio could produce something like that of course, a lot of that budget is motion capture and voice acting too, but what it can certainly help is at least reducing the art dept cost required to craft that lighting, and especially constantly update it during development as art direction can change for a particular scene over the course of the project.
 
Status
Not open for further replies.
Back
Top