Next gen lighting technologies - voxelised, traced, and everything else *spawn*

There is a difference between the developer not implementing mGPU, and the developer wanting to implement it but being blocked by the API. After using DX12 for the last 2 years, i have some basic-enough understanding of the API. So i was making a case for RTX mGPU being possibly blocked out of control of the developer of the game.
 
Dx12 can access all GPUs and in theory all resources on all GPUs.

It's just that driver forced multi-GPU in an application isn't a thing in Dx12. Multi GPU support in applications has to be done at the application level by the application.
Yeah that's basically what I was trying to say. When I said "these games can't communicate with the 2nd GPU", I'm referring to these specific RT-enabled games since they don't support mGPU, not that any games can't access them. Badly worded.

So i was making a case for RTX mGPU being possibly blocked out of control of the developer of the game.
You've gone from "intentionally beheaded" to "possibly blocked". So can you elaborate on what about DX12 makes you think a game that is developed to support mGPU in DX12 wouldn't be able to access the RT cores on both GPUs? And who was intentionally beheading RTX in mGPU and why?
 
since we're on the topic of explicit mGPU. Shadow of the Tomb Raider supports mGPU and scales very well on DX12. That being said, their DXR patch is still very much in status unknown. It might be that they are making an attempt to do explicit mGPU with DXR since they already support explicit mGPU really well.

On a side note, I'm unsure why everyone gave this game such hard time. It's quite a graphics tour de force and the scenes can be fairly complex.
 
D0SD2qwX4AEsUge.jpg:orig
 
If the author there makes the point to say that it is a labour intensive faked system for GI that only works in specific scenes - what is the purpose of pointing it out as being an alternative or relevant performance metric?
The purpose of course is perpetuate this fantasy that NVIDIA is scamming us all, and the perfect fantastical RT compute solution is right around the corner to save us all from evil NVIDIA.

It's no problem to spam this thread with useless videos that doesn't really describe what algorithms are used to generate the RT, or what the limitations are, or even a proof of hardware and fps, I can make a video tomorrow claiming I achieved 300fps with my hand made RT demo on my GT 1030, and the compute folks would take it for granted and be none the wiser.

It's really getting tiring and ridiculous.
 
If the author there makes the point to say that it is a labour intensive faked system for GI that only works in specific scenes - what is the purpose of pointing it out as being an alternative or relevant performance metric?
The point is pretty simple..counter the narrative (which is in part full of horseshit coming from a certain company as seen in the screenshot I posted above), that you need to do things a certain way to reach a certain goal (similar to irobot's post). Which is obviously the whole narrative of this particular thread...Maybe you could enlighten us on how this is not an alternative for certain uses cases ? I may have missed the memo which says that which should all bow to nvidia's holy solutions..or maybe we are not on ResetEra/Gaf...
 
The purpose of course is perpetuate this fantasy that NVIDIA is scamming us all, and the perfect fantastical RT compute solution is right around the corner to save us all from evil NVIDIA.
I think the purpose of the post (and the video in part) was to highlight the scamming from Nvidia on what an RTX "off" scene looks like, which was of course laughed about during and after their Turing release conference.

It's really getting tiring and ridiculous.
So are the Nvidia-are-gods posts.
 
If the author there makes the point to say that it is a labour intensive faked system for GI that only works in specific scenes - what is the purpose of pointing it out as being an alternative or relevant performance metric?
I think for a balanced discussion it's important to showcase as much as possible before dismissing any of it off.
That being said, there is a stark difference between a demo like the ones posted, and full implementation into a complex AAA title.

Of which DXR has been successfully bolted-onto two titles so far, with a couple more expected later this year.
I don't know where these compute solutions will go or whether they will surface into anythign more significant, but it's good to keep tabs on what can be done outside the realm of RTX.

But obviously I think it's fair to say that we're not comparing this tech demo and it's performance to the performance of what we see from Nvidia RTX cards and BFV and Metro. I mean, lets face it, metro and bfv without RT is going to draw away tons of available graphics performance already; it's questionable how much performance is left on these cards for RT to be done and thus why I still think hardware accelerated RT is the way to go right now, at least for a generation or two.

At the core of the debate there is performance and there is labour cost. The second usually gets omitted in the discussion. But I am a believer that the second is certainly as big of a driver if not larger than the first.
 
I think the purpose of the post (and the video in part) was to highlight the scamming from Nvidia on what an RTX "off" scene looks like, which was of course laughed about during and after their Turing release conference.
What scamming?
So are the Nvidia-are-gods posts.
Where are those?
counter the narrative (which is in part full of horseshit coming from a certain company as seen in the screenshot I posted above), that you need to do things a certain way to reach a certain goal (similar to irobot's post).
NVIDIA never made the claim that their way is the only way, they just stated their way is the faster way RIGHT NOW, and is the most usable way in games. All those unknown videos disprove absolutely NOTHING about NVIDIA's claim. In fact they do the opposite, they showcase how all compute implementations right now are lackluster and unusable with tons of limitations.
 
Image Based Lighting has been around for quite a while and would be widely used by game studios if a viable solution.
 
I think the purpose of the post (and the video in part) was to highlight the scamming from Nvidia on what an RTX "off" scene looks like, which was of course laughed about during and after their Turing release conference.

Metro looks exactly like "RTX Off"...
 
Metro looks exactly like "RTX Off"...

Metro also has some of the worst non-RT lighting of any of the latest big games. This was surprising to me as in their past games they paid a lot of attention to lighting. It's almost as if it was done to better highlight their RT implementation. Either that or they spent so much time trying to get RT right that they didn't spend as much time on the RT off lighting.

Hence, why in a previous comment of mine, I mentioned that without side-by-side comparisons of Metro: Exodus against itself (IE - instead comparing it to other AAA games with good lighting implementation), it's was sometimes difficult to see how RT in Metro improved things. However, at other times, in specific instances it was noticeable.

Regards,
SB
 
Metro also has some of the worst non-RT lighting of any of the latest big games
Have you played the game?
The game retains some of the best baked GI lighting ever conceived, even better than most of the best games out there, it simply looks gorgeous without RTX.

However what RTX does is correct the limitations of any baked GI solution, specifically in relation to dynamic moving objects, that's what most of the difference comes from, and that's the biggest difference captured in any comparison screenshots. If you are having a hard time detecting RTX it's because you are mostly looking at static objects, once stuff moves the difference becomes stark.

RT also irons out the inevitable locations left untouched by the baked GI, of which there are several in any open world kind of game.
 
Last edited:
Back
Top