Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Status
Not open for further replies.
Which is all Milk is saying, and he ain't wrong. ;)

Thanks for the summarization.
That is indeed what I'm saying. And I was originally going further.
Under the prerogative that the features supported by consoles end up becoming the baseline the defines what future games and engines focus on (a prerogative I thought was broadly accepted) it would actually be strategically advantageous for AMD to NOT support HW RT acceleration on PS5 and NextBOX.
>It would let them focus on more core features of their archtecture.
>It will effectively sabotage to some extent the popularity of Nvidia-friendly approaches.
>It will divert dev focus from specific Fixed Function features towards more generalized compute strategies, which AMD has an upperhand on.


And finaly, assuming AMD expects Nvidia will keep pumping money into real time RT because now that they committed to it, they try to milk it:
>Sabotaging RT will make it so their competitor pumps money into a marginal niche feature
>It will let AMD jump into RT hardware acceleration once it is more mature (matured under Nvidia's investments)

Because of all that, I said, if I were AMD, I'd not be in a rush to implement hardware features specific for accelerating RT in an RTX or DXR fashion. I'd be planning that for PS6/XB5 generation hardware.
 
Interesting theory, but I doubt it'd work. Raytracing is the future*, both in games and, more lucratively, in professional imaging. For AMD to stay away from raytracing to try and sabotage nVidia's efforts would only see nVidia gain industry prevalence. Just as AMD have Radeon Rays for the purposes of accelerating raytracing, they'll have investments in raytracing acceleration. Getting devs using those ideas in consoles will benefit all the other apps where it really matters.

* caveat, as ever, it's the results, and not the method, that's the future. Raytracing might be proven a totally inefficient dead-end!
 
Yeah, all my speculation is centered on RT for gaming.
The PRO market is a different beast, but just adding driver support for DXR through brute Force compute might not be that terrible for that market segment.
I'm taking AMD's much lower ability to invest in R&D compared to Nvidia in consideration here. They really have to chose what to focus on carefully.
 
I'm no PS4 palyer but the games looked as shown from what i've seen.
BFV or Metro however do not look like Starwars and Rodot Demo :(


They did not release their voxel GI with the argument it would be too slow for all platforms. So no games.


Your magic glass bowl is the only fantasy thing here. Wait and see.
1) I'm still waiting for the 1080p@60fps Uncharted 4 version. I'm sure it's dropping any day now :LOL:

2) Nothing stops other developers from implementing it if so they choose...

3) Predicting no RT hardware in next-gen consoles: allowed. Predicting RT hardware in next-gen consoles: not allowed.

UE4 never implemented voxel cone tracing...that's why no "games ended up using such technology"..you can't use something that doesn't exist. Epic was literally high on fumes when UE4 was unveiled (they saw Cyril Crassin's paper on the subject a few months prior & thought that this would be it..). The SVOGI shown when UE4 was unveiled was quickly scrapped because it was simply unusable at the time with the hardware available. UE4's development was a total mess, Epic was totally lost (this was before Tencent came to save the day & way before they decided to give it away for free in a last hail mary attempt..which worked to a certain degree..same thing they did with Fortnite which was a total flop & had a disastrous development cycle..they aped PUBG & made it F2P and this time raked in billions).
You prove my point. They demoed SVOGI but because the performance wasn't there it was scrapped. Without RT hardware the performance isn't there...

Yeah, all my speculation is centered on RT for gaming.
The PRO market is a different beast, but just adding driver support for DXR through brute Force compute might not be that terrible for that market segment.
I'm taking AMD's much lower ability to invest in R&D compared to Nvidia in consideration here. They really have to chose what to focus on carefully.
AMD better not screw this up...
 
1)
You prove my point. They demoed SVOGI but because the performance wasn't there it was scrapped. Without RT hardware the performance isn't there...

The performance would still not be there if PS4/XBONE level hardware were to suddenly have RT cores in them.
 
You don't know that with next-gen compute. I guess it all depends on what people mean by 'ray tracing hardware'. It is a pretty silly argument over whether console have a hardware feature or not when no-one can even say what exactly that hardware is. ;)
I agree, we can't know. BUT it's definitely better to have some sort of hardware that helps. It's like saying that, with enough compute power, we can simulate pixel shaders instead of managing them via hardware: of course, we could get a software solution, but it's obviously better having a hardware solution, so that not a really big chunk of processing power is used just for that.

Now we know that some sorts of hardware can vastly improve RTRT, so we should expect just that (not necessarily RTX, but any other working HW formula that may be even better), not using only compute just because there will be plenty of power.
 
The performance would still not be there if PS4/XBONE level hardware were to suddenly have RT cores in them.
Not worthy of next-gen ray tracing, no.

You don't know that with next-gen compute. I guess it all depends on what people mean by 'ray tracing hardware'. It is a pretty silly argument over whether console have a hardware feature or not when no-one can even say what exactly that hardware is. ;)
Do you expect next-gen consoles to have vastly superior compute power than the Titan V's that struggle to run ray tracing?

I agree, we can't know. BUT it's definitely better to have some sort of hardware that helps. It's like saying that, with enough compute power, we can simulate pixel shaders instead of managing them via hardware: of course, we could get a software solution, but it's obviously better having a hardware solution, so that not a really big chunk of processing power is used just for that.

Now we know that some sorts of hardware can vastly improve RTRT, so we should expect just that (not necessarily RTX, but any other working HW formula that may be even better), not using only compute just because there will be plenty of power.
Indeed. NVIDIA tried a fully programmable rasterization pipeline a few years ago and predictably it was far slower than its hardware accelerated counterpart.
 
Do you expect next-gen consoles to have vastly superior compute power than the Titan V's that struggle to run ray tracing?
It's not about power but how they operate. If compute shaders can be made to perform BVH intersects well, RT can be done on compute. That's the only part RTX is accelerating. Or, alternatively, perhaps compute will not accelerate intersect tests but other aspects of the resolve may be accelerated, meaning the final result is just as good.

When AMD introduced unified shaders, nVidia said they were slower than discrete shaders which is why they didn't have them in 7800. Now we all use unified shaders, and indeed their evolution into compute...
 
Do you expect next-gen consoles to have vastly superior compute power than the Titan V's that struggle to run ray tracing?

The Titan V struggles at emulating a high level implementation of Ray Tracing on PC on an API for WindowsI10 that is so generic it probably misses dozens optimisation opportunities it could have were dice building a game from the ground up with it as a baseline, and probably also forces an approach that is not as optimal as the one the developer would have chosen themselves.
 
The Titan V struggles at emulating a high level implementation of Ray Tracing on PC on an API for WindowsI10 that is so generic it probably misses dozens optimisation opportunities it could have were dice building a game from the ground up with it as a baseline, and probably also forces an approach that is not as optimal as the one the developer would have chosen themselves.
That's a very interesting point, worth researching.

However, if next gen consoles don't have specialised RT hardware, let's see if they are powerful enough to have both RTRT and a really noticeable improvement in other graphical aspects. I truly hope we can get the easiest, most efficient solution. I don't really care if it's hardware or not as long as the results are the best, both in graphics and performance. It's just that unless we're proven otherwise, right now the only facts we're seeing (and playing) are significant improvements with a (niche) hardware solution. :???:
 
One day, maybe RTX hardware will not be a niche, but the argument that even now, it's not a niche, is utterly ridiculous.
That's why I added DXR and DX11 to the comparison. Any new tech is niche, it's not something that is even worth reminding everyone about, or worth portraying it in as a massive factor in the adoption and spread into games. DX11 needed time to become widespread, and so does DXR.
You can deny this all you want. Consoles set the baseline.
Nope, maybe they do now, but in the future I highly doubt that, especially if they continue to be such middling and weak hardware that is outdated the moment it launches.
it only became popular when...
...drum roll...
...consoles with DX11 level HW features launched.
That's not correct, Crysis 2, Metro 2033, Battlefield 3, Arkham City, Deus Ex Human Revolution, Dragon Age 2, Dirt 2/3, Far Cry 3, Max Payne 3, Hitman Absolution, Battlefield 4, Metro Last Light, Call Of Duty Ghosts, Crysis 3, Arkham Origins, and many other visually striking games got released on PC using advanced DX11 features BEFORE the release of DX11 console hardware. I am sorry but you are misleading yourself with the console leading misconception. PC was always on the lead. Always will be in fact.
because they had the money to do it. Still, even now that modern consoles do support tessellation, it is not a generation defining feature.
Because current consoles are weak, most games are released without Tessellation on consoles, only to have it on full force on PC. Metro Exodus is the latest prime example of that.

No. If consoles don't have HW acceleration for RT, RTX will remain the "defacto" niche afterthought, just like AMD's HW tesselation was until consoles actually adopted it, by that time,
AMD was never in the position to push tessellation, especially that they introduced it outside of standard APIs the first time "TruForm", then they half assed it the second time. NVIDIA actually put the effort and work to make that feature viable. The same way it's doing with RTX.

So unless you think the hard RT approach of NVIDIA, is going to be outclassed in both effort and scale by AMD, which is highly unlikely, then this point is not even worth discussing.

NVIDIA introduced RTX not inside an isolated bubble, but as a part of standard DX implementation, and it actually involves much greater enhancements than Tessellation ever did, DXR involves enhanced shadows, reflections, lighting, materials .. etc. Comparing the scope of the two techs isn't even logical.

If consoles don't have RT hardware/implementation, NVIDIA will continue to expand it's DXR implementation driven by their high majority share and mindshare till the whole market is filled with DXR capable GPUs. And developers will exploit that. Because once more the PC ecosystem is wildly different from consoles.


CUDA is only still a thing on professional software. It is irrelevant for consumer games, which is what I'm discussing.
The principle still applies. And it's not related to CUDA alone, I mentioned DX11, VR and AI.
 
Last edited:
It's just that unless we're proven otherwise, right now the only facts we're seeing (and playing) are significant improvements with a (niche) hardware solution. :???:

Its what i ment before, results speak. Seeing is believing, right now RT hw is the only vaible option for modern gaming.
 
>It would let them focus on more core features of their archtecture.
>It will effectively sabotage to some extent the popularity of Nvidia-friendly approaches.
>It will divert dev focus from specific Fixed Function features towards more generalized compute strategies, which AMD has an upperhand on.
It will sabotage their own handle on a crucial tech, it will rob their PC GPUs of a crucial IQ feature on the long run and hand the IQ advantage to NVIDIA on a silver platter, years will pass and their GPUs will get the treatment of a third class citizen as their upper limit is running games with less than max quality. Especially when Intel introduces their version of DXR (which they will). People would buy AMD GPUs knowing they can't run the latest eye candy while their competitor can. Which will further destroy AMD's image and trim down their presence on PC to the minimum. It will probably drive them out of the PC market completely and then subsequently out of the console market as well. Imagine ATi not supporting Hardware T&L, they would have ended up the same way as 3dfx.

This is the most naive theory I've heard in a while.

When AMD introduced unified shaders, nVidia said they were slower than discrete shaders which is why they didn't have them in 7800. Now we all use unified shaders, and indeed their evolution into compute...
NVIDIA was the first to offer unified shaders on PC. Not AMD.
 
Nope, maybe they do now, but in the future I highly doubt that, especially if they continue to be such middling and weak hardware that is outdated the moment it launches.

So you are predicting the trend to shift. Why? The odds are against you. The natural bet is that things will behave as they have before.


That's not correct, Crysis 2, Metro 2033, Battlefield 3, Arkham City, Deus Ex Human Revolution, Dragon Age 2, Dirt 2/3, Far Cry 3, Max Payne 3, Hitman Absolution, Battlefield 4, Metro Last Light, Call Of Duty Ghosts, Crysis 3, Arkham Origins, and many other visually striking games got released on PC advanced DX11 features BEFORE the release of DX11 console hardware. I am sorry but you are misleading yourself with the console leading misconception. PC was always on the lead. Always will be in fact.

Yes, a dozen or so titles were built with DX9 as a baseline, and had additional DX11 options for niche ultra features. That's the future I predict for DXR if consoles don't support it.
Now after PS4/BONE launched, DX11 became the baseline, and games were built under engine architectures that were only possible under a DX11 level HW paradigm. You can't say the same about the titles you listed.

Because current consoles are weak, most games are released without Tessellation on consoles, only to have it on full force on PC. Metro Exodus is the latest prime example of that.

Weak compared to what? To premium high-end cards of their time? What makes you think next-gen consoles won't be like that as well?
I agree with you that just having a hardware feature does not magically make it performant not the best choice of use of resources. That's what happened to tessellation. Why would it be different with RT acceleration?

So unless you think the hard RT approach of NVIDIA, is going to be outclassed in both effort and scale by AMD, which is highly unlikely, then this point is not even worth discussing.

I think it's not worth pursuing for the aproaching next-gen consoles. And as such, it will remain a extra "ultra" optional feature, and never a fundamental pillar of a game's engine and content design.


If consoles don't have RT hardware/implementation, NVIDIA will continue to expand it's DXR implementation driven by their high majority share and mindshare till the whole market is filled with DXR capable GPUs. And developers will exploit that. Because once more the PC ecosystem is wildly different from consoles.

Developers will focus on what gives them the most bang for the buck, or rather profit for dev-time. Historically, that choice always has leaned on making the baseline, console version, the best looking they can, and whatever extra sparcle they can add for the PC version is either an afterthought or mostly finances by the HIV that wants to use it for marketing.

That's how it's been for the last 2 decades (save some exceptions already addressed, that become less frequent every year) I don't see why you think that trend will change, and honestly I feel like you think this will happen because that's what you want to happen. You are not thinking straight. You are wishfull-thinking.
 
It will sabotage their own handle on a crucial tech, it will rob their PC GPUs of a crucial IQ feature on the long run and hand the IQ advantage to NVIDIA on a silver platter, years will pass and their GPUs will get the treatment of a third class citizen as their upper limit is running games with less than max quality. Especially when Intel introduces their version of DXR (which they will). People would buy AMD GPUs knowing they can't run the latest eye candy while their competitor can. Which will further destroy AMD's image and trim down their presence on PC to the minimum. It will probably drive them out of the PC market completely and then subsequently out of the console market as well. Imagine ATi not supporting Hardware T&L, they would have ended up the same way as 3dfx.

This is the most naive theory I've heard in a while.


NVIDIA was the first to offer unified shaders on PC. Not AMD.

Note that in my theory, AMD does not neglect RT. They just don't rush it. They incorporate it when it really is crucial. It isn't yet, and if the upcoming consoles don't support it, it won't be crucial any soon.
 
Is Navi's Super-SIMD approach directly more advantageous than GCN for the incorporation of some RT elements?

RT will probably be used in small scale for specific game elements next gen. Fully RT games will probably be the domain of some experimental indie devs looking to show off what they can do.
 
Note that in my theory, AMD does not neglect RT. They just don't rush it. They incorporate it when it really is crucial. It isn't yet, and if the upcoming consoles don't support it, it won't be crucial any soon.
If AMD half asses RT then they will surrender both the IQ AND performance advantage to NVIDIA.
Most people here don't understand that AMD is in a true pickle with RT. Having your competitor providing superior IQ to you is not something you shrug at. It's something you think about solving before your existence becomes on the line.
Yes, a dozen or so titles were built with DX9 as a baseline, and had additional DX11 options for niche ultra features. That's the future I predict for DXR if consoles don't support it.
Those dozen titles helped sell millions of DX11 GPUs, they also advanced the IQ front, for DXR to behave the same way as DX11 is the optimal resolution really. It's not something you take lightly.
I think it's not worth pursuing for the aproaching next-gen consoles. And as such, it will remain a extra "ultra" optional feature, and never a fundamental pillar of a game's engine and content design.
Never say never. But again, even if it remains an Ultra option. People buy high end GPUs for this Ultra option. Developers do these Ultra options for people seeking them. And for their games to look good years after their launch. Which means DXR will get wider adoption still.
Weak compared to what? To premium high-end cards of their time? What makes you think next-gen consoles won't be like that as well?
Yes they were just less than middle class GPUs in the year of their introduction, then less than even that the year after. Heck a 750Ti was equivalent to a PS4 GPU. If next gen consoles are to repeat the same cycle, then I can assure you, that extra power on PC will not go to waste.

(save some exceptions already addressed, that become less frequent every year)
Wrongly addressed though. Even the first Half Life, Quake, Unreal and Doom were PC only titles at launch. Even PUBG was a PC only title and consoles struggle to run it till this day. I don't know where this idea of console dictating how graphics work came from?!
 
You prove my point. They demoed SVOGI but because the performance wasn't there it was scrapped. Without RT hardware the performance isn't there...

Got to love it when people see or imagine what they want just to suite their agenda...Everybody knew from the get go that SVOGI wasn't going to be a reality in 2012, even Epic. It was simply used a marketing tool to prop up UE4 which was in development hell. Nobody had done anything with SVOGI at the time as the technique was literally invented by Cyril Crassin (in collaboration with Fabrice Neyret, Miguel Sainz, Simon Green, Elmar Eisemann) a few months prior to UE4's announcement. Once Crassin became a full time Nvidia employee (by the end of 2011) Nvidia approached Epic and pimped it to them as a joint marketing effort...Epic didn't magically develop Lightmass as a replacement in a few weeks once "SVOGI" was scrapped because they suddenly discovered that it would be impossible to run on consoles or even high end PCs. seriously.
Those companies (Nvidia, Epic, AMD, Intel or whatever) are all about marketing BS to sell their products and make money. They don't care if this technique is better than this one. They only care about which one runs/works better on their product. Nvidia has been promoting the hell out of SVOGI for years because that (particular implementation) was theirs and worked best on their GPUs. The new buzz word is RT and Nvidia build dedicated HW to accelerate a single part of it so now they are going to BS their way into claiming that this is the best thing since sliced bread (while continuing to work on SVOGI and literally released VXGI 2.0 on the day DXR was unveiled btw..but on the low to not mess up the marketing message..). You only need RT "Hardware" if you want to make RT the way Nvidia wants you to make it (and this applies to all companies..).
 
Last edited:
I don't know where this idea of console dictating how graphics work came from?!

If you carefully read what I'm actually saying, I never claimed consoles dictate how graphics should work, but they end up becoming the BASELINE by which the industry chooses to base their graphic architecture decisions. It's based on the past 20 years of history of gaming.
 
Status
Not open for further replies.
Back
Top