Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Status
Not open for further replies.
There's not even affordable PC GPU cards with Ray Tracing you really think there will be console hardware at $399 in 2019 that includes it too?

Everyone should temper their expectations.
Hope for Ray Tracing consoles, but plan for typical rasterizer consoles.
That is why i said 2020...
 
There's not even affordable PC GPU cards with Ray Tracing you really think there will be console hardware at $399 in 2019 that includes it too?

Everyone should temper their expectations.
Hope for Ray Tracing consoles, but plan for typical rasterizer consoles.

RTX is not the only way for RT... don’t forget imgtec did RT in a mobile soc a long time ago. It could deliver 300 Mrays/sec in a mobile soc in 2014 !(https://appdevelopermagazine.com/im...res-provide-new-opportunities-for-unity-devs/) while a stream processor like the vega 64 does ...300 Mrays/sec.....so no RT is not necessarily expensive. It needs to be smart. I am of course not expecting a full RT pipeline but some hybrid approach. Just imagine the imgtec raytracer with the transistor budget of a console... it would most likely deliver few gigarays...
 
Uhm, no. Everyone's entire dream of PowerVR has always been "just imagine an imgtec gpu with a larger transistor budget"; it has never happened, nor will it ever happen.
 
Uhm, no. Everyone's entire dream of PowerVR has always been "just imagine an imgtec gpu with a larger transistor budget"; it has never happened, nor will it ever happen.
I am not saying PVR will ever come back... I am just saysing PVR proved RT is possible with a transistor and price constrained budget, and it is not just theorical.. they litteraly did it. PVR is no god, and realising crazy amazing smart things does not garanty success... PVR proved it again by virtually dying...
What I am saying is, RT for consoles is not so far fetched than people like to think... I don't understand why people would want to stay another generation (6 to 10 years) with rasterisation.... Going RT in half a generation would just fragment the market and the developping pipeline. And imagine the PC gamers ranting again about consoles holding back the gaming industry all over again.
 
I don't understand why people would want to stay another generation (6 to 10 years) with rasterisation.... Going RT in half a generation would just fragment the market and the developping pipeline. And imagine the PC gamers ranting again about consoles holding back the gaming industry all over again.

No one wants to. I never said they did, nor did anyone else. I'm only saying to be realistic and grounded in reality.

Everyone who doesn't spend $600+ on a RT GPU will be unable to play any RT-based PC game. Mainstream PC gamers are and always have been holding back enthusiast PC gamers. That has always been the situation. It's no different today. It'll be no different tomorrow. PC gamers only have other PC gamers to blame.
 
No one wants to. I never said they did, nor did anyone else. I'm only saying to be realistic and grounded in reality.

Everyone who doesn't spend $600+ on a RT GPU will be unable to play any RT-based PC game. Mainstream PC gamers are and always have been holding back enthusiast PC gamers. That has always been the situation. It's no different today. It'll be no different tomorrow. PC gamers only have other PC gamers to blame.

As someone currently gaming on a 1.2 TFLOP RX 550, I feel no shame. Come at me bro.

Anyone want to donate $500 to me? :p
 
There's not even affordable PC GPU cards with Ray Tracing you really think there will be console hardware at $399 in 2019 that includes it too?

Define affordable?

Most of Nvidia's modern day GPUs (i.e., GK104's Kepler, GF100's Fermi, etc.) containing CUDA cores can perform ray-tracing operations at the hardware level. Of course, the hitch being; the more CUDA cores the faster the operations are (SLI/NVL setups even better). So, programs like Adobe After Effects and Cinema 4D which contained RT features were available for CUDA core based Nvidia cards. And if I'm not mistaken, AMD's GCN architecture (through asynchronous compute) could perform RT functions as well.

All that being said, what the next generation of PC GPU architectures (hopefully this would include gaming consoles as well) should be offering are more specific RT features built within the GPU "cores" designs, for more efficient ways of executing these operations, rather than just brute-forcing through them like older architectures. HOPEFULLY, AMD's Navi and the architectures to follow, have better ways of tackling RT operations.

Everyone should temper their expectations.
Hope for Ray Tracing consoles, but plan for typical rasterizer consoles.

No, son! I'm going full expectations... never not go full expectations. :runaway::mrgreen:
 
Last edited:
Sony can't just tell AMD, "Bro, did you see the Nvidia conference? Design me some RT. C U in a year."

Besides, RTX/DXR looks good but definitely V1 rough. Let it come to Vulkan. Let's see AMD's driver implementation. Let it cook on PC for several years. You'll get a better implementation in 2026.
 
Besides, RTX/DXR looks good but definitely V1 rough. Let it come to Vulkan. Let's see AMD's driver implementation. Let it cook on PC for several years. You'll get a better implementation in 2026.

Don't get me wrong, I don't think RT will be used in every PS5/XB2 game, maybe just a handful of first-party titles. I just don't see developers like Guerilla or Naughty Dog aiming for 60fps gaming in open-world titles, when that budget can be spent on more complex geometry, prettier shaders and lighting, with some nice RT to boot. We shall see...
 
No one wants to. I never said they did, nor did anyone else. I'm only saying to be realistic and grounded in reality.

Everyone who doesn't spend $600+ on a RT GPU will be unable to play any RT-based PC game. Mainstream PC gamers are and always have been holding back enthusiast PC gamers. That has always been the situation. It's no different today. It'll be no different tomorrow. PC gamers only have other PC gamers to blame.
Totally agree on this point but imagine a console market where one console goes RT and the other console goes rasterisation... for comparison just watch the metro RTX demo
Early adopters tend to be tech savvy people and also influencers... When I pre ordered my PS4, I had a sense that the specs made it "more future proof" that the other console and many of my friends followed my advice.
If there is RT on one console and not on the other... where do you think geeks would lean (if prices are similar)?
Again RT can be made cheaply because you would not need a huge number of stream processors like currently. Currently you need a huge number of stream processors to inefficiently fake lighting. I think a large chunck of the rendering time is spent on faking lighting. it is very transistor and power inefficient.
What if you can get lighting and reflexions (RT) done by a dedicated small efficient pipeline ? would't it save on the otherwise overblown omnipurpose stream engines? I know it is a paradigm shift backwards when the developpers cried so strongly for unified shaders....
But lt's face it, unified shaders are super inefficient at lighting from a transistor and power budget point of view.
If PVR engineers could do it in 2014, why would AMD and sony engineers be less smart in 2018 ? (and they most likely were aware of MS plans to introduce DXR for some time now...)
 
I don't think these early demos of RT integration mean much really. It is the future, and surely tech oriented devs find it both wise to experiment with it and also personally satisfying to do so, and Nvidia's and MS's push to do it are the cherry on top.
But we've seen the similar marketing driven pushes for features, like tessellation or physX for example, that once the hype was over ended up proving like not the best use of resources for the time being. Even if you can somehow make RT run on a high end card for a current gen game, it doesn't mean it will be the best performance/graphic-gain trade-off for a game made from the ground up for a potential next gen console.
 
You can see how rasterizing will eventually hit a huge wall though. Nvidia might be incredibly early to the party, but they'll be better equipped to handle it when full rasterization is abandoned. Eventually having AI cores and dedicated RT silicon will provide better results for the given die space. It's gonna take a few gens, and the next round of consoles will probably dabble a bit into it to give multiplatform developers a reason to push it on PC. Gonna be expensive to develop both raster only and hybrid paths though.
 
We've been down this path many times, new technology is very inefficient and relatively expensive compared to a more mature process that's gone through multiple generations of refinement.

It's going to take a couple cycles for RT to be viable. For now let's hope we get higher frame rate 4k images that don't use a disproportionate amount of power simply to generate native 4k. Ideally we see more effort going into other areas which make the experience more immersive rather than simply prettier.
 
Totally agree on this point but imagine a console market where one console goes RT and the other console goes rasterisation... for comparison just watch the metro RTX demo
Pretty, and clearly an improvement, but hyperbolic marketing. As if there's no other way to do dark games with dark rooms. Prebaked GI is a real thing that has been working nicely for a few years. As ever, marketing needs a worst-case fake comparison to really sell their product as against the best the alternatives have to offer, their demo isn't such stark contrast. There are some great developments in voxel based GI.

https://forum.beyond3d.com/threads/...nt-console-hardware-games-ps4-xo-wu-nx.57658/
http://www.sonicether.com/segi/

Scary games haven't struggled to create dark corners.
 
Last edited:
Yeah I see current RT tech being at the "non-programmable shader" level of development. It's supported by a handful of high end cards with an indeterminate perf hit versus std raster techniques. Another few spins and it'll be the cheap easy way to handle shadows and lighting but we're not there yet. I would be very surprised to see much of this outside of a handful of first party titles next go around. Much of the silicon work for XB2/PS4 is already done so I think expecting a lot of RT next gen is setting yourself up for disappointment.

Console is still the prime dev target for volume and always will be, especially given how crap mass market OEM gpus continue to be (hi DDR vRAM).
 
Yeah I see current RT tech being at the "non-programmable shader" level of development. It's supported by a handful of high end cards with an indeterminate perf hit versus std raster techniques. Another few spins and it'll be the cheap easy way to handle shadows and lighting but we're not there yet. I would be very surprised to see much of this outside of a handful of first party titles next go around. Much of the silicon work for XB2/PS4 is already done so I think expecting a lot of RT next gen is setting yourself up for disappointment.

Console is still the prime dev target for volume and always will be, especially given how crap mass market OEM gpus continue to be (hi DDR vRAM).
And everybody tends to "forget", thanks to Nvidia's marketing, that RT can be done without having dedicated HW RT cores and works on any DX12 class GPU (via DX12 DXR/Vulkan). As a matter of fact all RT demos since the DXR announcement where done like that on Volta (which was only using it tensor cores for denoising, the RT was done using regular compute). Obviously slower today but it remains to be seen if having dedicated HW accelerated BVH would even be required in the future with faster GPUs. As you said RTRT is currently at "non-programmable shader" level of development.
 
Tomb Raider RTX Demo (No On/Off comparison)

Metro Exodus RTX Demo (No On/Off comparison)

So the official NV YouTube has posted the Tomb Raider and Metro RT videos but has not included any of the RT On/Off stuff from the stage demo so the stream capture is still the best way to see that
 
If a next gen console includes dedicated RT hardware, you can expect the same hyperbolic marketing. I think MS believes they lost the enthusiast with early Xbox One marketing miscues, and things like this are how they could be won back.

Second, I have no idea why we’re talking retail card prices. Nvidia controls the market and know they can charge what they want. What matters to a console vendor is die size plus licensing costs.

One can argue this benefits Nvidia because it reinforces the strength of their position by having more devs target their featureset, reinforcing their desktop dominance.
 
Until GPUs with RT sell for under $200 its a complete non-starter to consider the tech suitable to be included in consoles. Thats just the way the console BOM breaks down.
 
Status
Not open for further replies.
Back
Top