Now that Sony has given up the floor to MS at E3. The potential is higher to push the DXR narrative for the direction of graphics for the industry.I think some form of RT and hybrid rendering will be there one day but I don't think it will be on next generation console, too early.
See? Specialized hardware is required for speed and maximizing flexibility at its expense is too costly to be worth it. RTX already gives us HRT at 1080p60fps as an add-on. Just imagine the performance with a renderer actually designed for HRT.There are countless criticisms about the results from rasterising, for years and years people pointing out what's wrong with them and their ugly hacks. However, it's the only way to get decent framerate 3D graphics from a computer and will remain the only way through hybrid renderers still until we have notably faster tech that'll allow all ray tracing, if that ever happens. Excepting special cases like traced SDF, which again are only now possible and it'd have been really stupid to demand the ditching of rasterising in favour of traced SDF over the past few decades because traced SDF wasn't possible.
"We don't want rasterised graphics because they suck! Give us raytraced games at one frame every four hours!!"
Emotional, you say.Not at all. One is reality. The other is emotional paranoia. One has a place here. The other does not.
Two years from now seems very feasible to me.I think some form of RT and hybrid rendering will be there one day but I don't think it will be on next generation console, too early.
Doing something makes a great deal of sense for professional imaging. If RTX.... was a pro only card and not released for gaming, it's whole perception would be different. A cynical view sees RTX as a pro card released to gaming before the tech is ready for gaming. That's part of the discussion.
I’ve been trying to follow along and learning some things. It could be a while before RT is built ground up and it may be necessary for solid performance for reflections.Currently, BFV with RT enabled is causing about half (or more) of the traditional GPU resources normally used to go to waste. That is a monumental waste of silicon under normal circumstances. It's the kind of thing you can only do on a freaking massive chip, when you have no competition at the high end, and can also sell your GPU to professionals at a huge markup.
HRT may well be - and hopefully is - the future of affordable consumer rendering. But so far RTX doesn't prove this. Those figures for the 2070 are so bad that you simply wouldn't use it - at only 1440p on medium a 570 4GB user would have a massive fps and competitive advantage. Optimisations need to come very thick and very fast. Hopefully they will.
The performance hit is currently so bad, that I'd think any dedicated RT hardware should be added to a console via an edram heavy chiplet. Very low latency access, very high internal bandwith, and hopefully you could run it in parallel with one of the stages of the traditional pipeline to add little to no additional latency. Very high performance and you don't need to add it to the base SKU. Because the kind of console gamers who jumped onboard this gen at $300 but still expect "normal" versions of Fifa, Forza, GT, CoD, Battlefield, Halo (i.e. 60 fps) aren't going to want games that don't have the core, basic gameplay experience.
So much for it being a black box implementation limiting the creativity of developers.
That's GPGPU. They're talking about representing the data in graphics terms and using the graphics hardware, just interpreting the results differently. If the box was any colour other than black, perhaps a lot more could be done with it? The existence of some novel uses doesn't prove that the implementation doesn't restrict other novel uses (including performance enhancing ones) and options aren't being limited to fewer than if the hardware wasn't 'black box'.Some extra use cases for the BVH hardware in RTX: Data lookup for complex lighting, and screen space physics (that are also world aware). And we are only scratching the surface here.
So much for it being a black box implementation limiting the creativity of developers.
https://blog.demofox.org/2018/11/16/how-to-data-lookups-via-raytracing/
Which ones?More games in the works with RT, intresting to say the least.
Could you elaborate on the "more games" part?More games in the works with RT, intresting to say the least.
RT is impressive but dlss is something too and having both in a title could be a thing.
Is the rasterization pipeline fully programmable? No. Why? Because it would be too slow.That's GPGPU. They're talking about representing the data in graphics terms and using the graphics hardware, just interpreting the results differently. If the box was any colour other than black, perhaps a lot more could be done with it? The existence of some novel uses doesn't prove that the implementation doesn't restrict other novel uses (including performance enhancing ones) and options aren't being limited to fewer than if the hardware wasn't 'black box'.
This is exactly the same as GPGPU versus compute. GPUs could be made to do non-graphics work by structuring the workload as graphics tasks to fit the hardware. The hardware was then upgraded to better support general purpose processing.
As Sebbbi says, "Where's the mesh shader hype".Abit like all the people that said what the PS2 could or couldnt do.
DLSS should be able to up-rez from 1440p to 4K and look nearly native, it's branch of research would cover topics like AA etc.Could you elaborate on the "more games" part?
Also, I don't understand the fuss around DLSS unless your only point is trying to get some AA at 4K, but even then I would still probably pick 4K without AA rather than the 1440p-orsomething upscaled and AA'd
I don't know. I wasn't discussing whether the BVH was a black box or not. I was just pointing out that the existence of a GPGPU use of the hardware does not prove it to be transparent and flexible. Neither does that prove it's a 'black box'.You keep calling RTX a black box, but really, exactly what things does it prevent you from doing?
I'm not arguing on the features because I don't know the specifics of the implementation. Others such as milk I think have spoken at length about what they'd like to see in the hardware regards accessible memory access hardware.Do you have any specific examples of the things we would miss out on if it wasn't for its adoption or is it just FUD?
You're repeating the same argument instead of moving the discussion forwards. The discussion has proceeded thus far...If we didn't adopt it at least in the short term I can tell you what we would miss out on: fast triangle-intersecting ray tracing.
As Sebbbi says, "Where's the mesh shader hype".
As this goes on, and honestly, I'm seeing a lot of activity and discussion around DXR, more than I've seen with other new features that have been released with maxwell, and pascal. And so I'm lead to believe that they have enough information to go on to determine if it's worthwhile to explore DXR.
Hoping to see Mesh and Primitive shaders make it for next gen as well. But DX12/Vulkan needs to adopt it into their main branch as oppose to an extension for it to become a standard for all.
I don't know if this is the correct thread for discussing it, but it is part of Turing
I do understand that point, but the fact is that it loses details in the process and the only independent (even if NVIDIA was heavily involved) solution available so far breaks at least DoF if not more in the processDLSS should be able to up-rez from 1440p to 4K and look nearly native, it's branch of research would cover topics like AA etc.
It's important because the debate against DXR is that the games can only run at 1080p. But with AI-up res we're looking at the ability to push that significantly higher.
There's other methods as well, I don't think DLSS is the only one (well unfortunately Nvidia coined the term) so we might be using the term liberally when we shouldn't.I do understand that point, but the fact is that it loses details in the process and the only independent (even if NVIDIA was heavily involved) solution available so far breaks at least DoF if not more in the process