One Thing to take Note of is that the denoiser in shadow of the Tomb Raider is not temporal at all. It has Zero ghosting. Purely spatial.Ehh, I'd take PCF hacks over the residual aliasing and temporal smearing ... with exact hard shadows they can make it look quite nice.
Interesting.One Thing to take Note of is that the denoiser in shadow of the Tomb Raider is not temporal at all. It has Zero ghosting. Purely spatial.
Interesting.
One could argue TR still needs fallback to SM in the distance, but that's what i hope sotchastic lod could solve. (Gradually removing alpha planes and lots of vegetation would work without popping)
Also TR is not designed for area lights, which would help a lot against the gamey look caused from ugly artificial point lights and occlusion lacking probes.
But that's only possible on a platform with guaranteed RT support, so i expect Sony will show this first.
Though, there was some offscreen footage from Cyberpunk. Low quality recording, but it looked like soft shadows everywhere and very good.
From the readings i also thought they would do similar than Exodus. I remember some indoors with nice area shadows from the video, also visible here:Thought I read somewhere that Cyberpunk only uses skydome occlusion and emissive stuff.
I think the metric that nvidia provides is very difficult for us to gauge RT performance. I'm not sure if that' is an all encompassing metric or just for primary rays etc.So how many rays per second does RTX get in ye average game scene?
Triangle size still has impact: If you trace against a tree that's only 10px high on screen, the ray still has to descent down to hit a tiny subpixel leaf. LOD still necessary.Even if it's just primary and shadow rays my argument remains the same, with less than a billion per second you can get rid of a lot of the headaches of traditional rendering (fine grained occlusion culling, worrying about efficiency impact from triangles size etc).
You can force quake 2 vkpt and quake 2 rtx to just run primary rays I am sure with some console commandsTriangle size still has impact: If you trace against a tree that's only 10px high on screen, the ray still has to descent down to hit a tiny subpixel leaf. LOD still necessary.
Tracing primary visibility is likely too wasteful considering the high cost RT shows to have, and as long as there is raster HW we will use it. But ofc. this does not answer your question.
AFAIK the 10GRays/s numbers from NV comes from tracing primary visibility a single but detailed model and display the normal at the hitpoint. No materials or lighting, only perfectly coherent rays. This is what i got after asking the same question here a year ago - not sure about it.
The best resource to get practical numbers seems still the early presentation from Remedy. They gave both numbers for Volta and Turing, and for different kind of rays (OA, GI, probably shadow rays, but no primary rays).
I do not remember the numbers well in detail, but my personal rule over the thumb conclusion was 4-8 rays per pixel. In BFV it was less than one.
Additionally i remember those projects that used RT for primary visibility:
Quake 2 RTX,
and this:
unfortunately both do too much other things to give us a clue about primary ray costs.
But, i think it will take at least 5 years until first GPUs appear that only emulate raster HW. But not sure. Moores Law may just stop and we could never get rid of it.
Yeah, it's all about painting it... Even though there are some geometric approaches for wood, now that you talk about pine. Let me see if I find it.That said, its texturing is a bit limited. Try to create a realistic marble texture or varnish pine desk.
always fun to watch modellers do their work.Yeah, it's all about painting it... Even though there are some geometric approaches for wood, now that you talk about pine. Let me see if I find it.
EDIT:
Not exactly the example you mentioned, but still...
They're pretty good. I was trying to do similar to make marble, but couldn't get anything to work. Wood grows in clylinders so that's geometrically reproduceable, whereas marble is fractal patterns. Granite could be made from loads of geometry pieces, but the blending and mixing of marble...well, if it can be done, it's not straight forward. That contrasts with super easy textures or even 3D procedural shaders. Is it possible to include 3D shaders in an SDF engine?Yeah, it's all about painting it... Even though there are some geometric approaches for wood, now that you talk about pine. Let me see if I find it.
EDIT:
Not exactly the example you mentioned, but still...
I've seen some painted marble stuff, but I can't remember in what dream... Funnily enough, I was just thinking about creating something with marble, these days.They're pretty good. I was trying to do similar to make marble, but couldn't get anything to work. Wood grows in clylinders so that's geometrically reproduceable, whereas marble is fractal patterns. Granite could be made from loads of geometry pieces, but the blending and mixing of marble...well, if it can be done, it's not straight forward. That contrasts with super easy textures or even 3D procedural shaders.
I don't know. Maybe we could ask @sebbbi. Where is he?Is it possible to include 3D shaders in an SDF engine?
Ehh, I'd take PCF hacks over the residual aliasing and temporal smearing ... with exact hard shadows they can make it look quite nice.
Hybrid Frustum Tracing is pixel perfect shadows plus a PCSS hack (used the wrong acronym before). Raytracing would simplify the algorithm, but it gives the same result as irregular z-buffering.I'd take pixel perfect shadows (given enough rays of course) over grainy low res shadow maps.
Hybrid Frustum Tracing is pixel perfect shadows plus a PCSS hack (used the wrong acronym before). Raytracing would simplify the algorithm, but it gives the same result as irregular z-buffering.
The Division 2 being marketed by AMD is the most likely reason. But HFTS makes little sense when there's full ray tracing h/w available.Yeah HFTS looked pretty good in the Division. It was dropped for Division 2 though. Wonder why.