If I understood it correctly visibility samples are full res, texturing/shading is done at target rate.With variable rate shading is base color/texture still sampled at the highest resolution?
If I understood it correctly visibility samples are full res, texturing/shading is done at target rate.With variable rate shading is base color/texture still sampled at the highest resolution?
SIMD width is not really indicative of warp size. G80 was 8-wide and took 4 clocks to execute an instruction. It would also increase scheduler cost significantly and break years of optimizations tailored for 32 wide warps.
I'm assuming via Nvidia Optix in their drivers? Likely the demo is hand tuned to Optix, which means it probably won't work on Vega once next W10 update hits (and AMD adds DXR to drivers)So does RTX not need DXR to work? I am asking because the Ray Tracing in the StarWars demo works on both Pascal and Turing without DXR.
So does RTX not need DXR to work? I am asking because the Ray Tracing in the StarWars demo works on both Pascal and Turing without DXR.
More on Variable Rate Shading.
https://devblogs.nvidia.com/turing-variable-rate-shading-vrworks/
From coarse shading of 4x4 to 8xSSAA.
Not bad.
Nope, it can also run as an acceleration layer for Vulkan RT and Optix.
Unfortunately the screenshots of the upcoming 3dmark RT benchmark look quite unimpressive. Hopefully someone does a proper DXR or Vulkan benchmark.
The Star War Reflections demo is built using DXR. RTX accelerates DXR but DXR can run on any DX12 GPU using the fallback layer.So does RTX not need DXR to work? I am asking because the Ray Tracing in the StarWars demo works on both Pascal and Turing without DXR.
Isn't the "look" irrelevant as long as it gives numbers to compare across hardware?
We already have a benchmark where looks are irrelevant, it’s called math.
3DMark actually used to be a benchmark in the literal sense by producing visuals that games of the time couldn’t match. It set expectations for what’s possible in the future when using the latest and greatest tech. It hasn’t been that for a long time. Now it just spits out a useless number and doesn’t look good doing it.
I miss the good old days of Nature and Airship.
It will be certainly interesting to see how it will finally happen and does it need a additional change to hardware.Pretty cool. Still waiting on that killer app though for VR to really take off.
Indeed.We already have a benchmark where looks are irrelevant, it’s called math.
3DMark actually used to be a benchmark in the literal sense by producing visuals that games of the time couldn’t match. It set expectations for what’s possible in the future when using the latest and greatest tech. It hasn’t been that for a long time. Now it just spits out a useless number and doesn’t look good doing it.
I miss the good old days of Nature and Airship.
Pretty cool. Still waiting on that killer app though for VR to really take off.
Despite the 8700K being 30% faster than 2700X @1080p, it can't power the 2080Ti. The progress of CPUs has really fallen off a cliff in the past couple of years. This needs to change for both Intel and AMD.