CES 2025 Thread (AMD, Intel, Nvidia, and others!)

Huh? Nvidia has been the most vocal and active player tackling rendering pipeline latency in recent memory.
Yes ... after it increasing it by 1 frame.

You're also not really addressing that decreasing it by far more through frameless would kill G-sync ... which is painful.
 
Or NVIDIA's own paper Temporally Dense Ray Tracing. Really though, it's all in the name. The rendering doesn't create any proper frames any more, it just creates samples ... to be used by some backend to create actual frames, v-synced frames.

This would obviously kill G-Sync if adopted, which is painful.
 
Or NVIDIA's own paper Temporally Dense Ray Tracing. Really though, it's all in the name. The rendering doesn't create any proper frames any more, it just creates samples ... to be used by some backend to create actual frames, v-synced frames.

This would obviously kill G-Sync if adopted, which is painful.
Nvidia could find some other way to sell a special module to display OEMs. Maybe the new module could make the display itself frameless, so whenever the renderer creates samples the corresponding pixels on the display are immediately updated instead of waiting for an entire frame to be ready.
 
Or NVIDIA's own paper Temporally Dense Ray Tracing. Really though, it's all in the name. The rendering doesn't create any proper frames any more, it just creates samples ... to be used by some backend to create actual frames, v-synced frames.

This would obviously kill G-Sync if adopted, which is painful.
Why would that kill gsync? You would just set the fps target the engine would be outputting. There is no reason why this target has to be specifically at vsync refresh.
 
Nvidia could find some other way to sell a special module to display OEMs. Maybe the new module could make the display itself frameless, so whenever the renderer creates samples the corresponding pixels on the display are immediately updated instead of waiting for an entire frame to be ready.

Nvidia already did this. The Nvidia-made gsync module is dead, but MediaTek has the license for it and they're building gsync functions and new gsync tech like Nvidia Pulsar into their display scalers. So it's third party and it's fully integrated into the display scaler which is a processor that every display already needs. The available of gsync should actually grow, not shrink, because you don't need a separate costly module.

Nvidia doesn't look like they're abandoning gsync technology any time soon, and they seem to be growing a big partnership with MediaTek, who is doing all kinds of stuff with them (potentially a future desktop cpu).

From this summer:
 
Why would that kill gsync? You would just set the fps target the engine would be outputting. There is no reason why this target has to be specifically at vsync refresh.
Framegen has nearly deterministic processing time. When the position in time for the generated frames is decoupled from the rendered samples from the game engine, the FPS becomes not a target, but a parameter. If you can create approximate frames at arbitrary points in time reliably, you will create them at fixed intervals and display them v-synced, because it's the most sensible thing to do.
 
Framegen has nearly deterministic processing time. When the position in time for the generated frames is decoupled from the rendered samples from the game engine, the FPS becomes not a target, but a parameter. If you can create approximate frames at arbitrary points in time reliably, you will create them at fixed intervals and display them v-synced, because it's the most sensible thing to do.
Why is that any more sensible than displaying them at whatever refresh the display supports?
 
you will create them at fixed intervals and display them v-synced, because it's the most sensible thing to do.
Is not in opposition to
displaying them at whatever refresh the display supports

What no longer makes sense is VRR (ie. continuously adapting the refresh rate). Making VRR work as well as possible is what G-Sync is for.
 
Back
Top