CES 2025 Thread (AMD, Intel, Nvidia, and others!)

Huh? Nvidia has been the most vocal and active player tackling rendering pipeline latency in recent memory.
Yes ... after it increasing it by 1 frame.

You're also not really addressing that decreasing it by far more through frameless would kill G-sync ... which is painful.
 
Or NVIDIA's own paper Temporally Dense Ray Tracing. Really though, it's all in the name. The rendering doesn't create any proper frames any more, it just creates samples ... to be used by some backend to create actual frames, v-synced frames.

This would obviously kill G-Sync if adopted, which is painful.
 
Or NVIDIA's own paper Temporally Dense Ray Tracing. Really though, it's all in the name. The rendering doesn't create any proper frames any more, it just creates samples ... to be used by some backend to create actual frames, v-synced frames.

This would obviously kill G-Sync if adopted, which is painful.
Nvidia could find some other way to sell a special module to display OEMs. Maybe the new module could make the display itself frameless, so whenever the renderer creates samples the corresponding pixels on the display are immediately updated instead of waiting for an entire frame to be ready.
 
Or NVIDIA's own paper Temporally Dense Ray Tracing. Really though, it's all in the name. The rendering doesn't create any proper frames any more, it just creates samples ... to be used by some backend to create actual frames, v-synced frames.

This would obviously kill G-Sync if adopted, which is painful.
Why would that kill gsync? You would just set the fps target the engine would be outputting. There is no reason why this target has to be specifically at vsync refresh.
 
Nvidia could find some other way to sell a special module to display OEMs. Maybe the new module could make the display itself frameless, so whenever the renderer creates samples the corresponding pixels on the display are immediately updated instead of waiting for an entire frame to be ready.

Nvidia already did this. The Nvidia-made gsync module is dead, but MediaTek has the license for it and they're building gsync functions and new gsync tech like Nvidia Pulsar into their display scalers. So it's third party and it's fully integrated into the display scaler which is a processor that every display already needs. The available of gsync should actually grow, not shrink, because you don't need a separate costly module.

Nvidia doesn't look like they're abandoning gsync technology any time soon, and they seem to be growing a big partnership with MediaTek, who is doing all kinds of stuff with them (potentially a future desktop cpu).

From this summer:
 
Why would that kill gsync? You would just set the fps target the engine would be outputting. There is no reason why this target has to be specifically at vsync refresh.
Framegen has nearly deterministic processing time. When the position in time for the generated frames is decoupled from the rendered samples from the game engine, the FPS becomes not a target, but a parameter. If you can create approximate frames at arbitrary points in time reliably, you will create them at fixed intervals and display them v-synced, because it's the most sensible thing to do.
 
Framegen has nearly deterministic processing time. When the position in time for the generated frames is decoupled from the rendered samples from the game engine, the FPS becomes not a target, but a parameter. If you can create approximate frames at arbitrary points in time reliably, you will create them at fixed intervals and display them v-synced, because it's the most sensible thing to do.
Why is that any more sensible than displaying them at whatever refresh the display supports?
 
you will create them at fixed intervals and display them v-synced, because it's the most sensible thing to do.
Is not in opposition to
displaying them at whatever refresh the display supports

What no longer makes sense is VRR (ie. continuously adapting the refresh rate). Making VRR work as well as possible is what G-Sync is for.
 
What no longer makes sense is VRR (ie. continuously adapting the refresh rate). Making VRR work as well as possible is what G-Sync is for.
VRR makes and will still make a lot of sense. There will be legacy titles, there will be stuff like QMS, there will be latency reducing things like Reflex which require VRR. The fact that something _can_ run at vsync rate all the time doesn't mean that it makes everything else obsolete.
 
Yet its abandonment for the new feature games would still hurt ... and really hurt the ability of G-Sync monitors to attract a premium.

As long as they don't think/know that Intel (GFFE), AMD or Valve (VR) are going to force their hand, I think they'd rather not go that way.
 
I don't think that's a huge problem. NVIDIA does not make a lot of money from G-Sync (if any, considering the development cost). The fact that NVIDIA now works with Mediatek to integrate G-Sync into their scalers tells a lot. G-Sync now is mainly a branding asset.
G-Sync modules had their historical positions, back in the days when many LCD scalers were not very good and introducing VRR directly would cause serious degradation of quality. Today it's rarely a problem and most VRR monitors work well enough for most people.
Also VRR is still useful for other purposes such as viewing weird FPS video. For example, it used to be a big problem showing 24FPS movies on a 60Hz LCD. Obviously it won't be a problem if you have a 120Hz or 144Hz monitor, but it's still better with VRR where you don't have to worry about the highest supported frame rate of a monitor at all.
 
Yet its abandonment for the new feature games would still hurt ... and really hurt the ability of G-Sync monitors to attract a premium.

As long as they don't think/know that Intel (GFFE), AMD or Valve (VR) are going to force their hand, I think they'd rather not go that way.

That ship has already sailed. Gsync premium monitors are a dying breed.
 
G-Sync modules had their historical positions, back in the days when many LCD scalers were not very good and introducing VRR directly would cause serious degradation of quality. Today it's rarely a problem and most VRR monitors work well enough for most people.

Pulsar is still pretty new and needs to be integrated in the monitor. That they let Mediatek produce the hardware doesn't make it free or remove it as a brand builder for NVIDIA.
 
Let's show some figures regarding how much money gsync modules actually contribute to the bottom line. Until then, it's much ado about nothing.

Nvidia's bread and butter, by orders of magnitude more than anything else in their portfolio, is selling GPUs. Its going to be tough to convince me that gsync contributed even so much as a single percent to their overall EBITDA.
 
Let's show some figures regarding how much money gsync modules actually contribute to the bottom line. Until then, it's much ado about nothing.

Nvidia's bread and butter, by orders of magnitude more than anything else in their portfolio, is selling GPUs. Its going to be tough to convince me that gsync contributed even so much as a single percent to their overall EBITDA.
Probably impossible to come up with the price especially if MediaTek integrates it into their display controllers. At worst it was $500++ (original G-Sync HDR module) but that was because that version was based on FPGA (https://www.techpowerup.com/245463/nvidia-g-sync-hdr-module-adds-usd-500-to-monitor-pricing Altera FPGA costing $2000 in low quantities, estimated to drop down to $500 when you buy enough)
 
Back
Top