CES 2025 Thread (AMD, Intel, Nvidia, and others!)

Let's show some figures regarding how much money gsync modules actually contribute to the bottom line. Until then, it's much ado about nothing.

It's not about the money, it's about the branding. There is good will in interpolation and VRR ... and there is no pressing need to ditch that.

If they become aware of say Intel cooperating with some major tent pole game to force the issue, I'm sure they'll try to head them off at the pass. If VR keeps growing slowly, that might do it too ... eventually.
 
So you can pay extra for G-Sync ... which is only necessary for older and inferior games, while the new tent pole games you put nearly all your PR effort into obviate its need.

It would not be a good look, branding wise. Not a look they are racing to manoeuvre into, unless forced to. Which could happen.
 
The gsync brand definitely means less than it did because other companies have caught in up certain aspects or technologies. Then you have competing standards/certifications for HDR like the vesa ones, which are likely better anyway. But gsync isn't totally irrelevant because you have stuff like Nvidia Pulsar coming out, and it still has arguably the best variable overdrive. OLED monitors catching on does put a huge damper on a lot of the gsync space for premium monitors. I expect the brand-naming to continue weakening, but actually having some of the tech become more widely available because of the mediatek partnership. Maybe we'll see the latency measuring stuff, or wider VRR ranges on OLEDs. Long term gsync is probably not a big play anymore. Hard to convince gamers to pay the premium for the module, which is why it's now licensed out.
 
Gsync brand means either verified compatibility or high end features these days. It's not that different to what it was previously. Even if we entertain the idea that the industry will decide to move backwards and remove VRR support from monitors at some point I don't see why Gsync brand would go away. They'll just use for different things.
 
Which then takes us full circle back around to this:

Or NVIDIA's own paper Temporally Dense Ray Tracing. Really though, it's all in the name. The rendering doesn't create any proper frames any more, it just creates samples ... to be used by some backend to create actual frames, v-synced frames.

This would obviously kill G-Sync if adopted, which is painful.

If it meant NVIDIA could sell a whole new generation of cards and software for a quarter trillion dollars, they'd not even blink while killing g-sync -- if it really ever came to that. At the same time, in such a future, the death of g-sync is still quite overstated IMO. Millions of games and applications still work on rastered individual frames, all of which could theoretically benefit from g-sync-like capabilities. By then, output devices with ~1KHz refresh rates would have us at the point where functionally the refresh rate is beyond being any sort of relevant bottleneck to begin with.
 
Why would Nvidia care about Gsync? It’s not doing anything for their bottom line.
Indeed, NVIDIA would sacrifice it in a heartbeat if it means they would redefine frames per second like that.

But anyway, NVIDIA won't drop G-Sync in that case, they will just evolve it to mean something different. For example, NVIDIA evolved it from VRR with variable overdrive and 0Hz to max Hz coverage to VRR with 1000 nits HDR and multi zone backlight under the "G-Sync Ultimate" brand .. simultaneously they were also doing software VRR in laptops, which they backported to desktops under the "G-Sync Compatible" brand. And now, they are evolving it to mean adaptive pulse modulation under the "G-Sync Pulsar" brand.

In the future, I expect them to morph it into something completely different, maybe something related specifically to OLEDs or Mini-LEDs/Micro-LEDs or whatever, they will find a way to advance tech in premium displays and will adjust G-Sync branding accordingly.
 
Last edited:
I just realized now that NVIDIA is the only one left trying to advance PC display tech, they were the first with variable refresh rate, then they were the one promoting very high quality HDR implementations (instead of the tons of shitty ones), they developed Ultra Low Motion Blur and advanced it to Pulsar, and are fiercely pushing for higher and higher Hz displays with less and less latency. Other vendors stopped doing that years ago .. the last time AMD talked about FreeSync was about 5 years ago when they announced their Premium Pro branding. After that, they went radio silence on any new display technologies.

This is sad, I remember the times when AMD dabbled with new display techs and introduced the awesome EyeFinity, I remember the time AMD released graphics demos featuring new graphics with every new architecture launch, now we just have one company doing the heavy lifting across the entire PC spectrum (from monitors to software to graphics to video ...etc).

Bring those days back!
 
Back
Top