AMD demonstrates Freesync, G-sync equivalent?

Excellent stuff, now hopefully nvidia will be forced to follow suit with their own free implementation. Either that or AMD start properly supporting stereoscopic 3D gaming so that I can switch teams.
 
Excellent stuff, now hopefully nvidia will be forced to follow suit with their own free implementation. Either that or AMD start properly supporting stereoscopic 3D gaming so that I can switch teams.

It's not AMDs fault if devs support 3DVision rather than D3D's own implementation which AMD supports?
 
D3D's own implementation
Theres a d3d implementation ?

which AMD supports?
It doesn't really, hence the need for companies like tridef.
The were originally 2 companies making 3D drivers for amd cards. 1 company has already dropped out, what if the other decides to follow suit ?
Is it wise for Amd to leave 3d support in the hands of a small independent company
 
Not all games require Tridef. Only those that don't natively support 3D. For example, Tomb Raider supports stereoscopic 3D on AMD cards without Tridef.
 
Excellent stuff, now hopefully nvidia will be forced to follow suit with their own free implementation. Either that or AMD start properly supporting stereoscopic 3D gaming so that I can switch teams.

Since it's now part of the VESA standard, then NVidia doesn't have to come up with their own free implementation. They just have to use the standard.

Regards,
SB
 
Since it's now part of the VESA standard, then NVidia doesn't have to come up with their own free implementation. They just have to use the standard.

Regards,
SB

And take a huge hit on the G-Sync R&D?
Don't really foresee Nvidia going the Adaptive-Sync route.

Edit- NM, you were just stating they could, not would.
 
Only problem is AMD promising to deliver after 6-12 months, which actually means it is not really a priority for the company right now. this could probably drag for more than 12 months.

AMD is also saying most of their GCN 1.1 supports the tech, so that rules out all GCN 1.0 hardware (even in R 2xx line up) and some 1.1 too, pretty strange considering they claimed all of their hardware already supports the tech.

http://www.anandtech.com/show/8008/...andard-variable-refresh-monitors-move-forward
 
AMD is also saying most of their GCN 1.1 supports the tech, so that rules out all GCN 1.0 hardware (even in R 2xx line up) and some 1.1 too, pretty strange considering they claimed all of their hardware already supports the tech.
No. The GCN 1.1 parts listed are the first products that will support it. AMD has not formally ruled out GCN 1.0 parts; they just aren't currently on the supported list.

Will they be? At this time I don't know. But please don't read more into this than what has been stated.
 
Last edited by a moderator:
Since it's now part of the VESA standard, then NVidia doesn't have to come up with their own free implementation. They just have to use the standard.

Regards,
SB

Yeah that's what I meant, support it in drivers. I wouldn't rule it out, it may mean a loss on Gsync R&D but adaptive vsync is a huge selling point and if it's going to be available to AMD users at a much lower cost then that might cost NV sales which may make the loss on Gsync R&D worth it.
 
Yeah that's what I meant, support it in drivers. I wouldn't rule it out, it may mean a loss on Gsync R&D but adaptive vsync is a huge selling point and if it's going to be available to AMD users at a much lower cost then that might cost NV sales which may make the loss on Gsync R&D worth it.


Is Adaptive v-sync actually the same as G-sync? From what I understand, adaptive v-sync will work great if the frame times is predictable, but will suffer latency/lag problems when frame rates vary.
 
Is Adaptive v-sync actually the same as G-sync? From what I understand, adaptive v-sync will work great if the frame times is predictable, but will suffer latency/lag problems when frame rates vary.

If that's the case then NV could conceivable offer both options if they can demonstrate a significant enough quality difference.
 
Actually, G-Sync has a small performance penalty (read the Anand or Toms review, IIRC). Adaptive Sync doesn't necessarily have that same issue.
 
Actually, G-Sync has a small performance penalty (read the Anand or Toms review, IIRC). Adaptive Sync doesn't necessarily have that same issue.


We're talking about a few percent performance penalty for Gsync - it's really just noise.

The problem with Adaptive Vsync, as I understand it, is that the GPU has to guess in advance how long it will take to render a frame, and set the VBLANK interval appropriately. So things work great until framerate dips, at which point Adaptive Vsync's guesses are wrong, and so we have to repeat frames, introducing extra frames of latency. This extra latency happens at the exact moment when the user is most sensitive to jerking, because framerate has tanked.

But feel free to correct me if I'm wrong.
 
Back
Top