AMD: Speculation, Rumors, and Discussion (Archive)

Status
Not open for further replies.
I'm also kinda forced to wait on Vega since I've got a Freesync monitor.

That's a scary situation. I hold no bias to either graphics card maker, but I still don't want to pick the "wrong" standard and get HD DVDed. I'm not suggesting that'll happen to freesync (or G-sync), but the idea of potentially being locked to any one graphics card brand for the (long) life of a monitor is a scary one.
 
Except that adaptive sync I believe is already a standard and being adopted to display port while G-sync is proprietary (thus not a standard). So if someone do care about standard, there is only one choice. Anyway, for laptop, even Nvidia is not using their g-sync module and opting to use the standard built into eDP. Basically Nvidia can support adaptive sync if they want to but the other GPU can't support G-sync.
 
That's a scary situation. I hold no bias to either graphics card maker, but I still don't want to pick the "wrong" standard and get HD DVDed. I'm not suggesting that'll happen to freesync (or G-sync), but the idea of potentially being locked to any one graphics card brand for the (long) life of a monitor is a scary one.
Well, Freesync™ is just AMD's software that makes use of the Adaptive Sync VESA standard, of which Intel are adopting, so the only time you are locked to a platform is when you purchase Nvidia G-Sync.
Nvidia is the odd duck here, locking people out because they don't buy G-sync at a premium.
 
Adaptive Sync's price premium on monitors is on its way to become negligible, so it's not that one is "forced to stay with AMD", but rather that one "chose not to pay" for nvidia's proprietary, overly expensive and rare solution.

I'll bet that 2nd-gen Pascal will bring adaptive sync support from day one, but there's a chance they will be dicks and never bring the feature to the current cards.
 
Proprietary stuff was bad when Nvidia was the little dog and big-dog 3dfx ruled the market with glide. Today, g-synch is only NV's third or fourth major proprietary scheme, after cuda, physx and possibly also gameworks (depending on which was introduced first, I can't remember.)

You'd think you wouldn't have to twist a corporation's arm to have them hold by their own standards, but apparantly we're expected to feel it's natural to have them do a complete 180 on their position once they become dominant in the marketplace. It's the law of the jungle (sociopaths really, I'm figuring)...
 
I am not a fanboy of any company. I have both G-Sync and Free-Sync monitors. I have both AMD and Nvidia GPU's. I have used both V-Sync and Fast-Sync. In the end the only thing I care about is the technology and the user experience.

Nvidia is very protective of its image and brand. They hand pick panels for G-sync monitors. They develop technology, custom FPGA's, Fast-sync, etc to give their customers a better experience. Nvidia is a company and as such will try to protect their IP (think Gameworks) and make money off of their technology. I think that it is fine to expect a ROI. I understand that some people are a bit upset and think G-Sync is unnecessarily expensive. I respectfully disagree. They are just trying to give their customers the best experience possible. I hope that Nvidia will change their stance on Adaptive-Sync, and knowing Nvidia they might if the market demands it.

AMD is doing what makes sense to them. They are taking advantage of a technology invented by VESA by utilizing and improving on it(through their new Crimson drivers). Free-Sync is good solution but not the best solution. AMD is trying to play off peoples anger of the cost of G-sync by effectively renaming Adaptive-Sync to AMD Free-Sync. I want AMD to do well, but I have serious concerns of their direction as a company. I personally believe that AMD might not recover even with their newest GPU(Polaris) and CPU(Zen) architectures.
http://rebrn.com/re/lets-talk-about-v-sync-free-sync-g-sync-adaptive-sync-and-fast-s-2639948/
 
That's a scary situation. I hold no bias to either graphics card maker, but I still don't want to pick the "wrong" standard and get HD DVDed. I'm not suggesting that'll happen to freesync (or G-sync), but the idea of potentially being locked to any one graphics card brand for the (long) life of a monitor is a scary one.

My monitor came with both FreeSync or G-Sync variants (Acer XR341CK). The identical G-Sync variant costs ~200 euros more. I would have had the money for going nVidia, but it would have meant paying more for the very same feature.

But if one already has a capable nVidia card inside, and the monitor manufacturer doesn't release such identical models (enabling direct comparisons)... Then G-sync probably doesn't look like such a bad ideea
 
That's a scary situation. I hold no bias to either graphics card maker, but I still don't want to pick the "wrong" standard and get HD DVDed. I'm not suggesting that'll happen to freesync (or G-sync), but the idea of potentially being locked to any one graphics card brand for the (long) life of a monitor is a scary one.

its a funny comment when only Nvidia is trying to lock down their consumers with this feature, the other one is an open standard.
 
Proprietary stuff was bad when Nvidia was the little dog and big-dog 3dfx ruled the market with glide. Today, g-synch is only NV's third or fourth major proprietary scheme, after cuda, physx and possibly also gameworks (depending on which was introduced first, I can't remember.)

You'd think you wouldn't have to twist a corporation's arm to have them hold by their own standards, but apparantly we're expected to feel it's natural to have them do a complete 180 on their position once they become dominant in the marketplace. It's the law of the jungle (sociopaths really, I'm figuring)...

on thoses fourth ? PhysX ( Ageia ), CUDA ( mostly based on Ageia code ), Gamework is a bit strange to use as most features was introduced separately before under another names, then G-Sync.
 
Well, Freesync™ is just AMD's software that makes use of the Adaptive Sync VESA standard, of which Intel are adopting, so the only time you are locked to a platform is when you purchase Nvidia G-Sync.
Nvidia is the odd duck here, locking people out because they don't buy G-sync at a premium.

I know that's how things are "supposed" to work, but history has shown us that it doesn't always work that way. For example, HD DVD was developed by the DVD Forum and was supposed to be the "official" successor to DVD, but Sony managed to make Blu Ray succeed despite being the "odd duck". Meanwhile, Nvidia is technically the "odd duck", but they own like 80% of the consumer graphics market and they have tremendous mindshare in the pro market as well. I'm not saying that G-Sync will succeed, only that it's not a done deal that it won't (and vice versa).

Adaptive Sync's price premium on monitors is on its way to become negligible, so it's not that one is "forced to stay with AMD", but rather that one "chose not to pay" for nvidia's proprietary, overly expensive and rare solution.

I'll bet that 2nd-gen Pascal will bring adaptive sync support from day one, but there's a chance they will be dicks and never bring the feature to the current cards.

When we're trying to predict the behavior of a company like Nvidia, my bet is definitely in whatever camp where Nvidia gets to "be dicks". It's just a general rule of mine. Nvidia has enough control over the consumer graphic card market that I don't think they need to roll over quite yet.

EDIT - I feel like I may have just derailed this otherwise upstanding thread. Sorry about that... :/
 
Last edited:
It would appear that both Vega and Volta are new compute architectures (though with how broken and lacking in D3D12 features Polaris is, it would seem that Vega needs to be more than just a new compute architecture).

By the way, I can't help thinking it's no coincidence that AMD has named Vega with a V, to match Volta. And, erm, P(olaris/ascal). Sigh. I can't remember if Newton (matched by Navi) comes after Volta though...
 
Vega is a family of GPUs, so for example Vega 10 launching in February and Vega 11 in May could conceivably fit their wording.

This. There are two Vega chips, so claiming that Vega will be released in H1 2017 can simply mean that cards with both GPUs will be out until the end of June, and not that both will be released at the end of Q2.

AMD is still taking a lot of time getting out a competitor to GP104 and GP102 and that has been screwing with GPU value for consumers, but that slide doesn't mean this competitor won't be out until June next year.
 
This. There are two Vega chips, so claiming that Vega will be released in H1 2017 can simply mean that cards with both GPUs will be out until the end of June, and not that both will be released at the end of Q2.

AMD is still taking a lot of time getting out a competitor to GP104 and GP102 and that has been screwing with GPU value for consumers, but that slide doesn't mean this competitor won't be out until June next year.

I don't think anyone is trying to say that Q1 isn't part of 1H. That would be silly. I think some folks mean that amd's verbiage is implicitly suggesting that a Q1 release (be it little Vega, big Vega or both) is unlikely.

Of course, amd could use little Vega right about now because gp104 prices are hilariously unchecked at the moment. So it would definitely benefit amd to have some little Vega cards on the market asap, e.g. Q1 2017. So the need is 1000% there and we all would be happy to see it happen, but their verbiage doesn't inspire confidence for a Q1 release.
 
Status
Not open for further replies.
Back
Top