AMD Vega Hardware Reviews

In the comments they said they will do the same thing between 980 and 1080.

Saying the Vega improvments were not for "performances" is not right... You can even make the argument that Vega is a regression compared to Polaris. Right now a 14nm Fiji with only the tweak to go faster would do as good as Vega. You can't tell me it's normal and something isn't wrong.

With the exception that making a part clock 70% higher than its predecessor is anything but a "tweak". A node shrink alone does not get you not even half of that. That is part of the architecture improvements. In other words, a Fiji shrink would not magically give you the clock speed of Vega. I'm an nVIDIA fan through and through but its obvious to me that Vega has been received with way too much negativity and accusations without due respect for the fact its meant to compete with several GPUs from nVIDIA not just GP104.
 
AMD also said that a lot of transistors were used to make sure that Vega would clock as high as it did. That is also part of "all new architecture" since Fury was not capable ot clocking nearly as high, while Maxwell was already a nice overclocker. You cannot just make a GPU and hope it will clock high. It has to be engineered for it and AMD succeded in that.
Would Fury be the most appropriate comparison point for Vega's architectural changes?
Where would Polaris fit? It doesn't have the process node disadvantage Fury does, and yet AMD did not mention massive investment in revamping the pipeline for clocks that reduce Vega's clock advantage significantly.
 
Did not see anyone making questions about its architecture.
That was a joke, right?

The difference between Vega and Pascal is that AMD has been promoting Vega's new features as if there were the second coming.

With Pascal Nvidia talked about incremental changes. The Pascal GTX 1080 white paper spends exactly 2 pages on architecture, and it's a pure recap of Maxwell. The rest talked about optimizing critical paths, noise reduction in DRAM high speed IOs, and some other optimizations and minor features.

I'm an nVIDIA fan through and through but its obvious to me that Vega has been received with way too much negativity and accusations without due respect for the fact its meant to compete with several GPUs from nVIDIA not just GP104.
The reason for the criticism is that there are cases where Vega is no better than GP104.
 
Last edited:
I'd like to see how they plan on disabling 4 SMs on 1080. And if they don't then the comparison is completely worthless as the balances shift around.
I would pay for developer drivers that allowed me to disable compute units, geometry pipelines, ROPs and memory. Currently you need to physically acquire low end boards to test how well the game runs on them. How hard it would be to simply allow disabling units, in order to emulate mid/low end models with the high-end model?
 
How hard it would be to simply allow disabling units, in order to emulate mid/low end models with the high-end model?
Wouldn't there be all kinds of internal queues, buffers and so on that might differ in size between low and high end? Finding ways to software disable everything that could have a performance impact might be tricky, and if the main benefit is just to appease games developers, there might not be that much incentive... :)
 
The reason for the criticism is that there are cases where Vega is no better than GP104.
AMD obviously overhyped Vega, that it launched with disabled features after having spent six months out of the fab shows this. We don't know when, or even if AMD's going to enable said features. Plural, even.

That said, vega is a step towards AMD's return to the high-end GPU business. It's not an altogether terrible re-entry, the chip also has merits despite its flaws. :p The generation after this will be much improved, I'd think, and the one after that even more so, with the experience gained here.
 
Would Fury be the most appropriate comparison point for Vega's architectural changes?
Where would Polaris fit? It doesn't have the process node disadvantage Fury does, and yet AMD did not mention massive investment in revamping the pipeline for clocks that reduce Vega's clock advantage significantly.

Well the article that started this conversation tested it against Fiji, not Polaris.
 
That was a joke, right?

The difference between Vega and Pascal is that AMD has been promoting Vega's new features as if there were the second coming.

Again, like I said a few posts ago features != performance. There are things in Vega that do not translate necessarily in more raw performance but in more functionality / flexibility. Apart from the stupid "poor volta" stunt, AMD never promised Vega to compete with GTX1080Ti. Blame the rabid fans, who still to this day believe in a miraculous driver with 30% boost in performance lol.

The reason for the criticism is that there are cases where Vega is no better than GP104.

In gaming workloads. In non gaming it's far ahead from GP104, closer to GP102 / GP 100 territory.. So much that nVidia was forced to enable extra performance on Titan XP!. Where is RPM in GP104? That's right, it doesn't exist. You can't just blindly compare two chips with vastly different feature sets and market targets to make your point. Or you can, if you already made up your mind about the conclusions, before looking at the facts, that is.
 
AMD obviously overhyped Vega, that it launched with disabled features after having spent six months out of the fab shows this. We don't know when, or even if AMD's going to enable said features. Plural, even.

While AMD did hype things a bit, one should look at the chip for what it is and make an objective analysis of it. If you do that analysis under the influence of all the hype, than that analysis is anything but objective and is emotionally tainted. After all, I think this is still Beyond 3D no? Or have we come down to the standards of neogaf et al?


That said, vega is a step towards AMD's return to the high-end GPU business. It's not an altogether terrible re-entry, the chip also has merits despite its flaws. :p The generation after this will be much improved, I'd think, and the one after that even more so, with the experience gained here.

Yes, I think this is the more balanced and fair approach.
 
Last edited:
While AMD did hype things a bit, one should look at the chip for what it is and make an objective analysis of it.
Is that going to change anything? It is as big as GP102 but currently performs like GP104 with ~100 Watt higher consumption.
 
The Problem (for AMD IMHO) is, that the market which appreciates all those extra features and is willing to compromise because of them, might be really small. Gamers, they usually want high framerates first, moderate to low noise second and low power consumption at distant third (probably also depending on whether you pay your electricity bill yourself and what a kWh costs you). Not many there, who appreciate tech which just might turn out good to have in the long run. Again, IMHO.
 
Is that going to change anything? It is as big as GP102 but currently performs like GP104 with ~100 Watt higher consumption.

Performs like GP104 in gaming. Performs like GP102 in CAD tasks and compute. We don't know how GP104 would perform and how much power it would consume if it had the same features. Hence the comparison is not completely right.
 
The Problem (for AMD IMHO) is, that the market which appreciates all those extra features and is willing to compromise because of them, might be really small. Gamers, they usually want high framerates first, moderate to low noise second and low power consumption at distant third (probably also depending on whether you pay your electricity bill yourself and what a kWh costs you). Not many there, who appreciate tech which just might turn out good to have in the long run. Again, IMHO.

The market is small but profitable. AI has just started and is up for grabs. Why would AMD go only after the gaming market when such opportunities exist? It would be dumb of them. Honestly the big problem with Vega is that it arrived very late. Had it been released about GTX1080 time it would not have such a bad time. It's not a bad product by itself, especially Vega 56.
 
Wouldn't there be all kinds of internal queues, buffers and so on that might differ in size between low and high end? Finding ways to software disable everything that could have a performance impact might be tricky, and if the main benefit is just to appease games developers, there might not be that much incentive... :)
I know. But they could allow similar downgrades that they usually do with the same die. There aren't that many dies nowadays. I wouldn't mind if you had to re-flash the card when you changed the settings. Also disabling memory (to emulate 1GB, 2GB or 3GB cards) should be doable in the driver without any need to re-flash the card. Driver could simply report lower memory amount and never allocate anything to disabled memory area.
 
The problem is, they are not alone in those markets. And the more markets with different demands on your circuitry you try to cater to, the more flexible your hardware has to be. Ultimately, you'd end up with a CPU. And CPUs have been driven from the graphics rendering market ages ago and they have never even entered the AI market for real. Why? Because more specialized chips do the job faster, cheaper, more efficient.

I realize that AMD might not be in a position to develop chips for all possible target markets, but that ultimately leads to having disadvantages in each market compared to the "specialist" chips.
 
Back
Top