AMD Vega Hardware Reviews

Does he say, Prim Shaders are enabled in the driver? Because that's what the debate was about, not whether or not you could argue that part of the primitive shaders are enabled because under this and that circumstance, part of what contributes to a fully armed and operable space station... errr... DSBR/Prim Shader, is already working.

Taken to the extreme, you could say, the NCUs are generally working and are part of the primitive shader calculations, thus part of primitive shaders are working already.
 
Does he say, Prim Shaders are enabled in the driver? Because that's what the debate was about,
Actually I was following the comment you made to not confuse Primitive Shaders and DBSR, then where @Anarchist4000 describes the relationship. It wasn't particularly a nod whether they're enabled or not.
 
I'm not quite sure I would agree with that. Vega 56 when undervolted and overclocked appears to pretty clearly outperform an overclocked 1070:
vDUUA5H.png

However the drivers are still so raw that only limited undervolting and overclocking testing has been done so far. The undervolting testing I've seen so far seems to suggest that Vega 56 should be able to deliver ~15-20% gains from overclocking while undervolting to around 230 watts or even slightly less depending on silicon lottery results (See GN's review article and undervolting livestream). That is even before considering how much Vega's half finished drivers might be leaving on the table right now.

No idea what they did test there and it shows no power consumption for the RX56 either. If you undervolt and overclock you can go up to the power limit. Power saving mode, reduce the power limit though and with this it reduces the average clock speed and the performance. With both limited to 170W I am certain a 1070 will run circles around RX56.
 
I'm assuming the primitive shaders are partially responsible for setting up the bins, although the lines are a bit blurry and not well documented. Need to check the linux notes, but I thought they merged the first stages into a single shader which I think are surface and primitive. DSBR picking up from there. Frustum culling would ideally happen prior to tessellation, when desirable, which isn't indicated, but couldn't be done safely without some vertex shading. DSBR probably relies on a primitive shader to pack primitives or patches to facilitate sorting and maybe compress data, but occurs transparently through the driver. Speculating on that as it's something the driver would insert, but roughly sorting patches would be much easier than triangles for example as they'd be related spatially.
DSBR can be enabled without primitive shaders. Games often perform culling in the HS already.
 
i really dont understand why people still act suprised
i mean its a typical amd launch an awesome card launched on a cringy show and on the next day we learn that the drivers are worst than dog s**t and probably are on pre alpha mode
 
Does he say, Prim Shaders are enabled in the driver? Because that's what the debate was about, not whether or not you could argue that part of the primitive shaders are enabled because under this and that circumstance, part of what contributes to a fully armed and operable space station... errr... DSBR/Prim Shader, is already working.

I'm taking that to be AMD ultimately makes everything a primitive shader for the purpose of driver side optimizations. Even if only performing the standard pipeline functions. Therefore primitive shaders are always enabled. I don't recall Mantor mentioning primitive shaders specifically in that interview, only laying out an optimization (deferred attribute interpolation) that would only work with a primitive shader. So I'm deducing primitive shaders are enabled, only usable by AMD currently, and given the shape of everything, not optimal. I realize I'm hedging a lot there, but that's what's been presented that I've seen. Like I said above, it's not clear where the stages begin and end as they don't follow the standard pipeline structure. In the Linux driver they were creating giant monolithic shaders spanning many steps, so primitive and DSBR may very well be the same shader.

DSBR, according to the whitepaper, works with the Energy benchmark. So there's at least that.

DSBR can be enabled without primitive shaders. Games often perform culling in the HS already.
What I'm saying is even a vertex shader gets turned into a primitive shader by the driver, therefore primitive shaders are almost always used. Unless you find a way to skip it.
 
i really dont understand why people still act suprised
i mean its a typical amd launch an awesome card launched on a cringy show and on the next day we learn that the drivers are worst than dog s**t and probably are on pre alpha mode

Because it's not a awesome card when you factor in price, power consumption, and what nvidia is doing (or did, 14 months ago, damn). And they skipped a high end generation for this, while they're loosing ground on fps per watt compared to nvidia (or even Fiji in some cases). It's not a good direction at all.
 
No idea what they did test there and it shows no power consumption for the RX56 either. If you undervolt and overclock you can go up to the power limit. Power saving mode, reduce the power limit though and with this it reduces the average clock speed and the performance. With both limited to 170W I am certain a 1070 will run circles around RX56.

Gamer's Nexus launch day article on Vega 56 shows that they were able to achieve a Firestrike score of ~22,000 with an undervolt to 1.025v, a memory overclock to 980mhz and a 1525mhz core clock at a power draw of ~210w, This and Joker's testing in the video I linked above both seem to indicate that an overclocked and undervolted Vega 56 can offer a clear performance advantage against an overclocked 1070 at a ~210-240w power consumption (depending on the specifics of the undervolt and overclock achieved).

Unfortunately, as I said, there is still a fairly limited amount of testing on this subject out there due to how buggy overclocking and undervolting are in Vega 56's launch drivers.
 
The lengths some are willing to go to in order to make a competitive case for Vega is hilarious. And even then the best you can do is make it look less sucky against a chip that it is clearly one rung down from the chip it should actually be competing against.

Ask not is Vega a miserable failure, but why.
 
The lengths some are willing to go to in order to make a competitive case for Vega is hilarious. And even then the best you can do is make it look less sucky against a chip that it is clearly one rung down from the chip it should actually be competing against.

Or that people are looking at ways to extract for efficient performance out of a new product that they are interested in, because we're PC enthusiasts. For many people, the concept of winners and losers doesn't exist, only interesting hardware with both its benefits and flaws.
 
I look at vega in two different ways. From consumer pov Vega56 is decent for the price/performance and features it offers. Power consumption is off but still reasonable for Vega56. From amd pov vega is a really poor product. It barely competes with very significantly smaller chip using "bulk" memory. Amd's profit margin must be much worse than Nvidia's.

I don't understand how process can be blamed when raw specs are there(memory bandwidth, flops, features). Somehow vega fails to deliver as one would expect based on paper specs.
 
The lengths some are willing to go to in order to make a competitive case for Vega is hilarious. And even then the best you can do is make it look less sucky against a chip that it is clearly one rung down from the chip it should actually be competing against.

Ask not is Vega a miserable failure, but why.

welp the 56 is cheaper than i can find a 1070 for and is faster and supports freesync which is an open standard and the monitors are cheaper. I think it competes just fine. The 64 seems to compete well against the 1080 and is priced cheaper there too.

Yes i'd like to know why performance isn't better and if its something drivers can fix. But right now it seems to work
 
That is something I can not read from the article. Even undervolted (with a dubious stability in all applications) they measured RX Vega at 210W + 25W (cables + slot). That is more than the whole system running Firestrike on a 1070FE at 212W combined.
 
That is something I can not read from the article. Even undervolted (with a dubious stability in all applications) they measured RX Vega at 210W + 25W (cables + slot). That is more than the whole system running Firestrike on a 1070FE at 212W combined.


Well, you know my system eat more than 700 W not OC ( 220W x2 for GPUs, 140W for CPUs along than 24V pump, + fans + 3 SSD and 4 HDD )....
 
welp the 56 is cheaper than i can find a 1070 for and is faster and supports freesync which is an open standard and the monitors are cheaper. I think it competes just fine. The 64 seems to compete well against the 1080 and is priced cheaper there too.
Well the Vega 56 isn't out yet and the initial pricing of $399 seems to be irrelevant now. It will likely release at $499 with bundled games at the very least.
 
Does Vega still have the primitive discard accelerator introduced in Polaris or would the primitive shaders (in theory) make that obsolete?
 
Last edited:
Anyone know of anyone that tested RX Vega vs Fury X at same clock speed? I know Gamersnexus did the test with Vega FE (and those results were abysmal).
 
Or that people are looking at ways to extract for efficient performance out of a new product that they are interested in, because we're PC enthusiasts. For many people, the concept of winners and losers doesn't exist, only interesting hardware with both its benefits and flaws.
That's fine, my point is this chip is a major letdown from an engineering standpoint. This sucks for for everyone except NVIDIA.
welp the 56 is cheaper than i can find a 1070 for and is faster and supports freesync which is an open standard and the monitors are cheaper. I think it competes just fine. The 64 seems to compete well against the 1080 and is priced cheaper there too.
Is 56 cheaper than the 1070? I can't find it listed on Newegg right now. And Vega64 is not cheaper than the 1080 that is complete BS.
 
Anyone know of anyone that tested RX Vega vs Fury X at same clock speed? I know Gamersnexus did the test with Vega FE (and those results were abysmal).
Abysmal? At worst it lost just a tad and even then probably mostly due lower memory bandwidth (no, they didn't compensate for that), at best it beat Fury X silly, I'm not sure how that's "abysmal"
 
Back
Top