AMD Vega 10, Vega 11, Vega 12 and Vega 20 Rumors and Discussion

I tested a little bit around with Battlefront2. With 1800x an 720p I get also 320 Fps with my Vega and 100% GPU usage. At Battlefield 1 get 100 Fps and my GPU is spinning around 60% usage.

Also at 4k and low settings I get 100fps. For me it will be very interesting how a Titan Xp or 1080ti behave at 4k and low settings.
 
GPU usage doesn't tell much about how the chip is being stressed, especially with Vega, I get 70% usage on chip for both 100% power usage and <100W of power usage in GTA V. Either it's not behaving well with Ryzen or the drivers aren't there, either way AMD have lots of work to do.
 
For me the values of Battlefront at 4k and low Settings Looks strange. I get only 100 fps. But AMD has reworked the Frontend so i think i should see morre than 100fps without any shader load.

I'm realy intrested in how 1080ti behaves at this Settings.

More Intressting stuff. I set Mesh Quality, Terrain Quality, and Terrain groundcover Quality to Ultra at 4k. (All 3 Settings increase the polygoncount, LOD, and tesselation) In 4k i get only 10% performanceloss. I drop from 105fps to 95 fps.
 
Last edited by a moderator:
Also i tested now 10k. I get 56 fps at a worst place and with the 3 mesh settings set to Ultra i get 50 fps. Which also is a 10% loss of Performance for more Polygons, LOD and Tesselation.
 
This is hugely disappointing, unless there's still pieces of the puzzle missing after all this time. Basically the feature isn't doing shit either way from this QnD test.

AMD's continued stonewalling on the subject isn't good either.
 
I'll take "primitive shaders are still disabled" for 200.
Perhaps. And yet again we come back to why they are still disabled after being made such hooplah over for a long time, and now well over three months post-launch and no ETA.

All we have to fall back on is speculation like "hard things are hard", with no genuine sense of if that's actually what's holding up the show.
 
Perhaps. And yet again we come back to why they are still disabled after being made such hooplah over for a long time, and now well over three months post-launch and no ETA.

All we have to fall back on is speculation like "hard things are hard", with no genuine sense of if that's actually what's holding up the show.
We don't even know what exactly primitive shaders are. No details.
I guess hard things are hard.
They are not retarded enough to get burned by that the third time.
Vega FE and reference RX Vega already happened because of that.
 
This site did some testing on the effects of enabling Vega's primitive binning in Linux.

https://www.phoronix.com/scan.php?page=news_item&px=Vega-Prim-Binning-Test

The effect appears to be negligible, but there very well could be something more complicated at play.
Tuned for Raven and the selection likely won't be that reflective of DSBR with all elements enabled. For example. Many Linux games get DirectX wrapped in OpenGL and ultimately end up CPU bound. So even if the feature were enabled, it may not show, and 1080p/low certainly won't help matters.

Perhaps. And yet again we come back to why they are still disabled after being made such hooplah over for a long time, and now well over three months post-launch and no ETA.
They do seem situationally enabled, but if you look at that Linux patch, right under the DSBR it says:
Code:
/* While it would be nice not to have this flag, we are constrained
      * by the reality that LLVM 5.0 doesn't have working VGPR indexing
      * on GFX9.
      */
That file is for controlling the pipeline and primitive binning for Raven is on by default, which may be worth noting. Not a stretch to think indexing could affect binning.
 
Back
Top