AMD Vega 10, Vega 11, Vega 12 and Vega 20 Rumors and Discussion

Would the fury driver be a logical starting point for them given the similarity of the design?
 
Geeforcer
Yes you are right, I had to google that one again. It was in January when they were showing Vega vs GTX 1080 in Doom.
 
Where did this come from?
From me. I can provoke this in Dark Souls 3 by playing with camera angles. This can sometimes lead to one or two triangles from a wall being removed, allowing me to see through. I do not clip the camera into a wall for this. This happens with walls that are 20m away from the camera. My R9 290 and R9 380 did not do that.
 
From me. I can provoke this in Dark Souls 3 by playing with camera angles. This can sometimes lead to one or two triangles from a wall being removed, allowing me to see through. I do not clip the camera into a wall for this. This happens with walls that are 20m away from the camera. My R9 290 and R9 380 did not do that.
I have seen videos of Vega at extremely high overclocks with a lot of messing geometry in tests like TimeSpy and FireStrike. Results were invalidated because of this.
 
When binning was added to Linux drivers, it was done with a note stating tuned for Raven. That would be at odds with Raven being behind in development.

On the other hand, the Windows drivers for Raven Ridge are still based on 17.7, crash a lot and lack configurability according to APUsilicon.

I have seen videos of Vega at extremely high overclocks with a lot of messing geometry in tests like TimeSpy and FireStrike. Results were invalidated because of this.

Buildzoid had this issue when doing LN2. For me this happens in DS3 on stock bios with stock settings.
 
From me. I can provoke this in Dark Souls 3 by playing with camera angles. This can sometimes lead to one or two triangles from a wall being removed, allowing me to see through. I do not clip the camera into a wall for this. This happens with walls that are 20m away from the camera. My R9 290 and R9 380 did not do that.
I don't know how you conclude this is from DSBR. It's not the only change between Vega10 and your old cards and could be a software or hardware bug.
 
Raja saying "Fury drivers" did happen AFAIK, but it was a year ago in the first public showing when the silicon was fresh from factory

My recollection was that the information came out on the AMD subReddit for the post of that initial performance video. Possibly and AMA that happen at the same time. 100% sure that one of the AMD reps chimed up that the drivers were modified Fury drivers. Debugging enabled with some tweaks to ignore the errors.
 
My recollection was that the information came out on the AMD subReddit for the post of that initial performance video. Possibly and AMA that happen at the same time. 100% sure that one of the AMD reps chimed up that the drivers were modified Fury drivers. Debugging enabled with some tweaks to ignore the errors.
I thought it was an AMD rep at one of the editor tech days. Something along the lines of going back to Fury for a foundation for the new drivers.
 
My recollection was that the information came out on the AMD subReddit for the post of that initial performance video. Possibly and AMA that happen at the same time. 100% sure that one of the AMD reps chimed up that the drivers were modified Fury drivers. Debugging enabled with some tweaks to ignore the errors.
I find this whole “modified drivers” discussion very strange. As if every new GPU is supposed to have a whole new driver, which is of course not true.

So when it’s more a continuous progression as new features get added, where do you draw the line between “modified Fury driver” and “true Vega driver”?
 
I am sorry, but if after 6 months, numerous driver releases (including thousands-man-hours efforts such as Adrenalin) AIB boards, etc. anyone is still holding out hope that magic drivers are going to be here as soon as AMD gets around to it....

Your best bet is that some of these features need to be supported explicitly in software to be taken advantage of... of course that also means developers need to pour resources into codepaths for ALL those Vegas a actually out there and used for gaming. In others words, small chance outside of “partner” games where devrel pretty much furnishes the code.
 
That explicitly goes against what Rys was saying.

Right, but I think what Geeforcer was saying is that it's still your most opportunistic chance for performance increases, as in its your best bet for performance increases because if they're already enabled by the drivers while there's lack of performance improvements shown then it made zero impact. Well except for giving people hope.
 
Right, but I think what Geeforcer was saying is that it's still your most opportunistic chance for performance increases, as in its your best bet for performance increases because if they're already enabled by the drivers while there's lack of performance improvements shown then it made zero impact. Well except for giving people hope.

That’s exactly what I was trying to say. IMHO, everything that COULD come from the drivers in existing software WOULD have already arrived. My position that IF we were to see big gains going forward, it will have to be via work on developer’s part, whether it be in form of patches to existing software or resource investment into projects still in development. IF it’s possible at all.
 
That’s exactly what I was trying to say. IMHO, everything that COULD come from the drivers in existing software WOULD have already arrived. My position that IF we were to see big gains going forward, it will have to be via work on developer’s part, whether it be in form of patches to existing software or resource investment into projects still in development. IF it’s possible at all.
Or it comes from fixing any bugs that are holding things back. Just two weeks ago one got fixed that would have been a show stopper for certain key aspects of binning and primitive shaders. Not that long ago even FP16 between Vega and pre-Vega wasn't working due to opcode issues. It stands to reason there are some complicated shader compiler issues that are somewhat beyond the scope of just drivers. The LLVM tool chain with SM6+ probably takes some work and would likely be the priority for some contracts.

Even without those features, Vega has acceptable performance. Fixing those issues wouldn't address the current supply issues either. I'm unsure there is any urgency there outside of Raven, which doesn't really have any competition in the APU market anyways. Intel doesn't dominate graphics market share because Iris is faster. Just cheaper on a smaller form factor.
 
Back
Top