AMD Vega 10, Vega 11, Vega 12 and Vega 20 Rumors and Discussion

If we look at Radeon history
It's full of nVidia killers, starting with, of course, R300. RV700, Cayman and Polaris were merely an outliers.
You see, in consumer market you need mindshare to compete.
To regain mindshare you need to launch something very, very killy and do it VERY loudly.
And since RTG failed to do that again and again they will forever bear the mark of cheapo brand for the poor.
 
Last edited:
RV770 was certainly a GTX 280 killer from a price/performance point of view. You got almost as much bang (like what, 80% of it on average?), for like HALF the price. The R4890 boards got crazy cheap towards the end, and today anything in the same performance class costs like three times as much. :p
 
R300 was the product of Art-X, or am I wrong ? In the last years, RTG is getting more talented guys, or are they losing them to nVidia, Apple, etc... ?
 
RTG is getting more talented guys, or are they losing them to nVidia, Apple, etc... ?
I think a lot of people are moving from company to company all the time. One guy goes from Google to Apple to Tesla to some new startup, etc on and on round and round. :D

The same seems to be the case in games development too, where some people have been at like 4, 5, 6 different big-name studios over the years and so on. Maybe they get hired for a project, finish it, then move on. That guy whatsisface, who helped AMD architect both K7 and Zen, seems to be a chronic vagabond. He left the company both times before the CPU he helped design was even released IIRC.
 
The drivers/fw/sw side of AMD is just plain sad. For instance check AMD's ROCm issues at GH... I'm wondering who would pick Vega/AMD with this kind of support.
Forgive my ignorance but what is "GH"? GitHub?
Could you (or someone) expand a little bit on that?
 
R300 was real nice for its time. Kicked butt in Doom 3 at 1280*1024. The RAMs on it overclocked well too, but got hot as hell. I put sinks on them and had a 120mm cardslot cooling fan pointed down at the board. The board was lying around for a long time, got rid of it some years ago because you just can't keep everything. Sad but true. :p
 
Actually I think it was only HBCC. If its the demo I'm thinking of.
Yes. And on board with 2GB memory enabled, not 8GB (or 16GB as on Vega FE).
There are figures for DSBR in their vega white paper though.

I'd be willing to bet that DSBR is alive and kicking and one should be able to find it with targeted benchmark. But expectations around HBCC, primitive shaders and DSBR are IMO seriously in lala land.
 
Yes. And on board with 2GB memory enabled, not 8GB (or 16GB as on Vega FE).
There are figures for DSBR in their vega white paper though.

I'd be willing to bet that DSBR is alive and kicking and one should be able to find it with targeted benchmark. But expectations around HBCC, primitive shaders and DSBR are IMO seriously in lala land.
DSBR is working, at least fetch once part.
The rest is taking a nap.
 
I think primitive shaders could be a big deal for AMD. Hopefully we'll find out sooner rather than later.
The concept behind primitive shaders is significant, however they seem an evolution of the async compute based culling mechanisms. For games already using that capability, I wouldn't expect much. Should be more efficient, but to my understanding the quoted primitive rate is still bound by valid primitives.

HBCC is situationally useful and seems to help in CPU limited scenarios. Makes sense considering the game has less management work to perform.

DSBR in at least one benchmark doubled performance. The whitepaper quoted significant bandwidth savings, so jury is still out there. In Linux drivers they seem to want DSBR enabled for Raven, so it must be doing something.
 
According to AMD, or actually reproducible by someone?

I think this is in reference to the energy01 subtest of Specviewperf.
Back when the slides were out, the footnotes showed that driver version 17.30 provided a score of 8.80 without DSBR, and 18.90 with it.

Some of the Frontier Edition reviews tested energy01, and had scores roughly where AMD had the DSBR on.
The score without DSBR is somewhere around some old scores I saw for a Kepler-based K6000, whereas the enabled score is roughly in the neighborhood of a GP5000.
 
Back
Top