AMD Vega 10, Vega 11, Vega 12 and Vega 20 Rumors and Discussion

2 months till supposed launch and drivers are to blame? I think that is pretty far off the mark. I can undestand a test here or there but there are too many to be coincidentals, this could be fake is a better way to go though.
 
2 months till supposed launch and drivers are to blame? I think that is pretty far off the mark. I can undestand a test here or there but there are too many to be coincidentals, this could be fake is a better way to go though.
For not exposing support for atomics? I have no idea what else it could be beyond phasing out support for a recent feature.
 
doesn't seem likely for some of the things they are showing, I would expect smaller compute tasks to get better along with more complex tasks too.

Tessellation performance doesn't add up either.

This is small regressions either, fairly large. Unless AMD is deprecating some older features or extensions which the app uses? I don't think they would do that.
 
Tessellation performance doesn't add up either.
It does if you assume the demoed part is only using half the typical geometry pipes with some improvements. Normalize clocks and that Catmull3 rate is right at half of Fiji. Going off the Catmull3/5 tests they must have added primitive shader support to the benchmark if that's actually a requirement. Catmull5 is 3x the geometry rate as they start tessellating compared to Polaris 10 (1.08x) and Fiji (1.36x) increases. Primitive discard shouldn't apply to that as it's just triangles and Fiji probably benefits from bandwidth. Catmull aside, all those scores are virtually identical between Fiji and Vega when normalized for clock speeds.

Still the oddities of having both Vega 10 and 11 while only using one in the Instinct line along with Fiji and Polaris 10. Everyone assumed Mi25 was half precision with the 2xFP16, but it could have been single precision or the smaller part doubled.
 
It does if you assume the demoed part is only using half the typical geometry pipes with some improvements. Normalize clocks and that Catmull3 rate is right at half of Fiji. Going off the Catmull3/5 tests they must have added primitive shader support to the benchmark if that's actually a requirement. Catmull5 is 3x the geometry rate as they start tessellating compared to Polaris 10 (1.08x) and Fiji (1.36x) increases. Primitive discard shouldn't apply to that as it's just triangles and Fiji probably benefits from bandwidth. Catmull aside, all those scores are virtually identical between Fiji and Vega when normalized for clock speeds.

Still the oddities of having both Vega 10 and 11 while only using one in the Instinct line along with Fiji and Polaris 10. Everyone assumed Mi25 was half precision with the 2xFP16, but it could have been single precision or the smaller part doubled.


if they added primitive shaders to one of the Catmull x5 test *which how would they do that cause it would have to be in done in the application not driver side*, any case, not sure what you are getting at with MI25, are you talking about the Tflops of it? and its 2 x FP16 to get to that cause that would be the only way to get to those numbers. It definitely wasn't SP Tflops they were talking about.
 
A comparison of 480 and 1060 in compubench, Vega vastly improves in the particle simulation test where 1060 is 50% faster than 480.

https://compubench.com/compare.jsp?...ype2=dGPU&hwname2=NVIDIA+GeForce+GTX+1060+6GB

E. g. P11-based E9260 should boost around 1,4 GHz ("14 CUs, up to 2,5 TFLOPS"), while desktop cards boost up-to 1,2 GHz.

That's not a pro card, the best P10 workstation card is clocked a bit slower than desktop(1243Mhz vs. 1266Mhz) while the P11 card is a whole 1Mhz faster.
 
Do we have any news concerning Vega? Last I heard it was confirmed for May (Prey ad) but I'm w/o a GPU for the time being (sold my 970) and custom 1080tis will be a thing in April. But I'm willing to wait a bit if Vega is around the corner, if only to have more options.
 
Don't trust that.

I brought up the same thing a couple months ago and Ryan Smith was kind enough to point out that there is historical precedent for wonky specs on AMD's embedded products.

https://forum.beyond3d.com/posts/1945828/

Need to undersand that for embedded parts, final clock are set by design choice of integrators. so it is just "up to".. ( so anyway, there's not much value on thoses numbers in real integration )

Note, it is the same for Nvidia ones, they give you xx clock for xx Gflops, but on the end it is the integrator who will set , ask, for what they need and design choice.
 
"HDR" itself doesn't specify the format and there can be different depths available; consumer devices are settling towards the "HDR10". You will still need to have a tone mapping process to get the the correct format of the display.
 
Last edited:
http://wccftech.com/amd-radeon-rx-580-570-launch-delay-rumor/

Btw I have a doubt. Would rendering in HDR hurt performance? or rather boost it? I know game engines already use HDR and downscale it but you would also need to present much more information on screen if you want to render in HDR.
Performance talk starts @ Slide 66
http://www.frostbite.com/2017/03/high-dynamic-range-color-grading-and-display-in-frostbite/
i7uIUtG.jpg
 
So we would need a faster vga to have same performance in HDR but it seems it wont be a big difference.

One thing I've been waiting is to at least estimate dates for 1080p HDR monitors...I know companies would want to sell 2k dollars displays and they cant ask that for a 1080p but Im pretty sure there is a huge market of ppl wanting to play at 1080p in HDR, specially when you see that rendering at 4k and downsamling to 1080p have similar quality than native 4k when you at HDR I think that would be the best compromise.

One thing AMD is really bad at is showing product when they are ready, who still remember Freesyn2? I added Mad Max HDR to my amazon list and Im still waiting to buy it.
 
Not sure how you can say that about Freesync. There has been a steady stream of updates and messaging concerning panel support and overall support, especially as adoption has significantly outpaced G-Sync, and these are not even AMD's products. Freesync has been announced as a technology, but its going to take time for vendors to actually implement that, especially as it is dealing with even more nascent and bleeding edge technology elements.
 
Not sure how you can say that about Freesync. There has been a steady stream of updates and messaging concerning panel support and overall support, especially as adoption has significantly outpaced G-Sync, and these are not even AMD's products. Freesync has been announced as a technology, but its going to take time for vendors to actually implement that, especially as it is dealing with even more nascent and bleeding edge technology elements.

Im talking about FS2 which is an specification, isn't it? that will be given to the best monitors out there. It was 3 months ago and we are still waiting to see the first HDR gaming monitor going on sale. By the time we get the first FS2 monitor in the market ppl will forget about AMD's presentation(5 or 6? months ago).

My point is don't present/announce something that you can't actually sell. I'm sure will be getting a FS2 monitor(I hope AMD give the specification to 1080p displays) but I don't even know when that will happen, Ive been waiting for a HDR display so long..And I know this have nothing to do with AMD . I just think that if you make a presentation before been able to sell the product ppl will be tired of the waiting specially something like HDR that have been our wet dreams since so long.
 
Im talking about FS2 which is an specification, isn't it? that will be given to the best monitors out there. It was 3 months ago and we are still waiting to see the first HDR gaming monitor going on sale. By the time we get the first FS2 monitor in the market ppl will forget about AMD's presentation(5 or 6? months ago).

My point is don't present/announce something that you can't actually sell. I'm sure will be getting a FS2 monitor(I hope AMD give the specification to 1080p displays) but I don't even know when that will happen, Ive been waiting for a HDR display so long..And I know this have nothing to do with AMD . I just think that if you make a presentation before been able to sell the product ppl will be tired of the waiting specially something like HDR that have been our wet dreams since so long.
Well it was during CES (the FS2 announcement) which is essentially the best time/place to announce it and nVidia announced their G-Sync HDR specs (who, much like FS2, have yet to materialize in consumer available Displays..) there 2 days later...frankly you can't blame them (NV & AMD) for announcing this type of tech during CES at all... On the other hand..yes, AMD/RTG plain sucks in execution compared to nVidia when it comes to GPU's announcement -> retail release...
 
Last edited:
Well it was during CES (the FS2 announcement) which is essentially the best time/place to announce it and nVidia announced their G-Sync HDR specs (who, much like FS2, have yet to materialize in consumer available Displays..) there 2 days later there...frankly you can't blame them (NV & AMD) for announcing this type of tech during CES at all... On the other hand..yes, AMD/RTG plain sucks in execution compared to nVidia when it comes to GPU's announcement -> retail release...
Yes very uncharacteristic from Nvidia to announce something they cant yet sell, I assume display maker told AMD and Nvidia "we gonna have the display ready X date" and then they change the schedule. If the rumors are true the first HDR display will arrive in 3 months with 1200 dollar price and will be GS2, There is no word for FS2 that I'm aware of.
 
So this came out: https://www.bhphotovideo.com/c/product/1312886-REG/lg_32ud99_w_32_16_9_hdr10.html

2 things I don't understand:

1. It says 95% of P3 so its not really HDR(both standards HDR10 and DV specify color gamut either P3 or rec2020)
2. it ways "max and typical" brightness of 550 and 350 which again does not meet HDR standards which is 1k for DV and 10k for HDR10.

I also would like(if possible) a clarification if this monitor is FS2 or just normal FS. I think its FS1 since FS2 is suppose to be only for the best of the best and this monitor doesn't even meet the standards.
 
Back
Top