For not exposing support for atomics? I have no idea what else it could be beyond phasing out support for a recent feature.2 months till supposed launch and drivers are to blame? I think that is pretty far off the mark. I can undestand a test here or there but there are too many to be coincidentals, this could be fake is a better way to go though.
It does if you assume the demoed part is only using half the typical geometry pipes with some improvements. Normalize clocks and that Catmull3 rate is right at half of Fiji. Going off the Catmull3/5 tests they must have added primitive shader support to the benchmark if that's actually a requirement. Catmull5 is 3x the geometry rate as they start tessellating compared to Polaris 10 (1.08x) and Fiji (1.36x) increases. Primitive discard shouldn't apply to that as it's just triangles and Fiji probably benefits from bandwidth. Catmull aside, all those scores are virtually identical between Fiji and Vega when normalized for clock speeds.Tessellation performance doesn't add up either.
It does if you assume the demoed part is only using half the typical geometry pipes with some improvements. Normalize clocks and that Catmull3 rate is right at half of Fiji. Going off the Catmull3/5 tests they must have added primitive shader support to the benchmark if that's actually a requirement. Catmull5 is 3x the geometry rate as they start tessellating compared to Polaris 10 (1.08x) and Fiji (1.36x) increases. Primitive discard shouldn't apply to that as it's just triangles and Fiji probably benefits from bandwidth. Catmull aside, all those scores are virtually identical between Fiji and Vega when normalized for clock speeds.
Still the oddities of having both Vega 10 and 11 while only using one in the Instinct line along with Fiji and Polaris 10. Everyone assumed Mi25 was half precision with the 2xFP16, but it could have been single precision or the smaller part doubled.
E. g. P11-based E9260 should boost around 1,4 GHz ("14 CUs, up to 2,5 TFLOPS"), while desktop cards boost up-to 1,2 GHz.
E. g. P11-based E9260 should boost around 1,4 GHz ("14 CUs, up to 2,5 TFLOPS"), while desktop cards boost up-to 1,2 GHz.
Don't trust that.
I brought up the same thing a couple months ago and Ryan Smith was kind enough to point out that there is historical precedent for wonky specs on AMD's embedded products.
https://forum.beyond3d.com/posts/1945828/
Performance talk starts @ Slide 66http://wccftech.com/amd-radeon-rx-580-570-launch-delay-rumor/
Btw I have a doubt. Would rendering in HDR hurt performance? or rather boost it? I know game engines already use HDR and downscale it but you would also need to present much more information on screen if you want to render in HDR.
Performance talk starts @ Slide 66
http://www.frostbite.com/2017/03/high-dynamic-range-color-grading-and-display-in-frostbite/
Not sure how you can say that about Freesync. There has been a steady stream of updates and messaging concerning panel support and overall support, especially as adoption has significantly outpaced G-Sync, and these are not even AMD's products. Freesync has been announced as a technology, but its going to take time for vendors to actually implement that, especially as it is dealing with even more nascent and bleeding edge technology elements.
Well it was during CES (the FS2 announcement) which is essentially the best time/place to announce it and nVidia announced their G-Sync HDR specs (who, much like FS2, have yet to materialize in consumer available Displays..) there 2 days later...frankly you can't blame them (NV & AMD) for announcing this type of tech during CES at all... On the other hand..yes, AMD/RTG plain sucks in execution compared to nVidia when it comes to GPU's announcement -> retail release...Im talking about FS2 which is an specification, isn't it? that will be given to the best monitors out there. It was 3 months ago and we are still waiting to see the first HDR gaming monitor going on sale. By the time we get the first FS2 monitor in the market ppl will forget about AMD's presentation(5 or 6? months ago).
My point is don't present/announce something that you can't actually sell. I'm sure will be getting a FS2 monitor(I hope AMD give the specification to 1080p displays) but I don't even know when that will happen, Ive been waiting for a HDR display so long..And I know this have nothing to do with AMD . I just think that if you make a presentation before been able to sell the product ppl will be tired of the waiting specially something like HDR that have been our wet dreams since so long.
Yes very uncharacteristic from Nvidia to announce something they cant yet sell, I assume display maker told AMD and Nvidia "we gonna have the display ready X date" and then they change the schedule. If the rumors are true the first HDR display will arrive in 3 months with 1200 dollar price and will be GS2, There is no word for FS2 that I'm aware of.Well it was during CES (the FS2 announcement) which is essentially the best time/place to announce it and nVidia announced their G-Sync HDR specs (who, much like FS2, have yet to materialize in consumer available Displays..) there 2 days later there...frankly you can't blame them (NV & AMD) for announcing this type of tech during CES at all... On the other hand..yes, AMD/RTG plain sucks in execution compared to nVidia when it comes to GPU's announcement -> retail release...