Nvidia talked about DP when they announced the P100 so it's weird for AMD to not even mention it.Aren't these new cards for neural networks exclusively?
If so, is there any advantage in talking about DP?
Nvidia talked about DP when they announced the P100 so it's weird for AMD to not even mention it.Aren't these new cards for neural networks exclusively?
If so, is there any advantage in talking about DP?
Some of the very large science labs will want to do both training and FP64 modelling.Aren't these new cards for neural networks exclusively?
If so, is there any advantage in talking about DP?
AMD didn't talk about Bistrol Ridge's GPU either, and then it comes with half rate DP. Hmm.With AMD not even mentioning DP performance of the MI125, is it safe to assume there won't be half DP?
Does seem rather odd to add it in and then do away with it on your next major architecture launch.AMD didn't talk about Bistrol Ridge's GPU either, and then it comes with half rate DP. Hmm.
Gotcha, thanks for the correction. Are 64 Gb x32 LPDDR4 chips expected in the future?Since this is brought up, the 2016 MBP thingy is not really about the support of LPDDR4, but the lack of 64Gbit x32 LPDDR3 (and apparently Apple isn't interested to call for a part that would have a super low economies of scale).
I'm not familiar with the x[number] nomenclature and related properties, and you mention Hynix in your post, but is 8 Hynix 32 Gb x16 LPDDR4 chips a valid possibility for 32 GB? (Assuming 128-bit bus)At least when I last checked the public product catalogues, you can't even achieve 16GB in 128-bit DQ with what Micron and Samsung are offering in LPDDR4 (except SK Hynix, perhaps).
Nvidia talked about DP when they announced the P100 so it's weird for AMD to not even mention it.
Yeah. x16 usually means the package has 16-bit data bus. So eight of them would get you 32GB. But I am not sure if LPDDR4 is really available in x16. AFAIK the spec is two indepedent 16-bit channels per die though. Moreover, the ballout count seems to suggest otherwise. It looks more like 24Gb and 32Gb are available only in x32/x64 packages.Gotcha, thanks for the correction. Are 64 Gb x32 LPDDR4 chips expected in the future?
I'm not familiar with the x[number] nomenclature and related properties, and you mention Hynix in your post, but is 8 Hynix 32 Gb x16 LPDDR4 chips a valid possibility for 32 GB? (Assuming 128-bit bus)
https://lists.freedesktop.org/archives/amd-gfx/2016-December/004127.htmlamdgpu: Add support for Polaris 12
I think it was quite expectable. AMD already confirmed it in the Polaris architecture whitepaper (27th July 2016), here is direct quote:And early benchmarks are pointing to lower-than-P11 performance:
http://videocardz.com/64787/amd-polaris-12-spotted-in-linux-patches
;-)Polaris-based GPUs have 1-4 geometry engines, depending on overall performance targets (e.g. the Radeon™ RX 460 GPU has two, while the Radeon™ RX 480 GPU has four).
And early benchmarks are pointing to lower-than-P11 performance:
http://videocardz.com/64787/amd-polaris-12-spotted-in-linux-patches
Looks like I'm going to eat my virtual hat on this one.
OEM demands are a strange thing. I guess the best part of staying on top of the food chain like apple or Microsoft is that you don't really have to deal with this bullshit.
Maybe they're going for a Shield TV 2 competitor with P12.
This may be crazy but my future father in law runs his own business and we set him up with 3 monitors per pc. We are using really old radeon 5x00 series cards that are no longer supported. IF Polaris 12 sips power and runs 3 monitors esp new 4k ones we will get them for those pcs and ditch the old ones. Not everyone needs gaming power
that's not bad at all either. Might grab those for now but they are surely getting close to end of lifeR7 240 with an Oland chip?
$10 per card after rebate.
It's GCN 1.0. It's no spring chicken (can you believe 7970 came out 5 years ago this week?!), but AMD won't be dropping 1.0 support for a while yet. I reckon it gets a few more years before it gets shuffled off to legacy, in part because AMD sold Pitcairn for so long.that's not bad at all either. Might grab those for now but they are surely getting close to end of life
It's GCN 1.0. It's no spring chicken (can you believe 7970 came out 5 years ago this week?!), but AMD won't be dropping 1.0 support for a while yet. I reckon it gets a few more years before it gets shuffled off to legacy, in part because AMD sold Pitcairn for so long.
Hard to tell. It could also be an Arm core going by the recent MS moves of getting Windows on Arm. Will be interesting to see what if any features get implemented on Scorpio that miss the Vega window.Do you think a custom version of raven ridge is a realistic possibility for project scorpio?
(8 logical ryzen cores / 6 TF vega)