AMD: Speculation, Rumors, and Discussion (Archive)

Status
Not open for further replies.
http://videocardz.com/58031/amd-announces-polaris-architecture-gcn-4-0

I think this supports my theory that Fiji will continue to be AMD's flagship throughout 2016 and the two new 14/16nm GPUs will be focused on power efficiency, one of them replacing Hawaii in its performance category and the other being something that approaches Tonga but is very notebook-friendly (~75W in the desktop version, 50W in the notebook version).


M2wSoyv.jpg



Like I said, one of the two new GPUs in 2016 is a chip that is set to replace the old GM107.
And if the new chips are coming in mid-2016, I seriously doubt they're going to replace Fiji (by then, Gemini will be ~4 months old), so the other GPU should be something on a lower performance segment that replaces Hawaii.
Hawaii is a big chip with an expensive PCB and lots of memory chips and it's selling for rather low margins, to compete with the much cheaper GM204. It might very well be the chip that is getting the lowest margins to AMD and their AIBs, hence the urgency in replacing it.
 
More slides are on Videocardz.

"The GPU design was modified to include new logical blocks. What’s new is Command Processor, Geometry Processor, Multimedia Cores, Display Engine and upgraded L2 Cache and Memory Controller."

Those are hardly new stages, it's just the usual vendor specific naming found in any high-level block diagrams.

From the strong emphasis on the perf/Watt and form factors in the slides, I think AMD might want to promote the new architecture (and the new manufacturing process) on a bit safer ground in a form of a small mobile SKU, just like NV's 1st gen Maxwell.
 
Hmm does that also mean the L2 is now unified even wrt render backend? That would be something which I'd have expected for gcn 1.2 already :).

This slide did confuse me a little bit lol, cause L2 cache for all GCN is globally shared, at least that is what I thought and was told, so where is this Global Data Share, are they talking about L1 and why is it the same grey as the L2 Cache?

Maybe I'm just reading into it too much but......marketing can make anything new lol.
 
I'm not talking about error, but about model difference. Also the chosen game may have some influence. E.g. OCed MSI GTX 950 consumes ~20 watts more in Anno 2070 (compared to reference GTX 950), but only ~5 watts more in Battlefield 4. I'm sure they found worst-case situation for most power-hungry model of GTX 950.
 
I'm not talking about error, but about model difference. Also the chosen game may have some influence. E.g. OCed MSI GTX 950 consumes ~20 watts more in Anno 2070 (compared to reference GTX 950), but only ~5 watts more in Battlefield 4. I'm sure they found worst-case situation for most power-hungry model of GTX 950.


That possible but the highest reviewed models that I have seen for the whole system it was in the neighborhood of 170 watts, so to get 150 watts for just a single card is kinda out in left field even for worst case scenario.
 
I am just hoping this thing has overclocking headroom(the high-end parts at least). I really enjoyed playing around the bios with the Maxwell architecture and i would definitely buy into AMD again if it is something equivalent. The slide with Battlefront seems promising for the low-end part (even though Battlefront heavily favors GCN vs Maxwell).
 
Seems like system power consumption, so about 40-50W for the CPU if you assume that the 950 hits 90-100W.

Polaris card then uses only 36W if you assume same CPU usage. Things change if it's easier on the CPU.

Small GPUs and mid-2016 availability doesn't bode well for the bigger chips.
 
Seems like system power consumption, so about 40-50W for the CPU if you assume that the 950 hits 90-100W.

Polaris card then uses only 36W if you assume same CPU usage. Things change if it's easier on the CPU.

Small GPUs and mid-2016 availability doesn't bode well for the bigger chips.

It was total system power consumption measured at the wall. Per Anandtech, during AMD's live presentation they actually measured 88.1W for the Polaris system and 150W for the GTX 950.
 
It helps that the game is Battlefront where AMD can hold like 10-25% advantage over competing nvidia cards.

Live presentation with working silicon:oops: :runaway:
 
Seems like system power consumption, so about 40-50W for the CPU if you assume that the 950 hits 90-100W.

Polaris card then uses only 36W if you assume same CPU usage. Things change if it's easier on the CPU.

Small GPUs and mid-2016 availability doesn't bode well for the bigger chips.


well low end and midrange are volume sales, if they are able to get them out before nV they will take marketshare back quickly and this is something that is in their best interest, I'm not sure if nV will have a counter to this if this is AMD's goal.........
 
well low end and midrange are volume sales, if they are able to get them out before nV they will take marketshare back quickly and this is something that is in their best interest, I'm not sure if nV will have a counter to this if this is AMD's goal.........
Yeah whoever releases first will have the best product margin for a little while; just look at the NVIDIA 980, which in reality was priced way too high initially (could be argued still is) and even 970 was rather high.
I doubt either manufacturer will want to lose that early profit margin edge.
Cheers
 
Status
Not open for further replies.
Back
Top