AMD: Speculation, Rumors, and Discussion (Archive)

Status
Not open for further replies.
Would be great if instead of TDP we use calories to calculate the amount of energy dissipate by GPUs? that would be a common metric and everyone can compare. And it is actually possible to do the test at home.
 
So you think pricing has to do with power consumption alone? The fact that the Polaris 10 chip is 30% smaller, may use cheaper memory and cheaper PCB components are nothing compared to power consumption and that's why they have to play the price card?

Considering AMD's operating margins are in the toilet I doubt their production costs are a major factor in pricing. If nVidia sold 150w 1070's at $199 they would make a crap ton more money than AMD even with a larger die.
 
Well RX 480 at 150 watts vs Nvidia 1070 GTX at 150 watts with the former with a performance similar to a 980 and the later to a 980TI tells us than Polaris architecture with respect to Pascal has almost not moved a pinch from where it stood at 28 nm with respect to Nvidia Maxwell architecture efficiency.Of course they have to play the price card, at least until 1060 is out.

If it's just using a single 6-pin connector, then it's pretty likely that it's not 150W on the dot. Guru3D peg gtx1070 power usage at 161W.
 
Well RX 480 at 150 watts vs Nvidia 1070 GTX at 150 watts with the former with a performance similar to a 980 and the later to a 980TI tells us than Polaris architecture with respect to Pascal has almost not moved a pinch from where it stood at 28 nm with respect to Nvidia Maxwell architecture efficiency.
It's a little premature to make deep conclusions based just on TDP numbers, I think.

My conclusion for now is that AMD has managed to catch up with Nvidia in terms of perf/W, and probably perf/mm2. (It remains to be seen if they've also improved their ability to use BW effectively. The fact that it needs the same 256 bit memory system as a 1070 is not encouraging.)

I think that's good news for them and it will make them much more competitive than they were, without distancing themselves in any way from their competitor.

I'd give them a cookie for all of that if it weren't for the fact that it took them 2 more years to get to this point. This is business, not kindergarten. ;)
 
I think that GCN cards are rated for high TDPs because under compute loads they fully stress the GPU, making efficiency being different(but good), and makimg power consumption going up. So a RX 480 may consume ~110W on gaming loads and consume around the TDP on GPGPU loads.


Nvidia instead rates their cards with average power consumption, once they feel secure that in no plausible condition the cards will have thermal/power problems. This TDP rating become a bit less agressive in Pascal, versus Maxwell.

Is it TDP or AMD's Typical Board Power, which Fury X has shown has a margin of error roughly equivalent to one Polaris 10?
At least with Polaris and unlike other AMD boards, the 6-pin connector gives them less wiggle room on "typically" not blowing out the marketing number for board consumption. It's not a perfect limit, but it's more definitive than what we've gotten for a while.

Would buying two 480's for crossfire be a reliable way of getting 1080 levels of performance in DX12? I've heard that DX12 natively does crossfire but I haven't looked in to how reliable it is. I want 1080 levels of performance in DX12 but also want the more future-proof Async in AMD gpu's.
At the dawn of the DX12 age, I'd rather wait and pay the premium, or pay the early adopter tax. That's just me, but I don't have the patience to pay more of my time into every game the developer/AMD louse up with every release/patch/driver update.
 
Getting a dual RX 480 for "similar performance" of a GTX 1080 is indeed a less safe bet to get day-one performance with good scaling (at least until DX12's mGPU matures properly like 3dilettante said).

However, if you're going to get an adaptive sync display, the cost of the cards alone isn't the only money saver. You'll also have to deduct some $150 or more of G-Sync premium over FreeSync.
We can now find 34" IPS Curved Ultra-Wide monitors with Freesync for close to 800€, but I can't find a G-Sync equivalent for less than 1100€.
2* 8GB Polaris 10 + 34" FreeSync = 1300€
GTX 1080 + 34" GSync = 1800€

This is a 40% price difference. It's a premium that might make you think twice.
 
Source for this insinuation, please.
Particularly for the "crap ton more" part.

Lol, you want a source for a hypothetical scenario? The source is me.

A $199 1070 will generate a lot more sales and revenue than a $199 480 for obvious reasons - stronger brand and higher performance. The relative die sizes would be inconsequential.
 
Lol, you want a source for a hypothetical scenario? The source is me.

A $199 1070 will generate a lot more sales and revenue than a $199 480 for obvious reasons - stronger brand and higher performance. The relative die sizes would be inconsequential.
Nvidia must not want money then.
 
Traditionally AMD has been a fair way behind NV in performance/flop though. Fury-X is an 8.6TF card compared to Titan-X which is only 6.6 and faster. Polaris may change that of course.
GCN excluding Fiji seems to preform equal to NVidia cards of about 10-15% less FLOPS in the real world when boost clocks are in play.
 
Lol, you want a source for a hypothetical scenario? The source is me.

A $199 1070 will generate a lot more sales and revenue than a $199 480 for obvious reasons - stronger brand and higher performance. The relative die sizes would be inconsequential.

Revenue =/= profit. It's entirely possible to have incredibly high revenue and still be operating at a loss. It's also entirely possible to have low revenue and operate at a profit. For example, back during the GTX 280/260 versus Radeon 4870/4850 price wars, Nvidia had high revenue and operated at a loss (AMD also was operating at a loss, I believe).

A 199 USD GTX 1070 would have significantly less profit margins (assuming it had a profit margin at all at that price) than a 199 USD Rx 480.

It'll sell a whole lot more obviously if both were priced equally, AMD would just have to lower the MSRP and still have the same or higher profit margins.

I doubt either company is wanting to get back into a price war again. As long as Nvidia has healthy sales, they aren't going to price pressure AMD just to price pressure AMD. The past few years Nvidia has been VERY focused on their operating margins. I don't see that changing. The only way it does change is if Rx 480 massively eats into GTX 1070/1080 sales. And I just don't see that happening, they are in 2 completely different market demographics.

Regards,
SB
 
Last edited:
I would not bet the silicom cost is cheaper with GloFo 14nm Polaris 10 process than with TSMC 16nm 1070 even taking into account the die is larger in Nvidia's case.
Isn't everything I have heard that TSMC has a ton of demand? We will learn when Apple A10 comes. If they have lots of demand than Apple gets cheap price and TSMC raises price on other wafers to boost margins a bit?
GloFo is probably more of excess supply and so lower costs. Plus AMD has WSA. Even making 1 million off of 100 million in wafers is better margin than throwing 100 million away. Right?
 
I think this is a pretty smart move on AMD's part. It looks like it will very likely offer more performance and longevity than any GP106 device (assuming it isn't 256bit here). And $199 is an incredibly important price point, so kudos to AMD for recognizing & capitalizing on this.

Also looks like they are finally catching up in perf/w, but my educated guess is that they are likely still behind Nvidia (though probably very close now).

The one concern I have long-term is the clock speed. Obviously, if they could clock it higher they would... While it probably won't be a huge issue for Polaris10 / RX480, since it looks to sit above GP106 (though if it comes at 1.8GHz+ base clock who knows) and have a time to market advantage as well. But I don't see how an RX490 has much hope of competing with the 1080.

Still, definitely the right move for AMD to make, just wish I could buy one tomorrow (for my own sake and theirs)....
 
If they have lots of demand than Apple gets cheap price and TSMC raises price on other wafers to boost margins a bit?
GloFo is probably more of excess supply and so lower costs.
Not quite that simple. Obviously the fab has the cost of building out the facility & capability, which must be paid off. The more business you do the more that cost gets distributed, bringing costs per chip down.
 
It's also entirely possible to have low revenue and operate at a profit. For example, back during the GTX 280/260 versus Radeon 4870/4850 price wars, Nvidia had high revenue and operated at a loss (AMD also was operating at a loss, I believe).


nV hasn't had an operating loss in a long time, And I think that was the last time they reported a loss. But Revenue also dropped though in those Q's (q1 2009 and q2 2009)

AMD was aslo reporting a loss at the time.
 
We'll have to wait until actual reviews to see where AMD is with regards to perf/watt. A 6 Pin connector just means that the power required is greater than 75 watts but lower than 150 watts. If the wccftech piece is anywhere close to what Rx 480 ends up at it'll be anywhere from 110 watts up to 130 watts. If it's closer to the former then that's pretty good. If it's closer to the latter then at least they are closer to Nvidia than they have been for the past few years.

Regards,
SB
 
Status
Not open for further replies.
Back
Top