So you think pricing has to do with power consumption alone? The fact that the Polaris 10 chip is 30% smaller, may use cheaper memory and cheaper PCB components are nothing compared to power consumption and that's why they have to play the price card?
Well RX 480 at 150 watts vs Nvidia 1070 GTX at 150 watts with the former with a performance similar to a 980 and the later to a 980TI tells us than Polaris architecture with respect to Pascal has almost not moved a pinch from where it stood at 28 nm with respect to Nvidia Maxwell architecture efficiency.Of course they have to play the price card, at least until 1060 is out.
If nVidia sold 150w 1070's at $199 they would make a crap ton more money than AMD even with a larger die.
It's a little premature to make deep conclusions based just on TDP numbers, I think.Well RX 480 at 150 watts vs Nvidia 1070 GTX at 150 watts with the former with a performance similar to a 980 and the later to a 980TI tells us than Polaris architecture with respect to Pascal has almost not moved a pinch from where it stood at 28 nm with respect to Nvidia Maxwell architecture efficiency.
I think that GCN cards are rated for high TDPs because under compute loads they fully stress the GPU, making efficiency being different(but good), and makimg power consumption going up. So a RX 480 may consume ~110W on gaming loads and consume around the TDP on GPGPU loads.
Nvidia instead rates their cards with average power consumption, once they feel secure that in no plausible condition the cards will have thermal/power problems. This TDP rating become a bit less agressive in Pascal, versus Maxwell.
At the dawn of the DX12 age, I'd rather wait and pay the premium, or pay the early adopter tax. That's just me, but I don't have the patience to pay more of my time into every game the developer/AMD louse up with every release/patch/driver update.Would buying two 480's for crossfire be a reliable way of getting 1080 levels of performance in DX12? I've heard that DX12 natively does crossfire but I haven't looked in to how reliable it is. I want 1080 levels of performance in DX12 but also want the more future-proof Async in AMD gpu's.
Source for this insinuation, please.
Particularly for the "crap ton more" part.
Nvidia must not want money then.Lol, you want a source for a hypothetical scenario? The source is me.
A $199 1070 will generate a lot more sales and revenue than a $199 480 for obvious reasons - stronger brand and higher performance. The relative die sizes would be inconsequential.
Nvidia must not want money then.
It must costs nvidia nothing to make bigger dies.Why do you say that? They will make even more at $379+.
My point is that price has nothing to do with die sizes or cost. Price is a measure of perceived value.
I would not bet the silicom cost is cheaper with GloFo 14nm Polaris 10 process than with TSMC 16nm 1070 even taking into account the die is larger in Nvidia's case.It must costs nvidia nothing to make bigger dies.
GCN excluding Fiji seems to preform equal to NVidia cards of about 10-15% less FLOPS in the real world when boost clocks are in play.Traditionally AMD has been a fair way behind NV in performance/flop though. Fury-X is an 8.6TF card compared to Titan-X which is only 6.6 and faster. Polaris may change that of course.
Lol, you want a source for a hypothetical scenario? The source is me.
A $199 1070 will generate a lot more sales and revenue than a $199 480 for obvious reasons - stronger brand and higher performance. The relative die sizes would be inconsequential.
Isn't everything I have heard that TSMC has a ton of demand? We will learn when Apple A10 comes. If they have lots of demand than Apple gets cheap price and TSMC raises price on other wafers to boost margins a bit?I would not bet the silicom cost is cheaper with GloFo 14nm Polaris 10 process than with TSMC 16nm 1070 even taking into account the die is larger in Nvidia's case.
Not quite that simple. Obviously the fab has the cost of building out the facility & capability, which must be paid off. The more business you do the more that cost gets distributed, bringing costs per chip down.If they have lots of demand than Apple gets cheap price and TSMC raises price on other wafers to boost margins a bit?
GloFo is probably more of excess supply and so lower costs.
It's also entirely possible to have low revenue and operate at a profit. For example, back during the GTX 280/260 versus Radeon 4870/4850 price wars, Nvidia had high revenue and operated at a loss (AMD also was operating at a loss, I believe).