Nvidia Pascal Announcement

Vega only tried that with L2-Cache and with L1 Data/Instruction Caches, which are shared between 3 instead of 4 CUs now. But they said, the latter at least was done not to increase cache capacity per CU but to improve signalling and help keeping the clocks high. Registers and (most) things inside the CUs stayed constant in this regard.
 
More 1070 Ti leaks:
 
The price bracket between 1080 and 1070 is almost non-existent here. The respective cheapest ones available are at 400 and 500 Euro, with the more potent ones for the former populating the gap to the latter. It will be a mess, if all three lines will be continued.
 
Maybe they'll decrease 1070 price and increase 1080 price. It's not as though Vega is highly competitive and it's priced high anyway.
 
The 1070 Ti is rumored to have 9 Gbps memory now.

https://www.overclock3d.net/news/gpu_displays/nvidia_s_gtx_1070_ti_is_expected_to_use_9gbps_memory/1

I think we all thought it was strange when earlier rumors suggested 8 Gbps memory.

Sticking with GDDR5 means Nvidia knows this will be eaten up by miners.

The price bracket between 1080 and 1070 is almost non-existent here. The respective cheapest ones available are at 400 and 500 Euro, with the more potent ones for the former populating the gap to the latter. It will be a mess, if all three lines will be continued.

My bet is that the 1070 Ti functionally replaces the 1070 and basically allows Nvidia to get a bigger piece of the inflated 1070 prices.

Remember that the MSRPs for the 1070 and 1080 are $380 and $500, respectively, so there's room for a $440-450 card even if it would functionally cost about the same as the existing 1070s.
 
My bet is that the 1070 Ti functionally replaces the 1070 and basically allows Nvidia to get a bigger piece of the inflated 1070 prices.

Remember that the MSRPs for the 1070 and 1080 are $380 and $500, respectively, so there's room for a $440-450 card even if it would functionally cost about the same as the existing 1070s.
Possible, but i was talking about market prices of course which is what matter for end users instead of nvidias customers. And there, mining drove the lower part of the stack (namely ETH and G5-cards) up and pushed it very/too close to the lowest of the upper three cards.
 
You'd have a slight chance to see it in their quarterly results, if there's any truth to it. Do they sell at or near zero margin for their advertised launch prices? Possibly. Do they sell at a loss? Hard to say, I tend to no.
 
There's literally zero evidence suggesting that to be the case.

In fact the original source to this statement seems to have made it up out of thin air. Also, AMD has denied it.

How is profitability measured for the purpose of this exercise in conjecture? Current manufacturing runrate cost? R&D amortization schedule included? Component spot pricing VS long-term supply contract pricing with volume de-escalators?
 
Not sure I understand what you're getting at.

He means there's a ton of ways to calculate the costs of a consumer electronic product for purposes of measuring profitability.

In plainer English:
  • Do you include R&D costs?
    • If so, how do you amortize those costs (you have one massive R&D dollar amount that must be allocated to each product produced in the future, but you don't know how many you'll make)?
  • How do you measure the individual component costs in your product's BoM?
    • You negotiate complex contracts for components where the unit costs might change as you ramp production up (and then down). Do you use current pricing or an "average" long term price?

I don't have answers to those questions, but that's an interesting thought experiment.
 
Back
Top