NVIDIA Kepler speculation thread

The point is, if there are serious problems with 20nm and chips on those process would come later than expected, you would need to extend the 28nm lifetime somehow. Imagine they released something now and then had to wait until maybe Q3 2014. So why not push the refresh a bit back, slow down a bit?
 
If the first 20 nm chips would show up in late 2014 rather than early 2014, then I thought there would be three 28 nm series of chips at a ~1 year cadence, even if the third one is a hybrid like Northern Islands or some type of stopgap. If there really are only two 28 nm series, then I wonder what motivation (if any) they had for "skipping" the late 2012 / early 2013 series instead of a late 2013 / early 2014 series.

I think NVIDIA's 700 series (minus the Titan, although that may not be named under the "GTX 700" series) will end up similar to AMD's 8000 series (at least what we've seen so far), in the sense that there won't be significant performance increases. From what I've read, the 680 appears to have a memory bandwidth bottleneck. So if the 700 series GK104 parts stay at 6 Gbps maximum memory speed, they may not have significant core clock speed increases over the 670/680. If they move up to 7 Gbps (probably less likely, maybe that'll happen with GK114/GK204, if at all) then would open the door to up to similar core clock increases. Maybe the successor to the 660 Ti (and 650 Ti) won't have reduced bus widths.

If the 680 successor has similar or identical specs to the 680, NVIDIA might reduce the TDP slightly (or maybe not—AMD didn't seem to with the 8970 etc.), which may allow a dual-chip 690 successor to be two full non-clock-reduced 680 successors. However, if the Titan is as good as what certain rumors claim, there might not be a direct 690 successor.

Also, I don't think a Kepler refresh in late 2013 means that there won't be 28 nm Maxwell in H2 2014. There's a fair amount of room above even a refreshed GK104, assuming they don't add on SMXes to it or increase its bus width, for a Maxwell chip to slot in.
 
I'd think it would make sense to capitalize on their experiences on 28nm rather than stretching out the current products.
 
I will concurr with you... specially if you know the 20nm is delayed or not good... you will release the 28nm parts and then goes on for see what you can do with the next series..

How can AMD and Nvidia see their shippments reduced so much for a complete year ( Q4 ? so 1 complete year without a new line up ? in reality 2 years if we watch when the 7970 have been launch ) .... Their shippment will fall dramatically outside OEM ( and again, even there ) ... I imagine what earthquake this will product on the peoples who own the share of thoses company.

After reading this, i have been looking if TSMC have not explode or have disappear during the night.
 
Last edited by a moderator:
I wonder if AMD has stopped helping TSMC with development now. The 28nm lead was only a couple of months whereas AMD had the 4770 out how long before on 40nm - a year or so?

Shipments are going to collapse anyway because the desktop market is in freefall. AMD has no need for the high end enthusiast market, this can only harm Nvidia.
 
Last edited by a moderator:
I wonder if AMD has stopped helping TSMC with development now. The 28nm lead was only a couple of months whereas AMD had the 4770 out how long before on 40nm - a year or so?

Shipments are going to collapse anyway because the desktop market is in freefall. AMD has no need for the high end enthusiast market, this can only harm Nvidia.

This doesn't just effect the high end enthusiast market. It effects the whole line. AMD requires the development of discrete graphics cards because not only does the GPU division make money, the technology from their GPU series makes there way into their APU eventually.

With Intel having the node advantage for their graphic parts, they can catch AMD as far as graphic performance goes if they slow down the development of their graphics parts too much.

The graphic part of AMD's APU is the only thing that allows AMD to have some sort of margin on their chips.

If AMD loses the graphic advantage along with the long lost CPU advantage(As well as having the loser AMD brand), they can forget about selling any chips at any sort of profit.
 
The development would continue to take place, however not having any high end chips to sell right away isn't going to be a disaster for AMD.

To be honest, AMD doesn't have an awful lot left in discrete gpu to lose already. They appear to have cut down production by nearly 25% or even more. GPU prices are going up. Being first to market over and over again hasn't gained them a lot. The long term goal must surely be to have all GPU made at Globalfoundries too so a slowdown at TSMC can't hurt in that respect.

This is probably all about cost though, even for Nvidia. With Intel threatening 22nm Atoms the ARM crowd need to react quickly. I can imagine how much an early 20nm wafer is going to cost, and I can easily imagine how AMD doesn't want anything to do with it. Nvidia needs Tegra more now as well instead of crap yields on large chips early on a new process? AMD and Nvidia have beaten themselves with this price war that's gone on for too long.

We're witnessing the end of the high end PC - there are just too many indicators to ignore. Maybe they are making it a self fulfilling prophecy by their actions but it sure looks to me like they can't get out of it fast enough.
 
The point is, if there are serious problems with 20nm and chips on those process would come later than expected, you would need to extend the 28nm lifetime somehow. Imagine they released something now and then had to wait until maybe Q3 2014. So why not push the refresh a bit back, slow down a bit?

it's what i mean, excuse for my english :cry:
 
We're witnessing the end of the high end PC - there are just too many indicators to ignore. Maybe they are making it a self fulfilling prophecy by their actions but it sure looks to me like they can't get out of it fast enough.

The trouble is that despite of NV's Tegra efforts their nowhere near yet having a strong foothold in that market yet. I wouldn't doubt that they'll continue building up there, but it's inevitably a slow moving process while the PC market winding down accelerates in contrast way too fast.

I won't dare to make any far fetched predictions but let's hope that there won't be any serious time gap between the fore mentioned, otherwise it might get tough somewhere in between.

For this year on the other hand if both should decide to significantly lower prices on existing GPU SKUs it might be a eulogy after all, since it might somewhat blow some air in for PC gaming with consumers responding with much higher sales volumes. Granted so far both AMD and NVIDIA sold any of their 28nm GPUs at relatively high margins, but I can't get rid of the feeling that volumes have decreased due to the higher prices.
 
The point is, if there are serious problems with 20nm and chips on those process would come later than expected, you would need to extend the 28nm lifetime somehow. Imagine they released something now and then had to wait until maybe Q3 2014. So why not push the refresh a bit back, slow down a bit?

I think the main problem already is that most people lose interest and need to upgrade. And slowing down, making new generations with negligible performance improvements makes things even worse. Ok, they will kill the PC market but that means they will most probably kill themsleves as well. I think they deserve it. :rolleyes:

And if they really do this slow down, it won't be "a bit" but from Q1 2013, as all normal people would expect, to Q4 2013 is quite a significant delay...
 
Last edited by a moderator:
I think the main problem already is that most people lose interest and need to upgrade. And slowing down, making new generations with negligible performance improvements makes things even worse. Ok, they will kill the PC market but that means they will most probably kill themsleves as well. I think they deserve it. :rolleyes:

It's been that way for CPUs for several generations now. I'm pretty sure I can affordably upgrade my 5750 video card to something with enough extra guts so as to be noticeable. My 860 CPU, though? Not so much.

Dear industry -- I need another order of magnitude. And if I'm going to be editing 8k footage, I need two.

kthxbye
 
I think the main problem already is that most people lose interest and need to upgrade. And slowing down, making new generations with negligible performance improvements makes things even worse. Ok, they will kill the PC market but that means they will most probably kill themsleves as well. I think they deserve it. :rolleyes:

No idea how others think or react, but in my neck of woods I still need 380 Euros for a vanilla 7970 and over 470 for a GTX680 today, when for the past generation after about a year of a release you would had paid as much for a high end single chip SKU and not for a 294mm2 die as the GK104. Financials haven't gotten better on a worldwide scale rather the exact opposite. When you know that everyone's budget is tighter than ever before and you raise prices on top of that you'll obviously go for higher margings and lower volumes, but the real question is for how long exactly?
 
I'm not even sure it's about higher margins and lower volumes any more. Look at AMD and the position they are in and where they appear to be going. Last year we thought it maybe made sense to hold off on Kabini until 28nm was mature, but they are probably at the stage now where a 20nm shrink of Kabini is their #1 target. Can we really expect new 20nm graphics cards at the start of a new process again? I just don't think so.

I get your point on Tegra for Nvidia - in many ways they are far more dependent on discrete still, but they can't hide from this. The top end is going away - hell it's not just the top end. Tablets will probably overtake laptops in sales this year. That is fucking insane. Nobody saw this coming :p

Nvidia needs Tegra to be as good as it can be, and AMD needs Kabini/Temash on 20nm to be the same. That's why I think we're looking at them going with their SOC's early on a new process, and the big discrete graphics chips can come later on when it's cheaper.

In the end, only AMD and Nvidia are competing for the discrete graphics market. If they both looked at it logically they'd come to the same conclusion - they can fight it out for buttons in the discrete market later, but now they need small chips in volume for the tablet takeover.
 
Tablets will probably overtake laptops in sales this year. That is fucking insane. Nobody saw this coming :p

I did and am and I neither had a crystal ball or anything else. Look at the following prototype: http://www.brightsideofnews.com/new...ertablets-cometh-panasonic-leads-the-way.aspx I don't want to know what it'll cost in its final form and it's battery life is still way too low, however doesn't it further proof in what direction things might be moving? Wait until higher end SoCs start to find its home into future smartTVs and we'll see how generic home entertainment will look in a couple of years.

Nvidia needs Tegra to be as good as it can be, and AMD needs Kabini/Temash on 20nm to be the same. That's why I think we're looking at them going with their SOC's early on a new process, and the big discrete graphics chips can come later on when it's cheaper.

No idea what each of the those SoCs will be manufactured on, but there's quite a difference between say 20HP and 20LP or any other process that fits better low power devices. Add to that manufacturing processes are becoming more and more problematic as time goes by. Tegra4 on 28nm is roughly at 80mm2, which irrelevant of the process variant itself is quite a difference a chip monster like the ~550mm2 GK110. Even if either/or IHVs go for increasingly bigger SoCs, they're obviously not going to reach in the foreseeable future almost the maximum tolerance of die area of each process itself.

In the end, only AMD and Nvidia are competing for the discrete graphics market. If they both looked at it logically they'd come to the same conclusion - they can fight it out for buttons in the discrete market later, but now they need small chips in volume for the tablet takeover.

For NV at least for the time being it's their Tegra department that could be showing resource constraints and not the other way around.
 
No idea what each of the those SoCs will be manufactured on, but there's quite a difference between say 20HP and 20LP or any other process that fits better low power devices.

Sure but it's all wafer starts whether LP or HP right? The point I'm trying to make is that both Nvidia and AMD have probably had a large shift in emphasis over the past year. AMD obviously has, Nvidia still has a lot of eggs in discrete GPU but they know it's going away. Can they afford to stick with early large GPU on a new process knowing that they are harming their likely future, ie Tegra and even smaller phone chips?

Add to that manufacturing processes are becoming more and more problematic as time goes by. Tegra4 on 28nm is roughly at 80mm2, which irrelevant of the process variant itself is quite a difference a chip monster like the ~550mm2 GK110. Even if either/or IHVs go for increasingly bigger SoCs, they're obviously not going to reach in the foreseeable future almost the maximum tolerance of die area of each process itself.
I don't think we'll see increasingly biggers SoC's, what I think we'll see is Nvidia and AMD becoming increasingly needful of early wafers for their SoC's. Intel is going full out Atom at 22 soon, and at 14nm. The ARM crowd all need to rise to this challenge, Nvidia included.

Discrete GPU isn't where the money is at any more. Never really was for AMD so their choice is easy. I really expect 20nm to be mature before we see any discrete graphics above say, 150mm2. Then again, we don't know if AMD is planning a full switch to Globalfoundries for their SoC's, in which case they actually could still use TSMC for discrete only...

I dunno, but I get the feeling that some agreement has been made between Nvidia and AMD to put their gpu war aside for now. Nvidia finally woke up to the fact that Intel was their main enemy perhaps? I guess we'll hear more about it soon.
 
Discrete GPU is, contrary to what you said, where AMD's Graphics Division aka ATi made money in almost every quarter over the last couple of years, whereas the CPU division, where APUs are included, turn in a loss - even though being pimped with Radeon-genetics.
 
Yes, but the CPU loss is too great to be offset by CPU profit.

Somewhat similar to the situation they were prior to selling fabs: losing on their manufacturing arm could not be compensated by any profit on their CPU side.
 
Back
Top