NVIDIA Kepler speculation thread

And that is a problem exactly why?

Intel is basicly paying Nvidia a royalty stream for the right to use Nvidia's patents in their internal GPUs in processors and for Xeon Phi.

For someone who constantly posts how wonderful the royalties AMD will get from the consoles Intel's royalty payments to Nvidia are constant and pure profit to the bottom line.

AMD got a one time $1.25 billion from Intel and after the initial boost has been back posting quarter after quarter losses.

Nvidia got $1.5 billion spread out over six years and even if you back it out still makes a profit.

Oh, and when the cross licensing agreement comes up for renewal expect it to be renewed and the revenue stream to continue.

Actually, AMD was nicely profitable in 2009, 2010 and 2011. However 2012 was pretty terrible.
 
NVIDIA employs literally thousands of engineers. Considering that they are able to design and engineer some of the fastest and most complicated GPU's in the world, and considering that they have hired a lot of extra talent to work on CPU's, I don't think it would be above them to design a world-class CPU. We'll just have to wait and see. Note that the end goal here is not just to design a good CPU, but to integrate a fully custom NVIDIA CPU on the same chip as a fully custom NVIDIA GPU.

Intel poured more resources and hired more people for standalone GPU development than NV ever could for Denver and yet what came out at the other was a Larabee. Your point being?

What's a world class CPU exactly under the light that it'll be based on ARM ISA; are you really THAT confident that it could take on a future low power Haswell variant? Competitive with the rest of the ARM CPU crop definitely, but that's not something extremely impressive either considering Qualcomm and Apple's own custom CPUs.

I couldn't imagine that Denver won't be aiming at desktop/notebook SoCs for the entry level markets. Going against Intel there will be a mighty tough cookie and neither magic wands nor secret sauces are going to help I'm afraid.
 
Can we split the financial argument off into a new thread? This is Architecture and Chips after all.

I have to say, however, that it's pretty humorous reading people's comments that profits significantly larger than the entire market revenues for the market I compete in are somehow insignificant. lol.
 
Yes that's what I said. I said they had a one-off good quarter and that revenues would be down in Q4, and that's what happened.

A profit of 174M vs 209M from a revenue of 1.107B vs 1.204B are imo within the same ballpark. You can't make a big distinction between them calling the one one off good quarter and then dismiss the other. They've also had good quarters prior to Q3 and YoY shows great gains. Granted with you actually not specifying any ranges or figures, it gives you plenty of room to pedal which ever direction you happen to choose.

Sorry Mize. I'll try to restrain myself.
 
Can we split the financial argument off into a new thread? This is Architecture and Chips after all.

I have to say, however, that it's pretty humorous reading people's comments that profits significantly larger than the entire market revenues for the market I compete in are somehow insignificant. lol.

Agreed.
 
I'm just posting this here to finish my comments on what had been said before (plus I can't find a relevant thread)

http://www.techpowerup.com/180389/J...orts-Graphics-Market-Down-8.2-in-Q4-2012.html

Jon Peddie Research (JPR), the industry's research and consulting firm for graphics and multimedia, announced estimated graphics chip shipments and suppliers' market share for Q4'12.

The news was disappointing for every one of the major players. AMD dropped 13.6%, Intel slipped the least, just 2.9%, and Nvidia declined the most with 16.7% quarter-to-quarter change, this coming on the heels of a spectacular third quarter. The overall PC market actually grew 2.8% quarter-to-quarter while the graphics market declined 8.2% reflecting a decline in double-attach. That may be attributed to Intel's improved embedded graphics, finally making "good enough" a true statement.
On a year-to-year basis we found that total graphics shipments during Q4'12 dropped 11.5% as compared to PCs which declined by 5.6% overall.
Nvidia's quarter-to-quarter desktop discrete shipments fell 15.1% from last quarter; and, the company's mobile discrete shipments dropped 18.4%. The company's overall PC graphics shipments declined 16.7%.
I feel it's quite likely that the new change in reporting was to reflect/hide this massive drop.
 
Last edited by a moderator:
So now that Titan is out, the big question is if/when Nvidia will refresh GK104, 106, 107, and if there will be more GK110 cut down parts making their way to the Geforce lineup... news on these happenings has been fairly nonexistent thus far. :/
 
From Heise online: "First Haswell notebooks announced" (original).

In both Haswell notebooks two new graphics chips are used, to which Nvidia has not yet made ​​a public statement: The GeForce GTX 770M (3 GB of GDDR5) is the successor to the GTX 670M, the flagship GTX 780m (4 GB of GDDR5) solves the GTX 680M from. Further technical details about the two new high-end GPUs are not yet known, but the top model will work about 30 percent faster than its predecessor. Specifically called DevilTech a 3DMark Vantage value of 29,458 points (performance, the default) for the DTX Fragbook 4900MQ with Core i7, GeForce GTX 780m and 8 GB of DDR3 memory.
30% above the 680M would probably put it around or slightly above the 680MX.
 
From Heise online: "First Haswell notebooks announced" (original).

30% above the 680M would probably put it around or slightly above the 680MX.

Well hopefully desktop parts will get refreshed too. GK110 shows that with the build lessons learned from the rest of the Kepler family builds (AND/OR the process maturity that has ensued since GK104 first came out) there is some room forperf/watt improvement (or straight up performance) among GK104, 106, and 107. I'd still like and hope to see more GK110 sku's (cut down parts) that end up in the gtx 700 line up.
 
I thought I had carefully checked and 680M was Fermi, no you're right it's a GK104 :oops:, the Fermis are 670M and 675M (GF114).

But still, that big difference in performance can be explained : 680MX has a much faster memory clock, "2500MHz" instead of "1800MHz" (old style douled numbers) all details are there about the fast mobile geforce 6xx GPUs :

http://www.notebookcheck.net/NVIDIA-GeForce-GTX-680M.72679.0.html

Note that there's crazy high TDP for a laptop, 680M is rated at 100 watts and 680MX rated at 122 watts.
That doesn't rule out a respin, what if GK114 is just a silicon respin of the very same GPU and launched silently so to speak?
 
Back
Top