NVIDIA Kepler speculation thread

Discrete sound cards are not dead. If you want the best quality, you won't go for something integrated.
There are laptops with discrete GPUs for professional purposes.
I don't think an APU will ever be able to deliver adequate performance for supercomputers, for instance.
And you can always use a dedicated graphics accelerator next to your APU. So, I don't see why discrete graphics should disappear.
 
Discrete sound cards are not dead. If you want the best quality, you won't go for something integrated.
There are laptops with discrete GPUs for professional purposes.
I don't think an APU will ever be able to deliver adequate performance for supercomputers, for instance.
And you can always use a dedicated graphics accelerator next to your APU. So, I don't see why discrete graphics should disappear.

Depends of course what you mean with an "APU" exactly and how SoCs will evolve in the meantime. It's no secret that NV is developing custom CPU cores based on ARM instruction sets under the codename Denver, which according to all indications will be integrated into the GPU chip itself. And no the result won't be aimed at mainstream markets only but HPC too, for which I don't necessarily would think of high end but rather low end servers. However it's a process that will evolve over time in that direction.

Current desktop/notebook SoCs already have replaced IGPs of the past. In that regard NV had no other choice but to decide for Project Denver otherwise we would see only Intel/AMD solutions in budget PC/notebooks of the future. Now tell me why exactly SoCs in due time couldn't theoretically grow and replace low end standalone GPUs as a next target? It's obviously not a process that will happen over night, but there are tons of indications that things are moving gradually in that direction. And no standalone GPUs won't necessarily disappear entirely in the longrun; it's just that they're sales volumes probably will be miniscule compared to SoCs.

And I frankly don't understand the "fear" of SoCs many seem to have as it'll represent something inferior in technology development of the future. Under the right conditionals there are quite a few advantages that speak for a SoC especially if one thinks of heterogeneous computing. Have a closer look at the pdf I linked above at the last few pages of the Exascale "experiment". It describes the prospects of Einstein (Maxwell successor) and that's definitely not just a high end GPU core.
 
Both , CPUs and GPUs . maybe one more than the other , but both are affected.

Which lasts how long in the case of the GPU? More than one generation? Upcoming consoles from Microsoft and Sony according to supposed leaked information point at "7700" class GPUs ranging somewhere in the 1.2-1.5TFLOPs FP32 range. A mainstream "8700" desktop standalone GPU should master any "low level API" advantage already. In any case it's no a novel phenomenon. It's typical that console hw at its launch is already outclassed by PC hw.
 
I think that "standalone" GPU may completely go away eventually.
The benefits of something like HSA may be too great to pass on, and selling GPUs that can't run APU code may be undesireable - go explain to the consumer that sorry, algorithm A can be accelerated by either the APU or GPU or both, but algorithm B can be accelerated by the APU only.

This doesn't necessarily mean the end of big cards on PCIe 16x : as Ailuros recalled, Maxwell is basically an APU on a stick, if they intend to sell it as a graphics card and not only Tegra and Tesla that is.
We might see a Radeon that does the same, integrating a future equivalent of Jaguar cores.
 
I think that "standalone" GPU may completely go away eventually.
The benefits of something like HSA may be too great to pass on, and selling GPUs that can't run APU code may be undesireable - go explain to the consumer that sorry, algorithm A can be accelerated by either the APU or GPU or both, but algorithm B can be accelerated by the APU only.

But what if you have an APU anyway that will handle all the Algorithm A / GPGPU / HSA code along with a discrete GPU that handles just algorithm B's, i.e, the graphics code that is not dependant on fast, low latency communication with the CPU?

That may sound like it's over complicated but in probably less than 2 years, every "CPU" sold will actually be a fully or partially HSA capable CPU + GPU SoC. So having that GPU which is local to the CPU will become a fairly standard thing and something to almost be taken for granted. When we get to that point from a development point of view isn't it possible that SoCs can simply be treated as very SIMD heavy CPU's where all "GPGPU" code can be run locally and then we'd still rely on discrete GPU's to handle the graphics. In a way it's almost a reset back to where we where before GPGPU existed since the CPU/SoC's will have the resources available to them to make GPGPU operations on a discrete GPU uneccesary.

This seems to be what AMD are pushing for with HSA to me. i.e treat the APU as one big processor. There's nothing making that philosphy incompatible with discrete GPU's IMO. Low end users can run all code on the APU and high end users run CPU and GPGPU code on the APU and traditional graphics work on the discrete GPU.
 
Just wondering, what about Nvidia Maxwell and its integrated CPU? Will that fit into the HSA concept?

NVIDIA will probably have trouble caring less about HSA, especially given how it's shaping up nowadays. We'll see. If you just mean conceptually, as in would a SoC with potentially interesting integration fit into the paradigm they ('re trying to?) propose, yeah it would.
 
JHH has said the break even point for Tegra is about $1 billion in annual sales.

Do you recall where exactly he said that? I recall him saying a long time ago that it would only make sense to develop different product lines for Tegra if it became a $1 billion [revenue per year] business.
 
Do you recall where exactly he said that? I recall him saying a long time ago that it would only make sense to develop different product lines for Tegra if it became a $1 billion [revenue per year] business.

Tegra is accounted for in NVIDIA's Consumer Product Business, which also includes royalties for game consoles. Last quarter, according to their 10-Q filing, said CPB generated $244 million in revenue for an operating loss of $42 million.

These things tend to fluctuate from quarter to quarter, and I have no idea how much NVIDIA is getting in console royalites, but this suggest a break-even point somewhere in the neighborhood of $250~300 million. I guess on a yearly basis, that's about a billion.

Edit: Well, I've just checked their yearly 10-K, and there was $591 million in revenue for an operating loss of $208 million (page 100) over their last fiscal year. Previous years had much lower figures in both revenue and losses, suggesting that the business is facing increasing costs matching the increase in revenue, more or less. All in all, $1 billion sounds about right for this year. It might even be a low estimate for the future.
 
Last edited by a moderator:
Thanks for the link. For the Consumer Products Business (ie. Tegra, Icera, Licensing fees, etc.), it appears that the quarterly operating income loss in Q3 FY 2013 is significantly less than for Q2 and Q1 FY 2013. This makes sense due in part to the additional revenue stream from Nexus 7, Surface RT, etc. The CPB should be close to profitability once the revenue stream from Icera 4G LTE baseband processor and Tegra 4 is realized, which is still some months away.
 
Thanks for the link. For the Consumer Products Business (ie. Tegra, Icera, Licensing fees, etc.), it appears that the quarterly operating income loss in Q3 FY 2013 is significantly less than for Q2 and Q1 FY 2013. This makes sense due in part to the additional revenue stream from Nexus 7, Surface RT, etc. The CPB should be close to profitability once the revenue stream from Icera 4G LTE baseband processor and Tegra 4 is realized, which is still some months away.

It's a lot more likely to be because Q3 is generally a much better quarter than Q's 1 and 2 for most tech business. From the rumours Tegra 4 is no more likely to improve Nvidia's fortunes than 1-3 did, so I wouldn't be betting on their CPB making any money for a while yet.
 
Well, rumors should always be taken with a grain of salt, especially for a product that just recently started sampling! For instance, there was one rumor that Tegra 4 would be inside Nexus 7.7, and then there was a rumor that said exactly the opposite :D. Keep in mind that there is currently zero revenue associated with Tegra 4 and Icera 4G LTE modems, even though costs have certainly been incurred. When Tegra 4 revenue does start to be realized, Tegra 3 revenue will still be realized too. A relatively new business such as Tegra will tend to grow in the beginning, not contract. Anyway, apologies for being off-topic, but the basic design of Tegra and Geforce will be more closely integrated in the future (even if revenues are reported separately).
 
Last edited by a moderator:
Well, rumors should always be taken with a grain of salt, especially for a product that just recently started sampling! For instance, there was one rumor that Tegra 4 would be inside Nexus 7.7, and then there was a rumor that said exactly the opposite :D. Keep in mind that there is currently zero revenue associated with Tegra 4 and Icera 4G LTE modems, even though costs have certainly been incurred. When Tegra 4 revenue does start to be realized, Tegra 3 revenue will still be realized too. A relatively new business such as Tegra will tend to grow in the beginning, not contract. Anyway, apologies for being off-topic, but the basic design of Tegra and Geforce will be more closely integrated in the future (even if revenues are reported separately).

I think Maxwell is when NVIDIA is supposed to start sharing the same graphics architecture from top to bottom, Tesla/Quadro/GeForce to Tegra.

As for the business side of things, Tegra 4 may start to generate revenue, but Tegra 3's revenue stream is likely to diminish about the same time, as "old" devices are replaced and eventually phased out. NVIDIA will release quarterly earnings on the 14th, so we'll have a new data point.
 
Well, Maxwell is the one where they're rumoured to bring ARM cores to their GPUs, so that would make sensem but I see little those ARM cores would do on other than Tegras, since Wintel-world is still so bound to x86, even if they had full "HSA features" it would still be incompatible
 
Tegra is accounted for in NVIDIA's Consumer Product Business, which also includes royalties for game consoles. Last quarter, according to their 10-Q filing, said CPB generated $244 million in revenue for an operating loss of $42 million.

These things tend to fluctuate from quarter to quarter, and I have no idea how much NVIDIA is getting in console royalites, but this suggest a break-even point somewhere in the neighborhood of $250~300 million. I guess on a yearly basis, that's about a billion.

Edit: Well, I've just checked their yearly 10-K, and there was $591 million in revenue for an operating loss of $208 million (page 100) over their last fiscal year. Previous years had much lower figures in both revenue and losses, suggesting that the business is facing increasing costs matching the increase in revenue, more or less. All in all, $1 billion sounds about right for this year. It might even be a low estimate for the future.
I should mention that the cost of developing new chips is likely to rise with time, and thus so is the price point at which it makes sense to develop an independent line of chips.
 
GTX 650s are down to 100 now? Sweet, that's just the right price for them.

Anyway, if these things are more energy efficient than standard 650s, I don't really see the issue. They'll fill a hole in the market place.
 
The Asus has a 4 watt lower TDP than the usual one so this is not much. But at least, no 6-pin to Molex adapter to deal with.

You call that garbage.. LOL
Isn't that around gtx 285 performance? (roughly) and not too far off the Xbox 720 in theory. Please keep your smug elitism for yourself. I would like one of these much to play in 1280 by something, with anti-aliasing in most games. It's pretty much the card to get to play Steam linux games, and it's 2x/3x faster than the real "garbage" cards (gt640, gt430, 6670 ddr3, A10-5800K)

Lower watts also means lower power bill, silent card, PSU stays quiet (a rock solid old 350W one) and I can get away with having no functioning case fan.
I almost bought a used 9800GTX+ but would have had to buy molex adapters, fuck around transplanting the PC in another case and wiring fans, and pay an ever bigger bill. For now I run my older 7600GT, which is low watt by today's standards (I feel like upgrading to e.g. a GT 630 would be not very worth it, it's much better but has the same memory bandwith). A 140+ watt card? Do not want.
 
Last edited by a moderator:
You could have got that card a year ago except it would have been called the 7750.

AMD+linux=bad times usually(Though, not always). Still, you are correct. The GTX 650 was way over priced when it came out, and I honestly regret buying one instead of, say, a 7750 or 7770 simply due to the price I paid.
 
Back
Top