NVIDIA Kepler speculation thread

PC's aren't dying, they're just getting smaller which is a natural evolution of many technologies. Tablets are still PC's, just hand held ones. No-one ever said a PC has to run x86 or Windows (although increasingly more tablets will). And certainly no-one ever said a PC has to be upgradeable (laptops).
 
PC's aren't dying, they're just getting smaller which is a natural evolution of many technologies. Tablets are still PC's, just hand held ones. No-one ever said a PC has to run x86 or Windows (although increasingly more tablets will). And certainly no-one ever said a PC has to be upgradeable (laptops).

Granted, same as desktops were just evolutions of mainframes.

The point in this context is how the power shift is happening regarding desktop to tablet. Nvidia, Intel and AMD have large presences in the market that appears to be losing out.

SoC's are the future, not discrete GPU's.
 
Granted, same as desktops were just evolutions of mainframes.

The point in this context is how the power shift is happening regarding desktop to tablet. Nvidia, Intel and AMD have large presences in the market that appears to be losing out.

SoC's are the future, not discrete GPU's.

Nvidia, AMD and Intel also have presences in the tablet markets too. Not as high as the desktop market but its not as though they've always been the only players on the desktop either. Powervr for example. I do expect all 3 companies to start gaining significant market share in the tablet markets over the next few years though. They all have excellent roadmaps for the job.

I also think discrete has a few more years left in it yet. I think discrete gpus will be around at least as long as this console generation although we might be 100% SOC by the next gen if there ever is one. Enthusiast apus are an interesting prospect though. I think within a couple of years there'll be apus comparable to orbis in the pc space.
 
Nvidia, AMD and Intel also have presences in the tablet markets too. Not as high as the desktop market but its not as though they've always been the only players on the desktop either. Powervr for example. I do expect all 3 companies to start gaining significant market share in the tablet markets over the next few years though. They all have excellent roadmaps for the job.

I also think discrete has a few more years left in it yet. I think discrete gpus will be around at least as long as this console generation although we might be 100% SOC by the next gen if there ever is one. Enthusiast apus are an interesting prospect though. I think within a couple of years there'll be apus comparable to orbis in the pc space.

I doubt there will ever be anything comparable to an enthusiast class GPU in an APU. I have a hard time envisioning a 500+ mm^2 CPU (excuse me, APU) consuming over 300W of power.

Of course, the argument then becomes whether there will be enough demand for enthusiast level GPUs to warrant their existence. I'd guess the answer will probably be yes. Just like people bought Obsidian graphics cards with 2-4 3dfx Voodoo Graphics chips on them for obscene amounts of money back in the 90's to play GL Quake, there will always be someone willing to pay exorbitant amounts of money to have the best experience their budget can afford. Yes, those were cards built for professional workstations, but how is that much different from the upcoming Geforce Titan, a GPU purpose built for professional workstations that is being offered to the general public at extremely high prices.

Regards,
SB
 
Of course the discrete GPU will always be with us, but it's the relevancy that matters.

It all boils down to how much money can be got out of how much is put in. For the past 5 years neither company is getting much out of discrete for what the put in. Nvidia is doing better due to having 10x the professional market that AMD has but at the consumer level it's been break-even for both companies for a long time. Citing Nvidia's good Q3 2012 does not change the fact that they've made very little profit out of it in the previous 5 years.

Wafers get more expensive every new node, R&D costs go up too. Then there is the fact that these same GPU wafers are in direct competition from products within the same company.

Intel has made it clear that Atom takes precedence at 14nm - isn't that a clue to where the entire industry will be going? Imagine Nvidia trying to compete with Tegra 4 vs a 14nm Atom - how is that going to work out for them do you think? And all because they wanted to continue with a GPU war vs AMD where neither company can win?

The only logical way forward is to give precedence to Tegra, and AMD will do the same with Temash at 20nm. It's not the exact *end* of GPU, but it's the start of them taking a back seat for both companies. AMD has already made that decision, of that I am 100% certain.
 
AMD gave preference to CPUs the moment ATI was bought, but that doesn't mean discrete GPUs aren't valuable. Nvidia and ATI/AMD have had successful GPUs on old process technology too. As long as APUs and GPUs leverage the same technology and there are gamers and professionals wanting good graphics discrete GPUs will be sold.
 
I agree with you but it's more about at what kind of level of existence we're going to be left with.

Up till now both companies have had discrete GPU at the forefront of their roadmaps. It hasn't really worked for either of them because it's an expensive business with competition that is simply too harsh.

I cannot fathom any reason for why either company would continue with it. Nvidia stands to lose their entire SoC market - something they spent a lot of time and money creating - just to maintain their discrete GPU advantage in a decreasing market. Why would they do that?

AMD has literally nothing to lose in discrete GPU that they haven't already lost.

Both companies can continue to develop GPU (and indeed must), but the emphasis on discrete GPU production at the start of every new node has to change if both are to be relevant in the SoC market.
 
This is the problem. Nowhere did I say Nvidia wasn't doing pretty well - however their last numbers in Q3 (the point of contention) was a one-off because of AMD's choice to cut back on production.

It seems hard for you to grasp that AMD chose to cut production because their products weren't selling, not cut production to stop selling. Their choice to cut production didn't affect sales, it was just to lower inventory from the other side. It had no impact on sales, because their products were available.

Citing Nvidia's good Q3 2012 does not change the fact that they've made very little profit out of it in the previous 5 years.

They have 7 and it will be 8 (they report next week) profitable quarters for the GPU segment in a row. The trend line is up among those quarters. You constantly bringing up the bloody price war years serves no useful purpose. Separating the the professional line from the GPU serves no purpose either as those businesses bring synergy to each other.

I keep close track on prices and I've noticed a lot of cards I had been using have been out of stock or the price has either stopped falling or started increasing.

No offense, but this sounds like weak evidence. The prices and availability seem quite stable to me.
 
My opinion is that discrete GPUs will always matter , for a single reason : consoles .

Consoles will drive graphics sophistication forward , requiring powerful discrete GPUs to match them and to exceed them for the purpose of leveraging the PC (which GPU makers will make sure it happens to keep their business alive).

Not to mention that consoles will depend on the advancements made in the sector of discrete GPUs to build their graphics sub-system , just like what happened in this console cycle.
 
They have 7 and it will be 8 (they report next week) profitable quarters for the GPU segment in a row.

The evidence of their GPU segment being "profitable" is where?

The trend line is up among those quarters. You constantly bringing up the bloody price war years serves no useful purpose. Separating the the professional line from the GPU serves no purpose either as those businesses bring synergy to each other.

You're changing your argument.
 
Did AMD drop GPU production as well?

I don't know. If there was no buyout clause like at globalfoundries then they may have, or they may simply have not taken extra wafers when given the opportunity.

After a huge scaling back of cpu production it would have been prudent to do so with GPU as well, even if only for the Trinity discrete attachment rate.

If they didn't I would expect to see 6670 and below prices cratering or some special offer on those, but so far there is nothing and it's all been about the "Never Settle" high end bundles.

edit - having looked at Newegg there is a very large selection of 6670's on rebate offer so it's quite possible that AMD does have excess stock of these now.
 
Last edited by a moderator:
My opinion is that discrete GPUs will always matter , for a single reason : consoles .

Consoles will drive graphics sophistication forward , requiring powerful discrete GPUs to match them and to exceed them for the purpose of leveraging the PC (which GPU makers will make sure it happens to keep their business alive).

Not to mention that consoles will depend on the advancements made in the sector of discrete GPUs to build their graphics sub-system , just like what happened in this console cycle.

That argument doesn't hold terribly well in the face of rumors (assuming they are correct, of course) that both the next Xbox and PlayStation feature APUs.
 
This. Nvidias GPU business is highly profitable. 193m profits at 739m revenue. I don't know how jimbo75 could have missed that.

Not only did Nvidia make $193 million in profit in the consumer GPU segment but that profit number is almost twice the profit from the Professional GPU segment which came in at $101 million.

So much for jimbo75's belief that Nvidia makes peanuts in the consumer GPU segment and only makes $$$ in the professional segment.
 
Ok I missed that, however how long have they been disclosing these numbers because I'm sure they weren't up till now or at least it's a recent development.
 
My opinion is that discrete GPUs will always matter , for a single reason : consoles .

Consoles will drive graphics sophistication forward , requiring powerful discrete GPUs to match them and to exceed them for the purpose of leveraging the PC (which GPU makers will make sure it happens to keep their business alive).

Not to mention that consoles will depend on the advancements made in the sector of discrete GPUs to build their graphics sub-system , just like what happened in this console cycle.

Well, im not sure this generation of console is a good example, but in reality even if they are APU based ( Like some rumors suggnes ), console are different and they are "based on" APU, its not a simple APU..
you will need a bit more power and so effectively discrete gpu for match the simplicity of they work they have to do ( no need to run full OS and instructions, they run with specific API and optimisation )

I agree with you, discrete cards will still exist, and not only on professional space.. Computing take more and more importance, even on games and lets be honest, it will be really hard to match the performance spot with just APU.

Replacing our CPU + discrete gpu by SOC for gaming ? i let all try the new 3Dmark and watch the test who will be ported compatible with tablet and smartphone for have an idea and compare it with the last benchmark of the suite, Fire Strike.

Even if its not fully really representative, you see where the problem lie. Why peoples will go back on power and performance ?
I was compare the performance of my old gpu's after finding some screenshots of score and benchmark on a back up. Wow.. even thoses last 5 years, the discrete gpu graphics have got a massive amount of performance gain and on all domain. Lets not forget the average users have goes from 19-21" to 24" monitors screen thoses last years, this have a cost in term of gpu performance and looking the trend of the last 10years, this will continue to increase.

Its clear, games after games, we maybe not see a massive increase on quality unless some case, maybe its sometimes due to the lack of art of some developpers or they just dont implement graphics / effects features correctly when they port games to PC. ( massive drop on fps for a limited visual quality enhancement ) or sometimes the change is just " subtil " ( shadows, light effects, or particule is a good example, if you dont play with the max setting, you will not even see a real difference untill someone point it to you ). But play some games who was made 5years ago, and compare them of what is made todays.. ( like an old Hitman vs the new one ), and you will clearly see a massive difference. And more you watch it in details and more you will see the difference.
 
Last edited by a moderator:
That argument doesn't hold terribly well in the face of rumors (assuming they are correct, of course) that both the next Xbox and PlayStation feature APUs.
These were weak and strange rumors to begin with , now we have stronger rumors , maybe even strong leaks of different specs all together , more ordinary (familiar) specs .
 
Last edited by a moderator:
Not only did Nvidia make $193 million in profit in the consumer GPU segment but that profit number is almost twice the profit from the Professional GPU segment which came in at $101 million.

So much for jimbo75's belief that Nvidia makes peanuts in the consumer GPU segment and only makes $$$ in the professional segment.

Ok so looking through the 10Q they are struggling to make $150m a quarter with the help of AMD's market share cratering. That is peanuts. What was it last year, half that amount?

The point is proven - this is not a viable business going forward if it depends on AMD being this bad in order to make a measly profit.
 
Back
Top