Nvidia BigK GK110 Kepler Speculation Thread

But remove pricing out of the equation and it makes the 7970 ghz edition look unimpressive on many levels. Sure it might be only 30 percent faster but it does this with the same amount of power usage(which is particularly impressive), while having significantly higher compute if it is any sort of improvement over the gf100.

That's an interesting angle and one I hope reviewers explore on Thursday. Actual power consumption is to be determined but a 30% perf/w advantage would be significant. When AMD had the perf/w advantage they shied away from big dies. Nvidia has no such reservations.
 
That's an interesting angle and one I hope reviewers explore on Thursday. Actual power consumption is to be determined but a 30% perf/w advantage would be significant. When AMD had the perf/w advantage they shied away from big dies. Nvidia has no such reservations.

It's a lot more likely that AMD had no plans for big dies in the first place, is it not? The don't have the pro segment to cover the desktop losses on a chip this size.

That said, Nvidia obviously has fixed the perf/W disadvantage that had been plaguing them for the previous 4 years or so.
 
It's a lot more likely that AMD had no plans for big dies in the first place, is it not? The don't have the pro segment to cover the desktop losses on a chip this size.

That said, Nvidia obviously has fixed the perf/W disadvantage that had been plaguing them for the previous 4 years or so.

I would think it's the consumer segment that subsidizes professional and not the other way around. Absolute revenues are far higher on the former.

AMD has made big dies in the past but they didn't do so in recent times to push their perf/w advantage.
 
I would think it's the consumer segment that subsidizes professional and not the other way around. Absolute revenues are far higher on the former.

AMD has made big dies in the past but they didn't do so in recent times to push their perf/w advantage.

Come on..if that were the case then AMD would have had a big die gpu for a long time. There hasn't been anything near Nvidia's top SKU in area since R600 - they've generally been at a 40% disadvantage since then.

Titan can exist because the pro market allows it to exist. Think about the $millions that goes into developing each chip, and what AMD could realistically get back from it. With 1/10th of the pro market in units (and probably much less on revenues), they'd be spending a ton of money on development that they could never recoup on sales.
 
I would think it's the consumer segment that subsidizes professional and not the other way around. Absolute revenues are far higher on the former.

AMD has made big dies in the past but they didn't do so in recent times to push their perf/w advantage.

The RV770 story written by Anand suggests that when AMD started working on RV770, they didn't really know they'd end up with such a substantial perf/mm² and perf/W advantage.

Maybe they had some clue about it when they started working on Cypress, which did end up slightly bigger (334mm² vs. 255mm²). I think the small die strategy was mostly meant to get to market quickly, minimize risks and costs at a time when they were in trouble (post-R600).
 
What I should say is that, even if AMD did create a monster 550mm2 chip that demolished Nvidia's best...it would still underperform in sales. Nvidia has a strong grip on the pro market much like Intel has in servers. It would take literally years to change the mindset.

Same goes for the top end desktop. That's why some people will still buy this $1K Titan, but they didn't pay $550 for the 7970. The top end belongs to Nvidia in perception and AMD cannot change that no matter what they do.
 
I believe edge cases like you are going to make this card unavailable for many.

The 6GB alone is going to enable CUDA applications for 'cheap' that weren't possible before. Add in the DP and the other new CUDA goodies and it's an irresistible value for many.

I need agree with this, im tempted, but i have allready ( theorically ) this power in my pocket since 1 years ( even for computing ).. and for the same price. 8 TFlopsP SP and 2Tflops DP... ( outside CUDA ofc )

Im only tempted by Titan for the benchmark crazyness i could obtain from it in 3Dmark suite ... ( So i need to refrain me because, i will lost more with it, i will gain. )

Now for sure, if i can use a Titan for professional " house" Computing use on computing the choice is made .. I will never put 4000$ for my house system for a K20 .. but a Titan at 1000$ for my house system.. ohh yes.
 
Last edited by a moderator:
But remove pricing out of the equation and it makes the 7970 ghz edition look unimpressive on many levels. Sure it might be only 30 percent faster but it does this with the same amount of power usage(which is particularly impressive), while having significantly higher compute if it is any sort of improvement over the gf100.

Power useage isn't the same according to...


It's 15.7% more or 9.7% more depending on which 7970 you compare it to in that one particular game.

Close, but definitely not the same. :) If we were using that for a definition of the same, then the Radeon 6970 had the same performance as the GTX 580. :)

One thing that's impressive is that video playback is very low power. That's something AMD should work on, IMO.

Regards,
SB
 
Power useage isn't the same according to...



It's 15.7% more or 9.7% more depending on which 7970 you compare it to in that one particular game.

Close, but definitely not the same. :) If we were using that for a definition of the same, then the Radeon 6970 had the same performance as the GTX 580. :)

One thing that's impressive is that video playback is very low power. That's something AMD should work on, IMO.

Regards,
SB

In addition, AMD still use the board power and Nvidia in this case it is the power usage related on BC2 ....

Guru3d report 251W for Titan .. BC2 is really intensive on system.. i dont know why they use this one . my northbridge temps goes crazy with this game.
 
Last edited by a moderator:
In addition, AMD still use the board power and Nvidia in this case it is the power usage related on BC2 ....

Guru3d report 251W for Titan .. BC2 is really intensive on system.. i dont know why they use this one . my northbridge temps goes crazy with this game.

Nod, I'm not making any personal judgments until actual reviews are out. But I think it's probably safe to say that Titan will consume more power. It's just how much more power is the question.

Regards,
SB
 
Power useage isn't the same according to...



It's 15.7% more or 9.7% more depending on which 7970 you compare it to in that one particular game.

Close, but definitely not the same. :) If we were using that for a definition of the same, then the Radeon 6970 had the same performance as the GTX 580. :)

One thing that's impressive is that video playback is very low power. That's something AMD should work on, IMO.

Regards,
SB

Note that I said ghz edition and thus that chart proves my point further. In that chart a titan consumes 15.4 percent less power than a ghz edition while performing better.
 
Note that I said ghz edition and thus that chart proves my point further. In that chart a titan consumes 15.4 percent less power than a ghz edition while performing better.

How much consume the 7970 Ghz in this chart ? 250W ? not one or more watts ? im a bit dubious the wattometer have fall on 250W.. WHo is the board maximum power of the 7970 and 7970Ghz.... ( AMD give tdp in chart based on the max power board, not the average tdp )... If the Nvidia turbo boost v2 work well.. 250W TDP should be the right spot. for best Turbo and performance no ?

personally i will not speak on power consumption, because i have use a cooling unit only for my CPU who push more of 1200W of dissipation just and all days .. Im not really well placed for speak about it.
 
Last edited by a moderator:
What I should say is that, even if AMD did create a monster 550mm2 chip that demolished Nvidia's best...it would still underperform in sales. Nvidia has a strong grip on the pro market much like Intel has in servers. It would take literally years to change the mindset.
Market share and brand recognition aren't improved overnight.
 
Titan appears to be targeted very much toward the GPGPU crowd. Full speed dp support and dynamic parallelism especially are killer features. Think of it as more a consumer friendly (relatively) version of Tesla or Quadro, lacking just the various esoteric features that are needed for supercomputers and render farms.

Perhaps Nvidia is moving toward something like Tesla -> supercomputing, Quadro -> render farms, and Titan -> workstation?

Come to think of it, Mac Pro is due for an update. Hmm...
 
Last edited by a moderator:
Titan has more real power draw, than 7970 GHz in games?

1351547832FAJKcoppT4_10_1.gif
 
Titan has more real power draw, than 7970 GHz in games?

1351547832FAJKcoppT4_10_1.gif

There are more reviews than that one showing strange power numbers as well. Check out http://www.techspot.com/review/603-best-graphics-cards/page11.html

That one has the 660 Ti using more power than the 7950 and the 650 Ti using the same as the 7870. The lack of proper power testing irks me somewhat - clearly one game is not good enough. AT will probably use Metro 2033 and declare power results based on that one game alone, and it's a game where the AMD cards seem to be wringing their performance (and likely power) out of themselves.
 
HardOCP measured peak power nombers. Very unreliable. Also you cannot compare those measurement across different systems, because there are way too many variables.
 
Guru3D reports 800€ + VAT. That means +23% in my country, so 984 euros. LOL!

7950 CFX at 500 euros (game bundle value deducted) here I come!
 
Back
Top