NVIDIA Kepler speculation thread

*snip*

Don't ask.. I have no idea where this is coming from.

it's what pcinlife chart in a different form..

220329m6tnt4axtfl8tni3zbek.png
 
I have this great idea for a website: write a whole bunch of nonsense breaking news about company XYZ. Then a bit later, I write more breaking news that my earlier breaking news is probably too hard for company XYZ because of whatever other nonsense reason. Then even later, I write that my first breaking news won't happen after all, because, wouldn't you know it, it was indeed too hard for company XYZ. And then I write "I told you so." And nobody will ever be able to prove me wrong.

Rinse, lather, repeat.

Seriously..I read through that whole article(unfortunately..) and still couldnt figure out what he meant to say or what he allegedly said previously or what "analysis" he did :???:

Indeed. Hence the alleged...

However, it does look as if it is small, at least smaller than we expected.

It is a fair bit smaller than expected. As the leaked die shots imply, it is smaller than G92, and a far cry from the earlier rumours of 320-350 mm2.


What's 555?

Edit:

http://www.3dcenter.org/news/nvidia-gk104-354-milliarden-transistoren-auf-nur-294mm%C2%B2-chip-flaeche

Maybe someone who knows German can translate but I can see 294mm² die size, 3.54 billion transistors, TDP 195W and something about 185W.

I don't speak German either but the 185W seems like typical power consumption.
 
http://www.3dcenter.org/news/nvidia-gk104-354-milliarden-transistoren-auf-nur-294mm²-chip-flaeche

Maybe someone who knows German can translate but I can see 294mm² die size, 3.54 billion transistors, TDP 195W and something about 185W.

I don't speak German either but the 185W seems like typical power consumption.

The 185W are a guess of real-world consumption.

And the 3.54B and 294mm² are from Chiphells Napoleon: http://www.chiphell.com/forum.php?mod=viewthread&tid=382049&page=2#pid11424899
 
The narrow Turbo range could simply be a proving ground like Intel's first attempt. Otherwise it's pretty useless as is.
 
The 185W are a guess of real-world consumption.

And the 3.54B and 294mm² are from Chiphells Napoleon: http://www.chiphell.com/forum.php?mod=viewthread&tid=382049&page=2#pid11424899

Dont know about the transistor count but the die size is spot on with what i had heard.

If true, point to note is that GK104 beats Tahiti in density. Pitcairn is still 10% higher though.

my forum post count, sorry i disappointed you :/

Oh haha didnt see that :smile: Thought you were referring to something on the slide and i was wondering what i missed
 
If true, point to note is that GK104 beats Tahiti in density. Pitcairn is still 10% higher though.

Just a guess but I've heard multiple times that IO doesn't shrink well, so this may just be down to having a 20% smaller die with 50% less pins to RAM.
 
The choice of a 384-bit bus on Tahiti devoted a measurable amount of die area at the perimeter for the additional interface that at the contributed less to the total transistor count.
The non-IO areas may be even closer in density than the overall die numbers would suggest, assuming the figures are accurate.

edit:
The area around Tahiti takes (eyeballed measurement) almost 20% of the die.
Lopping off a third of the memory interface would reduce the transistor count of Tahiti a little, but it would make the Nvidia and AMD chips closer in terms of density.
 
Just a guess but I've heard multiple times that IO doesn't shrink well, so this may just be down to having a 20% smaller die with 50% less pins to RAM.

Yep..which is why i mentioned Pitcairn for comparison as well :smile: So while AMD is still ahead of NV in density, the gap has shrunk this time around.
 
Just a guess but I've heard multiple times that IO doesn't shrink well, so this may just be down to having a 20% smaller die with 50% less pins to RAM.

No. Tahiti has a 50% larger bus width than Pitcairn but is over 50% larger. It makes no sense at all to blame the decrease in transistor density to the larger bus. Going by that logic Tahiti should if anything be more dense.

Tahiti also has a significantly higher cache/SP ratio, which again should help make it more dense.

But it isn't.

There are three far more likely possibilities:

  • ECC and DP make the die less dense
  • Tahiti is intentionally less dense to increase performance
  • Or, AMD had greater knowledge of the process and architecture when they designed Pitcairn
 

Woo, thanks. It's about time.

I'm going to speculate the GTX680 will launch at $399, and the GTX670 at $299
I don't believe NV will launch with price/performance parity with AMD, since
1) They are 3-4 months behind AMD, so equal price-performance will seem lame, and not win any kudos...

GTX 480 says hello.


2) The GK104 looks to be small & powerful, so when NV have a winner, they usually go after market...

At 550 USD they will still like sell just about every single card they can deliver to the channel. But their hope is that it'll be just high enough that it doesn't deplete the channel.

Each 50 USD increment lower you go the chances greatly increase of depleting channel supply. At 299 USD, there would almost never be a card in stock. Imagine stock levels worse than 5870/5850 during the first 4-5 months but significantly worse. And 58xx was already a missed opportunity. In fact the stock situation was so dire that AMD increased the MSRP of the cards.

All of which makes the other points irrelevant. Especially if GK110 isn't due out for another 6+ months. Similar to how GTX 480 had to hold the fort until GTX 580 was ready, despite being incredibly late (compared to 5870) and only marginally faster. Wins in general but also sometimes beaten by 5870. Somewhat similar to the performance claims we're hearing for GTX 680.

While die size certainly matters in terms of impact on potential margins and operating profits. As people in these forums love to say, its the performance that determines price position and not die size. And that remains true even if Nvidia now has a slightly smaller die. Especially if Nvidia isn't expecting a faster chip for another 6 months or so.

http://fudzilla.com/home/item/26308-nvidia-gtx-680-pixellized-in-more-detail


That sounds pretty negative considering the source. Around the same speed as 7970 in the nvidia-selected benchmarks and drivers (IOW: slower), and nvidia feels the need to bang the perf/watt drum instead. (wonder if the 190W is tdp or just nvidia-tdp).

So basically we're back to NV40 versus R420 (similar price, similar performance), G70 versus R520 (again similar price, similar performance), and G71 versus R580 (again similar price/perf).

So the more things change, the more things stay the same. :D Only thing different now is that IQ is also basically similar between the two rather than AMD having better IQ while Nvidia had slightly better perf.

I see what he says but... how exactly would that happen? Are the guys at Apple blind not to see that Intel's graphics products are next to tragedy with image quality, drivers, etc? :LOL:

If low and mid-range videocards segments will be gone forever, then what exactly future do nvidia have? Considering they lost consoles too. :oops: And with these uber-high high-end videocards prices, next to no one will buy them, excluding professional segments...

With regards to the first part, it doesn't matter what Apple thinks if Nvidia can't supply enough chips. What would you propose they do? Cut Macbook shipments in half (or whatever number) instead? It isn't as if those lower priced Macbooks won't sell regardless of what graphics drives them. This, of course, presumes that the rumors of Nvidia not being able to provide enough Keplar chips is true.

With regards to the second part. Well yes, Nvidia is certainly hoping that HPC and professional sales continue to grow. As well they do have their Tegra initiative for low range and potentially midrange graphics in the future.

Windows 8 might allow Tegra valuable inroads into the budget desktop PC market they've been squeezed out of. It becomes even more important if developers bother to port AAA budgeted games to Metro (and hence capability of running on Arm based computers).

And as long as PC gaming continues to exist, there will always be demand for enthusiast and performance discrete graphics cards.

Regards,
SB
 
So basically we're back to NV40 versus R420 (similar price, similar performance), G70 versus R520 (again similar price, similar performance)
Then I want a model that is $200 less expensive and can be flashed to be the same as the fastest card. Otherwise your analogy just makes pandas sad.
 
Then I want a model that is $200 less expensive and can be flashed to be the same as the fastest card. Otherwise your analogy just makes pandas sad.

Well, someone already said that they flashed the supposed GTX 670 Ti with a new bios and performance went up by 40%, so fingers crossed! :rolleyes:
 
There are three far more likely possibilities:

  • ECC and DP make the die less dense
  • Tahiti is intentionally less dense to increase performance
  • Or, AMD had greater knowledge of the process and architecture when they designed Pitcairn

What about cookie monsters? ;-)

God how I hate graphs with no zero. Utter horseshit through and through.
The GTX680 has 3x the bar length in BF3!!! WOW!!!

Of course, I'm calling the data baloney too.

At least, the longest bar still is inside the chart. ;)
 
God how I hate graphs with no zero. Utter horseshit through and through.
Only until you actually look at the values on the graph. Sure, it'll fool the Joe Average but anyone with half a brain shouldn't really be all that bothered about it.
 
Then I want a model that is $200 less expensive and can be flashed to be the same as the fastest card. Otherwise your analogy just makes pandas sad.

Eh? NV40 launched at 500 USD. G70 launched at 600 USD. Street price on G70 at launched was 650 USD+ due to retailer price gouging.

Where are you getting the 200 USD less? Oh you mean the 6800 (non-ultra)? Then blame Nvidia for not having something similar. Heck they aren't even planning on launching GTX 670 for a few months. Likely due to trying to get enough GTX 680 chips to satisfy demand at 550 USD or GTX 670 is close enough to GTX 680 that they don't want to cannibalize sales in the first month or so.

Regards,
SB
 
Back
Top