Nvidia BigK GK110 Kepler Speculation Thread

Yeah that whole presentation was pretty boring and JHH was not his usual charismatic self. I was also disappointed with the lack of info on tegra 4 over power consumption and gpu performance relative to the competition.

Well, that's a pretty straight answer in itself.

If they had groundbreaking 3D performance they would have showcased that instead of a useless webpage rendering test, which is heavily relying on software.

AnandTech showed how a dual-core Cortex-A15 eats up to 8 Watt (Exynos 5150) if it isn't restricted to a lower TDP.

Bog standard Cortex-A15 just isn't well-suited for mobile devices, hence the reason for Qualcomm's Krait and Apple's Swift.
 
Yeah that whole presentation was pretty boring and JHH was not his usual charismatic self. I was also disappointed with the lack of info on tegra 4 over power consumption and gpu performance relative to the competition.

Tegra 4 consumes up to 45% less power than its predecessor, Tegra 3, in common use cases. And it enables up to 14 hours of HD video playback on phones.
http://www.xbitlabs.com/news/mobile...on_Processor_for_Tablets_and_Smartphones.html
 
Bog standard Cortex-A15 just isn't well-suited for mobile devices, hence the reason for Qualcomm's Krait and Apple's Swift.
Qualcomm would have done their own regardless of how "well suited" Cortex-A15 is, just like they've done for a while with Snapdragons
 
[offtopic]

Well, that's a pretty straight answer in itself.

If they had groundbreaking 3D performance they would have showcased that instead of a useless webpage rendering test, which is heavily relying on software.

AnandTech showed how a dual-core Cortex-A15 eats up to 8 Watt (Exynos 5150) if it isn't restricted to a lower TDP.

Where do you see 8W in that article?

Bog standard Cortex-A15 just isn't well-suited for mobile devices, hence the reason for Qualcomm's Krait and Apple's Swift.

Any basis for this claim?

[/offtopic]
 
Where do you see 8W in that article?

Go to the page where they discuss the TDP of Exynos 5 Dual.

The test itself is unlikely to ever happen in the real world, but it shows that the CPU is throttled anytime the GPU is ramped up in order to maintain ~4 TDP.

BTW - that doesn't mean that in all apps the CPU is always throttled in favor of the GPU. It's just a byproduct of the test. One application is pushing the GPU while another is pushing the CPU. In the graph the CPU just happens to be the background task.

What's important is that it shows that it is possible hit 8W with Exynos 5 Dual. But that throttling kicks in fairly quickly to keep it at ~4W.

Regards,
SB
 
It's true, can't say more.

Multiple independent sources say the SweClockers to GK110 appears in Geforce Titan - an upcoming graphics cards in high-end class.

According SweClockers sources the launch of the GeForce Titanium to resemble that of the GeForce GTX 690th Partner Manufacturers must follow Nvidia's reference design to the letter and can not even put their own stickers on graphics cards. The performance is estimated at about 85 percent of a Geforce GTX 690 .

The same sources claim that Geforce Titanium released in late February and has a suggested retail price of 899 USD.
http://www.sweclockers.com/nyhet/16402-nvidia-gor-geforce-titan-med-kepler-gk110
 

Hmm... GeForce Titan sounds sweet. But:

Even as the GK110 silicon physically packs 15 SMX units, totaling 2,880 CUDA cores, the top GeForce Titan variant is said to use not more than 14, amounting to 2,688. This is probably to help maximize yields of the GK110 silicon. The chip features a 384-bit wide GDDR5 memory interface, 6 GB will be the standard memory amount. The target TDP of the SKU is set around 235W.

is not very nice. Only 235 W?! :rolleyes:
 
I think they confused it with K20X.
Nvidia must have gotten at least some 15 SMX dies. Seeing the price and name, I would guess they use a full GK110.
 
No worries here - GTX480 was a top selection with a single SM disabled. GK110 is no exception here, with its big die and billions of transistors. You can't have it all in one shot. ;)
 
No worries here - GTX480 was a top selection with a single SM disabled. GK110 is no exception here, with its big die and billions of transistors. You can't have it all in one shot. ;)
Well the lower tier Fermi parts were disabled as well.
 
BTW 6GB is really stupid unless for a pissing contest (or wanting a single SKU)
3GB is already an effective +50% over the GTX 690.

But with 6GB, this would run big dataset CUDA or if you want to game 5 years on it. :)
 
BTW 6GB is really stupid unless for a pissing contest (or wanting a single SKU)
3GB is already an effective +50% over the GTX 690.

But with 6GB, this would run big dataset CUDA or if you want to game 5 years on it. :)

Many 680s and 670 have 4GB, this has to have at least as much and that means 6GB. 3GB would sound low for this.
 
Many may, but most do not. 6GB is just silly.

It is a bit silly, but 3GB would be too low for this. This needs to have more. The price premium to go from 2 to 4GB is quite low on the 600-series and this needs to be on top of them. With the suggested price point it's a no brainer to go 6GB imo. 690 and 2GB was a bad choice other than making people upgrade again.
 
Many may, but most do not. 6GB is just silly.

Unless someone decides to release a game to utilise such amount of memory. Otherwise it's a waste, and most probably it will prove as such given that in the lifecycle of such a product you will not see any games to use it.

As a comparison point, a list of TDP :
Geforce GTX 680, 195W
GTX 580 244W
GTX 480 250W
GTX 560 ti 170W

So much performance left on the table... Imagine what a 270 W GTX 680 will be capable of... :D :oops:
 
As a comparison point, a list of TDP :
Geforce GTX 680, 195W
GTX 580 244W
GTX 480 250W
GTX 560 ti 170W

But are all of these determined the same way? I mean, at least 580 & 480 had not issues going past their TDP in gaming of all things, while in most cases video cards tend to have healthy margin towards their TDP while gaming
 
Back
Top