Nvidia BigK GK110 Kepler Speculation Thread

Discussion in 'Architecture and Products' started by A1xLLcqAgt0qc2RyMz0y, Apr 21, 2012.

Tags:
  1. Pressure

    Veteran

    Joined:
    Mar 30, 2004
    Messages:
    1,655
    Likes Received:
    593
    Well, that's a pretty straight answer in itself.

    If they had groundbreaking 3D performance they would have showcased that instead of a useless webpage rendering test, which is heavily relying on software.

    AnandTech showed how a dual-core Cortex-A15 eats up to 8 Watt (Exynos 5150) if it isn't restricted to a lower TDP.

    Bog standard Cortex-A15 just isn't well-suited for mobile devices, hence the reason for Qualcomm's Krait and Apple's Swift.
     
  2. A1xLLcqAgt0qc2RyMz0y

    Veteran

    Joined:
    Feb 6, 2010
    Messages:
    1,589
    Likes Received:
    1,490
    http://www.xbitlabs.com/news/mobile...on_Processor_for_Tablets_and_Smartphones.html
     
  3. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,464
    Location:
    Finland
    Qualcomm would have done their own regardless of how "well suited" Cortex-A15 is, just like they've done for a while with Snapdragons
     
  4. hkultala

    Regular

    Joined:
    May 22, 2002
    Messages:
    296
    Likes Received:
    38
    Location:
    Herwood, Tampere, Finland
    [offtopic]

    Where do you see 8W in that article?

    Any basis for this claim?

    [/offtopic]
     
  5. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,418
    Likes Received:
    10,311
    Go to the page where they discuss the TDP of Exynos 5 Dual.

    The test itself is unlikely to ever happen in the real world, but it shows that the CPU is throttled anytime the GPU is ramped up in order to maintain ~4 TDP.

    BTW - that doesn't mean that in all apps the CPU is always throttled in favor of the GPU. It's just a byproduct of the test. One application is pushing the GPU while another is pushing the CPU. In the graph the CPU just happens to be the background task.

    What's important is that it shows that it is possible hit 8W with Exynos 5 Dual. But that throttling kicks in fairly quickly to keep it at ~4W.

    Regards,
    SB
     
  6. xDxD

    Regular

    Joined:
    Jun 7, 2010
    Messages:
    412
    Likes Received:
    1
  7. DSC

    DSC
    Banned

    Joined:
    Jul 12, 2003
    Messages:
    689
    Likes Received:
    3
  8. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    #488 lanek, Jan 21, 2013
    Last edited by a moderator: Jan 21, 2013
  9. boxleitnerb

    Regular

    Joined:
    Aug 27, 2004
    Messages:
    407
    Likes Received:
    0
    It's true, can't say more.

    Multiple independent sources say the SweClockers to GK110 appears in Geforce Titan - an upcoming graphics cards in high-end class.

    According SweClockers sources the launch of the GeForce Titanium to resemble that of the GeForce GTX 690th Partner Manufacturers must follow Nvidia's reference design to the letter and can not even put their own stickers on graphics cards. The performance is estimated at about 85 percent of a Geforce GTX 690 .

    The same sources claim that Geforce Titanium released in late February and has a suggested retail price of 899 USD.
    http://www.sweclockers.com/nyhet/16402-nvidia-gor-geforce-titan-med-kepler-gk110
     
  10. UniversalTruth

    Veteran

    Joined:
    Sep 5, 2010
    Messages:
    1,747
    Likes Received:
    22
    Hmm... GeForce Titan sounds sweet. But:

    is not very nice. Only 235 W?! :roll:
     
  11. boxleitnerb

    Regular

    Joined:
    Aug 27, 2004
    Messages:
    407
    Likes Received:
    0
    I think they confused it with K20X.
    Nvidia must have gotten at least some 15 SMX dies. Seeing the price and name, I would guess they use a full GK110.
     
  12. fellix

    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,552
    Likes Received:
    514
    Location:
    Varna, Bulgaria
    No worries here - GTX480 was a top selection with a single SM disabled. GK110 is no exception here, with its big die and billions of transistors. You can't have it all in one shot. ;)
     
  13. Homeles

    Newcomer

    Joined:
    May 25, 2012
    Messages:
    234
    Likes Received:
    0
    Well the lower tier Fermi parts were disabled as well.
     
  14. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    As a comparison point, a list of TDP :
    Geforce GTX 680, 195W
    GTX 580 244W
    GTX 480 250W
    GTX 560 ti 170W
     
  15. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    BTW 6GB is really stupid unless for a pissing contest (or wanting a single SKU)
    3GB is already an effective +50% over the GTX 690.

    But with 6GB, this would run big dataset CUDA or if you want to game 5 years on it. :)
     
  16. DuckThor Evil

    Legend

    Joined:
    Jul 9, 2004
    Messages:
    5,995
    Likes Received:
    1,062
    Location:
    Finland
    Many 680s and 670 have 4GB, this has to have at least as much and that means 6GB. 3GB would sound low for this.
     
  17. Homeles

    Newcomer

    Joined:
    May 25, 2012
    Messages:
    234
    Likes Received:
    0
    Many may, but most do not. 6GB is just silly.
     
  18. DuckThor Evil

    Legend

    Joined:
    Jul 9, 2004
    Messages:
    5,995
    Likes Received:
    1,062
    Location:
    Finland
    It is a bit silly, but 3GB would be too low for this. This needs to have more. The price premium to go from 2 to 4GB is quite low on the 600-series and this needs to be on top of them. With the suggested price point it's a no brainer to go 6GB imo. 690 and 2GB was a bad choice other than making people upgrade again.
     
  19. UniversalTruth

    Veteran

    Joined:
    Sep 5, 2010
    Messages:
    1,747
    Likes Received:
    22
    Unless someone decides to release a game to utilise such amount of memory. Otherwise it's a waste, and most probably it will prove as such given that in the lifecycle of such a product you will not see any games to use it.

    So much performance left on the table... Imagine what a 270 W GTX 680 will be capable of... :grin: :shock:
     
  20. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,464
    Location:
    Finland
    But are all of these determined the same way? I mean, at least 580 & 480 had not issues going past their TDP in gaming of all things, while in most cases video cards tend to have healthy margin towards their TDP while gaming
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...