NVIDIA Kepler speculation thread

Discussion in 'Architecture and Products' started by Kaotik, Sep 21, 2010.

Tags:
  1. Man from Atlantis

    Regular

    Joined:
    Jul 31, 2010
    Messages:
    960
    Likes Received:
    853
    http://www.overclock.net/t/1208362/q-and-a-session-with-nvidia-ceo/0_20#post_16343030
     
  2. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    so... end of April.
     
  3. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    Hmm 28% faster sounds really bad. Granted since we know nothing about what chip it was, if it was running (near) final clocks, and not even actually what the cuda code was doing, it doesn't really tell anything, but you'd think nvidia would show something which would show more of an improvement...
     
  4. DuckThor Evil

    Legend

    Joined:
    Jul 9, 2004
    Messages:
    5,995
    Likes Received:
    1,062
    Location:
    Finland
    If it's their midrange chip 28% sounds pretty good to me.
     
  5. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
    GK106: ~768SPs @ ~1,3GHz? :lol:
     
  6. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,579
    Likes Received:
    4,799
    Location:
    Well within 3d
    I get paid monthly. ;)
     
  7. DarthShader

    Regular

    Joined:
    Jul 18, 2010
    Messages:
    350
    Likes Received:
    0
    Location:
    Land of Mu
    But it was a "CUDA demo". Why not a game? Or at least Vantage?
     
  8. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    Yes, but I think if they were going to suggest nothing sooner than July, they'd have just kept their mouths shut.
     
  9. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,465
    Location:
    Finland
    But that's quite unlikely, IMO

    edit:
    To clarify - this is assuming CUDA performance translates more or less directly to gaming performance
     
    #1329 Kaotik, Feb 2, 2012
    Last edited by a moderator: Feb 2, 2012
  10. Malo

    Malo Yak Mechanicum
    Legend Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    8,929
    Likes Received:
    5,529
    Location:
    Pennsylvania
    I guess they aren't planning on getting very competetive with pricing if I need to save my next 6 paychecks? :D
     
  11. iMacmatician

    Regular

    Joined:
    Jul 24, 2010
    Messages:
    797
    Likes Received:
    223
    From LenzFire, I'm not sure how reliable these alleged specs are (I have some doubt).

    Quick summary:
    • Performance: GTX 680 (GK110) @ HD 7970 + ~45%, GTX 660 (GK104) @ ~GTX 580, GTX 650 (GK106) @ ~GTX 560
    • 2:1 hot clock present
     
  12. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,465
    Location:
    Finland
    GK110 performance is clearly way too high if GK104 is @ ~GTX580 level
     
  13. Man from Atlantis

    Regular

    Joined:
    Jul 31, 2010
    Messages:
    960
    Likes Received:
    853
    I call it bs, it looks like a mixture of chiphell, obr and a bit charlie and they fillled empty parts like GTX650Ti 224 bit :D how on earth GK104 has more trannies than GF110? it is 256 bit, 32ROPs ok there is extra PCI3 but it's still seems too much..
     
  14. Arty

    Arty KEPLER
    Veteran

    Joined:
    Jun 16, 2005
    Messages:
    1,906
    Likes Received:
    55
    I get a delta of 90% between GK104/GK110. For GF104/GF100 it was roughly 50% and for GF114/GF110 it was even less at 30%.

    I dont know why the GK104 is roughly 580 level when it's projected clocks by Lenzfire for core and shader are both exactly 16.5% higher. And they still have hot clocks, not accurate. What Atlantis said above.
     
  15. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    It's clear that its bullshit, however it does in fact state 3.4b for GK104 and for GK110 6.4b transistors.
     
  16. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    lensfires 660ti has almost 50% of the chip disabled? I can't imagine that would be good for margins.
     
  17. ninelven

    Veteran

    Joined:
    Dec 27, 2002
    Messages:
    1,742
    Likes Received:
    152
    Since when is 25% = 50%?
     
  18. Arty

    Arty KEPLER
    Veteran

    Joined:
    Jun 16, 2005
    Messages:
    1,906
    Likes Received:
    55
    Are those correct?
     
  19. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    my bad was probably looking athe the 660. I wonder, would you disable 25% of a 550mm2 chip to hit performance 10% above your own 290mm2 chip? That part makes no sense for a number of reasons. (or the surrounding ones don't)
     
  20. NathansFortune

    Regular

    Joined:
    Mar 3, 2009
    Messages:
    559
    Likes Received:
    0
    Looks stupid. Last time the GF104 was 384 shaders vs 512, it would make more sense for Nvidia to follow the same path and release GK110 with 1024 shaders and GK104 with 768 with GK106 coming in at 512 shaders. Going straight from 1024 to 512 in a single jump would leave Nvidia nursing losses on the 660Ti as they have outlined it. I don't see it happening.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...