NVIDIA Kepler speculation thread

Discussion in 'Architecture and Products' started by Kaotik, Sep 21, 2010.

Tags:
  1. Mize

    Mize 3dfx Fan
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,079
    Likes Received:
    1,149
    Location:
    Cincinnati, Ohio USA
    I have been much more impressed with SLI drivers than with CFX, but I'd rather wait for a single card that can match 2x 580 or 2x 6970.
     
  2. Sinistar

    Sinistar I LIVE
    Regular Subscriber

    Joined:
    Aug 11, 2004
    Messages:
    660
    Likes Received:
    74
    Location:
    Indiana
    I have been using crossfire since the 48xx series, have not had any problems with any of the games I play. I will admit that I am probably not the typical gamer, I do not play FPS games, I like my RPG's, and sandbox games.

    Edit: I am on the 69xx series now, and will soon upgrade with a new motherboard, and CPU.
     
  3. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
    #3643 Shtal, Mar 25, 2012
    Last edited by a moderator: Mar 25, 2012
  4. ninelven

    Veteran

    Joined:
    Dec 27, 2002
    Messages:
    1,742
    Likes Received:
    152
    Finally figured out how the "average" boost clock works.

    Base: 1006
    Boost: 1058
    Difference: 52 MHz
    Max: 1006 + (52*2) = 1100

    The "average" number is just the halfway point in the default power profile. They apparently step the clock at 8 13 MHz intervals, so 1006, 1019, 1032, 1045, 1058, 1071, 1084, 1097, 1110 by default (depending on power consumption). Thus, every 680 will be allowed to clock up to 1100 Mhz by default, it is just how often it does this will depend on the unique power consumption of each board/chip. The clock offset just changes the minimum from which the 8 boost intervals start (which are still power limited).

    From what I have gathered, they have been fairly conservative in the profiling, especially regarding power consumption. I'd guess virtually every 680 will easily hit the same clocks as the review samples, though in some cases the user might have to increase the power consumption limit by a small amount. Hopefully, that will rarely be the case though. It will certainly be interesting to see the results from actual users.

    Sorry if this has been posted before... haven't been keeping up.
     
  5. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    Hitting the same clocks != hitting those clocks all of the time. More stressful scenes may invoke limitations sooner. So while every card might be able to hit 1100Mhz under certain conditions, it certainly will not do so under every condition (and I believe there is variation in those clocks in reviews that monitored them).

    I look forward to a review which monitors the clocks of a number of GTX 680's in the same tests.
     
  6. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    It is kind of annoying not a single reviewer appeared to even acknowledge the seeming possibility of a "golden sample" skewing their results under this new boost tech. Dereliction of duty, frankly.
     
  7. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    That's a possibility with any review sample of any tech product, not sure why you expect Kepler to get special treatment about that.
     
  8. xDxD

    Regular

    Joined:
    Jun 7, 2010
    Messages:
    412
    Likes Received:
    1
  9. A1xLLcqAgt0qc2RyMz0y

    Veteran

    Joined:
    Feb 6, 2010
    Messages:
    1,589
    Likes Received:
    1,490
    GPU Boost Hypocrisy

    It sure is funny to see all the red-headed stepchildren crying and howling about Nvidia's GPU Boost technology when the same technology is built into the latest CPU's from Intel and AMD. I don't see any calls for websites that are benchmarking those CPU's that have a Turbo Mode to disable Turbo Mode when benchmarking.


    Yet you see posts here on B3D and saturated on S|A forums is how despicable Nvidia is and that all reviewers should disable GPU boost when benchmarking.


    Hypocrisy at its finest.


    http://www.pcmag.com/article2/0,2817,2402021,00.asp


    http://www.amd.com/us/products/desktop/processors/phenom-ii/Pages/phenom-ii-key-architectural features.aspx



    http://www.amd.com/us/products/desktop/processors/amdfx/Pages/amdfx.aspx



    http://www.techpowerup.com/144260/AMD-C-60-Gets-TurboCore-to-CPU-and-GPU.html



    http://en.wikipedia.org/wiki/Intel_Turbo_Boost
     
  10. PeterAce

    Regular

    Joined:
    Sep 15, 2003
    Messages:
    490
    Likes Received:
    10
    Location:
    UK, Bedfordshire
    Well installed my GK104 yesterday afternoon. In Battlefield 3, Alan Wake and Skyrim the card was 'Boosting' up to 1123 Mhz (up from 1006 Mhz stock).
     
  11. A1xLLcqAgt0qc2RyMz0y

    Veteran

    Joined:
    Feb 6, 2010
    Messages:
    1,589
    Likes Received:
    1,490
    You don't have to launch to do that all you need to do is stockpile them during the qualification process.
     
  12. itsmydamnation

    Veteran

    Joined:
    Apr 29, 2007
    Messages:
    1,349
    Likes Received:
    470
    Location:
    Australia
    Hi pot meet kettle :roll:

    calling people Hypocrites without actually addressing there point.

    :roll::roll::roll:

    come on people, this forum is better then that lowest common denominator crap! :twisted:
     
  13. DarthShader

    Regular

    Joined:
    Jul 18, 2010
    Messages:
    350
    Likes Received:
    0
    Location:
    Land of Mu
    How about thinking twice, before you elevate the discussion with such elaborate phrases?

    1. CPU Boost has guaranteed values for all chips and doesn't boost further, there is no lottery here.
    2. CPU Boost can be turned off.

    CPU Boost has no relevance to what nVidia is doing, other than the general idea, which is why the comparision was made.
     
  14. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    I think you've made up an imaginary person and substituted him in my place. Either that, or you're thinking of what was going on years ago. Because I've easily cared a fair amount about power consumption for the last five years or so.

    It's lower-priced, performs better, uses less power. What more could you possibly want?

    But personally I still lean a bit towards nVidia parts for the little things. I do have an ATI GPU on my laptop, so I'm not talking completely out of my backside when I say that I generally prefer nVidia's drivers. They have more options and better Linux support for the most part (ATI does have better open-source drivers available, which allows KMS, which in turn allows for faster wake times from sleep mode). So even given very similar parts, I would still prefer nVidia. Just makes me happy that it's a better part all-around.
     
  15. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,541
    Likes Received:
    964
    It uses less power under full load, but more power when idle (displays off), due to the lack of anything like ZeroCore.

    So if your primary concern is maximum power draw, GK104 wins. If it's total energy consumption per day, Tahiti wins.
     
  16. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    For me it's not so much power draw or heat that's important, as long as they are reasonable then I don;t really care which GPU wins. However what those two metrics enable with regards to noise is very relevant to me. The quieter the GPU the better. I learned this the painful way with my current 4890. Whatever I get next I want it to be a lot quieter than this!
     
  17. Mize

    Mize 3dfx Fan
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,079
    Likes Received:
    1,149
    Location:
    Cincinnati, Ohio USA
    Stock cooling is rarely the best for noise. I was able to quiet my 6970s down to a whisper with a couple aftermarket Scythe heat sinks and PWM fans and my current 580s are extremely quiet on water.
     
  18. Sinistar

    Sinistar I LIVE
    Regular Subscriber

    Joined:
    Aug 11, 2004
    Messages:
    660
    Likes Received:
    74
    Location:
    Indiana
    I have an sli laptop here, its dead from bumpgate, so I hope you can understand why I prefer to go with AMD products.
     
  19. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    So, either you have been sold a sample meant for reviewers by accident or maybe 1100-ish MHz is not that much out of the ordinary as some people were afraid of.
     
  20. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    I'm not sure that really helps for a single video card, provided I turn my computer off or put it to sleep when not in use (which I do). Tomshardware has some power consumption numbers that seem to back this up:
    http://www.tomshardware.com/reviews/geforce-gtx-680-sli-overclock-surround,3162-15.html

    Of course, being able to power down completely when the display is off is nice, but otherwise even the idle consumption is on par or higher than what nVidia offers. And just putting the computer to sleep is even better.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...