NVIDIA Kepler speculation thread

Discussion in 'Architecture and Products' started by Kaotik, Sep 21, 2010.

Tags:
  1. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    Because... GF104 and onwards did it also single-cycle?
     
  2. A1xLLcqAgt0qc2RyMz0y

    Veteran

    Joined:
    Feb 6, 2010
    Messages:
    1,589
    Likes Received:
    1,490
  3. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    First people made fun that testing at 1080p is BS for a card of this caliber. Then they speculated that, for sure, it would trail at 25x16. And now with that out of the way too, they complain that a test was done at 25x16, the resolution of choice remember, when it should have been done at 1080p? Are we in the bargaining phase of the 5 stages of grief?
     
  4. A1xLLcqAgt0qc2RyMz0y

    Veteran

    Joined:
    Feb 6, 2010
    Messages:
    1,589
    Likes Received:
    1,490
    If there is anyone cherry picking here it is you.
     
  5. jimbo75

    Veteran

    Joined:
    Jan 17, 2010
    Messages:
    1,211
    Likes Received:
    0
    It should have been tested at 1080p when so much is being made of the 680's 1080p performance - not lauding the performance at 1080p then benching turbo at 1600p.

    He should also have ran power draw, heat and noise benchmarks on one of the games Nvidia won at instead of one of the very few than AMD won at (Metro).
     
  6. Vardant

    Newcomer

    Joined:
    Sep 1, 2009
    Messages:
    96
    Likes Received:
    1
    But that was all just you guys :lol:
     
  7. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    I'm cherry picking pointing out it really seems to have fallen off a cliff in compute? Sorry, the benchmarks dont lie. In that test it gets creamed by a 580 forget 7970.

    It seems like when Trinibwoy reveled in AMD taking the compute hit, he was only half right. Nvidia did the reverse, they un-took the compute hit.

    Also, overclocking is weird on this thing

    Hardcore overclockers are such a huge part of the enthusiast market, hell I've sometimes wondered if it isn't best to artificially limit high end part clocks a little bit so that the OC'ers can feel smug about their great OCing results.

    This overclocking weirdness with Kepler, I'm not sure how it's going to play, but I suspect it might be a negative with that crowd. Regardless their love for Nvidia will probably overcome it to some extent. They'll likely grin and bear it for Nvidia's sake, even if they dont like it.
     
  8. jimbo75

    Veteran

    Joined:
    Jan 17, 2010
    Messages:
    1,211
    Likes Received:
    0
    http://hexus.net/tech/reviews/graphics/36509-nvidia-geforce-gtx-680-2gb-graphics-card/?page=4

    Looks like Hexus got lucky with their card! I wonder how many other reviewers did. :razz:

    Average 1097 - http://translate.googleusercontent....e.html&usg=ALkJrhjWLuwQUkB2NNH7HwOmqGZsAYb_dA
     
  9. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,579
    Likes Received:
    4,799
    Location:
    Well within 3d
    My mistake, for some reason my brain had GF with a slower rate.
     
  10. SimBy

    Regular

    Joined:
    Jun 21, 2008
    Messages:
    700
    Likes Received:
    391
    GPU boost obviously leaves open doors to benchmark shenenigans and weird inconsistent results from one review to another. It's basically an out-of-the-box overclocked card only that OC is range based.

    And it seems Nvidia gave up compute for gaming performance.
     
  11. ECH

    ECH
    Regular

    Joined:
    May 24, 2007
    Messages:
    692
    Likes Received:
    30
    So I take it that power consumption/clocks can be controlled via profile/software.
     
  12. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,579
    Likes Received:
    4,799
    Location:
    Well within 3d
    It's not necessarily cheating that the 680 fares worse in various compute scenarios, particularly since it still beats AMD in others.
    What we do see is a deepmhasis on all areas of compute and less consistency, which can be a valid tradeoff for the target market which is not dominated by compute loads.
    One of the few areas of interest for most consumers is now handled by a hardware encoder.

    Lesser capability needs to be weighed against the relative significance of the feature where the card is sold.
    That's not to say that this couldn't hurt in the future, since some games do make better use of compute shaders.
    However, the rest of the architecture on the graphics side is so strong that it can frequently more than compensate.
     
  13. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    nah
    http://www.hardwarecanucks.com/foru...616-nvidia-geforce-gtx-680-2gb-review-29.html

    This again ! :roll: , any clock increase AMD would be able to raise , NVIDIA could just do the same .

    HD 7970 clocks at 5500 MHz , this what I am referring to .
     
  14. Man from Atlantis

    Regular

    Joined:
    Jul 31, 2010
    Messages:
    960
    Likes Received:
    853
  15. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/4
     
  16. Jaaanosik

    Newcomer

    Joined:
    May 18, 2008
    Messages:
    146
    Likes Received:
    0
    Comments under the review.

    Edit: Comment from Ryan:
     
    #3416 Jaaanosik, Mar 22, 2012
    Last edited by a moderator: Mar 22, 2012
  17. steveOrino

    Regular

    Joined:
    Feb 11, 2010
    Messages:
    549
    Likes Received:
    242
    HAHAHA, by reasoning thank goodness ATI did all that "cheating" in compute during previous generations or they would have been really screwed.
     
  18. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,987
    Likes Received:
    3,529
    Location:
    Winfield, IN USA
    Looks like Kepler is living up to it's reputation, nice job nVidia. :)
     
  19. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,112
    Location:
    New York
    Lol I'm glad somebody said it. The coo-coo train is really rolling now. The fact that it doesn't make toast is probably cheating too.

    Jawed was right about the greater compiler dependency though. He probably saw the white paper before starting that little diatribe :lol: In any case it's obvious nVidia's static scheduling needs some work. It's only dual issue dammit, how hard can that be. AMD had to deal with 2.5x that.
     
  20. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    The NVIDIA card appears to suck badly in the original Crysis and Crysis Warhead , it is hardly faster than GTX 580 ! I suspect this could improve with future driver updates .

    Performance in Crysis 2 DX11 seems superior to competition though , that is mainly because of the tessellation I think , without it we are looking at a tie .
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...