NVIDIA Kepler speculation thread

Discussion in 'Architecture and Products' started by Kaotik, Sep 21, 2010.

Tags:
  1. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,966
    Likes Received:
    4,561
    The origin of the discussion is how the GTX780 Ti has a great GPU that will be short-lived by the low amount of memory set by nVidia's default design/specs, just as what happened with the GTX580.
     
  2. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,773
    Likes Received:
    2,560
    Not even close.

    Only less than 5.5GBs are available for PS4 games, same for XO. the rest is dedicated for the OS. so the GPU in these consoles could use maybe 3GB for it's video data.
     
  3. boxleitnerb

    Regular

    Joined:
    Aug 27, 2004
    Messages:
    407
    Likes Received:
    0
    First of all, I disagree - it is. People spend money on all kinds of crap, be it the newest smartphone, new fancy clothes they don't need, drive their cars where they could walk/cycle, smoke etc. $200 every 2-3 years for a midrange video card of the newest generation is perfectly affordable imo. It's about lifestyle preferences, nothing else. I had two 580s 3GB, but only because of SLI. For that it makes sense. Otherwise, it's quite questionable. It's like the frequency game. People always think more is better, they need more. And just because a game uses more on a card with larger VRAM doesn't mean a card with less would fail in that scenario. VRAM usage != VRAM requirement! People so easily forget that.

    Secondly, a generic statement is better than cherry picking (Skyrim with Mods). I would expect that most games play just fine on a 580 IF you lower settings accordingly - despite the 1.5 GB. For instance 2xMSAA instead of 4xMSAA not only gives more performance in general but also requires less VRAM. These things often go hand in hand. A GT650 Ti 2 GB is a joke compared to the 580. In 99% of all cases it doesn't have the raw compute power to fully utilize the additional 512 MB.

    Third, you cannot compare console hardware to PC hardware. Closer to the metal programming models extract much more performance from the consoles. And as DavidGraham correctly said, it's about 5 GB in total (VRAM+RAM unified) for a game on PS4 or XB1.
     
    #6983 boxleitnerb, Jan 13, 2014
    Last edited by a moderator: Jan 13, 2014
  4. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,457
    Likes Received:
    580
    Location:
    WI, USA
    I wonder if they could have built 2GB 580s using mixed channel width like 550 Ti.
     
  5. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,966
    Likes Received:
    4,561
    Using a 3-word sentence won't make your post right. It'll only make it trollish, which brings me to:

    Nitpicking for a difference of ~8% between what I said and what you said.
    Again, trollish.

    The GTX580 is comparable to the xbone's and ps4's iGPUs in compute performance, geometry processing, fillrate, memory bandwidth and featureset. Any theoretical comparison that you ever bother to read will tell you the same.
    Besides, what exactly was the point in your post?



    This is the most conceited and self-absorved opinion I've ever seen about being able to make a purchase.
    If I can afford it, then all the others must be able to afford it to. Otherwise, they must just be wasting too much money on cigarettes and stuff.

    This conversation ends here for me. I choose not to maintain any kind of argument with someone carrying this attitude.



    Since nVidia introduced that ability with GF114 which came out half a year after the GTX 580, it may have not been prepared for that.
    But they sure could have done it for any Kepler chip. At least the GK104 and GK106 have that ability (660 Ti and 650 Ti Boost).
     
    #6985 ToTTenTranz, Jan 14, 2014
    Last edited by a moderator: Jan 14, 2014
  6. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,773
    Likes Received:
    2,560
    Comparing different architectures will only get you so far. PS4's GPU is slightly better than an HD 7850, nothing more .. nothing less. The XO is out of the equation because it's at the same level of an HD 7770, so far behind. If you mean that they could get close to a 580 by close to metal programming , then that is yet to be tested and proven. And the situation is not that rosy in the console space due to their lackluster CPU which will hold them back.

    Not nitpicking, just pointing out the falsehood of the supposed advantage of consoles in memory capacity. there are restrictions all over, and while they do have some advantage in the flexibility of memory allocation, it is nothing that current PCs can't handle, and certainly nothing to brag about.

    In fact if you look at some current cross-platform titles, such as Call Of Duty Ghosts you would see that it's texture resolution is lower than the PC version at max quality, BF4 and NFS Rivals also suffer from reduced world details (though that could be attributed to reducing CPU load too) .. suggesting even more deep restrictions on memory allocation.
     
  7. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,583
    Likes Received:
    703
    Location:
    Guess...
    The interesting thing about the new Titan will be the clock speeds.
     
  8. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,153
    Likes Received:
    928
    Location:
    still camping with a mauler
    Hey guys, it's against the forum rules to accuse other users of trolling. That stuff must be reported if serious.

    Anywho I'm way more concerned about the 2GB cards than the 3GB GTX780. 3GB should be good well into the future, but 2GB may be limiting on the higher end GK104 cards. By the time that I'll probably be ready to upgrade from my GTX670 anyway, but it would be a problem if I ever wanted to resell it. (not that I would, all my old GPUs get passed down the "chain of command")
     
  9. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,928
    Likes Received:
    1,626
    http://www.guru3d.com/news_story/rumor_geforce_gtx_titan_black_edition_and_790.html
     
  10. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    I always wonder why journalists trot out these tired, old Cebit, CES, Computex, etc release date speculations?

    Have Nvidia and AMD *ever* released a new desktop GPU at one of those shows? You'd think that they start seeing a pattern of being incorrect after a while, but no...
     
  11. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,420
    Likes Received:
    179
    Location:
    Chania
    For each MHz a $? :razz:
     
  12. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,583
    Likes Received:
    703
    Location:
    Guess...
    I'd hope less than that. To justify the Titan name I'd hope for a boost clock of no less than 1100Mhz, preferably more.
     
  13. xDxD

    Regular

    Joined:
    Jun 7, 2010
    Messages:
    412
    Likes Received:
    1
  14. spworley

    Newcomer

    Joined:
    Apr 19, 2013
    Messages:
    146
    Likes Received:
    190
    Evidence of a new (likely Kepler) low-end GPU.
    The GTX 740 has only 1GB of GDDR5 with a 128-bit bus width.

    With no more information, we can only speculate on details, especially which GPU is inside.


    It could be a GTX 650 retread/rename using a GK107 or a cut-down GK106. Odd to use a two year old chip, though.

    It could be a super-cut-down GM107, though the mere existence of a 6-pin power option makes this unlikely.

    The most interesting possibility: it could be a new Kepler, a GK207 to complement GK208. Perhaps with the hinted upcoming sm_37 architecture? GK208 is sm_35, unlike the older GK10x series.
     
  15. Erinyes

    Regular

    Joined:
    Mar 25, 2010
    Messages:
    647
    Likes Received:
    92
    Agreed..could well be a GTX650 rebranding. But also be a cut down GM107..the 6 pin power plug appears to be only on one of the cards indicating that it is not a requirement.

    GK107 would be cheaper as it has a lower die size..but if they have harvested GM107 chips..then it would be better to use them I suppose. Looks like launch is imminent so we should find out shortly.
     
  16. A1xLLcqAgt0qc2RyMz0y

    Regular

    Joined:
    Feb 6, 2010
    Messages:
    987
    Likes Received:
    278
    This link shows the GT 740 GPU to be a GK117.

    http://gpuboss.com/graphics-card/GeForce-GT-740
     
    #6997 A1xLLcqAgt0qc2RyMz0y, May 25, 2014
    Last edited by a moderator: May 25, 2014
  17. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,183
    Likes Received:
    1,840
    Location:
    Finland
    NVIDIA hits new low, they launched 3 new GeForce GT 730 -models

    GeForce GT 730 with DDR3 memory and 128-bit interface
    GeForce GT 730 with DDR3 memory and 64-bit interface
    GeForce GT 730 with GDDR5 memory and 64-bit interface

    Doesn't sound too bad, does it?
    Yeah, it doesn't - until you realize that the first of the 3 has only 96 CUDA cores while the other two have 384.
     
  18. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,298
    Likes Received:
    247
    It's probably 4 years old Fermi / GF108 / 40nm. GeForce GT 430 -> GT 530 -> GT 620 -> GT 730 128bit.
     
  19. dbz

    dbz
    Newcomer

    Joined:
    Mar 21, 2012
    Messages:
    98
    Likes Received:
    41
    Really? Seems rather hyperbolic
    And this tops the four* GT 630 variants in what way?
    No, not really given that the likely market segment user either wouldn't know a CUDA core from a VGA-out, or are simply looking for card for display out.

    *
    GT 630 (GF108/96 core/128-bit/DDR3)
    GT 630 (GF108/96 core/128-bit/GDDR5)
    GT 630 (GK107/192 core/128-bit/DDR3)
    GT 630 (GK208/384 core/64-bit/DDR3)
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...