NVIDIA Kepler speculation thread

Discussion in 'Architecture and Products' started by Kaotik, Sep 21, 2010.

Tags:
  1. UniversalTruth

    Veteran

    Joined:
    Sep 5, 2010
    Messages:
    1,747
    Likes Received:
    22
    But they lose the market with highest margins- the professionals and supercomputers. No one sane will "upgrade" 580 with 680 in such clusters.
    Why didn't they just shrink GF110 to obtain a smaller, faster in all cases and cheaper end product? Instead they did GK104 which sucks, of course it doesn't suck for a mainstream product, but for the high-end- definitely.
     
  2. boxleitnerb

    Regular

    Joined:
    Aug 27, 2004
    Messages:
    407
    Likes Received:
    0
    Ask Nvidia. I suspect purchasing decisions in the professional market, especially HPC, are longer term. Having nothing for a year or so isn't that bad. Also you have old Teslas that may very well sell for people who don't have them yet.
     
  3. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,420
    Likes Received:
    179
    Location:
    Chania
    Then give me a reasonable explanation why they supposedly dumped such an expensive hypothetical design like the "GK100" in order to go str8 to the supposed GK110 "refresh". It was always NV's intentions afaik to release a performance part FIRST for the Kepler family of products and that performance part is GK104. It was never meant and never will be a high end chip.

    GK104 is not a high end core full stop. Not without any HPC capabilities and not with a 294mm2 die area. What marketing and its resulting pricing did with the core has nothing to do with the fact that GK104 is, was and will be a performance part of the Kepler family.

    GK10x are named like that because the don't carry the additional HPC related logic and capabilities of GK110. I know it's highly tempting to spin out of a simple digit difference any wild speculations based on not even a single indication, but if you think that NVIDIA can afford to dump multi million projects like a high end core without hurting you're obviously confusing it with Intel.

    A 7.1b/550+mm2 chips wasn't simply manufacturable earlier than mid 2012 either way you turn it. Look at current manufacturing quantities, yields and what not for 28HP and the whole story is rather self-explanatory.
     
  4. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,183
    Likes Received:
    1,840
    Location:
    Finland
    They were perhaps one day, but they've become part of marketing too for the semi-aware/informed people, as all the sites do a good job mentioning the codenames before and after the releases.
    GF110 is GF110 because of marketing, too, it was GF100b originally (as proven by early BIOSes)
     
  5. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,420
    Likes Received:
    179
    Location:
    Chania
    GF100b was renamed to GF110, because the latter original 32nm project had been cancelled for obvious reasons.
     
  6. UniversalTruth

    Veteran

    Joined:
    Sep 5, 2010
    Messages:
    1,747
    Likes Received:
    22
    You ask the question and then answer it alone. Dump or simply postpone?
    I might be expensive but this can in all cases be considered as investment and the technology will almost surely be used one day.
    I wonder why you do bother about if it is expensive or not. NV has lots of money to dump.

     
  7. Andrew

    Newcomer

    Joined:
    Jul 26, 2002
    Messages:
    58
    Likes Received:
    5
    Why would you spend money on a 7970 right now when AMD made it perfectly clear that in a few weeks the 1ghz edition will be hitting stores.

    AMD has a very bad habit of killing their own sales by either leaking or announcing their upcoming product, seemingly without any concern for their 3rd party manufacturing partners and retailers.
     
  8. UniversalTruth

    Veteran

    Joined:
    Sep 5, 2010
    Messages:
    1,747
    Likes Received:
    22
    Isn't the GHz edition supposed to be at least a little bit more expensive?
     
  9. Andrew

    Newcomer

    Joined:
    Jul 26, 2002
    Messages:
    58
    Likes Received:
    5
    I guess so.

    But regardless... if it came down to waiting a few weeks and having to raise my budget $50 to get the newer card that isn't slower than GTX 680... I have zero doubt in my mind that anyone who is informed enough to know that the Ghz Edition is coming soon would NEVER buy one of the 7970's collecting dust at a retailer near you.

    People who you'd typically expect to have no clue about this upcoming faster 7970 are your uninformed consumers who buy their PCs at Best Buy, and to be honest.... these clueless people hardly ever upgrade their video card and when they do it ends up being a mid-range level one at the very most.

    So I still stand by my statement about how AMD killed their own product sales by paper launching a newer, faster, and better card while retailers still had large quantities of their current product lines in stock.

    AMD would have been wise by waiting a few weeks and making the Ghz edition a hard launch in order for retailers to sell off what they had and not take a big hit if AMD does indeed plan on a price cut.
     
  10. leoneazzurro

    Regular

    Joined:
    Nov 3, 2005
    Messages:
    518
    Likes Received:
    25
    Location:
    Rome, Italy
    Because actually the 7970 is quite less expensive (at least in Europe) than a 680 and than the announced MSRP of teh 7970 GE while delivering only a few % less performance, maybe (not counting OC in the factor, which is also a good point for the 7970s)?
    Also, by the same reasoning, the 670 could have been a 680 "internal" killer (but no sign of a 670 or lower on the list, maybe the claim about all the Kepler cards being detected as "680" by the Steam survey has some ground... )
     
  11. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,298
    Likes Received:
    247
    Better price/performance + lower power-consumtion?
     
  12. iMacmatician

    Regular

    Joined:
    Jul 24, 2010
    Messages:
    773
    Likes Received:
    200
    There have been many occasions since at least the G80/G90 series that a GXy1z or G9z chip was released with no previous sign (or at least none that I saw) of a corresponding GXy0z or G8z. G92, G98, most of the GT21x, GF119, and GF117 are the ones I can think of. (Also GT216 was nothing like GT206/GT200b.)

    However, if there really was no GK100, this would be the first time since at least G80 that a GXy10 chip was released with no GXy00 before it. Still, doesn't seem unlikely to me.
     
  13. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,183
    Likes Received:
    1,840
    Location:
    Finland
    ...and if you get reference model, tha ability to just flash it to fullblown GHz Edition
     
  14. Homeles

    Newcomer

    Joined:
    May 25, 2012
    Messages:
    234
    Likes Received:
    0
    If you're into flashing, chances are that you're into overclocking as well, making a flash irrelevant.
     
  15. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,079
    Likes Received:
    648
    Location:
    O Canada!
    That sentence could have gone a whole different direction! :shock:
     
  16. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,891
    Likes Received:
    2,307
    Speaking of voyerism what are you doing in an nv thread..
     
  17. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,183
    Likes Received:
    1,840
    Location:
    Finland
    What happens in Ruisrock stays in Ruisrock! (and what would music festival be without skinny dipping into ocean too drunk to really stay floating)
     
  18. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland

    Yes and no ... for peoples who dont want run overclocked cards with third party software who start to be a real headache when you use UPLS.. got a 1050mhz 7970 without having to use them is not so bad ..

    Im an overclocker and i push the limit of my hardware for get some score under H2o, LN2 etc ... but i dont need overclock my cards for 24/24 use ( with 2x 7970 i really dont need it, whatever is the resolution ).

    As bios editing is not possible yet on Thaiti and have a 1050mhz bios with turbo boost is not a bad thing. what is really interessant is not to flash or flash your card.. what is interessant is any AMD cards with the same system can got the turbo boost.. including the midrange 7870-7850 .
     
    #5098 lanek, Jul 13, 2012
    Last edited by a moderator: Jul 13, 2012
  19. xDxD

    Regular

    Joined:
    Jun 7, 2010
    Messages:
    412
    Likes Received:
    1
    Some months is not a tragedy...Amd has been lacking for years in that market for example...
     
  20. jimbo75

    Veteran

    Joined:
    Jan 17, 2010
    Messages:
    1,211
    Likes Received:
    0
    Considering it's Nvidia's only decently profitable segment, it doesn't make a whole lot of sense to be so late.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...