NVIDIA Kepler speculation thread

Discussion in 'Architecture and Products' started by Kaotik, Sep 21, 2010.

Tags:
  1. sheepdogexpress

    Newcomer

    Joined:
    Mar 10, 2012
    Messages:
    86
    Likes Received:
    11
    What are you talking about pushed to its limits?

    gf114 was one of the best overclockers of last generation. Considering that gf110 hit 772mhz at a much larger die size and still overclocked pretty well, it was easy for gf114 to hit 840mhz.

    Half the gf114 cards sold as overclocked edition and some were sold at about 20% over stock clocks.
     
  2. tviceman

    Newcomer

    Joined:
    Mar 6, 2012
    Messages:
    191
    Likes Received:
    0
    You are right sir; I stand corrected. I should have clarified my argument to say that it was primarily based on add-in discrete desktop GPU's. And yes, I was encompassing all GF104/GF114 based cards in saying that GF104/GF114 was Nvidia's best selling die (at the AIB level) on 40nm.
     
  3. jaredpace

    Newcomer

    Joined:
    Sep 28, 2009
    Messages:
    157
    Likes Received:
    0
    My understanding from reading the vr-zone article is that there are two clocks like in fermi, but they aren't *always* bound to a 2:1 ratio. The gk104 core (uncore) clock ranges from 300 to 950 defaulting at 705mhz, and shader clock domain can reach as high as 1411mhz (supposedly). Can clock independently from 2:1 shader:core ratio and "Turbo" from ~700mhz to 950mhz if needed and TDP warrants. GDDR5 is at 6000mhz QDR (higher speed memory than 7970 boards).

    A GT640m notebook Kepler clocks
    [​IMG]

    http://vr-zone.com/articles/nvidia-...r-dynamic-clocking-2-and-4gb-gddr5/15148.html
    http://www.forum-3dcenter.org/vbulletin/showpost.php?p=9202420&postcount=5723
     
  4. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    Why don't you first start to explain what you mean by 'making it look a lot better than it was' and how 'that doesn't come cheap' ? All while keeping in mind that GF114 is still produced on an extremely mature 40nm process, which means: low wafer cost, very low process spread and low defect densities, no surprises, basically, and a very high yield for the top bin.

    You still haven't replied to my first question: what do you mean by 'loss'? I hope you're not saying that they are selling a die for less than what it costs to produce, because that would be more than ridiculous.

    I also challenge you find a single instance in the Nvidia and AMD conference calls that say something like "our gross margins in the consumer space dropped due to the product mix shifting towards higher-end GPU." (No point challenging you to find the opposite statement: there are plenty of them.) News flash: it is much easier to make a good chunk of money on a $200 card than on a $50 card.

    Finally: can you explain me the concept of 'loss leader' in the GPU space? I'm really curious about that. Does Nvidia sell ink cartridges to print out Nvidia logos? Did I miss out on this thriving market of Nvidia branded razor blades? Will the sale of a GTX 560 encourage the buyer to buy a companion GTX 550? The GTX 560 products are all high runners: exactly how do you figure Nvidia will be able to recoup the losses made on the initial selling price?

    (There was once this John Peddie or Mercury report on Ars Technica that broke down the cost and volumes of Nvidia and AMD GPUs. Anyone able to find this back?)
     
  5. Erinyes

    Regular

    Joined:
    Mar 25, 2010
    Messages:
    808
    Likes Received:
    276
    Oh yeah..i wouldnt either :wink:

    True..the first lot of GK110 has been booked for a supercomputer apparently.

    The definition of high end is purely on price at the moment. Going by past convention (since RV770), AMD's top chip has been upper midrange. High end has been made up of Nvidia's top dog and the dual GPU cards from both parties. The current positioning of Tahiti is an aberration, which will be corrected in time. Besides, you think AMD wont introduce a refresh of Tahiti to counter GK104?

    There's no question of if, AMD is most definitely going to release a dual GPU card based on Tahiti. And as indicated above, Nvidia is going to release a dual GPU card based on GK104 as well.

    Past info always suggested it, you just had to know where to look :wink:
     
  6. UniversalTruth

    Veteran

    Joined:
    Sep 5, 2010
    Messages:
    1,747
    Likes Received:
    22
    Let it be 40-50% to equal it. ;)
     
  7. jimbo75

    Veteran

    Joined:
    Jan 17, 2010
    Messages:
    1,211
    Likes Received:
    0
    Looking on newegg, half of them (13) have dual fan coolers and 9/26 are clocked over 900 MHz. That does not come cheap.

    "Making it look a lot better than it was" is easy enough, just check the reviews that had overclocked cards up against AMD's stock cards. We even had toms increase the clocks on a stock card in a review vs Barts "because so many of them on newegg had overclocks."

    Why didn't they just go with higher clocks?

    Why would that be ridiculous? Check out AMD's graphics segment revenues and you'll see they've been struggling to break even most of the year. This is with generally smaller chips. Why would Nvidia be any different? There is not a lot of money in consumer graphics cards when there is an ongoing price war.

    If you consider the 6950 is selling for a bit more than the 560, and that AMD has barely made any profit in graphics most of the year, by your reckoning that must mean that they are losing money on all of their bottom end cards? I mean if you really believe that these $200+ cards are making them a fortune then there must be a loss elsewhere right?

    How many people do you think buy 560's believing they are 560 Ti's? How many people do you think buy 560 Ti's with single fan coolers and cheap components thinking that they all hit 950 MHz easily?
     
  8. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
  9. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
  10. sethk

    Newcomer

    Joined:
    May 1, 2004
    Messages:
    93
    Likes Received:
    1
    Do we actually know if the Samaritan (UE3) demo was running on GK104? I see the Kepler bit, but people seem to be extrapolating the GK104 part - if this single card replaced a Tri-SLI GTX 580, I don't see that very likely being the 'upper-mid-range' GPU, I see that more likely being the top part, or GK110.
     
  11. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    Exactly what I was thinking. Why are we getting what sounds like around 30%-50% performance improvement from something that has 3x the shaders.
     
  12. chavvdarrr

    Veteran

    Joined:
    Feb 25, 2003
    Messages:
    1,165
    Likes Received:
    34
    Location:
    Sofia, BG
    Because what dnavas wrote is not true? :???:
     
  13. tviceman

    Newcomer

    Joined:
    Mar 6, 2012
    Messages:
    191
    Likes Received:
    0
    I think that while it wasn't confirmed to be on GK104, it probably was running on GK104 but at a lower resolution than what gtx580 was doing and with FXAA (instead of MSAA).
     
  14. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    Possibly. Extremetech seems pretty confident on those specs though:

    More here:

    http://www.extremetech.com/computin...based-dynamic-turbo-boost-arriving-this-month
     
  15. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    They're just repeating rumors.
     
  16. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,535
    Likes Received:
    144
    2GHz at Q(uad)DR == 6 GHz? This is quite curious...
     
  17. iMacmatician

    Regular

    Joined:
    Jul 24, 2010
    Messages:
    797
    Likes Received:
    223
    The "2 GHz" QDR might be a typo, as all the other info in that report points to 1.5 GHz QDR: 6 GHz effective, 256-bit bus, 192 GB/s bandwidth. (I'm guessing there's no such thing as T(riple)DR memory?)

    EDIT: Or maybe Dynamic clock works with memory too?
     
  18. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    The clocks are irrelevant. A fan is dirt cheap.

    I have no idea how this kind of 'reasoning' leads to a conclusion that a GF114 is sold with negative GMs.

    Why didn't 7970 didn't go with higher clocks?

    When people don't use right terminology, it's often an indication that they don't really know what they are talking about either, and it's a drag to deal with the imprecision and resulting confusion.

    I've worked for many, many years for fabless chip companies. Pretty much without exception, they never ever made a profit. Yet without exception, gross margins were in the 40% range. Some of the CEOs may not have been brilliant, but there were not total idiots either: nobody deliberately sells silicon for a lower price than needed.

    The reason companies (or, in this case, a division) don't make a profit is because the huge NRE involved in bringing a chip into existence. But once it's there, it's pretty cheap to produce and you basically hope that you sell enough of them of recoup the NRE.

    It is funny that you're talking GF114 as your example, because semiaccurate of all places, did your exercise for GF104. This was discussed at length on Beyond3D, which is why it's better to move this discussion to PM.

    First observation: even semiaccurate admits that you can break even on a GF104 based card. GF114 is essentially the same die, so the same thing applies.

    But we are now 20 months later: yields on 40nm are amazing and wafer cost has gone down. If you factor that in, even by their numbers, a GF114 die can be sold very profitably.

    But here's the kicker: semiaccurate got it wrong, as usual. I've been involved in productization of consumer electronics gadgets:
    - their cost of the PCB is a riot. A Chinese manufacturer can sell you a similar size 10-layer PCB for $5, not $10. Nvidia can probably get it quite a bit lower.
    - GDDR5 RAM does not cost $24. It's probably $15 or less.
    - a dual fan heatsink? My guess is $7.
    - packaging and accessories $10? Are you kidding me? How about $3?

    If you ever have the misfortune to go to Shenzhen, you should go the SEG Electronics Market and the surrounding shops, all in the same street. I did. It's where thousands of Chinese manufacturers sell their wares: one will only sell HDMI cables ($0.50?), the other only fans etc.

    It's all dirt cheap, in volume.

    Do the exercise: add the numbers. See how misguided your premise is.

    The loss is in NRE, marketing, buildings, whatever. It's all in the open: just read their 10-K statement. What's without question is that the gross margins on high-end silicon are much higher than for low-end silicon. This has been stated again and again in conference calls.

    Not many.

    How many people do you think overclock in the first place?
     
  19. UniversalTruth

    Veteran

    Joined:
    Sep 5, 2010
    Messages:
    1,747
    Likes Received:
    22
    If so. :???:
    Why do they hurry to EOL products which can be produced few months more at lower cost, prices (discounts too), thus higher volume?
    See an example- at the moment there is one R6970 at the ridiculous price of 410$. :???:
     
  20. jimbo75

    Veteran

    Joined:
    Jan 17, 2010
    Messages:
    1,211
    Likes Received:
    0
    Fans, VRM's, PCB's - it all adds up. If it didn't every card would have the best available.

    You seem to have forgotten that I was talking about profit (which I mentioned 3 times btw) and you started talking about gross margin.

    How do you explain AMD making a tiny profit on graphics all year with their perf/mm2 advantage, yet Nvidia making ~$500m? Nvidia's professional sector is shoring up their losses in consumer and has been for years - it really is that simple. What else could it be? These price wars might be great for us but they are not working for either company.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...