NVIDIA shows signs ... [2008 - 2017]

Discussion in 'Graphics and Semiconductor Industry' started by Geo, Jul 2, 2008.

Tags:
Thread Status:
Not open for further replies.
  1. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    3,984
    Likes Received:
    34
    Chipsets build platforms which enable high margin multi-GPU sales. If there were no NF4, there would've been no SLI.
     
  2. FUDie

    Regular

    Joined:
    Sep 25, 2002
    Messages:
    581
    Likes Received:
    34
    So says nvidia marketing. If ATI could do CrossFire without NF4, why couldn't nvidia?

    -FUDie
     
  3. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    Funny then, that Nvidia first demonstrated working SLI setups in a dual-Xeon on top of a motherboard sporting the Intel E7525 chipset, almost exactly 5 years ago... ;)
     
  4. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    3,984
    Likes Received:
    34
    Because that was a case of "we can do it too!" marketing.

    Technology demos are rarely given on final hardware. I'm sure that was more a case of necessity than an indication of plans for the future. Besides, a dedicated chipset allows for much more rigorous quality control. Nvidia wouldn't want to have provided support for SLI on every chipset under the sun, especially on the first iteration of the product.
     
  5. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    SLI never required a nForce chipset, not way back when and not now. It was the case of nVidia limiting SLI support to nForce chipsets (I'm pretty sure the nVidia chipset department never had to pay the graphics department a "Sly bridge License"... Talk about unfair business practices.
     
  6. rjc

    rjc
    Regular

    Joined:
    Oct 27, 2008
    Messages:
    270
    Likes Received:
    0
    I think i can perhaps explain it.

    You need the economic idea of complementary goods. The chipset and cpu have a high degree of complementariness that economists measure and call the cross elasticity of demand. One of the best explanations i've seen of this effect related to how computer hardware and software interact was written a long time ago here:

    Anyway now that you have the idea the chipsets and cpus are battling against each other. If the battleground was level it possible they might wear each other down to a draw and roughly rake in half the customers money each. Looking at the battlefield though the ground is nowhere near level - the chipset require a host of licenses to exist for all the IO involved with third parties, and most importantly also requires a license from the cpu vendor.

    The cpu itself is not so constrained, or not nearly as much, third parties just dont get much licensing $$ out of the cpu vendor. Economists have term for the situation the chipset vendor is in: a complementary monopoly where multiple agents consent must be obtained to provide a service, there is even maths(see the wikipedia page linked) showing that the chipset vendor and chipset customers will always end up losing.

    Cpu vendors have seen this and realising their own profitability is at stake have started manufacturing their own chipsets and selling them at cost or below to isolate and minimise the licensing paid to third parities. As a side effect of this it's goodbye all third party chipset vendors...

    Looking a tegra in light of above and i suppose atom to a lesser extent because of the integration its not clear where exactly any profits much above commodity level are going to come from. Hence i said above they must sell million upon millions of units of these to make any money. In the tegra case to the basic cpu is ARM based which they have to pay licenses for, only the gpu component of it appears free unconstrained by third parties.

    (Hope above is understandable)
     
  7. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    299
    Location:
    UK
    rjc: I think that's reasonable, but that's obviously all for a commodity market - which is now true of discrete chipsets, but not of integrated chipsets for Intel CPUs where performance and branding are still strong differentiators for NV.

    As for Tegra: I completely and utterly fail to understand how this applies to it in any way. It's a single-chip so how is it a complementary good? Or is that not what you meant? And the licensed CPU is about 5% of the die size - everything else is in-house (unlike some competitors obviously).
     
  8. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    3,984
    Likes Received:
    34
    The rest of the quoted post (which you left off) addresses this.
     
  9. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    You make it sound like no other chipset had the quality to provide the bandwidth required by SLI, or at least something that the nForce chipsets did have. Up-selling is one thing, locking a market with a ridiculous mechanic is another.
     
  10. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,166
    Likes Received:
    1,836
    Location:
    Finland
    If my memory serves, very first SLI supporting chipset was in fact Intel chipset, not a basic consumer one though, and the support for it got dropped later on from nV drivers
     
  11. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    82
    I think Nvidia saw the SLI/Nforce lock-in as some way to leverage both products off each other. Instead of seeing it as an advantage, I think a lot of people see the SLI/Nforce lock-in as a limitation, especially when their competition offer their version of the same product without such limitations.

    This sort of lock-in/leverage really only works when you have a dominant product or offer huge advantages (either technical/price) over your competition. It always struck me as arrogant to insist that you buy a particular motherboard to use two SLI cards when it wasn't really necessary.
     
  12. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    3,984
    Likes Received:
    34
    It's just smart business. Nothing to do with any actual lack of quality of competitive parts, just that NV can only control the quality of their own parts so rather than release SLI into the wild and risk a poor reception because of potential conflicts with 3rd party boards, they decided to control the entire ecosystem and avoid said risk.
     
  13. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    299
    Location:
    UK
    I don't think anyone is contesting that it made sense to limit SLI to nForce4 initially - but in 2006/2007, it became pretty clear that both Intel and AMD could have supported it just fine on their own chipsets.

    In retrospect (aka hindsight is 100/20), probably the smartest thing NV could have done is started talking with Intel/AMD in 1Q07 (after the G80/680i launch) about guaranteeing SLI support with a very reasonable royalty fee (with an actual contract forcing them to keep doing it for some years so they can't be left in the dust later) in 'exchange' for leaving the high-end chipset market (aka: we want to leave this crap that makes us lose money, and we want you to pay us to do it). But it's a tad late for that to say the least.
     
  14. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    3,984
    Likes Received:
    34
    I'm sure NV is still enjoying chipset sales for SLI alone. Many enthusiasts that have a brand preference of NV buy SLI motherboards even if they never use the feature.

    Although that's no doubt lessened of late with the perceived quality control issues of recent chipsets. Perhaps a bit of poetic justice there for NV artificially restricting SLI support.
     
  15. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,848
    Likes Received:
    2,267
    I did, Nv lost a board sale from me because of it


    So did I

    I am

    A royalty fee for what exactly as far as I am aware sli only requires 2 pci-e graphics slots ?
    what next nv want a fee to enable a single card on a vendors chipset
    intel could get in the act too, they could require a fee from nv to make their cards work on their chipsets, why not whats good for the goose ect :(
     
  16. JoshMST

    Regular

    Joined:
    Sep 2, 2002
    Messages:
    465
    Likes Received:
    18
    I honestly wonder if NV didn't make a big mistake initially by attempting their "all in one" chipset design. It seems to me that by putting everything in one chip, especially on a motherboard solution, is a tough thing to design around. Plus it seemingly allows a lower level of flexibility in their designs. This seems to be one case where breaking with tradition has proven to be troublesome.

    I honestly wonder if their profitability issues in the chipset market was attempting to push too many features, thereby extending design time and expenses for the product. More stuff to throw in there, in a single chip package, requires more engineers for any one single product. While in solutions like the 590 SLI which used the 570 SLI as the "southbridge" and their PCI-E/HT bridge chip as the northbridge worked out ok... it just seems a waste to have used a larger southie than what they perhaps needed to? Again, a lot of this is obviously speculation since I don't get the chance to sit in on a lot of the high level meetings. I'm guessing that in this case, ATI's less advanced (read feature free!) chipsets with more defined functionality per chip was the way to go. Still, its not like the chipset market has stopped innovating and expanding its reach...

    Now... if all three of the big chipset folks could finally put enough good silicon into their integrated chipsets to make for adequate co-processors, no matter what situation, then we might see some excitement and real value. But I think that is more wishful thinking on my part than anything else.
     
  17. rjc

    rjc
    Regular

    Joined:
    Oct 27, 2008
    Messages:
    270
    Likes Received:
    0
    Are you talking about Ion having superior performance and branding to the competition? I wont disagree with you on either count it appears well ahead of Intel's offerings.

    I will disagree though that Nvidia can get any kind of premium over commodity prices though. If you assume the principle difference between the chipsets is the superiority of the embedded GPU(...admittedly am putting aside for the moment the more integrated nature of the nvidia chipset, will do in another post perhaps if it's a problem)

    The GPU component of the chipset has another complementary good: the LCD screen. If the display attached is of high enough resolution that the weaker intel chipset cannot maintain the display or overheats or otherwise fails then the nvidia chip moves into a position with much more leverage and profits can be extracted.

    Looking at it though the screen resolution of typical devices sold not that many have high enough resolutions to cause problems for the intel solution. The nvidia advantage becomes quite a bit lessened.

    Could see the above anecdotally at computex the vendors of Ion/9400 chipsets didn't have many models or were having trouble with availability. This suggests supply of product is short for some reason. Can think of 3 obvious reasons for this: bad behaviour by intel, some sort of production difficulty or supply has been deliberately limited to preserve average margins.

    Oh ok if nvidia is responsible for 95% of the IP on the chip with few third parties, then it doesnt really apply. The complementary situations arise with license holders if your SoC has a multitude of independent license holders each with claims on your chip then can get into a complementary monopoly situation.
     
  18. Tahir2

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,978
    Likes Received:
    86
    Location:
    Earth
    If we are talking about a 2D display of a higher resolution I don't know if increases to the resolution would have an impact on energy used by the GPU component of the Intel chipset (GMA950). It would be interesting to test that out actually.

    What you are speculating on doesn't really make much sense to me since the 2D component architecture is going to be able to crank all the way up to 2048x1536 at 75Hz if the display supports it (maybe badly but netbooks definitely have a long long way to go before they have 8 to 11" displays of that resolution!
     
  19. Thorburn

    Regular

    Joined:
    Oct 8, 2006
    Messages:
    323
    Likes Received:
    19
    Location:
    UK
    Well you have to pay a fee for a 'key' to insert in the BIOS on X58 boards. Originally I thought this was to cover some basic 'validation' costs and ensuring things like graphics cards didn't obscure the SATA ports - but since DX58SO (Smackover) got an SLI enabled BIOS I guess thats gone out the window.

    Original SLI drivers didn't require an nForce chipset as previously stated, I remember demoing it running on the D955XBK board to someone who was convinced it was a chipset enabled feature just by using some older drivers with a pair of 6800GT's. Later drivers then locked it down to nForce chipsets.

    You COULD definitely make an arguement for validation of the platform as a reason to lock it out though, NVIDIA don't want to deal with ATI/Intel chipset problems making their glorious and empirical SLI technology looking bad, nor do they want to spend money supporting and fixing those problems unless they are getting some money in return.
    Likewise ATI never supported CrossFire on NVIDIA chipsets - except if I recall on the HP BlackBird systems, which would let you spec up a pair of NVIDIA or ATI cards in an NVIDIA 680i or 780i (can't remember which, but I think ASUS made it) board - presumably enabled by a special driver or BIOS flag. But that leaves just a single extra board to validate and presumably a monitary incentive to do so (high profile machine otherwise not available with their cards).
     
  20. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,873
    Likes Received:
    767
    Location:
    London
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...