Business practices of PC hardware features *spawn

Well nv do have a history of charging money for nothing (aka sli)
One thing I don't understand is at which point in time it became acceptable to assume that technology should be given away for free by the evil for profit corporations?

Take SLI: It provides higher frame rates than would otherwise be possible, which is important to some. It provides marketing benefits for GPU add-in card and motherboard vendors. And there is no question that it must have cost (and still costs) many man years of engineering time to make it work.

Yet here is somebody who just says all that is just 'nothing'. Where does that it come from that all good things must be given away free? Is it because it's just software? If so, then why is it acceptable for software companies to charge for software but not for system companies that sell a combination of hardware and software?
 
One thing I don't understand is at which point in time it became acceptable to assume that technology should be given away for free by the evil for profit corporations?

Take SLI: It provides higher frame rates than would otherwise be possible, which is important to some. It provides marketing benefits for GPU add-in card and motherboard vendors. And there is no question that it must have cost (and still costs) many man years of engineering time to make it work.

Yet here is somebody who just says all that is just 'nothing'. Where does that it come from that all good things must be given away free? Is it because it's just software? If so, then why is it acceptable for software companies to charge for software but not for system companies that sell a combination of hardware and software?

I think he refer to the SLI chipsets ( southbridge ) ,. who was absolutely not needed technically, but was only enable SLI, disabling the possibility to use ATI GPU's in crossfire ..... basically nvidia wanted to put their southbridges on motherboard who was only bring the posssibility to enable SLI, not CFX ... ( i remember have buy a lot of DFI SLI NF4 board ( on the 6600 SLI 6800 ultra period ), and then when i have use 2x X1950XTX have need buy a DFI CFX3200-DR ( Who was the most incredible motherboard never made ) ... The funny thing is at this time, we was mostly use AMD processors for gaming / overclocking .. Intel have close the case, by only allowing Crossfire from ATI on their motherboards, ... The Southbridges / chispets SLI from Nvidia was suddenly only on some AMD based boards. and with the coming of Core2Duo from Intel, they was refuse the integration of the Nvidia southbridges on the boards based on Intel chipsets ... well you know what have happened then. ( including when AMD have buy ATI .) ..... Nvidia have need to come back to reality... They have try control the dual GPU market, by locking the consumer by the motherboard way ( a bit as G-sync today ).. They have compltely fail...
 
Last edited:
I think he refer to the SLI chipsets ( southbridge ) ,. who was absolutely not needed technically, but was only enable SLI, disabling the possibility to use ATI GPU's in crossfire ..... basically nvidia wanted to put their southbridges on motherboard who was only bring the posssibility to enable SLI, not CFX ... ( i remember have buy a lot of DFI SLI NF4 board...

NF4 was a unified north/south chip, so I'm not really sure what you're complaining about here. Should Nvidia have engineered the NF4 to support ATI's multi-GPU on AMD platform boards? Why would ATI have allowed that? Or maybe Nvidia should have undercut their own chipset division by letting ATI's chipsets be the only to support both SLI and xfire?

What complicated things was the eventual Conroe release and AMD/ATI buyout - AMD ceased to be the premier platform overnight and the chipset marketplace changed as a result. In an Intel dominated world it only made sense for AMD to allow open crossfire support on Intel platform boards, while Nvidia were still actively competing and used SLI compatibility as a selling point (until Intel played hardball.) The moral of the story being no one gives stuff away for free until they're left with no other option for monetizing a feature/product/IP. If AMD weren't being substantially outspent in R&D I'd be a little more sympathetic to the complaints of Nvidia's general focus on proprietary technologies.
 
I was actually referring to nv stating to mboard makers "give us money or we will prevent sli working on your board" might as well say "and we want some cash or anti aliasing wont work either"
what next ssd makers stating to mboard makers "give us money or we will make sure our drives revert to sata 1.0 mode"
@Silent guy you think mboard makers should pay for the features of add in boards?
hell why shouldnt board makers join the party and say to nv "give us money or sli wont work on our boards, or and while your here its another £2 if you want your cards to run in pci-e 3.0 mode, and to save our monitor division a bit of time they want £5 or its no hdmi input or resolutions above svga for you"
 
Last edited:
@Silent guy you think mboard makers should pay for the features of add in boards?
If it's a feature that helps motherboard makers sell more of their wares, why shouldn't they as a small piece of the pie?

hell why shouldnt board makers join the party and say to nv "give us money or sli wont work on our boards, or and while your here its another £2 if you want your cards to run in pci-e 3.0 mode, and to save our monitor division a bit of time they want £5 or its no hdmi input or resolutions above svga for you"
They could do that. But there's only one Nvidia and many motherboard makers. Supply and demand.
 
slilent_guy If i jam your cell phone and say to you "give me $20 and i'll stop jamming you" I am not selling you a product, I am not licensing you intellectual property
I am blackmailing you, but then again if you use your phone to make money (maybe you need it for work) why shouldn't I be entitled to "a small piece of the pie"
and also it takes money to build a cell phone jammer I should be allowed to recuperate that cost ;)
 
What's the point of a discussion when you have to explain to a participant that jamming a cellphone after the fact isn't quite the same thing as the cellphone maker having to pay a license fee to activate a certain piece of technology?

How much code did Microsoft write for Android? How many billions do they make per year from Android?
 
What's the point of a discussion when you have to explain to a participant that jamming a cellphone after the fact isn't quite the same thing as the cellphone making having to pay a license fee to activate a certain piece of technology?

How much code did Microsoft write for Android? How many billions do they make per year from Android?

Thats because of software patents, which is not what the Nvidia SLI matter was about.
 
That's an irrelevant legal technicality. In both cases, the product maker pays to be able to use intellectual property. .

Who they're paying is entirely different and a finer detail that can not go overlooked. If you think that distinction doesnt matter then there is no point in discussing it with you.
 
Who their paying is entirely different and a finer detail that can not go overlooked. If you think that distinction doesnt matter then there is no point in discussing it with you.
In the case of Android, HTC & friends are paying Microsoft for a patent license. In the case of SLI, motherboard makers are paying Nvidia to activate the Nvidia driver IP. First difference here: there is actual code from Nvidia involved, which is not the case for Microsoft.

But also: Nvidia provides design collateral to the motherboard maker about what rules to follow. (This could be a simple as just stating how far PCIe slots need to be separated from each other.) And they certify the motherboard. I assume they keep samples so that bug reports in the field can be reproduced in-house.

So, yes, there is a difference. Microsoft didn't provide any code and just collects. Nvidia does actual work.
 
There is no NVidia IP in contemporary mobos to provide SLI support.
The IP on the motherboard itself is the 'SLI' logo on the box. It has value otherwise motherboard makers wouldn't pay for it. Pencil box makers pay Disney to put Mickey Mouse on their wares because it makes them more attractive to customers. Same thing.

The technical IP is obviously in the driver.
 
The IP on the motherboard itself is the 'SLI' logo on the box. It has value otherwise motherboard makers wouldn't pay for it. Pencil box makers pay Disney to put Mickey Mouse on their wares because it makes them more attractive to customers. Same thing.

The technical IP is obviously in the driver.

Logo or not, SLI wouldn't work unless manufacturers agreed to pay. And that was the second, "better" situation that followed the original one where manufacturers had to buy and include an entirely pointless NVIDIA chip that made the board more expensive to design and manufacture, more complex, less power-efficient, and had no benefit whatsoever, apart from lining NVIDIA's pockets.

Letting MB manufacturers make SLI-compatible products would have been mutually beneficial, since the former could have sold more high-margin products, and NVIDIA would have had more potential customers to whom they could sell a second GPU. They decided to try to squeeze extra money from the situation, at the risk of alienating a lot of people.

All of that is (probably) perfectly legal, but it's sleazy as hell.
 
I don't think Nvidia is make big bank with these SLI license fees. It's probably peanuts. But it's not uncommon to have these kind of fees to keep the riffraff out, ensure a minimum quality level and protect your reputation as a premium product.

If there were no barrier to entry, any cheap motherboard maker could knock something out with no guarantees whatsoever. Maybe they use a second grade BIOS. Maybe no BIOS upgrades. Maybe the mechanicals aren't as they should be. Maybe the power distribution isn't up to snuff. The motherboard may work fine in regular conditions but become unstable in SLI. Guess who will be the first to receive blame?

Being the premium brand is something worth protecting. More so than the lost opportunity of selling a few more GPUs. It maybe sleazy to some, but it's definitely smart.
 
Whatever the case have been closed by Intel ..

Intel was not allow manufacturer to integrate it on Intel based motherboard ... we had got then 1 motherboard over 100 who was support SLI ( the only left was the Evga ones ).. 99% of the motherboard was only support Crossfire. .. during this short period Nvidia had no more "SLI ready" motherboard on the market ( the complete invert of what they wanted inititially ) .

On this closed cased, AMD/ATI can really thank Intel..
 
Last edited:
Letting MB manufacturers make SLI-compatible products would have been mutually beneficial, since the former could have sold more high-margin products, and NVIDIA would have had more potential customers to whom they could sell a second GPU. They decided to try to squeeze extra money from the situation, at the risk of alienating a lot of people.

They were trying to keep their chipset division alive by tying a marquis IP to a product and keep some sort of foothold in the motherboard market. The fact that Nvidia now no longer have any presence in that market would seem to validate their concerns at the time. What people are arguing for here (I guess?) is that Nvidia should have promptly axed their chipset division in response to the threat of being forced out of Intel and AMD's platforms, which is a silly thing to expect when you're talking about that large of a product division within a company. Was it messy for consumers for that couple year period? Of course, but it's perfectly understandable given how much the market had changed within such a short period of time. Within a matter of 5 years we went from a half dozen 3rd party chipset companies supporting 2 platforms, to just 2 companies and their own platforms. Nvidia just happened to be a 3rd party that was big enough for the messy battle to trickle down to consumers in a visible way. Nvidia didn't kick my dog, they were doing what they thought was best for the health of their company at the time.
 
They were trying to keep their chipset division alive by tying a marquis IP to a product and keep some sort of foothold in the motherboard market. The fact that Nvidia now no longer have any presence in that market would seem to validate their concerns at the time. What people are arguing for here (I guess?) is that Nvidia should have promptly axed their chipset division in response to the threat of being forced out of Intel and AMD's platforms, which is a silly thing to expect when you're talking about that large of a product division within a company. Was it messy for consumers for that couple year period? Of course, but it's perfectly understandable given how much the market had changed within such a short period of time. Within a matter of 5 years we went from a half dozen 3rd party chipset companies supporting 2 platforms, to just 2 companies and their own platforms. Nvidia just happened to be a 3rd party that was big enough for the messy battle to trickle down to consumers in a visible way. Nvidia didn't kick my dog, they were doing what they thought was best for the health of their company at the time.
There was nothing "to keep alive" when they sold PCI Express -bridgechips just to enable SLI-support, even though they weren't actually required
 
Back
Top