NVIDIA shows signs ... [2008 - 2017]

Status
Not open for further replies.
Aren't the electrical designs of the interfaces subject to some kind of copyright protection? In which case, it's a particular bus implementation's copyright being licensed and there would be no FSB to QPI/DMI transferrence.

I'd seen it explained that the refusal to license upcoming bus interfaces in the Slot 1 era were what lead to the creation of Slot A and the end of pin compatibility between Intel clones and Intel platforms.
 
Aren't the electrical designs of the interfaces subject to some kind of copyright protection? In which case, it's a particular bus implementation's copyright being licensed and there would be no FSB to QPI/DMI transferrence.

I'd seen it explained that the refusal to license upcoming bus interfaces in the Slot 1 era were what lead to the creation of Slot A and the end of pin compatibility between Intel clones and Intel platforms.

pretty much exactly what Intel did with the transition from Socket 7 to Slot 1.. by doing so AMD, VIA, and SIS (iirc) no longer had the right to produce Intel compatible chipsets w/o acquiring a new license. Up until then AMD, VIA and others where pretty much free to produce Intel compatible platforms. I believe later on VIA entered into a cross licening deal to make S1 motherboards while AMD, VIA, SIS and Cyrix gave further live to Socket 7 (Super SOcket 7 added AGP and 66/100 Mhz SDRam).
 
Indeed from what I gather these licensing type of deals are usually made for a given time period. Aka we can do this for 5 years and decide if we want to lengthen it during that time. If that's the type of deal Intel sold Nvidia on then 3 years in says oh we meant for this interface not the new one so you can't build these anymore for the new chips. That's rather sleazy on Intel's part.

It's usually limited both by length of term (though infinite is also not uncommon) as well as type of interface and/or product line. In the automotive industry for example contracts to produce parts can and often are limited by the product line in addition to length of time. Same with the electronics industry and just about all other industries. Rarely is a licensee given a blanket agreement for ever changing products. If a company is licensed to make drum brakes for a car line and the car company suddenly switches to disc brakes, that doesn't mean that drum brake manufacturer/provider automatically gets the to provide the disc brakes, unless the contract actually specified that it would.

Without viewing the contract itself, nobody on this forum can say what it is and who has the right of it.

And something some people (not you) might be missing sight of, is that Intel is in no way obligated to offer another company licensing terms or access to anything it does unless it offers a clear benefit to them.

And money is going to be rather low on that scale for a company as large and profitable as Intel. Thus a non-tech sharing agreement would need to include a large monetary involvement from the other party.

The only thing we know for sure is that Intel and Nvidia were negotiating for whether Nvidia would be allowed to license the rights to produce chipsets for the new interconnect. That would indicate that Nvidia at least suspected or knew that it's prior contract did not or might not apply. Negotiations hit a brick wall as Intel were not able to get anything they felt was beneficial to them as a company. And Nvidia felt that whatever Intel was asking for made any licensing agreement not beneficial for their company.

At which point the legal case becomes the primary focus to try to find some way to force Intel to allow access. Perhaps there's something there that will give them leverage. Perhaps they are hoping it's enough to bring Intel back to the table. Perhaps they're hoping to get a non-tech savy Judge that will rule in their favor regardless of what the contract actually says (not like this has never happened before between other companies).

But either way. Unless that prior contract gives them explicit rights to a new interconnect, Intel is under no obligation to offer them a license if they don't feel it benefits their company.

Regards,
SB
 
At which point the legal case becomes the primary focus to try to find some way to force Intel to allow access. Perhaps there's something there that will give them leverage. Perhaps they are hoping it's enough to bring Intel back to the table. Perhaps they're hoping to get a non-tech savy Judge that will rule in their favor regardless of what the contract actually says (not like this has never happened before between other companies).

But either way. Unless that prior contract gives them explicit rights to a new interconnect, Intel is under no obligation to offer them a license if they don't feel it benefits their company.

Regards,
SB

According to your reasoning, Intel would be totally justified in making all x86 CPUs use a proprietary replacement for PCIE, and then only allowing Intel GPUs to talk to Intel CPUs. In one stroke, they could eliminate AMD and Nvidia's GPU businesses, just by changing the interface to their chips. After all, there's no reason they should have to allow other companies to use their new interface.

Is that really the way you think the world should work?
 
According to your reasoning, Intel would be totally justified in making all x86 CPUs use a proprietary replacement for PCIE, and then only allowing Intel GPUs to talk to Intel CPUs. In one stroke, they could eliminate AMD and Nvidia's GPU businesses, just by changing the interface to their chips. After all, there's no reason they should have to allow other companies to use their new interface.

Is that really the way you think the world should work?

wow talk about grasping for straws ...
 
And I'm surprised people are so eager to defend Intel here - why are people happy that Intel has granted itself, through legal shenanigans, a monopoly on chipsets for its own processors?

Well Intel is currently under no obligation to allow other companies to make chipsets for their processors. They will do so if it's advantageous to them and in today's market it isn't. I wouldn't do it any differently.

The analogies you made don't really apply because they are all based on explicit exclusions for otherwise open interfaces. Coding an application for Windows and gaining access to required libraries and network access is an open affair - they can't selectively target browsers (how would they even know the application is a browser?). Same goes for the PCIe interface - it's a standard so if you advertise PCIe compliance you gotta play nice with PCIe devices. There's no such standard for Intel's proprietary interfaces to their CPUs.
 
According to your reasoning, Intel would be totally justified in making all x86 CPUs use a proprietary replacement for PCIE, and then only allowing Intel GPUs to talk to Intel CPUs. In one stroke, they could eliminate AMD and Nvidia's GPU businesses, just by changing the interface to their chips. After all, there's no reason they should have to allow other companies to use their new interface.

Is that really the way you think the world should work?

Actually they have been told that they aren't allowed to do that. Is it the DOJ which does the Anti-Trust stuff?
 
Well Intel is currently under no obligation to allow other companies to make chipsets for their processors. They will do so if it's advantageous to them and in today's market it isn't. I wouldn't do it any differently.

The analogies you made don't really apply because they are all based on explicit exclusions for otherwise open interfaces. Coding an application for Windows and gaining access to required libraries and network access is an open affair - they can't selectively target browsers (how would they even know the application is a browser?). Same goes for the PCIe interface - it's a standard so if you advertise PCIe compliance you gotta play nice with PCIe devices. There's no such standard for Intel's proprietary interfaces to their CPUs.

Evidently I didn't make my point clear enough: I don't think Intel should have the right to keep interfaces to its processors proprietary. Otherwise, all it takes is Intel to replace PCIe with a new proprietary interface (as they have done many times before), and then we'll be stuck trying to run our GPUs off of USB ports.

Obviously the FTC agrees with my reasoning, which is why their most recent settlement with Intel prohibits Intel from dropping the PCIe bus or downgrading its performance for the next few years, even though it might be "advantageous" to Intel's business interests to lock AMD and Nvidia GPUs out of the PC market, once Intel's graphics initiatives mature.
 
Actually they have been told that they aren't allowed to do that. Is it the DOJ which does the Anti-Trust stuff?

It was actually the FTC and Intel who recently settled, and Intel agreed not to drop PCIe or otherwise cripple its performance. The FTC agrees with me: there's a real danger that Intel would just deprecate PCIe and create some proprietary graphics interface, so it barred Intel from doing so. Proprietary interfaces are not good for any of us who like competition.

But like I said earlier, I don't think even Intel thinks it has a case with the Nvidia chipset history. When Intel filed the lawsuit that started all this, they were increasingly worried about Nvidia's chipset success - for example all Mac laptops and desktops (except Mac Pro) had just converted to using MCP79. Intel's chipsets were inferior - they were a 3 chip solution instead of 2, and their graphics performance was terrible. They needed to do something to stop Nvidia's encroachment on their territory, so they just declared their new processors off limits to Nvidia and peremptorily sued Nvidia to stop Nehalem chipset development. Timing is everything - they knew that during the several years it would take their suit and Nvidia's countersuit to work through the courts, they could get Sandy Bridge off the ground, at which point Nvidia's chipset advantage would be much less attractive anyway. And it worked - Nvidia stopped chipset development, and Sandy Bridge does look promising.

However, going back to the original point of departure for this tangent - Intel has no one but themselves to blame for Apple's decision to use old Core 2 Duos in their new Macbook Air. It's certainly not Nvidia's fault for stopping chipset development, which only happened because of Intel.
 
You honestly believe that if AMD had high performance integrated graphics, Intel would let NV provide the parts that let Intel-based computers compete?

intel with thier billions would wait for nvidia to start going broke and then buy them and then have high performance intergrated graphics
 
Intel has no one but themselves to blame for Apple's decision to use old Core 2 Duos in their new Macbook Air.

Maybe i'm out of the loop or something, but might it have something to do with the SU9600 maybe using less power than any Core i3 mobile?

you know an 80% difference in TDP goes a long way in these slim notebooks.

Apple just chose a (honestly, sucky) SU9600 LV CPU and bundled it with a product that was already in their notebook framework; least development costs etc.
 
For the product, I don't see why an SU9600 LV is sucky myself. I think it's a reasonably well judged component list that's obviously designed to be frugal on the battery.
 
I agree with Rys, the CPU seems quite adequate for the intended use. Someone who wants to buy a Macbook Air is looking to surf the web, send email, do some writing, watch videos,etc on the move. A Core 2 Duo + Geforce 320M to accelerate HD video/flash are more than competent to do these tasks. Heck you can even play most modern games(using Bootcamp with Windows of course) on it. The only real shortcoming would be video encoding performance where the Core i series processor would be vastly superior.

Ultimately the choice was based on thermal characteristics, package size and cost. The LV Core 2 Duo's have very low TDP's which are currently not matched by the LV Core ix series. If Intel had decided to make one, a 32nm Core 2 Duo would be the ultimate choice i suppose. The Core 2 Duo CPU's have a package size of 22X22mm(Nvidia GT320M unknown) while Arrandale LV is 34X28mm(its southbridge is 22X20mm). And Core 2 Duo chips are probably a lot cheaper than the LV Core ix parts, and the development costs are also probably a lot lower given the (re?)use of 2 year old hardware.
 
Last edited by a moderator:
Well. Month or two before the launch of HD6800 nVidia already knew what to expect and lowered price of GTX460 to sell as much as possible before Barts will be available.

Now they discounted professional products. Isn't it possible, that AMD is going to release some professional Cayman-based board?
 
Now they discounted professional products. Isn't it possible, that AMD is going to release some professional Cayman-based board?

Who would care? ATI is irrelevant in pro, they make like 25 mln per quarter, which likely comes from legacy FireGL clients. To become relevant they'd need to excel in areas where they've been traditionally bad in(hint, hardware prowess does little good there, beyond a certain point). For years they've been clamoring design wins, and still they only make a spitball from a highly lucrative market(see how much NV earns there). It's more likely that NV isn't seeing the upgrade rate they'd have liked, and being crunched in consumer they're trying to rake in some cash from pro.
 
Who would care? ATI is irrelevant in pro, they make like 25 mln per quarter, which likely comes from legacy FireGL clients. To become relevant they'd need to excel in areas where they've been traditionally bad in(hint, hardware prowess does little good there, beyond a certain point). For years they've been clamoring design wins, and still they only make a spitball from a highly lucrative market(see how much NV earns there). It's more likely that NV isn't seeing the upgrade rate they'd have liked, and being crunched in consumer they're trying to rake in some cash from pro.

OK, but still, 50%?! That's huge!
 
OK, but still, 50%?! That's huge!

I didn't say the move isn't surprising, or unexpected. I just have doubts it's directly related to ATI doing something in the pro market, because they're certainly not a threat now, and given inertia in that market space it'd take them quite a while to become one.
 
Who would care? ATI is irrelevant in pro, they make like 25 mln per quarter, which likely comes from legacy FireGL clients. To become relevant they'd need to excel in areas where they've been traditionally bad in(hint, hardware prowess does little good there, beyond a certain point). For years they've been clamoring design wins, and still they only make a spitball from a highly lucrative market(see how much NV earns there). It's more likely that NV isn't seeing the upgrade rate they'd have liked, and being crunched in consumer they're trying to rake in some cash from pro.

AMD's Fire GL business is that bad off? I thought it was much more successful(Though not close to Quadro) than that...
 
anyway, what are theoretical reasons which can force a manufacturer to offer 50% discounts of (reportedly) well selling product?

1. competitor is going to launch better product (not likely according to AlexV)
2. old stock (it isn't - according to kitguru)
3. replacement product will be launched soon
4. demand is weaker than supplies (hardly could explain 50% discount)
5. need to get some capital as soon as possible

any better idea? :???:
 
Status
Not open for further replies.
Back
Top