NVIDIA shows signs ... [2008 - 2017]

Status
Not open for further replies.
Is this just the first sign of the squeeze? I thought Nvidia still made the most popular AMD chipsets though....but AMD's woes can't be helping in that department.
 
Nvidia chipsets started as a spin-off from the Xbox, and attracted a premium price for what was then a premium product. It's been a while now since the two main CPU manufacturers have caught up and have been making their own pretty good chipsets. Motherboard manufacturers can now make good motherboards without having to pay Nvidia what they considered to be a lot of money for chipset functionality that they can now get elsewhere for a better price.

Rock and a hard place for Nvidia.
 
This news item is inaccurate; NVIDIA does have the Macbook slot, and MCP7A will be released. What is likely being considered is leaving the market in the Nehalem timeframe.

Either way, there's a very simple reason why this would be a disastrous decision and it has nothing to do with the economics of keeping the business running: with all due respect, the MCP engineers would be hard to integrate and would effectively ruin the GPU business. Maybe once their DX11 chip is done and taped-out they could consider it, but before that they're just asking for trouble! :p

Given how badly fucked up their chipset roadmap has been in the past and still is, and how godawful their execution has been (did they ever deliver on ANYTHING except MCP61?), I can't help but wonder why they don't just hire me to run that shithole instead. Alternatively, I'd love to buy it... for five bucks or so. I refuse to pay six, that'd really be pushing it.
 
ROFLMAO! I can't wait to see Apple's infamous reliability go in the toilet. Oh sweet revenge... I've never liked working on those proprietary POSes, now I can point the finger and tell all the Mac Qaeda zealots their beloved cult is not infallible, and they shall soon suffer the consequences of ill-informed decision making.
 
In the famous words of Neo. . . "whoa".

I can't help but think back to my original predictions of the real threat to NV from the AMD/ATI merger --that galloping platformisation would begin to squeeze them at the bottom and work its way up. This would appear to be the first casualty of that dynamic.

Are IGPs included in what's being discontinued? There's millions of units you can't spread your R&D over, if true.

What happens to SLI? Is this going to lead to the glorious day where SLI and CrossFire become cross-compatible? Or is it going to lead Intel to toss CrossFire over the side and go with Nvidia SLI only which would be even worse than what we have now?
 
ROFLMAO! I can't wait to see Apple's infamous reliability go in the toilet. Oh sweet revenge... I've never liked working on those proprietary POSes, now I can point the finger and tell all the Mac Qaeda zealots their beloved cult is not infallible, and they shall soon suffer the consequences of ill-informed decision making.
I'm not trying to be mean, but you're just being absurd here. Point me to a single NVIDIA chipset except MCP55 that has ever had significant problems that are truly out of the ordinary. Just go ahead; you won't find it outside of maybe ActiveArmor on the nForce4.

It could be that MCP79 will be unreliable, nobody knows since nobody even previewed it yet AFAIK, but I don't think it's a good plan to start by generalizing so far :)
 
You're just being absurd. Point me to a single NVIDIA chipset except MCP55 that has ever had significant problems that are truly out of the ordinary. Just go ahead; you won't find it.

It could be that MCP79 will be unreliable, nobody knows since nobody even previewed it yet AFAIK, but I don't think it's a good plan to start by generalizing so far :)

Are you kidding? It's common knowledge NV's modern "enthusiast chipsets" are prone to corrupting data (particularly in RAID arrays). I've experienced it on an NF4U board, and I have a friend with 3x NV chipsets in his house (NF4 SLI, 590, 680) and every one of them has experienced stability issues due to excessive heat and/or data corruption of RAID arrays. My roommate has an NF4 SLI board that's on its last legs for needing its NB fan replaced one too many times.
 
Are you kidding? It's common knowledge NV's modern "enthusiast chipsets" are prone to corrupting data (particularly in RAID arrays). I've experienced it on an NF4U board, and I have a friend with 3x NV chipsets in his house (NF4 SLI, 590, 680) and every one of them has experienced stability issues due to excessive heat and/or data corruption of RAID arrays. My roommate has an NF4 SLI board that's on its last legs for needing its NB fan replaced one too many times.

Well, the fan dying has little to no connection to the chipset underneath it, don't you think ?
Most high-end quality boards used heatpipes, at least since the Asus NF4 SLI Premium, for instance.
The Asus NF4 SLI Deluxe model was especially prone to fan failures right from the start, they chose a very poor solution, which is further evidence that Asus, not Nvidia, later acknowledged the mistake and decided to correct it in a new revision.

As for the data corruption, it's not really the northbridge that's at fault here, it's the use of an outdated NF570/590 SLI chip as a southbridge on recent chipsets.
That too is fading away, since MCP7A, like MCP78 before it, is going single-chip.
 
Well, the fan dying has little to no connection to the chipset underneath it, don't you think ?

I hoped someone would say this... :devilish:

If the chipset were not running at the ragged edge of thermal limits, it would not require a fan in the first place ;)

Most high-end quality boards used heatpipes, at least since the Asus NF4 SLI Premium, for instance.
The Asus NF4 SLI Deluxe model was especially prone to fan failures right from the start, they chose a very poor solution, which is further evidence that Asus, not Nvidia, later acknowledged the mistake and decided to correct it in a new revision.

This is the board in question. Yes, the particular cooling implementation is a vendor-specific issue, but again, the necessity of the fan is the fault of the ridiculously hot NV chipset in the first place.

As for the data corruption, it's not really the northbridge that's at fault here, it's the use of an outdated NF570/590 SLI chip as a southbridge on recent chipsets.
That too is fading away, since MCP7A, like MCP78 before it, is going single-chip.

Well my buddy's 2x680 systems have been reloaded several times over the last 18 months due to array corruption (verified by both of us, experienced, multi-cert techs).
 
Well, it is worth pointing out that while Digitimes is better than Le Inq, they hardly have a flawless record of accuracy. So a few more data points on what's really going on here are definitely in order.
 
Well, it is worth pointing out that while Digitimes is better than Le Inq, they hardly have a flawless record of accuracy. So a few more data points on what's really going on here are definitely in order.

Good point. We're (I am) jumping to conclusions here. As much as I dislike the business practices and despicable marketing of both NV and Apple, this is a bad idea and would be better for consumers if it never came to fruition.
 
Shaidar & INKster --that's enuf on old battles in detail about specific products. That's ancient. If you want to keep talking about it, take it to another thread and entertain yourselves, because the rest of us are bored by it.

Even if Arun did make the mistake of inviting it. :)
 
I've experienced it on an NF4U board, and I have a friend with 3x NV chipsets in his house (NF4 SLI, 590, 680)
Latter two are MCP55 and therefore is exactly what I said. NF4 SLI, I seem to understand from what you say that the problem is the fan failing; this is a motherboard problem, not a chipset problem. NF4 wasn't perfect, far from it, but its problems were nothing significantly out of the ordinary.

MCP55 was used in the following SKUs: 550, 570, 590, 680i, 780i, 790i... And yes, that means it has been used over and over again during a ~2 years period (although 780i/790i are based on a respin, but still...) - and I apologize for continuing this after Geo's warning, wasn't posted when I began writing it - either way I'll probably delete this post once I'm sure it was seen so as to not to overly pollute the thread.
 
ROFLMAO! I can't wait to see Apple's infamous reliability go in the toilet. Oh sweet revenge... I've never liked working on those proprietary POSes, now I can point the finger and tell all the Mac Qaeda zealots their beloved cult is not infallible, and they shall soon suffer the consequences of ill-informed decision making.
Hmm, haven't you noticed there's a lot of Apple laptops out there with NVidia GPUs just waiting to commit suicide? They're already suffering.

Jawed
 
Are IGPs included in what's being discontinued? There's millions of units you can't spread your R&D over, if true.
I'm sure it'll take Nehalem a fair while to take over from Core 2. But once it does those mobo's won't be offering NVidia a home, unless NVidia gets a QPI licence. So there's still a fair while for NVidia yet.

What happens to SLI? Is this going to lead to the glorious day where SLI and CrossFire become cross-compatible? Or is it going to lead Intel to toss CrossFire over the side and go with Nvidia SLI only which would be even worse than what we have now?
:LOL: at the prospect of Intel giving CrossFire and SLI alternating 6-month slots "in its mobos", until Larrabee is underway.

Jawed
 
Intel wouldn't be making any chipsets if it didn't help sell processors and keep old fabs running.

Right. Because Intel isn't the famously anal company that wants to control everything that has an impact on them and maximize their profits while keeping everyone else away. Remember the movie "Stir Crazy" with Richard Pryor and Gene Wilder? Grossburger protecting his meal? That's Intel with anything to do with their platform. It's not that the factors you point at aren't true, but they are more nice byblows rather than real strategy.

AMD would still have the excuse that its chipsets help sell processors and possibly video cards.

AMD was pathetically at the mercy of some really crappy chipsets for a long time. Then they had NV and that helped quite a lot, but still put them at the mercy of a third party continuing to love them. Which worked out kind of well when they had A64 to kick ass with, but not necessarily so well when market conditions shifted. I'd argue that AMD didn't become a real competitor to Intel until it had its own credible chipset division.

Nvidia just has the video card excuse, and apparently it looks like chipsets don't really sell video cards or procure QPI licensing.

The SLI thing is really quite puzzling tho in this regard. They've always loved SLI dearly. Have they made agreements to protect its future? I think it nearly certain that AMD would play ball for various reasons. This might force Intel to follow suit.
 
http://www.techreport.com/discussions.x/15240

Puzzled, we asked Nvidia Platform Products PR chief Bryan Del Rizzo to weigh in. Del Rizzo's response came swiftly and left little open for interpretation:

1. The story on Digitimes is completely groundless. We have no intention of getting out of the chipset business.
2. In fact, our MCP business is as strong as it ever has been for both AMD and Intel platforms:
1. Mercury Research has reported that the NVIDIA market share of AMD platforms in Q2 08 was 60%. We have been steady in this range for over two years.
2. SLI is still the preferred multi-GPU platform thanks to its stellar scaling, game compatibility and driver stability.
3. nForce 790i SLI is the recommended choice by editors worldwide due to its compelling combination of memory performance, overclocking, and support for SLI. . . .
3. We're looking forward to bring new and very exciting MCP products to the market for both AMD and Intel platforms.

Personally, I don't see any value in releasing chipset for either of the 2 CPUs makers that are hostile towards Nvidia. Intel of course won't license QPI(hope EU will nail Intel good) and AMD wants to be a platform company also and eat the chipset cake, leaving partners with nothing just like Intel does with their chipset business, leaving hardly any scraps for SiS, ALi & VIA back then.
 
Right. Because Intel isn't the famously anal company that wants to control everything that has an impact on them and maximize their profits while keeping everyone else away. Remember the movie "Stir Crazy" with Richard Pryor and Gene Wilder? Grossburger protecting his meal? That's Intel with anything to do with their platform. It's not that the factors you point at aren't true, but they are more nice byblows rather than real strategy.
Chipsets are a low margin business, and if they weren't tied to a high-margin business (CPU) thanks to the platform, they would have been killed or spun off in many such retrenchments in Intel's history: see memory, flash, their TV chip foray, etc.
 
Chipsets are a low margin business, and if they weren't tied to a high-margin business (CPU) thanks to the platform, they would have been killed or spun off in many such retrenchments in Intel's history: see memory, flash, their TV chip foray, etc.

Well, sure, but that's like saying if my uncle were a woman he'd be my aunt. It's less that cpu's are high-margin than cpus are their main business. Control, control, control.
 
Status
Not open for further replies.
Back
Top