NVIDIA shows signs ... [2008 - 2017]

Status
Not open for further replies.
The Geforce 6150 motherboard IGP came out before the X1800 at the 90nm node

The name on the check may have been Microsoft, but I think we can credit Xenos to the ATI account at 90nm in this context.
 
It's not as if they'd have realistically attached such a mediocre cooling solution if it had been a PC part.

But it wasn't. In fact, the PS3 RSX GPU had much more in common to the PC GPU's of the time (NV40/G70/G71) than the Xenos with its dual die, eDRAM, unified shaders design.
So, if ATI can be credited with production tweaking of the chip at TSMC (to some point, of course), then its part in choosing an adequate cooling solution for the GPU must be credited accordingly too, for better or for worse, wouldn't you agree ?
Microsoft certainly did not know the power draw and heat output of the GPU any better than its own designers -and TSMC production indirect process implementers-, and ATI's designers probably knew very well and far in advance the general volume and component arrangement inside the console (this is especially true in the X360, since the GPU also acts as the Northbridge/memory controller for the whole system, so it has a central part in it).

Most X360 failures happened because of its GPU, not the CPU or any other faulty hardware components/software bugs.
 
Last edited by a moderator:
None of the new parts (RV770, G200) have any bearing on the mobile space though. As far as I'm aware, nothing has happened on that front that changes the competitive landscape substantially. (My personal opinion is that neither AMD nor nVidia has managed to produce a particularly attractive DX10 generation GPUs for mobile use. 40nm lithography may or may not change that.)

I agree that the new parts don't have a bearing on the mobile space, but remember that this thread is just about the stock price. Whether it be the battle between newer parts on the desktop or older gen derivatives on the laptops, the point is that any volume/margin squeeze is a justifier for a drop.

There's been a consensus (of sorts) that whatever the cost itself, the new G200 chips are expensive to produce. I'm not sure what their margins were, but price cuts on the order of several hundred dollars nigh wrecks them surely. And it seems to me a move that would only take place - to that extent and degree - if the sales before now had been... well, bad.

At ~$12.50 NVidia's P/E ratio hangs at roughly ~9.00x. If this was floating in isolation, anyone would agree that this stock is now 'cheap.' But... when the earnings get revised, that P/E ratio is going to jump way up. Whether the jump is to a level higher or lower than the P/E prior to the slide this year (inclusive of the one-day drop) is quite significant IMO in judging whether the potential for this stock weighs more heavily to the downside or the upside going forward. In the interim, investors will be looking to Intel and AMD's quarterly results in order to glean information on where NVidia's market position may lie relative to its competitors.
 
Last edited by a moderator:
And the Beatles were the bestest music group ever. Which is just as relevant to the point under discussion as yours. rIngo sTAr. . . .ATI backwards, you know that's ZOMG!

I didn't bring a non-AMD, non-Nvidia product to this discussion in the first place, you and AlphaWolf did... ;)
 
Now you're just being intentionally difficult. How unworthy. You know you're using the details of a business relationship to obscure the truth. What's all that money that ATI got paid before release and still gets paid for then? Why was it that it was ATI reporting yields on Xenos in conference calls pre-release of XB360?

But enough of your thread-crapping. If you'd like to have a "does Xenos really count as an ATI part" thread go start one and get laughed at.
 
Now you're just being intentionally difficult. How unworthy. You know you're using the details of a business relationship to obscure the truth. What's all that money that ATI got paid before release and still gets paid for then? Why was it that it was ATI reporting yields on Xenos in conference calls pre-release of XB360?

But enough of your thread-crapping. If you'd like to have a "does Xenos really count as an ATI part" thread go start one and get laughed at.

It seems i've struck a nerve somewhere with you... Sorry.
You win, Xenos is an ATI/NEC part made at TSMC on a 90nm process back in 2005 (paid for with Microsoft money). It was the first. Good enough ?

Now let's get back to the subject of this thread, shall we ?
 
Can they make it? yes.. but I don't think they're in the position to play another "Ultra" card here...

There's a "GT200-400" mentioned in the 177.40 driver .inf file.
As we know, GTX 260 is the "GT200-100", while the GTX 280 is the "GT200-300", so this is obviously a new card (i don't think they're the new Quadro's or Tesla's, because they usually use the sufix -8xx after the core codename).
I wouldn't be surprised if there was actually a "GX2" or another "Ultra" card to be played directly against the HD4870 X2...
 
If there is hopefully it's more than the 6% overclock the last "Ultra" was. There was a post at XS a few days ago hinting at a 736/1550 GT200 coming soon. Can't seem to find it now though.
 
I'd be very surprised if GT200-400 was anything other than a GT200b 'GTX 290'.
 
Saw an article this morning of the nvidia DX10/1 partt coming out in Jan 2009 so that ties up well with 40nm timecale too. My trusty G80 will soldier on that long I think until i see what AMD and nvidia do then.

As for the GT200-400 worse case scenario it is an ultra at the $650 price point now the 280 is at about $500. That's horrible to contemplate. Best would be a rushed 200b as mentioned above which is cheaper and overclocks like a mofo. Still not for me though, unless it was $300. I am a bit strapped for cash at the mo :(
 
Charlie D thinks there's a lot more to come, and Nvidia is burning bridges:

The official line is: "While we have not been able to determine a root cause for these failures, testing suggests a weak material set of die/package combination, system thermal management designs, and customer use patterns are contributing factors". Parsing that, you see that they are blaming fabs and packaging suppliers first, OEMs second, and those damn users third, but they have no fault here, NV can do no wrong.

This is really dangerous for three reasons: they are annoying suppliers, annoying OEMs and annoying users. Last we checked, they need all three to remain in business.
There seem to be two currently-affected products, the low-end and the mid-range parts of the last generation. Depending on the failure rate, Nvidia could be looking to eat the majority of a generation's products plus the cost of things they were soldered to, and the tech school dropout used to screw new parts in.

This will be very ugly before it is done, very very ugly. Finger pointing early on and the blame game will only harden resolve on the other side, and add to costs. There go their cash reserves, we guess. It couldn't come at a worse time. Then again, doing everything wrong does have a cost.
 
Status
Not open for further replies.
Back
Top