Geforce NV50 Canceled [Inq]

Unknown Soldier said:
Is it possible that Nvidia found out some results from the R520 and realised that the NV40 would not be up to scratch (ala NV30 and R300) and so decided to cancel the project.

I suppose so, but it is equally possible that Nvidia realized NV50 was far too advanced in comparison to R520 and that there was no reason to release such a powerful chip into the market when it can be left for later (to maximize profits). You know, how ATI decided it didn't need to spend so much on R&D, but could compete by upping the frequency of the core with R420 and could leave the original R400 tech for a later date. Milking, I believe is the terms when viewed from the pointy end of the shaft. ;)

I find it amusing that people are speculating on a speculation of a cancellation of a product that they don't even know what it is. I like a good theory, but don't we need a little more to go on than we have? So far the best info from this thread has been the bringing out of the importance of the DX/WGF shift and how this may have impacted hardware roadmaps. Then again, I am not sure how to consolidate this with the fact that DX has been evolving and MS is notorious for delayed release dates. What's new?
 
wireframe said:
I suppose so, but it is equally possible that Nvidia realized NV50 was far too advanced in comparison to R520 and that there was no reason to release such a powerful chip into the market when it can be left for later (to maximize profits). You know, how ATI decided it didn't need to spend so much on R&D, but could compete by upping the frequency of the core with R420 and could leave the original R400 tech for a later date. Milking, I believe is the terms when viewed from the pointy end of the shaft. ;)

I always imagined the successor of NV40, which was meant to fill the gap until WGF/Longhorn shows up, as just another SM3.0 architecture with whatever changes/improvements possible within that timeframe. It simply doesn't make sense to me personally to go beyond SM3.0, without the according API support.

I find it amusing that people are speculating on a speculation of a cancellation of a product that they don't even know what it is. I like a good theory, but don't we need a little more to go on than we have? So far the best info from this thread has been the bringing out of the importance of the DX/WGF shift and how this may have impacted hardware roadmaps. Then again, I am not sure how to consolidate this with the fact that DX has been evolving and MS is notorious for delayed release dates. What's new?

ROFL tell me about it. What's new (and really isn't in the end) is that the Inquirer is full of it :LOL:
 
The way i see it:

NV48 may be canned due it's not being good enough replacement for NV40/NV45 - just a small overclock as R480.

NV47 is going to be the next high-end chip. Basically it's just NV40 with more pixel/vertex pipes and native PCIE interface. Don't know anything about the process it's supposed to be build on. Doubt about 0.13, most probably it's gonna be 0.11 or 0.09.

NV47 will come to market before R520 and will be the fastest chip till 520 comes. We'll see how well will it compete with R520.

NV50 is just to far away to be cancelled. NV could cancel some form of NV50, but that means that we'll see another better (or worse - depends on why NV50 was canned) NV50 in the autumn of 2005.

If NV50 was canned the way R400 was canned then we'll most probably get something like 'NV49' in the end of 2005. It might even be called NV50, but 'real' NV50 will already be NV60 at this point.

If NV50 was canned the way the original NV40 was canned, then we'll see something like NV55 in the autumn of 2005 with NV60 coming with WGF/DXNext/whatever.

I'm actually thinking of the second variant with NV50. They could again decided to bring the refresh instead of the original chip. That'd give them a big lead over R520 (which will be the same R3x0 architecture with SM3 support) till R600 comes out. NV60 would be ready by then.
 
wireframe said:
So far the best info from this thread has been the bringing out of the importance of the DX/WGF shift and how this may have impacted hardware roadmaps. Then again, I am not sure how to consolidate this with the fact that DX has been evolving and MS is notorious for delayed release dates. What's new?

Realistically, in since DX8 was released, its always been about getting as close to the DX release window as you can (assuming you are executing within the timeframe you expect) and this is getting even more critical as both performance and capability expectations are increasing. WGF is a little different in the fact that it is not a standalone release (in fact there may not ever be an XP version) and that it is tied to Longhorns release which has had wide release date variations, which doesn't make the 3D vendors lives any easier.
 
wireframe said:
...

I suppose so, but it is equally possible that Nvidia realized NV50 was far too advanced in comparison to R520 and that there was no reason to release such a powerful chip into the market when it can be left for later (to maximize profits). You know, how ATI decided it didn't need to spend so much on R&D, but could compete by upping the frequency of the core with R420 and could leave the original R400 tech for a later date. Milking, I believe is the terms when viewed from the pointy end of the shaft. ;)

Following that logic, then ATi should certainly never have released R300 when it did, as it leapfrogged nV technically and was "far more advanced." But simply looking at R300 it's easy to see why the inverse of your logic is clear and apparent: if you can leapfrog the competition it behooves you to do so. That's the lesson behind R300 and AMD64, among many others, in a nutshell. Milking is a fine strategy for a non-competitive market that you own but it's a disaster in a competitive atmosphere.

I find it amusing that people are speculating on a speculation of a cancellation of a product that they don't even know what it is. I like a good theory, but don't we need a little more to go on than we have? So far the best info from this thread has been the bringing out of the importance of the DX/WGF shift and how this may have impacted hardware roadmaps. Then again, I am not sure how to consolidate this with the fact that DX has been evolving and MS is notorious for delayed release dates. What's new?

It's no more amusing than your "too advanced" speculation, certainly...;) But in truth it would be interesting if this was more than yet another Inq rumor, since I predict that nVidia will indeed release *a product* at some future date called "nV50"...;) Whether the current iteration of the design has been shelved to be replaced by something better seems fairly irrelevant as this is actually a common practice along the road to product development. What counts is always what a product turns out to be; what it "might have been" is irrelevant. In this case it's a double irrelevancy as we have no clue as to what nV50 "might have been," let alone what it will turn out to be...;)
 
WaltC,

You seem to have developed some sort of twitch... ;) ;)

It's either that or you understood full and well that my "too advanced" statement was put there is humor and to balance the scales with those who always assume the worst for certain companies/people and the best for others.

We know for a fact that ATI's R400 "was too advanced" and that they decided to use the superscalar nature of the R300 to "milk out" a R420. I don't see how this can be viewed otherwise. I hope I need not always qualify such statements with a "there is nothing wrong with that (it happens all the time)".

All I am saying is that we don't know exactly what NV50 is/was/will be and we do not know the status of its development. There is really no reason to use this speculation to elevate a competitor's product which is also unknown. It just doesn't make sense. We could then go even further an speculate that Nvidia knows what R520 is and sees the chink in ATI's armor and realigned NV50 to go for the jugular... c'mon...

BTW, considering how different R420 and NV40 are, doesn't anyone else find it peculiar that they perform so equally in many ways? ;)

Oh no! I seem to have developed a twitch too! ;) ;) :|
 
_xxx_ said:
If they really cancelled it, that can only mean they have an alternative which blew the management away completely. I'd really like to know what's going on.

Any other blurb?

Maybe the rumours of Nvidia designing the GPU for the PS3 are true, and now they have to shuffle around their engineering teams?
 
some speculation:

hmmm. Maybe Nvidia has seen ATI R520 and/or the more advanced R500 (for Xbox 2?) and knows that ATI's next, next gen graphics part, the R600, will be killer. the NV50 would have to be facing the R600 in the early 2006 timeframe. so maybe Nvidia is changing their roadmap. maybe Nvidia is accelerating NV60 design. or will simply make what would be NV55, the NV50, or whatever. remember both Nvidia and ATI have at least 3 GPU design teams each and are always working on several generations of chips at the same time.

Nvidia has, so far, been shut out of the nextgen game consoles. ATI has 2 of the 3 locked up (MS, Nintendo) and Sony has, so far, done their own graphics in-house. Nvidia needs some breakout technology or breakout deal (maybe Sony?) to follow-up the sucessful NV4x / GeForce 6x series.

It'll be fun to see what Nvidia does, they're a survivor. remember the disasters that were NV1 (Diamond Edge 3D card) and NV2 (for Sega) in 1995-1996? They bounced back with Riva128 (NV3) and TNT (NV4) then TNT2 (NV5) and GeForce (NV10) very nicely. And more recently, Nvidia bounced back very well from the NV3x / GeForceFX semi-disaster
 
The question is, has nVidia been shut out of the next-gen consoles, or chosen not to design for one? After what happened with the NV30, it seems plausible that nVidia decided to refocus their efforts on their core market.
 
They finally decided they liked 3dfx's post voodoo5 lineup better then what they have to offer!

Or they've decided to go the way of the 3dfx and sell their assets to ati and close up shop, at which point matrox or imgtech(or perhaps sony or intel) will step in and compete with the 'new' ati.

Or perhaps it's not the NV47 but rather they hired Agent 47 to assassinate all of ATI's engineers.
 
whatever Nvidia's next major architecture is, in the late 2005 to mid 2006 timeframe, it will probably be approaching 500M transistors. it'll break through 400M no doubt. and that's just thinking in terms of NV50 without cancellation.

Nvidia, moreso than ATI now, has been doubling transistors every true generational change.

from Riva128/Riva128 ZX (3-3.5M) to TNT/TNT2 (8-10M) to GeForce256/GeForce2 GTS (23-25M) to GeForce3/GeForce4 (57-63M) to GeForceFX (125M) to GeForce 6800 (222M)


I predict that Nvidia will want to do at least 1 demo of 'Toy Story in realtime' and if they do, it'll be heavily optimized and modified from the original film but look close enough to convince the masses. :LOL:
 
Chalnoth said:
The question is, has nVidia been shut out of the next-gen consoles, or chosen not to design for one? After what happened with the NV30, it seems plausible that nVidia decided to refocus their efforts on their core market.

I've heard the NV30 and X-Box NV2A connection mentioned before, but ATI pulled off the Gamecube and R300. Everything seemed to work out ok. ATI now has 2 console contracts (if they are indeed providing a GPU for the next Nintendo console). The revenue from the X-box contract is going to be hard to replace without a PS3 contract, so nVidia should be hungry for some console action.
 
whatever Nvidia's next major architecture is, in the late 2005 to mid 2006 timeframe, it will probably be approaching 500M transistors. it'll break through 400M no doubt. and that's just thinking in terms of NV50 without cancellation.

Is such a jump even possible for late 2005 on let's say 90nm? (ok a lot would be possible if it would cost an arm and a leg, but there are increasing considerations as processes get smaller and that's something all chip vendors have seen recently...).

Multi-core on a single die?
 
Brimstone said:
Chalnoth said:
The question is, has nVidia been shut out of the next-gen consoles, or chosen not to design for one? After what happened with the NV30, it seems plausible that nVidia decided to refocus their efforts on their core market.

I've heard the NV30 and X-Box NV2A connection mentioned before, but ATI pulled off the Gamecube and R300. Everything seemed to work out ok. ATI now has 2 console contracts (if they are indeed providing a GPU for the next Nintendo console). The revenue from the X-box contract is going to be hard to replace without a PS3 contract, so nVidia should be hungry for some console action.

Unless they concentrate even more on other markets instead, where less resources are required and higher margins can be predicted.

I'd even dare to speculate that both IHVs might start to sell more and more IP in the foreseeable future, in order to save time and resources (ATI selling IP for XBox2 and NVIDIA for AR10 would be two recent examples up to now).
 
but ATI pulled off the Gamecube and R300.

ATI didn't pull off Gamecube graphics. that job was done by ARTX 1998-2000, before ATI bought out and acquired ARTX in mid 2000, by which point, Flipper GPU for Gamecube was complete.


Is such a jump even possible for late 2005 on let's say 90nm? (ok a lot would be possible if it would cost an arm and a leg, but there are increasing considerations as processes get smaller and that's something all chip vendors have seen recently...).

Multi-core on a single die?

although I cannot say yes for certain, but think about this: hardly anyone thought 125M transistors was possible for NV30 in late summer 2002 (paper launch) to early 2003 (release) or 222M transistors for NV40 in early to mid 2004.

ok I'll say it: a 400M to 450M transistor Nvidia GPU is possible in late 2005 to early 2006 on 90 nm

or maybe Nvidia will go nuts and pull out a 500M+ transistor GPU in late 2006 on 65 nm :devilish:
 
Megadrive1988 said:
whatever Nvidia's next major architecture is, in the late 2005 to mid 2006 timeframe, it will probably be approaching 500M transistors. it'll break through 400M no doubt. and that's just thinking in terms of NV50 without cancellation.

Nvidia, moreso than ATI now, has been doubling transistors every true generational change.

from Riva128/Riva128 ZX (3-3.5M) to TNT/TNT2 (8-10M) to GeForce256/GeForce2 GTS (23-25M) to GeForce3/GeForce4 (57-63M) to GeForceFX (125M) to GeForce 6800 (222M)


I predict that Nvidia will want to do at least 1 demo of 'Toy Story in realtime' and if they do, it'll be heavily optimized and modified from the original film but look close enough to convince the masses. :LOL:

And look how great the yields are for their high end chips now.

I think the fast-14 technology will allow ATI to produce a high end chips with 1/4 the transistors so they can have a monsterous cache in the chip. Less heat and current leakage at 90nm will also be essential and is a bonus feature of dynamic logic over cmos.
 
DaveBaumann said:
WGF is a little different in the fact that it is not a standalone release (in fact there may not ever be an XP version) and that it is tied to Longhorns release which has had wide release date variations, which doesn't make the 3D vendors lives any easier.
Its gonna be a hell of a small market at the beginning then. Its relevance on the market at the time of release is going to be quite limited, IMO.
 
Back
Top