Anandtech's "explanation" of NV3X's issues.

Snake's NV3x Post-Mortem:

Nvidia thought they could get away with a half-assed technology leap, since generally the game market doesn't really adopt to new standards until months and months after an API update.

Nvidia also miscalculated their competitor, figuring they (ATI) would do the same as they were.

That is why they put out a what was basically a DX8.1 part (with a smattering of DX9 feautres). They just figured the industry wasn't moving ahead that quickly.

In other words, Nvidia fucked up. Their 100 million dollar baby was a stillborn. A bonafide piece of crap. A real hunk of... Well, you get the idea.

While the NV3x is a steller DX8 card (when compared to a GF4), it sure is a real shitty DX9 one.
 
Nick Spolec said:
Snake's NV3x Post-Mortem:

Nvidia thought they could get away with a half-assed technology leap, since generally the game market doesn't really adopt to new standards until months and months after an API update.

Nvidia also miscalculated their competitor, figuring they (ATI) would do the same as they were.

That is why they put out a what was basically a DX8.1 part (with a smattering of DX9 feautres). They just figured the industry wasn't moving ahead that quickly.

In other words, Nvidia fucked up. Their 100 million dollar baby was a stillborn. A bonafide piece of crap. A real hunk of... Well, you get the idea.

While the NV3x is a steller DX8 card (when compared to a GF4), it sure is a real shitty DX9 one.

I agree, and would only add that I think nVidia was so entranced with the illusion that it was the de facto "market leader" in 3d that it believed it could continue milking versions of nV10 (indefinitely?), depending on 3rd-party advances in FAB tech and ram and other things to provide performance and feature differences between one iteration of nV10-20-25 to the next, that when ATi came along with R300--nVidia simply never saw it coming. That's what hubris will do for you--just my opinion, of course...:D nVidia got too comfortable after engulfing 3dfx, and began making gross over-assumptions as to its "manifest destiny" in the 3d-chip sector. I think much the same thing happened at 3dfx.
 
WaltC said:
I agree, and would only add that I think nVidia was so entranced with the illusion that it was the de facto "market leader" in 3d that it believed it could continue milking versions of nV10 (indefinitely?), depending on 3rd-party advances in FAB tech and ram and other things to provide performance and feature differences between one iteration of nV10-20-25 to the next, that when ATi came along with R300--nVidia simply never saw it coming. That's what hubris will do for you--just my opinion, of course...:D nVidia got too comfortable after engulfing 3dfx, and began making gross over-assumptions as to its "manifest destiny" in the 3d-chip sector. I think much the same thing happened at 3dfx.

But, hopefully after the entire NV3x debacle, Nvidia kicks itself in the ass and wakes up.

6800 looks like a result from that, but their actions in the coming months when ATI reveals their chip will really tell the tale is Nvidia has changed for the better.
 
whether it is part of the DX9 baseline does not really matter all that much.

When talking about a half assed tech leap I think it is important to judge if the industry is waiting for that tech leap.

I think that both users and developers were eagerly awaiting PS2/VS2. (or at least PS1.4 since that wasn't available to Nvidia users as well)

But in the case of PS3 I doubt if that is the case. I have still not seen any examples of PS3 were it greatly improves performance or image quality, like we had with PS2.

Because of that PS3 is not high on my wish-list. And I think that I am not alone in that.

Now can you speak of a half assed tech leap, if nobody is waiting for it to be implemented?
 
While the NV3x is a steller DX8 card (when compared to a GF4), it sure is a real shitty DX9 one.

Seriously, though, now.....how many DX9 titles are out? Really? Can you count them on one hand?

Now, how many DX8 titles around out? Yeah.

Okay, now how many DX7 titles are out? Riiight. (And DX7, especially, benefits from the 4x2 architecture as DX7 games were generally HEAVILY multi-textured).

Making a DX7/DX8 card with DX9 as a checkbox feature was not necessarily a bad idea. DX9 games STILL don't have a major market presence.

And, of course, there is the oft-commented on concept that the nv30 was 'designed around Doom3'. Ultrashadow, improved stencil op processing, partial precision support, etc - all things that will boost performance substantially in games coded using nVidia's OpenGL extensions. Like Doom3 and its licensees.

Personally, I *don't* think this Anand article is late - I think it's actually jumping the gun. Until we see Doom3 retail, it's far to early to determine an 'across-the-board' loser from last year's hardware wars.
 
Xander said:
While the NV3x is a steller DX8 card (when compared to a GF4), it sure is a real shitty DX9 one.

Seriously, though, now.....how many DX9 titles are out? Really? Can you count them on one hand?

Now, how many DX8 titles around out? Yeah.

Okay, now how many DX7 titles are out? Riiight. (And DX7, especially, benefits from the 4x2 architecture as DX7 games were generally HEAVILY multi-textured).

Making a DX7/DX8 card with DX9 as a checkbox feature was not necessarily a bad idea. DX9 games STILL don't have a major market presence.

And, of course, there is the oft-commented on concept that the nv30 was 'designed around Doom3'. Ultrashadow, improved stencil op processing, partial precision support, etc - all things that will boost performance substantially in games coded using nVidia's OpenGL extensions. Like Doom3 and its licensees.

Personally, I *don't* think this Anand article is late - I think it's actually jumping the gun. Until we see Doom3 retail, it's far to early to determine an 'across-the-board' loser from last year's hardware wars.

:oops:

Uhm, what? :?
 
While I'd hesitate to call Nv3x an absolute loser when it comes to the admitted majority of games on the market, I don't think the across the board "optimizations" such as brilinear filtering and selective anisotropic filtering are the hallmarks of a winner either.

Nvidia forced those cheats on pretty much everything, including DX7, DX8 class games. That and the horrific MS flight sim ground detail, good lord I felt embarrassed just looking at it. Texturing that blurry should be reserved for sufferers of glaucoma and XGI.
 
I can understand Xander's reasoning.

Indeed there are few DX9 titles and lots of DX7/DX8 titles.
Therefor the NV30 does indeed do quite well in the majority of titles. And since DoomIII is essentiallty a DX8 title with ultrashadow it does seem like their design decision could have worked. (Yes I know doomIII is OpenGL, but you know what I mean)

It COULD have worked. Only it didn't, because the R300 was just as fast in DX7/DX8 technology titles, and much faster in DX9 titles. (Actually also faster in DX7/DX8 is you compare on similar image quality using AA)

If the R300 had been slow in DX7/DX8 and fast in DX9, then the NV30 design could have been the best choice.
 
Well, I don't see how the nv3x could be anything but a loser. For a long time it cost more, it had inferior IQ, it would at best match the speed of its rival, often falling behind (even in DX8, and always in DX9), it took two slots compared to its competitions one...
 
My problem with NV3x is simple.

It's not that it was a bad card, as has often been said the NV3x derivatives can be described as good, or even stellar, DX8.1 cards with a DX9 checkbox.

My problem is that it was a DX8.1 card with a DX9 checkbox that was marketed as a DX9 card. It wasn't what it was marketed as, and that's what I have problems with.

If they'd marketed it as not being as fast in features you wouldn't see in real games for years, but faster and more stable in some of the games people are playing right now, we wouldn't have this problem.
 
They have probably considered that marketing option. The problem with that is that the NV30 isn't faster or more stable in DX8 games.
So they couldn't realisticly market it that way either.
 
Nick Spolec said:
...
But, hopefully after the entire NV3x debacle, Nvidia kicks itself in the ass and wakes up.

6800 looks like a result from that, but their actions in the coming months when ATI reveals their chip will really tell the tale is Nvidia has changed for the better.

Yes, nV40 seems to be nothing if not a complete repudiation of the statements nVidia was making for all of '03 about the "future of 3d," without any question. My biggest curiosity with nV40, however, is in seeing whether they can manufacture it in sufficient quantities this year, as I believe this is likely a larger hurdle for them than was designing the chip. 222M transistors at .13 microns, at profitable yields, seems a tall order to me. I guess we'll see fairly soon, though.
 
digitalwanderer said:
The Dig briefly sticks his head into the thread and hisses his prophetic word:

"July..."

According to what Tamasi stated about it in his interview at TR, this is as likely a guess as any...;) If no real yield problems develop I would expect them to make that date for the 6800U (although Tamasi states that we should see 6800U's available by the end of May--I think that might be optimistic unless yields are OK), but if yields are problematic I'd expect to see delays past July, at least for the 6800U products hitting the retail channels. How's that for stating the obvious?... 8)
 
WaltC said:
if yields are problematic I'd expect to see delays past July, at least for the 6800U products hitting the retail channels. How's that for stating the obvious?... 8)
Well it's obvious to you and me, but when I started saying it I was called a trolling fanboy who was spreading FUD.... ;)
 
Xander said:
And, of course, there is the oft-commented on concept that the nv30 was 'designed around Doom3'. Ultrashadow, improved stencil op processing, partial precision support, etc - all things that will boost performance substantially in games coded using nVidia's OpenGL extensions. Like Doom3 and its licensees.

NV30 didn't support UltraShadow, that was a NV35 tech.
 
digitalwanderer said:
Well it's obvious to you and me, but when I started saying it I was called a trolling fanboy who was spreading FUD.... ;)

Heh--common sense, definitely not FUD. nVidia had well-documented yield problems with nV35, problems the company admitted to publicly last year (lest there be any doubt), on the same .13 micron production process used for nV40, the difference being that nV40 has nearly double the transistor count of nV35. So I wouldn't call such speculations FUD whatsoever, as they seem entirely reasonable, to me. And of course, such speculation is easily proven false in the event that nVidia is able to ship marketable quantities of nV40 such that its board AIB's can make Tamasi's Memorial Day (last day of May) target for 6800U retail availability. If nVidia has no trouble manufacturing these 222M-transistor beasties I certainly won't be spreading FUD about that, I'll be congratulating them, and I'm sure you will be, too...;)
 
Back
Top