NV40: 6x2/12x1/8x2/16x1? Meh. Summary of what I believe

MuFu said:
There's a possibility both are wrong. ATi have certainly done a very good job of confusing people. :LOL:

I can come up with but two ways to make both speculations wrong:
A) R420 is a 8x2 / 16x0 part much like the NV40 is supposed to be ( not likely IMHO )
B) ATi is pulling off some black magic with cramming 16x1 into the given transistor budget ( considering the alleged AA improvements, FP32, etc. even less likely )
 
digitalwanderer said:
MuFu said:
There's a possibility both are wrong. ATi have certainly done a very good job of confusing people. :LOL:
TELL ME ABOUT IT!!!! :oops:

I'm actually half-way convinced the R420 might just be a bloody FP32 card now. :|

yeah so it's compatible with dx 9.1 :devilish:
 
My bet for R420/423 right now is an 8x1 design with more FP ALUs per one pipe to speed up shaders even more. There will be shaders 3.0, but i doubt about any improvements (besides speed of coarse) in AA and AF.

As for NV40 it's still a mystery considering how many pixels/clock can it produce and where. The main THREE possibilites are: 6x2/12x1, 8x2/16x0 and 8x2/16x1. But i wouldn't be surprised at all if it'll end up like 4x2/8x1.
 
jimbob0i0 said:
digitalwanderer said:
MuFu said:
There's a possibility both are wrong. ATi have certainly done a very good job of confusing people. :LOL:
TELL ME ABOUT IT!!!! :oops:

I'm actually half-way convinced the R420 might just be a bloody FP32 card now. :|

yeah so it's compatible with dx 9.1 :devilish:
soapbox.gif


Actually, it'd be sort of nice to have it to support Longhorn for down-the-road; but I wasn't expecting it and still ain't sure about much of anything about the R420. :rolleyes:
 
The Nv30 was supposed to be 6x2/12x0, when that failed they went the 4x2, Flow FX, exotic memory, cheating drivers route.
Thr 5700u is a fairly fast chip, unless by some miracle they got 6x2 to work on .11, which I wouldnt bet the farm on or $10 for that matter, then I expect Nvidia to go 8x1, exotic cooling/ram, and cheating drivers with the Nv40.
I think people are giving Nvidia too much engineering credit like people did with 3dfx towards the end, face facts they totally arsed up the Nv30 and had years to get it to work, when a design fails that horribly for years they simply had to drop it. I think the 5700u is evidence of this shift in direction personally.
 
I think you are giving them too little credit. Even Intel and AMD have had missteps and arsed up designs. (Itanium and K6 anyone? Rambus system RAM?). Does Itanium prove that Intel engineers SUCK? No, sometimes people try something different, and market conditions, timing, and competitor's products control whether or not it is seen as a glorius victory or suckage. AMD went conservative, and not Intel is forced to follow suit. NV30 wasn't a bad design, it simply wasn't good enough. If ATI hadn't shipped the R300, people would be quite satisifed with the boost the NV30 delivered over the NV25. It is only because ATI did such an incredible job that we view the NV30 as a suckage. ATI went for full FP, Nvidia thought long instruction limits, stencil, and fast integer was the right strategy. ATI guessed the market right.

If you've been on this board long enough, you know how quickly tables can turn. People once thought 3dfx was unstoppable. Then they thought NVidia was unstoppable. Now a new group of kids is once again proclaiming that a large faceless corporation has an unbeatable lead, and won't make any mistakes to give it up, nor will any new competitors arrive. How ironic it will be if PowerVR comes up with some magic VS/PS3.0 chipset that blows the doors off both R420 and NV40. Then we'll all be debating how ATI and nVidia both "arsed up" their 3.0 pipelines.

But see, you've already come to the conclusion that the NV40 is arsed up, without any real knowledge. And here you are in another thread instructing me to "wait and see how PCI Express performs", yet you've already passed judgement on anything NVidia can possibly do.
 
As an aside, Best Buy have changed their D3 shipping ETA to July 13th. Doesn't bode too well for a bundle. :-\

MuFu.
 
MuFu said:
As an aside, Best Buy have changed their D3 shipping ETA to July 13th. Doesn't bode too well for a bundle. :-\

MuFu.
Have you learned nothing from R360/RV360? Vouchers make everybody happy (with the exception of the consumer).

But maybe they'll bundle UT2004 instead. Fine by me.
 
The Baron said:
Have you learned nothing from R360/RV360? Vouchers make everybody happy (with the exception of the consumer).

Hehe, true that. The nV + D3 bundle is a little different though; what they were/are proposing is literally a bundled copy of D3, not a voucher.

MuFu.
 
MuFu said:
The Baron said:
Have you learned nothing from R360/RV360? Vouchers make everybody happy (with the exception of the consumer).

Hehe, true that. The nV + D3 bundle is a little different though; what they were/are proposing is literally a bundled copy of D3, not a voucher.

MuFu.
I know, but again, it's entirely possible that Best Buy is pushing it back as to not piss off people who might preorder, thinking it will come out very soon, in case it actually is going to come out after this summer. (Read: they don't know any better than the leprechaun who sits on my shoulder and tells me to burn things)

But... will Doom 3 have TWIMTBP logo at the beginning of it? It pains me to imagine such a travesty.
 
DemoCoder said:
you've already come to the conclusion that the NV40 is arsed up, without any real knowledge. And here you are in another thread instructing me to "wait and see how PCI Express performs", yet you've already passed judgement on anything NVidia can possibly do.
<sigh>

I can't believe I'm going to say this, but I agree with you. I've got the hunch that NV40 is going to fall flat on it's face and be a better-but-still-bit-of-a-NV30-fiaso-ish, but I have NOT counted nVidia out yet nor would I be too shocked if they came out with a great card.

I've had too many dodgy-POS cards from ATi over the years to NOT know how quickly things can change, nothing is a given until it's a given. ATi shocked the hell out of me with the R3xx line-up and their whole corporate/public relations turn-around over the last year and a half or so, it ain't beyond the realm of possibility that nVidia pulls off something similar.

I've been playing around with my son's GF4 ti4400 the last few days and I'd forgotten just how nice of a little card it actually is, nVidia has had their day in the sun and can come up with some excellent hardware....I actually hope they get back on track and start doing so again.

That being said, I think the R420 is going to be more of the same from ATi...but I mean that in a "more of the same level of excellence and technological advancement" kind of good way. ;)

I was actually just wondering today when 3D gaming became so damned pretty and how could it possibly become more so? I mean, 50+ fps averages easy with 4xAA 8xAF v-sync on and it looks/feels/plays smooth?!?!

I can't WAIT to find out how wrong I am with the next generation of cards! :LOL:
 
There is always room for improvements. I mean, yes games look better and better, but even with all fancy enabled in todays games, you can see the limits.

While not falling into the hyper-realism trend, you can just see that smoothing polygons, or using more precise lighting models, you can represent things more naturally.

Just look how Cg graphics have evolved in movies.

From Tron to Finding Nemo, there's been a huge evolution.

Look now, Tron 2.0 was convincing, actual rendering power could render the game as if it was the movie.

Just wait till we have the same result with a Finding Nemo 2 game ;)
 
Magic-Sim said:
Look now, Tron 2.0 was convincing, actual rendering power could render the game as if it was the movie.

Somehow I don't find that particularly impressive - I mean, Tron 2.0 has great graphics because it looks like a movie that looked kinda like contemporary ( read = ugly ) videogames...? WTF? Then again, maybe it's just me being heavily biased against the Lithtech engine... :rolleyes:

As for Doom3 - after the infamous Great Alpha Leak scenario and the Anti-Shader Day, I find it entirely possible that it will receive the TWIMTBP badge of dishonor...
 
I was actually just wondering today when 3D gaming became so damned pretty and how could it possibly become more so? I mean, 50+ fps averages easy with 4xAA 8xAF v-sync on and it looks/feels/plays smooth?!?!

I can't WAIT to find out how wrong I am with the next generation of cards!

realtime graphics still have a massively long way to go. no matter how good todays graphics seem. we have to get several orders of magnitude
better than now, if we are to reach *current* prerendered/offline CGI. of course we will probably NEVER catch up to the CGI of the same time, because CGI is a moving target, always improving.
 
DemoCoder said:
I think you are giving them too little credit. Even Intel and AMD have had missteps and arsed up designs. (Itanium and K6 anyone? Rambus system RAM?). Does Itanium prove that Intel engineers SUCK? No, sometimes people try something different, and market conditions, timing, and competitor's products control whether or not it is seen as a glorius victory or suckage. AMD went conservative, and not Intel is forced to follow suit. NV30 wasn't a bad design, it simply wasn't good enough. If ATI hadn't shipped the R300, people would be quite satisifed with the boost the NV30 delivered over the NV25. It is only because ATI did such an incredible job that we view the NV30 as a suckage. ATI went for full FP, Nvidia thought long instruction limits, stencil, and fast integer was the right strategy. ATI guessed the market right.

If you've been on this board long enough, you know how quickly tables can turn. People once thought 3dfx was unstoppable. Then they thought NVidia was unstoppable. Now a new group of kids is once again proclaiming that a large faceless corporation has an unbeatable lead, and won't make any mistakes to give it up, nor will any new competitors arrive. How ironic it will be if PowerVR comes up with some magic VS/PS3.0 chipset that blows the doors off both R420 and NV40. Then we'll all be debating how ATI and nVidia both "arsed up" their 3.0 pipelines.

But see, you've already come to the conclusion that the NV40 is arsed up, without any real knowledge. And here you are in another thread instructing me to "wait and see how PCI Express performs", yet you've already passed judgement on anything NVidia can possibly do.

If a part simply doesnt work its not an affront to call it a failure. We're not living in politically correct-ville here, if I build a race engine and it doesnt work I failed, the blueprints could give pretty women orgasms but if it doesnt work it doesnt work. That has to be called a failure.
 
Well, maybe you can tell those NV3x owners that their cards don't work, and that they are imagining things.

OR, you could drop the hyperbole and rabid bias.

If I own a Nissan 350 Z, the fact that it can be beat by a Skyline does not mean "it doesn't work". Why does rationality fly out the window and reduce people to speaking hyperbolic nonsense when it comes to graphics cards?
 
DemoCoder said:
I think you are giving them too little credit. Even Intel and AMD have had missteps and arsed up designs. (Itanium and K6 anyone? Rambus system RAM?).
The K6 was a fine chip. Maybe it's floating point performance could have been better, but it was great for other things.
NV30 wasn't a bad design, it simply wasn't good enough. If ATI hadn't shipped the R300, people would be quite satisifed with the boost the NV30 delivered over the NV25. It is only because ATI did such an incredible job that we view the NV30 as a suckage. ATI went for full FP, Nvidia thought long instruction limits, stencil, and fast integer was the right strategy. ATI guessed the market right.
I don't understand how you thing that NV30 is great at stencil. 8 stencil ops per clock? R300 has that too and it can also do 8 colored pixels per clock as well.
 
Well, maybe you can tell those NV3x owners that their cards don't work, and that they are imagining things.

OR, you could drop the hyperbole and rabid bias

My you're quick to toss around invectives, the original Nv30 was a 6x2 design Nvidia failed at making that a reality. They tried their hardest and failed I respect that as do most people, however it was still a failure.
 
Back
Top