Graphic features rant

ET said:
Even the difference between the GeForce4 MX and the Radeon 9000, both of which sell in the same price range, is huge.

I see one problem with that thought and therefore makes a lot of your argument redundant in certain contexts.

The Geforce4MX is not in the same price range as the Radeon 9000. GF4MX are slightly (and in many ways very significantly) cheaper. There is a sweetspot and that is the sub 50 UKPounds (er key for pounds temporarily disabled - sorry). This makes a new gfx card more of an impulse buy if it is believed these cards have high performance like the GF4 Ti range (yes, we know it is a con). In older games a GF4MX will be faster than a Radeon 9000 NP - GF4MX 440 has higher bandwidth and I think a higher speed core.

Interestingly the Radeon 9100 is out and is very cheap (kind of suprised me how cheap 77 UKPounds), hopefully replacing the GF4MX as the impulse buy but I doubt it. I have noticed pretty high sales of the AIW 9000 Pro however and that is excellent news as it supports DX8.1 too.

Anyways, I haven't read the rest of the thread yet so I apologise for taking it back to the level it was a few days. :oops:
 
Simon F said:
ET said:
I really wish we had slow DX9 compatible cards instead of fast DX7 ones. It would have made things so much better.
I think that is called a CPU.

:)

Didn't mean "very very incredibly slow" you know. I mean, just stick one DX9 TMU in each integrated chipset, even without hardware vertex shaders; two TMUs with hardware VS for the low end, and we've got us a ballgame.
 
Another way to look at things is: you want a grade A title and want it to push the envelope for the system it was designed for then look no further than a console .. hehe.. sorry totally irrelevant.

Reading the rest of the posts I agree with everything else you said ET by the way. It is just sad to me how many GF4MX's are still sold. Easily the most sold current gfx card, by a big margin. DX7 is here to stay.
 
Tahir said:
The Geforce4MX is not in the same price range as the Radeon 9000. GF4MX are slightly (and in many ways very significantly) cheaper. There is a sweetspot and that is the sub 50 UKPounds (er key for pounds temporarily disabled - sorry). This makes a new gfx card more of an impulse buy if it is believed these cards have high performance like the GF4 Ti range (yes, we know it is a con). In older games a GF4MX will be faster than a Radeon 9000 NP - GF4MX 440 has higher bandwidth and I think a higher speed core.

Sadly you're right. I do hope that ATI can grab that spot (or NVIDIA release a good replacement).

Regarding the better performance of the GeForce4 MX, IMO for many people who are not hard core gamers, very high resolutions with the very high frame rates are not that important. My brother in law plays a lot of games, and up until now was using a Celeron 333 with Radeon VE. He hasn't been playing the latest, but he was fine playing Q3 and Unreal engine games. I'm just moving him now to a Celeron 1700 with the same card, and I'm sure he'll be happy. I know other people who play quite a lot on GeForce2 MX level cards.
 
Tahir said:
The Geforce4MX is not in the same price range as the Radeon 9000. GF4MX are slightly (and in many ways very significantly) cheaper. There is a sweetspot and that is the sub 50 UKPounds

Here in the states at Best Buy you can get both a BFG GeForce4MX 64MB and a ATI Radeon 9000 64MB for $99US. I think that's what ET was talking about.

Tommy McClain
 
Joe DeFuria said:
This logic is fatally flawed by the fact Doom 3 runs on any GF2 or higher (GF2 = Dx7).

That's a valid point, but only partially true. Yes, IIRC, DX7 cards that support stencil and Dot-3) do have all the "Feature support" needed to run Doom3 with "everything enabled."

However, they lack performance to do so. So the DX7 "paths" (ARB, NV10) do not utilize the same lighting model as the DX8+ paths do.

So, while it's possible to "run" Doom3 fully featured on GF2....it won't.

It depends how fast you clock the GF2 :) Run it 320x200 with a fast enough GF2 and it will look roughly the same as a ATI 9700 Pro at the same res.

I agree with ET that having a single rendering path would make high end games look better, because having finite resource we have to pick the largest bang for the buck. JC picked Dx7 level, sure he will add a few tricks (like most games) that use the latest and greatest features. But the majority is Dx7 give him 4-5 years and you will see a Dx9 engine that will make a 9700 or GeforceFX really shine.

People should congratulate JC for making something that looks that good on Dx7 tech. A pure Dx9 level engine would have totally different architecture and would support much more advanced features (but only a few people could use it).
 
DeanoC said:
Joe DeFuria said:
This logic is fatally flawed by the fact Doom 3 runs on any GF2 or higher (GF2 = Dx7).

That's a valid point, but only partially true. Yes, IIRC, DX7 cards that support stencil and Dot-3) do have all the "Feature support" needed to run Doom3 with "everything enabled."

However, they lack performance to do so. So the DX7 "paths" (ARB, NV10) do not utilize the same lighting model as the DX8+ paths do.

So, while it's possible to "run" Doom3 fully featured on GF2....it won't.

It depends how fast you clock the GF2 :) Run it 320x200 with a fast enough GF2 and it will look roughly the same as a ATI 9700 Pro at the same res.

I agree with ET that having a single rendering path would make high end games look better, because having finite resource we have to pick the largest bang for the buck. JC picked Dx7 level, sure he will add a few tricks (like most games) that use the latest and greatest features. But the majority is Dx7 give him 4-5 years and you will see a Dx9 engine that will make a 9700 or GeforceFX really shine.

People should congratulate JC for making something that looks that good on Dx7 tech. A pure Dx9 level engine would have totally different architecture and would support much more advanced features (but only a few people could use it).

I'd say if he starts as soon as Doom III, more like 2 1/2 to 3 years. or at least I hope :)

remember, he said dx9 was going to be the base platform for his next gen engine
 
But the majority is Dx7 give him 4-5 years and you will see a Dx9 engine that will make a 9700 or GeforceFX really shine.

I'd say Carmack's next engine will make the 9700/GFfx "really shine" in exactly the same way Doom3 makes a GeForce 1 "really shine".

It'll make DX9 really shine...but it'll probably take the first generation DX10 capable cards to run it really well--in the same way as it'll take 9700/GFfx to run Doom3 really well.
 
Dave H said:
But the majority is Dx7 give him 4-5 years and you will see a Dx9 engine that will make a 9700 or GeforceFX really shine.

I'd say Carmack's next engine will make the 9700/GFfx "really shine" in exactly the same way Doom3 makes a GeForce 1 "really shine".

It'll make DX9 really shine...but it'll probably take the first generation DX10 capable cards to run it really well--in the same way as it'll take 9700/GFfx to run Doom3 really well.

Agreed a fully Dx9 engine would only just run on 9700/GFfx. A Dx9 only you would have soft shadows as standard, BDRFs, high poly characters AND high texture resolution and many lights.

Doom3 style rendering is well over 3 years over but that was purely research. Even Carmack Reverse predates Carmack usage by several years. What JC is doing is making it actually work outside of a small demo.

I think everybody is underestimating (which is rare :) ) Mr JC. I reckon is will run 'well' (for some value of well) on a GF2 type platform (i.e my guess at least 20fps on a GF4MX at 640x480).
 
ET said:
Unfortunately, I don't think this will change. I really wish we had slow DX9 compatible cards instead of fast DX7 ones. It would have made things so much better.

Are you forgetting that the NV34, with only 45M Transistors according to rumors, is DX9 compliant? And even CineFX compliant, it seems. Sure, it uses a lot of software stuff to attain such feature support, but programmers don't have to worry about it, it's transparent to the API.


Uttar
 
Uttar said:
Are you forgetting that the NV34, with only 45M Transistors according to rumors, is DX9 compliant?

I wouldn't be too sure on NV34 being DX9 compliant...
 
BRiT said:
Uttar said:
Are you forgetting that the NV34, with only 45M Transistors according to rumors, is DX9 compliant?

I wouldn't be too sure on NV34 being DX9 compliant...

As I said, the NV34 *hardware* probably isn't DX9 compliant. But in practice, to the API, it's just like if it was.


Uttar
 
Here too the MX440s are about the same, sometimes more sometimes less then the Radeon 9000XTs. (Pros are ridicilous, often near very the price of a 8500LE/9100).

Also, I'm pretty sure that Quake was indeed developed totally for software originally. If I recall, 3dfx were the ones who convinced JC to develop GLQuake either just after it was released or not long before. Definetly during a lot of the Quake days, no one, including me had really heard of 3d cards or though much of them. I was still using my good ole S3 Trio 64 (not it didn't actually have many problems, as I have said many times elsewhere, S3 2d cards weren't really that bad IMHO and were Trident). I remember downloading the Quake multiplaye test, one of the first games to have a public multiplayer test. Remember finding out from the internet how to get the hidden models. And other things. Also remember thinking it was way overhyped (especially in PCZone) and even more so when it was released.... Well that's me...
 
Just a note on the OT quake-related realm:

First there was Quake -- software only renderer, which was aided by various linear frame-buffer implementations. Second there was VQuake -- hardware renderer for Verite chips. Lastly there was GLQuake -- hardware renderer that used a very small subset of OpenGL (ala miniGL) suited for 3dfx Voodoo1 hardware.
 
I think everybody is underestimating (which is rare ) Mr JC. I reckon is will run 'well' (for some value of well) on a GF2 type platform (i.e my guess at least 20fps on a GF4MX at 640x480).

I agree with that very much, I just had a different value of well in mind. When I say "it will take a 9700/GFfx to run Doom3 really well" I'm thinking on the order of 60fps, full details and high AF, at either 1024 4xAA, 1280 2xAA, or 1600. (Which evidently means I'm banking on a decent speedup between the alpha and final versions.) But obviously more vague than that. :)

Basically I mean, a level high enough that there is not a large qualitative difference between there and running it "perfectly". Now that I quantify it as I have above, though, I'm beginning to think we might need to "wait" for R350/NV35 to run Doom3 really well!
 
Back
Top