Geforce fx = Geforce 4600 ultra?

way

Newcomer
It would seem that the Geforce fx on paper is about where a Geforce 4600 ultra would have been.I was looking for the Geforce fx to really be a brand new part,but it seems like the same tech.I know we have no benchs as of yet but if the fx is any faster than the 9700 pro I'll be very surprised. I think nvidia is in trouble with this new part,they totally under estimated ATI,even an under clocked fx that costs less will have a hard time competing. I hope I'm wrong about the fx and it really does bring something new and better to the table.
 
Don't know what you think this post of yours will accomplish, buddy.

The GFFX has about as much in common with the old GF4 as the Radeon 9700 has with the Radeon 8700, ie, not much at all.

There has never been a "GF4 4600 Ultra" announced, nor even rumored.

And considering Nvidia's said the GFFX will be faster than the 9700 it would be mighty strange if it was not. I certainly expect it to be, though by how much is open to debate of course.

*G*
 
The Geforce FX line are DX9(.?) products.
The Geforce 4 line are DX 8(.1) products.

I don't see their Ultra nomenclature be ever given to a higher DX revision product.
 
The GFFX has about as much in common with the old GF4 as the Radeon 9700 has with the Radeon 8700, ie, not much at all.

Pixel pipeline wise you can certianly see the architectural evolution. The vertex Shader end appears to be all new though.
 
Well technically speaking, while there are quite a few huge differences, changes, additions, and so on, technically speaking GeForce4 is just a highly evolved TNT.

TNT + Clockspeed = TNT2

TNT2 + TNT2 + HW TCL = GeForce256

GeForce256 + Additional TCU's = GeForce2

GeForce2 + Pixel / Vertex Shaders = GeForce3

GeForce3 + ...um... LMA2? = GeForce4.

Of course it's more detailed than that but you can technically reduce it to that. I guess what way is speculating is that

GeForce FX = GeForce4 + DX9 Pixel Shaders + Additional pixel pipe per TCU - DX8 Vertex Shaders + Massive Vertex FP array.

It IS a pretty big jump though. I doubt it's still an evolution of GF4... but we'll see once the S3TC results are in :LOL: Keep your eyes on the Q3A sky! :LOL:
 
Tagrineth said:
Well technically speaking, while there are quite a few huge differences, changes, additions, and so on, technically speaking GeForce4 is just a highly evolved TNT.
I'm bored :rolleyes:
 
...but we'll see once the S3TC results are in Keep your eyes on the Q3A sky!

Considering the age of the specific game, the rendering power of recent accelerators and amount of on board ram, give me one good reason why someone would opt for texture compression in q3a?

Finally not even the GF4's Hardware DXT1 decompressor uses the old 16bit colour interpolation of it's predecessors.
 
Guess what Way is getting at though is that there doesn't seam to be a whole lot of new stuff for the average gamer to care about, like improved AA and aniso etc.
 
Ailuros said:
Finally not even the GF4's Hardware DXT1 decompressor uses the old 16bit colour interpolation of it's predecessors.

It's rather that they do some screenspace dithering the hide the artefacts.
 
Humus said:
Guess what Way is getting at though is that there doesn't seam to be a whole lot of new stuff for the average gamer to care about, like improved AA and aniso etc.

Initially the GFFX isn't for the average gamer. Hopefully by the time it is priced that way there will be some benefit to the DX9 features.
 
Humus said:
Guess what Way is getting at though is that there doesn't seam to be a whole lot of new stuff for the average gamer to care about, like improved AA and aniso etc.

I've complained about that aspect myself, especially the aniso algorithm. Albeit mostly based on software modifications, from the user side 6xS and 8xS are an improvement over current 4xS and 4xOGMS methods.

My biggest grief with the NV30's AA pipeline is the lack of 4xRGMS, unless it's present and I've gotten something wrong here.
 
Humus said:
Ailuros said:
Finally not even the GF4's Hardware DXT1 decompressor uses the old 16bit colour interpolation of it's predecessors.

It's rather that they do some screenspace dithering the hide the artefacts.

Someone shoot Unwinder then ROFL ;)
 
3dcgi said:
Initially the GFFX isn't for the average gamer. Hopefully by the time it is priced that way there will be some benefit to the DX9 features.

Mainstream and performance parts should be out in April. There won't be any tangible benefit for DX9 features in games by then yet, except for performance improvements (esp. if DOOM3 is released by then...).
 
mr said:
The Geforce FX line are DX9(.?) products.
The Geforce 4 line are DX 8(.1) products.

I don't see their Ultra nomenclature be ever given to a higher DX revision product.

Just for the records: AFAIK the whole GF4-family is only DX8.0 due to its PS v1.3 only (Edit: v1.4 = DX8.1)
 
Doom 3 is more DX8 class hardware than DX 9...Doom 3 doesn't use displacement mapping, or Vertex Shader 2.0 or PS 2.0...
The only real gain a Dx 9 class card will get you may be overall speed due to the improved hardware (faster memory modules, 256 bit bus).
Doom 3 doesn't require a DX 9 card..it requires a 256-bit bus, 800 mhz DX8 class Radeon 8500/Geforce 4.
 
T2k said:
Just for the records: AFAIK the whole GF4-family is only DX8.0 due to its PS v1.3 only (Edit: v1.4 = DX8.1)

When did DX8.0 support PS 1.3?
 
Back
Top