Radeon 9800 256meg reviews ......

elroy said:
Exactly Russ. Imagine how fast NV40 will be! (Assuming that it has 8 pixel pipes, of course.)

Nah, NV30/35 only has 4 "pixel pipes", but those are 4x2 pipes (in terms of writing pixels with color.) So it has more "texel" fill rate per clock than R350, which is 8x1.

If NV40 is 8x2, that would be the large leap you are thinking of over the the NV3x.

In short, NV3x isn't "1/2" of the R3xx. In fact, given the apparent pixel shading tweaks in NV35 over NV30, in most game situations (multitexturing), the NV35 is roughly equal to the R350 on a clock for clock basis.

At this point in time, the NV35 has clock speed advantage over the R350, which is why there's generally a performance advantage over the R350.
 
Good point Joe. However, I also assume that they will also increase the number of vertex engines/pixel shader units in NV40 as well. Actually, if the rumours of NV40 having unified PS/VS units are true, trying to compare competing architectures is only going to get more complicated.
 
Evildeus said:
MuFu said:
Evildeus said:
Not really, but i think (well from what i saw ;)) that the 5900U will be the best product available in june.

Depends on your moral standing I guess. :LOL:

MuFu.
Well, we will see in less than 1 hour where my/your moral standing is ;)

Well, Evildeus, I waited the hour, and just where is the "best product available in june"? Overall, I see a parity here, not the asswhipping many had predicted. I doubt many who own a radeon 9700P/9800P would get rid of those cards for a 5900Ultra.... 5800Ultra owners, on the other hand, have reason........ ;)
 
martrox said:
Well, Evildeus, I waited the hour, and just where is the "best product available in june"? Overall, I see a parity here, not the asswhipping many had predicted. I doubt many who own a radeon 9700P/9800P would get rid of those cards for a 5900Ultra.... 5800Ultra owners, on the other hand, have reason........ ;)
Really? Seems we didn't see the same think. I didn't see any det 50.** scores, but CAT 3.4 (the joker from Ati not already certified it seems ;)) are already out and are at best on par with the NV35 :)
 
elroy said:
Good point Joe. However, I also assume that they will also increase the number of vertex engines/pixel shader units in NV40 as well. Actually, if the rumours of NV40 having unified PS/VS units are true, trying to compare competing architectures is only going to get more complicated.

Yeah, all indications are that with NV40 and the R500 (used to be R400?), most of the talk of descript PS/VS units will go out the window. ;)
 
Look, the bottom line here is everyone should be pretty happy... nVidia users now have a card that's at least comparable to ATI's.... Bottom line , the NV30 just plain sucked, period (even nVidia abmits that!) ATI users should be happy that they still have a more than competitive card, with better IQ. Both cards are very fast, each has games it runs faster in. So, everything considered, they are pretty much even. So, go buy what you want. You can't go wrong either way. Only a true fanboi could say one is a great deal better than the other......
 
256MB of DDR-II. Uhm, isn't the point of DDR-II to run the memory a hell of a lot faster than DDR-I? I thought DDR-II was slower clock-for-clock than DDR-I because of the latency issues, not to mention much more expensive. So why would ATI not increase the memory clock into the 400+ range?

Seems rather disappointing.
 
Natoma said:
256MB of DDR-II. Uhm, isn't the point of DDR-II to run the memory a hell of a lot faster than DDR-I? I thought DDR-II was slower clock-for-clock than DDR-I because of the latency issues, not to mention much more expensive. So why would ATI not increase the memory clock into the 400+ range?

Seems rather disappointing.

DDR-II should allow faster effective bandwidth for the same amount of power draw.

The reason, presumably, why ATI went with DDR-II for the 256 MB version, is to lower the power requirements and keep them within some desired spec (probably per OEM request.)

With 256 MB of DDR-I, they might have needed something like a 2 slot cooler...see nVidia 5900.
 
Back
Top