NV35 Reviews

So far.... with the exception of the Doom results, I see little difference to buy one card over the other, except for the quality of the FSAA. I have to view this as a big win for ATI, as the 5900 is not doing the asswhipping that most fanATIs had feared and most nVidiots had predicted.

So, has nVidia reclaimed the "fastest" card available? Only partially so. ATI can still claim they are faster in as many benchmarks as nVidia can, plus they can say, without a doubt, they will look better doing it! So, is ATI smoking something hallucinogenic? No, but I’m sure , at the moment, ATI is a bit euphoric over these results.
 
For me, $299 much less $399 or $499 is way to expensive for something I could pick up for $100 next year. ;)
 
Hey, where's Hellbinder? He was asking me why I thought the 5900 Ultra wouldn't outperform the 9800 Pro. Because of the benchmarks of course, duh! :LOL:

MuFu.
 
MuFu said:
Hey where's Hellbinder? He was asking me why I thought the 5900 Ultra wouldn't outperform the 9800 Pro. Because of the benchmarks of course, duh! :LOL:

MuFu.

Boy, Mufu....when you are right, you are RIGHT!

Edit: although I think, for once, Hell will be happy to be wrong!
 
So far.... with the exception of the Doom results, I see little difference to buy one card over the other, except for the quality of the FSAA. I have to view this as a big win for ATI, as the 5900 is not doing the asswhipping that most fanATIs had feared and most nVidiots had predicted.

I agree. Only problem is that Doom3 is a rather big exception.

Although i agree with Joe in that we need to wait a bit until we can verify the winner of that battle.

No, but I’m sure , at the moment, ATI is a bit euphoric over these results.

If the NV35 is running at full precision then i don't think Ati is to happy about the Doom3 benchmarks. The interesting question here will be, if it uses lower precision, how noticeable is it ?
 
Bjorn said:
The interesting question here will be, if it uses lower precision, how noticeable is it ?

I would trust [H] to have pointed out any noticeable differences in IQ between the two cards if they existed. Also, it is JC who determines the default paths and I'm pretty sure he wouldn't allow a large discrepancy, although of course there is always the possibility that the most preferable path for an NV3x product trades in a little IQ for *a lot* of performance by running half float or integer precision.

C'mon HL2 benchmarks... :D

MuFu.
 
Bjorn said:
I agree. Only problem is that Doom3 is a rather big exception.

Although i agree with Joe in that we need to wait a bit until we can verify the winner of that battle.

No, but I’m sure , at the moment, ATI is a bit euphoric over these results.

If the NV35 is running at full precision then i don't think Ati is to happy about the Doom3 benchmarks. The interesting question here will be, if it uses lower precision, how noticeable is it ?

Don't you think that ATI can do a little work on their drivers for Doom3? If, in every other way, the 9800 is the equal of the FX5900, I'm sure there will be parity on this, too..... at least by the time Doom3 ships. In fact, I'd venture a guess that there will be a great deal of difference in this "benchmark" by the time the 5900 is available!
 
ATI would be well served to get those Doom3 scores up though. Regardless of ones opinion on the game, it is GUARENTEED to be a very high profile, very influential benchmark used extensively by web and print magazines.
 
jandar said:
"We're not exactly sure what's going on with the FX's antialiasing. It seems to completely blur the whole image. Sure, it gets rid of the stair effect at the bottom of the image, but at what cost? Looking at the whole picture, it seems as if there is a reasonable degree of blurring throughout. Quake 3, for example, doesn't show this kind of blurring with anti-aliasing used. We'll have to take a far closer look at the FX5900 Ultra's image quality over a variety of games. The limited time scope of this review didn't permit it. Anisotropic filtering appears good, antialiasing is something that needs looking at."


anyone else see this?

Yea hardocp also found the same thing, http://www.hardocp.com/image.html?image=MTA1MjY1OTc3OGJRaTJVSURadEJfNl8yNl9sLmpwZw==

Whats this Xaberesc AA all about then? :-?
 
[q]If, in every other way, the 9800 is the equal of the FX5900, I'm sure there will be parity on this, too..... at least by the time Doom3 ships.[/q]

That would be assuming that the game was shader op limited. If the FX5900 is the equal of the R9800 in everything else, why doesn't it stand to reason the 'turning on' four extra pixel pipes is going to give a sizeable edge?
 
Hey, where's Hellbinder? He was asking me why I thought the 5900 Ultra wouldn't outperform the 9800 Pro. Because of the benchmarks of course, duh!

MuFu.
Thats funny, i see the 5900 beating the 9800pro at almost every turn.. Especially when you look at all the reviews.

Also, i still dont understand your *moral* issue with the Nv35/Drivers. You m ust know something we dont. Is it their use od *Special* application specific Cg compiled routines? Perhaps some shoddy driver hackery even with the Details set to max??

Please.. Why not just spill the beans. Is it really fair to let Nvidia get away with shoddy jusnk if you know what it is??


Other than the above. I think that the Nv35 looks really solid. Heck i'd get one over a 9800pro in a second.
 
It is unlikely that the FP shader capabilities of the NV35 have significantly increased over the NV30. It would be nice, but it's just not likely.
 
BenSkywalker said:
[q]If, in every other way, the 9800 is the equal of the FX5900, I'm sure there will be parity on this, too..... at least by the time Doom3 ships.[/q]

That would be assuming that the game was shader op limited. If the FX5900 is the equal of the R9800 in everything else, why doesn't it stand to reason the 'turning on' four extra pixel pipes is going to give a sizeable edge?

Ben, can't you see by these results, that there's something not right? Do you think ATI, or JC himself, would allow this to happen on a shipping product? I don't care how biased you are..... This won't happen.
 
Chalnoth said:
It is unlikely that the FP shader capabilities of the NV35 have significantly increased over the NV30. It would be nice, but it's just not likely.

The fact is, the benchmarks show the performance as significantly increased. This doesn't mean they changed the hardware drastically though, or that we're claiming the hardware was drastically changed....a few possibilities, for example:

1) For some reason, their shader implementation depends highly on memory bandwidth.
2) The NV30 implementation has a few critical paths that needed tweaking. Simply wasn't time to do it for NV30 launch. (Think about the GeForce1 pipelines, relative to the GeForce2 GTS pipelines...similar "architectural performance increase", but the hardware really isn't that different.)
3) Some driver hacks going on. (Similar to the ones that made 3DMark Shader tests double on the NV30, these are making all shader tests virtually double on the NV35.
 
The 3dm03 PS improvements over NV30 are backed up by [H]'s ShaderMark figures.

MuFu.
 
Joe DeFuria said:
Chalnoth said:
It is unlikely that the FP shader capabilities of the NV35 have significantly increased over the NV30. It would be nice, but it's just not likely.

The fact is, the benchmarks show the performance as significantly increased. This doesn't mean they changed the hardware drastically though, or that we're claiming the hardware was drastically changed....a few possibilities, for example:

1) For some reason, their shader implementation depends highly on memory bandwidth.
2) The NV30 implementation has a few critical paths that needed tweaking. Simply wasn't time to do it for NV30 launch. (Think about the GeForce1 pipelines, relative to the GeForce2 GTS pipelines...similar "architectural performance increase", but the hardware really isn't that different.)
3) Some driver hacks going on. (Similar to the ones that made 3DMark Shader tests double on the NV30, these are making all shader tests virtually double on the NV35.

Ah - conspiracy theory, eh?
 
Ben, can't you see by these results, that there's something not right?

What I see is one core that looks damn near custom tailored for Doom3 and another that is far more general purpose. For your typical application the NV3X boards are only utilizing half of their pixel pipes with the other half being useless. Running DooM3 those useless pixel pipes come in to play. If a board is competitive, actually winning the majority of the time, under conditions where it can't use a decent segment of its hardware, why does it not stand to reason that the same board will distance itself considerably when it has all of its features enabled?

I've been expecting to see pretty much exactly what we are seeing in Doom3 in terms of relative performance.

Do you think ATI, or JC himself, would allow this to happen on a shipping product? I don't care how biased you are..... This won't happen.

Allow what to happen? The major 'bad choice' that nVidia made with their pixel pipe layout wasn't for absolutely nothing. They are not so stupid that they did it without knowing what the architecture's shortcomings and strengths were.

I'll ask you a question, do you think that developers should only allow graphics chips to use four pipes from now on to level the playing field between nV and ATi? Are they all biased because they don't do that? Doom3 is the poster child for the NV3X architecture from everything I've seen.
 
I would be skeptical of the 3mark03 ones but the tommti systems PS2.0 benchmark shows a big improvement over nv30 (still behind r350 overall though). If you wanted to be REALLY cynical I supposed you could suggest that nvidia knew that [H] use that bench and 'optimized' for that one too but thats probably getting a bit daft :)
 
Back
Top