Wich card is the king of the hill (nv40 or R420)

Wich card is the king of the hill (nv40 or R420)

  • Nv40 wins

    Votes: 0 0.0%
  • they are equaly matched

    Votes: 0 0.0%

  • Total voters
    415
Status
Not open for further replies.
L233 said:
Ardrid said:
I was considering that, but then I figured most ppl on this board know that. Besides, my main reason for quoting it for Natoma was to indicate the type of rig that was used.

Most people know that the Enermax 350W delivers more ampere on the 12V rail than most quality 430W PSUs? I am not so sure about that.

My point really is that Nvidia's 480W recomendation seems pretty accurate for most PSUs. It's on the safe side for sure (there are people who run 4 10kupm HDs in some RAID configuration and stuff) but not too far off.

Well most of the ppl on this board are rather knowledgeable, wouldn't you agree? :) I would still say that NVIDIA is more or less playing in safe. I think the 480W figure is in reference to generic PSUs. Granted, the Enermax is in a class of it's own when it comes to the 12V rail, but I don't think any quality PSU would have any issues.
 
If you look at the performance of the extrem nv40 iit matches the performance of the X800XT i think nv targeted the clock speed to low with the ultra
 
Tough call. If I was spending someone elses money I'd pick up the X800XT PE. Spending my own money I see myself taking a good long hard look at that 6800GT. Perhaps ATI will release a lower clocked 16 pipe product to slide in between the Pro and XT PE. ;)
 
Ardrid said:
L233 said:
Ardrid said:
I was considering that, but then I figured most ppl on this board know that. Besides, my main reason for quoting it for Natoma was to indicate the type of rig that was used.

Most people know that the Enermax 350W delivers more ampere on the 12V rail than most quality 430W PSUs? I am not so sure about that.

My point really is that Nvidia's 480W recomendation seems pretty accurate for most PSUs. It's on the safe side for sure (there are people who run 4 10kupm HDs in some RAID configuration and stuff) but not too far off.

Well most of the ppl on this board are rather knowledgeable, wouldn't you agree? :) I would still say that NVIDIA is more or less playing in safe. I think the 480W figure is in reference to generic PSUs. Granted, the Enermax is in a class of it's own when it comes to the 12V rail, but I don't think any quality PSU would have any issues.

best way is to try it - didn't the 9700pro say it needed a 350watt psu? i ran it on my 250watt for nearly 2 years without an issue before i upgraded my case etc :)
 
Ardrid said:
L233 said:
Ardrid said:
I was considering that, but then I figured most ppl on this board know that. Besides, my main reason for quoting it for Natoma was to indicate the type of rig that was used.

Most people know that the Enermax 350W delivers more ampere on the 12V rail than most quality 430W PSUs? I am not so sure about that.

My point really is that Nvidia's 480W recomendation seems pretty accurate for most PSUs. It's on the safe side for sure (there are people who run 4 10kupm HDs in some RAID configuration and stuff) but not too far off.

Well most of the ppl on this board are rather knowledgeable, wouldn't you agree? :) I would still say that NVIDIA is more or less playing in safe. I think the 480W figure is in reference to generic PSUs. Granted, the Enermax is in a class of it's own when it comes to the 12V rail, but I don't think any quality PSU would have any issues.

best way is to try it - didn't the 9700pro say it needed a 350watt psu? i ran it on my 250watt for nearly 2 years without an issue before i upgraded my case etc :)

people really do make too much of the PSU as an issue, its amazing how many people jump on it when performance issues occur - YOU DONT HAVE ENOUGH VOLTS etc etc ;) oh well...
 
Hrmm what's this ... farTcry thing? :LOL:

http://translate.google.com/transla...e=UTF-8&oe=UTF-8&prev=/language_tools

If one designates however the file "FarCry.EXE" for example in "FartCry.EXE" over, the result of the bench mark in the dissolutions of 1.024 x 768, 1,280 x 1,024 and 1,600 x of 1,200 pixels worsens around in each case 10 pictures per second (!).

21799658_5977b1e2e1.jpg
 
NV40 all the way, ATI can't compete in the price segement I am interested in. The X800XT looks like the better card at $500 but I am not interested in $500 video cards. While $400 is still quite a stretch, I am willing to pay it this time (because of EQ2).

Nvidia's last minute announcement of the 6800GT has rendered the X800 Pro a failure.

The 6800GT offers more performance and a more advanced feature set. It's a no-brainer. Unless ATI's new shader compiler thingy increases the X800 Pro's performance considerably, I see exactly zero reason to pick that one over the 6800GT. Hell, the 6800GT might even be the better overclocker because, thanks to its 16 pixel pipelines, you get a bigger performance increase per MHz overclocked.

While the X800 Pro draws less power (10-30W less) it's not that big a deal. Also, I have full confidence that MSI will ship a 6800GT card with a quiet and efficient cooling solution.

I might get interested in the X800 Pro if ATI dropped the price by $50...
 
L233 said:
NV40 all the way, ATI can't compete in the price segement I am interested in. The X800XT looks like the better card at $500 but I am not interested in $500 video cards. While $400 is still quite a stretch, I am willing to pay it this time (because of EQ2).

Nvidia's last minute announcement of the 6800GT has rendered the X800 Pro a failure.

The 6800GT offers more performance and a more advanced feature set. It's a no-brainer. Unless ATI's new shader compiler thingy increases the X800 Pro's performance considerably, I see exactly zero reason to pick that one over the 6800GT. Hell, the 6800GT might even be the better overclocker because, thanks to its 16 pixel pipelines, you get a bigger performance increase per MHz overclocked.

While the X800 Pro draws less power (10-30W less) it's not that big a deal. Also, I have full confidence that MSI will ship a 6800GT card with a quiet and efficient cooling solution.

I might get interested in the X800 Pro if ATI dropped the price by $50...

Well, if this is accurate, wouldn't you say that the GT would be hard pressed to keep up with the Pro?

1083564189888Adk70te_10_1_l.gif
 
Nvidia's last minute announcement of the 6800GT has rendered the X800 Pro a failure.

lol.... gonna keep that quote... some failure that beats/is on par with the 6800 ultra in a fair few tests don't ya think? :rolleyes:
 
Natoma said:
But 3Dc is just :oops: in terms of "wow" factor.
I don't get it. All they did was show you lower resolution normal maps vs. higher resolution maps.

If you pay attention to how games actually use compressed textures, they pretty much never bother to have special high-res textures only for video cards that support the compressed format (there was one that has done this: UT).

This compression is a performance advantage, sure, but just how much of one depends on how much of the texture memory usage is normal maps.

3Dc really is no different, from a programming standpoint, from current texture compression techniques. In other words, it will be used in the exact same way by game developers: it will be used to increase performance. The performance increase will be almost certainly less than half the performance increase that we saw in the past from enabling DXTC, as, at the time, that compressed all textures. I claim that this normal map compression technique will not produce nearly as much of a performance increase.

Still, I think it's definitely a good idea, and should become standard by the time the next generation comes around. I just don't see it a "wow" anywhere in relation to the technique.
 
Ardrid said:
Well, if this is accurate, wouldn't you say that the GT would be hard pressed to keep up with the Pro?

I don't particularly care about the fact that ATI's AA and AF implementations seem to be more efficient - and that's what's skewing the picture here.

Since EQ2 won't run at highest settings even on the upcoming high end cards, I am not even expecting to use AA and for all other games the 6800 is fast enough at 4x AA. Raw fillrate/shader power is more important to me and I don't play at resolutions higher than 1280x1024.

Also, Brad McQuaid mentioned that Vanguard will support PS 3.0 and for me that's a big plus for the 6800GT.
 
Eronarn said:
L233 said:
I might get interested in the X800 Pro if ATI dropped the price by $50...

Remember, the X800 has a good two months to drop prices before the GT comes out. 8)

Good point. I'll see how prices are at the time EQ2 is released (or when I get into beta).
 
L233 said:
Eronarn said:
L233 said:
I might get interested in the X800 Pro if ATI dropped the price by $50...

Remember, the X800 has a good two months to drop prices before the GT comes out. 8)

Good point. I'll see how prices are at the time EQ2 is released (or when I get into beta).

I'll be running it on a R9700 Pro personally... hooray, lowest res and settings possible for 40FPS! :D
 
TAA did it for me the 420 was able to stay above the 60 fps threshold with 6xAA in almost every game benched. To me this means I get to enjoy 12xAA in most of my games. I have seen TAA in action and it really is that good, you just need high frames.

3dc seemed pretty good also ps 3.0 and vs 3.0 I don't think will make any huge difference, but I am exited about bw2.
 
From the various benchmarks it seems the 800XT is on par with the 6800U. (with the exception of FarCry of course).

In OpenGL games, the 6800 is superior, in DirectX 9 games with Pixel Shader 2.0, the 800XT is superior.

Taking on these into account, i reckon there is no clear winner.
Obviously, the 800XT dissipates less heat than the 6800U, and there are clear advantages that the 6800U has more features than the 800XT, but when it all comes down to fps, i would have to say both are equal in this round. Correct me if im wrong :p
 
Status
Not open for further replies.
Back
Top