Wich card is the king of the hill (nv40 or R420)

Wich card is the king of the hill (nv40 or R420)

  • Nv40 wins

    Votes: 0 0.0%
  • they are equaly matched

    Votes: 0 0.0%

  • Total voters
    415
Status
Not open for further replies.
I'd have to say that at this present time I feel the R420 edges it, but as time goes on I think we'll get a better idea of how the land really lies. Right now we're just entering the period of "this is invalid because..." and by the end we'll actually know what these cards can REALLY do :)
 
On the EQ2 - AF thingy: I'm suprised nobody here pointed out yet that AF is cheap performance-wise nowadays.

This discusion might have taken place 3 yrs ago, but this is rediculous. Suppose EQ2 is slow, then it will be only slightly slower with AF turned on. Solution: don't play it at all; especially if gameplay's anything like the old EQ.

Back to the original discussion, unless Ati manages to significantly improve OGL performance I think the X800xt shouldn't be crowned king of the hill yet. The 6800 can still beat the X800 in a few popular benches and the 6800 does have some extra bells and whistles.

The Ati card still gets my preference because it's a normal single slot solution with a modest cooler and lower power consumption, but there's no reason why the King of the hill couldn't be fat and bloated.
 
Well I voted "even" but after thinking about it more I would side with ATI this round.

ATI is faster, smaller, quieter, cooler and has better IQ.

Yes Nvidia has PS3.0 and I sometimes think I may end up missing something because if I buy ATI I won't have Ps3.0, but then I start thinking about what I will be missing all the time because of IQ lowering optimizations to keep up appearances on benchmarks.

The effects of the many outweigh the effects of the few, or the one.
 
L233 said:
ChrisRay said:
Why will AF not be feasible in EQ2? just curious, Havent heard anything about that.

They demonstrated it at FanFaire on 3 GHz, 1 GB RAM machines with 256mb GF 5950 video cards and the best they could do was medium settings for graphics. It was also mentioned that a GF6800U won't be good enough to play it with all the eye-candy turned to max.

Doesn't sound like there was any room to waste fillrate on AF/AA .

But usable. keep in mind that Playable is completely subjective, Depends the resolution you play At ect, personally I'd rather play @ 800x600 @ 4x AA than 1024x768 with no AA, But thats me ;) But I am in line to beta test the game.

So Hopefully I experience it first hand.
 
Sandwich said:
Back to the original discussion, unless Ati manages to significantly improve OGL performance I think the X800xt shouldn't be crowned king of the hill yet. The 6800 can still beat the X800 in a few popular benches and the 6800 does have some extra bells and whistles.

uhm.. what benches are ogl again? q3 doesn't need to get more optimisation, to beat a geforce. those several houndred fps are more than enough. any recent opengl game in benches? todays opengl is quite powerful, and works great on ati hw. i'm not in knowledge of any opengl game that uses any of these features.

i don't care if ati doesn't really bother about old games. they run good enough. and all q3 games i know of run good enough.

but i could be wrong, so please enlighten me (but yes, todays opengl can run very fast.. this i know for fact, programming with it)
 
davepermen said:
uhm.. what benches are ogl again? q3 doesn't need to get more optimisation, to beat a geforce. those several houndred fps are more than enough. any recent opengl game in benches? todays opengl is quite powerful, and works great on ati hw. i'm not in knowledge of any opengl game that uses any of these features.

i don't care if ati doesn't really bother about old games. they run good enough. and all q3 games i know of run good enough.

but i could be wrong, so please enlighten me (but yes, todays opengl can run very fast.. this i know for fact, programming with it)

Of the top of my head: OGL games like CoD, NWN and JK run faster on a 6800.
 
Sandwich said:
davepermen said:
uhm.. what benches are ogl again? q3 doesn't need to get more optimisation, to beat a geforce. those several houndred fps are more than enough. any recent opengl game in benches? todays opengl is quite powerful, and works great on ati hw. i'm not in knowledge of any opengl game that uses any of these features.

i don't care if ati doesn't really bother about old games. they run good enough. and all q3 games i know of run good enough.

but i could be wrong, so please enlighten me (but yes, todays opengl can run very fast.. this i know for fact, programming with it)

Of the top of my head: OGL games like CoD, NWN and JK run faster on a 6800.

All 3 of those games are based on engines that are atleast a year and a half old. And its like a difference from 200 to 300 fps. who cares, once you're above 100...

Until opengl produces a game that is actually released and on store shelves (not doom 3 vaporware) that looks as good as far cry, i will not care about their benchmarks. Hell, any opengl engine nowadays is so old, Ati's LAST generation of video cards could run it maxed out...
 
and none of these look like they cannot be run on dx7 hw, not even needed any dx8 hw? but definitely, none of these games use any modern technology, and, based on the age of them, none uses any of the modern gl1.4 or gl1.5 features, wich give much performance, and feature gain.
 
Oh very well, I can't really argue with that. Why are all the games with pretty new features all directx anyway? Are game devs going to abandon ogl?
 
opengl had quite a mess during the last years, gf3 and after, tons of extensions, no real new standard that worked on cards in a good way, including nv30 days.. now, nvidia, ati, and the rest sat together and started to work out all the things opengl missed wich dx grasped. opengl caught up, the last things get worked out, and in some features, it beats out dx9 currently.

nowadays, its something to invest on again, but it wasn't after q3 for quite a while. microsoft was then, together with xbox, able to get up all developers to work with dx..

possibly opengl gets more attention again.. i hope so, at least. support in drivers for the new gl are good on all hw..
 
Sandwich said:
Oh very well, I can't really argue with that. Why are all the games with pretty new features all directx anyway? Are game devs going to abandon ogl?

I doubt it. If you want to use the full capabilities of the NV40, you'll
need to use OpenGL, the same applies to the R420. If you use
vendor specific extensions you can do things in an optimal way,
things that may not even exist in Direct3d. :)

Direct3d is great if you want to have a very standardised target
for your game, but you still have to query capabilities and write
different paths for different hardware levels.

OpenGL is great if you want to use vendor specific capabilities
for you game, but you still have to query extensions and write
different paths for different hardware levels.

FWIW I write OpenGL software using things like VBO, GLSL,
Nvidia's Cg, and the accumulation buffer on the Radeon R3x0s.
I don't feel like I'm missing out by using OpenGL. :)
 
420


I can't understand why no reference card, given the power/heat requirements, didnt have an Abit OTES style cooler - particulalrly the 6800 Ultra Extreme Doohicky. I guess that's what the card manufactures will inlcude as a selling point for thier product.
 
Pretty simple really, people who want to fiddle with programming etc. would be daft not to go with NV40 to access SM 3.0. I'd guess this is only a very small part of the market, however.

SFF fans are going to go with ATI and Linux fans will go with NV.

Doom 3 will (I expect) be faster on NV whereas Gabe Newell has already stated HL2 is currently much faster on X800.

DX9 games will probably be better on ATI with OpenGL games better on NV. It will be interesting to see how much ATI can manage to improve their performance in OpenGL and see if they can improve their support for Linux. It will also be interesting to see how much NV drivers can improve.

As usual the battle lines will really be drawn in the $200-$300 market so the main fight will be between RV4X0 and NV41/whatever.

Hopefully a third party will also release something to compete soon. ;)
 
Bah, I decided I'd vote for the R420. It is clearly a better design. In terms of performance both are pretty close. NV is already stinking things up with drivers though. Also I think they are really going to have difficulties with supplying these products on mass. I read somewhere that the GT though is not a 12 pipeline design rather a 16 pipeline design down clocked from the Ultra variety. Surely I don't see a lot of these cards being produced on mass. NV will definitely have difficulties with their yields. 222million transistors @ .13 micron process is going to be difficult. Already you can buy the 12 pipe variety of the R420. I think this speaks volumes about ATi and their execution. Further for the enthusiast crowd who use high res and AA/AF the R420 outperforms and that is the crowd that these are focused on. PS3.0 is a great feature though but it really isn't of any value today nor anytime soon. There is not even full API support for it yet AFAIK.
 
I voted that they were both equal which I feel they are.

For me I think I may got 420 this round. The main reason is that both cards seem to have some advatages and disadvatages when you compare them. Given that I then would have to base my choice on other factors, like Price, Availibity, ect What would throw me for ATI is the fact over the past 12 months ATI has been on higher moral ground. I know business and ethics dont belong together in the same sentence (even though my company requires 20 hrs of Business ethics training every year). So while most companies don't seem to care about ethics IMHO one of them was a bit better in this area.

Also PS3.0 is a interesting point. Almost enough to sway me over. However I remember buying the original GeForce card so I could get the 100 TnL games by Xmas. I remember getting the 8500/GF3 ti500 so I could play all of those PS1.0 games that both ATI/NV said where coming. In both cases I got duped to thinking that these features would be in games in the lifetime of the card. I do not fault either company for pushing this its just the way developers take up new tech (at a slower pace). Thus when the 9700pro came out I got that only because of AA/AF and framerate. I did not want to get into the whole DX9/PS2.0 game thing again. I looked as the 9700pro support for this as nice but not a factor. Thus I am not counting on PS3.0 support to be all of that right now. Sure it will be used and sure its nice to have, but rather bank what I know it can do now with the future being a nice factor but not a major one.....



Chalnoth said:
Still, I think it's definitely a good idea, and should become standard by the time the next generation comes around. I just don't see it a "wow" anywhere in relation to the technique.

Maybe not technique, but from an end gamers point it can make a difference. I remember the first time I was able to get UT to use those High res textures. WOW Night and day difference!!!
 
Personally my vote go's to ATI in a big way, why? First off I think I have read most of the important sights reviews and by far everyone pretty much agree that this is the case. Yes their are a few that like to play “the we are bias gameâ€, but if you can read through the crap, I don't see how it can be seen any other way. Performance is there, playable framerates with eyecandy at max is there.

PS 3.0 has a future and I'm glad Nvidia took the plunge, but my doubts are implementation. Developers may in the next 12 months go that route, but given the slowness of PS 2.0 coming to market leads me to believe otherwise. PS 3.0 for the future great, but not useful for a bit. Also considering Nvidia’s track record for PS 2.0 support has been less than stellar, I’m wondering how well the hardware would actually work under 3.0 we’ll eventually see. I’m also certain ATI is very aware of what PS 3.0 will bring to the table if they felt it was something the developers really had the intention of supporting in the near future, don’t you think ATI of all companies wouldn’t have gone that route given their current track record? ATI has pushed PS 2.0 to the forefront and it’s hardware runs very well on it, can you say the same about Nvidia’s current products?

Product to market, well what can be said. Nvidia loves to flaunt paper while ATI puts a product on the shelf. This tells me that ATI is actually on schedule with their x800 production meaning this is not a preview product like Nvidia but a produced product for review. That alone tells me that ATI is not in the game to blow smoke up everyones ass, they are serious. Nvidia has a lot to learn from ATI in this regard.

The funnies, Nvidia sends out new drivers to all the review sights (61.11 even though broken) that clock higher in some benchmarks one in particular Farcry a day or so before the ATI launch HMMMM. They get favorable review sights to actually bench with them. Then the day ATI launch's they have a new version of an already confirmed product (the 6800) that they sent out to all review sights not 3 weeks ago stating these are the specs of their flagship card, but Oh we forgot to mention that we have an extreme ultra version dubbed 6850. Amazing right out of the blue! Why? Can we see your new card please? Can we review your new card please? How many will you be producing? When can I buy one? Now we have a new product once again the 6800GT, Hmmm, the above applies once again.

Trust, unless you have been hibernating, last year this time what have we seen from Nvidia ? Cheats (now dubbed Bugs) in benchmark and flybys , sacrificing IQ for FPS etc. Am I willing to turn the page, yes, when I see they have changed. What am I seeing though? Fartcry? Driver enhancements(that breaks more than it fixes) for a game that they knew would be benched and is part of the TWIMTBP program (again!). How much trust can you place in a companies product that has such a record? My gut feeling is they haven’t changed, time will tell. As a consumer I don’t want to hear they need to resort to underhanded tactics to make their product look good, what did they say (or rumor) they spent 1 billion dollars for this generation? Excuse me that should have gotten them a dam good series of cards that can play clean, if not, they have a bigger problem than you want to know. Oh and given their track record I’m all for dissecting their drivers for the next 12 months!

Lets face facts Nvidia has been a reactionary company for the past couple of years, they don’t like being second best (I can agree no one do’s) but look at their track record it speaks volumes.

I can go on about a lot of other things that I don’t appreciate like dual molex a bigger PSU a huge 3 slot fan etc. but my points are very clear as is.

Thanks for the time guys
 
Status
Not open for further replies.
Back
Top