My first full review - 9600SE vs 5600XT - Comments please

Hanners

Regular
I've finally published my first full video card review, and so I thought I should throw it open to the scrutiny of you guys, as I'm really keen to get an idea of what I could do better and what principles I should stick with in future.

You can take a look at the review, comparing a Hercules 9600SE and PNY GeForceFX 5600XT here.

So, now your turn - Throw me to the lions! :p
 
you ATI f@nboy! you didn't use the latest NVIDIA drivers for 3DMark--you konw, the ones that re-enable their unified compiler!</sarcasm>

The problem--no IQ analysis. Would've liked to see AF, default texture quality, etc. compared on both cards. Also, no mention of brilinear, which is disappointing (although it's not paramount for a card with such a low price, a comparison would be nice).
 
How come there is absolutely no mention of the expected performance boost that dx9.1 is going to give to the whole FX line-up? :|
 
The Baron said:
The problem--no IQ analysis. Would've liked to see AF, default texture quality, etc. compared on both cards. Also, no mention of brilinear, which is disappointing (although it's not paramount for a card with such a low price, a comparison would be nice).

Fair point... But to be honest, I really just wanted to concentrate on how the cards play games given their nature - Hence giving comparison shots from each card at their highest playable settings, then leaving the reader to decide. I should probably have mentioned brilinear though, especially given that the 5600XT really has to run without AF to have any chance of performing okay.
 
All looked good to me :D Shame the fx 5600 XT you had didnt provide competition with the 9600se in any benchmark though :p

Is the fx 5600XT really playable in halo at 20fps average? :oops:

The methodology seemed sound enough, image quality is not so important for upper budget cards and theres not a huge difference between the two cards in terms of image quality.

I mean you could go on about af/fsaa quality but your not going to be using them on the fx5600XT in most situations anyway :p

Not directly related to the review, but do your in game experiences with halo correlate well with the games built in benchmark?
 
I thought it was a well-rounded and well-written review. The realistic framerate discussion on page two was nice, something lacking in most reviews. I suppose I could do the legwork myself by analyzing a boatload of charts and finding which settings provide equivalent performance, but the average reader will and should prefer your method. Although it's pretty impossible to compare IQ with two different-sized pics (unless I'm willing to display each full-screen on my monitor), props for including them.

The 5600XT seems practically worthless for a gamer (in my snooty opinion), but I wonder how the 5500 will compare to a 9600SE, particularly if it comes in at the same price and with 128-bit memory? Heck, how do these cards compare to a cheap-o $75 Radeon 9100 128MB or $85 GF4 4200 64MB, both of which are available in the U.S. (albeit probably only online)? I look forward to a review. :)
 
digitalwanderer said:
How come there is absolutely no mention of the expected performance boost that dx9.1 is going to give to the whole FX line-up? :|
I can't believe you could even bring yourself to SAY that! :p
 
cthellis42 said:
digitalwanderer said:
How come there is absolutely no mention of the expected performance boost that dx9.1 is going to give to the whole FX line-up? :|
I can't believe you could even bring yourself to SAY that! :p
What I find even more telling is the fact that Hanners hasn't even addressed my question yet....what is he trying to hide?
bleh2.gif
 
Typo alert;

The Serious Sam graph is labelled The Grand Cathedral, but the text mentions it as The Last Cathedral. Other than that, nice work. :)
 
Is there something about NV's architecture vs ATI's that *requires* higher bandwidth per gpu core clock? Looking at these two cards, the NV is 235/266 while the ATI is 325/200. And ATI smokes 'em. So what's up with that? Either NV put higher clocked memory (read $$) on a low-end card than was necessary (which seems unlikely to me on a value card), or they need that extra bandwidth even at the lower core clock (compared to the ATI). Is there a third explanation? If the latter, can we generalize that up the product line? And what's causing it?
 
That's actually a very nice review, so anything that follows should rather be considered as nitpicking.

I'd prefer to not see any 4xAA/1xAF comparisons at all, but rather 2xAA/2xAF scores for two reasons:

  • These are value to mainstream products and despite the fact that there are higher modes than 2xAA/2xAF available, the real time gameplay section shows that anything above 2xAA stresses performance way too much.
  • 2xAA/2xAF is the closest thing one can get for comparisons, while respecting the differences in implementations concerning both anti-aliasing and anisotropic filtering algorithms. Granted there are still differences but at the lowest possible level. In fact I'd prefer reviewers to skip noAA/noAF modes from now on even for higher end offerings and use 2xAA/2xAF as the minimum base level (with exeptions where AA f.e. doesn't work at all).

I'm also not so sure about the "Value for the money" graphs on the last page, despite Hanners' notes underneath. The comparison to higher end sollutions is out of place IMHO. Would you retest the same applications with all 4 cards in 1280*960*32 with 4xAA/8xAF enabled f.e., the picture would change immediately and it's the real reason why someone would break his budget and opt for a R350 or NV35. Who buys high end cards for 1024/noAA-AF anyway?

Other than that my congratulations for an excellent first attempt. Keep the good stuff coming and keep us informed whenever you have something new released. :)
 
dan2097 said:
Is the fx 5600XT really playable in halo at 20fps average? :oops:

Of course, it's all pretty subjective, but it felt pretty playable to me. Not as playable as the 9600SE, but good enough to not be a total disaster.

dan2097 said:
Not directly related to the review, but do your in game experiences with halo correlate well with the games built in benchmark?

The timedemo and actual gameplay in Halo are two very different beasts, and shouldn't really be compared at all.
 
Ailuros said:
I'd prefer to not see any 4xAA/1xAF comparisons at all, but rather 2xAA/2xAF scores for two reasons:

  • These are value to mainstream products and despite the fact that there are higher modes than 2xAA/2xAF available, the real time gameplay section shows that anything above 2xAA stresses performance way too much.
  • 2xAA/2xAF is the closest thing one can get for comparisons, while respecting the differences in implementations concerning both anti-aliasing and anisotropic filtering algorithms. Granted there are still differences but at the lowest possible level. In fact I'd prefer reviewers to skip noAA/noAF modes from now on even for higher end offerings and use 2xAA/2xAF as the minimum base level (with exeptions where AA f.e. doesn't work at all).

I think that's an interesting point actually - In retrospect, I did wonder whether I should have gone through more possible AA and AF combinations. In the end, I decided to leave that to the real-world tests, where as you can see 2xAA was pretty much the best you could hope to use, and the performance hit for AF was low enough to allow 8x without too much trouble on the 9600SE in a lot of cases. Certainly an idea I'll take on board though.

Ailuros said:
I'm also not so sure about the "Value for the money" graphs on the last page, despite Hanners' notes underneath. The comparison to higher end sollutions is out of place IMHO. Would you retest the same applications with all 4 cards in 1280*960*32 with 4xAA/8xAF enabled f.e., the picture would change immediately and it's the real reason why someone would break his budget and opt for a R350 or NV35. Who buys high end cards for 1024/noAA-AF anyway?

To be honest - I'm not sure it works either in its current form. I was quite temped to take it out altogether, but I kind of wanted to see how people reacted to it - i.e. Is it completely useless, or would it be a useful facet of a review in a slightly different form?
 
"At this point, I should probably adhere to tradition and compare a huge long list of specifications and spout PR rubbish for the two cards in competition here, but let's be honest; nobody wants to hear about that stuff, especially seeing as we've seen it all a million times before anyway. So, without further ado, on to the benchmarks! "
Based on the products you were reviewing, which are aimed at the budget market the specs/PR of each card are exactly what you should be mentioning. Even small differences in warranty etc are really important. In order to make an informed purchase the reader now has to visit two other sites. Maybe at the very least you should add a link on the page that goes direct to the specs.

"All tests have been run initially with AA and AF disabled, followed by runs with 8x AF, 4x AA, then both 8x AF and 4x AA together"
I'm a great believer that you should know the limits of the card you are testing...8AF/4AA tests on these cards is completely pointless. No-one will ever use them. 2AA 2AF would have been my choice right out of the box then a little tinkering here and there to see if maybe one could be moved up to 4x in some apps/games.

Final scores for Aquamark and 3Dmark would have been useful...for someone looking for a card they may not know how to or want to compare every single fps stat...instead they may just want to run an app and see how the score compares then they may move onto the specific areas. (I see the 3dmark ones are in the OC section however some people may skip that through having no interest in OC'ing)

Why no 3Dmark01? Its a very useful indication of DX8 performance, useful on budget dx9 cards.

In the synthetic/timedemo section if a mainstream/budget card is not "playable" at 1024x768 (e.g. Splinter Cell) then you really shoud show 800x600...if i was a reader i'd like to know if i could get any sort of playable framerate at lower settings or if it was a completely lost cause. Its pretty easy to edit the benchmark file in splinter cell to add resolutions.

Halo results :"and surprisingly ends in one of the closest contests between the two cards performance-wise"
I wouldnt say almost double the performance from the Radeon makes things close...its all relative to the fastest performer...not the number of fps between them.

In the real world tests average fps is fine however min fps is also very useful in helping a reader determine what is acceptable for them. GTA:VC for example. I now know that the 2 cards can happy get an average of 35+fps...this could however mask the fact that the SE is dropping as low as 5-10fps in some areas where as the XT only drops to 25fps. I also feel that you should show the results for the 2 cards at the lowest setting (in VC's case 8af 0aa and then if one can handle 8af 2aa then show that too.) Again it helps people compare and also shows the differences in performance more clearly.

It would be useful to know in the review what patches were applied to the games/apps. Was 3dmark03 build 340? Is Halo 1.31 or 1.4? What about UT2003...patch 2xxx? Also what Nforce drivers were used? Did you reinstall windows for each cards testing or use an app like drivercleaner? Did you apply any further patches to Windows or was it only SP1? What directX version was used 9.0, a or b?

Personally i really dont like the value for money section, didnt like Toms Hardwares similar system...dont like this either. I think showing the performance (which you do in the review ) and listing the prices makes it easy for the user to see what value for money they are getting.

Final point for now, maybe chucking in a 8500 or TI4xxx card would have been useful for a comparison however i appreciate time is always more of an issue than available cards.

Hope the above helps...
 
Veridian3 said:
Based on the products you were reviewing, which are aimed at the budget market the specs/PR of each card are exactly what you should be mentioning. Even small differences in warranty etc are really important. In order to make an informed purchase the reader now has to visit two other sites. Maybe at the very least you should add a link on the page that goes direct to the specs.

Fair point - I would have simply pointed to the two companies web sites, except they both seem to be in denial that these products exist, neither PNY or Hercules has any mention of them on their website. I should have mentioned warranty and the likes though, you're right.

Veridian3 said:
I'm a great believer that you should know the limits of the card you are testing...8AF/4AA tests on these cards is completely pointless. No-one will ever use them. 2AA 2AF would have been my choice right out of the box then a little tinkering here and there to see if maybe one could be moved up to 4x in some apps/games.

Again, as Ailuros mentioned, a fair point and one I'll take on board for future consideration.

Veridian3 said:
Final scores for Aquamark and 3Dmark would have been useful...for someone looking for a card they may not know how to or want to compare every single fps stat...instead they may just want to run an app and see how the score compares then they may move onto the specific areas. (I see the 3dmark ones are in the OC section however some people may skip that through having no interest in OC'ing)

To be honest, I don't see the point of total 3DMark scores in either 3DMark or AquaMark, they don't really tell you anything at the end of the day about where the cards strengths and weaknesses lie.

Veridian3 said:
Why no 3Dmark01? Its a very useful indication of DX8 performance, useful on budget dx9 cards.

I considered it, but given both IHVs history with 'optimising' for this benchmark I felt that it might not be all that useful in the end.

Veridian3 said:
In the synthetic/timedemo section if a mainstream/budget card is not "playable" at 1024x768 (e.g. Splinter Cell) then you really shoud show 800x600...if i was a reader i'd like to know if i could get any sort of playable framerate at lower settings or if it was a completely lost cause. Its pretty easy to edit the benchmark file in splinter cell to add resolutions.

I figured that would be somewhat pointless seeing as the Splinter Cell timedemo isn't a good indicator of performance in that title anyway, it was designed to push the abilities of a card to handle shaders, among other things. The real-world gaming section was hopefully clear enough about how the two cards really perform in Splinter Cell.

Veridian3 said:
I wouldnt say almost double the performance from the Radeon makes things close...its all relative to the fastest performer...not the number of fps between them.

True, I didn't word that one too well.

Veridian3 said:
In the real world tests average fps is fine however min fps is also very useful in helping a reader determine what is acceptable for them. GTA:VC for example. I now know that the 2 cards can happy get an average of 35+fps...this could however mask the fact that the SE is dropping as low as 5-10fps in some areas where as the XT only drops to 25fps. I also feel that you should show the results for the 2 cards at the lowest setting (in VC's case 8af 0aa and then if one can handle 8af 2aa then show that too.) Again it helps people compare and also shows the differences in performance more clearly.

I did consider including minimum FPS, but that can be just as (if not more) misleading than an average FPS score. Taking your Vice City example, all it takes is one slightly long CD or hard drive access, and you have a minimum of 1 FPS, which is in no way indicative of how the video card was performing. It's something I would certainly consider doing in future, but when I was looking through the data for this review I didn't feel that including minimum FPS would give a helpful indicator of the actual performance and playability with the cards in general.

Veridian3 said:
It would be useful to know in the review what patches were applied to the games/apps. Was 3dmark03 build 340? Is Halo 1.31 or 1.4? What about UT2003...patch 2xxx? Also what Nforce drivers were used? Did you reinstall windows for each cards testing or use an app like drivercleaner? Did you apply any further patches to Windows or was it only SP1? What directX version was used 9.0, a or b?

Yep, I should have mentioned all that stuff. Duly noted. :)

Veridian3 said:
Personally i really dont like the value for money section, didnt like Toms Hardwares similar system...dont like this either. I think showing the performance (which you do in the review ) and listing the prices makes it easy for the user to see what value for money they are getting.

Again, noted, and I'm inclined to agree with you.

Veridian3 said:
Final point for now, maybe chucking in a 8500 or TI4xxx card would have been useful for a comparison however i appreciate time is always more of an issue than available cards.

I would have loved to, but sadly didn't have any older cards available. I wish I hadn't got rid of my 8500 a while back now. :(

Veridian3 said:
Hope the above helps...

Definitely - Thanks for all the input! :)
 
Is it completely useless, or would it be a useful facet of a review in a slightly different form?

No I don't think it's useless at all; I just think you expanded too much into market segments (high end) that are out of place for price/performance evaluations, since the demands are times higher there also.

You could have easily limited it to the two contenders and call it a day. All IMHO of course ;)
 
HiJack Alert.

Sorry Hanners for this but:

Veridian3 said:
"Why no 3Dmark01? Its a very useful indication of DX8 performance, useful on budget dx9 cards.

3Dmark01 is a very POOR indicator of DX8 performance. Its way to CPU/System bound to be a good indicator of a video cards DX8 perfromance. Then you issues like how the K2 gets killed by a GF2mx in 3dmark2k1, yet in about every Dx8 game, the K2 is much faster than a GF2mx for example.

Ok back on topic
 
Sorry OT again,
With cards this slow the card will be a bottleneck long before the CPU and i feel it would be a good indication of the max performance that could be achieved by the card when coupled with a decent CPU/mobo. It may be true that it doesnt mirror games as well as we would like however its an interesting (although not essential) result to see imho.
 
Hanners said:
I did consider including minimum FPS, but that can be just as (if not more) misleading than an average FPS score. Taking your Vice City example, all it takes is one slightly long CD or hard drive access, and you have a minimum of 1 FPS, which is in no way indicative of how the video card was performing. It's something I would certainly consider doing in future, but when I was looking through the data for this review I didn't feel that including minimum FPS would give a helpful indicator of the actual performance and playability with the cards in general.

That's why you make sure you have a nice clean hard drive with not much on it and nicely defragged with a large swap file. And you make sure to have plenty of RAM, I've got 1GB on my test system. And then you run the test at least 3 times back to back so it works out any loading issues in the level, and you can start to see if the minimum fps is evening out, you can take the average of the 3 runs for example, or take whatever you feel is needed. But by doing it multiple times you eliminate those loading pauses.

It's a lot of work and takes time, believe me I know, but that’s how I do it, and it’s the only way to really get good solid information, Min, AVG, Max. All of which I think are important in evaluating game performance.
 
Brent said:
Hanners said:
I did consider including minimum FPS, but that can be just as (if not more) misleading than an average FPS score. Taking your Vice City example, all it takes is one slightly long CD or hard drive access, and you have a minimum of 1 FPS, which is in no way indicative of how the video card was performing. It's something I would certainly consider doing in future, but when I was looking through the data for this review I didn't feel that including minimum FPS would give a helpful indicator of the actual performance and playability with the cards in general.

That's why you make sure you have a nice clean hard drive with not much on it and nicely defragged with a large swap file. And you make sure to have plenty of RAM, I've got 1GB on my test system. And then you run the test at least 3 times back to back so it works out any loading issues in the level, and you can start to see if the minimum fps is evening out, you can take the average of the 3 runs for example, or take whatever you feel is needed. But by doing it multiple times you eliminate those loading pauses.

It's a lot of work and takes time, believe me I know, but that’s how I do it, and it’s the only way to really get good solid information, Min, AVG, Max. All of which I think are important in evaluating game performance.
I've been cheating as the best way I found to avoid the "load-time low fps dip" while frapsing is to just find a good run in each game I'm benching that doesn't require a load time or have the low-dip then just benching that particular section.

The thing with fraps benching is it can be a great tool to reflect real-world gaming performance (OMG, did I really just write that? :rolleyes: ), but it can also be innaccurate/misleading as hell if you don't use it right.

I really did find myself trying to find sections of the games to bench that would reflect what I felt the gaming performance was.

Biased, yup....but effective too. :)
 
Back
Top