Albatron GeForceFX 5900PV 128MB Review

Discussion in 'Beyond3D News' started by Dave Baumann, Aug 26, 2003.

  1. Evildeus

    Veteran

    Joined:
    May 24, 2002
    Messages:
    2,657
    Likes Received:
    2
    Good review, and good to see these updates. I think you should have updated this part also:
    The last part really depends on the impact of the glow effect. If it's more or less the same effect, then you are right, but if it's -50% for Nv like you show, and -5% or -95% on Ati, the relative comparisons doesn't still stand.
     
  2. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    A great read overall. I am very impressed with the apparent close relationship B3D has with Core, and the attempts at getting clarifications / questions answered from both Core and nVidia when deemed necessary. (Even if you don't always get a response.) It's very refreshing compared to the "typical" reviews that might encounter an anomoly, and then just blindly speculate (but stated as fact, of course) on what the problem is.

    The glow effect:

    I am a bit worried by how this is going. If it really is an issue of Core not being happy with the effect itstlef, that's one thing. If it has to do with performance, then I question the removal of it. A 9700 Pro can pull a few FSP shy of 60 at 800x600 with all effects, including Glow, and with 4X FSAA and full Aniso. That's not too shabby, and that's not the best card on the market...far from it after this fall's product announcements.

    I don't see any reason, performance wise, to eliminate the Glow effect as an option. I can certainly understand disabling it by default, but hardwire disable?

    [conspiracy mode]Makes you wonder if any particular IHV "enccouraged" the removal of the glow effect, because with it, NONE of its cards can get close to 60 FPS, at any setting...more like 30 FPS at 800x600.[/conspiracy mode]

    As other have said, this is the way it should be. If you bought an FX based on what nVidia planted as some expectations...and those expectations aren't met, then you shouldn't be happy. And you should keep that in mind the next time nvidia sets certain expectations.

    If you have been frequenting this board since the dawn of the FX Architecture debate, synthetic tests, etc, you would have witnessed the following:

    1) General concensus that the FX shaders, from the very beginning with the fist 3DMark03 tests, had issues...potentially big ones.

    2) nVidia basically saying "but we work closely with develoeprs, so it won't be an issue."

    3) To which the relevant response is: "having to rely on close developer relations to get performance is bad on two fronts: a) Do you have "close" developer relations with everyone? Can you guarantee that all developers can or will get performance to what we expect? b) There's no guarantee that even with a close relationship, that they can work around whatever issues there are.

    4) Which leads us to: Buying FX hardware based in part on an expectation that it will meet R300 level of shader performance, is a significnat risk. It may happen, it may not happen. So far, it's not happening...and this is a case (TR), where nVidia does appear to have a close relationship with the development team.

    Does this make the FX cards horrible? No. But if one of the reasons why you are considering a new card is the ability to handle DX9 shaders, then FX is certainly a much higher risk than the R300 products.
     
  3. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,533
    Location:
    Winfield, IN USA
    Thanks, I look forward to seeing the results. :)

    BTW-Finished the review, REALLY nice job! 8)
     
  4. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    Actually, I wouldn't term my relationship with Core as "close" (and it really is "my" relationship... since it's kinda like one of my "job specs" here at B3D... I'm the B3D devrel, so to speak :) ). It's just that the guy at Core (in fact, this guy is the only guy that matters when it comes to the PC version of this game... without him, there'd be no PC TRAOD) seems to reply to my emails. The fact that I approached him (thanks to one of the B3D partners here) with my first email saying a good relationship between B3D and Core wrt to what B3D wants (info, benchmarking stuff) can only probably help a few more copies of the game (not ad verbatim, but he got the message) can't but help encourage him to talk to me. He's almost replied every one of my emails since.

    You should stop with these conspiracy stuff -- Core has had plenty of flak from purchaser of the PC version the game, from game bugs to "lousy" performance... when they don't know what settings these people use. The "Glow" effect is kinda hard to notice anyway. I've asked Core to do away with another effect in a different level of the game (the "Graveyard" level) because it sucks plenty hard when it comes to performance, when the effect isn't that great looking to start with. From the very satrt, Core told me the "Glow" effect eats up performance while not looking that great. This was way before we debuited this game in our reviews (the first being the Triplex R9600PRO one). No conspiracy here -- Core can't have any more trouble with this game.

    In any case, I have to say this : The game isn't great (but not a concern for B3D), the engine isn't great (slightly a concern for B3D) but the fact that it uses all these DX9 effects is great insofar as B3D is concerned. Core didn't code this game to cater to GFFX even with NVIDIA proclaiming this a TWIMTBP game... this is important.
     
  5. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    No such logo.
     
  6. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    The free AA doesn't seem to have much to do with the glow effect - it was essentially free when glow was enabled (when using cg) too.
    Though this makes me worry, there is absolutely 0% performance difference between AA 2x and AA 4x in ALL modes using cg (be it with/without glow, with/without DoF). There is at least a slight performance difference between 0xAA and 2xAA in SOME modes. A couple of screenshots showing it actually does 4xAA would be nice. Though I guess entirely free AA might be possible with this game, if it's completely float shader limited the 5900 should have lots of bandwidth to waste.
     
  7. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    matrox wrote:

    "Well, I don't think for one minute that the FX series of chips is the best nVidia is capable of. It's basicly the result of 2 things:
    1) nVidia's belief that they had no compitition(ie.they did'nt see the R300 coming )"

    That's simply disgusting... Nvidia policy was: add a new shader intruction to their card lineup ----- means ----- more money.....
    Six months after... another pipeline increase..... more money....
    and six months after again.... clock increase..... more money......

    Just to maintain their price on top, they were adding features to their chips in a snailing manner..... Yes they were playing with us buyers..... but who was going to guess ATI would come into play?

    PS: the same happened with intel before athlon come into circus.....
     
  8. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    Right...like I said...a good relationship. ;)

    Fair enough...but again, I don't see why taking the option out is a necessity. Hell, make it only accessible by tweaking an INI file and remove it from in-game menus...because apparently some cards do have the horsepower to handle it.

    I always err in the side of choice.

    I didn't say they coded it to "cater" to GFFX. I said they are coding with GFFX at least in consideration. (Offering Cg support, etc.)

    It's also important because we now know the TWIMTBP program is meaningless for anything other than marketing.

    We all know that nVidia "convinced" FutureMark to "allow for certain optimizations", presumably because nVidia "convinced" FutureMark that "in reality, that's what they do with game developers. "

    This (TR:AOD) appears to be a case where either

    1) nVidia did NOT work closely with Core...hence blowing a non-trivial hole in their argumentation for allowing "optimizations" in 3DMark.

    2) nVidia DID work closely with Core...but that didn't help bring shader performance up to parity with R3xx...bringing into question the type of optimizations that are being "allowed" by FutureMark.
     
  9. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    the link to comments at the end of the article
    just sends you back to the beginning; which would be quite frustrating, if you were extremely literal minded ;)


    I had been wondering about the use of Tenebrae as a possible benchmark, and I'm glad you tried it out. Have you done any (comparable) tests on and R3XX cards?

    Do you use a console command to enable stencils, or something? because I don't see it in my launcher options ('though i'm not sure which version I've got).

    there's an interesting wall (one of the initial levels of the second part) which basically yields an interactive version of 3dMark03's AF filter pattern. I enjoyed approaching and retreating in order to view the function of filtering pattern for distance.
     
  10. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    the link to comments at the end of the article
    just sends you back to the beginning; which would be quite frustrating, if you were extremely literal minded ;)


    I had been wondering about the use of Tenebrae as a possible benchmark, and I'm glad you tried it out. Have you done any (comparable) tests on any R3XX cards?

    Do you use a console command to enable stencils, or something? because I don't see it in my launcher options ('though i'm not sure which version I've got).

    there's an interesting wall (one of the initial levels of the second part) which basically yields an interactive version of 3dMark03's AF filter pattern. I enjoyed approaching and retreating in order to view the function of filtering pattern for distance.
     
  11. Archaeolept

    Newcomer

    Joined:
    Feb 26, 2003
    Messages:
    52
    Likes Received:
    0
    damn :oops:

    can't find any edit button to delete one of those posts :?

    oh - rooked by the accidental "guest" once again: guests can't edit???


    Anyways, interesting series of articles you guys put up. thanks.
     
  12. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Puzzling...sounds suspiciously like a cpu bottleneck of some sort. However, in testing some features it seems clear the cpu isn't preventing the performance being capped by the vpu/gpu, as well. What happens with AA without Cg? I'm wondering if this might be applicable to something odd Cg is doing...
     
  13. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    So if I read this right - there are possibly two issues here surrounding the FX line of cards.

    1. PS 2.0 implementation
    2. Image quality

    If so - is ps 2.0 functionality a bug? More and more it seems every new
    card based on the NV30 has this issue. So if truth be told, it honestly seems to be a bug in design carried forward. Not sure how many drivers have been released since NV30's release - but I'd wager if it was a driver bug it would have been fixed...

    Image quality - can this be fixed by drivers?

    In a nutshell - if the the shader feature set is truely a design issue - then there is no reason to buy a Nvidia product today. Not a high end card anyway.

    Oh yes, one last thing. It seems that FX cards running the benchmarks using CG look better and perform on par with the Raddy's. Is this a accurate statement? PS2.0 disabled and CG enabled = a faster FX ever so slightly...

    Anyway - nice review. Would love to hear what Nvidia has to say. I'd packed up every benchmark surrounding PS 2.0 performance and say:

    ANSWER THIS!!!

    In a sense, ATI should watch out though. If Nvidia does fix this - it appears they could have a much faster card...
     
  14. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    I have half the data (no-DOF... with-DOF to come tomorrow... it takes quite a while for benchmarks, all-rez-all-AA, to finish, y'know) when I wrote that. Percentage-wise, the 9700PRO exhibit the same performance behavious wrt the "Glow" enable/disable efect as the GFFX 5900.
     
  15. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    Fixed.

    Not yet.

    I used an autoexec.cfg for enabling/disabling stencils. Like so :

    Code:
    sh_entityshadows 0
    sh_worldshadows 0
    sh_playershadow 0
    sh_noscissor 0
    Substitute 0 (setncil off) with 1 (stencil on).
     
  16. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    Not a bug in design... "more and more" like simply a worse hardware/architecture design than ATI's.

    You mean the DX9 HLSL compiler thing? Likely.

    Cg look as well as "Raddy's", not necessarily better. Cg performs worse than "Raddy's", not on par.

    A certain NVIDIA personnel is currently trading emails with me... nothing interesting so far.

    I'm a little confused by this... what exactly do you mean to say?
     
  17. Evildeus

    Veteran

    Joined:
    May 24, 2002
    Messages:
    2,657
    Likes Received:
    2
    Ok thx :)
     
  18. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    All I really meant to say is to package everything up as a whole. All the tests that you have done which show the poor PS 2.0 results. Or, just package up the data and supporting graphs and ask what they plan to do about it.

    It seems to be the primary problem. One that if fixed would bring them up to speed with other GPU manufactures.

    Or so it appears to me anyway.
     
  19. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    Couriouser, and couriouser... These links http://www.ati.com/entertainment/po...//www.ati.com/products/gamesupport/index.html would seem to indicate that TR:AOD is also ATI's 'Get in the Game' title? These marketing deals are supposed to be mutually exclusive, right?

    So Nvidia is claiming that this is a TWIMTBP title and ATI is bundling it with their cards and claiming it to be a 'Get in the Game' title? In the middle of all this it's impossible not to notice that Core is being awfully quiet about _any_ hw requirements what-so-ever on their web site (at least I couldn't even find decent minimum requirements from official Tomb Raider web site).

    What gives?
     
  20. martrox

    martrox Old Fart
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,065
    Likes Received:
    16
    Location:
    Jacksonville, Florida USA
    Think the NSA is involved?


    sorry, just watching Mercury Rising........
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...