forced FP16 on HL2 fixes FX performance problem???

Discussion in '3D Hardware, Software & Output Devices' started by ^eMpTy^, Nov 30, 2004.

Thread Status:
Not open for further replies.
  1. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,579
    Likes Received:
    4,799
    Location:
    Well within 3d
    I was under the impression that partial precision hints only allowed the hardware to change how it internally works with a value, not how it is stored in memory.

    ATI's DX9 implementations may work with some 24-bit pixel precision internally, but they write and read from memory in 32 bit quantities.
     
  2. t0y

    t0y
    Newcomer

    Joined:
    Mar 28, 2004
    Messages:
    149
    Likes Received:
    6
    Location:
    Portugal
    I might be a bit off, but don't you mean they write 32 bits per pixel? I believe NV cards write 64bits per pixel too (FP16*4), and I mean the frame buffer, not off screen stuff.

    But yes, the pp hints only affect intermediate calculations, not the output format.
     
  3. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,579
    Likes Received:
    4,799
    Location:
    Well within 3d
    I accidentally mixed up terms in my post. I was trying to make a distinction between how the ATI chips use 24-bit precision internally for pixel shader calculations, while things outside those shaders use a full 32 bits.
     
  4. Beren_PCE

    Newcomer

    Joined:
    Dec 2, 2004
    Messages:
    8
    Likes Received:
    0
    Location:
    Zagreb, Croatia
    I would like to know why Valve bragged about how they invested huge work in optimizing source engine for FX cards and then trashed it at the end.

    Most radicale answer to this question would be that Valve this decision purposefully to encourage average FX users to buy ATI-s cards in the future. If that's the case, it's a real low kick.


    On the other hand, arent you curiouse about the fact that nVidia didn't issue any press statements about this FX performance issue? Would they abandon their customers (FX card customers that is) like that?

    I would just like to mention again that forcing DX9 shaders over the console in HL2 on FX cards results with no water rendering whatsoever. On the other hand, when you make your cards say "hello" to engine in ATI way (ie. Radeon 9800 Pro) DX9 path works like a charm regarding IQ. Maybe i don't know enough about graphics programming to draw such extreme conclusions but when i look at the presented facts i can't that it doesn't looks fishy.
     
  5. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,533
    Location:
    Winfield, IN USA
    Or might it be that since the nV40 don't suck at Dx9 that they just decided why bother supporting something that nVidia themselves barely support anymore?

    Hmmmmm..... ;)
     
  6. radar1200gs

    Regular

    Joined:
    Nov 30, 2002
    Messages:
    900
    Likes Received:
    0
    Barely supported???

    The performance and IQ increase just from 60.72 thru 70.41 has been quite remarkable on a "barely supported" product. I'd guess that the FX line has improved its speed over this time more than the NV4x line has.

    Oh, and as for your comment on FP24 being minimum full precision, you no doubt will be even more amused to lear thart under SM3.0 and above, FP24 does not count as full precision, but rather partial precision, which should lead some to wonder why it was ever allowed as a full precision in the first place....
     
  7. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,533
    Location:
    Winfield, IN USA
    Yeah, barely supported. Go on over to the new nVidia forums and check out all the FX users bugging 'em for help/support and explain to them how wonderful nVidia has been about it. :)
     
  8. radar1200gs

    Regular

    Joined:
    Nov 30, 2002
    Messages:
    900
    Likes Received:
    0
    Once again, I point you to the recent drivers and say to you that FX users have had at least as much help from nVidia as NV4x users have, probably more.

    If you are referring to hardware issues, that is the responsibility of the card manufacturer, not nVidia.
     
  9. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    But what are the recent drivers doing . It seems to me they are droping features in order toget better performance like percision in dx 9 shaders .
     
  10. hovz

    Regular

    Joined:
    May 10, 2004
    Messages:
    920
    Likes Received:
    0
    its humorous how radar completely abandons his arguements when someone shuts them down and just hops to a new arguement.
     
  11. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,533
    Location:
    Winfield, IN USA
    Again I point you to nVidia's own forums where there are people screaming for help and support since they don't seem to share your opinion. :)
     
  12. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    There may be bugs on the Nv3x hardware. But there are bugs on the Nv4x hardware Just as theirs bugs on ATI hardware, SiS Hardware, And XGI hardware. Nvidia forums. Are naturally going to draw people with problems with their hardware. The Nv3x isnt an unsupported card. It may not be the fastest piece of hardware out there. But Nvidia isnt taking away its feature sets. They just arent optimising the hell out of it anymore. Radar in this case is right. Bug fixes are still occuring on the Nv3x as they are Nv4x.

    Seriously digi. I bet if you found any forum about a specific brand. You will find people posting bugs galore.

    Driver Forum.
    http://www.nvnews.net/vbulletin/forumdisplay.php?f=26

    Another Driver forum

    http://www.rage3d.com/board/forumdisplay.php?f=59

    Another Forum.

    http://www.volarigamers.com/viewforum.php?f=2

    Its pretty easy to draw a conclusion about what these types of forums incur. Driver/Bug related feedback. And its a problem with various platforms and vendors. I may not agree with alot of what radar says. And some of it may be out right ridiculous. But ridiculing someone who says that they are still getting decent driver support on a 1 year old hardware. Is a bit far fetched.
     
  13. Pete

    Pete Moderate Nuisance
    Moderator Legend

    Joined:
    Feb 7, 2002
    Messages:
    5,777
    Likes Received:
    1,814
    In lieue of wasting my time ripping into Greg's ... what is it, madness? calculated obstinance? just plain trolling? again, allow me the following light-hearted aside.

    Chris, I'll let the murderous rampage go, but I'm afraid your spelling has indeed inconvenienced the stickler in me. :p

    ;)
     
  14. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    Ha! Doh thank you Pete :) Your right.. it wasnt spelled right.. "doh" ;)
     
  15. DukenukemX

    Newcomer

    Joined:
    Dec 26, 2003
    Messages:
    59
    Likes Received:
    0
    Same reason why you won't find many FX owners running HL2 in FP-16 DX9 mode. Even with all the work that Valve did for the FX owners it would still be very unplayable unless you're an FX 5900+ owner.

    I'm sure the FX 5200/5600/5700 owners won't be using FP-16 DX9 unless they wanna see what they're missing.

    So that's why Geforce 6800 Ultra owners are either better or equal to X800 XT in HL2? ATI doesn't care about the sales for FX cards and neither does Nvidia.

    In fact Nvidia doesn't even want to be associated with the FX name anymore. Plus why would ATI or Nvidia worry about the sales for FX cards anyway? It's all about the X600/X700 or Geforce 6200/6600 now.

    The FX cards died when the Geforce FX 5800 was released 6 months late and named the DUST BUSTER. Some people have a hard time seeing this.

    They got their money so why not? It's up to Nvidia if they support FX owners or not. Didn't you notice that "FX" was removed from the new Geforce 6XXX line of cards?

    Besides what do you mean by performance issue? So long as FX owners run in DX8 mode they will have a playable frame rate. It's been known for a while that DX9 was piss poor on FX cards. Only way to get FX cards to run fine in DX9 is to make sure that games were "suitable" for FX cards. That is why "The way it's meant to be played" program was started.
     
  16. radar1200gs

    Regular

    Joined:
    Nov 30, 2002
    Messages:
    900
    Likes Received:
    0
    I haven't abandoned any of my arguments in this thread.

    People have tried to argue against _PP, I have pointed out the specs of the DX9 API and the reasons why they are wrong. When people realise they can't argue against the spec, they resort to personal attacks against me.

    It's quite flattering actually.
     
  17. hovz

    Regular

    Joined:
    May 10, 2004
    Messages:
    920
    Likes Received:
    0
    how about the fact that the r300 runs FP24 substantially faster than the nv30 runs fp16?
     
  18. radar1200gs

    Regular

    Joined:
    Nov 30, 2002
    Messages:
    900
    Likes Received:
    0
    So what? I own a nVidia card not an ATi card, the speed of ATi's offerings is of no consequence to me.
     
  19. hovz

    Regular

    Joined:
    May 10, 2004
    Messages:
    920
    Likes Received:
    0
    it applie sin context of you saying nvidia made the right design decisions. which they clearly did not. as for the 2nd generation in a row they are 2nd best.
     
  20. radar1200gs

    Regular

    Joined:
    Nov 30, 2002
    Messages:
    900
    Likes Received:
    0
    There is no minimum performance metric in DX9. There will be in future versions of DirectX (or whatever Microsoft decides to call it).

    Your argument is purely subjective and many people will not agree with it.

    My arguments deal with established fact.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...