FP16 and market support

Discussion in 'Architecture and Products' started by radar1200gs, Dec 19, 2003.

  1. CorwinB

    Regular

    Joined:
    May 27, 2003
    Messages:
    274
    Likes Received:
    0
    Ah, yes. It's all MS fault, of course. The fact that some IHV managed to implement (and come first to market) a full FP part running at great speed should indeed tell you something (that Nvidia got badly burned by their old strategy of designing their next gen parts as a faster previous gen part, plus some next gen features thrown in for OEM checkboxes).

    Of course, Nvidia would never decide on their own to hack image quality in order to gain some speed, wouldn't they ? In Chalnoth's wonderful NV-colored world, they are forced to by bad, evil MS and their forward-looking standards, and bad, nasty ATI that releases parts that follow this spec and offer great speed and image quality. Just like BB is forced to lie, bully and spew BS all day long to feed his family.

    "Everybody", in Chalnoth's NV-colored vision, means "Nvidia and the mindless Nvidia fanb0ys who bought what they knew was an inferior product and went into denial afterward".

    I'm not sure if your Nvidia driver is replacing every mention of it, but he said "0085 Ultra" (reversing the order of digits so your driver won't catch it, hopefully you won't mind the dip in 2D rendering performance).


    Nvidia's excellent sales in the high end market indeed prove that Intel is Nvidia's main competitor... Looks like NV pretty much dropped out of high end and doesn't consider itself in competition with ATI. :)

    Denial at its finest...

    And it does full trilinear when you specify application-controlled filtering and the app has a setting for trilinear. How can you get full trilinear filtering with the Cheatonators ? You can't.

    Well, they have many options, ranging from butchering filtering to replacing shaders with hand-tuned versions (calling it a generic compiler) or to static clip planes. I trust your answer was ironic, or was it a case of pot calling the kettle black at its finest ?

    Which is a damn fine consolating thought to all the people having bought Nvidia's li(n)e and a 5600...
     
  2. YeuEmMaiMai

    Regular

    Joined:
    Sep 11, 2002
    Messages:
    579
    Likes Received:
    4
    chalnoth,

    funny that I said 5200 and 5200 does NOT have FP32 units FP16 yes but NOT FP32

    I guess that is why they only run as DX 8 cards when detected by DX9 games

    Funnly I can still find 9700Pro on ATi's website

    why do you insist that Ati does not do trilinear, when in fact they do when the application requests it.

    i know that while most of us want to have the best performance and the best IQ for the $ you would much rather have nVidia try and split the industry

    tell me is it in the gamers best intrest to have nVidia doing what they do? IT would be totally different if Nvidia just admitted this time they did not do the best that they could and at least adhere to the spec and produce quality drivers that made the most out of their hardware instead of cheating their rumps off.

    S3 was exterminated for the savage 2000's flawed hardware but we are supposed to give nvidia a pass? (sonic blue the company directly responsible for the savage 2000 has filed for chaper 11) Now S3 graphics is tainted by the savage 2000 (rightfully so).

    you know what was funny when 3dmark 03 game out my 9500Pro performed better than the 5800 ultra and DX9 titles bear this out before nVidia gets a chance to hack them.........

    Diamond MM may or may not sufffer from this mess, I guess we'll find out in a few months when they ship their cards.....
     
  3. Demirug

    Veteran

    Joined:
    Dec 8, 2002
    Messages:
    1,326
    Likes Received:
    69
    It HAS FP32 Units. No nVidia chip until now have a FP16 unit.
     
  4. Althornin

    Althornin Senior Lurker
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,326
    Likes Received:
    5
    You told me you never laid out blame. Oops. WHy didnt you say "nVidias refusal to have decent floating point performance, instead of including worthless legacy precisions"

    Awww, poor nVidia, "forced" by their terrible performance to cheat. Doesnt sound like a "forced" to me, sounds like a choice - "nVidia choose to use integer precision anyways because anything else would result in crappy performance due to the inclusion of worthless legacy precisions in the chip design"

    It would have been vastly better for everyone involved if nVidia had designed a chip that didnt suck so very badly at floating point operations.

    Chalnoth, do you ever get the feeling that everyone is against you (except for radar1200gs, lol) ?
    Do you think it could be because everyone is finally sick of your FUD, misinformation, and hugely blinding bias?
     
  5. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    I wouldn't be so quick to label a chip DX9 that is considered DX8 with developers. Lets not be kidding anyone here, a 5200 is unusable with DX9 shaders at full precision.
    Intelligent people look beyond marketing PR labels, and the 5200 is classic example of misleading PR....it may support 'some' DX9 feature set but is far too slow to be useable, so the end user will have to disable that option in their game which is no different then not having it at all.
     
  6. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    82
    Very slowly, along with other usage of FP32 in NV3x?


    Ahh, so because Nvidia doesn't support the spec, the solution is to downgrade the spec back down to DX7/8 so that Nvidia can still claim to be "DX9 compliant"?

    Which is not the same product is it?

    Meh. Or maybe in the next drivers. Or maybe in NV50. Or maybe when monkeys fly out of Jen-sun's butt.

    Or what we like to call "living in denial". It just shows how Nvidia are worried by ATI when they won't even publicly speak about the competition we know exists between the two companies.

    Well NV3x was far, far behind schedule because Nvidia couldn't get it to work properly. ATI's early adopter avantage is just another point in ATI's favour.


    That's because that mode is designed for forcing triliner on old apps that don't otherwise support it. When the control panel is set to "application preference" and the application is set to ask for trilinear, it get it on all the stages it requests it for. This is in start contrast to Nvidia products, where the application never gets anything except brilinear at best.

    Maybe they'll lie and cheat on the benchmarks too?

    Whoop-de-do. A year after R300 ships, Nvidia finally goes from "sucks badly" to "adequate in the midrange".
     
  7. Demirug

    Veteran

    Joined:
    Dec 8, 2002
    Messages:
    1,326
    Likes Received:
    69
    Yes, the use of DX9 Features is slow on a FX5200 but it is not unusable slow. If you calculate most pixel with the DX8 Featureset there is some room for the use of DX9 Features on small portions of the Image.
     
  8. Ostsol

    Veteran

    Joined:
    Nov 19, 2002
    Messages:
    1,765
    Likes Received:
    0
    Location:
    Edmonton, Alberta, Canada
    The only thing I've seen the FX5200 to be useful for is as a cheap feature compatability testing platform. Everyone I know who bought one for gaming has sold it and gotten a Radeon 9600 -- and they're much happier.
     
  9. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    Texture coords, maybe, but not texture lookups. What need does NV2x have for FP32 per component texture lookups? None, because the shader is far lower precision.
     
  10. Demirug

    Veteran

    Joined:
    Dec 8, 2002
    Messages:
    1,326
    Likes Received:
    69
    Sure a 9600 is a better gameing solution as a 5200. Hell you have to pay nearly twice the price for a 9600 as for a 5200.

    IMHO a cheap (and slow) DX9 card is better than something like a GF4MX again. I do not need such a "Technology brake" again.
     
  11. Demirug

    Veteran

    Joined:
    Dec 8, 2002
    Messages:
    1,326
    Likes Received:
    69
    No, it isn't. The NV2X Texture shader works with FP32.
     
  12. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    That's it, resort to personal attacks when you can't refute someone else's argument. Anyway, back in the days when I was at S3, the OpenGL driver was quite decent at the time. The few OpenGL games there were played quite well given the limitations of the HW.

    But go ahead and insult me some more if it makes you feel good.
     
  13. Rugor

    Newcomer

    Joined:
    May 27, 2003
    Messages:
    221
    Likes Received:
    0
    I've seen the 5200 Ultra for the same price as a 9600 Pro in one of our local stores........ so you never know.

    I do think the FX5200 is a step forwards from the MX series, but I don't know that I'd call it a true DX9 card as far as performance goes. It has the check-box, but I don't know if it can use the features.
     
  14. radar1200gs

    Regular

    Joined:
    Nov 30, 2002
    Messages:
    900
    Likes Received:
    0
    Mate, I owned a Savage3D. The card should have ran the 3dfx Banshee out of the market and provided the TNT-1 with stiff competition. It did neither, mainly thanks to the most abysmal drivers ever to "grace" a graphics card (and because of the drivers, the OEM's and system integrators who would have ensured the cards success would not touch it with a fifty foot barge pole).

    I'm well aware of how Savage3D did in OpenGL - experienced it firsthand, which is what lead me to buy my first nVidia card...
     
  15. YeuEmMaiMai

    Regular

    Joined:
    Sep 11, 2002
    Messages:
    579
    Likes Received:
    4
    Savage3d and 4 were not bad not at all....but there was no way that savage3d was going to bury TNT as it lacked single pass multi texturing support

    the one thing that crippled the savage3d/4 was it's 64 bit memory interface......
     
  16. radar1200gs

    Regular

    Joined:
    Nov 30, 2002
    Messages:
    900
    Likes Received:
    0
    I know that - I said "provide stiff competition".

    At the time TNT-1 was very expensive and Savage3D implimented right would have succeeded on price. The drivers never allowed it that chance.
     
  17. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,081
    Likes Received:
    651
    Location:
    O Canada!
    Exactly what experience do you have to make the call that drivers or hardware was the issue?
     
  18. Althornin

    Althornin Senior Lurker
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,326
    Likes Received:
    5
    That hasnt stopped him from making any calls in the rest of this thread, why do you expect it to matter now?
     
  19. Dio

    Dio
    Veteran

    Joined:
    Jul 1, 2002
    Messages:
    1,758
    Likes Received:
    8
    Location:
    UK
    Which goes to show how much you know.
     
  20. nelg

    Veteran

    Joined:
    Jan 26, 2003
    Messages:
    1,557
    Likes Received:
    42
    Location:
    Toronto
    :roll:
    By that same token I will assume then that you will offer full credit and congratulations to OpenGL Guy for the excellent DX drivers that he must solely be responsible for.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...