*spinoff* PhysX relevance

Discussion in '3D Hardware, Software & Output Devices' started by Scali, May 1, 2009.

Thread Status:
Not open for further replies.
  1. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Yea exactly.
    You could have the same argument for PhysX. Some games have a lot more eyecandy and/or run faster with an nVidia card and PhysX acceleration. And ATi doesn't offer support.

    At least with nVidia we know that their future DX11 hardware will support DX10.1... But there's no way of telling what's going to happen in the physics-world at this point. Will PhysX become ever more popular? If so, will PhysX remain Cuda-only, or will it be adapted to OpenCL or CS?
    Or will we get Havok, and if so, will it work on nVidia cards, and will it work well?
    Or perhaps another solution altogether?
     
  2. Jaysin

    Newcomer

    Joined:
    Feb 21, 2009
    Messages:
    2
    Likes Received:
    0
    Is it just me or am I the only person that still yawns over the physics thing?
     
  3. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    PhysX is currently the "Pimp my ride" of the GPU world. the huge spoiler will make it look cool but it won't do anything but drag down performance.
     
  4. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Or so the envious ATi-owners want to make you believe...
     
  5. karlotta

    karlotta pifft
    Veteran

    Joined:
    Jun 7, 2003
    Messages:
    1,292
    Likes Received:
    10
    Location:
    oregon
    Say's poster who posts the first post on PhysX in a 4770 review thread...
     
  6. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,452
    Likes Received:
    10,357
    I think the same reason people knock on PhysX is because it's the Glide API of this generation.

    For the same reason people knocked Glide when OGL and D3D became capable people will be knocking PhysX when OpenCL/Havok/DX11 become capable of GPU accelerated Physics.

    Likewise there is no game so far that shows an actual speed increase when enabling PhysX. Developers (prompted by Nvidia maybe?) are making the PhysX enabled features so IN YOUR FACE that rather than enabling more features at the same speed or a faster gaming experience, they are instead overdoing it and slowing down the games.

    Take Sacred 2 for example. They could have easily done with less leaves blowing around constantly and had the framerate not take a huge hit (presumably). But rather than enhance the game and keep the same speed. They had to make it so there was no way you could possibly miss the fact that it is there. And in the process the game takes a huge hit in performance.

    The conspiracy theorist might say they did this on purpose to make the CPU rendering of it even more horrendous than it needs to be. Or that it's purely driven by marketing, and as such it's more important to make it noticeable than to have an overall positive impact on the game.

    But even with all that, just the fact that it is currently the Glide API of this generation is going to make it unpopular with some people. Except while Glide not only made things look better, it also made things run faster (until OGL and D3D caught up). So far PhysX generally makes things look better, but also makes them run slower.

    Regards,
    SB
     
  7. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Except they are already knocking PhysX, while there are no alternatives.
    Did people knock Glide when it was the only 3D acceleration API around?
     
  8. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    9,045
    Likes Received:
    1,120
    Location:
    WI, USA
    Glide and the awesome hardware behind it in the mid '90s was what made 3D gaming possible on PC for the most part. Few cared about full OpenGL for realtime rendering for most of the '90s. The drivers for it sucked and it wasn't really suited for the very limited accelerators of the day. Early MS D3D was a big bad joke. Glide had a lot going for it during its heyday.

    Physx is, so far, very much a gimmick. What's there to like, exactly? Hyped promises for fancy future stuff? The Physx stuff I've personally tried in UT3 on my 8800GTX was enough to convince me that Physx is not worth making a card judgment over. It was just all so superficial and added little to the game other than "physics eye candy".

    I think GPU accelerated stuff like Physx and GPGPU might become useful when all cards support it through DX11 or OpenCL. And when the cards are simply more powerful than they are now.
     
    #8 swaaye, May 5, 2009
    Last edited by a moderator: May 5, 2009
  9. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,452
    Likes Received:
    10,357
    That's true but there was also this...

    Giving people a reason to complain even when there isn't any competition. Just like people complained when PhysX was owned and operated by Ageia. It has yet to either...

    1. Speed up performance with same level of eye candy
    2. Provide greater eye candy with same or similar level of performance as without the extra eye candy

    So far, everything is about look what we can do to market PhysX. Look we can do this and CPU can't. Or at least that's the way it feels.

    At least with Glide. Not only did you get better eye candy it was also faster. So why would anyone knock it until the competition (OGL and D3D) caught up.

    This aversion to PhysX isn't anything new. It was here before Nvidia aquired them, and I'm not sure how they expected to change that perception without improving on the situation from when Ageia owned it.

    Regards,
    SB
     
  10. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Well, that's personal opinion. I do think that PhysX has shown that it can speed up and/or get a better gaming experience. It sure sped up 3DMark Vantage anyway, enough to make people cry out "cheat" and all that :)
    And Mirror's Edge gives you extra eye candy without dropping the performance level.
    So I don't agree with you there.

    The problem with Ageia was different: you had to buy new hardware, which was only supported in a handful of games. With nVidia you get it for free on 2+ year old hardware. I would probably never have bought an Ageia PPU card... But I did happen to have a GeForce 8800, and got to experience PhysX for free, and it was a nice extra.
    So that's the thing... the PPU card wasn't very good value for money... but when it's free, that logic doesn't apply.
     
  11. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Mirrors Edge shows a 20-30% drop in FPS when enabling PhysX on a GTX260. A fact I can see from PCGH's benchmark that a 4870 on E8500 without PhysX is faster than a GTX260 on a QX9770 with PhysX. Also shown is that using an inferior card (8400GS) for PhysX reduces your framerate by about 60%.
     
  12. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    That depends on a LOT of factors (resolution, detail settings, AA/AF etc).
    Besides, as long as you get 50-60 fps, it won't affect the perceived performance. Most monitors cannot refresh faster anyway (and the game is capped at 62 fps by default).
    And if you're bothered, you can always turn it off. With an ATi card you can not turn it on.
     
  13. John Reynolds

    John Reynolds Ecce homo
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    4,491
    Likes Received:
    267
    Location:
    Westeros
    So your argument went from PhysX not impacting performance at all in Mirror's Edge to it does but depending, and then to a sort of 'but you won't notice the performance impact.' But that last argument is only valid for those with high-end GPUs, and for those with decent GPUs they're of course going to run the game with those 'factors' you mentioned (higher resolutions and detail settings, AA/AF enabled). Also worth noting is that NV did all the heavy lifting on the PhysX code for Mirror's Edge, so not exactly a strong example of developer support suggesting an independent uptake that potential informed buyers should factor into their purchase decisions.

    Is Mirror's Edge a better game with the PhysX support? Most would probably say yes. Is the game indicative of impending widespread developer support? Kinda hard to say considering its proprietary nature right now, the performance impact it clearly has when running on even a high-end GPU, and the likely need for a game engine to have physics support factored into its design from the earliest stages of development to properly integrate it into gameplay in any truly significant fashion.
     
    #13 John Reynolds, May 6, 2009
    Last edited by a moderator: May 6, 2009
  14. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    I am serious. The PhysX workload is pretty much constant. By varying the graphics workload, you vary the percentage of impact the PhysX workload has on the overall performance.
    Since I still have a 1280x1024 screen, the PhysX hit for me is way lower than for people gaming at 1920x1080 or something like that, because we spend the same amount of time on PhysX, but I spend less time on graphics. QED.

    I would seriously hope that most people on this forum don't run XP and don't have a DX9 level card.
     
  15. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    No, the actual wording was 'performance level'.
    I didn't go anywhere. I never do. I'm smart enough to only say what I mean, and I have the integrity to stick by what I said because I mean it.

    I'm not saying anyone should factor it into their purchase decisions.
    I'm just saying that I got PhysX for free on a product that I already bought previously, and I like it. The thing for me is: Do I want to give up PhysX?
    Which I don't unless there is something significantly more compelling... which currently there isn't.
     
  16. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    4,313
    Likes Received:
    1,109
    Location:
    35.1415,-90.056
    This forum is not a valid representation of the consumer market, which is where PhysX will live or die.

    And given your continued backpedaling combined with the overall marketshare of the hardware that is capable of sustaining the performance you're having a hard time substantiating, I think the answer is becoming sorely obvious.
     
  17. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    That is exactly the reason why I go to forums like these, and not 'consumer forums'.

    I'm not backpedaling. What exactly do you want me to substantiate?
    That 55 fps and 62 fps is the same 'performance level'?
     
  18. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,360
    Likes Received:
    1,377
    What I tried to be polite about is that we know this. Nobody gives a damn. Not in the marketplace as a whole, and most definitely not in a thread dedicated to the HD4770, a product that, as you have so helpfully pointed out, has no relation to PhysX.
     
  19. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    I was just giving a response to the DX10.1 argument, which the marketplace as a whole gives as much about as they do about PhysX: nothing. Sadly nobody picked up on the parallels in these technologies I was trying to demonstrate. No, they just wanted to flame PhysX again. I can smell the envy from here.
    Funny enough I actually wanted to discuss the DX10.1 part, rather than PhysX, but people focused blindly on PhysX after my post. Clearly that was not my intention.
     
    #19 Scali, May 6, 2009
    Last edited by a moderator: May 6, 2009
  20. John Reynolds

    John Reynolds Ecce homo
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    4,491
    Likes Received:
    267
    Location:
    Westeros
    Is this situation more likely to change with the release of DX11 and the tangible performance benefits that will be available to all consumers once NV catches up with hardware support? I think so.

    I don't see anyone flaming PhysX. Hell, I own a GTX 285 and my arguments certainly aren't based on envy as you're suggesting. I think people are generally opposed to PhysX because it's not currently an open standard. And ask yourself how many APIs/libraries/standards proprietary to specific hardware have survived over the years in the graphics market.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...