Nvidia following with Physics Simulation on GPUs

Discussion in 'GPGPU Technology & Programming' started by Arty, Mar 11, 2006.

  1. _xxx_

    Banned

    Joined:
    Aug 3, 2004
    Messages:
    5,008
    Likes Received:
    86
    Location:
    Stuttgart, Germany
    The advantage due to batch size is comparable to the results you see in the world of gfx. However that translates to physics (less gfx bound?).
     
  2. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    213
    Location:
    Uffda-land
    <geo starts humming "Anything you can do, I can do better" for the nth time.>

    I'd been more-or-less expecting a press release along those lines. At least we got it confirmed in public in this article.
     
  3. mashie

    Newcomer

    Joined:
    Apr 11, 2005
    Messages:
    42
    Likes Received:
    0
    Location:
    Peterborough, United Kingdom
    Nice, use a x1900 for video and a lot cheaper x1600 to do physics.
     
  4. mashie

    Newcomer

    Joined:
    Apr 11, 2005
    Messages:
    42
    Likes Received:
    0
    Location:
    Peterborough, United Kingdom
    This will happen regardless you want it or not. The ball is already in motion, sorry.
     
  5. Gump

    Newcomer

    Joined:
    Mar 12, 2002
    Messages:
    28
    Likes Received:
    0
    http://physx.ageia.com/footage.html

    As a developer quite frankly while it'll be nice to pass on certain effects to the GPU certain gameplay scenarios we "would LIKE to do" are just not possible without physics acceleration (or at least they run far too slow in Novodex's software mode). Haven't had a chance to play with Havok FX so I really can't comment too much on its capabilities.
     
  6. aaronbond

    Newcomer

    Joined:
    Aug 11, 2004
    Messages:
    62
    Likes Received:
    0
    video of physics on GPU over @ guru3d (by the way it looks pretty lame compared to ageia's implementation. hopefully this is a VERY early demonstration)

    http://downloads.guru3d.com/download.php?det=1357

    Personally I'm all for a dedicated PPU, but only if the next-gen release cycle is @ least every 2 years. I only hope the selling point would be around the $200 mark.
     
  7. Maintank

    Regular

    Joined:
    Apr 13, 2004
    Messages:
    463
    Likes Received:
    2

    And I have to ask why does _xxx_ think it is crap?
    Right now we have wonderful looking gameworlds that are almost devoid of any realistic interaction. I shoot a wall in Doom 3 and all I get is a graphic that looks like a generic bullet-hole.

    I am looking forward to this new technology to see what it can do in terms of realism. I get visions of a WWII sim where your enemy is in a building firing at you. You call up a tank and it lobs a shell into the building and we get realistic effects like a wall imploding and parts of the house collapsing.

    btw after reading a little more about this, it looks like Nvidia's idea is more a compliment to a dedicated PPU than a replacement. Is that correct? Where the Nvidia part will be doing effects physics and a dedicated PPU will be doing gameworld interaction?
     
  8. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,436
    Likes Received:
    443
    Location:
    New York
    I don't think that's how it works. The PPU on its own should be able to handle both effects and gameplay physics with no problem. I think Nvidia's focus is on taking advantage of the fact that the PPU is not going to be mainstream anytime soon. I don't think any developer is going to code a game that "requires" a PPU for good performance of gampelay physics, but I could be wrong.
     
  9. jb

    jb
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,636
    Likes Received:
    7
    I think effects based Phys is not a bad way to go (ie particle interaction, rag dolls on death, movement of background items like trees blowing in the wind, very minor interaction like glass breaking,ect..), but I would get worried to let it handle and Phys that has a direct effect on gameplay. The big problem we could have would be in on-line games where the Phys part of the driver is changed to give a user a clear advantage over others. We already have heard about "wall hack drivers" in the past..we don't need "Phys hacked" ones :)
     
    #49 jb, Mar 23, 2006
    Last edited: Mar 23, 2006
  10. mashie

    Newcomer

    Joined:
    Apr 11, 2005
    Messages:
    42
    Likes Received:
    0
    Location:
    Peterborough, United Kingdom
    Hehe, imagine if the people with a PPU can break through the wall while those without has to run around it :D Now that would be a good incentive to get a PPU for sure.
     
  11. Sxotty

    Veteran

    Joined:
    Dec 11, 2002
    Messages:
    4,928
    Likes Received:
    355
    Location:
    PA USA
    I agree with this wholeheartedly. I want more eye candy still. I like most of what is now feasible, but it hasn't gotten where it needs to be yet.

    As to online gaming, don't worry this physics crap won't work well unless you are on a LAN b/c there is too much data to send and lag would be horrible, the wall would fall crush you then bounce back up the otherway when it was realized a grenade was already going off on the other side...
     
    micron likes this.
  12. Mintmaster

    Veteran

    Joined:
    Mar 31, 2002
    Messages:
    3,897
    Likes Received:
    87
    There are usually two major problems with these sorts of comparisons.

    The first is that they show a fixed scene running on different hardware rather than a fixed framerate on both systems. A developer is not going to make a world that runs at 6.2 fps. So the question becomes how much of a difference do you see and feel for increased object numbers. In physics, linear scaling is usually out of the question.

    The second is you just don't know how they implemented the physics on the CPU-only system. It would definately be unwise to use even remotely similar algorithms, since a CPU can tranverse tree structures and use scattered writes much, much more effectively than a GPU. Inter-particle collision is the toughest thing for a GPU.

    The framerates they're getting for 15,000 boulders on a CPU makes me a bit suspicious, and that the GPU is so much faster makes me wonder if inter-boulder collision is absent or unrobust, which makes the results almost useless. This is along the lines of Jawed's statements of how limited "effect physics" is.
     
  13. Rur0ni

    Regular

    Joined:
    Dec 10, 2004
    Messages:
    320
    Likes Received:
    4
    Anyone see that Cell Factor demo by Ageia?

    http://www.fileshack.com/file.x?fid=8558

    Impressive. Enough to make me put the money down on one provided games really took advantage of the card.

    So the nVidia/Havok implementation is just fluff?
     
  14. IgnorancePersonified

    Regular

    Joined:
    Apr 12, 2004
    Messages:
    778
    Likes Received:
    18
    Location:
    Sunny Canberra
    well isn't their 3 in that - havcok/nvidia/ati? Though the gpu makers seem to be differentiating the pr if not the actual implementation. I'd still buy a pci-e x1 ageia card over a x19000 and then a x1600 on a x16 slot (or whatever combo.) unless the bandwidth of an x1 slot was an issue which I doubt.

    Ati did seem to raise a good question - how much real processing power does the agiea card have in relation to a mid range card of theirs( and then nvidia's)? Dunno if it would be an issue though.
     
    #54 IgnorancePersonified, Mar 24, 2006
    Last edited by a moderator: Mar 24, 2006
  15. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,079
    Likes Received:
    648
    Location:
    O Canada!
    The flipside to that point is how efficient is a pcessor designed for graphics commands going to be in relation to a processor design for the task?
     
  16. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,921
    Likes Received:
    221
    Location:
    Seattle, WA
    Well, due to the lack of economies of scale for a physics card, a graphics card doesn't need to be nearly as efficient to perform as well.
     
  17. IgnorancePersonified

    Regular

    Joined:
    Apr 12, 2004
    Messages:
    778
    Likes Received:
    18
    Location:
    Sunny Canberra
    Yeh I agree though they indicate(for what it's worth) that they believe there is at least a power of 10 difference in the floating point power of an x1900 card versus thier figure for an ageia card. So even if it is %50 efficient it would still be a more powerful solution on paper.
     
  18. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,172
    Likes Received:
    488
    Location:
    en.gb.uk
    It's not just a question of FLOPpage though. Are GPUs sufficiently flexible to implement the types of data structures required to so the physics efficiently? I'm thinking linked-lists, trees here, those sorts of things.

    If you're doing simulations in involving particle-particle interactions for example (fluids being an obvious instance) then the brute force approach scales very poorly with particle count (N^2), so things get ugly very quickly. Increase the number of particles by a factor three and your much vaunted factor of ten advantage in head-line FLOPs just disappeared is a puff of PR.

    To do these sorts of things efficiently *some* sort of spatial indexing scheme is essential, and due to the dynamic nature of the problem it's an index which needs to be updated semi-regularly. I'm sure GPUs can traverse such indexes? If not it'll be left to the CPU which means a round-trip for all the data every now and again.
     
  19. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    6,839
    Likes Received:
    481
    Meh, for something like fluids hacks are going to rule supreme for the foreseeable future anyway.

    Id be more concerned about stability .... did those boulders in that demo ever actually come to rest? Also a huge part of the whole process is collision detection, was the GPU actually doing that?
     
  20. psurge

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    939
    Likes Received:
    35
    Location:
    LA, California
    Yeah... not to mention that CGI water looks like utter crap in almost every film I've seen, and that's presumably despite a high # of phds and an even higher # of CPUs.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...