Physics Processor - AGEIA (Cool Demo)

Discussion in 'GPGPU Technology & Programming' started by rwolf, Mar 8, 2005.

  1. ninelven

    Veteran

    Joined:
    Dec 27, 2002
    Messages:
    1,742
    Likes Received:
    152
  2. rwolf

    rwolf Rock Star
    Regular

    Joined:
    Oct 25, 2002
    Messages:
    968
    Likes Received:
    54
    Location:
    Canada
    Neat.
     
  3. rwolf

    rwolf Rock Star
    Regular

    Joined:
    Oct 25, 2002
    Messages:
    968
    Likes Received:
    54
    Location:
    Canada
    Did you try the dominos? My system crawls running that one.

    What is cool about this technology is that it should also work well with hyper threading and dual cores.
     
  4. rwolf

    rwolf Rock Star
    Regular

    Joined:
    Oct 25, 2002
    Messages:
    968
    Likes Received:
    54
    Location:
    Canada
    Toms hardware coverage....

    .http://www.tomshardware.com/hardnews/20050308_214530.html

    Wow, Nvidia should be a bit worried because physics seems to me to be much better investment then SLI. I would imagine however that physics would be much better suited on the video card itself with 512MB memory.

    Also interesting is that TSMC is an investor.
     
  5. pat777

    Newcomer

    Joined:
    May 19, 2004
    Messages:
    230
    Likes Received:
    0
    Did anyone try creating walls multiple times in a row? It kills performance. When you do it, you'll see why.
     
  6. _xxx_

    Banned

    Joined:
    Aug 3, 2004
    Messages:
    5,008
    Likes Received:
    86
    Location:
    Stuttgart, Germany
    Nonsense. When Voodoo1 appeared, ATI was making crappy 2D cards and nVidia wasn't even there yet or just started developing nV1 (which sucked a**). Carmack coded GLQuake on 3Dfx because it was the only "real" 3D card back then. Yes, there was Rendition etc. but they were all not even half as fast or feature rich.

    Back then, there was no real 3D API available for gaming. That's why glide was so successfull - it was easy to code for and there was a HW which could do a bunch of neat things noone else could.

    Now talking about Diablo2 or Unreal, that was all a couple of years later. DirectX was still crap, OGL was only used by id so far and you had a bunch of developers who programmed glide last couple of years. They had the choice to program for glide or to try getting these things work with Directx6 and nVidia Riva128 and ATI's still 2 1/2D RageBlahXXX crap. So logically, they invested their time in coding for the proven, widespread platform which was Glide and haven't wasted their time with programming for those three geeks who whined over not being able to play Unreal on their ATI RageProSuperUltraMaXX0r with 2MB RAM...
     
  7. Richard

    Richard Mord's imaginary friend
    Veteran

    Joined:
    Jan 22, 2004
    Messages:
    3,508
    Likes Received:
    40
    Location:
    PT, EU
    You continue to think, despite my admonition, that that's the dichotomy I seek.

    8)

    So that's why I sold one of my VooDoo 2s and got a TNT2? It wasn't perhaps because it had 32-bit colour, OpenGL ICD and DirectX support out of the box right? And if what you say is true then why did 3Dfx go under? Wait, you say below "they didn't have the hardware", that's right. What I said.

    1) You get a lot of like-minded people together and mass mail Aegis with your "manifesto". Hello lobby pressure.
    2) In mboards, such as this, instead of praising whatever new comes regardless of consequences, and instead of resignation of what's to come, put your concerns forward now.
    3) If all else fails, don't buy something that will ultimately be a bad thing.

    But I guess Digi's post exemplifies most people today: hey, that looks cool, I want it, I want it now, me, me, me, now, now, now, what do I care that 5 years down the road I'll be paying $500+ for each part and get shitty availability?

    Here's another example: even though I'm very happy with my R9800 Pro a while ago I posted that Valve was doing the wrong thing and setting a bad precedent for not allowing GFFXs to use mixed precision and dropping them to Dx8. "Oh screw GFFX owners, they should have known better", "Valve probably saw their GFFX market share and decided not to bother", "it's their game, they decide". What do we see now? We see games (like Chaos Theory and STALKER) that could run perfectly alright in Dx9 mode in R3xx/R4xx cards but that will have to run in Dx8 mode. I hate to say "I told you so" as I didn't even expect the comeback to arrive so soon. I guess "R4xx buyers should have known better" and "it's their game, they decide".

    How does liking to play older games comes in contradiction to liking to play newer games?

    But it is up to Ageia to move quickly in an open direction instead of like 3Dfx which did their best to delay that. Here's a little quote from Carmack July 3 1997:

    Wasn't that the truth?

    If they don't, then like _xxx_ said the "logical" (yeah, right) step is to continue to support the available proprietary API.

    It did, and then it was having problems getting through to consumers because even in 2000 there were games (like Diablo 2) with GLide mode that performed better, at lower quality settings than <insert competitor here> that had to run in OGL/DX.
     
  8. Neeyik

    Neeyik Homo ergaster
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,231
    Likes Received:
    45
    Location:
    Cumbria, UK
    Can people return to the actual topic of this thread and not recovering age-old arguments, please? Otherwise, the thread will just get some serious pruning.
     
  9. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,244
    Likes Received:
    3,408
    Whoa! A Cell APU for the PCI Express :lol:

    An interesting idea, but will probably hit the wall as soon as Intel and AMD decide to do some research in this area ;)
     
  10. a688

    Regular

    Joined:
    Apr 20, 2004
    Messages:
    359
    Likes Received:
    0
    Good thing it isn't a cell apu ;)
     
  11. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,532
    Location:
    Winfield, IN USA
    Mine ("Bubbles: Athlon XP-M 2400+ (12x200), NF7-s rev 2, 1Gb PC3200, OEM Sapphire ATi Radeon 256MB X800 pro VIVO (flashed 16 pipes 500/540), Audigy2, Altec-Lansing ADA885, ATi TV-wonder VE, DVDR, 2x10/100NIC, 80GB, 200GB sATA") ran it like butter, the only one that really slowed mine down was the exploding builiding one so far.

    My daughter loved whipping the ragdolls around and smashing the block towers to pieces and such. :lol:
     
  12. Richard

    Richard Mord's imaginary friend
    Veteran

    Joined:
    Jan 22, 2004
    Messages:
    3,508
    Likes Received:
    40
    Location:
    PT, EU
    Ok, OnT, I'm surprised by the amount of local memory these parts are supposed to have, isn't 128mb (GDR3 at that) a little too excessive? Anyone with hands-on knowledge of physics implementations wants to chime in?

    I can understand the concept of caching especially on PCI parts but when I read that interview with more info (I missed the first page) I expected a smaller and faster cache particularly due to PCI-e's full duplex capability.

    There's also some who believe a PPU would be of better use combined with video cards. I can certainly see that line of reasoning considering that a PPU is only going to be used for games (at least initially) and that if you're looking to offload some work from the CPU then that probably means you have a capable 3D card that's waiting for the CPU to begin with.

    If this is done, is there a potential problem of saturating the PCI-e bus? Not speed wise, but where latency is concerned?
     
  13. Dr. Ffreeze

    Regular

    Joined:
    Jun 6, 2003
    Messages:
    335
    Likes Received:
    0
    Cool news,


    http://www.tomshardware.com/hardnews/20050308_214530.html
     
  14. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    A dedicated physics coprocessor makes no sense at all for the PC market. If it has a use anywhere, it would be in the professional simulations market.

    That is to say, there's just no way to smoothly make the transition from software physics to hardware-acclerated physics on the PC, not like there was with 3D graphics.
     
  15. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,532
    Location:
    Winfield, IN USA
    You worry a bit too much, 5 years is a long time and the Digi ain't no fool. :roll:
     
  16. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,532
    Location:
    Winfield, IN USA
    Why not? We've had the option to run graphics in hardware or software mode for a while, why not the same with physics? :|
     
  17. rwolf

    rwolf Rock Star
    Regular

    Joined:
    Oct 25, 2002
    Messages:
    968
    Likes Received:
    54
    Location:
    Canada
    You obviously didn't read any of the documentation.

    The developers use an API. The API is multi-threaded. You can run on one CPU or more and if you have a PPU it will use that as well as/or instead of the CPU.

    I think the physics chip should be on the video card itself however and share memory with the video chip.
     
  18. Richard

    Richard Mord's imaginary friend
    Veteran

    Joined:
    Jan 22, 2004
    Messages:
    3,508
    Likes Received:
    40
    Location:
    PT, EU
    Cool news indeed, that $400 figure just chilled me to the bone.
     
  19. RejZoR

    Regular

    Joined:
    May 9, 2004
    Messages:
    300
    Likes Received:
    3
    Location:
    Europe\Slovenia\Ljubljana
    Try the Building Explode or Big Bang demo and you'll see why you need dedicated chip for physics.
    As side note: Big Bang demo totally killed my AthlonXP 3400+ (Tbred)
    Building Explode ran smooth for like 5 seconds when there is not much physics used. After that it just goes down.
     
  20. CMAN

    Regular

    Joined:
    Mar 23, 2004
    Messages:
    620
    Likes Received:
    15
    Location:
    St Louey
    Not everyone will buy the $400 dollar card. I'd settle for the $100-$200 level card. The same works with graphics cards. A X850 XT PE is over priced, but people purchase it. Same goes for the 6800 Ultra or two 6800 Utras in SLI for $1000! It depends on the person. If you look for reasons to not want it you will find them, and that can be said for ANY PRODUCT!
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...