AGEIA bought!

Discussion in 'Graphics and Semiconductor Industry' started by INKster, Jan 22, 2008.

  1. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    I'm usually reluctant to post anything Fudzilla due to their shady track-record so far, but Fuad is reporting that AGEIA has been acquired by an unnamed suitor (he only says that it's not AMD):
    http://www.fudzilla.com/index.php?option=com_content&task=view&id=5260&Itemid=1
     
  2. vertex_shader

    Banned

    Joined:
    Sep 8, 2006
    Messages:
    961
    Likes Received:
    14
    Location:
    Far far away
  3. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    299
    Location:
    UK
    My bet is NVIDIA is most likely, followed by MS, then Intel and finally Sony. Why NV? SW & Engineering talent + if they want to add stuff to their DX11 GPU especially for physics (rather than just marketing buzzwords), now is the time to get started. And they don't know what to do with their money anyway...
     
  4. vertex_shader

    Banned

    Joined:
    Sep 8, 2006
    Messages:
    961
    Likes Received:
    14
    Location:
    Far far away
    When its NV thats can cause bad things in the future for AMD terms of GPU physics, because Intel has Havok, NV has Ageia, AMD has nothing.
     
  5. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,818
    Likes Received:
    5,922
    Location:
    ಠ_ಠ
    Wouldn't that just be fine and dandy for Epic Games. ;)
     
  6. ZioniX

    Newcomer

    Joined:
    Mar 13, 2005
    Messages:
    61
    Likes Received:
    5
    From X-bit labs
     
  7. _xxx_

    Banned

    Joined:
    Aug 3, 2004
    Messages:
    5,008
    Likes Received:
    86
    Location:
    Stuttgart, Germany
    Hasn't nV already bought some physics middleware company from Norway or such? Was it Meqon or so? I can't remember exactly.

    As for Ageia, whoever thought buying them is a good idea should be forced to clean the toilets for the rest of his career.
     
  8. Fox5

    Veteran

    Joined:
    Mar 22, 2002
    Messages:
    3,674
    Likes Received:
    5
    That's a little harsh. Their card may not have amounted to much, but they've got decent physics middleware and are the only product left on the open market that could support GPUs. I think an acquisition of Ageia would make them significantly more interesting as they'd have the muscle to push their software, and would perhaps be forced to abandon their cards in favor of the gpu approach.
     
  9. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    299
    Location:
    UK
    I'm not aware of that. And regarding Meqon, you seem horribly confused: http://www.meqon.com/ (Ageia acquired both Novodex and Meqon...)

    I think you're not really thinking that through. It depends a lot on the price of course, but good SW, good SW engineers, and probably rather good HW engineers who have "what went wrong?" experience never hurts.
     
  10. Rufus

    Newcomer

    Joined:
    Oct 25, 2006
    Messages:
    246
    Likes Received:
    60
    What on earth type of statement is that from a PR guy? Either your company has been bought or it hasn't, what do you mean you have no information!?
     
  11. Bludd

    Bludd Experiencing A Significant Gravitas Shortfall
    Veteran

    Joined:
    Oct 26, 2003
    Messages:
    3,247
    Likes Received:
    811
    Location:
    Funny, It Worked Last Time...
    Ageia bought Meqon in 2005. Meqon was Swedish.
     
  12. Fox5

    Veteran

    Joined:
    Mar 22, 2002
    Messages:
    3,674
    Likes Received:
    5
    I remember meqon as having some neat physics demos that ran on multicore processors.
     
  13. _xxx_

    Banned

    Joined:
    Aug 3, 2004
    Messages:
    5,008
    Likes Received:
    86
    Location:
    Stuttgart, Germany
    Ok, confused that with nV buying Meqon.

    As for the comments above, physics on GPU is just plain dumb and nothing substantial for the future. We'll have 8-core and bigger CPU's in a year or two, why would anyone abuse the precious GFX flops for a bit of physics? That's a complete no-go long term.

    And also what is so special about their API to begin with?
     
  14. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    3,984
    Likes Received:
    34
    Oh boy, here we go again.

    GPU physics is not "dumb".

    Think about this: what is the average PC gamer more likely to have?
    1) a 4-8 core CPU
    2) a discrete GPU

    Clearly the answer is a discrete GPU. GPU physics has already been demonstrated as a viable technology, simply lacking software support (API, game dev inclusion).
     
  15. Nick

    Veteran

    Joined:
    Jan 7, 2003
    Messages:
    1,881
    Likes Received:
    17
    Location:
    Montreal, Quebec
    The average gamer is most likely going to have a dual-core CPU that isn't fully utilized and a GPU that would have a hard time doing physics work without noticably dropping frame rates.

    Single-core CPUs are already extinct, while IGPs and low-end cards are actually gaining popularity as a viable option for people who don't game every day. So for game developers it's much more interesting to do their physics on the CPU instead of on the GPU, which can vary widely in performance. Quad-cores are also already getting affordable, and it won't take many years till you can't buy a dual-core any more, while weaker GPUs will keep existing next to the awesomely powerful cards.

    Of course, for low-interaction 'eye candy' physics the GPU is an excellent candidate. For less embarassingly parallel work that requires sending data back and forth it's better to just stick to using the CPU.
     
  16. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    299
    Location:
    UK
    I definitely don't disagree with the first statement in the current market situation (but I'd point out this kind of strategy would be long-term), however 'dropping frames' makes absolutely zero sense unless you're thinking of raw performance requirements.

    That latter information is completely inaccurate. Double-attach rates are at all-time records, ffs! (number of discrete GPUs being used on IGPs, thus being counted as likely both one unit for Intel and one for NV or AMD in market share reports, artificially inflating Intel's share). And let's be honest with ourselves: $100 GPUs are completely inadequate for any AAA game released in the last 6 months.

    That's completely true today, yes. In the long-term, I think that's a flawed assumption for perf/mm² and perf/watt reasons as well as market segmentation though. The fact Intel wants to run Havok on Larrabee should make it obvious they think that way too (although you could argue it's even more desirable on Larrabee than classic SIMD GPUs!)

    We must not be looking at the same roadmaps. Dual-cores will remain important throughout the 32nm era, which means them truly exiting the market is a good 5 years away! Of course, by then it won't be gamers buying those chips, but you get the point.

    Mostly agreed, yeah. That's certainly true on current GPUs, I'm skeptical it'll remain the case forever though... :)
     
  17. _xxx_

    Banned

    Joined:
    Aug 3, 2004
    Messages:
    5,008
    Likes Received:
    86
    Location:
    Stuttgart, Germany
    Oh boy, and imagine what that GPU will be doing, especially if the game is gfx-intensive. Then you force it to do two entirely different things and stel the valuable gfx flops for physics halving your fps in the process. All that while at least one CPU core (or a few) sit idle.

    Uhm, yeah, very clever.
     
  18. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    299
    Location:
    UK
    You're NOT looking at it the right way. It's a long-term battle to make the IPC-optimized CPU a commodity (even in the gaming market) and steal its ASP to your own advantage. So obviously a 'balanced' system there would have a cheaper CPU and a slightly more expensive GPU.

    Whether that will work, whether it could even work, is a complex debate that also requires knowledge of DX11/DX12 GPU architectures, adoption rates, and marketing budgets. Personally, I think that it could happen if NV or AMD started really executing in the right direction there, but we'll see if they do.
     
  19. Squilliam

    Squilliam Beyond3d isn't defined yet
    Veteran

    Joined:
    Jan 11, 2008
    Messages:
    3,495
    Likes Received:
    113
    Location:
    New Zealand

    Im curious, where are these statistics? It would be interesting to find out the "market" size for certain levels of graphics performance.
     
  20. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    299
    Location:
    UK
    I don't have access to recent price segment statistics sadly (they're only in paid reports). The only ones you *might* be able to find are channel-specific and won't consider OEMs, which are obviously a major part of the market. And you'd expect OEMs to be lower-end in average, too, which makes the numbers rather worthless unless you know by how much (which I don't).

    Regarding double attach rates, NVIDIA's internal estimates (which were given at a non-quartely analyst conference in November) are a mind-blowing 55%. Yes, that means less than half of IGPs being sold are actually being used according to them! Which would put Intel very significantly behind NVIDIA in practice too. I'd expect NV's claims to be on the high side, but still...
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...