AGEIA bought!

You're NOT looking at it the right way. It's a long-term battle to make the IPC-optimized CPU a commodity (even in the gaming market) and steal its ASP to your own advantage.

Well then we'll agree to disagree :) Long-term goal is for me still minimizing power usage and heat; further integration of peripherals, a single-chip computer in the end.

All the other stuff is just avoiding the short-term obstacles and looking for ANY alternative working solutions until then.
 
Let us not forget the upcoming CPU + GPU initiatives @ both Intel & AMD. Gotta put those co-processors to work somehow, right?
 
Im curious, where are these statistics? It would be interesting to find out the "market" size for certain levels of graphics performance.
Top three PC game sales in 2007:

1. World of Warcraft: Burning Crusade - 2.250.000
2. World of Warcraft - 914.000
3. The Sims 2 Seasons Expansion Pack - 433.000

These games run absolutely great on the cheapest graphics cards sold to date. Only in rank four we find a game that requires some GPU muscle: Call of Duty 4. I enjoyed the demo on my GeForce 8500 though, which is last generation but still low-end. Games like Crysis, which do need a mid- or high-end GPU, don't even appear in the top ten.

Anyway, people buy graphics cards that will last a few years. But the same is true for CPUs, which is why you won't easily find a single-core system being sold today. Laptops with IGPs and desktops with low-end graphics cards on the other hand are everywhere...

So any game developer aiming for a medal will wisely choose to run non-graphics workloads on the CPU.
 
Top three PC game sales in 2007:

1. World of Warcraft: Burning Crusade - 2.250.000
2. World of Warcraft - 914.000
3. The Sims 2 Seasons Expansion Pack - 433.000

These games run absolutely great on the cheapest graphics cards sold to date. Only in rank four we find a game that requires some GPU muscle: Call of Duty 4. I enjoyed the demo on my GeForce 8500 though, which is last generation but still low-end. Games like Crysis, which do need a mid- or high-end GPU, don't even appear in the top ten.

Anyway, people buy graphics cards that will last a few years. But the same is true for CPUs, which is why you won't easily find a single-core system being sold today. Laptops with IGPs and desktops with low-end graphics cards on the other hand are everywhere...

So any game developer aiming for a medal will wisely choose to run non-graphics workloads on the CPU.

But those numbers don't include sales from Steam or any other digital distribution. How accurate is that list?
 
But those numbers don't include sales from Steam or any other digital distribution. How accurate is that list?
I think the list is accurate for U.S. sales. Yes it does not include Steam sales but it does give an idea of what the market is really like. Also look at the popularity of Wii. Oh, and the most anticipated games of 2008 are... Spore and WoW: Wrath of the Lich King.

Graphics demands go up slightly every year. Physics demands, meh, overrated. Fun games just need a couple of crates, parabola trajectories, and portals. ;) Cheap quad-cores are coming our way, and octa-cores will be unstoppable in 2010. By that time poppies grow over AGEIA...
 
Havok seem to think gpu physics is dead

quote from wiki
"To compete with the PhysX PPU, an edition known as Havok FX was to take advantage of multi-GPU technology from ATI (CrossFire) and NVIDIA (SLI) using existing cards to accelerate certain physics calculations.[2]

Havok's solution divides the physics simulation into effect and gameplay physics, with effect physics being offloaded (if possible) to the GPU as Shader Model 3.0 instructions and gameplay physics being processed on the CPU as normal. The important distinction between the two is that effect physics do not affect gameplay (dust or small debris from an explosion, for example); the vast majority of physics operations are still performed in software. This approach differs significantly from the PhysX SDK, which moves all calculations to the PhysX card if it is present.

However, Havok FX seems to have been cancelled. "
 
Havok already announced the cancellation ... after they were purchased by Intel.
 
This link is more fun:

A source from NVIDIA confirms the acquisition of AGEIA. In other news, Kyle Bennett is wrong.

......

Kyle Bennett, Editor in Chief and Founder of HardOCP, should be enjoying a nice big helping of his own words. After initial reports came in last week that AGEIA was to be acquired by an unknown bidder, Kyle Bennett promptly and brazenly refuted these claims, as if he were somehow affiliated with one of the companies involved and knew what he was talking about. This ruffled quite a few feathers amongst editors of the sites that posted the news. Well, now it would appear that he was completely and utterly wrong.
Much better colour than that boring PR release of Perez's. :p
 
I can't help but believe this was the real goal for Ageia.

Was there ever any doubt ? :p

(Not to me at least. I've always thought that a separate physics processing card was just preposterous, when it would be more widely accepted as part of an IGP or a CPU instead).
 
Yeah, and the PPU is dead.

nVidia is just filling the gap Havok left when they were acquired by Intel. They lost their software partner on GPU physics, and now they've got another (who'll probably be leveraged on cuda and parallel processing generally too).

Will physics processing be relegated to an nvidia only (and possibly sli only) thing, or will they let ATI in on it too?
 
I don't really see any sort of standard coming up to support it, at least not a non-Microsoft one. Also, with the way Nvidia seems to be branded only basically every major (and most minor) PC games I really would not be shocked at all to see their method take off pretty well, if it's included on your standard graphics card.
 
I suppose we'll have to wait until the upcoming CC to find out how much nVIDIA paid for them.:p I also wonder where this leaves Ageia's next gen phyics PPU, i.e. will nVIDIA keep Ageia's hardware roadmap in place or rather roll up the engineers into its own seemingly dormant 'GPU Physics' campaign?

In the excellent Larrabee architecture thread, ArchitectureProfessor mentioned that Ageia's next gen PPU is supposed to have 720 [simple] cores.:!:
 
I suppose we'll have to wait until the upcoming CC to find out how much nVIDIA paid for them.:p I also wonder where this leaves Ageia's next gen phyics PPU, i.e. will nVIDIA keep Ageia's hardware roadmap in place or rather roll up the engineers into its own seemingly dormant 'GPU Physics' campaign?
See Beyond3D's news post. My personal interpretation of that is they'll finish their current next-gen PPU and sell it as a discrete part, then stop PPU development. There are two key reasons why that makes perfect sense:
- RTL designers are likely already done with that project, and they're the ones that need to be moved to work on NV's DX11 GPU ASAP.
- They need a better, cheaper, cooler PPU to drive adoption before they can run it all on the GPU.

The goal now should definitely be to drive adoption for the new discrete PPU through aggressive pricing and deliver API compatibility for future GPUs. With a cheap PCB and cheap GDDR3, it should be possible to sell a 100-150mm² 80nm solution for $99. After all, the 8500GT is selling for a fair bit less than that already! (yes, it's using DDR2, but you get the point)

And at that price, using PCI Express x1 and hopefully passive cooling (or at least a slightly more expensive SKU with passive cooling?), it should be able to get a bit of traction no matter what. Of course, this is all assuming Ageia's next-gen isn't a 400mm² chip (lol?) and that NVIDIA is smart enough to know what's good for them. We'll see what happens.
 
Anyhow, since I won my bet wrt the Ageia acquisition (now I'm just waiting to see if I won my bet wrt how much NV would be willing to pay too), I'll take a new one: NVIDIA will announce they will acquire VIA before May 2008.
 
Back
Top