Cell on NVIDIA Graphics Cards?

exactly, theyre becoming more + more like cpus with each generation. so basically we'll end up with 2 cpu's in the machine. 2 cpu's sending large amounts of data to each other.
note in my above post - 'raytracing was just an example'
 
Not by that much. 3D rendering is still a very specialized task, and GPU's will still be orders of magnitude faster at it.

One big difference, for example, is that with graphics, you have quite literaly millions of completely independent objects to process. You just don't get this with nearly any program you would ever want to run on a CPU.
 
I agree with Titanio that the next big step will come from some form of ray-tracing or ray-casting. The whole physics chip is the step up to this I think, to make the transition a little more smooth.

The thing I'm most curious about is how sudden the transition will be. Will it be sudden, or smooth, with rendering slowly getting more like ray-tracing over the next 10-15 years.

Another question is whether real 3d (i.e. through 3D glasses or holographic projection, etc.) will take off anytime soon. When the PS3 was announced with two HDMIs I thought it might happen a bit sooner, but now that's off ... so it's not going to be mainstream for a while yet, it seems. But things are happening in the PC market, with Nvidia already having released drivers for it last year.
 
Arwin said:
I agree with Titanio that the next big step will come from some form of ray-tracing or ray-casting.

I didn't say that..

I just think it's interesting to consider what the relationship will be between the GPU and the CPU in a few years. Arguably the former are looking more like CPUs, and the latter (in instances like Cell) are looking a little more like GPUs.
 
nAo said:
emh..no, there was no project to use one additional CELL or just one of them as GPU.

Not as a GPU, but getting rid of the GPU completely. You also saw those city and canyon demos from the PS3 presentation rendered just on Cell I guess? They also explicitely underlined that.
 
What I want to see is when AMD opens up their which ever HT they haven't ( I can't rember if it is sync or async) which is used for inter cpu compunication is a cell processor on the motherboad :p then we should forget about all this Physic Processor and use the Cell chip for a dedicated PPU or whatever else the devs could do with large parallelisation.
 
If physics chips are going to have any chance of success at all, then it is going to happen by using Cell chips. Cell chips will be dirt cheap because of mass production for the PS3. In addition, there will be millions of Cell chips with more than 1 faulty SPE which would otherwise be thrown away. Cell chips with 6 functional SPEs should therefore be dirt cheap. It would make sense to put these on nVidia cards, PC motherboards (as media accelerator chips) or separate physics accelerator cards rather than Aegia's PhysX chips. Another advantage of using Cell, is code commonality with the PS3, something which would be welcome for games with physics or AI acceleration.

I predict Aegia's PhysX physics accelerator will be a commercial flop and Aegia will discontinue it. Ageia will change into a successful software company supporting Cell for processor based physics and also integrate GPU based (effects) physics on various GPU chipsets.
 
SPM said:
I predict Aegia's PhysX physics accelerator will be a commercial flop and Aegia will discontinue it. Ageia will change into a successful software company supporting Cell for processor based physics and also integrate GPU based (effects) physics on various GPU chipsets.
Ridiculous. It should be obvious, imo, that if their hardware-based solution fails, they'll be bought by either NV or ATI. The API developer support, and the really-not-as-bad-as-you-think engineers are more than worth a couple million dollars.

Uttar
 
Uttar said:
Ridiculous. It should be obvious, imo, that if their hardware-based solution fails, they'll be bought by either NV or ATI. The API developer support, and the really-not-as-bad-as-you-think engineers are more than worth a couple million dollars. Uttar

I don't think their hardware solution is worth buying - there are other DSP solutions similar to that used on their PPU card. The problem with the Aegia hardware is not that the chip is a bad one but rather that there will be very low production volumes which means high costs, which means it won't sell, which means games developers won't write for it.

Their PPU software engine on the other hand is worth buying. Having said this, it needs to be a universal API, something that can run on both nVidia and ATI products in order to encourage games developers to write for it, and GPUs need to be utilised as well by an integrated API. This may be hampered if nVidia or ATI buy out Aegia. I am not saying this won't happen, but if nVidia or ATI buy out Aegia, then Aegia will be worth less than it would if it was independent.
 
Back
Top