Why did Sony abandon the idea of having cell in PS3 doing gfx?

Techno+

Regular
hi guys,

Can anybody answer that question. Sorry guys but i tried googling it, answering it or asking it, but i found nothing. I also heard that they were gonna use a mini gpu next to the 2nd Cell cip that was supposed to do gfx. You need not explain anything if u dont want, just give me a link.

Thanks
 
hi guys,

Can anybody answer that question. Sorry guys but i tried googling it, answering it or asking it, but i found nothing. I also heard that they were gonna use a mini gpu next to the 2nd Cell cip that was supposed to do gfx. You need not explain anything if u dont want, just give me a link.

Thanks

Because the specifications and their plan became outdated.

Remember that when we talk about specifications it goes beyond the performance, for example they could have a powerful GPU with a lot of problems for make an image quality like the PC GPUs.
 
Well, so many tools and libraries are already available and supported by Nvidia and ATI technologies. They evolved on these for years and devs got lots of experience on them. Starting fron scratch on a new GPU would have added lots of unnecessary work I think. PS2 caused lots of pain due to this for some time. And Sony doesnt have the luxury to wait for devs to waste time figuring things from zero like the first time. They need evolving results fast since they already have a powerful competitor in the market.
RSX supports existing knowledge, features, and tools right? I mean, isnt Colada a collection of existing OpenGL tools/libraries but optimised for the PS3?
 
hi guys,

Can anybody answer that question. Sorry guys but i tried googling it, answering it or asking it, but i found nothing. I also heard that they were gonna use a mini gpu next to the 2nd Cell cip that was supposed to do gfx. You need not explain anything if u dont want, just give me a link.

Thanks

Because there never was a real plan to do so... Nonetheless it'll remain one of the many great myths of the PS3 for decades to come... :)
 
Simple: It would have sucked.
 
Its simmiliar to going from one type of tech, to make a new type of tech.

Imagine a babbage engine. A clockwork processor, turing complete, or a optical processor. Both is able to be programmed, and none got libraries, or ppl who got experience in it.

The case is simmiliar: even if its doable, its not always any point. if you gain nothing but the trouble of 100 000 coders who need to to baseline stone clobber coding with no idea What the heck they are doing, as totally noobs, is it a point?
Edit: im assuming it was one of the ideas discussed under what is knowns as "brainstorming", who mean discussing all sort of weird ideas anyone get, to get the true creative ideas out, who can be used. I doubt they ever sat and planned a clutter of cpu's as a gpu.
 
Last edited by a moderator:
I have a dream that one day even the state of graphics, a state sweltering with the heat of graphics whores, sweltering with the heat of oppression, will be transformed into an oasis of graphical euphoria.

I have a dream! That my seven little SPEs will one day compute in a chip package where they will not be judged by the size of their cache but by the content of calculations.



um..sorry. I get these weird visions sometimes. :oops: But perhaps the idea of cell as a a GPU proliferated from the cell-only demo of London for starters.
 
Last edited by a moderator:
I have a dream that one day even the state of graphics, a state sweltering with the heat of graphics whores, sweltering with the heat of oppression, will be transformed into an oasis of graphical euphoria.

I have a dream that my seven little SPEs will one day live in a consumer electronic where they will not be judged by the size of their cache but by the content of their calculations!



....anyhoo... perhaps that cell-only demo of London proliferated the idea of a cell GPU for starters.
 
Last edited by a moderator:
Exactly, where are people getting this idea from ?

I think that John Carmack mentioned the idea in one of his QuakeCon speech. But even if the Cell GPU was never a real plan, I'm sure that the RSX was a quite late addition to the project, maybe a GS2 was the first option.
 
hi guys,

Can anybody answer that question. Sorry guys but i tried googling it, answering it or asking it, but i found nothing. I also heard that they were gonna use a mini gpu next to the 2nd Cell cip that was supposed to do gfx. You need not explain anything if u dont want, just give me a link.

Thanks

Because it would make life a living hell for all PS3 developers.

That, and the outcome probably wouldnt be better than a RSX chip doing the graphics
 
Graphics processing is as much about being smart through efficiency, multithreading, pipelining, latency hiding, compression, etc. -- highly specialized design areas where the graphics companies focus their research and often patent the implementations -- as it is about being fast.

Sony has now apparently closed shop on their graphics processor design team.
 
Graphics processing is as much about being smart through efficiency, multithreading, pipelining, latency hiding, compression, etc. -- highly specialized design areas where the graphics companies focus their research and often patent the implementations -- as it is about being fast.

Sony has now apparently closed shop on their graphics processor design team.

Nvidia and ATI is it now. Sony/Toshiba could not come close.

It would probably take a startup company of 1,000 employees at least ten years to get anywhere close to as refined as ATI and Nvidia designs/drivers now are.
________
Wong Amat Tower Condominium Pattaya
 
Last edited by a moderator:
I'm kind of curious how GS2 would have turned out. I imagine software/driver would have been crap compared to RSX, even if performance was better (maybe it was, maybe it wasn't -- probably was balanced very differently, better at some things, worse at others).

Cell for graphics was, obviously, never a realistic choice and at most I'd say it was a passing suggestion within Sony (which was likely shot down rather quickly, if taken seriously at all).
 
Easy, Sony could never reach Ati and NVIDIA's levels in GPUs at an acceptable cost and time restraints, so they did the good thing of getting one of them on board.
 
Well, we've had a request to reopen this thread, so we'll give that a try as it hadn't degenerated too far yet, for the actual terms of service.

Nevertheless, the quote below from upstream is my own understanding of the situation, so my first question would be to the threadstarter as to what evidence they wish to present as to why the below is not true? Because the plan had to exist before it could be abandoned.

Because there never was a real plan to do so... Nonetheless it'll remain one of the many great myths of the PS3 for decades to come... :)

And, oh yeah, I'd appreciate if y'all didn't make me regret acceding to the request to re-open the thread by turning it into a flamewar. Because of course it can be closed just as fast again, only by then I'll be feeling all wroth and stuff. So let's not, please? ;)
 
Back
Top