The definitive console specs thread.

the CELL CPU architecture makes a great front-end of the graphics pipeline, traditionally done by geometry engines / T&L units and Vertex Shaders (or normal CPUs) but as far as rasterizing/ rendering / drawing graphics, CELL was not designed for that type of work. otherwise there would be no need for an Nvidia GPU in PS3.
Actually Cell's not a bad design except for a few key areas. It can transform, light and draw triangles well, but not texture, nor shade as powerfully as a GPU. It's no replacement for a GPU, but it can do GPU work (rendering images) pretty well. So the statement 'CBEA can handle GPU work' is true - it doesn't say how much GPU work, or even how efficiently (and by that, an x86 can handlt it to ;))
 
I never said CELL is a bad design. it's a great CPU for what it was designed for, but it's not a graphics processor.

all CPUs can do the first several stages of the graphics processing pipeline, like transform and setup geometry.


If one was to do pure software rendering on CELL where CELL has to do everything graphics-wise , it would be down to GeForce / GeForce 2 levels of graphics complexity as seen in the recent IBM video and actually that was several CELL Blades with probably a GB of RAM or more.
 
I never said CELL is a bad design. it's a great CPU for what it was designed for, but it's not a graphics processor.

all CPUs can do the first several stages of the graphics processing pipeline, like transform and setup geometry.


If one was to do pure software rendering on CELL where CELL has to do everything graphics-wise , it would be down to GeForce / GeForce 2 levels of graphics complexity as seen in the recent IBM video and actually that was several CELL Blades with probably a GB of RAM or more.

Doing raytracing, at 1080p. Not polygon pushing....
 
Actually it's ray casting.
I think you're wrong there.
1) AFAIK the shadows are being traced.
2) The project's name is iRT - interactive Ray Tracer
3) The same engine was demo'd across 3 PS3s tracing a car including glass (refraction secondary rays)
4) The paper describes the engine calculating AO. Even if that's a separate pass using a different engine, the car is rendered without AO but with reflections - 96 rays per pixel. That's a crazy amount for a simple ray-caster!
 
If you don't want to discuss it you don't have to but keep your trolling to yourself.
1) When posting to articles, explain what they're about so people can decide whether they're worth looking at.
2) When asking for discussion, lead with your own POV.

This trolling of mine is unfortunately state-sanctioned FAQ adherence. :(
 
"PC Benchmarks for the Wii Cpu under Linux compared to the X360 and PS3 Cpu's... Infact I find it kinda funny that the Wii CPU is almost Equal to a Single Core of the the X360... Also the Wii GPU has about 1/3 the Cache of the X360... See a Trend?!? 1/3 seems to be a Magic Number when comparing Wii and X360... Don't take this to mean the x360 is 3 Times the Power of the Wii, as it's Not... The X360 is roughly 2 - 2.5 the Speed of the Wii overall... The PS3 is roughly the same as the x360 Benchmark wise, however It gets Insane Scores at Floating point Math due to the SPE units... Overall to compare the X360 CPU and PS3 CPU to a Computer I'd say roughly a 3.2Ghz Intel P4 (Non-DualCore)... The Wii is some where in the 2.4 - 2.6Ghz Intel P4 (Non-DualCore) Range"

i call bs on that
 
"PC Benchmarks for the Wii Cpu under Linux compared to the X360 and PS3 Cpu's... Infact I find it kinda funny that the Wii CPU is almost Equal to a Single Core of the the X360... Also the Wii GPU has about 1/3 the Cache of the X360... See a Trend?!? 1/3 seems to be a Magic Number when comparing Wii and X360... Don't take this to mean the x360 is 3 Times the Power of the Wii, as it's Not... The X360 is roughly 2 - 2.5 the Speed of the Wii overall... The PS3 is roughly the same as the x360 Benchmark wise, however It gets Insane Scores at Floating point Math due to the SPE units... Overall to compare the X360 CPU and PS3 CPU to a Computer I'd say roughly a 3.2Ghz Intel P4 (Non-DualCore)... The Wii is some where in the 2.4 - 2.6Ghz Intel P4 (Non-DualCore) Range"

i call bs on that

It's PPC, not PC :p
Anyway, it's actually possible that it could do that well in a benchmark for PPC's, which afaik are OOeO CPU's, while Xenon & Cell's PPC core are both in order execution cores? (the only ioe ppc's i'm aware of)
 
"PC Benchmarks for the Wii Cpu under Linux compared to the X360 and PS3 Cpu's... Infact I find it kinda funny that the Wii CPU is almost Equal to a Single Core of the the X360... Also the Wii GPU has about 1/3 the Cache of the X360... See a Trend?!? 1/3 seems to be a Magic Number when comparing Wii and X360... Don't take this to mean the x360 is 3 Times the Power of the Wii, as it's Not... The X360 is roughly 2 - 2.5 the Speed of the Wii overall... The PS3 is roughly the same as the x360 Benchmark wise, however It gets Insane Scores at Floating point Math due to the SPE units... Overall to compare the X360 CPU and PS3 CPU to a Computer I'd say roughly a 3.2Ghz Intel P4 (Non-DualCore)... The Wii is some where in the 2.4 - 2.6Ghz Intel P4 (Non-DualCore) Range"

i call bs on that

Well that part was basically his opinion not fact. He seems to overestimate the power of the Wii a little IMO but I haven't been involved in game developement before or had hands on with a Wii dev kit and he has so I can't really say. I just report what I hear.
 
Well that part was basically his opinion not fact. He seems to overestimate the power of the Wii a little IMO but I haven't been involved in game developement before or had hands on with a Wii dev kit and he has so I can't really say. I just report what I hear.

What about the 8 pixel and texture pipelines, which I believe an infamous fake Ubisoft interview also claimed? Perhaps someone else can verify that infomation.
 
"The Wii is some where in the 2.4 - 2.6Ghz Intel P4 (Non-DualCore) Range"

A PPC750 @ ~730MHz near a 2.5GHz Willamette ? No way, or just in the kind of stupid and meaningless benchmark that Apple was using back in the days. Even the 1.25GHz PPC7455 would lose against this kind of PC CPU.
 
2.5Ghz indeed sounds a little bit to bright to say the least. Though maybe he was talking about a willmetta in combination with SD memory? I've got one of those, now I dont remember if its 1.8 or 2.6Ghz as I ony used it once but lets say 1.8Ghz. I asked around if it was better to use the 1.8Ghz instead of my 1ghz thunderbird for my mediacenter and basically everybody said to stick with the thunderbird as wilmetta's have crap performance with sd memory. Now I have no idea about PPU performance but if its better per clock than a thunderbird maybe its more comparible to something like a 1.5Ghz cpu? or 2Ghz depending of what kind of model you are comparing with?

In the end I think ill keep with thinking that the cpu is 1.5 the GC one so it can only turn out for the better ;)
 
So you think that a lot higher clocks and ~70% more transistors gave them just 50% more performance?

As I know far less about cpu design than alot of other members on this board I'll stick with listening to them and being not to exited can only make things better in the end :)
 
What about the 8 pixel and texture pipelines, which I believe an infamous fake Ubisoft interview also claimed? Perhaps someone else can verify that infomation.

Actaully I asked him about that interview and if that's where he got the info and he said it says 8 pipelines in the sdk. He also said that he thinks the interview was not 100% bs because both Nintendo and Ubisoft's lawyers had the story pulled. Not the OP.

I dunno I think it most likely really does have 8 pipelines but I still think the easyninteno article was bs.

He also sent me the link to the sdk

Mod: posting links to illegal torrents is not a peccadillo
 
Last edited by a moderator:
Back
Top