Masayuki Chatani (SCEI CTO) interview

Shifty Geezer said:
Seems to me Cell would ideally have been paired up with Xenos, to switch that whole GPU over to pixel shading and have Cell process the vertices. Unified shaders would offer more flexibility than fixed pipes.
Why wouldn't nVidia then be able to make all the shaders in RSX just pixel, and let the Cell do the vertexs.
Or maybe just put some vertex shaders in RSX just in case Cell can't cope with ;)
 
one said:
If unified shader is better, I guess Cell-based GPU might not be discarded in the first place... just my 2 cents.
Presumably Cell's not so hot at pixel shading.

@ rabidrabbit : My guess is 32 pixel shader pipes is total overkill :p I want more physics and 'stuff' rather than have my Cell taken up with boring Vertex transforms
 
SedentaryJourney said:
For example we showed the demo that renders London City, it's not rendered in the GPU but the CELL does lighting and texture processing then outputs it to the frame buffer. Even without GPU, only CELL can create good enough 3D graphics.
Now THAT's impressive.
I agree :oops:
 
nAo said:
Why would you 'waste' CELL cycles to do some vertex shading when you have a GPU designed for it?
IMHO it's better to use CELL for other, even complementary, tasks.

Yep.

Shifty Geezer said:
Seems to me Cell would ideally have been paired up with Xenos, to switch that whole GPU over to pixel shading and have Cell process the vertices. Unified shaders would offer more flexibility than fixed pipes.

I have had the exact same thought! Xenos could be used as a traditional GPU or every shader ALU could be dedicated to PS and have CELL do VS.

one said:
If unified shader is better, I guess Cell-based GPU might not be discarded in the first place... just my 2 cents.

What?

What does the success/failure of the CELL-based GPU's have to do with Unified Shader Architecture being good or bad?

It could very well be STI's (or Toshiba's) design was flawed, or the fundamental aspects of the CELL implimentation are not as effecient as a totally dedicated PS hardware, or it could be lack of IPs, or they just stunk it up (like XGI, Matrox, S3, 3DLabs, and every other company that has dropped out of the 3D race).

NV, once very negative toward unified shading, has admitted they are going to it. ATI already spent a lot of research to get the R400 up and now has a working commercial product in R500 and will be going retail with the R600.

Sorry one, your 2 cents don't make much sense. Just because Sony did not go with a CELL based GPU does not mean unified shaders are inferior. Facts is both ATI and NV are going that direction. The only thing inferior was STI's attempt and the downplaying of Sony's competition.
 
one wrote:
If unified shader is better, I guess Cell-based GPU might not be discarded in the first place... just my 2 cents.

Why ?

A gpu is more than unfied shaders . The nv part could have had much better / faster filtering of pixels , much better texture compesion and handling . much better triangle set up engine , much better pixel shading engine . The list can go on an on .


Nvidia makes graphics cards , they are constantly updating and tweaking thier designs . Toshiba has almost no experiance in this compared to nvidia or ati .
 
Wow what a sudden bump, indeed I admit my logic for the Cell-GPU and unified shader was a bit abrupt too :LOL:
Acert93 said:
NV, once very negative toward unified shading, has admitted they are going to it.
But doesn't go to it in 2006. Also, Dave's Xenos article suggests that other than Xenos ATI's use for unified shader will be limited to mobile GPUs and CAD apps for some time, not for their flagship VPUs. My interpretation is, unified shader is not inferior at all, but brings some trade-off / restriction / less performance per silicon area / etc. so I'm not really sure what was the decisive factor that made Microsoft adopt a unified-shader architecture GPU.
 
But doesn't go to it in 2006. Also, Dave's Xenos article suggests that other than Xenos ATI's use for unified shader will be limited to mobile GPUs and CAD apps for some time, not for their flagship VPUs
because dx 9 doesn't support it . WGF 2.0 will use it though and support it adn u will see ati bring out a desktop version for it .



interpretation is, unified shader is not inferior at all, but brings some trade-off / restriction / less performance per silicon area / etc. so I'm not really sure what was the decisive factor that made Microsoft adopt a unified-shader architecture GPU.
I believe that when done right unified shaders can only be better . In both designs your going have a limit in transistors but with a unfied shader you leave it up to the needs of the program to pick if it needs more pixel orvertex power . With segmented shader set ups your going to have to make a choice and live with it .

As for ms they ipcked it because its most likely more efficent than a non unified set up and they are not constrained by the api like they are on the desktop . They built the api around xenos . Not that xenos was built around the api like under windows
 
I would be tend to think that its got all with performance to do.
When games starts using more and more complex shaders this could be the "break-point" when we see the companies go unified because i think both are "afraid" of losing benchmarks today.

In some sences i feel the Xenos chip has been overhyped because we know nada about performance and just have fancy diagrams and numbers.
 
In some sences i feel the Xenos chip has been overhyped because we know nada about performance and just have fancy diagrams and numbers.
well in some ways this is true .

However we know its capable of running the unreal 3 engine pretty damn well at 720p and most likely with 4x fsaa . A feat that is most likely out of reach for even the g70 esp with hdr applied .
 
jvd said:
However we know its capable of running the unreal 3 engine pretty damn well at 720p and most likely with 4x fsaa . A feat that is most likely out of reach for even the g70 esp with hdr applied .

Excuse me?! Says who?
 
First i never once talked about the rsx . So why are u bringing up.

Secondly I said "A feat that is most likely out of reach for even the g70 esp with hdr applied ."

Notw the most likely .


This is based on performance levels of the 6800 and the x800xt series and what little i've heard . And well we know the g70 can't do 4x msaa and hdr and we know that even on the g70 hdr takes a huge hit (about 30% or more ) so with ssaa it will most likely be out of the question .
 
jvd said:
First i never once talked about the rsx . So why are u bringing up.

Secondly I said "A feat that is most likely out of reach for even the g70 esp with hdr applied ."

Notw the most likely .


This is based on performance levels of the 6800 and the x800xt series and what little i've heard . And well we know the g70 can't do 4x msaa and hdr and we know that even on the g70 hdr takes a huge hit (about 30% or more ) so with ssaa it will most likely be out of the question .
when Tim Sweeney demoed UE3 on the GeForce 7800 (@ the GeForce 7800 unveiling), it looked like it ran smoothly, and it was running @ 1280x1024 res. (higher than 720p res.)...

link to a clip with Tim talking in the background about the 1280x1024 res.: http://media.ps3.ign.com/articles/628/628835/vids_1.html
 
Back
Top