NVIDIA Sony Graphics Interview

...not necessarily. "Interfacing" and "being" can be different things. I think he brings up an interesting point that you may be missing.
 
randycat99 said:
...not necessarily. "Interfacing" and "being" can be different things. I think he brings up an interesting point that you may be missing.

Sure, I may have read it wrong but it was this,

n00body said:
...
With that in mind, my theory on how the GPU will integrate is that it will interface directly with CELL and be treated as though it were just another PE (albeit a highly specialized one). That way, the GPU would be directly controlled by CELL and would reap the benefits of CELLS high speed bus. It would also allow CELL's other PEs to work with the GPU if it became necessary.

i.e. the GPU to be treated as a PE (a CELL)...well that sounds like CELL based to me...but I've discussed this many times already in the other thread... ;)
 
As far as accessing the hardware, it may "appear" as a PE to the CPU.
That does not make it certain that the GPU is "cell-based". Maybe it just complies/conforms/subsets with a certain ISA, but the actual hardware underneath can be whatever. There is no strict cosmic rule that to "appear" as a PE, the hardware must be Cell-based.
 
randycat99 said:
As far as accessing the hardware, it may "appear" as a PE to the CPU.
That does not make it certain that the GPU is "cell-based". Maybe it just complies/conforms/subsets with a certain ISA, but the actual hardware underneath can be whatever. There is no strict cosmic rule that to "appear" as a PE, the hardware must be Cell-based.

I'm agreeing with everything you've said here and you're repeating everything I've posted in the other thread! ;)
 
Jaws said:
i.e. the GPU to be treated as a PE (a CELL)...well that sounds like CELL based to me...but I've discussed this many times already in the other thread... ;)

Sorry, I guess it didn't sound like you were agreeing, at all.
 
randycat99 said:
Jaws said:
i.e. the GPU to be treated as a PE (a CELL)...well that sounds like CELL based to me...but I've discussed this many times already in the other thread... ;)

Sorry, I guess it didn't sound like you were agreeing, at all.

No probs. ;)

I've said my piece in the other thread and I'm not looking at repeating myself again! :)
 
IMO the gpu in a console cannot be a "bottleneck", as it is there in the first place to meet certain fill-rate requirements under certain circumnstances, hence it wouldn't be there in the first place if it wasn't so by design. the doom3 example jvd gave is a bad one as doom3's visuals were not meant to run fully featured on gf3, whereas the game's cpu requirements are more than met by a 5ghz cpu. in this regard, a game could have been designed for a gf3 gpu + a 10ghz cpu just as well, in which case a 5 ghz cpu would be _the_ bottleneck.

again IMO, a hypotetical system where an infinitely-fast cpu was coupled to a finite-speed gpu would be quite useful. the reverse, oth, would be a complete waste.
 
We all know that current cpu's don't fully use the FLOPs a GPU is capable of. From what I've heard Cell's processing power makes a normal gpu primary functions pointless. A GPu is assisting the CPU with its processing workload from 3D applications.

What Gpu today could assist cell with its processing? None. Cell doesn't need its assist when one cell could potentially push over 200 GFlops. So I think Nvidia is primarily building a GPU that give Sony's hardware a rich library of vastly scalable effects. I think Nvidia ideas about future GPU's up until their involvement with Sony on PS3' project have been solely PC based. Even concepts for the next generation GPU applications was based on expanding the life of static architectures with large caches. They were building GPUs to take advantage of daul core static systems that trys to proforms like real streaming, dynamic, whatever you to call it architectures.

However, I could only imagine Nvidia's shock when Sony showed them what the true future of parallel processing would be with cell. It had to force them to rethink how a GPU benefits a system and its primary role in future hardware. Kinda like when you stoped using cassette tapes for CD's.

I read most of you guys creating potential conceptual proformance models for PS3's GPU without considering that fact. I think Nvidia is even saying this if you read between the lines. True parallel processing in a console started with PS2. Even today the EE puts many pruely static chips to shame. Sony's weakness in the development community was lack of understanding about parallel processing and a GPU that was not rich with effects. From the looks of it that has changed. One of the graphics card industry heightweights has signed on to parallel processing. What company does more to spearhead development of new effects


:?:
 
SedentaryJourney said:
Something I've wondered about the GPU in the PS3 is if CELL is handling all the vertex shading, does that make all the vertex shaders on the GPU redundant or even disposable? And wouldn't that equal more headroom for pixel shading?
I also thought nV might just strip out GF7's vertex shaders to reduce costs, in keeping with (IIRC) Kirk's saying discrete pixel and vertex shaders are still preferable to unified ones.

I don't follow the console forum, tho, and I suspect this has been hashed out before. Anyone care to tell me why I'm wrong?
 
Pete said:
SedentaryJourney said:
Something I've wondered about the GPU in the PS3 is if CELL is handling all the vertex shading, does that make all the vertex shaders on the GPU redundant or even disposable? And wouldn't that equal more headroom for pixel shading?
I also thought nV might just strip out GF7's vertex shaders to reduce costs, in keeping with (IIRC) Kirk's saying discrete pixel and vertex shaders are still preferable to unified ones.

I don't follow the console forum, tho, and I suspect this has been hashed out before. Anyone care to tell me why I'm wrong?

The current trend is to use per pixel lighting where per pixel lighting is mostly calculated in pixel shaders. Vertex shader are use for setup, and animation.

Perhaps, in PS3, subdivision surface will be common, Cell will take over the job for animation, and do the setup, by tesselating to micropolygon and let the vertex shader in NV GPU to apply (displacement) mapping and calculate lighting and the pixel shader to do fancy composition or other funky effects. Or Perhaps not.

Still though I think it wouldn't hurt to have vertex shaders there. Even if it isn't unified with the Pixel shaders like Xenon.
 
I'm getting really tired of all this Cell-based GPU talk.

What is Cell? That's the main question. And now we have an official answer from Sony/Toshiba/IBM.

Cell = 1PPC + 8APU + XDRI. How (and why) in a world would GPU be "based" on this? I see only the XDR interface here as something remotely usable for GPU construction.

Cell is a CPU, an extended version of PowerPC core with very powerfull (basically b/c of their number) APUs and an XDR interface. What exactly would PowerPC core do in a hypothetical Cell-based GPU? Why would they need it there if they already have one?

As i see it: PS3 GPU is going to be a custom version of NV5x GPU. Most of customization will be done in the interface area - they'll probably connect the GPU directly to the Cell, not to the chipset (no, this does NOT make a GPU Cell-based the same way how DDR DIMMs in Athlon 64 systems are not K8-based though they connect directly to K8 core; rough example but it'll do). They may tweak cache sizes and ALU numbers. They may lighten a bit VS part of the chip. They will probably cut most legacy PC blocks. And so on. But at it's core, on the ALU level, it'll still be a member of NV5x family and not "based on Cell" in any way.
 
That's really already been settled. Everybody (atleast everybody who's not completely out of the loop) already knows the GPU is to be a custom derivative of the next gen GPU core from Nvidia.


Later
 
DegustatoR said:
I'm getting really tired of all this Cell-based GPU talk.

What is Cell? That's the main question. And now we have an official answer from Sony/Toshiba/IBM.

Cell = 1PPC + 8APU + XDRI. How (and why) in a world would GPU be "based" on this? I see only the XDR interface here as something remotely usable for GPU construction.

Cell is a CPU, an extended version of PowerPC core with very powerfull (basically b/c of their number) APUs and an XDR interface. What exactly would PowerPC core do in a hypothetical Cell-based GPU? Why would they need it there if they already have one?

As i see it: PS3 GPU is going to be a custom version of NV5x GPU. Most of customization will be done in the interface area - they'll probably connect the GPU directly to the Cell, not to the chipset (no, this does NOT make a GPU Cell-based the same way how DDR DIMMs in Athlon 64 systems are not K8-based though they connect directly to K8 core; rough example but it'll do). They may tweak cache sizes and ALU numbers. They may lighten a bit VS part of the chip. They will probably cut most legacy PC blocks. And so on. But at it's core, on the ALU level, it'll still be a member of NV5x family and not "based on Cell" in any way.

I agree with you. it looks extremely unlikely we will be seeing any Cell-based GPU. for now Cell will be a CPU. the PS3 will have a fairly traditional GPU from Nvidia, of either the NV5x or NV6x generation. it will be customized in certain areas to work with Cell, not to be a Cell. I think the Vertex Shaders will be either gone completely or reduced in number and/or complexity. most of the silicon of the GPU will be dedicated toward pixel shading / rasterizing. while the Cell-CPU handles all or much of the geometry/lighting/vertex shading calculations.
 
Why woudl you have a CELL based GPU in PS3 when Nvidia is able to supply the GPU with enough performance on its own? It is completely illogical. Why destroy something that already works just fine? The CELL CPU in PS3 will be a powerhouse of a chip and so will the Nvidia supplied GPU. Seems to me that they complement each other very well.
 
Agreed, im actually pleased PS3 chose Nvidia for their GPU. I suspect a Cell based architecture for rendering would be a nightmare for programmers to optimize, eg having to write all your branching by hand etc etc.

The pricepoint on this machine seems to be spiraling out of control though. A cell cpu is already extremely expensive, if you couple it to Nvidia (who are known to not like to take a cut on profits) you end up with the situation where Sony could conceivably be losing more per machine than M$ did with Xbox1. Which company has deeper pockets, well that much is obvious.

This is great for consumers of course, otoh its also a hit or miss. If PS3 doesn't succeed, it could very well put Sony out of the video game market.
 
Fred said:
The pricepoint on this machine seems to be spiraling out of control though. A cell cpu is already extremely expensive, if you couple it to Nvidia (who are known to not like to take a cut on profits) you end up with the situation where Sony could conceivably be losing more per machine than M$ did with Xbox1. Which company has deeper pockets, well that much is obvious.

Well we'd have to see real statistical data to talk about that.
 
This is great for consumers of course, otoh its also a hit or miss. If PS3 doesn't succeed, it could very well put Sony out of the video game market.
Well, truth be told, they are probably not interested in that market if they are not leading/monopolizing it, so it makes sense they will try everything and anything to make it happen.
 
marconelly! said:
This is great for consumers of course, otoh its also a hit or miss. If PS3 doesn't succeed, it could very well put Sony out of the video game market.
Well, truth be told, they are probably not interested in that market if they are not leading/monopolizing it, so it makes sense they will try everything and anything to make it happen.

Like MS did? :devilish:
 
Sony probably saved money by going with Nvidia. For Sony to design a cutting edge GPU themselves, as well as development tools to go along with it, would have been much more expensive and time consuming. I think Nvidia said something like, it cost $350 million to develop the GF6 series. How much do you think Sony is paying to license a design from Nvidia? Who knows, but it's certainly much less than that.
 
Back
Top