Fillrate in the Next Gen Consoles

onetimeposter said:
50% for AI Physics and Graphics is alot lower than expected 70%

Err..what has this got to do with what EA was saying there?

And more generallly...what?

edit - bah, I see the stuff you seem to be referring to now, but that's not related to the question you were quoting. And I'm not sure what point it is you're making
 
Last edited by a moderator:
Shifty Geezer said:
70% expected by who and for what reason?

because Sony doesnt have any major software in the console like Media Center Capability or Xbox Live with every game enabled so some Processing power is saved there.

Also its interesting that 4 SPUs would give 50% of the processing power
 
ERP said:
Bandwidth might be a limiting factor, more so on PS3 than Xenon.

GPU wise I agree with you 100%, CPU wise... well, let's say that my perfec console would have Xenos and the e-DRAM daughter-die in the place of RSX with the CPU being the Broadband Engine and having separate VRAM and XDR ram blocks so that Xenos would have a direct connection to its own VRAM pool (in addition to the daughter-die) and the CPU would ahve its own RAM pool maximizing bandwidth for each chip.

Even though you take out frame-buffer/z-buffer/stencil-buffer reads and writes away from the shared UMA pool I still do not like too much having 22.4 GB/s shared between such hungry beasts such as Xenos and the XeCPU. I am sure developers will make it work well (trying to move data directly between the CPU and the GPU without hitting main RAM too often... which should also help the 1 MB L2 cache to be more effecitve being less stressed), but a mn can dream ;).
 
onetimeposter said:
because Sony doesnt have any major software in the console like Media Center Capability or Xbox Live with every game enabled so some Processing power is saved there.

You think PS3 won't have OS/non-game functionality overhead?

onetimeposter said:
Also its interesting that 4 SPUs would give 50% of the processing power

He doesn't limit "processing power" to the CPU by the way. He's saying in the previous gen, 20% of your processing power could be used for non-rendering tasks, next-gen it may be more like 50%. Is that across the CPU and the GPU or just the CPU? The former seems likely to me. On PS2, you're CPU was involved heavily in graphics work (because it had to be), but I think he may simply be referring to how that needn't be the case anymore and you've now got a lot more power free on the CPU side for everything else. The alternative interpretation is that he's saying you'll be using 50% of your CPU for graphics.

Anyway, I'm wondering what the other 3 SPEs are doing..

The mocap stuff sounds very encouraging. Pity there were no photo-journalists there!
 
Is it normal to max out the fillrate so fast? What are the effects of fillrate limitation? How fast is Cell fillrate? How can it help to compensate RSX deficiency?

Cell and Xenos, one can dream..
 
wrongdoer said:
Is it normal to max out the fillrate so fast?

Well there's nothing abnormal about it, every game will be different.

wrongdoer said:
What are the effects of fillrate limitation?

It's only a limitation if it's causing unacceptable performance. But there's no indication of that here, simply that it's the bound in this case. Every game has a bound of one type or another.

wrongdoer said:
How fast is Cell fillrate?

Unknown, but the possibility of moving some or all alpha-blending/transparencies to cell has been discussed. It's possible to do that, as long as sufficient resources are free to do that. If you're game was GPU-bound, there may well be. But that's a theoretical for now.
 
Titanio said:
Well there's nothing abnormal about it, every game will be different.



It's only a limitation if it's causing unacceptable performance. But there's no indication of that here, simply that it's the bound in this case. Every game has a bound of one type or another.



Unknown, but the possibility of moving some or all alpha-blending/transparencies to cell has been discussed. It's possible to do that, as long as sufficient resources are free to do that. If you're game was GPU-bound, there may well be. But that's a theoretical for now.

yes but with EA's rep for low textures its not looking that good as previously suspected, atleast for ingame. yes they have been discussed but they were on PS2 as well. There will be massive bandwidth cost if Cell tries to take away the GPU speciality for itself, besides nearly all developers wont do it considering they will have difficulty programming for PS3 atleast the first generation.
 
onetimeposter said:
because Sony doesnt have any major software in the console like Media Center Capability or Xbox Live with every game enabled so some Processing power is saved there.

Also its interesting that 4 SPUs would give 50% of the processing power
1) I doubt 50% was a scientific measurement, but a ball-park figure. 4 SPU's is hald the cores in Cell, hence 'half the power'.
2) How do you come to 70% figure after graphics, and OS work would take up 20%? The article sayeth 50% after graphics is left for non graphics stuff, and you seem to think that 70% of processing power after graphics should be available for non-graphics stuff. I can't see how you get to that figure.

There's no structured division of resources. One game might use 20% for graphics and 80% physics + stuff. Another might throw most of the processor into graphics, with less complex AI and physics (eg. flight sim) in which case 80% might be used on graphics and 20% non-graphics. This EA statement is only giving an example of one game. In this case they have produced their graphics and can't do any more graphics work because the GPU has reached it's fillrate limit. 4 cores are still unused at this point so they can be used for other stuff. That's it. There's no system-wide performance considerations that can be rationally gleaned from this. We've no idea how efficiently they're using fill-rate or how they're using the Cell for graphics. Could well be another game uses 80% Cell for graphics without ever hitting the fillrate limit. And there's no reason to think if OS work were running in the background it'd consume 20% of resources. And there's no evidence what sort of OS/non-gaming functionality might be running either.
 
Shifty Geezer said:
1) I doubt 50% was a scientific measurement, but a ball-park figure. 4 SPU's is hald the cores in Cell, hence 'half the power'.
2) How do you come to 70% figure after graphics, and OS work would take up 20%? The article sayeth 50% after graphics is left for non graphics stuff, and you seem to think that 70% of processing power after graphics should be available for non-graphics stuff. I can't see how you get to that figure.

There's no structured division of resources. One game might use 20% for graphics and 80% physics + stuff. Another might throw most of the processor into graphics, with less complex AI and physics (eg. flight sim) in which case 80% might be used on graphics and 20% non-graphics. This EA statement is only giving an example of one game. In this case they have produced their graphics and can't do any more graphics work because the GPU has reached it's fillrate limit. 4 cores are still unused at this point so they can be used for other stuff. That's it. There's no system-wide performance considerations that can be rationally gleaned from this. We've no idea how efficiently they're using fill-rate or how they're using the Cell for graphics. Could well be another game uses 80% Cell for graphics without ever hitting the fillrate limit. And there's no reason to think if OS work were running in the background it'd consume 20% of resources. And there's no evidence what sort of OS/non-gaming functionality might be running either.

i was referring to Xbox 360 where the OS sound and Live use 25% of 1 of the cores and the 2 cores and 75% of the remaining core for the game graphics, AI and physics. I talked to a developer for Gearbox who are making the Brothers in arms games for PS3 and he said dont be surprised if GPU usage in first generation will be around 20% of its potential, both for PS3 and Xbox 360, CPU will always be used 100% and if a developer cant use 100% of CPU potential then gameplay will stutter alot, but the GPU is the solvent for next gen gaming.
 
onetimeposter said:
yes but with EA's rep for low textures its not looking that good as previously suspected, atleast for ingame. yes they have been discussed but they were on PS2 as well.

I'm sorry, I'm not sure what you're referring to or what you're talking about (?)

onetimeposter said:
There will be massive bandwidth cost if Cell tries to take away the GPU speciality for itself

What I described would actually save a lot of bandwidth, both memory and cell-rsx bandwidth, if your game was using a lot of transparencies.
 
Once MSAA is turned on then compression comes into play, making bandwidth calculations tougher. Except on the Xenos, where we know that 4x MSAA can be achieved without an increase in bandwidth consumption.

And this is nothing but M$ marketing blabla. In an interview with PGR3 devs, they state they will be using 2xAA @ 720p, and 4xAA with 640x480 so it's far from "free" considering the game is hardly running at 30 fps right now.

@ EA; i guess they might be using the other 3 SPEs doing post processing or HDR rendering (maybe AA?) for "free" (for the GPU).
 
onetimeposter said:
because Sony doesnt have any major software in the console like Media Center Capability or Xbox Live with every game enabled so some Processing power is saved there.
And exactly what makes you so sure of this?
 
ERP said:
Bandwidth might be a limiting factor, more so on PS3 than Xenon.

But where it will likely hurt is the low complexity transparent polygons, mostly particles. FWIW if there hadn't been this premature obsession with HD this time around, I'd have said it was probably a none issue.

are you out of your mind? repeat after me 'HD is revolutionary, HD is revolutionary, HD is revolutionary' </sarcasm>

Simon F said:
And are probably particularly nasty when it comes to page breaks thanks to a very likely lack of coherency.

well, you can easily impose (spatial) coherency onto them. you know, software renderes can do anything hw can - tiling, swizzling, etc : )
 
darkblu said:
well, you can easily impose (spatial) coherency onto them. you know, software renderes can do anything hw can - tiling, swizzling, etc : )
Well if software rendering a particle volume, I'd want to raycast it, avoiding most transparent overdraw completely. :p
But Simon was referring to hw rendering I believe.
 
And this is nothing but M$ marketing blabla. In an interview with PGR3 devs, they state they will be using 2xAA @ 720p, and 4xAA with 640x480 so it's far from "free" considering the game is hardly running at 30 fps right now.

If the game is running @ 30 fps I doubt it's because of the AA, and more so all the polygons they have on screen
 
Do the polygons change going from 640p to 720p that would force that sort of a drop in AA? I don't understand how the two are dependant upon eachother. I would imagine, since they offloaded, of sorts, the AA logic to the daughter die. THey shouldn't really have an effect scaling up resolution. More over, I thought all 360 games were internally rendered at 720p and then scaled to various resolutions, so I'm REALLY confused.
 
Perhaps the 4xAA at 640 is due to downsampling? AFAIK all games render to 720p. That's what we've been being told all this time
 
onetimeposter said:
i was referring to Xbox 360 where the OS sound and Live use 25% of 1 of the cores and the 2 cores and 75% of the remaining core for the game graphics, AI and physics. I talked to a developer for Gearbox who are making the Brothers in arms games for PS3 and he said dont be surprised if GPU usage in first generation will be around 20% of its potential, both for PS3 and Xbox 360, CPU will always be used 100% and if a developer cant use 100% of CPU potential then gameplay will stutter alot, but the GPU is the solvent for next gen gaming.


I think i read that 25% for sound was when they were pushing the absolute limits on it (using all 256 voices simultaneously). Anyone else remember that?

A very minor point and insignificant to the topics being discussed ehre but just wondering...

J
 
Nemo80 said:
And this is nothing but M$ marketing blabla. In an interview with PGR3 devs, they state they will be using 2xAA @ 720p, and 4xAA with 640x480 so it's far from "free" considering the game is hardly running at 30 fps right now.

@ EA; i guess they might be using the other 3 SPEs doing post processing or HDR rendering (maybe AA?) for "free" (for the GPU).


ofcourse it is. your M$ gave away your bias
 
Back
Top