Cell/CPu architectures as a GPU (Frostbite Spinoff)

Are we sure we speak of the same culling(s)?
Like back face culling, Hi-Z prepass /rejection and non hardware based techniques as occlusion in Frostbite 2 engine which is kind a lesser geometry pass to define occluder and reject more polygon than the GPU can guess with hardware techniques. They are non exclusive and Cell prove faster at the later.
EDIT posting is fast I'm no sure who I was answering ... :LOL:
 
Are we sure we speak of the same culling(s)?
Like back face culling, Hi-Z prepass /rejection and non hardware based techniques as occlusion in Frostbite 2 engine which is kind a lesser geometry pass to define occluder and reject more polygon than the GPU can guess with hardware techniques. They are non exclusive and Cell prove faster at the later.

I stand corrected, liolio. You're correct to point out that the SPU triangle culling and the occlusion culling are 2 different things, but they can be applied together.
 
So if the PS3 version of BF3 doesn't end up looking and/or performing better than the 360 version, I guess there will be a lot of awkward silences in this thread.
 
Think shader aliasing

MLAA doesn't help shader aliasing in any way...

What about 360 games? Gears? Halo *? There is Alan Wake if you can count it with that resolution.
edit: Oh yeah, I forgot about Crysis 2 not using MSAA either.

Shading per sample is expensive regardless of the hardware, that's why GG went with MLAA, but then there's a ton of shader aliasing since they're not shading 2 samples per pixel anymore. You can still get away with MSAA in Unreal Engine 3 for the static lighting and such. It would also still help with thin geometry.

As a cheap solution, the frame blend AA has its merits with scenes over large environments where the pixels won't be changing so much as compared to a corridor.

You may also want to consider other aspects to MSAA, like the particle buffer trick or the higher sampling rate for z-only passes.

So if the PS3 version of BF3 doesn't end up looking and/or performing better than the 360 version, I guess there will be a lot of awkward silences in this thread.

On this forum, I highly doubt that. :|
 
Yes, because the EDRAM is only enough for 2xAA with forward rendering AFAIK. Or for some seriously sub-HD resolution with 4x AA ;)

I thought the framebuffer became too large for the EDRAM any time you used AA, regardless if it was 2x or 4x.

Of course. But say that we take the (admittedly near ideal) case of AA solutions. The only 'fixed hardware' AA solution that the RSX provides is QAAx2, which has proven to be very impopular. Now I doubt that if this was fixed hardware, that it wasted a lot of space, but ok. The next option, MSAA, is already very expensive on the RSX, quoted as taking performance down by something like 25% per times AA (e.g. MSAAx4 halves performance or worse).

IIRC QAA has the same performance hit as 2xMSAA. I thought it was basically 2xMSAA with an extra blur filter added which is what helps against edge aliasing but hurts texture clarity.

So if the PS3 version of BF3 doesn't end up looking and/or performing better than the 360 version, I guess there will be a lot of awkward silences in this thread.

No there will be a lot of eating to be had and excuses being made.

The SPUs would be underutilized. They are there to also work on graphics from day one.

Seeing as how games rarely not use the SPUs for graphical tasks, and how the vast majority of SPU optimization has been for the benefit of graphics, I'm not so sure anyone can say this.

Who's to say how much better physics, AI, etc would have been if the ps3 had a better GPU. Maybe we'll see this happen with the PS4 if Sony uses the Cell again.

As for post processing, if you want to quote them, NaughtyDog mentioned that they can get better results on the SPUs compared to RSX _and_ 360. The results are (more ?) mathematically correct, and fast enough. Why not use the SPUs then ?

I didn't know Naughty Dog made a game on the 360, what's it called? :p
 
MLAA doesn't help shader aliasing in any way...
Of course it can. MLAA is a fully screenspace solution, it doesn't really have any knowledge of polygon edges. As long as pattern matches it can help with shadow or shader aliasing. As I said, most importantly it can improve.
Shading per sample is expensive regardless of the hardware, that's why GG went with MLAA,
... along with memory and bw cost you mean.
but then there's a ton of shader aliasing since they're not shading 2 samples per pixel anymore. You can still get away with MSAA in Unreal Engine 3 for the static lighting and such. It would also still help with thin geometry.

As a cheap solution, the frame blend AA has its merits with scenes over large environments where the pixels won't be changing so much as compared to a corridor.
I'm not sure how those are relevant to joker's claim that MSAA is only feasible on 360 and PC or total lack of evidence to that.
You may also want to consider other aspects to MSAA, like the particle buffer trick or the higher sampling rate for z-only passes.
True, it has its benefits, but clearly very inefficient technique even though it's hardware based.
 
MSAA is very efficent hardware wise ... it's just that the lie of "we have enough bandwidth now" is and has always been a lie.

It doesn't off the shelf play too nicely with deferred shading but there are solutions for that too ... in my opinion multiple samples should be at the basis of any other anti-aliasing method, some things just can't be done well without multiple samples.
 
So if the PS3 version of BF3 doesn't end up looking and/or performing better than the 360 version, I guess there will be a lot of awkward silences in this thread.

Nah... it means we have more details to talk about other than hand waving arguments like 'lazy someone'.

EDIT: There are still open issues like memory, streaming system, content, workflow, schedule, etc.
 
Seeing as how games rarely not use the SPUs for graphical tasks, and how the vast majority of SPU optimization has been for the benefit of graphics, I'm not so sure anyone can say this.

??? Cell was meant to perform graphics work as well as other important tasks. What's there to deny ?

Who's to say how much better physics, AI, etc would have been if the ps3 had a better GPU. Maybe we'll see this happen with the PS4 if Sony uses the Cell again.

There is always better physics, AI, graphics left for imagination. There is a fixed budget afterall. 360 has a better GPU, does/can it have better physics, AI ? Can its graphics be better ?

Not sure why Cell is special. Next gen, Sony will use whatever makes sense for them, Cell or no Cell. But developers would have more experiences with CPU + GPU combo work.

I didn't know Naughty Dog made a game on the 360, what's it called? :p

Ask them. They made the comment in the public. The 2 GPU cores and programming models may be more specialized or limited compared to a CPU's.
 
Of course it can. MLAA is a fully screenspace solution, it doesn't really have any knowledge of polygon edges. As long as pattern matches it can help with shadow or shader aliasing. As I said, most importantly it can improve.

How does/can MLAA deal with shader alias ? Or rather what do you mean by MLAA can help address shader aliasing ?
 
I think when Joker is saying having gone with a more traditional CPU + GPU combo like the 360, the PS3 may have been consistently capable of 2x or 4x MSAA, but not that the 360 can do it now. I think he's suggesting with the huge investment made into Cell, the PS3 could have ended up with setup more capable than the 360. Could be wrong, but that's what I'm seeing.
 
Taking into account the overall transistor count of PS3, plus the money and time spent on developing Cell, they could quite easily have made a console featuring Xenon, plus a more efficient version of Xenos sporting at least 64 shaders (as opposed to the 48 in Xenos).

Does anyone really thing such a console wouldn't have performed better than the current PS3 across the board?
 
I would actually like to know what is the combined performance of RSX and Cell shading in FrostBite 2 so far.
 
Taking into account the overall transistor count of PS3, plus the money and time spent on developing Cell, they could quite easily have made a console featuring Xenon, plus a more efficient version of Xenos sporting at least 64 shaders (as opposed to the 48 in Xenos).

Does anyone really thing such a console wouldn't have performed better than the current PS3 across the board?

Across the board? I don't think so. In graphics related tasks? All day, every day.
 
Across the board? I don't think so. In graphics related tasks? All day, every day.

I'd be curious to see how much of that SPU power is being used for anything other than supporting the GPU functions. I'm sure there are a lot of things running on the SPUs, I just really doubt it's significant relative to the graphics function. The general argument here is if the SPUs are for the most part being used to accelerate graphics related tasks, why didn't they just go with a better GPU and a lighter CPU?
 
I'd be curious to see how much of that SPU power is being used for anything other than supporting the GPU functions. I'm sure there are a lot of things running on the SPUs, I just really doubt it's significant relative to the graphics function. The general argument here is if the SPUs are for the most part being used to accelerate graphics related tasks, why didn't they just go with a better GPU and a lighter CPU?

I don't think the amount of SPU power being used for graphics actually matters. What matters is non-graphics related SPU time. It's about how much SPU time you need for advanced A.I., audio, etc. The KZ3 screencaps give insight into those things.

It seems they didn't just go with a better GPU, because it's a less flexible rendering solution. I remember seeing something about Sony wanting to create the most flexible rendering system for that time. I think this was around 2006 or 2007. I think they succeeded in that regard.
 
Yet, pretty much all graphically acclaimed games on PS3 came with MSAA/QAA until GoW3.
What about 360 games? Gears? Halo *? There is Alan Wake if you can count it with that resolution.
edit: Oh yeah, I forgot about Crysis 2 not using MSAA either.

And yes MLAA can be better than MSAA (definitely better than 2x). Think shader aliasing, post resolve tone mapping etc.
It's also programmable thus improvable unlike MSAA.
I think once again, you are just trying too hard.

Count how many games can't use msaa due to render engine incompatibility or whatever, one hand would probably have enough fingers to count them. Now count how many games can use msaa, you'll need a lot of hands to count them. All of those dozens of games were unable to use msaa due to the design choices made by Sony, specifically to use old gpu hardware that required lots of memory and provided slow msaa performance and instead go with a heavily customized cpu. Which was my point all along, that their decisions have consequences and their decision to ignore gpu and focus on spu meant years of games on that platform missed out on having any form of anti aliasing, or had to settle on image quality destructive qaa. I don't see how anyone can view this as a positive design choice.


It depends on what you prefer. Some people prefer to have all the improvements in the beginning and show no real improvemented later on.

If you buy a new sports car, would you prefer that it did 0 to 60 in 10 seconds for the first five years, and then finally do it in 5 seconds at year five? I mean seriously, who doesn't want their hardware to perform great from the get-go? Certainly not the people buying the product to begin with, especially at console launch prices. Certainly not the studios putting out the games as they don't want to look bad. And certainly not shareholders as they want to know that a competent product is out in the marketplace now, and not have to hear hollow promises. Plus, both platforms have shown consistent quality improvements over the years, look at standouts like Rdr or Castlevania and compare them to launch games. PS3 appears to have more change in quality over the years but that's only because it's first years of titles looked and performed so badly.


Then, most like to have continued improvements to be shown thorughout the console lifecycle.

Both have shown improvements, but given that the most impressive titles out are multi platform (to me) shows that the cpu gamble didn't pay off at all.


There was an article about the 3 phases of coding on the PS3. I believe Mike Acton was speaking on this. The first phase was having everything or almost everything on the PPU with nothing to very little on the SPUs. The second phase was moderate usage of SPUs and some offloading of coding from the PPU. The third phase was light to no PPU usage for game code and heavy use of the SPUs. Sony's 1st party are on the 3rd phase, now. If you aren't on phase 3 or close to it at this point, it's legacy thinking to me.

That's totally true...except everyone hit phase three some time ago.


ND said the hardest part of taking full advantage of the Cell was to "keep all the plates spinning" (UC2 interview). Do you think any 3rd party devs have reached that point, yet? Until I see a Cell usage chart with associated jobs, I can't even say DICE is there. However, from their presentation, I applaud what they have done so far.

An spu useage chart, as odd as it may sound, will give you precious little info as to whether or not the spu's are being used effectively at all. I could show you a 50% useage chart that is far more impressive than a 100% useage chart. Additionally, a 100% useage chart doesn't mean 100% useage on spu because they are dual issue. So you could have a chart that shows 40% spu useage and makes full use of dual issue, and hence is actually doing more work than a 70% useage chart that is making no use of dual issue. Finally, it's pretty darned easy to put out a chart that shows 100% spu use really. Percentage numbers mean almost nothing when comparing different games.


Personally, I don't understand all the "PS3 not pulling ahead of the 360 graphically" talk. I and most others seem to believe this happened some time ago. Most seem to believe the PS3 has pulled ahead in a number of categories (graphics, audio, A.I., and scale). I understand that you and some others don't subscribe to that, though.

You and I know this is infinitely debatable. Any "most others believe" statement can be countered with someone elses "most others believe" statement that says the complete opposite, and it never gets anywhere.


So you don't care for the better physics and audio the Cell affords games like UC2, Killzone 2 and 3? You wouldn't care for the additional space, in Blu-ray, that makes these games easier and better for the end user to experience (less discs, no loading screens, better quality audio, etc)? Would you have taken the HDD out as a standard option, for more RAM, as well? Of course, that takes away a company's incentive to subsidize a console as much as well. That, probably, means far less of a budget to work with for the design.

I definitely would have ditched blu-ray, but possibly kept the hdd standard. Standard cpu, killer customized gpu, more ram, awesome dev tools from day one. That formula would have won them this gen easy. Load times would have been faster since if they went with dvd then full game installs would likely have been supported, games would have looked better thanks to more ram, parity would have been achieved from day one, etc... It would have been win all around and we wouldn't even be having this conversation because right about now some studios would be dropping 360 support due to it being a distant third, as it's lack of memory would have prevented if from being able to visually keep pace and day one parity along with a < $599 launch price would have ensured another round of Playstation dominance.

Fyi cell doesn't afford better physics, ai, etc, it hasn't been a standout in that regard. No games have been ai standouts really, even games like KZ3 and Reach constantly have ai that does dumb things. They all mostly use havok for physics that runs on every platform on this planet and the next, so there is no physics advantage either. Remember as well that those spu charts you see with physics, ai, etc, on them doesn't mean it's doing more work. There are three cores in the 360 and 1 in the ps3, so there are two cores worth of general cpu load that have to be moved to spu no matter what just to maintain parity. Seeing those types of loads on an spu chart doesn't mean it has an edge in that category, every game *has* to do that to stuff on spu just to maintain cpu parity. Ie, they have to be at phase three for a while now.


Flexibility means you can choose what you wish to put your resources into. It's just like some games choosing to render at a lower resolution than 720p for some games. The whole "they have to just to keep up with the graphics" part doesn't add up. It's highly unlikely they would try to improve these other areas. The proof is on the 360 multiplatform games. Most of those have the 360 ahead in the graphics department, but there is zero improvement in any other areas. If your theory held water, there would be some advancement in "all other applications" on the 360. After all, they are suppose to be twiddling their thumbs while the PS3 version is struggling to meet parity, right? :)

They still aren't at parity just yet, standouts like Rdr and Assasins creed show that quite well although it has been getting better in this sixth year of the console gen. Which was my point all along that the cpu gamble failed in that it handicapped them for most of the gen, and provided no tangible improvement even years into the gen. Years later they are still figuring out clever ways to shift loads to spu, and there still is no game that is head and shoulders about multi platform games like Rdr, Castlevania, etc. The data out there couldn't be more clear to me, even if most people chose to ignore multi platform games, their design choice to focus on cpu was a failure.
 
...and look how it all turned out. Quality content from day one, tool and dev support from day one...
You're not understanding at all the perspective of my 'argument'.

This type of comment always comes up and I'll always ask the same thing, what makes people think that in year 6 games are still being done with legacy thinking on ps3?
Because reinventing the wheel is a rare thing that happens once in a while, and there's a lot of legacy code and structures in place. UE is built on a framework and a way of thinking and a set of tools that means it can't be "tweaked" to completely change the renderer. Those sorts of new-wave thinking need ground-up designs. Like the SaarCOR processor back in its day. "Could raytracing be accelerated?" You couldn't test that without builing an actual processor, and you'd have to create a games engine to run on it to evaluate it.

This has nothing to do with lazy devs or "the magic of the Cell processor" or that crap, and I don't understand why you are bringing that up, as if you cannot discuss technology on its own merits. :???: Screw 360 and PS3 and PCs, and let's just look at how we get silicon to make pretty pictures (hence why this topic doesn't belong in the console tech thread, but as I said, I'm not entering into it now as it deserves it's own proper thread in the processor architectures forum. And as this thread proves, the topic's not ready for a proper discussion).

If you want to discuss that just out of academic reasons then sure, but it's totally unrealistic.

In my mind that is the easiest question in the world to answer, made even easier by the lab test known as the ps3. Put all your money in gpu, ram and tools and you win in both cases, both in a console war and in graphical return.
Which highlights exactly that you aren't understanding the very idea I wanting to explore - What the hell should a GPU be made like?!Should it be thousands of fixed-function, dedicated pipelines? Or hundreds of flexible, programmable pipelines? Or dozens of CPU-like cores that have to use a different mindset to render far more efficiently? We don't often get chance to evaluate the latter, and are left looking at titchy pieces of info and theorising. But sadly few people are capable of just having a theoretical discussion.

Does anyone really thing such a console wouldn't have performed better than the current PS3 across the board?
"Should PS3 have launched with Cell, or a conventional CPU and better GPU?" is an old recurring topic that was never meant for this thread, but in this forum everything comes down to same old things by minds that think aligned to a specific brand or other.
 
Well, fixed function is obviously slowly on the way out, to some extent. All GPUs are moving the way of unified shaders and programmability. You have CUDA, OpenCL and compute shaders on the PC side. DX11 allows for computer shaders to be grouped and share information.

I'm not knowledgable about GPU hardware, really, but I'd be interested in sort of a birds-eye comparison of an SPE vs a compute shader or other GPGPU implementation. I'm assuming the GPGPU APIs would lack some of the flexibility of the instruction set for a SIMD processor. That might be a good starting point for the conversation.

I guess something like AMD's Fusion is also very similar to a console implementation, where the CPU and GPU are closely tied, and the line between the two is blurring. I guess the point of the SPE and GPGPU are mostly the same - fast processing of parallel data. I guess the solution ends up being, in theory, the same from a very high level. The PPU in Cell is more like a traditional CPU, the SPEs are SIMD and RSX is GPU function. In Directx 11 you have a CPU and instead of at general purpose SIMD, you get similar functionality from the GPU side. So, in my mind, what you have with the PS3 and DirectX 11 are both heading in the same direction from opposite ends of the spectrum, I suppose.
 
Back
Top