provided slow msaa performance
Cost of MSAA on either 360 or PS3 GPU is roughly the same just getting there via different methods.
360 costs are in the tiling department.
PS3 costs are in the ROP unit.
provided slow msaa performance
So if the PS3 version of BF3 doesn't end up looking and/or performing better than the 360 version, I guess there will be a lot of awkward silences in this thread.
On this forum, I highly doubt that. :|
I think you are wrong.I haven't seen anything special about A.I in PS3 exclusive games.Yes,KZ2/3 have good A.I but its nothing that hasn't been done before.UC2,GOW III?Nope...like any other game.I don't think the amount of SPU power being used for graphics actually matters. What matters is non-graphics related SPU time. It's about how much SPU time you need for advanced A.I., audio, etc. The KZ3 screencaps give insight into those things.
It seems they didn't just go with a better GPU, because it's a less flexible rendering solution. I remember seeing something about Sony wanting to create the most flexible rendering system for that time. I think this was around 2006 or 2007. I think they succeeded in that regard.
I agree PS3 exclusives have great audio but there are also couple other games that even excel at it like for example BFBC2,best sound i have heard as far as this gen goes.
Off topic.I think you are wrong.I haven't seen anything special about A.I in PS3 exclusive games.Yes,KZ2/3 have good A.I but its nothing that hasn't been done before.UC2,GOW III?Nope...like any other game.
Off topic.Not only great sound but also very good and advanced HDR sound effects.
Off topic.Lack of processing power is the last thing I'd blame for poor AI. Looking how >5h single-player scenarios are relatively rare in newer FPS'es it's rather logical not to put too much effort in improving the AI there.
Except for texturing. (I heard it was really important! )
I mention the different phases of code execution on the Cell was from Mike Acton in my other post. It was you, though. This was back in 2006. I just wanted to give you your due credit. Sorry for the mix up.Cost of MSAA on either 360 or PS3 GPU is roughly the same just getting there via different methods.
360 costs are in the tiling department.
PS3 costs are in the ROP unit.
If it's the opposite results, FXIII and NGS 2 drama will probably be replayed.CE3, Crysis 2 thread(s) drama replayed. Deja-vu sensation will be present.
Well, I guess shaders are becoming more and more programmable, while texture units and ROPs are still fixed function?
If it's the opposite results, FXIII and NGS 2 drama will probably be replayed.
What developers have published public tech papers showing they are "pushing it to the limit" regarding the HW? I haven't seen these tech papers show the amount and types of jobs being run on Cell and the times associated with it. The closest I've seen to that has been from DICE and that's still a far cry from what I mentioned a sentence ago. We are just getting thinly sliced pieces of cake (cake = information). I would, also, like to see more about devs taking advantage or not taking advantage of the benefits parallelism affords (DICE handled that part nicely, IMO).Not at this stage with these reputable developers with state of the art cutting edge tech and engines. Things have progressed enough and the developers and with public tech papers released to show how they are pushing it to the limit regarding the HW. They are using using methods to keep Cell working hard to assist GPU in possible ways while running non graphics code to.
Cost of MSAA on either 360 or PS3 GPU is roughly the same just getting there via different methods.
360 costs are in the tiling department.
PS3 costs are in the ROP unit.
Because reinventing the wheel is a rare thing that happens once in a while, and there's a lot of legacy code and structures in place.
Screw 360 and PS3 and PCs, and let's just look at how we get silicon to make pretty pictures
Which highlights exactly that you aren't understanding the very idea I wanting to explore - What the hell should a GPU be made like?!Should it be thousands of fixed-function, dedicated pipelines? Or hundreds of flexible, programmable pipelines? Or dozens of CPU-like cores that have to use a different mindset to render far more efficiently?
What games do you have in mind?I think you are wrong.I haven't seen anything special about A.I in PS3 exclusive games.Yes,KZ2/3 have good A.I but its nothing that hasn't been done before.UC2,GOW III?Nope...like any other game.
Count how many games can't use msaa due to render engine incompatibility or whatever, one hand would probably have enough fingers to count them. Now count how many games can use msaa, you'll need a lot of hands to count them. All of those dozens of games were unable to use msaa due to the design choices made by Sony, specifically to use old gpu hardware that required lots of memory and provided slow msaa performance and instead go with a heavily customized cpu. Which was my point all along, that their decisions have consequences and their decision to ignore gpu and focus on spu meant years of games on that platform missed out on having any form of anti aliasing, or had to settle on image quality destructive qaa. I don't see how anyone can view this as a positive design choice.
It seems they didn't just go with a better GPU, because it's a less flexible rendering solution. I remember seeing something about Sony wanting to create the most flexible rendering system for that time. I think this was around 2006 or 2007. I think they succeeded in that regard.
Because reinventing the wheel is a rare thing that happens once in a while, and there's a lot of legacy code and structures in place.
Not legacy code but legacy thinking and tools. Intel had an idea for a new GPU, but they had to make it run DirextX for business reasons or it would get nowhere, and yet it wasn't ideally suited for DirectX and so would never compete with DX GPUs. Thus it has no business and no market. But it could power different engines, like...I dunno a CSG raytracer, more effectively than a GPU, and if everyone had one to develop for, I'm sure the ways of thinking would reveal whole new paradigms.For the record, there is little legacy code out there now. Neither the 360 nor the ps3 allow it for optimal results, everything had to be re-thought and re-written to accomodate multi core, vmx/spu, and load hit store. If you just take legacy C code and run it on console it will run like ass, and you would not be getting the games you see today.
Exactly.You abolutely *need* to look at existing platforms because they are the real world test cases to many peoples questions...With that in mind, look at the ps3 again, but *not* as a console. Instead treat the entire box it as a gpu...
Years where the remit wasn't "go do whatever you can with this new toy" but "go make a commercially successful game that's going to work with the assets we can produce with our current toolchains." And most of those decisions would have to work on multiple platforms. Only a few developers and MS/Sony with their own technology groups had that luxury, and I imagine they always have a target in mind that measn that aren't truly free to 'play'.The "ps3 gpu" came out years ago and devs had years to play with it and do anything they wanted with it.
And this is the interesting bit! Compute shaders are allowing rendering efficiencies. These compute shaders are new to PS3 right? The GPU can't do it, but DICE for one are using Cell as a compute shader engine. Now, unless when DICE did their GDC presentation most of the audience went, "so what? We've been doing this sort of thing for a couple of years now," (and maybe that happened as I don't know the state of console development), this is a new way of thinking. This is a way of thinking born out of GPU evolution, which was born out of a gradual analysis of workloads and a development of progressive solutions. Split work into tasks and create shaders. Unify shaders for efficiency. Add compute shaders for flexibility. But no-one back in 2000 was designing 2011 class GPUs, not because of a lack of funding, but thought patterns had got that far.Compute shaders have answered that, general purpose but not 100% general like in a typical cpu, but still mixed with some fixed function hardware. That's where it's all going.
And this is the interesting bit! Compute shaders are allowing rendering efficiencies. These compute shaders are new to PS3 right? The GPU can't do it, but DICE for one are using Cell as a compute shader engine. Now, unless when DICE did their GDC presentation most of the audience went, "so what? We've been doing this sort of thing for a couple of years now," (and maybe that happened as I don't know the state of console development), this is a new way of thinking. This is a way of thinking born out of GPU evolution, which was born out of a gradual analysis of workloads and a development of progressive solutions. Split work into tasks and create shaders. Unify shaders for efficiency. Add compute shaders for flexibility. But no-one back in 2000 was designing 2011 class GPUs, not because of a lack of funding, but thought patterns had got that far.
The GPU ? I don't know. There is a Super Companion Chip made by Toshiba for Cell. That one handles AV interfaces.
As I recall, Kutaragi said edRAM is out because they can't include enough for HD resolution. Cell and RSX can work on multiple frames/tiles at the same time. DMA to LocalStores works well regardless of which source frame/tile it is.
I'd be curious to see how much of that SPU power is being used for anything other than supporting the GPU functions. I'm sure there are a lot of things running on the SPUs, I just really doubt it's significant relative to the graphics function. The general argument here is if the SPUs are for the most part being used to accelerate graphics related tasks, why didn't they just go with a better GPU and a lighter CPU?
There is one other significant thing the SPUs are doing on PS3, actually.. video / audio decoding for Blu-Ray playback. The PS3 doesn't have dedicated decode hardware, and needed to handle mpeg 2, VC-1, h.264 AVC at higher bitrates than HD-DVD did, simultaneously with all the audio codec to 7.1 PCM decode.. Dolby TruHD, DTS Master Audio, etc.
Nothing to do with games, but nonetheless something that Sony was aiming to support with their hardware solution.