Impact of EDRAM-free design on PS3

Status
Not open for further replies.
I think someone at Sony must have been really disappointed when the PS3 was finalized to have no eDRAM. Not having it is lose-lose situation cause honestly speaking I for one can't tell the advantages of not having an eDRAM. [anyone wanna enlighten me over what advantages does not having an eDRAM brings to the table ?]
 
I have to ask is it really that hard to actually line up shots so they are virtually the same?

When they don't I instantly think there's a hidden agenda.
 
I think someone at Sony must have been really disappointed when the PS3 was finalized to have no eDRAM. Not having it is lose-lose situation cause honestly speaking I for one can't tell the advantages of not having an eDRAM. [anyone wanna enlighten me over what advantages does not having an eDRAM brings to the table ?]

Lack of EDRAM really hurt PS3

I have a genuine question: Having seen Uncharted 2 and some other titles, which are just utterly beautiful, and hearing how some people tout eDRAM as the saviour of graphics technology... Isn't it true that as long as the developers know what they're doing, having or not having it makes little difference?
Or, would Uncharted 2 look or run "better" if the PS3 had eDRAM? If so, in what way?
Please do explain.
 
I think someone at Sony must have been really disappointed when the PS3 was finalized to have no eDRAM. Not having it is lose-lose situation cause honestly speaking I for one can't tell the advantages of not having an eDRAM. [anyone wanna enlighten me over what advantages does not having an eDRAM brings to the table ?]

EDRAM offer less options for future cost-reductions. I think that is the most important reason why Sony didn´t stay with the PS2 formula.
 
I bough SO4 international on PS3. Now I have both version at home at full retail price. >_>

Here my pics, though they are both on component. I couldn't get any signal with HDMI on my capture card with the PS3 version.


The lack of high speed edram hurts here for Alpha Blending. It appears that they needed to change some effects to alpha to coverage to save bandwidth. I also suspect some of the alpha buffers to be of lower res as well. One thing to remember is that Edram has hurt some 360 titles as well. It probably wouldn't happen as much if it weren't for the need for MSAA but some games are Sub-HD because of it.

It does appear that they did a much better job than Sega did with Bayonetta in this area but they made some decisions that leaves me wondering why they did what they did.
 
Last edited by a moderator:
I have a genuine question: Having seen Uncharted 2 and some other titles, which are just utterly beautiful, and hearing how some people tout eDRAM as the saviour of graphics technology... Isn't it true that as long as the developers know what they're doing, having or not having it makes little difference?
Or, would Uncharted 2 look or run "better" if the PS3 had eDRAM? If so, in what way?
Please do explain.

I don't think the inclusion or lack of EDRAM is a major factor nearly as much as that each console has it's strengths and weaknesses.

The problem comes in that most devs will develope on the X360 first for a variety of reasons, not least of which is that it's closest to PC in ease of developement sharing many of the same strengths and weaknesses.

Then it's ported to PS3 and you have to adjust things in order to reach either visual or performance parity but not usually both.

That said devs are getting better and better at designing games such that they exploit both consoles strengths without having their weaknesses impact the final product in noticeable ways. MW2 for example.

That said, it's still going to be a problem for smaller studios with smaller budgets. And obviously this doesn't affect first party efforts (UC2, for example) where they can completely work around weaknesses and maximize the strengths.

Regards,
SB
 
I have a genuine question: Having seen Uncharted 2 and some other titles, which are just utterly beautiful, and hearing how some people tout eDRAM as the saviour of graphics technology... Isn't it true that as long as the developers know what they're doing, having or not having it makes little difference?
Or, would Uncharted 2 look or run "better" if the PS3 had eDRAM? If so, in what way?
Please do explain.

Depends on the renderer type I think. eDRAM probably wouldn't help deferred shading titles as much (still useful! MRTs do suck up bandwidth still, but within the context of particles...) due to the need to perform a separate forward pass to do alpha effects anyway, and they'd be wanting to use lower res alpha buffers anyway because of limited fillrate. They essentially gain maximum pixel throughput whilst blending and/or using MSAA. The paltry 21.6GB/s on RSX serves framebuffers, geometry, and textures (though textures can be offloaded to XDR IIRC). Things can get a little tight when blending on top of using MSAA (or just having a ton of alpha blended effects in Bayonetta's case).

Forward renderers can make use of the bandwidth very well (and I'm sure if you search around for sebbbi's posts on his work with Trials HD, you'll find better explanations :p :)).

eDRAM has been a limitation of sorts for even fitting multiple render targets at 720p or even with MSAA as plenty of devs have been looking for ways to avoid tiling (using lower res frame buffers or separating into multi-pass), at least beyond 2 tiles, it's very rare.
 
I remember someone here mentioned that Tiling has a relatively low performance hit, if that's the case then why so hell bent on trying to avoid it ?

Also when lacking an eDRAM, is there no two ways around when you have cases like Bayonetta [apart from cutting back the alpha effects] ?
 
I remember someone here mentioned that Tiling has a relatively low performance hit, if that's the case then why so hell bent on trying to avoid it ?

Also when lacking an eDRAM, is there no two ways around when you have cases like Bayonetta [apart from cutting back the alpha effects] ?

Geometry cost was mentioned on average as roughly 1.2x with two tiles. This also does apply to setting up shadow maps too I believe... Things are trickier with multiple render targets, especially if you're aiming for HD & MSAA and framebuffer memory can skyrocket. Consider for a moment that a 720p 2xMSAA full on G-buffer (4MRT + depth) takes 36MB....

Aside from just having more bandwidth? :LOL: Bayonetta may have had other issues on the CPU side too; it was a rather quick port, and the code running on Xenon isn't necessarily well suited towards running on SPEs. Some of the framerate dips I saw don't make sense from a blending/bandwidth POV.
 
I really never understood the problem with Sigma 2...I am playing this game at the moment and I feel that they didn't really needed to removed the blood & splatter completely. Have a look at KZ2 there's probably more blood & particles going on..yes indeed its at a low resolution & half the framerate but couldn't Tecmo do atleast something same ? cause even "pints" of blood is better than blue hazy gases. :) I guess they really needed all that bandwidth so as to sustain 60FPS.
 
I really never understood the problem with Sigma 2...I am playing this game at the moment and I feel that they didn't really needed to removed the blood & splatter completely. Have a look at KZ2 there's probably more blood & particles going on..yes indeed its at a low resolution & half the framerate but couldn't Tecmo do atleast something same ? cause even "pints" of blood is better than blue hazy gases. :) I guess they really needed all that bandwidth so as to sustain 60FPS.
The lack of blood is more for the selling. More pegi rate means 'restrict' audience. ngs 2 use even high buffer in the special effects.
 
I really never understood the problem with Sigma 2...I am playing this game at the moment and I feel that they didn't really needed to removed the blood & splatter completely.

The original game got a Z ranking in Japan, which is the equivalent of AO with the ESRB. They had viral ads for months before NGS2's release hoping for a massive success, but the gamble never paid off though. Overall it's a remixed game that goes beyond visuals.

The irony is that the replacement effects (the purple haze) actually drop the frame rate noticeably when they're used en masse.
 
Last edited by a moderator:
PC cards typically have quite a bit more bandwidth than RSX anyway.

Also when lacking an eDRAM, is there no two ways around when you have cases like Bayonetta [apart from cutting back the alpha effects] ?

Insomniac has the answer since they used transparencies liberally with the last R&C, and the drops in frame rate wasn't unlike games like Bayonetta (360). It held up well actually.

I'd like to hear how they got around the proverbial bottleneck.

For the record, I remember an old Game Watch interview with SCEI's CTO where he was asked about the omission of the eDRAM and the response was that it wasn't possible with dual HDMI output. I think it's fair to say that the dual HDMI output was scrapped pretty late so they couldn't revise anything. PS2 game emulation was certainly hurt the most.
 
Status
Not open for further replies.
Back
Top