360 eDRAM Utilization - 2010 Edition

Also, as joker454 and others have explained more than once, most people refrain from making the most out of EDRAM, as that will jeopardize the sacred "parity".

Look, I know that you guys (due to whatever reason) keep iterating your mantra in every second post or so...I am just a gamer with a pair of healthy eyes: but after seeing recent exclusives like Crackdown2 (don't know which parity these devs are targeting?)....such comments are getting boring and sound rather like a prayer than reality...just my honest opinion!

Honest question incoming (not sarcasm!!): Can you please explain me what recent Xbox exclusives like AW and Crackdown 2 use EDRAM for which MP devs don't do due to the sacred "parity"? Which special EDRAM tech are they using? It seems to me that those games have some serious issues due to size limits of EDRAM (resolution in AW and Crackdown 2 dev interview at DF stating the vertex overhead) - so where do they gain due to EDRAM?
 
so where do they gain due to EDRAM?

Where does/did the PS3/Xbox 1 suffer due to very limited framebuffer bandwidth?

Transparencies, AA, and framerate spring to mind.

Look at all the crap a developer like Rockstar gets when they let stuff like this show. No wonder so many studios are content to play it safe.

If things like transparencies, AA and framerate don't bother you, then I understand why you can't see what the big deal is.
 
Where does/did the PS3/Xbox 1 suffer due to very limited framebuffer bandwidth?

Transparencies, AA, and framerate spring to mind.

Look at all the crap a developer like Rockstar gets when they let stuff like this show. No wonder so many studios are content to play it safe.

If things like transparencies, AA and framerate don't bother you, then I understand why you can't see what the big deal is.

You got me wrong - I did not state in any way that EDRAM is useless!

Here is another try to explain my question (sorry for my bad english):

Assen said that MP devs don't use full potential of EDRAM due to "parity" with PS3...so I thought about exclusive devs who can use the full potential of EDRAM!

My question was:
AW and Crackdown 2 where Xbox360 exclusives, so no parity to PS3 needed or intended - the devs can use full potential of EDRAM and do their magic stuff!

=> What are these devs using EDRAM for, which other multiplatform devs cannot do because of the PS3??

I hope my question is clear now!?
 
My question was:
AW and Crackdown 2 where Xbox360 exclusives, so no parity to PS3 needed or intended - the devs can use full potential of EDRAM and do their magic stuff!

What kind of magic are you expecting someone to point out?

How can someone point out improved performance or resolution or IQ compared to a version of a game that doesn't exist? How can someone point to a bit of Crackdown 2 and say something like "without AA being resolved on the framebuffer, that character model's arm wouldn't fit in memory!"?

What's even to say that first parties see it as a priority to "use the full potential of EDRAM"?

=> What are these devs using EDRAM for, which other multiplatform devs cannot do because of the PS3??

What can something with more bandwidth do compared to something with less? Would an answer such as "doing the things they do while maintaining the performance they do" satisfy you?
 
What kind of magic are you expecting someone to point out?

Huh? You guys are stating that MP devs don't use EDRAM appropriate due to PS3 parity!! Now, I ask about stuff they use in exclusive games...and now you ask me which stuff I am talking about?

Sorry, but there seems to be a weird argumentation going on here so I will leave this topic...
 
COD WaW: full resolution fire/explosions on 360, low on PS3?

Ps3 http://images.eurogamer.net/assets/articles//a/3/0/8/2/4/1/CoD5_PS3_021.jpg.jpg

360 http://images.eurogamer.net/assets/articles//a/3/0/8/2/4/1/CoD5_360_021.jpg.jpg

That's a result, isn't it?

How about Afterburner climax, 4XAA on 360, 0XAA on PS3? That's probably a EDRAM related result.

There are probably a lot like this, I'm only naming from my limited memory.

How about countless multiplatform games running a little to a lot faster on 360? Sure, many third party devs may not be pushing the hardware like they could, but in a way that's not even the point. The point is what happens in the real world, and it seems to consistently be easier for devs on a budget to get something up on 360 than PS3. Some of that may be due to the CPU and memory layout, but I think some is due to the EDRAM just making quick and dirty programming a lot easier.

I'm actually sympathetic to your argument though, Billy Idol, and I've argued your side before. It's not that I dont think the EDRAM helps, against not being there, I've come to perhaps appreciate it a bit more. It's just I wonder if using the cost spent on EDRAM couldn't have been better used elsewhere. And I agree, sometimes it seems more trouble than it's worth. But I'm not a programmer so I dont really know. I dont think blaming games at sub HD resolution on it makes sense though, because we see just as many sub HD games on PS3. If EDRAM is why Alan Wake is 540P, then why is RDR 720P on 360 but less on PS3? It seems to me that it's at least as common to see X360 720P, PS3 <720P on multiplatform games, as vice versa, so you cant blame sub HD much on the EDRAM, it doesn't make sense, obviously PS3 is struggling with something that causes it to have many sub 720 games while the 360 version 360 is full 720, too.

In a sense what saved MS's bacon is that ATI came through with a non-EDRAM GPU that at a much smaller size seems to compete very well with RSX. But what if they hadn't? Or what if the EDRAM resources had been spent on making that traditional GPU bigger and trying another bandwidth solution? Would they be ahead of where they're at now relative to competition?

There's also the perennial issue of MS first party just not measuring up too, when judging "lacking" 360 exclusives. I think most agree Sony first party, at least a few titles, have been a lot better at pushing hardware. There's not a whole lot of MS true first party left, and what is, aka like Bungie, dont seem all that interested in graphical showcases, even though Reach looks good, it's not incredible. And dont forget there are plenty of Sony first party games that dont look amazing either, ala Resistance 1/2, Twisted Metal, MAG, Socom Confrontation/4...so I mean I'm just saying, a first party game not looking that good, like Crackdown 2, it happens sometimes on both HD platforms.
 
Last edited by a moderator:
I think there needs to be a logical break here from the original thread from 07, so I hereby christen the 2010 edition.

By the way, tone down the subjective stuff and focus on the technology guys - the console tech section isn't where fan passions need run wild as to "PS3 iz the superiorz!"
 
Last edited by a moderator:
^Check MW2 where the particle res are same across both platform.
Also I don't think WAW or MW2 have full res buffer on any platform.

What confuses me is that why do we have games on Xbox 360 like Forza 3,Alan Wake (exclusive) and RDR,BFBC2 (multiplatform)...use the much cheaper Alpha to Coverage for transparencies instead of regular Alpha Blend, eventhough it has huge amount of bandwidth available with the eDRAM. Another thing to note is BFBC2 didn't used any AA either so its even more weird to see a game that doesn't uses any bandwidth for AA do alpha to coverage on a platform that has its strength in being able to sustain high fillrate due to eDRAM.
 
Last edited by a moderator:
^Check MW2 where the particle res are same across both platform.
Also I don't think WAW or MW2 have full res buffer on any platform.

What confuses me is that why do we have games on Xbox 360 like Forza 3,Alan Wake (exclusive) and RDR,BFBC2 (multiplatform)...use the much cheaper Alpha to Coverage for transparencies instead of regular Alpha Blend, even though it has huge amount of bandwidth available with the eDRAM. Another thing to note is BFBC2 didn't used any AA either so its even more weird to see a game that doesn't uses any bandwidth for AA do alpha to coverage on a platform that has its strength in being able to sustain high fillrate due to eDRAM.


According to repi's twitter the A2C in BC2 360 was a look that the artists liked.

It's curious to me as to why the 360's bandwidth is literally 10x higher than the PS3s and yet when using low res effects there are a still framerate drops, eg MW2 :?: (while not quite a big hit as the PS3 version, but still tangible). Is there a point where bandwidth stops having an impact or is it something else?
 
But even if you got the bandwdith you need the GPU to render the particles.
 
How about Afterburner climax, 4XAA on 360, 0XAA on PS3? That's probably a EDRAM related result.
I seriously doubt it. AB is a visually very simple game, if it lacks AA on PS3 I would expect it to be entirely due to lack of development time, and/or effort.
 
I thought those Sega lindbergh games lacks AA because of FP HDR. It uses a geforce 6 base graphic card and PS3 version of VF5, Initial D, and AB are pretty much direct port. 360 version of VF5, and AB would be FP10 + MSAA.
 
ED-Ram was worthwhile and still is worthwhile.

From a manufacturing perspective it enabled them to go with two cheaper/easier to fabricate chips with better yields especially in the early days which kept the overall costs lower at limited their early losses and/or reduced the need to come in at an even higher price.

It reduces the memory consumption of MSAA (is that in the back or front buffer?) and it enabled them to use a single 128bit memory interface using the same memory technology so they don't have to duplicate buses and it meant that the current CGPU could be produced.

It also improves performance and improves the efficiency of their GPU.

Whats there not to like about it? Its quite likely the alternatives are much worse from a performance vs price perspective, especially given that the Xbox 360 was already too hot and noisy right from the start.
 
I thought those Sega lindbergh games lacks AA because of FP HDR.

6/7 series Nvidia GPU's don't lack the ability to use MSAA + HDR, they lack the ability to use MSAA with floating point render targets. While the most straightforward way to implement HDR is to render to floating point render targets, it's far from the only way. It's been PS3 standard practice for a few years now to use an encoded HDR format...it's not exactly the most difficult thing to implement.
 
ya what I meant, FP16 on PS3 and lindbergh, so those version of the games has no AA at all. Im pretty sure the arcade version I saw of HOTD4, VF5 never have any MSAA. And PS3 version of those games being a direct port.
 
What confuses me is that why do we have games on Xbox 360 like Forza 3,Alan Wake (exclusive) and RDR,BFBC2 (multiplatform)...use the much cheaper Alpha to Coverage for transparencies instead of regular Alpha Blend, eventhough it has huge amount of bandwidth available with the eDRAM.

Alpha test and A2C don't require sorting objects. Sorting can be expensive or a headache if you're trying to get correct results.
 
=> What are these devs using EDRAM for, which other multiplatform devs cannot do because of the PS3??

It's hard to say sometimes because unless a dev is asked about how they use EDRAM, they usually don't say. And it's very very rare that anyone interviewing a dev actually asks how they used the EDRAM. The only one I can think of is the Digital Foundry interview with Sebbi about Trials HD.

I doubt there are any devs that ignore EDRAM, but it's hard to say to what extent they use it if they don't comment about it.

For comparison, it's easy to see things with regards to SPU, because Cell is such a different architechture that devs are almost constantly asked how they are using it. As well I'm sure Sony is encouraging devs to talk about it in order to justify developement and deployment of Cell.

And at the end of the day SPU is just more interesting to talk about than the EDRAM module even if there's some quite interesting things being done with it. So it's much harder to find information on how EDRAM is used by each particular dev. Especially if it's something as simple as using it allows them to implement more of X standard feature faster.

Regards,
SB
 
If things like transparencies, AA and framerate don't bother you, then I understand why you can't see what the big deal is.

Frame rate relative to multiplatform games, right? Certainly not an issue with exclusives or games lead on the PS3.

I think the best example of this has been Bayonetta. The game was tailored to the 360 hardware like a first party game, and Nex simply didn't re-engineer it on the PS3. The game was transparency pron.

Of course, the original Xbox had a handicap with the CPU itself. That Bayonetta example does remind me of a similar one last gen with the MGS2 port on the Xbox. The lead programmer did state that they pushed the fill rate on the PS2, so when the game was ported, the Xbox version's frame rate took a massive dip. The tanker episode's exterior with the rainstorm essentially ran in slow motion.
 
I thought those Sega lindbergh games lacks AA because of FP HDR. It uses a geforce 6 base graphic card and PS3 version of VF5, Initial D, and AB are pretty much direct port. 360 version of VF5, and AB would be FP10 + MSAA.
Wasn't VF5 was a times exclusive on PS3, & was later ported to 360. :?:
 
ya, VF5 on PS3 and lindbergh didnt have any AA, but it has AA on 360. Same with Afterburner, no AA on PS3 and lindbergh but AA on 360. Initial D was also a lindbergh game that got ported to PS3, neither version have any AA. I am just saying, it might be due to FP 16 rendering doesn't allow MSAA on G6/7 (Lindbergh and RSX).
 
Back
Top