360 eDRAM Utilization - 2010 Edition

maybe I misremember, I ask this because I remember (sort of) reading/hearing that EDRAM daughter die has programmable processing also.
There is processing logic in the eDRAM die, so it's not just vanilla eDRAM and why ATi called it "Smart eDRAM", but it's not programmable. It only does resolve operations, using the sample data in the eDRAM to create the final buffer written to RAM.

Essential reading
 
Thank you

There is processing logic in the eDRAM die, so it's not just vanilla eDRAM and why ATi called it "Smart eDRAM", but it's not programmable; it's the ROPS. It only does resolve operations, using the sample data in the eDRAM to create the final buffer written to RAM.

Essential reading

Simple ROPs. People were saying many crazy things, I don't know if you remember. It was interesting when people were making many speculations.

Since EDRAM cannot perform operations other than write buffer on the RAM, then what "utilization" is this thread about?

Below someone says this

So it is certainly possible to utilize the EDRAM for a bunch of things, including MSAA but other tricks too. What we haven't seen yet is a high profile 1st party MS game doing it. Maybe the upcoming Halo game(s) from 343 Industries will do so.

The link says post-processing but not much details than blurring/downsampling. So the AA hardware in the EDRAM is used for intermediate buffers to blur/downsample thinks like shadows and other intermediate buffers in deferred rendering?

This sounds very creative. I wonder how much speed-up this can provide. I think the person who can give this answer (actually precise) is on this forum.
 
The link says post-processing but not much details than blurring/downsampling. So the AA hardware in the EDRAM is used for intermediate buffers to blur/downsample thinks like shadows and other intermediate buffers in deferred rendering?

This sounds very creative. I wonder how much speed-up this can provide. I think the person who can give this answer (actually precise) is on this forum.

It can't do post processing itself but it can be used to do some things that are traditionally done through post processing.

Here's the relevant quote from the Sebbi interview linked in this thread above and many others.

The anti-aliasing hardware inside the eDRAM is one of the most important performance advantages of the platform. With the anti-aliasing hardware we could speed up our soft shadowing algorithm dramatically, and we could replace lots of usually pixel shader-heavy post-processing steps (blurring and downsampling) with cheaper alternatives. These hardware specific optimisations required down-to-the-metal code, but in the end I must say that the eDRAM hardware was a key feature in making our game run at constant 60FPS.

Regards,
SB
 
It can't do post processing itself but it can be used to do some things that are traditionally done through post processing.

Here's the relevant quote from the Sebbi interview linked in this thread above and many others.



Regards,
SB

Tbank you for this clarification.

Probably they have to do a lot of planning for this to work. But programmer is always planning! lol
 
It can't do post processing itself but it can be used to do some things that are traditionally done through post processing.

Here's the relevant quote from the Sebbi interview linked in this thread above and many others.

The anti-aliasing hardware inside the eDRAM is one of the most important performance advantages of the platform. With the anti-aliasing hardware we could speed up our soft shadowing algorithm dramatically, and we could replace lots of usually pixel shader-heavy post-processing steps (blurring and downsampling) with cheaper alternatives. These hardware specific optimisations required down-to-the-metal code, but in the end I must say that the eDRAM hardware was a key feature in making our game run at constant 60FPS.

Regards,
SB
At least we know for sure the 360 hardware is being pushed in certain games. I'm glad that can finally be put to rest. If they didn't absolutely need to do code down to the metal to reach their goal, they wouldn't have done it.

I wonder how low the internal eDRAM bandwidth could be to accomplish the same tasks.
 
At least we know for sure the 360 hardware is being pushed in certain games. I'm glad that can finally be put to rest. If they didn't absolutely need to do code down to the metal to reach their goal, they wouldn't have done it.

I wonder how low the internal eDRAM bandwidth could be to accomplish the same tasks.

Yes, but don't expect coding down to the metal to be encouraged by MS. They were far more tolerant of coding to the metal in the Xbox 1 and that burned them quite heavily when it came time to implement BC in X360 as they had to pay royalties to Nvidia for emulation in any game that coded "to the metal" of the Nvidia GPU used in the Xbox 1.

For X360, MS is far less likely to blindly approve any application that attempts to bypass DirectX.

I have a feeling that titles for XBLA have the greatest flexibility. AAA titles are going to be more in the public eye if claims for backwards compatibility are made. Thus AAA titles will have conform more strictly to DirectX and whatever other approved API's are in use. By doing so, you make the task of BC much easier and much less reliant on gaining approval of pertinent IP holders.

The flipside to that, of course, is that you'll always have a layer of abstraction from the HW which may limit some out of the box ways of doing things.

I'd expect the areas where MS is more tolerant of coding to the metal will be those areas where they hold the IP or have already negotiated useage rights for the IP.

Regards,
SB
 
Last edited by a moderator:
Still curious as to what MS is doing with the UE3 license they acquired years ago, though I suppose it'd be for their ATG branch. ("In the Trenches..." & "Still in the Trenches..." gamefest audio).

They at least discuss some code optimizations there in terms of GPU querks. The second presentation discusses a bit about the edram MSAA trick and a couple other things.
 
Yes, but don't expect coding down to the metal to be encouraged by MS. They were far more tolerant of coding to the metal in the Xbox 1 and that burned them quite heavily when it came time to implement BC in X360 as they had to pay royalties to Nvidia for emulation in any game that coded "to the metal" of the Nvidia GPU used in the Xbox 1.

For X360, MS is far less likely to blindly approve any application that attempts to bypass DirectX.

I have a feeling that titles for XBLA have the greatest flexibility. AAA titles are going to be more in the public eye if claims for backwards compatibility are made. Thus AAA titles will have conform more strictly to DirectX and whatever other approved API's are in use. By doing so, you make the task of BC much easier and much less reliant on gaining approval of pertinent IP holders.

I agree with your reasoning of the situation but is it not the case that Microsoft owns all the IP in the XBox 360? I seem to remember seeing this trumpeted quite loudly in 2005, the matter that they would not have backward compatibility problems for any platform in the future due to their control of all essential IP in the box.

Of course, this is also just more reason for them to have everybody stick to the gameplan.
 
Here's a good interview:
http://www.eurogamer.net/articles/digitalfoundry-tech-interview-trials-hd

Also note how most X360 exclusive titles aren't doing anything like this... Devs either concentrate on something else (Bungie), use a mostly multiplatform engine (Epic), or don't have the budget (almost everyone else).
Or there's Forza, what kind of AA is there on top of 60fps and HDR, 2x or 4x? Then there's Fable which is IMHO let down by the less inspired artwork...

Well yea, if you haven't noticed Microsoft doesn't really invest that much on their 1st party IPs anymore. Atleast half their exclusives are from 3rd party developers (Unreal Engine 3, other PC engines...)...Hell even Alan Wake started out as a PC title. While Sony invests alot in their 1st party games to bring out the best in their heavily invested Cell Processor.

Oh and Forza 3 is full 720p 60 FPS and HDR, with 2xMSAA (4xMSAA during replay). And it used A2C I believe.
 
Yes, but don't expect coding down to the metal to be encouraged by MS. They were far more tolerant of coding to the metal in the Xbox 1 and that burned them quite heavily when it came time to implement BC in X360 as they had to pay royalties to Nvidia for emulation in any game that coded "to the metal" of the Nvidia GPU used in the Xbox 1.

For X360, MS is far less likely to blindly approve any application that attempts to bypass DirectX.

I have a feeling that titles for XBLA have the greatest flexibility. AAA titles are going to be more in the public eye if claims for backwards compatibility are made. Thus AAA titles will have conform more strictly to DirectX and whatever other approved API's are in use. By doing so, you make the task of BC much easier and much less reliant on gaining approval of pertinent IP holders.

The flipside to that, of course, is that you'll always have a layer of abstraction from the HW which may limit some out of the box ways of doing things.

I'd expect the areas where MS is more tolerant of coding to the metal will be those areas where they hold the IP or have already negotiated useage rights for the IP.

Regards,
SB

I know this is an older post but I'm sorry I don't really agree with this. AFAIK Microsoft had to pay royalties to Nvidia in order to emulate the GPU in the xbox which Nvidia still owned the rights to. It had nothing to do with how close to the metal any xbox game was coded.

Also, I think downloaded content would be under the microscope as much, if not more than regular retail games regarding being BC in the next gen console. I don't know about you, but if I come to find out none of my digital content purchased on my 360 (or PS3, etc) doesn't work on the next console, I will surely be hesitant to spend more money next gen on more digital content.

That said, I've lurked around on these forums for a good while before registering, and I too am curious to what some of the devs (including Corrine) mean by the 360 not being pushed enough. I'm quite happy with how far the system has been pushed, how much more can it possibly go?
 
I know this is an older post but I'm sorry I don't really agree with this. AFAIK Microsoft had to pay royalties to Nvidia in order to emulate the GPU in the xbox which Nvidia still owned the rights to. It had nothing to do with how close to the metal any xbox game was coded.

Had they been programmed directly to the DirectX API calls or whatever was encouraged at the time there would be no IP issues. As MS owned the IP or has already negotiated useage rights to all things used.

Thus BC in this case would be just a simple matter of translating those DirectX (or whatever) calls to whatever hardware was being used. Similar to how games on PC don't give a rats arse to whether you have an Nvidia, ATI, Intel or whatever card as long as it meets DX specifications.

In the case with Xbox 1 and the Nvidia GPU. Devs were coding to the metal using specific Nvidia instructions whether for speed or to access features not available in DirectX. Any game that has those calls would require them to be emulated on the ATI designed GPU as Nvidia controls IP related to that. Nvidia basically said, pay up or you can't touch those graphics instructions.

Fast forward to today. And you hear a lot about how MS isn't very tolerant of bypassing approved API's. As opposed to the GPU in Xbox 1, MS is able to manufacture the GPU anywhere they wish. However, there probably is still IP owned (and likely licensed to MS) by AMD for the GPU just as IBM certainly still retains control of key IP in the CPU.

Noone other than the involved parties will ever know just how much IP has been licensed by MS or even for how long. So, safest bet to is to stay within approved API's which MS does have full control over.

Looking over at PS3, it is interesting to note that I don't think coding to the metal of RSX is encouraged. Not when you have available SPU's in Cell that Sony has full rights to use however they wish. Thus avoiding a situation where Nvidia might hold hostage your ability to do BC if you don't go with one of their chips for the next iteration of your console.

Regards,
SB
 
I thought MS owned full rights to Xenos which basically makes any concerns over IP ownership pointless.

I don't think bypassing the API has as much to do with paying rights as it has to do with making BC easier for the next generation.
 
I thought MS owned full rights to Xenos which basically makes any concerns over IP ownership pointless.

No one knows exactly what rights MS has with regards to the IP involved with Xenos. The only thing we know for certain is that MS has negotiated rights to enough that they can manufacture it wherever they want.

It's almost a certainty that there will be IP involved in the chip that ATI isn't willing to sell outright. But that doesn't preclude some form of licensing of useage rights.

Heck, ATI doesn't even own all the IP that goes into their own GPUs. For example, they must license some IP from Rambus with regards to their memory controllers (although it's unclear if new products have changed sufficiently that they may no longer have to).

Regards,
SB
 
Back
Top