I could not find the XDR latency topic in the search function, so

I'd like to ask (and I know this topic has been overkilled by many members in this forum):

How can RSX use XDR via Flex I/O in time for static textures per frame? Maybe I haven't worded the question correctly but with regards to Ubisoft's (Rainbow Six: Vegas) comment about a possible downgrade in the PS3 version because the immediate RAM for the GPU is only 256 mb of GDDR3, is it highly improbable for developers to considering using XDR for textures or developers just don't want to be hassled with it? Would it have been wiser to consider to use a complete pool of 512 mb of GDDR3 RAM or does this split pool architecture reduce other bottlenecks? (yeah, but please answer my questions about textures)
 
This has been discussed before, but as I understand it..

How can RSX use XDR via Flex I/O in time for static textures per frame?

RSX can texture from XDR. There is a latency penalty relative to texturing from VRAM. Depending on where your rendering is bound, however, that extra latency may or may not be relevant (it can make little difference where your texturing from depending on that). There is, I'm sure, some extra implementation issues to concern yourself with also, versus doing all texturing from VRAM.

Would it have been wiser to consider to use a complete pool of 512 mb of GDDR3 RAM or does this split pool architecture reduce other bottlenecks? (yeah, but please answer my questions about textures)

A split pool has advantages and disadvantages like any other set up. It provides both chips with 'close' memory to work with, even if it's not all the memory in the system (whereas with a unified pool, it's likely one would be 'further away' from memory, in terms of latency), and provides two busses to memory which might help reduce contention compared to one bus to one pool, and have other benefits at a lower level in terms of memory access. It allows you to use different types of memory, where that might be important. That's some of the benefits, there are of course tradeoffs - it may be less easy to manage than a unified pool depending on your application's demands, and to share memory from each pool between the chips and so forth.
 
I think a lot of the Ram issues on PS3 are down to the 96 MB for the OS.

People seem to have forgotten about that, but it's the "easy" suspect if you ask me. Although I guess devs are not being clear about the issue (NDA's?).

Also, the EDRAM does result in some small ram savings I believe, although it does not work out to 10 MB or anything 1:1 like that (could be more or less).

Still, these seem minor in importance compared to the fact Xbox had 2X PS2 RAM last generation.
________
SweetNelly live
 
Last edited by a moderator:
I think a lot of the Ram issues on PS3 are down to the 96 MB for the OS.

People seem to have forgotten about that, but it's the "easy" suspect if you ask me.

People have forgotten about that because it was a crazy rumor that was never substantiated.

The real simple explanation for the R6:V problem is that Ubisoft is using the Unreal3 engine. An engine that is currently not great at taking advantage of the PS3 architecture.
 
RSX can texture from XDR. There is a latency penalty relative to texturing from VRAM. Depending on where your rendering is bound, however, that extra latency may or may not be relevant.
While there may be penalties related to pulling from XDR, depending on what you're trying to do you can actually end up with an overall increase in texture bandwidth when using both XDR *and* GDDR3. This can be the case when you're already saturating GDDR3 bandwidth with other texture fetch/blending/depth operations.

Treating the machine as though it's only got 256MB of GPU addressable memory is a bad way of dealing with the available resources.

Dean
 
Also, the EDRAM does result in some small ram savings I believe, although it does not work out to 10 MB or anything 1:1 like that (could be more or less).
.

The EDRAM is for the framebuffer, not for normal storage. The only reason its there is to make up for the memory bandwidth limitations that the GDDR3 128bit RAM gives. ITs not comparable at all in this case, no developer is going to store an extra 10mb of textures in the EDRAM...

What it results in is (if used with tiling), very low cost 4x AA, and the ability to run FP10 HDR + 4x AA without to much trouble.
 
People have forgotten about that because it was a crazy rumor that was never substantiated.

The real simple explanation for the R6:V problem is that Ubisoft is using the Unreal3 engine. An engine that is currently not great at taking advantage of the PS3 architecture.

The Unreal 3 engine isnt great at taking advantage of ANY architecture to be honest.

Edit: Well, maybe the PC architecture.
 
This has been discussed before, but as I understand it..



RSX can texture from XDR. There is a latency penalty relative to texturing from VRAM. Depending on where your rendering is bound, however, that extra latency may or may not be relevant (it can make little difference where your texturing from depending on that). There is, I'm sure, some extra implementation issues to concern yourself with also, versus doing all texturing from VRAM.

But doesn't the fact that XDR is at 3.2 GHz helps incrementally? Oh and the OS RAM hog issue, was it actually confirmed? Why would it cost that much RAM in the first place anyway? How much does the 360 consume?
 
But doesn't the fact that XDR is at 3.2 GHz helps incrementally? Oh and the OS RAM hog issue, was it actually confirmed? Why would it cost that much RAM in the first place anyway? How much does the 360 consume?
3.2 GHz is just the effective signaling rate. The DRAMs operate at 400 MHz. On the GDDR side, the 1.3 GHz is effective signaling rate, but the DRAMs operate at 325 MHz. Of course, XDR has a host of features to improve and/or hide latencies further, which is good news for the CPU. The GPU doesn't have direct access -- it has to go through a second bus (FlexIO) to get data from XDR, so it has two hops. Pretty similar to the latency difference between a CPU having to access memory through a Northbridge and having a memory controller on die, which more than cancels out any XDR latency advantage. ;)

The OS Ram -- no, it was never confirmed, AFAIK. In fact, the only real developer mentioning I saw of an OS taking up 96 MB was a hypothetical postulation. Technically, if we were as bad off as PSP, we'd be talking about an OS taking up 128 MB (1/4 of the physical RAM). For that matter, even all the talk about CPU time consumption has been confined to devkits, which are running a lot of ancillary tools all the time which will never be seen by consumer eyes.

360's OS, BTW, takes up 32 MB.
 
The EDRAM is for the framebuffer, not for normal storage. The only reason its there is to make up for the memory bandwidth limitations that the GDDR3 128bit RAM gives. ITs not comparable at all in this case, no developer is going to store an extra 10mb of textures in the EDRAM...

What it results in is (if used with tiling), very low cost 4x AA, and the ability to run FP10 HDR + 4x AA without to much trouble.


Yes but I believe this has been batted around a few times and it saves you some memory somehow, via a bunch of stuff that went over my head (various buffers and all that). It's just not, treatable as 10MB of extra RAM per se.

I thought the 96 MB PS3 OS thing had essentially confirmed back at the time, with the caveat that it might be changed later. Of which I never heard anything either way, so I assumed it was still on.
________
HotSexyBody cam
 
Last edited by a moderator:
The last I heard the PS3 OS only utilizes 48MB of RAM. The 96MB of RAM was in the devkits.
That, too, is speculation based on mentioning of the PSP OS' double state. Namely, that in the PSP devkits, you have twice the RAM as the retail unit, but the reserved memory space of the OS is taken up in both the lower and upper 32 MB, so you end up with twice the occupied RAM on the devkit (when running in 64 MB mode, anyway). And this isn't really new for Sony platforms with the PSP, AFAICR, but the OS never really ate up 1/4 of your physical RAM before.

In any case, that's not a guarantee that history will repeat itself yet again, but it's also safe to assume that there's more going on on the devkits in general than the retail PS3.
 
ShootMyMonkey said:
Technically, if we were as bad off as PSP, we'd be talking about an OS taking up 128 MB (1/4 of the physical RAM)
You could also add an extra ~16MB (for eDram) and let's not forget at least 4SPEs (half the CPU ;)) :p
 
My question is that why is Ubisoft and Starbreeze claiming they have more memory for textures on Xbox 360 than they do with PlayStation 3? Is XMB hogging THAT MUCH memory? or is it because Ubisoft can't take advantage of the seperate system RAM design that Sony chose to go with since they're porting from Xbox 360?
 
My question is that why is Ubisoft and Starbreeze claiming they have more memory for textures on Xbox 360 than they do with PlayStation 3? Is XMB hogging THAT MUCH memory?
If the OS only takes 4MB, that's still 'less memory for textures' ;)
or is it because Ubisoft can't take advantage of the seperate system RAM design that Sony chose to go with
That's my guess. I think they're treating it like a 256 MB VRAM pool and not crossing over into XDR.
 
If the OS only takes 4MB, that's still 'less memory for textures' ;)
That's my guess. I think they're treating it like a 256 MB VRAM pool and not crossing over into XDR.

Ubisoft not taking advantage of the XDR at all? :oops: A poor decision if true. I was really hoping developers would at least try to make their products as equal as possible across multiple platforms when it comes to "next gen."
 
I wonder how much of what MS paid UBI for Splinter Cell 5 exclusivity was to start a smear campaign against the PS3.
 
Back
Top