I could not find the XDR latency topic in the search function, so

Ubisoft not taking advantage of the XDR at all? :oops: A poor decision if true.
Obviously they're using it for Cell and the game code. I just don't think they're using it for textures, though other graphics data may be finding its way into there.
 
Obviously they're using it for Cell and the game code. I just don't think they're using it for textures, though other graphics data may be finding its way into there.

I find it hard to believe that Ubisoft isn't going to take advantage of the XDR for textures and only using it for game code. I mean RSX does have access via a seperate bus, so I just don't see why not. Oh well... texture downgrade for sure? It seems like 256MB (minus game code, OS) is a waste.
 
I find it hard to believe that Ubisoft isn't going to take advantage of the XDR for textures and only using it for game code. I mean RSX does have access via a seperate bus, so I just don't see why not.

The implementation required likely isn't the most straightforward approach. As said before, the most straightforward approach is to localise the GPU's access to VRAM, and Cell's to XDR, and leave that be that. With time, and as their middleware matures*, and as they get a better handle on things, I'm sure you will see them take a more versatile approach to the memory where it is appropriate.

*UE3 reportedly isn't very mature on PS3 at all, yet. It's probably something of a miracle we're seeing any UE3-powered games this year.
 
Probably not important, but we will most likely see a pretty significant texture "downgrade" (for the lack of a better word) for this particular PS3 game?
 
*UE3 reportedly isn't very mature on PS3 at all, yet. It's probably something of a miracle we're seeing any UE3-powered games this year.
Fatal Inertia uses it I think, that is a launch "window" game supposedly.

I wouldn't class KOEI as miracle workers though ;)
 
Probably not important, but we will most likely see a pretty significant texture "downgrade" (for the lack of a better word) for this particular PS3 game?

I'm not sure, I think we'll have to wait for the final product. I haven't been keeping up with the game or its status, but given what's been said to date by Epic et al, I'm just surprised we're seeing a UE3 game at all this year on PS3. But like Shifty says, the issues with that engine aren't necessarily the same as the issue raised in the OP (texturing from XDR).
 
I don't see what is so difficult in using XDR for textures. Heck, it would be pretty similar to a PC using system RAM for textures, only on PS3 there would not be the AGP/PCIe bottleneck.

Am i missing something?
 
Am i missing something?
There's only two theories that seem to fit the idea that Ubisoft have less RAM for textures on PS3 than XB360. Either the OS is consuming vast amounts of RAM meaning less is available, or they're only using the GDDR for textures. Neither sounds entirely plausible, but the XDR-based textures seems the more probable of the two! Because RSX has to fetch textures over the FlexIO, making it indirect access, perhaps that adds a problem for the existing texturing engine either in setting up the fetch or managing the extra latencies or somesuch?
 
Well, the textures in this particular game are already bad, even in 360 terms so i really think it's either lazyness of them devs or it will be an issue for both consoles but only mentioned for one (as some kind of "excuse").
 
Texturing from XDR performs differently, so you can't treat it as "equal performance". You suffer more latency from XDR, which means that you'll prefer to use smaller shaders (those that use less registers), because the overall register file size = fragments in flight * registers per fragment.

Since the register file is fixed in size, the increased fragments in flight (needed to hide the increased latency) reduces the number of registers you can use. Alternatively, you can elect to go "over budget" on registers, but that will reduce performance (because now you have less fragments in flight and can't hide the latency 100% of the time).

If you're starting out with an abundance of shader performance anyway (e.g. you're ROP bound, or you're GDDR3 bandwidth bound) then there's no harm in reducing the theoretical performance of your shader (since it's "too high") and texturing from XDR. Which has the side effect of giving back some GDDR3 bandwidth, perhaps for things that will benefit more.

Additionally, texture caching will work differently between XDR and GDDR3. This could mean that some kinds of textures will actually be happier in XDR than in GDDR3, due to access patterns and contention for cache space between textures.

Clearly, you have to suck it and see to work out what's best. Because this experience is so different from PC, which is basically fucked when texturing out of system RAM, there isn't much carry-over between the two platforms.

Jawed
 
Hmm, perhaps this thread may have been started on false premises? I believe the comment the OP* is talking about comes from this month's GamesTM where they talk very briefly about the PS3 version of Rainbow Six. The actual quote is somewhat more ambiguous than the OP suggests.
JF Poirer said:
"We're developing with 360 as our main development platform and porting to the PS3 means that there's less memory available for us to use, but we're trying to minimise any drop in quality, that's for sure."
The quote also comes from the associate producer rather than a technical minded developer, so whatever problem they might be experiencing could likely have been contracted to a soundbite level for PR purposes.

Outside of the aforementioned quote, I think there are a lot of complicatons which come with porting from the 360 and developing with UE3 on a PS3 launch title - problems that would not necessarily be there if this situation were repeated 12 months down the line. Mark Rein himself has been quoted (1UP Yours Podcast - 8th August 2006) as saying his engine is "not really a launch title engine" (43:30) and "I don't know of anybody making a PS3 Unreal Engine 3 game for launch" (46:35)...Which kinda begs the question - Is R6:V really a UE3 title? Maybe the engine isn't the problem at all, and it's simply a complication from developing with 360 as the lead platform. Without any further insight into their problems, we're going to need a qualitative comparison to answers our questions. If it's a significant difference, then they probably aren't relying on XDR, and if it's "negligable" like Starbreeze followed up with on their texture deficit comments, then it'll probably be because of the complications involved in using XDR and the architectural differences between the two consoles.

It would really help if there were any screenshots or footage of the PS3 version, but that may in itself be the answer to our questions :-?

*If it's not the quote you had in mind, then it would be nice if you could source it thanks ;)
 
You suffer more latency from XDR, which means that you'll prefer to use smaller shaders (those that use less registers), because the overall register file size = fragments in flight * registers per fragment.
A long shader does not necessarily uses more registers than a short one and vice versa.
 
Didn't NV double the texture caches in RSX to deal with the added latency ? Or did I dream that.

Cheers
 
DeanA, can you say us what is the XDR latency from/to the Cell? This should be public information since there is a full sistem emulator from IBM.
 
DeanA, can you say us what is the XDR latency from/to the Cell? This should be public information since there is a full sistem emulator from IBM.
the simulator is not clock cycle accurate, it does not model external memories AFAIK, so no, it's not a public information
 
DeanA, can you say us what is the XDR latency from/to the Cell? This should be public information since there is a full sistem emulator from IBM.
Probably not. If I'm unsure as to the public nature of something, I'll tend to not say. Besides, surely the XDR/CELL latency is a property of the system that CELL is in, rather than a property of CELL as such. So I'm expecting that there's no guarantee that, for example, a CELL reference system has the same latencies as a PS3 implementation.

Dean
 
Probably not. If I'm unsure as to the public nature of something, I'll tend to not say. Besides, surely the XDR/CELL latency is a property of the system that CELL is in, rather than a property of CELL as such. So I'm expecting that there's no guarantee that, for example, a CELL reference system has the same latencies as a PS3 implementation.

Dean

I thought the whole FlexIO thing and XDR/Rambus was integral to the Cell Broadband Engine architecture, and all open and documented? So then it's just a matter of what clocks you are working with, which we know to be 3.2 for Cell. I think this information may already be out there.

But how much latency RSX has vis-a-vis XDR memory is something else entirely. As soon as anything touches on RSX/Cell interaction, it becomes very PS3 specific. Therefore someone's comment on texturing from XDR, though pretty much widely known, was probably closer to NDA stuff than anything you could say about Cell/XDR latency. ;)

Then again, a lot of developers are now openly talking about the techniques they are using to make use of the PS3, and I think most stuff will become public now soon enough, as both the consoles and the games are shipping. Little you can change at this point in time! ;)

And of course people will start doing some benchmarking as soon as they get their hands on Linux. You might see people close to or from Sony themselves even mentioning details about the hardware to indicate when something in Linux prevents reliable tests. But since I understand you'll have pretty extensive access to the Cell, we should see some good tests anyway.
 
Looking at the documentation I found this:
The Full System Simulator for Cell BE Processor provides a cycle-accurate SPU core model that can be used for performance analysis of computationally-intense applications. However, this model can not be used for measuring or tracking memory access latencies.
So, sorry for asking something that you can't comment.

Since the memory controller is integrated in the Cell I don't think that latency should change a lot from implementations but only those who have Cell Blades may have the right to talk about it.
 
Back
Top