CEDEC 08 Going ons...

4132.jpg


Red overlay: Failure...

Light, Ambience

TGS-2005 TRAILER Experiment

• 3-vector component lightmaps
• Shadow volumes
• High polygon, high resolution models
— 800,000 polygon count (background only)
— About 300MB texture memory (total)

But...

• 3-part lightmaps put pressure on VRAM, 10% of the resolution couldn't be obtained...
• The shadow volume technique was going extinct...
• Could not use all the memory we began with...
 
:/ Damn Sony and their castrated 7800 GTX chip. Even with 256-bit and 512MB VRAM, it couldn't really hold a candle to a PC at the time of launch, but 128-bit and 256MB VRAM just totally crushed some developers targets.
 
:/ Damn Sony and their castrated 7800 GTX chip. Even with 256-bit and 512MB VRAM, it couldn't really hold a candle to a PC at the time of launch, but 128-bit and 256MB VRAM just totally crushed some developers targets.
Yes,it would have been nice to have more, faster RAM, but cost was prohibitive. The question here is what went missing and why? How much texture memory were they using in the final game?
 
I'm also curious to know if any PS3 games use the XDR main memory as additional VRAM for textures. Do we have any knowledge on that, or speculation? It is possible after all isn't it?
 
I'd say most games pretty much have to use XDR for textures... Otherwise the system couldn't keep up with X360 games at all. Even this way there's going to be a slight difference with some multiplatform games, so one could imagine how they'd look like with only the VRAM.
 
True, but then again, no game on consoles has beaten Uncharted texture-wise. So it might be the lack of dedication for multi platform games. Why should they go one step beyond? ;)

Also, do you think the high bandwidth/speed of the XDR memory makes a difference at all, i.e. justifies the cost of such memory? Maybe Sony should have chosen a lower-end RAM and put that money/R&D into a better GPU chip.
 
The higher speed of XDR is consumed with framebuffer operations that XB360's RAM does experience due to eDRAM. Sony couldn't have gone with lower BW without using an eDRAM type solution.
 
there's the simple issue of two splits memory pools, both with an amount reserved for the OS/interface.
I blame the Cell, or the console's design. Could have they gone with unified 256bit 512MB GDDR3? that would be expensive for sure but so is their XDR stuff or an edram die.
 
Hrm. XDR is Cell's memory, although RSX can read it, and read it fast. This is not where you want to put your rendertargets, so its bandwidth is not consumed by the ROPs, Shifty.
While you can put textures there for RSX to consume, this will cut into the bandwidth available for Cell and may not be the best of ideas. On the 360, all bandwidth is shared between Xenon and Xenos, but you only have roughly half the total bandwidth. Thus the dreaded EDRAM.

I guess if you need more than 256MB of targets and textures, you'll have to use some XDR for textures, but that is not necessarily given. Or possible, depending on how much mem your game-code needs.

WRT Uncharted, there is a presentation here with the memory layout for that game. There is a 6MB "Video Memory" block in XDR which might be textures, but I doubt it.
 
Hrm. XDR is Cell's memory, although RSX can read it, and read it fast. This is not where you want to put your rendertargets, so its bandwidth is not consumed by the ROPs, Shifty.
Yes, that's an important correction to my explanation. I guess I should have said 'rendering functions' or something. I'm assuming a graphics engine that spans both memory pools, but what you say suggests that's actually a bit rare (?), and in essence the aggregate BW of XDR+GDDR isn't readily being targeted for rendering, instead the XDR BW is being isolated somewhat to game code. If that's the case in the majority of titles, PS3's graphics capabilty will be limited to something like 256 MBs VRAM (excluding OS reservations if any) and 20 GB/s BW, which isn't a great deal. I don't suppose there's any way to determine what games are using XDR for textures.
 
Yes, that's an important correction to my explanation. I guess I should have said 'rendering functions' or something. I'm assuming a graphics engine that spans both memory pools, but what you say suggests that's actually a bit rare (?).

I'm not really in a position to answer that, as I haven't really seen the code of many other people. It is certainly possible - and absolutely straight-forward - to use XDR for rendering stuff. I don't think it's really needed for bandwidth, though. More on that below. ;)

In essence the aggregate BW of XDR+GDDR isn't readily being targeted for rendering, instead the XDR BW is being isolated somewhat to game code. If that's the case in the majority of titles, PS3's graphics capabilty will be limited to something like 256 MBs VRAM (excluding OS reservations if any) and 20 GB/s BW, which isn't a great deal.

20GB/s is a whole lot of bandwidth, if you use it smart. Let's calculate! Assume 1080p and 30fps for 666MB/frame and 8MB for your target. If we do a deferred-pass, we'll probably do a z-prepass. Assuming we hit every pixel 5 times (ouch!), we're at 80MB for this frame. Now for the g-pass, we'll probably reduce the number of hits on the z-buffer a lot, simply by early-z tests. But let's assume 2 reads per pixel, or 16MB. We'll also need textures here, so let's say diffuse, normal, specular and foo, all DXT5. With 50% texture cache hit-rate (not very good), we're at 8 bytes per pixel read (bilinear) and probably 12 bytes writing. So that's about 40MB.

The deferred-pass itself will eat a lot. We have 4 targets to read, all 4bpp, and maybe two to write (yes, I'm making this up). So we have 6 full 1080p targets moving around, for 48MB. So just for my deferred solid-pass, I've consumed 80+16+40+48=184MB of bandwidth. So that's a quarter of my bandwidth. There's still plenty for transparent passes, shadows, particles and whatnot.

(I've probably made a big terrible mistake in there somewhere, but hey, you get the point. If it's all off by a factor of 2, just assume I was talking 720p.)

To be honest with you, if you're using a not-so-smart forward renderer (as opposed to a smart forward renderer ;)), especially with a good number of light-passes, bandwidth will kill you. Add some AA, some aniso, render to FP16 targets and you will feel the bandwidth fast. On the other hand I would argue that this is not necessary, most of the time. If you pick your texture resolutions wisely, apply advanced filtering where you need it and have some good artists, you'll get great results without needing 100GB/s busses.


I don't suppose there's any way to determine what games are using XDR for textures.

Nah, no chance. I can't profile other people's games on a devkit, sadly. Boy would that be fun. ;)
 
To be honest with you, if you're using a not-so-smart forward renderer (as opposed to a smart forward renderer ;)), especially with a good number of light-passes, bandwidth will kill you. Add some AA, some aniso, render to FP16 targets and you will feel the bandwidth fast. On the other hand I would argue that this is not necessary, most of the time. If you pick your texture resolutions wisely, apply advanced filtering where you need it and have some good artists, you'll get great results without needing 100GB/s busses.
Thanks for a very interesting response. At the beginning of this gen, and pointing at PC GPU VRAM speed increases, everything was pointing to very high BW requirements. I'm very curious what the mindset of developers is like as they look at PS3, if they see it as a miniscule working space or ample given a smart solution.

Nah, no chance. I can't profile other people's games on a devkit, sadly. Boy would that be fun. ;)
Hardware companies should run performance competitions where games are openly profilable by other devs, and the highest performers win big prizes! :mrgreen:
 
Thanks for a very interesting response. At the beginning of this gen, and pointing at PC GPU VRAM speed increases, everything was pointing to very high BW requirements. I'm very curious what the mindset of developers is like as they look at PS3, if they see it as a miniscule working space or ample given a smart solution.

From my POV, I'd say it's adequat. I don't see console-games looking better than PC versions anymore, simply because the GPUs are a generation old. No amount of bandwidth will change that. What we have is enough for a lot of games to run at 720p 60fps, or 1080p 30fps, with decent details. There are many sub-720 games where I'd love to profile the game and see what the reason is.

The comparison with PCs is always a bit complicated. PCs usually render in higher resolutions and these days usually with 4xAA at least. That eats some bandwidth, but as resolutions get higher, the need for AA gets less and less. Unless we get struck by lightning, we'll ship with 1080p no AA and that's a very, very clear image. Nobody complained about jaggies in 1080p until now, while 720p without AA would make you go blind.

It's the same for aniso. On the PC, I go to the control-panel and set 16xAF for the entire scene. Do I need AF on the characters in a FPS? Not really, right? The ground in a racing game? Absolutly! But I can only apply it to everything. There's a certain lack of granularity, in my experience.

Hardware companies should run performance competitions where games are openly profilable by other devs, and the highest performers win big prizes! :mrgreen:

That would be awesome. I'm not really sure many people - especially engine companies - would appriciate that. Make one stupid mistake and everyone will point and laugh. I *am* sure the tech people would love it. It's basically what we do between the console teams, internally (minus the prizes). Great fun. :)
 
T.B said:
That would be awesome. I'm not really sure many people - especially engine companies - would appriciate that.
Well on PS2 you could do it if you were one of the lucky ones with a PA enabled DTL. Could see all kind of dirty rendering secrets with it to - comparing actual rendered primitive counts to PR statements(or generally accepted internet 'facts') was always particularly fun.
 
Nobody complained about jaggies in 1080p until now, while 720p without AA would make you go blind.

Stick around a while! We were complaining about lack of AA at 2GPixel resolutions before 2005 around these parts :p You will even catch quotes like, "4xMSAA should be sufficient in most cases at 1080p for next gen." Gotta love B3D!

It's the same for aniso. On the PC, I go to the control-panel and set 16xAF for the entire scene. Do I need AF on the characters in a FPS? Not really, right? The ground in a racing game? Absolutly!

A number of developers need to be slammed over the head with that last bit. For any game with large amounts of flat terrain that the consumer looks at often (be it a racing game, football game, FPS with large open flat areas, etc) AF should be an early budget concern. :devilish: I would take sub-720p resolution for a fair number of games just to enable the feature. I am quite irritated by games that invest in a slew of high quality textures that look horrible because they lack proper filtering. Why even bother if all I will see is a blurry, shimmering mess?
 
Stick around a while! We were complaining about lack of AA at 2GPixel resolutions before 2005 around these parts :p You will even catch quotes like, "4xMSAA should be sufficient in most cases at 1080p for next gen." Gotta love B3D!

:D

Yeah, there's that crowd as well. I was more thinking about our own people, who seem to be easier to please. :)

For any game with large amounts of flat terrain that the consumer looks at often (be it a racing game, football game, FPS with large open flat areas, etc) AF should be an early budget concern.

It's a strange thing, to be honest. Even with a racing game, I wouldn't imagine the road-pixels to be extremely expensive. Maybe relatively hightly tesselated, but not exactly texture bound. So I would imagine that some aniso shouldn't really be a problem. Maybe we have someone around who has actually done a racing game and can demonstrate why I obviously don't know what I'm talking about... ;)
 
Back
Top