Standard PS2 framebuffer size?

mech

Regular
I was always under the impression that (for PAL at least) the standard resolution used was 640x512, or 640x256 interlaced.

However, my roommate was at the AGDC (Australian Game Developers Conference) and he was told it would be far better for his game to use a 512x512 resolution - and render it with a skewed aspect, so that when it was stretched out onto a TV it would look correct.

This is news to me, but apparently MGS2 uses this resolution? Does anyone know anything about this?
 
4mb vram. :p
ps2 games always loook slightly blurrier, darker, muddier and low res compared to the others, including dc.
 
Because it is a well known fact that 4mb of memory darkens, muddens and when possible reduces all colors. The moment you draw it into a 4mb buffer, red turns to brown, White becomes gray with scratches of mud, and black becomes, well, blacker.

Btw Chap, interlace flicker which earned the PS2 all that aliasing fame is a direct result of displaying "higher" res then the filtered games.

On topic,
there's no standard frame buffer resolution per say. It IS true some games render 512 horizontal since it makes very little perceptual difference when displayed on TV screen. The nature of interlaced TV screens makes changes to vertical resolution considerably more noticeable then horizontal.
(also why 4x vertical AA will tend to look better then 2x2 hor/vert. on TV)
 
Sorry faf, i forgot ps2 crappy DAC video out too. :eek:
but you have to admit that 4mb is a bad limitation for ps2 to shine. :oops:
 
You know, that's weird, but I've noticed that some games captured on Gamespot have horizontal resolution of 512 and a squished horizontal aspect ratio. That goes for both PS2 and GC caps of some games. How does TV scale that to 640 anyways?

MGS2, judging from the same video captures, runs at 640x resolution. On the document of MGS2 they say they used 2MB of VRAM for video buffers (front+back) I don't know how the buffer calculation goes, but I'm sure someone could devise the resolution from that. I think the game runs at 16 bit color, as color banding is present in some places, like lens flares, or close ups of color gradients.

*edit*

512 * 512 * 4(if it's 32bit color) * 2 = exactly 2MB!

Maybe PAL MGS2 really does this resolution stretch? I don't think if would be necessary for NTSC, as you could render 600x440 in the 2MB which should be OK without any stretching? Still, I'm not sure if the game uses 32bit color in the first place.
 
This is something I have never understood about the PS2, it has 4 Mb of VRAM, that’s one whole more Mb than the Gamecube.
Then how come people keep citing the PS2s V-buffer size as the reason for its grainy textures and jaggy video output?
And how come the Gamecube can (supposedly) do that much better with less RAM? The GCs video memory is slightly faster, but not by that much, and the PS2 has higher bandwidth for both frame and textures.
Now I know that the GC has build in texture compression to compensate, but as it has been pointed out in several other posts here on this board, texture compression is not some magic box, you just put pictures into, and then out comes smaller pictures, the PS2 can compress textures to, just as well as GC or xbox, using proprietary compression methods tailored specially to the kind of textures you are using.

Is there something I have missed here, or is it just the old case, of developers not bothering/having time to come up with special TC schemes just for the PS2? Because IMO the PS2 does generally have a bit poor textures, except for at few shining examples such as: Jak & Daxter, Rachet and Clank and Burnout 2. But even in those games you can sense the various tricks used. For example clever use of tiling (J&D, R&C) and texture streaming and use of the fact that racing games doesn’t have much overdraw (Burnout). Will we ever se good honest high res. colourful textures on PS2?
 
Well, the thing is, GCN's texture compression format (S3TC) can very well be one-click and you're done. Granted that won't get the best results - it NEVER will - but it's a lot easier to set up a texture for S3TC than it is for CLUT.

Also on PS2 the line for streaming textures from main RAM is a paltry 1.2GB/sec, and that's for vertex data too.
 
Ratchet and Clank is hardly worth mentioning as far as textures go. Silent Hill 2, Burnout 2 and Baldurs Gate, of the games I have for PS2, have the best textures.
 
marconelly! said:
Ratchet and Clank is hardly worth mentioning as far as textures go. Silent Hill 2, Burnout 2 and Baldurs Gate, of the games I have for PS2, have the best textures.

I agree totally. Single color gradient textures are not very rich. A perfect example would be MG2.
 
Except that in MGS2 it works much, much better, IMO. I never think about 'the lack of textures' when I play it, and I can't say the same for R&C.
 
Tagrineth said:
Also on PS2 the line for streaming textures from main RAM is a paltry 1.2GB/sec, and that's for vertex data too.

Why is 1.2 GB/s "paltry" when you would be swapping at most 8 MB of textures? Even an unrealistic 32 MB of swapped textures would only be less than 3% of the total theoretical bandwidth. Is it not also true that there are actually alternate lines to get textures into the GS (if you want to get technical about it), if need be- not just the main line?

I think it all comes down to this- is having 2 GB/s of bandwidth to the GPU really "better" if only to say you have it or if you can actually use it?
 
Also on PS2 the line for streaming textures from main RAM is a paltry 1.2GB/sec, and that's for vertex data too

Gamecube has 2.6 Gb main memory bandwidth, but the Gekko uses half of that bandwidth, and from what I understand that’s pretty much a fixed set up (you wouldn’t be able to change it if wanted more bandwidth for graphics), so you end up with a main mem to graphics chip access similar to PS2.
Likewise for xbox, if you take away DDR latency, general latency (which is actually rather high on xbox) the lack of large on-die texture buffer and frame and z-buffer bandwidth, all that’s left for pure geometry and textures is somewhere in the 1.0 to 1.5 region.

Well, the thing is, GCN's texture compression format (S3TC) can very well be one-click and you're done. Granted that won't get the best results - it NEVER will - but it's a lot easier to set up a texture for S3TC than it is for CLUT.

I think the GC's S3TC feature is completely automatic, I don’t even think you could turn it off if you wanted to.
I can’t imagine that someone hasn’t written an auto CLUTmaking program somewhere. It could involve some kind of pattern recognition similar to VQ-compression...


Sorry faf, i forgot ps2 crappy DAC video out too.

I think maybe the PS2 has to sharp an output for its own good. I once tried a test DVD on my PS2 vs. an xbox and the PS2 won hands down in the detail department.
Actually most people (me included) prefer a slightly blurry picture versus a sharp but pixelated one. Just look on the picture control menu on most peoples TVs, the sharpness slider will in most cases be all the way down. This is why the Dreamcast had a highpass filter, to filter out the noise that pixelation really is. Fortunately the PS2s sharpness can be "cured" with AA.
 
Not really knocking anything, just pointing this out.

Swapping 8MB's @ 60fps of textures requires 8*60 = 480MB/s of bandwidth which is about 40% of the available bandwidth.

32MB's @ 60fps requires 32*60 = 1920 MB/s or about 160% of the available bandwidth.

And 1.2GB/s is it, everything goes over the bus.

Personally I think the biggest issue with PS2 development are developer choices, developers are tied up in pushing lots of polys rather than pushing visual quality. PS2 is capable of good quality output with colorful textures if the developers want to do it, I just don't think it's been a primary focus.
 
Gamecube has 2.6 Gb main memory bandwidth, but the Gekko uses half of that bandwidth, and from what I understand that’s pretty much a fixed set up (you wouldn’t be able to change it if wanted more bandwidth for graphics), so you end up with a main mem to graphics chip access similar to PS2.

No its not a fixed setup for GC, not for texture/geometry bandwidth anyway. However it is for PS2.

Basically as I understand it the 3.2gb/s main mem bus (for PS2) goes direct to the PS2 CPU. Then there is a 1.2gb/s bus from the CPU to the GPU, so there's 2gb/s for the CPU. But because of the way its setup even if you don't use all of that 2gb/s bandwidth for the CPU you still only have that fixed 1.2gb/s bus for textures and geometry.

With GameCube its the other way around. The 2.6gb/s main mem bus goes direct to the GPU, then the GPU leads to the CPU on a 1.3gb/s bus. So in a purely theoretical case if you used no bandwidth for the CPU you could use the whole 2.6gb for textures and geometry. On the other hand for GC even if you use less then 1.3gb/s for textures and geometry you still only have a fixed 1.3gb/s of bandwidth for the CPU. But then I think 1.3gb/s bandwidth is all GC's CPU could ever need really so..
 
marconelly! said:
32MB's @ 60fps requires 32*60 = 1920 MB/s or about 160% of the available bandwidth
However, on PS2, it would be ridiculous to have 32MB of textures, as that would fill an entire RAM.

Too true! So what do you suppose would be more typical values for a texture heap on a PS2 game? 2 MB normal, 4 MB for a high-texture presentation?
 
However, on PS2, it would be ridiculous to have 32MB of textures, as that would fill an entire RAM.

There is nothing stopping you from storing compressed textures in main mem to. In fact there is a MPEG2 decoder on the EE die with a 3.2 Gb bus to main mem, although I have never actually heard of anyone using it to decompress textures I cant see why it shouldn’t be able to.
 
Don't forget that on the PS2, depending on how you use the textures and in what order, some textures will have to be sent across the bus several times... not just the entire 4 or 8MB set once per frame.

Also don't forget that while filling RAM with textures is silly, you can have a very high texture count because some textures could even (in theory) be dynamically loaded from the CD. Main RAM won't stay in one state for the whole second! :LOL:

Re: GameCube:

The 2.6GB/sec bus is obscenely low-latency (so it's incredibly efficient), and is pretty much dedicated to Flipper. The developers can allocate however much they want to Gekko through Flipper - up to 1.3GB/sec but not fixed at that amount.

Also vertex data doesn't specifically have to touch main RAM much, or at least doesn't have to use the original 2.6GB/sec much at all, and if Gekko is used for TCL, then the 1.3GB/sec Flipper-Gekko link can be used instead, not hitting main memory bandwidth at all. :)

Squeak said:
I think the GC's S3TC feature is completely automatic, I don’t even think you could turn it off if you wanted to.
I can’t imagine that someone hasn’t written an auto CLUTmaking program somewhere. It could involve some kind of pattern recognition similar to VQ-compression...

Yeah, GC's S3TC is automatic if the developer wants to use it. It CAN be turned off, but that is rarely necessary or very beneficial...

And automatic CLUT is possible, of course, however it would look easily several times worse than automatic S3TC.
 
Back
Top