PS2 question

Wildstyle

Newcomer
Hay guys I was just wondering if the two 1024-bit ports to the embedded RAM are for Z reads/writes and can the 1024-bit transfer textures as well any info greatly appreaciated.
 
Oh lol I see but honestly why would you need such a big frame buffer for the Xbox and Gamecube get by just fine with way smaller frame buffers and if they went with a smaller frame buffer they could have prolly increased the texture cache to 1024-bit that would have been 8)
 
Wildstyle said:
Oh lol I see but honestly why would you need such a big frame buffer for the Xbox and Gamecube get by just fine with way smaller frame buffers and if they went with a smaller frame buffer they could have prolly increased the texture cache to 1024-bit that would have been 8)

Uh... wha?

Actually most PS2 devs do use a smaller frame buffer to save on memory (4MB embedded, half of that would be eaten by a full/full 640x480 double buffer).

Z/frame reads and writes actually take up the majority of the memory bandwidth on a rasteriser - ESPECIALLY one with a ridiculous 16 pipelines.. You'd be surprised how little bandwidth is really needed for texturing.

The main thing hindering texture quality on PS2 isn't bandwidth, it's space. As already mentioned, PS2 has less than 2MB available for textures if it's using a full/full 480-line frame (full/full HAS to be used for progressive, while full/half CAN be used for interlaced IF you have a high-quality filter AND can sustain 60fps), and while that doesn't sound bad on the surface (GCN's texture cache is fixed at 1MB), remember that PS2's texture compression support is very limited - 4- and 8-bit CLUT or JPEG only.

Oddly enough, although JPEG compression would offer 24-bit quality at about 10:1 compression, not one single PS2 game uses it. I suppose as soon as that gets used, we'll see much better texture variation in games... but until then, it's still limited pretty handily. Shame, really.
 
Oddly enough, although JPEG compression would offer 24-bit quality at about 10:1 compression
I'm curious, what do you base your compression estimates on?
10:1 is something of a worst case scenario I'd expect, rather then average.
CG typically compresses well up to 20:1, photographic material fares better(30:1 is not uncommon to work with at least comparable lossiness to S3TC, and still better results then either VQ/Clut compressions).

If there's one thing where you'd be looking at low compression it's with recompressing Clut images - anything but really high quality settings will tend to artifact a lot. But then recompressing Clut stuff into any other format will tend to degrade quality quite noticeably too.
 
Fafalada said:
Oddly enough, although JPEG compression would offer 24-bit quality at about 10:1 compression
I'm curious, what do you base your compression estimates on?
10:1 is something of a worst case scenario I'd expect, rather then average.
CG typically compresses well up to 20:1, photographic material fares better(30:1 is not uncommon to work with at least comparable lossiness to S3TC, and still better results then either VQ/Clut compressions).

If there's one thing where you'd be looking at low compression it's with recompressing Clut images - anything but really high quality settings will tend to artifact a lot. But then recompressing Clut stuff into any other format will tend to degrade quality quite noticeably too.

All the stuff I read on PS2's architecture discussed the JPEG defaulting to 10:1.

Keep in mind that even if that's "worst case", it's still better than 6:1 S3TC... and at 10:1, JPEG artefacts will be minimal. Now at 20:1 or 30:1, OTOH, there would probably be noticeable JPEG noise on textures...
 
All the stuff I read on PS2's architecture discussed the JPEG defaulting to 10:1.
Well - it's plain JPeg (or rather I-Frame). PS2 has nothing to do with how it's compressed - and you can obtain fairly extensive results on JPeg quality&compression tests from other sources on the net, as well as test it yourself - it's not like the format wouldn't be easily accessible to anyone.

Anyway, I specifically talked about quality relative to other methods. I don't see a particular logic behind using higher quality setting if I can get double the storage by using something comparable to that of using VQ or S3TC.
Especially since the most obvious use of IPU is with texture animations.
 
The main thing hindering texture quality on PS2 isn't bandwidth, it's space. As already mentioned, PS2 has less than 2MB available for textures if it's using a full/full 480-line frame

But it can stream about 14mb of textures per frame at 60fps from its main ram (that's after taking into account upto 400mbps for geometry transfer). So its not as if that 2mb of spare on-chip ram restricts it to 2mb of textures per frame or anything.
 
it is true that ps2 must be doing render to texture at an insane speed, because there are many games that use it at a decent resolution and a full frame rate, tekken4 comes to mind.

also, can anyone tell me whether the big screen in MGS2 (the ones in the tanker with loads of soldiers) are animated textures or are actually real time render-to-texture videos? the stuff they are showing is after all things present in the rooms, it would save space if they were just showing, for example, Metal Gear and render-to-texture-it without having to use a prerendered video and show it on the big screen...

u know what i mean? not sure i was clear... :D
 
Teasy said:
The main thing hindering texture quality on PS2 isn't bandwidth, it's space. As already mentioned, PS2 has less than 2MB available for textures if it's using a full/full 480-line frame

But it can stream about 14mb of textures per frame at 60fps from its main ram (that's after taking into account upto 400mbps for geometry transfer). So its not as if that 2mb of spare on-chip ram restricts it to 2mb of textures per frame or anything.

Yeah, but programming that kind of streaming by hand is a PITA that wouldn't be (as) necessary if GS had say an additional 2MB, or S3TC support.

GCN's texture cache operates automatically, IIRC.

It's still a restriction that PS2 would be much, much more powerful without.

Side note: Actually now that you mention it, that is the bandwidth restriction in PS2 - the EE-GS link. It should by all rights be at least twice as wide...
 
Tagrineth said:
Teasy said:
The main thing hindering texture quality on PS2 isn't bandwidth, it's space. As already mentioned, PS2 has less than 2MB available for textures if it's using a full/full 480-line frame

But it can stream about 14mb of textures per frame at 60fps from its main ram (that's after taking into account upto 400mbps for geometry transfer). So its not as if that 2mb of spare on-chip ram restricts it to 2mb of textures per frame or anything.

Yeah, but programming that kind of streaming by hand is a PITA that wouldn't be (as) necessary if GS had say an additional 2MB, or S3TC support.

GCN's texture cache operates automatically, IIRC.

It's still a restriction that PS2 would be much, much more powerful without.

Side note: Actually now that you mention it, that is the bandwidth restriction in PS2 - the EE-GS link. It should by all rights be at least twice as wide...



yeah definately that is one of the things i could never understand about the hardware...... when i saw it for the first time i just could not believe it... that, combined with simple S3TC would have made PS2 look just as good as (at least) GC most of the time.... i mean, it's doing pretty well as it is now, let alone with those 2 little modifications....
 
Tagrineth said:
Z/frame reads and writes actually take up the majority of the memory bandwidth on a rasteriser - ESPECIALLY one with a ridiculous 16 pipelines.. You'd be surprised how little bandwidth is really needed for texturing.
The main thing hindering texture quality on PS2 isn't bandwidth, it's space.
Tagrineth said:
Side note: Actually now that you mention it, that is the bandwidth restriction in PS2 - the EE-GS link. It should by all rights be at least twice as wide...

Isn’t this a bit contradictory?
 
Squeak said:
Tagrineth said:
Z/frame reads and writes actually take up the majority of the memory bandwidth on a rasteriser - ESPECIALLY one with a ridiculous 16 pipelines.. You'd be surprised how little bandwidth is really needed for texturing.
The main thing hindering texture quality on PS2 isn't bandwidth, it's space.
Tagrineth said:
Side note: Actually now that you mention it, that is the bandwidth restriction in PS2 - the EE-GS link. It should by all rights be at least twice as wide...

Isn’t this a bit contradictory?

Well, somewhat. Even with the current space allocation, widening the pipe from the EE to the GS wouldn't help as much as one would hope. Ideally both would be expanded... combine with S3TC (primarily space saving) and we'd have a winner.

I guess if anything the GS's embedded RAM has too much texture bandwidth...
 
I guess if anything the GS's embedded RAM has too much texture bandwidth...

Blasphemy!!! :devilish:

As far as the GIF-GS interface... The performance is what you make of it. I personally haven't hit performance bottlenecks with it myself (other than stupid things, which cause too much bus chatter and leave you hanging dry, but that applies to pretty much any bus and is a problem of the programmer not the hardware.)

Oh and adding S3TC isn't going to magically make games "look" better either...
 
archie4oz said:
I guess if anything the GS's embedded RAM has too much texture bandwidth...

Blasphemy!!! :devilish:

So explain what the core's going to do with that gargantuan pile of texture-only bandwidth? 9.6GB/sec if I'm not mistaken. I suppose actually that could be used for really hi-res rendered textures... but otherwise... :?:

archie4oz said:
As far as the GIF-GS interface... The performance is what you make of it. I personally haven't hit performance bottlenecks with it myself (other than stupid things, which cause too much bus chatter and leave you hanging dry, but that applies to pretty much any bus and is a problem of the programmer not the hardware.)

Yeah, well, were you making a serious effort to push numerous large textures? Not to belittle you or anything, I deeply respect your opinion on such things, but you came from Square, correct? FFX isn't exactly the most incredibly detail-textured game I've seen, and it doesn't use MIP maps either which saves on texture size too.

archie4oz said:
Oh and adding S3TC isn't going to magically make games "look" better either...

Nobody said it would... but having S3TC would mean up to 6x the texture space and total bandwidth... giving the artists and designers more freedom in slapping hi-res, colourful textures on everything.

In fact, just adding S3TC will make things look worse for most current PS2 games!
 
Actually now that you mention it, that is the bandwidth restriction in PS2 - the EE-GS link. It should by all rights be at least twice as wide...

It seems that circuit board bandwidth was expensive in Japan 97-99, because both consoles from that period of time have less than you would expect of it. Curiously, inexpensive 3D-cards for PCs from the same time has 2 or 4 Gb.
Dreamcast seems to do miracles with its 800Mbs of bandwidth though, granted it doesn’t have to draw 3-4 textures on top of each other in vain, but it does have to do buffer read and writes, textures and polys over that bus. PS2 on the other hand has 1/3 more bandwidth and “onlyâ€￾ has to send vertice positions and textures “once in a whileâ€￾ when the GS has finished using the currently uploaded textures. Then why the big difference?

Regarding JPEG texture compression:
The IPU is on the EE die and the decompression needs to be done in GS memory. Wouldn’t JPEG decompression be a bit like building a ship in a bottle: A lot of work but not much gained?
 
So explain what the core's going to do with that gargantuan pile of texture-only bandwidth? 9.6GB/sec if I'm not mistaken. I suppose actually that could be used for really hi-res rendered textures... but otherwise...

Uhh how could you say that the Ps2 has a gargantuan pile of texture only bandwidth because if I am not mistaken doesn't the Gamecube have 10.4GB/sec of texture only bandwidth and what exactly would the GC's core use all that bandwidth for?
 
Back
Top