osvaldomartins
Newcomer
I was wondering about it and i wasn´t able to find anything. If yes, how much ram can we find in textures in the PS3?
Thanks a lot!!!
Thanks a lot!!!
Uttar said:Well, CELL always seemed very appropriate for bzip2 compression to me, if it was optimized for it, which compresses 25-30% more than zlib from my experience. So if devs used and dedicated a fair bit of processing power to it, that would improve loading times from disk further, but I doubt that's what you were talking about, since this is unrelated to loaded textures.
And I'm sure a lot of devs won't bother and will keep using zlib - or will find out that it's better anyway, for other reasons.
Uttar
I don't think bz2 is a particularly good choice (if you want to use the SPEs to decompress), as the BWT only becomes reasonably effective on large blocks of data.Uttar said:Well, CELL always seemed very appropriate for bzip2 compression to me, if it was optimized for it, which compresses 25-30% more than zlib from my experience. So if devs used and dedicated a fair bit of processing power to it, that would improve loading times from disk further, but I doubt that's what you were talking about, since this is unrelated to loaded textures.
bzip.org homepage said:bzip2 usually allocates several megabytes of memory to operate in, and then charges all over it in a fairly random fashion. This means that performance, both for compressing and decompressing, is largely determined by the speed at which your machine can service cache misses. Because of this, small changes to the code to reduce the miss rate have been observed to give disproportionately large performance improvements. I imagine bzip2 will perform best on machines with very large caches
!eVo!-X Ant UK said:Im not sure but if Cell can Do graphically whats been shown, i think it would be stupid for Cell not to have compression.
version said:jpeg, wavelet with deferred rendering will be fine
You're a funny one.Heinrich4 said:(Jpeg2000 is lossless compression almost 100:1)
You need to read up on the subject. Take a ganders at this for one...Heinrich4 said:But how tax rate is possible with this ? 10:1?, 20:1?
(Jpeg2000 is lossless compression almost 100:1)
Shifty Geezer said:You need to read up on the subject. Take a ganders at this for one...
http://kt.ijs.si/aleks/jpeg/artifacts.htm
There's no such thing as lossless compression at 100:1 in anything 'natural' without large areas of flat colour. It's scientifically impossible. Data is of absolute size, and the only way to compress it smaller is to express it in different ways taking advantage of patterns etc. Large variety of data means there's no scope to compress it (unless someone finds a mathematical formula for describing any image as a seed to a fractal process...)
[maven] said:You're a funny one.
But back to the original topic, I expect next-gen games to use art assets that are compressed with more complicated algorithms than what's in use today (which essentially is DXTC) eventually.
I also think this will be constrained to better entropy coding of precompressed formats, as for bandwidth reasons all these assets should still remain compressed in memory but in a form directly usable by the GPU / audio processor.
I.e. someone should start thinking about efficiently encoding the DXTC (or whatever other texture formats are directly supported) formats both losslessly and lossily (maybe something that compressed the 2 palette color components in a more traditional form and the flags as side-band information), so that they can be directly decoded to a compressed (fixed-rate) texture.
I may actually have a go at this...
Before I believe 100:1 compression of images will be possible without severe losses (let alone lossless) I'd need to see some real evidence of the fact. The only things I know that get high lossless compression are things with LOTS of redundant data. eg a 256x256 white to black gradient image is 192kb raw data, which compresses to 693 bytes as a .png. Your average photo will compress to about half size, maybe even 4:1, as lossless pngs, and png is pretty good (though can be compressed further a little). Methods like DXTC are lossy which is how they manage better compression. There was the subject of compression elsewhere (is DVD9 big enough for next-gen type thread) a while back and I found a website showing that the differences between compression algorithms. The difference in compression size was small; maybe 15% between the best and good compression schemes.Heinrich4 said:About to be possible 100:1 or i was not based on information of forums as www.igda.org and gamasutra and others (is only hipothetical ok?).
I think you're missing the point here. 100:1 image compression is already possible right now. 100:1 lossless compression is very possible. But only in very specific conditions. There are fundamentally only two ways 100:1 lossless compression will *ever* be possible -- 1 ) you have a recognizable pattern of some sort or size that repeats itself at least 100 times. Or 2 ) you have patterns that may not necessarily repeat, but can be exactly reproduced procedurally and therefore, you only need to store information about the procedure to use and the parameters.Thanx for explanations, but 100:1 as i say maybe a extreme possibility in the future who can speak necessarily will be or not possible to reach this level of compression in a tool optimized for spe/Spu one day? We have many thread around the world saying about jpeg2000.
That still has nothing to do with it. It doesn't matter that CELL can compress MJPEG video streams very quickly because the GPU still has to be able to *decompress* it quickly if you want to use it as a texture. The point of S3TC is that it's natively supported in hardware by the GPU, so there's no extra cost associated with it. Even if you store it on disc in a heavily compressed format, you will still have to decompress it (and if you like, re-encode it to S3TC), in order for it to be usable as a texture, and so in terms of how much memory is utilized, you've gained nothing.And you believe that with the use of a SPE 3.2GHz if reaches 50:1(50 * more than powerpc at same clock) in mjpeg at TRE cannot be possible to obtain bigger taxes that the 4:1 the 8:1 of s3tc(or streaming or something to obtain more compression) ? I think that we would have to better give credit to the future of the tools will be able to offer.
Shifty Geezer said:The only things I know that get high lossless compression are things with LOTS of redundant data.
dukmahsik said:"are"