Working backwards: Overcoming technical challenges for the next generation.

What I'd like to see is better support for compression at the hardware level, or flexibility that allows it.

Everything you need to support compression is a good out-of-order CPU.

I could see microsoft retrofitting the HD Photo (JPEG XR) format for texture use. Given it already supports mipmaps, HDR and as many channels as you want.

But it's not very local, unlike DXTn; you need to touch tens, maybe hundreds of bytes to decompress a single pixel; and with geometry densities climbing ever higher, it's getting harder to count on good hit rates in the texture caches.
 
assen said:
Everything you need to support compression is a good out-of-order CPU.
That's a bit like shooing mosquitos with canons isn't it?
PS2 had a small piece of silicon almost 9 years ago that could do DCT macroblock decoding faster@150mhz then multi ghz OOOe cpus at 100% utilization.
And even with the resources being abundant, how does CPU decompression fit into texture fetch pipeline that you refer in the second part of the post?

Mind you I agree that we don't need more silicon dedicated to hw compression - but you need to accept compression schemes that work on temporal locality rather then on-demand.

EasyRaider said:
The ability to put unique texture everywhere doesn't mean you have to.
Until market starts expecting it.
 
I think the market already is looking that way. At least the hardcore for RPG's. Would be really nice if procedural generation could step in here in a big way. Not just for pure looks, but for random fleshing out of objects in large cities. Would at least add more to the atmosphere.
 
That's a bit like shooing mosquitos with canons isn't it?
PS2 had a small piece of silicon almost 9 years ago that could do DCT macroblock decoding faster@150mhz then multi ghz OOOe cpus at 100% utilization.

Yeah, you're right about "media" type of compression; I was thinking more about the LZ- and PPM-style lossless codecs.

And even with the resources being abundant, how does CPU decompression fit into texture fetch pipeline that you refer in the second part of the post?
If we believe the Larrabee team, it doesn't fit well :)

Mind you I agree that we don't need more silicon dedicated to hw compression - but you need to accept compression schemes that work on temporal locality rather then on-demand.

Sorry, I didn't understand that part.
 
assen said:
Yeah, you're right about "media" type of compression; I was thinking more about the LZ- and PPM-style lossless codecs.
Ah ok, those are a different story, but then again we've been using LZW derivates on disc-data since PS1 days, so by the very definition of lossless compression you can't expect any significant advancements on that front, no matter how much processing power you can throw at it.

Sorry, I didn't understand that part.
Well I just meant we'll be doing more and more fancy realtime-decompression on programmable hardware/CPUs, but it will continue to be centered around temporally-coherent data structures + maybe heuristics to help decide what data is needed around the near future (eg. 'megatexture' and similar) rather then improved 'dxtc'(instant decompress on demand).
 
Just wondering, is there any chance that PS3 will get x4 BluRay drive? Or it will be forever stuck on x2?

Why would they willingly introduce a more expensive, more complicated and more noisy drive when none of the functions of the console - movie playback and game playback - can't use it? One day when 2x drives aren't even produced anymore, they will stick a 4x or 8x or whatever is the cheapest model, but will limit it to 2x.
 
And in addition to assen's points, what historical precedent is there? How many PS1's came with 20x CD drives? Or PS2's with 16x DVD drives? They never got an upgrade, because it serves the machine no purpose.
 
And in addition to assen's points, what historical precedent is there? How many PS1's came with 20x CD drives? Or PS2's with 16x DVD drives? They never got an upgrade, because it serves the machine no purpose.

Isn't the real problem with the PS3's drive the access times though? Perhaps they could improve those... and improve streaming experience for some. Devs would have to allow for install for older hardware though. I guess we'll wait for PS4 for a better drive. Blu-ray or something else.
 
Well, PSP-2000 doubled the memory from 32 to 64MB to allow for more caching and thus reduced loadtimes (apart from Skype apparently needing the additional memory). So, while I dont expect Sony doing anything else than reducing cost for quite a while, its not impossible we might see improvements later on.
 
Unsurprisingly, memory will be an issue. Rather than going for the fastest, newest RAM tech I think the hardware companies should go for something cheaper but with significantly more capacity.
 
Why would they willingly introduce a more expensive, more complicated and more noisy drive when none of the functions of the console - movie playback and game playback - can't use it? One day when 2x drives aren't even produced anymore, they will stick a 4x or 8x or whatever is the cheapest model, but will limit it to 2x.

Surely when installing to the hardrive its not a question of capping the transfer rate ? That way if one day a 4x drive is as cheap as a 2x drive (which may be a time aproaching quickly) they could add it in there and install times would be reduce drasticly
 
How would you rate the reliability of a faster spinning drive?

Thats a good question. It really all depends on the build quality. If the reliability is with in the same time frame as the 2x drive I don't see the problem.

If its a fast enough diffrence between the two many people may buy new systems. Esp if the noise isn't increased when playing games due to the limits of programing.
 
And in addition to assen's points, what historical precedent is there? How many PS1's came with 20x CD drives? Or PS2's with 16x DVD drives? They never got an upgrade, because it serves the machine no purpose.

NeoGeo CD went from 1x to 2x if I remember correctly.
 
NeoGeo CD went from 1x to 2x if I remember correctly

Nope, the CDZ was still a 1x speed drive. All SNK did was add an extra megabyte of RAM used just for loading, in which case when that 1mb was full up the data was tranfered into the 7mb of video memory. Or something like that, as I'm sure they used it more efficiently in later games. Though of course, with such large amounts of graphics data being moved we never really saw it.
 
That would be possible, and you could use the RAM as buffer to stream graphics data into video memory, just like on Neo CDZ. This would work better than streaming textures from HDD, and you could load the next lot of data into the buffer RAM as you play the section already loaded into VRAM.

Not sure how fast RAM (v cheap option, like SD?) you would need for streaming of textures to be efficient, though for reducing load times it would be great.
 
Back
Top