Offical blueray and other large format storage. thread

Chap said:
You can pack em all tightly and unpack them during installation for PC games, while console games are play direct from the disc.
:rolleyes:
Sure. You forgot to mention that console games also save data on discs as uncompressed as possible to further prolong the load times. Not only they don't compress, they use wide characters, save textures in 32FP formats and geometry data as extended doubles. And if that isn't enough to bring the load time to 1minute and fill the disc, they'll use movies saved as sequence of raw 32bit BMP files.

All the while PC games will ZIP their files 60times over(why do you think installs are so slow?) using one of those magic algoryhtms that can losslessly compress random data sequence as many times as you want each time making it 50% smaller.

So i guess that is why we can see certain games, that have dummy data almost as much as game data.
DVDs like all other optical media read faster on outer tracks - which is why in cases when you don't fill entire disc you might use dummy data as "padding" so the important data will be placed on outermost possible track for optimal load times. For the same reason you'll normally place FMV and other streaming data on inner tracks since they only require a fraction of minimal reading speed of the drive.
 
Fafalada said:
Squeak said:
Anyway the idea of meagre cache for dynamic media applications have caught on in later architectures with the same purpose
Maybe it's semanticst here, but I thought that's exactly what an UMA is? (one big pool with a few meagre caches on top).

I think the reason for the PS2s on average, disappointing graphics performance should be sought elsewhere.
Well it's just the wrong cache people are bitching about. It's a fact that R59k memory accesses are the bottleneck of majority of PS2 apps - and extra 8-16kb of higher asociativity D-Cache would help this out immensely.
I blame Mips and Toshiba engineers for this (because I need some scape goat :LOL: ) but who knows what the real reason was.

Maybe without stretching it to far, you could even say that the unified memory pool could be regarded as a very large level 2 cache, although a pretty slow one? :)

This is probably a stupid question, but what about the scratchpad memory? If you include that as a sort of cache, then that brings the total mem size in the core up to 40Kb. Not to shabby for a L1 cache? IIRC the largest size L1 cache the R5k series cores can handle is 64Kb.
Maybe scratchpad mem isn’t set associative, like the rest of the caches? But isn’t it right that you have total control of scratchpad as a programmer?
 
This is probably a stupid question, but what about the scratchpad memory? If you include that as a sort of cache, then that brings the total mem size in the core up to 40Kb. Not to shabby for a L1 cache? IIRC the largest size L1 cache the R5k series cores can handle is 64Kb. Maybe scratchpad mem isn?t set associative, like the rest of the caches? But isn?t it right that you have total control of scratchpad as a programmer?
ScratchPad is just a 16kb chunk of linear memory - it has no cache-like properties (except low latency :\ ). In other words, it won't be any help with code doing lots of random memory accesses - ie. pretty much all general purpose code.
It'll work well enough for working on large linear data sets though.
 
I've always thought that it would be faster to read form the innermost tracks of a CD or DVD :? as there the laser would have to move less than in outer tracks.

Isn't that why the smaller Game Cube discs are faster at loading than xbox and PS2 discs.

Anyway, I don't see the loading times would be much of an issue in next gen (BluRay or other), as it is not much of an issue in PS2, GC or xbox. At least I can not see any significant differences in average loading times between PS2 and xbox games. The HD is not helping that much.
 
The further you get from the centre of the disk, the faster the track is moving past the laser head. This is for a constant rate of rotation.

Think about it this way; imagine drawing a circle around the innermost boundary of the area containing data on a CD. Then imagine drawing a circle around the outermost boundary of the area containing data on a CD (imagine the CD is full). The circumference of the outside circle will be over twice as great (if I can remember correctly) as the circumference of the inside circle.

Likewise, a CD or DVD should have a data transfer rate over twice as high at the outside edge of the disk as at the inside edge of the disk. This difference will be less on physicall smaller disks (like GC disks).

I'd guess that the reason for GC games loading faster is most likely due to the machine having less fast RAM than the PS2 or Xbox, and it having the 16 MBs of super slow A-RAM, which seems designed to act as a sort of disk cache (or auxiliary store) to reduce mini-DVD loading times and access frequency.

Hope that made sense, I'm kind of in a rush right now! :)
 
rabid said:
I've always thought that it would be faster to read form the innermost tracks of a CD or DVD as there the laser would have to move less than in outer tracks. Isn't that why the smaller Game Cube discs are faster at loading than xbox and PS2 discs.
Function already summed up the inner vs. outer tracks nicely.
Other then that, GameCube drive is slower then PS2's and XBox's, by quite a bit actually. Whether this speed advantage translates to actual applications is entirely up to the people responsible for loading stuff.
As a silly example, our load times are now over 300x faster then they were when we put the game on a CD for the very first time. All with exactly the same hw and all... :p

Chap said:
So PC games are as compressed as Console games? I am not talking 60X though..
I would imagine PC must be more compressed on average to fit smaller discs, but that comes at cost of using (more)lossy methods. FMVs would be typical example although it applies to other data too.
 
Thanks for the explanations Function and Fafalada.
I've never really thought it that far :), of course now it is obvious that streaming data is faster from the outermost tracks. But aren't seek times still slower there than inside.
Why did Nintendo go for the smaller discs if they give no speed advantage? Didn't they say that the smaller disc was chosen just because it would be faster :?
 
It was first and foremost an anti-piracy measure, AFAIK. Perhaps access could be faster in specific cases where it would have to read something on the innermost tracks then go all the way to the outside for the next thing and then back multiple times, but not all accesses are like this. So the benefit is largely dependent on the nature of the accessing that occurs for a specific disc laid out in a specific way (as determined by the game developers). It could be anywhere from nonexistent/inconsequential to considerable (in the case where the disc was laid out in the worst way possible).
 
Back
Top