Alternative distribution to optical disks : SSD, cards, and download*

Although the thought of the extra speed by distributing games on SSD's sounds fantastic I don't think the price of manufacturing them is in line quite yet for game distrabution.

I'm still baffled that no console has toyed with the idea of installing some really cheap ram into the system just to use as a virtual drive. A 2GB stick of DDR3 by crucial is going for like $20 on newegg, a 2GB buffer the console to populate from the HDD AND Optical drive would mean little to no loading required after the initial load. That would give us more speed then SSD's and only need to be purchased once.

We don't get the most out of our systems simply because we waste resources. Your HDD drive and Optical drive are not always streaming data for the games. A lot of the time they sit idle and that is wasting resources in my opinion. If we were able to offload large chunks of data into ram that transfers at rediculous speeds we could do a lot more with textures and assets. Right now the amount of video ram and system ram available we contribute to the potential ability of the system but it's only so because we have to spend so much time getting the data from the mediums that they have to store so much of it that isn't being used at that moment in memory.

Maybe someone here with more knowledge would be able to explain why this is a bad idea and what crack I might be smoking. But it would seem we waste a lot of resources simply because our mediums for transferring data is too damn slow. If a developer could swap out textures faster couldn't they do much more with LOD and such?
 
I'm still baffled that no console has toyed with the idea of installing some really cheap ram into the system just to use as a virtual drive. A 2GB stick of DDR3 by crucial is going for like $20 on newegg, a 2GB buffer the console to populate from the HDD AND Optical drive would mean little to no loading required after the initial load. That would give us more speed then SSD's and only need to be purchased once.
Just adding that to overall RAM pool won't probably make much of a price difference and would give developers even more freedom than extra layer of cache
 
You'd still need to populate that RAM from the optical disc and HDD, so it's not really a net gain. You'd just get much low access times and better streaming as long as you are still working with that data in the cache. The moment you need to load in fresh data, you are stuck with the drive reading speeds. Given the added mobo complexity and bus to accomodate such RAM, it's clearly been measured as a net loss. Bear in mind GC did this effectively with its A-RAM, although not as a larger drive than system RAM.

Personally I think the best compromise is optical distribution and a decent chunk of flash for caching, say 16 Gbs for local data, and HDD. This'll give low-latency random access storage while the optical drive offers the cheap distribution and added value of movie playback, and HDD installations will solve the loading issues while an HDD is pretty much a requirement to sell lots of DLC. The hardware cost of HDD would be amortised in sales of DLC/DD content, so could be sold at loss to keep the unit cost of the console down (assuming they can lock down a decent network service like Qriocity, so people choose it over rivals). A couple of year later they could lose the HDD and put in 64 GBs flash for throwaway money for a lowest entry price.

That's assuming traditional consoles, and no-one goes with my Grand Vision.
 
I foresee optional optical drive (DD is always an option) and full disk installs. The disc is just a medium of delivery just like internet. Since the optical would be an addon, included in the top tier console, there would be no problems with profitability. $399 for hard disk only and $499 with optical should allow for top tier performance. Don't forget the higher profit margins for DD games for the platform holder, so assuming 20% of distribution moves to downloads, that's a nice boost in profits.

Rather than buy say 'A' EA game, you could buy the whole years collection and activate the ones you want. So as soon as you buy one game they have the opportunity to sell you another one or two. It'd require a 3 or 4 layer Blu Ray but it'd be a pretty good way to distribute them to those people who can't download.

Oh the interesting thing about optional optical is it means they can price discriminate between their hardcore users who absolutely must have backwards compatibility and their regular users.
 
You'd still need to populate that RAM from the optical disc and HDD, so it's not really a net gain. You'd just get much low access times and better streaming as long as you are still working with that data in the cache. The moment you need to load in fresh data, you are stuck with the drive reading speeds. Given the added mobo complexity and bus to accomodate such RAM, it's clearly been measured as a net loss. Bear in mind GC did this effectively with its A-RAM, although not as a larger drive than system RAM.

Shifty,

Yeah I thought about that and in my opinion that would just come down to the Devs thinking about those issues ahead of time. Instead of using the Ram drive like a cache I was thinking more like a buffer. I was looking at it almost like another piece of hardware that the Devs control. Instead of just having the HD and Optical drives populate it as a middle stage between the content and the processors it would be more like a designated space to store information. It would be controlled and the Devs would populate it with the data they believe is necessary.

I'm going to butcher this explanation but here I go..

Lets take online COD for example:
Current method
Your game ends and you are given 3 choices, Previous Map, Next Map and Random Map. Once the selection is locked in place it loads the map needed and after 10-15 seconds the game is about ready to start. Now that the game is started the game continuously loads assets from the game while you are playing it as needed in the game because it doesn't have enough system memory to populate the entire map and to reduce load times..

Buffer Method
Your game ends and you are given 3 choices, Previous Map, Next Map and Random Map. However, during your last game the system already started to populate the buffer with the next map and the random map and still has the current/previous map in ram. Since it doesn't need to worry about excessive wait times there is no more assets being loaded from a HDD or Optical drive during game play like there is now. Now the game is started it loads instantly, has both the next map and random map ready to go and since you just played this map twice when you start your next match it starts populating a new "random" map into the buffer replacing the old map. This also means that the game itself really only needs to keep just the stuff drawn on screen in its main memory. Loading and unloading assets it normally wouldn't do because the speed was too slow to accomplish that seamlessly.

Now I know some might be thinking they could accomplish this with just more system memory..but. Why waste high performance memory for a simple buffer when off the shelf ram can give them 1000x the transfer speed (21GBps) (like offloading a single layer BD in 1 second!) and 1/100th the seek times of even the fastest hard drive? Having the ability to stream at that speed is almost as good but less expensive then increasing the total memory of the system.

Hell by the time the system is ready to launch they could probably throw in a 4GB buffer for less then $10 per console.
 
You'd still need to populate that RAM from the optical disc and HDD, so it's not really a net gain. You'd just get much low access times and better streaming as long as you are still working with that data in the cache. The moment you need to load in fresh data, you are stuck with the drive reading speeds. Given the added mobo complexity and bus to accomodate such RAM, it's clearly been measured as a net loss. Bear in mind GC did this effectively with its A-RAM, although not as a larger drive than system RAM.

Personally I think the best compromise is optical distribution and a decent chunk of flash for caching, say 16 Gbs for local data, and HDD. This'll give low-latency random access storage while the optical drive offers the cheap distribution and added value of movie playback, and HDD installations will solve the loading issues while an HDD is pretty much a requirement to sell lots of DLC. The hardware cost of HDD would be amortised in sales of DLC/DD content, so could be sold at loss to keep the unit cost of the console down (assuming they can lock down a decent network service like Qriocity, so people choose it over rivals). A couple of year later they could lose the HDD and put in 64 GBs flash for throwaway money for a lowest entry price.

That's assuming traditional consoles, and no-one goes with my Grand Vision.

I think it will be alot like it is today on the PS3, mandatory installs coupled with optional installs. Except the hard drives will be faster and bigger. I think the gain for the console manufacturers in getting faster loading is just not worth it for them to use money. If they were to introduce a 3rd level harddrive cache they might as well use hybrid drives and save the developers from the trouble.
 
I think it will be alot like it is today on the PS3, mandatory installs coupled with optional installs. Except the hard drives will be faster and bigger. I think the gain for the console manufacturers in getting faster loading is just not worth it for them to use money. If they were to introduce a 3rd level harddrive cache they might as well use hybrid drives and save the developers from the trouble.

tkf-

But wouldn't you agree that the current speed limitations of our mediums limits the ability of the console? As the games get more complex they require more data but the transfer speeds of our mediums haven't kept up with the file sizes at all. Considering the HDD is currently in this generation and will probably be in next generation that is a huge problem for me as they wont be any faster. If we don't do something about our mediums or start working on our mediums this next generation when it comes to xbox1080 or PS5 the loading times wont be the only travesty; it will have at that time be the sole bottle neck for system performance.

If the system has 2gb of system memory it needs to populate constantly to load assets to a lot of the system power (ram) is going to be wasted using it as a cache since the transfer speeds are ridiculously slow. It will have to waste so much system memory because the assets will be too big to stream to the system efficiently.

Just my opinion
 
Yeah I thought about that and in my opinion that would just come down to the Devs thinking about those issues ahead of time. Instead of using the Ram drive like a cache I was thinking more like a buffer...
Oh sure. 'Buffer' and 'cache' are two different names for the same thing, although the term 'cache' obviously has connotations for automated data management for some folks. You'd want full developer access to gain optimal control of what resides in there. I'd still rather see that being a 2/4GB partition of 16 GBs internal flash though, as it wouldn't have to be repopulated between games unlike DRAM. It could in essence give flash speed loading after installation, the major advantage of flash based distribution, with the advantages of optical capacity. As a consumer this seems the best deal. Straight flash has storage limits, and straight optical has speed issues.
 
tkf-

But wouldn't you agree that the current speed limitations of our mediums limits the ability of the console? As the games get more complex they require more data but the transfer speeds of our mediums haven't kept up with the file sizes at all. Considering the HDD is currently in this generation and will probably be in next generation that is a huge problem for me as they wont be any faster. If we don't do something about our mediums or start working on our mediums this next generation when it comes to xbox1080 or PS5 the loading times wont be the only travesty; it will have at that time be the sole bottle neck for system performance.

If the system has 2gb of system memory it needs to populate constantly to load assets to a lot of the system power (ram) is going to be wasted using it as a cache since the transfer speeds are ridiculously slow. It will have to waste so much system memory because the assets will be too big to stream to the system efficiently.

Just my opinion

Interesting point, the hardware and content is getting so good that the games storage medias speed or size is the bottleneck (optical/flash).

But to some extent it has been like this since forever, except that with this generation and hard drives we finally got a much faster medium than optical, but so far it hasn´t really been used to it´s fullest potential, imho.

Afaik GTA4, Red Dead Redemption, Uncharted all made it through with streaming from a slow Blu-Ray drive. Has there been any games on the PS3/360 that actually required all data to be installed to the harddrive, and then streamed it?

Now, imagine that in the next gen we will still have a harddrive, the current 2.5 7200 RPM drives deliver something like +80MB pr second. Compared to the 9 MB of the 2xspeed drive in the PS3 i would say it´s enough. It´s not fantastic, earth shattering etc, but it should be enough to keep up with the demand.

And as shifty pointed out, hard drives is a given with the onslaught of DD/DLC that is incoming, so the size should also be going up by a healthy amount.

When the PS5 is out i hope we are on flash in every conceivable way :)
 
So how would my previous idea of full disk installs work? Would that be a good idea? If you are going to ''force'' installs you might as well take as much advantage of it as possible.
 
Yes, it'd make sense. Could be done celverly, so you only install the first part to start the game and the rest installs while you're playing. As -tkf- says, it'd help with streaming too, with developers knowing exactly what they have to play with. Also the disc could be required and audio/video streamed from that, defending against piracy, providing a little extra storage BW, and avoiding the need to copy those large files across saving a bit of install time.
 
Yes, it'd make sense. Could be done celverly, so you only install the first part to start the game and the rest installs while you're playing. As -tkf- says, it'd help with streaming too, with developers knowing exactly what they have to play with. Also the disc could be required and audio/video streamed from that, defending against piracy, providing a little extra storage BW, and avoiding the need to copy those large files across saving a bit of install time.

I don´t know if it´s a smart move to force all games onto the harddrive. You have plenty of games that doesn´t really need it and just ends up taking space because they have to.

Rather have a Sony/Microsoft/Nintendo enforced quality requirement that is supported by them with technology and software. You have a great example (i think uncharted did this to a an extent?).

Provide the basics in software to support streaming (isn´t this already a part?) and use all available resources that the hardware can provide. If we take a 8 x Speed Blu-Ray (36MBs) drive and a +80 MBs hard drive we have a theoretical max of over 110MB pr second. Of course this will be hampered by seek times etc.

But it should be possible for the big 3 to supply a software layer where the developers can choose where to put assets and also know how fast they can read them back. This must be a part of the software tools by now?
 
36 MBps is 30 seconds a gig, a minute of load time for a 2 GB console, and we're all hoping for 4 GBs which is almost a certainty if next-gen isn't until 2014-2015. So if installation is optional, every game is going to use it at least to store the main game, at which point you may as well just install all the data (bar maybe video and audio) to HDD in a background copy while playing the game. The only reason not to do this is to support an HDD-less SKU, which doesn't seem at all sensible any more.
 
36 MBps is 30 seconds a gig, a minute of load time for a 2 GB console, and we're all hoping for 4 GBs which is almost a certainty if next-gen isn't until 2014-2015. So if installation is optional, every game is going to use it at least to store the main game, at which point you may as well just install all the data (bar maybe video and audio) to HDD in a background copy while playing the game. The only reason not to do this is to support an HDD-less SKU, which doesn't seem at all sensible any more.

4096 / 36 = 113 secs

2048 / 36 = 57

512 / 9 = 56 secs (If the 9 MB is true on the PS3)

But in order to utilize 8 times the ram for better quality graphics/audio can we just say we need 8 times the data? That would really see the games grow then.
 
36 MBps is 30 seconds a gig, a minute of load time for a 2 GB console, and we're all hoping for 4 GBs which is almost a certainty if next-gen isn't until 2014-2015. So if installation is optional, every game is going to use it at least to store the main game, at which point you may as well just install all the data (bar maybe video and audio) to HDD in a background copy while playing the game. The only reason not to do this is to support an HDD-less SKU, which doesn't seem at all sensible any more.

I remember your wanting flash based buffers in next generation consoles, how about something like Seagates Optimus where you marry a 7200RPM laptop drive with 4-16GB of flash?

Over and above how much data installed to a HDD are you actually getting to the point of diminishing returns? In most games theres a large quantity of data which is used more frequently such as recurrent textures and the like. The solution to not having an install already premade on a new game could be to run the optical drive at 12x (54MB/s) for a few minutes to populate the HDD with relevant data on first load and then subsequently drop the speed down to 8x (36MB/s) when the relevant data has been moved to the relevant locations. You can easily install 6GB in the first few minutes of loading a game for the first time and surely that would also be tolerable for the end users?
 
I remember your wanting flash based buffers in next generation consoles, how about something like Seagates Optimus where you marry a 7200RPM laptop drive with 4-16GB of flash?
That's an option. of course if you place 16 GB's flash on the mobo, the developer can cherry-pick the ideal data, or even use it for other purposes like a scratchpad. i don't know how efficiently HDD controllers can manage data sharing between flash and HDD such that there aren't unexpected latencies etc., whereas in the case of developer controlled flash, actually identified as a different storage medium, they can place latency sensitive files in there and know it's always there and there won't be ocassions when there's a sudden 10x delay in fetching the data (although there's distinctly the possibility that development will become so high-level that games won't be designed down to that level).

Over and above how much data installed to a HDD are you actually getting to the point of diminishing returns? In most games theres a large quantity of data which is used more frequently such as recurrent textures and the like. The solution to not having an install already premade on a new game could be to run the optical drive at 12x (54MB/s) for a few minutes to populate the HDD with relevant data on first load and then subsequently drop the speed down to 8x (36MB/s) when the relevant data has been moved to the relevant locations. You can easily install 6GB in the first few minutes of loading a game for the first time and surely that would also be tolerable for the end users?
That's certainly a consideration, and you're right that there'll be a core of often files - the executables and constant assets like the protagonist - and then all the shorter term data like level data. As such just installing the core files and reading the supplimentary data from optical is workable. Myself though, I like quiet systems! I'd much rather have everything installed. I imagine developers would appreciate not having to worry about the mammoth seek times on optical drives. Full, compulsory HDD installation wasn't an option this gen because for cost reasons, tiddly little HDDs were needed in some SKUs. Next gen will have ample HDD capacity, so dumping the whole disk over time and effectively elliminating the optical means a net win in every area - quieter consoles, faster data access, and better streaming. There's really no point therefore in keeping the optical drive active, except I think for things like video and music which can be streamed quietly.
 
512 / 9 = 56 secs (If the 9 MB is true on the PS3)
Isn't this about right though? You put in Uncharted 2, and wait a while. Then you start the game and wait a while more. Also it's not necessarily filling all 512 MB's before you get to do anything. In fact far from it, a fair bit of that 512 MB's is reserved for framebuffers and executable workspace and stuff. you also have the issue of seek times, not just peek throughput, which can dramatically reduce a drive's read speed. Even if load times are no worse than this gen, an HDD install will be possilble next-gen because we'll have the capacity by default, so why not go for it?

But in order to utilize 8 times the ram for better quality graphics/audio can we just say we need 8 times the data? That would really see the games grow then.
Indeed. Assuming everything gets a suitable increase in quality, capacity will need to increase accordingly, or else there'll have to be quality compromises. You could still get away with the same amount of data, same assets, as now, and throw in loads more shaders and higher IQ and calculated effects like lighting to get a next-gen look, but that won't achieve as much as the next generation of hardware could do.
 
Isn't this about right though? You put in Uncharted 2, and wait a while. Then you start the game and wait a while more. Also it's not necessarily filling all 512 MB's before you get to do anything. In fact far from it, a fair bit of that 512 MB's is reserved for framebuffers and executable workspace and stuff. you also have the issue of seek times, not just peek throughput, which can dramatically reduce a drive's read speed. Even if load times are no worse than this gen, an HDD install will be possilble next-gen because we'll have the capacity by default, so why not go for it?

Indeed. Assuming everything gets a suitable increase in quality, capacity will need to increase accordingly, or else there'll have to be quality compromises. You could still get away with the same amount of data, same assets, as now, and throw in loads more shaders and higher IQ and calculated effects like lighting to get a next-gen look, but that won't achieve as much as the next generation of hardware could do.

The only reason i see for not making it mandatory on every game is the games where it just isn´t needed or at least should be made a choice since the only difference would be more loading. The games where there is a real need, for example GTA5, and the games is build around the Hard drive speed coupled with "secondary" streaming from the Disc should be installed and the developers shouldn´t be force to create a pure disc based version. It only compromises the games.

But as this discussion goes on i am starting to think that the real bottleneck might end up being storage space afterall. If we do some crazy math and say that 8 times the memory could in theory require 8 times the quality of content. Then a game that today took 5GB would suddenly require 40GB already getting close to the limit of Blu-Ray. If the average game size goes up by a large margin then the Harddrives really have to be big to keep up :)
 
That's an option. of course if you place 16 GB's flash on the mobo, the developer can cherry-pick the ideal data, or even use it for other purposes like a scratchpad. i don't know how efficiently HDD controllers can manage data sharing between flash and HDD such that there aren't unexpected latencies etc., whereas in the case of developer controlled flash, actually identified as a different storage medium, they can place latency sensitive files in there and know it's always there and there won't be ocassions when there's a sudden 10x delay in fetching the data (although there's distinctly the possibility that development will become so high-level that games won't be designed down to that level).

So the only difference there is that it ought to be able to be developer defined? That of all issues with having it attached to the HDD ought to be the easiest to solve with some foresight. At least this way the console maker is benefiting from lower prices due to the HDD's with that design being mass produced vs a custom self designed version

That's certainly a consideration, and you're right that there'll be a core of often files - the executables and constant assets like the protagonist - and then all the shorter term data like level data. As such just installing the core files and reading the supplimentary data from optical is workable. Myself though, I like quiet systems! I'd much rather have everything installed. I imagine developers would appreciate not having to worry about the mammoth seek times on optical drives. Full, compulsory HDD installation wasn't an option this gen because for cost reasons, tiddly little HDDs were needed in some SKUs. Next gen will have ample HDD capacity, so dumping the whole disk over time and effectively elliminating the optical means a net win in every area - quieter consoles, faster data access, and better streaming. There's really no point therefore in keeping the optical drive active, except I think for things like video and music which can be streamed quietly.

It would probably be slower to install everything to the HDD than to rely on both the optical and HDD simultainiously. I don't think they need to eliminate the optical as much as use both at the same time without pushing the optical drive to speeds which would create too much noise that it disturbs the end user. With the optical drive adding 24-36-48 MB/s on top of the HDDs 80+ it will be possible to deliver >100MB/s transfer speeds in 100% of cases which ought to help make loading time appreciably lower. It seems a waste to install an optical drive and not use it in a practical way even if it nets only a 25% increase in throughput because it means at the very least the developer will be able to seek two sets of data simultainiously for instance playing a FMV off the Blu Ray whilst at the same time dedicating the HDD entirely to streaming in the next set of assets as both a HDD and Optical drive struggle if you're trying to seek two entirely different data sets off them at the same time and you'd never get their full theoretical throughput under such a scenario!
 
My concern with mandatory installs is that it is a waste of resources. Even if future games only store 25% of themselves on the console we are still looking at around 6-9gb per game. We are also only doing that in the hopes of helping loading times as technically that is the only thing it would help with besides game updates. So we are already looking at ways to speed up the loading process, but as compared to this generation it isn't that big of a leap as it is for the assets that are going to be created and needed to take advantage of the additional computing power. What has the limited load speed of game data done to the graphics this generation? Does anyone even know? Have we been too concerned with the system ram of our consoles that we have overlooked its purpose typically is considered the bottle neck every generation?

In my opinion you don't gain a whole lot by transferring data from a 8x BD at 36mps to a 80mbps HDD drive just so you can have an increase of 44mbs transfer speed and 1/6th the seek times. Now say you install 1gb of game data onto a HD drive and that unloads itself into a ram buffer/cache (honestly I agree no difference) and then the BD drive continues to load 1gb of data into the buffer/cache at 35mbs until it is full.

Now you have basically went from a 36mbs media system to a 21gbps media system and knocked down seek times to 1/100th. What really do you lose from this setup? You were still going to install game data onto the HDD but now you wouldn't need to install as much and that means smaller HDD's can be used. You also take advantage of the real upgrade to the BD drive as the HDD speed didn't change between generations. Also wouldn't having a cache like that mean we could stream from 3 sources simultaneously? Buffer, HD and BD? Heck make it mandatory that the next system has to have a minimum 1gb page file and it further increases the ability of the cache/buffer system as it could offload data transfered from the BD drive into the cache back to the page file when needed..

In fact transferring the data from the HDD to the system for processing wouldn't be any faster then transferring the data to the ram cache/buffer and then to the system for processing but you would have that data still in an area for quick access. Now the 2gig or 4gig buffer is being filled by the BD drive and portions of the HDD information is being removed as it is no longer needed and left over data from the BD drive is being stored in the page file.

I just don't see an SD card or larger HDD being a really viable solution to increase performance and reduce load times next generation. They could get away with just having a 100gb HD drive if they just threw in 2-4gb of ram and that has to be cheaper then adding an extra 100-300gigs of storage space over a 100gig drive but provide a huge increase in performance.
 
Back
Top