Alternative distribution to optical disks : SSD, cards, and download*

My concern with mandatory installs is that it is a waste of resources. Even if future games only store 25% of themselves on the console we are still looking at around 6-9gb per game. We are also only doing that in the hopes of helping loading times as technically that is the only thing it would help with besides game updates. So we are already looking at ways to speed up the loading process, but as compared to this generation it isn't that big of a leap as it is for the assets that are going to be created and needed to take advantage of the additional computing power. What has the limited load speed of game data done to the graphics this generation? Does anyone even know? Have we been too concerned with the system ram of our consoles that we have overlooked its purpose typically is considered the bottle neck every generation?

It isn't just load speed! The faster data can be loaded into main RAM and GPU RAM the less of it you need for a given level of performance. So in short with fast loading you need less RAM and you make better use of the RAM you have. The reason why noone talks about it is that once a generation starts its fixed.

You also take advantage of the real upgrade to the BD drive as the HDD speed didn't change between generations.

As much as the optical drive speed didn't change between DVD and Blu Ray for the same linear velocity... as the HDD is not a fixed process it's speed increases as the linear density increases. So as we've gone from 20GB platters to 250GB platters the speed on the HDD subsystem has increased substantially too. Whilst it may have been 30MB/s at the start of the generation, average speeds are closing in on 100MB/s
 
My concern with mandatory installs is that it is a waste of resources. Even if future games only store 25% of themselves on the console we are still looking at around 6-9gb per game. We are also only doing that in the hopes of helping loading times as technically that is the only thing it would help with besides game updates. So we are already looking at ways to speed up the loading process, but as compared to this generation it isn't that big of a leap as it is for the assets that are going to be created and needed to take advantage of the additional computing power. What has the limited load speed of game data done to the graphics this generation? Does anyone even know? Have we been too concerned with the system ram of our consoles that we have overlooked its purpose typically is considered the bottle neck every generation?

In my opinion you don't gain a whole lot by transferring data from a 8x BD at 36mps to a 80mbps HDD drive just so you can have an increase of 44mbs transfer speed and 1/6th the seek times. Now say you install 1gb of game data onto a HD drive and that unloads itself into a ram buffer/cache (honestly I agree no difference) and then the BD drive continues to load 1gb of data into the buffer/cache at 35mbs until it is full.

Now you have basically went from a 36mbs media system to a 21gbps media system and knocked down seek times to 1/100th. What really do you lose from this setup? You were still going to install game data onto the HDD but now you wouldn't need to install as much and that means smaller HDD's can be used. You also take advantage of the real upgrade to the BD drive as the HDD speed didn't change between generations. Also wouldn't having a cache like that mean we could stream from 3 sources simultaneously? Buffer, HD and BD? Heck make it mandatory that the next system has to have a minimum 1gb page file and it further increases the ability of the cache/buffer system as it could offload data transfered from the BD drive into the cache back to the page file when needed..

In fact transferring the data from the HDD to the system for processing wouldn't be any faster then transferring the data to the ram cache/buffer and then to the system for processing but you would have that data still in an area for quick access. Now the 2gig or 4gig buffer is being filled by the BD drive and portions of the HDD information is being removed as it is no longer needed and left over data from the BD drive is being stored in the page file.

I just don't see an SD card or larger HDD being a really viable solution to increase performance and reduce load times next generation. They could get away with just having a 100gb HD drive if they just threw in 2-4gb of ram and that has to be cheaper then adding an extra 100-300gigs of storage space over a 100gig drive but provide a huge increase in performance.

One of the points shifty was making was you gain the transfer from the BLuRay drive, at least it´s possible to use it for secondary stuff like music etc. And you can combine it with the Harddrive.

I found a random access time for BR drive , it was 144ms compared to a harddrive that is really slow.

But sure, add more buffer space, but it still has to be filled up from a slow drive.
 
One of the points shifty was making was you gain the transfer from the BLuRay drive, at least it´s possible to use it for secondary stuff like music etc. And you can combine it with the Harddrive.

I found a random access time for BR drive , it was 144ms compared to a harddrive that is really slow.

But sure, add more buffer space, but it still has to be filled up from a slow drive.

Agree and we will never have a way to get around the fact that we have slow mediums; it's just the nature of things. But I'm still going to argue my point that at no point in time can any medium available be able to offload 2+GB of information in less then 1ms like a buffer could. I'm not too concerned with the loading of the data into the cache, this is a moot point as that is going to be a hold up if we had a buffer or not. The difference in my opinion is the speed we could now achieve and the texture and asset swapping that could now be accomplished if we used a buffer.
 
You mean start to offload in 1ms? Because 2GBs in 1ms would be 2 terabytes a second, which is a fair ways off.

HA!!

Yeah, I was a lil goof on that one! Basically it was so fast it combined both seek times and transfer speeds :p

You are correct though, but 2GB in 1 second and it could seek that data extremely fast.

I wish I was smart enough to figure out a way to test this. I know you can get software that will partition your ram and create a temporary ram drive using it. I know I could then store the game data into the ram drive to be read for the game...but. Just by doing so I never put more strain on the system to more agressivly swap out assets and such.

Would love to see someone who could do this make a simple test. A system with 4gb of ram and a 1gb graphics card partition half of the ram available and create a ram disk. Code a set of objects to be drawn of varying quality and capture the performance gained and quality level achieved with this setup. Doing multiple texture swaps with multiple objects that exceed the amount of space available for both vram (1gb) and system ram (2gb) would be a start I think.

Because we all know all I'm basically doing is talking out of my ass :D Id rather be proved wrong than being neive enough to think this for however long if it is not going to work.
 
Taken from the Frostbite thread

“We knew that people would think that this demo was running on PC, but the good thing is that it’s all based on streaming,” Bach tells us. “We have a super-powerful streaming pipeline, which makes it possible to stream high-end data through the game so every frame we look at will have fresh data. This means you don’t have to load everything at once; you don’t have to fill the level at the start. In BC2, you have 512 megs of memory; you load it, you play it, done. The objects you saw at the end needed to be loaded at the start, and you think, ‘It took me an hour to get to this point where I can see it, so what’s the point?’

“That’s the whole magic with this. We can have 512 megs every hundred metres if we wanted to, as we can just flush the data out [and replace it] as you move along. I can promise you that the console versions will still look amazing because of the core technology. If you have a 360, we want to use that machine to the maximum.”

Could this be my vindication?

Only needing to supply 512mb of new data isn't very difficult, current hard drives can transfer at speeds like 80mbs, but the system isn't going to be able to stream small chunks at 80mbs to the system when it is using 1gb of ram! It will need to flush out 80mbs a second from ram to fill the need for the next asset. If the game needs to load 320mb of new assets for a new building around the horizon it needs to dump 4 seconds worth of data first if the drive is peaking at 80mbs so it can load that new asset. That means we went from having 1024mb of ram to 704mb because a portion of the ram was unusable/partially empty due to the new assets being loaded.

A HDD or Optical drive can only stream data that it can store or use immediately. Therefore if the system does not have a buffer or need for data, both those mediums sit idle providing no other benefit to the gaming experience. So while the HDD could churn out that 320mb needed when the system didn't need it, and then imediately offload it within a ms as it was needed you wouldn't lose that avaialbe resource.

A great analogy?
You're on a game show, you and your opponent are given a bunch of 6" x 6" x 6" boxes.
A large pile of money is dropped on one end of the building, your designated cache is on the other side of the building.
You're both instructed that each of the 4 rounds last 2 minutes and only during that time can you dump the boxes full of money into your cache to count your winnings. You are given 6 minutes between rounds.

Round 1 starts
Your opponent fills up 2 boxes and runs to the other side of the building, dumps them into the cache, and then waits for his next round to begin; he was able to get $512.
You too after the first round only was able to get $512 but...you found a loophole.
After the round ended you went back and started to fill those boxes back up again and then moved them really close to the cache. The rules never said you couldn't do this, only that you couldn't dump them into your cache until the next round begins.

Round2
Your opponent once again was only able to get $512
You now were able to get $2,048
round 3..round 4 ...done! Your opponent got $2,048 total, you got $6,656 total

Both you and your opponent went the same speed packing the boxes, you both had the same amount of distance between the pile of money and the cache. You both had the same amount of time between rounds but the difference was you didn't wait for the round to begin to start filling your boxes and you put your boxes in a area much closer to the cache for dumping.

A game would be the same way because it has a designated time between when it allows data to be moved into it and when it sits on that data. Most games sit on the data for a very long time but when it does need more data it is limited by the speed in which the data can be given to it.

Basically put if next gen has 2gb of system memory ( I hope) streaming with our current speed limitations is going to severly hinder the quality of the games. We will have to sacrafice memory of the system to store new assets that are streaming in larger chunks due to that limitation and that doesn't seem very efficient use of expensive memory.
 
Last edited by a moderator:
Taken from the Frostbite thread



Could this be my vindication?

Only needing to supply 512mb of new data isn't very difficult, current hard drives can transfer at speeds like 80mbs, but the system isn't going to be able to stream small chunks at 80mbs to the system when it is using 1gb of ram! It will need to flush out 80mbs a second from ram to fill the need for the next asset. If the game needs to load 320mb of new assets for a new building around the horizon it needs to dump 4 seconds worth of data first if the drive is peaking at 80mbs so it can load that new asset. That means we went from having 1024mb of ram to 704mb because a portion of the ram was unusable/partially empty due to the new assets being loaded.

.

You're assuming the time it takes to traverse 100M is 4 seconds? Remember, the player is on foot and facing opposition. They can stream say 240MB/s on the Xbox 360 in 20s, which is about right if they don't want to have the system break down if the player sprints through the level.
 
http://news.cnet.com/8301-13506_3-20041084-17.html?tag=cnetRiver

EA is really interested in the pc DD market . I wonder if the push for DD will come not from the consumers but from the publishers next gen.

While a DD only console may loose out on a % of the user base it might be more than made up for by the margins yo ucan make on each sale.

If the break downs that we've seen a true using the traditional retail model developers may have to sell double the games to make the same profit as they are on a DD network
 
You're assuming the time it takes to traverse 100M is 4 seconds? Remember, the player is on foot and facing opposition. They can stream say 240MB/s on the Xbox 360 in 20s, which is about right if they don't want to have the system break down if the player sprints through the level.

But is 240MB going to be enough assets? That sounds great for our current system with only 512MB; you're basically giving the system 50% new content for creation. But how will that translate when assets are much larger or more assets are needed? If we do the 50% again a 2GB system will then want 1GB of new data....

So using your example
They can stream say 240MB/s on the Xbox 360 in 20s, which is about right if they don't want to have the system break down if the player sprints through the level
That is 12MB per second being offloaded into the system, so taking 1,024MB / 12MB per second means 85 seconds to load that data.

But during that time you have to remember they need to consistantly remove previous assets so they can put new assets into it. Unless they are swapping out 12mb (usuable) chunks at a time we are losing system memory that sits empty waiting for our mediums to fill it.

Obvously newer BD drives and HD speeds could transfer much faster then 12mbs but even at 80mbs without knowing the size of future assets its hard to see if that is enough speed to stream as needed.

Maybe 60-80mb of data streaming is enough for this next gen of consoles, maybe it isn't. Either way unless the HD and BD drive are constantly streaming data those benifits are mainly going unused.
 
But is 240MB going to be enough assets? That sounds great for our current system with only 512MB; you're basically giving the system 50% new content for creation. But how will that translate when assets are much larger or more assets are needed? If we do the 50% again a 2GB system will then want 1GB of new data....

Like I said, systems which can use both HDD and ODD streaming will be ideal in the next generation. However at up to 2GB of RAM a standard 6-8* Blu Ray drive is serviceable. However with respect to streaming, not all assets in main memory need to be replaced.

That is 12MB per second being offloaded into the system, so taking 1,024MB / 12MB per second means 85 seconds to load that data.

Blu Ray 6,8 and 12* drives are available. Next generation consoles won't be stuck at 2* speed.


Obvously newer BD drives and HD speeds could transfer much faster then 12mbs but even at 80mbs without knowing the size of future assets its hard to see if that is enough speed to stream as needed.

Maybe 60-80mb of data streaming is enough for this next gen of consoles, maybe it isn't. Either way unless the HD and BD drive are constantly streaming data those benifits are mainly going unused.

Obviously future assets won't be larger than can fit into main memory and it is unlikely that the next generation consoles will have more than 4GB of memory of which a large portion may be graphics memory, so we won't be talking about 4GB of assets, probably at most 2GB for a console with 4GB of RAM.

As for streaming, it won't get any harder to accomplish with the current generation and it'd probably get easier as the tools and experience with streaming improves amongst the games industry.
 
Some more datacaps coming, now from at/t.

http://www.dailytech.com/article.aspx?newsid=21115

The interesting thing is how much the data transfer costs. This allows us to take "price of" digital distribution to equation compared to physical media better than before.
Overage fees will be $10 for every 50GB that you go over the limit. However, AT&T will send notices to customers at the 65, 90, and 100 percent data cap thresholds, so there should be no excuse for customers to not know when they ar

So if we assume let's say 20GB size for average high profile game the price of DD(worst case) for that game would be 4$ if user goes above his quota. Ofcourse if user stays under his quota the price of DD is essentially 0$.
 
Some more datacaps coming, now from at/t.

http://www.dailytech.com/article.aspx?newsid=21115

The interesting thing is how much the data transfer costs. This allows us to take "price of" digital distribution to equation compared to physical media better than before.


So if we assume let's say 20GB size for average high profile game the price of DD(worst case) for that game would be 4$ if user goes above his quota. Ofcourse if user stays under his quota the price of DD is essentially 0$.

AHHH!!

I don't understand data caps one lil bit. They are not a perishible good, you transfering 300gb of data does not mean someone else can't transfer 300gb of data. They are treating bandwith like a resource it really isn't. If AT&T and other telecoms/cable companies want to capture the "streaming" service industry as they hope; putting data caps on it will only make it that much more difficult for them.

The company that doesn't put limits on the service in the coming years will be the service that more people go to as bandwidth usage changes.
 
AHHH!!

I don't understand data caps one lil bit. They are not a perishible good, you transfering 300gb of data does not mean someone else can't transfer 300gb of data. They are treating bandwith like a resource it really isn't. If AT&T and other telecoms/cable companies want to capture the "streaming" service industry as they hope; putting data caps on it will only make it that much more difficult for them.

The company that doesn't put limits on the service in the coming years will be the service that more people go to as bandwidth usage changes.

Well the bandwidth of networks is not infinite. Wonder what would happen if everyone tried to stream their favorite tv-show at the same time or worse yet, torrent it at that same time :) If everything goes to DD only it will become rather crucial how the bandwidth is divided between users to avoid network blackouts/slowdowns. Also it will become rather obvious that where the data resides becomes equally important to bandwidth distribution to avoid clocking the network.

From that same article the number of uses affected is about 2%... What if that other 98% would use the network to same extent as those 2% ?
 
Last edited by a moderator:
Well the bandwidth of networks is not infinite. Wonder what would happen if everyone tried to stream their favorite tv-show at the same time or worse yet, torrent it at that same time :) If everything goes to DD only it will become rather crucial how the bandwidth is divided between users to avoid network blackouts/slowdowns. Same not it will become rather obvious that where the data resides becomes equally important to avoid clocking the network.

From that same article the number of uses affected is about 2%... What if that other 98% would use the network to same extent as those 2% ?

What your inferring too is much different then bandwidth caps.

Limiting you or me to only 50GB of data per month is not going to stop the possibility one bit that many people could be trying to stream their favorite Tv shows at the same time. Limiting Bandwidth of users only works if they can control when that bandwidth is used, anything else and it is just frivolous to even argue that point (them, not you).

Case in point I primarily do most of my heavy traffic later during the night, in fact when I'm using the majority of my data consumption most people are asleep. So, in that instance I am not affecting anyone elses ability to stream their favorite shows and they are not hindering my ability to stream netflix either. So I am given a cap to help reduce the chance that I'm going to consume too much bandwidth and enroach on thier usage?

If they really want to limit the possibility of people "clashing" and clogging the network then why dont they work harder on increasing the speed? 2x the speed means I download things 2x faster which means I'm on the "net" 1/2 as much. I can download an episode of Justified in about 10 minutes, give me double the speed and it's now 5 so someone else can use that 5 minutes I'm no longer using.

I mean we had bandwidth caps with dial-up and during that time with the infrustructure I could understand. Today? With the pipes we are running and the 10fold increase of nodes I don't get it. 50GB is childs play when it comes to DD, a HD copy of a movie is already 4-6GB (that I pay for through PSN). With those caps I could download 10 movies a month and ONLY 10 movies a month..nothing else (no gaming, email, music or even checking the weather) or I'd go over that limit.
 
Like I said, systems which can use both HDD and ODD streaming will be ideal in the next generation. However at up to 2GB of RAM a standard 6-8* Blu Ray drive is serviceable. However with respect to streaming, not all assets in main memory need to be replaced.

A 8x BD is like 288Mbps (36MBps), if we have 2GB of system ram to fill (not Vram) at peak outter diameter speed it would take approx 57 seconds to fill that ram. Current systems have 256MB of system ram and a 2xBD (9MBps) to fill 256MB takes 28.4 seconds. Even with a drive 4x slower the fact that current systems have 1/8th the amount of ram new systems will have means what is barely sufficient now isn't anywhere near sufficient soon.

Blu Ray 6,8 and 12* drives are available. Next generation consoles won't be stuck at 2* speed.

Still a 12x BD drive transfers at 50MBps so it would still take 41 seconds to populate that memory only using the BD drive.



Obviously future assets won't be larger than can fit into main memory and it is unlikely that the next generation consoles will have more than 4GB of memory of which a large portion may be graphics memory, so we won't be talking about 4GB of assets, probably at most 2GB for a console with 4GB of RAM.

I HOPE we don't have 4GB of usuable assets to load, our current mediums would waste those resources. A 2GB system could get the same performance and graphics as a 4GB system today with our current transfer speeds.

As for streaming, it won't get any harder to accomplish with the current generation and it'd probably get easier as the tools and experience with streaming improves amongst the games industry.

This is where I think we have a severe problem coming. I think streaming is going to take a huge hit if something isn't done. Unless all games have a mandatory HDD install or come on some kind of flash medium streaming assets from even a 12x BD drive is going to be problematic. The math doesn't look good at all. Newer systems with even a 12x BD drive are slower at populating then previous systems, it can't be denied. The only possible ways to keep parity with streaming is to

1. Reduce Ram down to 2GB (1gb system) so that the 4x increase in usuable resources matches the 4x increase in drive speed.
2. Require mandatory installs of all assets, that would give the system slightly (2-3 seconds) faster population time then what we have currently with 2x BD drives.
3. Games come on SD cards that again would give us slightly better population times then current generation tech.

Again my main problem with streaming is that in order to stream data into that usuable ram something has to be taken out of it. Will assets be made at 36-50MB chunks that it can easily swap them out? During the swap the system has less ram, how much wasted ram is going to be there waiting for assets to populate it during streaming? Even if it's only 8 seconds worth of ram that equates to either 288MB or 400MB or ram that cannot be used.

A buffer on the other hand can spit 2GB of assets to the buffer and take 2GB of assets from the buffer so fast it would be almost instantanous. At that point the system could have discarded assets it did not believe it would need anymore and store assets it might need to recall shortly into the buffer while the system now populates the buffer with more assets the game knows it will need in the immediate future.

I would just like to see a less wastefull system in the future when it comes to assets. The assets that are discarded from the system during streaming I would guess need to get loaded back into the system quite frequently.
 
What your inferring too is much different then bandwidth caps.

Limiting you or me to only 50GB of data per month is not going to stop the possibility one bit that many people could be trying to stream their favorite Tv shows at the same time. Limiting Bandwidth of users only works if they can control when that bandwidth is used, anything else and it is just frivolous to even argue that point (them, not you).

Case in point I primarily do most of my heavy traffic later during the night, in fact when I'm using the majority of my data consumption most people are asleep. So, in that instance I am not affecting anyone elses ability to stream their favorite shows and they are not hindering my ability to stream netflix either. So I am given a cap to help reduce the chance that I'm going to consume too much bandwidth and enroach on thier usage?

If they really want to limit the possibility of people "clashing" and clogging the network then why dont they work harder on increasing the speed? 2x the speed means I download things 2x faster which means I'm on the "net" 1/2 as much. I can download an episode of Justified in about 10 minutes, give me double the speed and it's now 5 so someone else can use that 5 minutes I'm no longer using.

I mean we had bandwidth caps with dial-up and during that time with the infrustructure I could understand. Today? With the pipes we are running and the 10fold increase of nodes I don't get it. 50GB is childs play when it comes to DD, a HD copy of a movie is already 4-6GB (that I pay for through PSN). With those caps I could download 10 movies a month and ONLY 10 movies a month..nothing else (no gaming, email, music or even checking the weather) or I'd go over that limit.

You didn't read the article, perhaps you should. The basic transfer limit was either 150GB (Byte, not bit) or 250GB. After that each 50GB more costs extra.
 
A 8x BD is like 288Mbps (36MBps), if we have 2GB of system ram to fill (not Vram) at peak outter diameter speed it would take approx 57 seconds to fill that ram. Current systems have 256MB of system ram and a 2xBD (9MBps) to fill 256MB takes 28.4 seconds. Even with a drive 4x slower the fact that current systems have 1/8th the amount of ram new systems will have means what is barely sufficient now isn't anywhere near sufficient soon.



Still a 12x BD drive transfers at 50MBps so it would still take 41 seconds to populate that memory only using the BD drive.





I HOPE we don't have 4GB of usuable assets to load, our current mediums would waste those resources. A 2GB system could get the same performance and graphics as a 4GB system today with our current transfer speeds.



This is where I think we have a severe problem coming. I think streaming is going to take a huge hit if something isn't done. Unless all games have a mandatory HDD install or come on some kind of flash medium streaming assets from even a 12x BD drive is going to be problematic. The math doesn't look good at all. Newer systems with even a 12x BD drive are slower at populating then previous systems, it can't be denied. The only possible ways to keep parity with streaming is to

1. Reduce Ram down to 2GB (1gb system) so that the 4x increase in usuable resources matches the 4x increase in drive speed.
2. Require mandatory installs of all assets, that would give the system slightly (2-3 seconds) faster population time then what we have currently with 2x BD drives.
3. Games come on SD cards that again would give us slightly better population times then current generation tech.

Again my main problem with streaming is that in order to stream data into that usuable ram something has to be taken out of it. Will assets be made at 36-50MB chunks that it can easily swap them out? During the swap the system has less ram, how much wasted ram is going to be there waiting for assets to populate it during streaming? Even if it's only 8 seconds worth of ram that equates to either 288MB or 400MB or ram that cannot be used.

A buffer on the other hand can spit 2GB of assets to the buffer and take 2GB of assets from the buffer so fast it would be almost instantanous. At that point the system could have discarded assets it did not believe it would need anymore and store assets it might need to recall shortly into the buffer while the system now populates the buffer with more assets the game knows it will need in the immediate future.

I would just like to see a less wastefull system in the future when it comes to assets. The assets that are discarded from the system during streaming I would guess need to get loaded back into the system quite frequently.

You have to remember that some if not all the data can be compressed on the disc and be uncompressed when loading. For example texture data could be compressed as jpeg2000 in disc and be converted to less compressed gpu friendly format on the fly.

Are you really expecting binary sizes to grow in proportion to memory size? If so you will have gigantically big game binaries. On the other hand if the binary sizes only double then 8x blu-ray drive would be considerably faster than 2x drive on ps3 in filling the memory.

If you assume let's say 10GB size for ps3 game 512MB of memory. If next gen had 4GB of memory would you expect the game sizes be 80GB(8 times more memory, 8 times assets?). Or would you rather expect the binary size to double, blu-ray read speed quadruples(2x to 8x) and what happens to load times ;)

I would still expect next gen consoles to have considerably fast flash memory as cache to alleviate problems and differences in both harddrive and physical media speed(and maybe even games coming on those pesky cheaper and slower read only flash carts). If next gen starts 2014 it's a no brainer to put 32GB, 64GB, or even more fast flash memory inside the console. This would also give scalable base sku not needing hard drive(flash price drops in future, base sku becomes cheaper, similar scaling on harddrives is not possible).
 
Last edited by a moderator:
You have to remember that some if not all the data can be compressed on the disc and be uncompressed when loading. For example texture data could be compressed as jpeg2000 in disc and be converted to less compressed gpu friendly format on the fly.
That's true this gen. Unless devs are willingly allowing long load times because they can't be bothered to compress more effectively, there's nothing you can do next gen to speed things that you can't do this gen, so the load times will remain proportional. There are also few assets that can be lossily compressed, and there aren't even potentially more ways to compress than we have now (even if JPEG isn't used now for textures, moving over to it next gen will only make savings in a fraction of the assets. Model data, executables, etc. can't be compressed any more than they can now).
 
Are you really expecting binary sizes to grow in proportion to memory size? If so you will have gigantically big game binaries. On the other hand if the binary sizes only double then 8x blu-ray drive would be considerably faster than 2x drive on ps3 in filling the memory.

Not sure, but if every future generation stays true from previous generations then I would expect larger worlds, higher resolution textures and models etc etc. I would expect Grand Theft next generation is going to be a much larger game then this generation.


If you assume let's say 10GB size for ps3 game 512MB of memory. If next gen had 4GB of memory would you expect the game sizes be 80GB(8 times more memory, 8 times assets?). Or would you rather expect the binary size to double, blu-ray read speed quadruples(2x to 8x) and what happens to load times ;)

I honestly don't know, I mean I've heard some games are using quite a bit of the storage space provided by BD. Not sure if that is due to asset placement (multiple placements of same assets), large amounts of FMV or simply a large amount of different assets. However I would be naive to think that the size of the assets themselves wouldn't increase next generation. Basically if UC3 or Halo:Reach was recreated in 3 years for the new consoles wouldn't you expect the exact same game to have much larger assets due to the new capabilities of the machine? It's not like they can turn a low rez texture into a high rez texture without increasing it's size right? And as little as I know about textures increasing the resolution has a tremendous affect on its size.

I guess we would need to know if the 360 is/was suffering from only having about 8gb of storage compared to the PS3 with BD. Yet if the 360 could just fit assets on a 8GB disc wouldn't it be reasonable to think it would now need a larger storage medium for the 720 due to assets? One last thing, a 8GB DVD could fill the 256mb pool of ram for existing consoles 32 times. A dual layer BD (52GB) fills a 256mb pool of ram for existing consoles 208 times. A dual layer BD (52GB) can fill a 2GB pool of ram for future consoles 26 times.

I would still expect next gen consoles to have considerably fast flash memory as cache to alleviate problems and differences in both harddrive and physical media speed(and maybe even games coming on those pesky cheaper and slower read only flash carts). If next gen starts 2014 it's a no brainer to put 32GB, 64GB, or even more fast flash memory inside the console. This would also give scalable base sku not needing hard drive(flash price drops in future, base sku becomes cheaper, similar scaling on harddrives is not possible).

While flash would be a step in the right direction I think the size is over-kill and the speed is sub-par. 64GB wouldn't be large enough to install future games too, would be excessivly large for buffering (2-4 GB ram in future console). Currently the fastest I've seen is an (600x) 90MBps cards and they are approaching $200 for 64GB. Granted the price will drop and we might get a 64GB card for $30 in 2 years but we still only get 90MBps on a medium that will have its life slashed by using it as a cache.

We keep on increasing the speed of the processors, then need to increase the speed of the ram so it doesn't waste cycles but all that time our mediums are still extremely slow. We basically use Ram just as I'm specifying anyways to store data temporarily from the mediums so the processor can access it quickly. Less then $20 per console will get them 2GB of DDR3 today. I'd take
 
That's true this gen. Unless devs are willingly allowing long load times because they can't be bothered to compress more effectively, there's nothing you can do next gen to speed things that you can't do this gen, so the load times will remain proportional. There are also few assets that can be lossily compressed, and there aren't even potentially more ways to compress than we have now (even if JPEG isn't used now for textures, moving over to it next gen will only make savings in a fraction of the assets. Model data, executables, etc. can't be compressed any more than they can now).

Compression itself is trivial to add to the pipeline. There is the issue of finite computing power. Devs need to weight compromise between time spent on IO and time spent on re/de(compressing) data and find right balance for each game separately. It might be that for current gen some compression schemes would take away from io time but consume too many cpu cycles to be useful(i.e. texture assets as jpeg2000 needing decompression and recompression). As we have more computing power next gen, more memory to fill and larger datasets the sweet spot of compression shifts.

I don't believe binary sizes will stay the same but in same note I don't believe binary size will grow proportionally to memory size either. Sweet spot is somewhere in between. Blu-ray alone can give around 3x improvement to data read speed(2x to 8x).

Some games like gt5 already have (possibly) next gen assets partially included(photomode models). How much space photomode assets accounts for and how much more is needed for next gen? I have no idea but again I don't think the requirement for binary size growth from this gen to next gen is proportional to available memory.

What I would prefer to see is really fast flash memory inside the console and seemless cache for game data. Let the flash cache small files accessed in random pattern and leave the big files and predictable access pattern stuff to physical media.
 
Last edited by a moderator:
Back
Top