Taken from the Frostbite thread
“We knew that people would think that this demo was running on PC, but the good thing is that it’s all based on streaming,” Bach tells us. “We have a super-powerful streaming pipeline, which makes it possible to stream high-end data through the game so every frame we look at will have fresh data. This means you don’t have to load everything at once; you don’t have to fill the level at the start. In BC2, you have 512 megs of memory; you load it, you play it, done. The objects you saw at the end needed to be loaded at the start, and you think, ‘It took me an hour to get to this point where I can see it, so what’s the point?’
“That’s the whole magic with this. We can have 512 megs every hundred metres if we wanted to, as we can just flush the data out [and replace it] as you move along. I can promise you that the console versions will still look amazing because of the core technology. If you have a 360, we want to use that machine to the maximum.”
Could this be my vindication?
Only needing to supply 512mb of new data isn't very difficult, current hard drives can transfer at speeds like 80mbs, but the system isn't going to be able to stream small chunks at 80mbs to the system when it is using 1gb of ram! It will need to flush out 80mbs a second from ram to fill the need for the next asset. If the game needs to load 320mb of new assets for a new building around the horizon it needs to dump 4 seconds worth of data first if the drive is peaking at 80mbs so it can load that new asset. That means we went from having 1024mb of ram to 704mb because a portion of the ram was unusable/partially empty due to the new assets being loaded.
A HDD or Optical drive can only stream data that it can store or use immediately. Therefore if the system does not have a buffer or need for data, both those mediums sit idle providing no other benefit to the gaming experience. So while the HDD could churn out that 320mb needed when the system didn't need it, and then imediately offload it within a ms as it was needed you wouldn't lose that avaialbe resource.
A great analogy?
You're on a game show, you and your opponent are given a bunch of 6" x 6" x 6" boxes.
A large pile of money is dropped on one end of the building, your designated cache is on the other side of the building.
You're both instructed that each of the 4 rounds last 2 minutes and only during that time can you dump the boxes full of money into your cache to count your winnings. You are given 6 minutes between rounds.
Round 1 starts
Your opponent fills up 2 boxes and runs to the other side of the building, dumps them into the cache, and then waits for his next round to begin; he was able to get $512.
You too after the first round only was able to get $512 but...you found a loophole.
After the round ended you went back and started to fill those boxes back up again and then moved them really close to the cache. The rules never said you couldn't do this, only that you couldn't dump them into your cache until the next round begins.
Round2
Your opponent once again was only able to get $512
You now were able to get $2,048
round 3..round 4 ...done! Your opponent got $2,048 total, you got $6,656 total
Both you and your opponent went the same speed packing the boxes, you both had the same amount of distance between the pile of money and the cache. You both had the same amount of time between rounds but the difference was you didn't wait for the round to begin to start filling your boxes and you put your boxes in a area much closer to the cache for dumping.
A game would be the same way because it has a designated time between when it allows data to be moved into it and when it sits on that data. Most games sit on the data for a very long time but when it does need more data it is limited by the speed in which the data can be given to it.
Basically put if next gen has 2gb of system memory ( I hope) streaming with our current speed limitations is going to severly hinder the quality of the games. We will have to sacrafice memory of the system to store new assets that are streaming in larger chunks due to that limitation and that doesn't seem very efficient use of expensive memory.