Alternative distribution to optical disks : SSD, cards, and download*

Have a 4x or even 6x constant angular velocity blu-ray drive(assuming it's not more expensive) for megatexturing/predictable streaming

The problem with Optical Drives for megatexturing is less the transfer speed (but you will see more pressure here at 1080p when you start have color, specular, normal, etc layers all separate so bandwidth could be a concern down the line as well) but the seek time latency is a big issue. As sebbbi noted about virtual textures:

Seek latency however can be a big bottleneck on DVDs, since a worst case seek can take 100ms

He mentions some solutions but the bottom line is for a traditional game populated memory via transfer (filling 2GB+ of memory via an optical drive) is painfully slow yet going with a RAM friendly solution that leverages the nice big optical drive then butts heads up against a the really poor seek time of the optical drive. So in the traditional scenario you want the data on the HDD because sequential well packaged data is 5x faster and for megatextures avoiding the nasty latency on the optical drive is, while not ideal, can be up to an order of a magnitude better on a traditional HDD.
 
You want a separate SKU with a hole where the drive would be or two seperate consoles differing alot more than that?
Also if you want to use that DD-only version for the triple A titles you gonna need a big HDD anyway, kinda opposite to the goal of makin it very cheap. But People dont seem to think long-term, and may end up with this and a HDD addon almost from the first day.
(I think PS3s (perceived) entry price is inflated by not offering a HDD-less SKU and annoys anyone that want to get another HDD than the available SKUs have with a dead-weight. It wouldnt practically change anything regarding costs but the common comparison to the cheapest XBox would look better)
Its still a useless expense as soon as you plug an HDD in. Just sell the cheapest SKU with a 2.5 SSD drive you can rip out.
That is unless you expect the flash to be super-speedy (and cheap?), but then Id rather have more main RAM than another pool that needs to be managed.

I probably would want two completely different sku's where there is no option to update optical drive. Would it look the same or be different would be manufacturing optimization guestion. Perhaps having same case but different cooling would be cheapest option.

If we would assume 64GB flash DD only would be quite ok. Maybe you fit one aaa title(20GB), 2-3 regular titles(8GB) and then some DD only titles + save games + game caches. Ones you fill the drive you either go and buy the hd(which is premium price vendor locked) or you uninstall and are in (re)download loop.

I would probably want to put the flash on mainboard to optimize cost(no casing). This would allow nice scaling of price compared to having and HDD in base model. I would guess if the blu-ray drive is fast enough for mega texturing then having something along the lines of 100MB/s for flash would be reasonable for random data and giving a huge leap over current gen.

I don't think there is any game that has been designed to use SSD and having quite a fast random access+read speed. I wonder for example what madden could do if you could load extreme models to memory on demand instead of caching all the models to ram. PC games tend to just cache to ram instead of stream which is not that optimal considering limited budgets of console hardware.
 
Last edited by a moderator:
Another thing that there is swapping. Usually a curse word but with proper OS and fast random access media it would be very, very reasonable solution for consoles on limited budget. If the OS can load let's say 100MB/s it would let any task smaller than let's say 50MB/s be swapped to background and be loaded over game in less than 0.5s. Split the background services to engine and UI part and the UI parts can be easily swapped to disc while playing game and freeing a lot of memory. Os with SSD+proper swapping could give really insanely good feeling of multitasking without actually keeping that much in memory...
 
Can't they do a better implementation of Gamecube A-RAM to mask loadtime, put in 2 to 8 GB of memory with around 1-2 GB/s bandwdith in there for buffer or something. I always thought Gamecube was well done in that aspect.
 
Can't they do a better implementation of Gamecube A-RAM to mask loadtime, put in 2 to 8 GB of memory with around 1-2 GB/s bandwdith in there for buffer or something. I always thought Gamecube was well done in that aspect.

The problem with ram is you need to reload it on bootup. Flash/ssd/whatever retains the state.
 
That doesn't seem like a big obstacle if it opportunistically preloads and doesn't eject pages from RAM too aggressively, actually..
 
The problem with ram is you need to reload it on bootup. Flash/ssd/whatever retains the state.

Can't they make the console be put into sleep mode or something, where the data inside that A-RAM be somehow maintain at low power ?
 
How would it work exactly?

Imaging an Xbox 360 with 512MB UMA but an additional 2GB like the GCN "A-RAM" pool connected at, say, 10GB/s. Your game would load as it does now but after such the disk/HDD keeps spinning and loading into the A-RAM. The game, instead of seeking the disk/HDD for access now addresses the A-RAM as it is likely the entire level (and maybe even the next) are loaded into A-RAM. At 10GB/s it isn't blazing fast but it is magnitudes faster than the HDD or Disk and it has extremely low access times. This scenario isn't too different from the PC where you can use the system memory as a large VRAM swap and, as noted above, you could in some games create a situation where you may never see a significant load time other than the initial game load as the A-RAM could stream data into system memory within seconds.

Obviously a system designed around such a concept, with software taking it into consideration, would function much better than what I laid out above. But if 8GB of cheap system memory is $20 you could then invest in a much nicer 2GB fast pool. This doesn't address the initial load time issues (puts it squarely on developers on designing games where you can quickly begin play while content streams into memory) but it does solve a lot of other problems by create a large, fast, and low latency pool of memory. It is a fairly "cheap" solution in that it can cut down on the costly memory and blunt some of the impact of slower HDDs and slow optical formats.

But I hear the screams of "split memory pools" as it would only take the competitor going with 4GB of fast UMA memory to toss this on its ear.
 
Imaging an Xbox 360 with 512MB UMA but an additional 2GB like the GCN "A-RAM" pool connected at, say, 10GB/s. Your game would load as it does now but after such the disk/HDD keeps spinning and loading into the A-RAM. The game, instead of seeking the disk/HDD for access now addresses the A-RAM as it is likely the entire level (and maybe even the next) are loaded into A-RAM. At 10GB/s it isn't blazing fast but it is magnitudes faster than the HDD or Disk and it has extremely low access times. This scenario isn't too different from the PC where you can use the system memory as a large VRAM swap and, as noted above, you could in some games create a situation where you may never see a significant load time other than the initial game load as the A-RAM could stream data into system memory within seconds.

Obviously a system designed around such a concept, with software taking it into consideration, would function much better than what I laid out above. But if 8GB of cheap system memory is $20 you could then invest in a much nicer 2GB fast pool. This doesn't address the initial load time issues (puts it squarely on developers on designing games where you can quickly begin play while content streams into memory) but it does solve a lot of other problems by create a large, fast, and low latency pool of memory. It is a fairly "cheap" solution in that it can cut down on the costly memory and blunt some of the impact of slower HDDs and slow optical formats.

But I hear the screams of "split memory pools" as it would only take the competitor going with 4GB of fast UMA memory to toss this on its ear.

Anytime one starts a new game the cache is empty, which is worst case scenario, or what?
 
Sure, the first level or area you play, you'd be loading at optical disc speed, but after that you'd have the game be doing prefetch in the background. If you've got 8 gigs to play with, that's an entire 360 game right there. And no worries about limited write cycles on flash ram.

As mentioned above, this ram would have to be above and beyond the main fast ram that your competitor has, to make sure you don't lose functionality, but whenever you're not dealing with mass data from disk, you'd be able to use it for fast paging of whatever apps might be on the system.

That is, after all, how PC's do it today.
 
I don't see them doing this for the simple reason that MS will want their next console to be a media hub that can do everything and not just play games - and that will mean it'll need to play DVDs and Blu-rays.
 
It´s a bit strange, it´s ok to wait for potentially hours/days? for a Digital Purchase to download, which then has to be installed. But it´s out of the question to wait for complete/partial install from a optical drive to a Hard drive. A straight copy of 25 GB should take around 15 mins with a 8 speed drive

If MS/Sony makes a real hard effort on the install part and provides the developers with the needed tools and forces them to follow strict guidelines it should not be such a big issue.
 
On Steam (and 360, I guess) there's no installation part...

But in any case... I always wondered why not more games used the Oblivion etc. kind of "installation". Works pretty well for that game. And what it does is installing the game to HDD while playing the game. No "installation" wait. At first the game has longer load times, but once the installation is finished, it's the same as installing it like other games do. Some Steam games (i.e. Half Life 1/2, iirc) allow you to start playing while it's still downloading the game. Most linear games could use such a system. Open world not so much.

The "load to RAM" stuff was used for Arcade machines. Some (Naomi with GDROM, afaik did it) had additional RAM which was just used as a storage medium for the game. This served two purposes. First Arcade machines run 24/7 and suffer heavy abuse, so mechanical drives are prone to failure. And by caching all game data to RAM, the drive only needs to spin up once per game load. And the other part was much faster load times than from HDD or CDROM.

You could do it "Windows/Hibernate" style and make it so that the game cache gets put onto the HDD if you want to load another game (maybe ask the user if he wants to dump it to HDD or reload it when he boots up the other game again), and have this RAM not lose content when switching off the console, or putting it to HDD when switching off anyways.. Something like 2 to 4 gb per game should suffice pretty much. The act of copying that amount of data to an HDD doesn't take too long, either... with my slower external HDD (writing sequential data at about 30MB/s) that's about two minutes, which could be done in parallel to game load / system boot.

Might be a real bitch to setup and debug though, from Sonys/MSs perspective, as well as game makers.
 
Yeah, big problem is how its done currently (on PS3 atleast), installs take way longer than they should. And similarly updating is a pain, with purchases on PSN installing the first version and then having to update multiple times.
Its unfair to assume than next-gen consoles will use a separate flash-pool perfectly, while not improving on their regular disk caching and other stuff.

Some better effort and stricter guidelines would go a long way in improving loadtimes.
 
There is NO installation with Steam (unless you count installing Steam itself or starting a download). The download IS the installation. There's no extra process of starting an installation after the download has finished (some games need extra dependencies installed, but that's a different matter altogether and usually doesn't take longer than a minute anyways).

On PS3, it's a two part process... first downloading a package and then probably unpacking/decrypting that for playback. This just isn't present in Steam or 360 (I guess, at least that's what I heard... I don't have a 360).
 
Which is kinda what I said. Only that Steam does more then just download, files are compressed and possibly encrypted - and they get processed instantly (and then often packed into gcf files) instead of waiting for the whole download to finish first.
Same thing as any installer really, just not a 2 step process.
 
Back
Top