*spin off* RAM & Cache Streaming implications

Thanx Sebbbi.

Impressive knowledge about virtual texturing brings you in here.

Virtual Texturing From what I understand in theory makes life much easier for developers to save memory space and the next two years with drives from 2 to 4 times faster combined with the fastest SSD (100MB/sec today) will likely to allow more than enough space for 1080P games.

Talk about 720P(midia drive requirements streaming 6.7MB/sec and turn situations) ,when Carmack and others spoke of ID you could put 8k on x360(10/12MB/sec) and 4k only on ps3(6/9MB/sec) was figuratively on the maximum speed that each media drives on this consoles had not memory space per se ?

About 2GB RAM next gen ...Shifty please hold "our cause" ;) of up to 4GB(praying here for XDR2), as I said before smart phones and tablets will grow much (power, capacity and market) and soon maybe someone has the idea of puts them (tablet) as game machines could be a very serious opponent for next gen console.;)
 
Last edited by a moderator:
Er, in the case of Rage I'm 99% sure that Carmack talked about usig every bit of available memory for caching. Probably not for the tile cache itself, but for the transcoding instead?
 
About 2GB RAM next gen ...Shifty please hold "our cause" ;) of up to 4GB(praying here for XDR2), as I said before smart phones and tablets will grow much (power, capacity and market) and soon maybe someone has the idea of puts them (tablet) as game machines could be a very serious opponent for next gen console.;)
That's the wrong approach. Who cares how much memory is in another device, when it's what's realised on screen that matters? the only people who care are those who are guided by numbers alone. Truth is a very low spec GPU capable of 60 million verts a second, 60 million pixels a second, with 50 MBs RAM, would be capable of rendering photorealistic graphics if it could be used 100% efficiently with one texture sample per pixel, one vertex per pixel (excluding shader power). The need for gigabytes of RAM is to overcome stupid inefficiencies in the rendering pipeline, and store high-res textures which are rarely ever seen. If a next-gen machine comes out with some amazing HOS TBDR rendering engine that produces better visuals with 512 MBs RAM due to far greater efficiency, I'll be all for it!

Because we can't do that though, 4 GBs still strikes me as the best compromise in cost and resources for devs to exploit. I would never request more RAM just because other more inefficient hardware designs are including more. I don't use 3GBs on my laptop, so >1GB on a mobile phone makes absolutely no sense and I'll never lament not having 4+ GBs in my laptop when handhelds have pointlessly overtaken my 3GBs. Just as I'll never lament not having a 4k TV, because one of suitable size to make a difference wouldn't fit in my home. Bigger numbers can be a complete waste. I've just talked someone out of buying a £600 laptop because all they did was browse the web, send emails, and do a little light photography. The 1TB HDD in the £600 laptop that appealed to them contrasts with the 150MBs of data they've generated in 6 years on their previous computer. The cheapest possible laptop I could find was way more computer than they'd ever use with its 2GBs dual 2GHz cores, and saved them over £300. Likewise with a console, throwing in more RAM you can't sanely use is pointless added cost, and if a study of the theoretical requirements of content streaming leads to designs that achieve the same or more with less hardware, we should embrace that enthusiastically!
 
I was a big fan of carts, add in processors, that kind of stuff. One of my favourites was King of Fighter 95 on the Saturn that came with a 16 Mbit ROM cart in addition to the CD - used it to free up memory for animation and have zero load times for things like character select.

I know it won't ever happen but I'd like to see just one game come on DVD and USB pen. A 2GB pen drive preprogrammed with game data would cost about $3. Next shrink that could be 4GB.

For a game like Final Fantasy 13 any common data could be put on the pen drive, freeing up disk space across all disks allowing less of them or more content. Would work perfectly on arcade units and speed up some loading too. The DD version wouldn't need the pendrive.

Economically not viable and MS would never allow it but I'd be all over it even if the game sucked.
 
That's how Vita's carts work, basically right? So economically viable it certainly is, I'd say.
 
That's the wrong approach. Who cares how much memory is in another device, when it's what's realised on screen that matters? the only people who care are those who are guided by numbers alone. Truth is a very low spec GPU capable of 60 million verts a second, 60 million pixels a second, with 50 MBs RAM, would be capable of rendering photorealistic graphics if it could be used 100% efficiently with one texture sample per pixel, one vertex per pixel (excluding shader power). The need for gigabytes of RAM is to overcome stupid inefficiencies in the rendering pipeline, and store high-res textures which are rarely ever seen. If a next-gen machine comes out with some amazing HOS TBDR rendering engine that produces better visuals with 512 MBs RAM due to far greater efficiency, I'll be all for it!

Because we can't do that though, 4 GBs still strikes me as the best compromise in cost and resources for devs to exploit. I would never request more RAM just because other more inefficient hardware designs are including more. I don't use 3GBs on my laptop, so >1GB on a mobile phone makes absolutely no sense and I'll never lament not having 4+ GBs in my laptop when handhelds have pointlessly overtaken my 3GBs. Just as I'll never lament not having a 4k TV, because one of suitable size to make a difference wouldn't fit in my home. Bigger numbers can be a complete waste. I've just talked someone out of buying a £600 laptop because all they did was browse the web, send emails, and do a little light photography. The 1TB HDD in the £600 laptop that appealed to them contrasts with the 150MBs of data they've generated in 6 years on their previous computer. The cheapest possible laptop I could find was way more computer than they'd ever use with its 2GBs dual 2GHz cores, and saved them over £300. Likewise with a console, throwing in more RAM you can't sanely use is pointless added cost, and if a study of the theoretical requirements of content streaming leads to designs that achieve the same or more with less hardware, we should embrace that enthusiastically!

I don't think that is a wrong approach, there must be balance thinking in the future, because speak frankly unless the developers (Crytech and others) and manufacturers (AMD and Nvidia putting 2GB in 3D cards) are lying, all that has been talked about and touted on assumptions and needs and growing RAM space for various uses is on next gen consoles or handhelds (smartphones soon in 2012 coming with SoC powerVR6 rogue 5/13Gpixel/350 million poligon/sec and tablets coming with even better specs 3 to 5 times more power each year).

We're not talking about laptops underutilized by many consumers or Tvs 4k resolution there will be virtually no sources to feed them as something distant, but again I have to remember that the next consoles are being discussed and speculated that come with operations and aplications capabilities far beyond what we see like "entertaiment center" converging Skype (or MSN Messenger?), browser, youtube, crosschat more evolved, more interactive virtual keyboard, streaming videos, sound and others, manage your friends list with profiles etc and all this will require constant and large space of RAM ... still play games with the maximum that can be placed with the BOM allows.

About 60 million polygons,60 millions pixels etc.... I sincerely hope that the next generation has the capacity to increase to 2.25 times with 1080p resolution for the amount information per pixel expected from a truly next generation engine (like samaritan Epic video, Frostbite 2 , id Tech 6), because if it does not, probably many core gamers will not want to play 720P even in 60Hz/fps going to consuming medium pcs (soon as we expected with Radeon HD 6970/GTX580 2GB VRAM and i7/phenon+4GB RAM when kepler and 7970 coming in 2012) that are already more than enough to run higher resolutions and more shader informations.
 
That's how Vita's carts work, basically right? So economically viable it certainly is, I'd say.

I was thinking that publishers would see the $3 (for example) additional investment in "disk(s) + flash" as lost money rather than an opportunity to improve the quality of the game (potentially across all formats you publish on).

You could now put a 4GB game on a $5 cart - dirt cheap by 16-bit standards - but when a cheaper alternative is already built into the system I guess there's little justification.
 
That's amazing Sebbi, thanks for taking the time to write that all up!

I'm actually surprised that you only use 50MBs of GPU memory, when the common assumption seems to be that texture data should be the #1 consumer. Are you actually hard pressed for texture memory? In your full memory budget, is texture data a majority position?
Can you name your target platform? Your comments about flash and not having to worry about optical drives makes me think it's not the current HD consoles.
 
Can you name your target platform? Your comments about flash and not having to worry about optical drives makes me think it's not the current HD consoles.

Well, you might think of the 360 Arcade as being appropriate since it's 4GB flash and his XBLA titles would certainly fit there. :p
 
Well, you might think of the 360 Arcade as being appropriate since it's 4GB flash and his XBLA titles would certainly fit there. :p
Quite right. I was just thinking that 50MB for textures seems quite conservative for a platform with 512MB of RAM.
 
That just goes to show you how utterly inefficient traditional hardware texturing is at this time. Sebbi's game could theoretically use far more detailed textures in that 50MB memory than a game using 1-2GB of RAM. All that's required for this is a large background storage and lots of artist time...
 
The solution, in case of Rage is; make in an on-rails shooter :D
Why not just go all Rebel Assault on the game and use pre-rendered photorealistic visuals played back as semi interactive movies!!!! :D :D

ILM should start making video games.
 
Superb post! Thanks muchly for taking the time for that. A common requrest and expectation for next-gen is inbuilt flash as a drive cache, and it should enable best-case virtual texturing.

I would say flash is a given for the lowest common denominator console given the fact that it would fit the majority of useage cases and it would provide a nice backup to optical media streaming latency.

I have to say, having you spell the numbers out like that, it makes it so very clear how grossly inefficient massive textures are! It even suggests a 2GB next-gen console won't be such a bad option if virtual texturing does as well as we could hope.

Well Joker would like to have a word with you about that. He couldn't implement an advanced form of AI without a large quantity of RAM. :p
 
All the talk of virtual texturing (ala megatextures) and it makes me pause: what kind of trade offs are we getting in terms of geometric changes, even texture changes? There will always be trade offs and bottlenecks (megatextures uses a boat load of storage so the pressure point switches to your HDD or optical drive) so would a game like Battlefield 3 work well with megatextures?

Lets say they have 12-15 large maps for MP (lets say they vary between 1kmx1km and 4kmx4km) plus 10-15 SP maps. These also need to support destruction (buildings, trees, vehicles) plus terrain deformation. Because of the open world design you are also looking to lean a lot on dynamic shadowing and lighting. Maybe the pressure points will change next gen but the above scenario doesn't look to good for a current gen scenario.
 
I'm not sure what your question is, but AFAIK Battlefield 3 is already using virtual texturing, just not completely unique virtual texturing.
 
I'm not sure what your question is, but AFAIK Battlefield 3 is already using virtual texturing, just not completely unique virtual texturing.
Our virtual texture mapping is not completely unique either. Our objects can share textures and we can map same texture area multiple times to the same object and even unlimited tile/repeat our object textures (we have tangent space normals, so nothing really limits this). However we made it so, that our engine blends artist placed decals to all our objects/terrain during virtual texture tile loading. This way we can store the base textures only once to the HDD, and we can map the same physical tiles to different places in the virtual texture with different sets of decals, basically making several versions of the same texture for a very small cost (upper mips still get baked, usually causing the modified texture versions to take around 1.5% of the original). So basically our HDD cost is just a few percent larger than in games that do not use unique per surface texturing.

Decaling during tile loading is a good technique, since it requires performance only during tile loading, not every frame like the commonly used decaling techniques do. When the decal is burned to the virtual texture tile cache, it stays there as long as the tile. Tiles are reused hundreds of times (saving lot of performance), so this technique allows a lot more decals than systems that do not use virtual texturing.

---

I noticed that in my previous post, I forgot to mention that we actually do have more textures in memory than just the virtual texture. Of course we have menu/UI textures (actually when the game is not running, we have more textures in memory :) ), fonts and a single particle atlas (containing all our particle textures). Particles are not virtual textured (particle shader must be really simple as the overdraw can be really high). But the half resolution particle rendering (pretty much a common standard in all new console games) doesn't require that large textures. If I remember correctly, all our particle textures fit in a single 2k atlas (4 megabytes). We also have a separate decal atlas and some terrain generation textures. I will talk about our terrain texture generation later (when I have time to write a post mortem of our project). Obviously we could not store our 4km x 4km terrain pixel by pixel in the HDD. That would take several terabytes of storage.
 
Back
Top