*spin off* RAM & Cache Streaming implications

Heinrich4

Regular
Maybe not belong here, but the requirements of this engine that are still the current generation (id tech 6 seems to be more targeted to next generation) are quite high RAM space(gpu not much high):

http://www.kotaku.com.au/2011/09/rage-system-requirements-are-fittingly-barren/

Minimum:

OS: Win XP SP3, Vista, Win 7
Processor: Intel Core 2 Duo or Equivalent AMD
Memory: 2GB
Hard Disk Space: 25GB
Video Card: GeForce 8800, Radeon HD 4200


Recommended:

OS: Win XP SP3, Vista, Win 7
Processor: Intel Core 2 Quad or Equivalent AMD
Memory: 4GB
Hard Disk Space: 25GB
Video Card: GeForce 9800 GTX, ATI Radeon HD 5550

Maybe engines coming next years if have the same or more memory requirements for even more advanced Virtual Texturing than ID TEch 5,4GB RAM may not be fully enough (thinking skype,internet browse,cross chat more advanced and interactive,streaming sound and vídeos like entertaiment hub way of life etc).
 
Last edited by a moderator:
I don't understand this mindset at all. More memory decrease the stress put on optical media.

With everything else the same, less memory will require more streaming of textures, smaller buffers for preloading and thus more texture pop-in.

I expect at least 4GB. Whoever launches first of MS and Sony risks the counterpart doubling up on RAM and having to sit through a whole generation of Lens of Truth and Eurogamer VS. reviews declaring the competitor the better choice.

We are talking about less than $20 worth of DRAM for 4 GB for a launch console. In 2005 DRAM was 10% of the $525 BOM of the premium 360.

Cheers
Actually that the best argument I read in favor of 4GB of RAM
Epic seems to expect the next generation:

http://www.vg247.com/2011/09/06/epics-capps-gears-of-war-3-and-the-next-generation/


“‘What would you like to see?’ I can’t tell you, because I know what’s going on.”

Despite the silence, though, Capps is willing to concede that developers Epic’s engaging with on UE4, the company’s next generation engine, are pumped.

“We’ve only talked to a small number, but the ones we have are very excited because we’re doing a lot of core work that’s going to make it easier for them to scale up, especially with lots of processors,” he says."
So 8 cores sounds like a good bet, assuming Charlie is right and MS uses a SoC straight power A2 or derivative sounds like a decent possibility.
 
Interesting, for me the OS itself takes around 100-150M at most, the rest are BG apps that I can easily close.

Don't know how you manage that. I'm running a very clean 32bit Win 7 with zero background tasks beyond security, and I have 1GB of 3GBs used up.
 
Don't know how you manage that. I'm running a very clean 32bit Win 7 with zero background tasks beyond security, and I have 1GB of 3GBs used up.

Same here, but it believe cashes some parts of programs which it thinks you would use, it would let go of the memory if a real program needed it.
even still: Win7 is laggy as shit, even if you have the quad core, dual sli thing going on.
 
Don't know how you manage that. I'm running a very clean 32bit Win 7 with zero background tasks beyond security, and I have 1GB of 3GBs used up.
Easy, I don't count cached memory as used as OS will flush it whenever anything needs that RAM. Otherwise I could say my Linux box has 99% of any amount of memory used whenever I let it sit idle for long enough for it to cache stuff :)

So basically you get the actually used memory as a sum of availiable+cached+free, not just availiable+free. The physical memory percentage at the bottom of task manager is pretty meaningless as well.
 
yes, here's memory usage on the box I'm typing this on, half the memory used is disk cache.

Code:
             total       used       free     shared    buffers     cached
Mem:           496        351        144          0         32        139
-/+ buffers/cache:        179        317
Swap:          191         18        173

by the way it's an ancient computer, but usable and running firefox 6.
it has xbox 1's CPU and xbox 360's memory amount :), it had maybe 128MB originally which would still be usable but limit it very severely.
 
I don't understand this mindset at all. More memory decrease the stress put on optical media.

With everything else the same, less memory will require more streaming of textures, smaller buffers for preloading and thus more texture pop-in.
"Stress" on the drive per se is not an issue. Only when it becomes contention and thrashing. The drive will keep spinning all the time anyway. Might as well use it.

Suppose you have a streamed "corridor" environment where the camera can move at a maximum speed that is balanced against the drive's capacity to bring in new assets. You fill up your RAM by buffering ahead all the assets you need for the next 30 seconds of movement. If the camera moves constantly, you will prefetch at the drive's max speed all the time, but everything will be loaded when it's visible and you will render full quality versions of everything.

Now you double your RAM. You can now prefetch content for 60 seconds worth of constant movement, and you do. As long as the camera keeps moving, the drive is still busy prefetching at its top rate. What did you gain?

The desirable thing is to get the game running and have it be responsive as quickly as possible. An idling drive OTOH doesn't improve the experience. Even when you consider drive noise, you'd have to spin it down to really make an impact, but usually spin-down/spin-up cycles annoy people more than a constant noise floor.
Gubbi said:
I expect at least 4GB. Whoever launches first of MS and Sony risks the counterpart doubling up on RAM and having to sit through a whole generation of Lens of Truth and Eurogamer VS. reviews declaring the competitor the better choice.

We are talking about less than $20 worth of DRAM for 4 GB for a launch console. In 2005 DRAM was 10% of the $525 BOM of the premium 360.

Cheers
I'm not privvy to high-performance DRAM prices, but these numbers seem to be in line with DDR3 DIMMs. Not sure how this translates to higher-end memory. If you recall, 700MHz GDDR was pretty much cutting edge when the current crop of consoles launched, and they all went for that speed grade anyway. Even Nintendo. I fully expect aggressive memory clocks again. I.e. the balance will tilt towards bandwidth, not size.

Consoles have never had aggressive amounts of memory. It just had to be fast. Starting with fewer chips makes it easier to revise and use even fewerer chips later down the line. And corners have been cut for much smaller savings.

I agree that 4GB would be awesome and pleasant, but everything that goes into a closed box has to be justified. I feel 2GB can be good enough, extrapolating from what appears to be possible this generation.
 
"Stress" on the drive per se is not an issue. Only when it becomes contention and thrashing. The drive will keep spinning all the time anyway. Might as well use it.

Suppose you have a streamed "corridor" environment where the camera can move at a maximum speed that is balanced against the drive's capacity to bring in new assets. You fill up your RAM by buffering ahead all the assets you need for the next 30 seconds of movement. If the camera moves constantly, you will prefetch at the drive's max speed all the time, but everything will be loaded when it's visible and you will render full quality versions of everything.

Now you double your RAM. You can now prefetch content for 60 seconds worth of constant movement, and you do. As long as the camera keeps moving, the drive is still busy prefetching at its top rate. What did you gain?

I'm not talking about mechanical stress of the optical drive. I'm tallking about the number of I/Os / second the drive needs to respond to.

A streamed corridor is perfectly prefetchable. There is also *zero* reuse, so a pretty poor example for a caching scenario. A 6x Bluray drive can chomp through 50GB data in 20 minutes. Most games are longer than 20 minutes.

Instead, imagine something like Rage, where they drop all the high-res mip levels once textures are outside your view frustrum. All those texture data needs to be reloaded when you do a 180 degree turn. Do it quickly and you have *massive* texture pop-in. Do another 180 turn and the textures you were looking at 10 seconds ago needs to be reloaded from optical or HDD. Now imagine running Rage with 1GB of memory instead, you'd be able to cache most of the texture data outside you view frustrum, improving fidelity *and* improving prefetching when moving to a new area because you lowered load on the storage system.

...Or, imagine a hub-like game-level structure. we are talking about *all* RPGs this gen. Here you're constantly returning to areas you've already visited, more RAM would allow more caching. All those saved I/Os could then be used for more prefetching of contents.

Cheers
 
Now imagine running Rage with 1GB of memory instead, you'd be able to cache most of the texture data outside you view frustrum, improving fidelity *and* improving prefetching when moving to a new area because you lowered load on the storage system.
Or instead of that you could be using higher resolution textures instead and still have the pop-in problems. Technically Rage could simply downsize all their textures by 2-3x and have nearly no pop-in when spinning around in one place but it would look ugly. Similarly with 1G the textures would also look ugly if all that extra space is just used for keeping stuff that most of the time isn't needed.
 
Now imagine running Rage with 1GB of memory instead, you'd be able to cache most of the texture data outside you view frustrum, improving fidelity *and* improving prefetching when moving to a new area because you lowered load on the storage system.

That's sort of what they do already, the megatexture needs about 24-32MB or RAM and whatever's left after sounds, animation, game data etc. is used for caching. It's just that it's apparently not enough for the base X360 systems; we'll see how well it works with a full HDD install once the game's out.
 
Your streaming system shouldn't throw out data you can still see simply by turning around. Frustum is too conservative. This actually reminds me of Ultima IX, ugh.

I thought Rage was supposed to perform fine (60fps) on current consoles. This might actually become the new poster child for efficiency in streaming. I haven't heard any specific complaints about thrashing when you turn around. Has that happened?
 
There were some mentions of texture pop-in, but no further word about the circumstances, like if it was a full HDD install on a X360 or something less capable, or maybe a fast PC. Which is why I'd prefer to wait with any judgment at least until all versions are released and DigitalFoundry can do a proper analysis.
 
The solution, in case of Rage is; make in an on-rails shooter :D
That way they can exactly control the camera and determine which high resolution textures should be loaded, in a way that you would never have texture pop-up on low-memory systems.

It could benefit the gameplay if they went all out and produced a 10 hour on-rails shooter.
 
There were some mentions of texture pop-in, but no further word about the circumstances, like if it was a full HDD install on a X360 or something less capable, or maybe a fast PC. Which is why I'd prefer to wait with any judgment at least until all versions are released and DigitalFoundry can do a proper analysis.

I think I read Carmack himself that it was hard to eliminate pop-in on HDD-less, and then the journalist mentioned the HDD enabled systems also showed some pop-in, though less. We'll see what the final version does.

I think 4GB is going to be the minimum, considering that 8x previous generation is pretty much standard, and the time-gap between this gen and the previous one is rather large. The only way we are going to see 2GB is if there is some incredibly efficient storage coupled with very fast RAM of which it is only affordable to have 2GB, but the increase in bandwidth is so extreme that it is worth it.

I wouldn't even be surprised if we did get 8GB, considering that we're still 3 years off from the next-gen (!), and considering that it may be harder to get storage to increase in speed sufficiently, so that caching becomes more important. In the case of the PS4, I am curious whether they would go for split memory again. It all depends on where the bandwidth comes from, eventually.
 
Back
Top