Virtual Texture Issues and Limitations (Gen 9/PC)

Yes, of course. I just wanted to point out that expecting a virtual texturing system to provide textures for an entire scene in real time is unrealistic.

I assume you mean you cant expect a vittual texturing system to fetch/generate all needed textures for a single frame instantly. In that case yes, that goes without saying, and nobody ever thought that to be the case.
 
I assume you mean you cant expect a vittual texturing system to fetch/generate all needed textures for a single frame instantly. In that case yes, that goes without saying, and nobody ever thought that to be the case.

That's my entire point from the start. I think I explained that in my first post.
 
That's my entire point from the start. I think I explained that in my first post.

And its been rebuted from the start that that it does not need to be instant since the same texture remain visible for multiple frames, and fast movemebts or camera cuts can be accounted for with larger speculative buffers and ways to hin the system. You are complaining that VT can't do something it never set out to do.
 
And its been rebuted from the start that that it does not need to be instant since the same texture remain visible for multiple frames, and fast movemebts or camera cuts can be accounted for with larger speculative buffers and ways to hin the system. You are complaining that VT can't do something it never set out to do.

Actually no. 'Large speculative buffers' don't work (I also explained that) because you can't expect that to smooth out everything without an intelligent caching system, which I don't think exists for now (or even possible).
All caching systems will always face the same problem: you can't predict when you might suddenly need a lot of data not in cache. In most applications it's fine, because you just need to wait for those data to be loaded. However, in real time applications you don't have that luxury, that's why hard real time systems generally don't use cache. Games are not really hard real time systems of course, but the problem is that if these "waits" (which will be like stutters) happens too frequently it's unacceptable.
That's why I said even with VT you'll want to have to stream textures based on how your scenes are arranged. A simple way is to do it in grids, and streaming all neighboring grids at the same time in case the player enters the next grid. In such arrangement you don't really need VT to do these streaming (because you'll be loading the entire texture anyway), but VT will help when you need a few parts of textures outside those grids (e.g. far away objects), where on demand streaming will be able to cope.
 
Actually no. 'Large speculative buffers' don't work (I also explained that) because you can't expect that to smooth out everything without an intelligent caching system, which I don't think exists for now (or even possible).
All caching systems will always face the same problem: you can't predict when you might suddenly need a lot of data not in cache. In most applications it's fine, because you just need to wait for those data to be loaded. However, in real time applications you don't have that luxury, that's why hard real time systems generally don't use cache. Games are not really hard real time systems of course, but the problem is that if these "waits" (which will be like stutters) happens too frequently it's unacceptable.
That's why I said even with VT you'll want to have to stream textures based on how your scenes are arranged. A simple way is to do it in grids, and streaming all neighboring grids at the same time in case the player enters the next grid. In such arrangement you don't really need VT to do these streaming (because you'll be loading the entire texture anyway), but VT will help when you need a few parts of textures outside those grids (e.g. far away objects), where on demand streaming will be able to cope.

Just play any VT game. When the cache fails, the games don't stutter. Some textures just look blurry for a dozen frames and then the proper mip fades in.
 
Just play any VT game. When the cache fails, the games don't stutter. Some textures just look blurry for a dozen frames and then the proper mip fades in.

I have seen that and I think that's even worse, IMHO.

[EDIT] To clarify a bit: if the resolution change is just a MIP map level it's probably unnoticable. Unfortunately, in many case it's a huge blurry thing for a second or more (probably because the entire texture is not in VRAM). To me this actually is more unrealistic because you don't tend to see that in real life (outside of putting on a pair of glasses :p ). I prefer games to be consistent, both in frame rate and in quality. That's why I think it's still better to have most textures you know you'll need in VRAM because it produces better experience, and use VT to load whatever not in VRAM just in case.
 
Last edited:
Or you might have seen high res textures in a traditional mips streaming system not loading in quickly enough... 🙂

Could be, but even in VT games it can't handle the case where a large area (which is more noticable) does not have the portion of textures required.
 
I don't think previous games are an example of the limitations of VT where results could be a limitation of the hardware. Looking at UE5 now, how well does the virtualisation work there? Not only textures, but also geometry with nanite. Is it prone to pop-in and LOD issues?
 
VT cache misses were long in early VT games because they didn't have a concervative enough speculative buffer, and because they were loaded from the HDD. Larger memory pools solve issue one, and SSDs solve issue 2.

Well, that's the main issue. That's why in my first post I speculated that you probably can't have more than twice the working size of your VRAM. I don't know the exact number of course, as it depends on how your game works. For example, as one extreme example, in a game like Diablo 4, you probably can make it work with a much larger data set because its camera angle makes the amount of new data potentially required relatively small. On the other end, in an open world game with user generated contents (such as buildables, imagine something like Conan Exiles), it'll be much more difficult to estimate how much data you might need when a player is simply looking around. In this case you might need to have a much larger cache to avoid texture popup or stuttering happening too frequently.

So let's say in a game like Conan Exiles you'll probably still want to have something like a heuristic caching system. For example, when a player is in the desert biome, you probably don't need to have jungle biome textures and you can tag them as less prioritized. On the other hand, if the player is in the desert biome but close to the jungle biome, you might want to increase the priority of the jungle biome while decrease the priority of another biome. You might even want to preload some jungle biome when the player is near the zone. These kinds of heuristics can't really be replaced by a completely automatic system.
 
Immortals of Aveum devs had issues with the VT texture pool in UE5. VT is required if you're using Nanite.

"Nanite does a good job of using the memory it has available, but the exception to that is that virtual texture pools in UE cannot be resized - they have to be initialised at engine startup and cannot be touched again, [which provides] fully allocated contiguous memory which is wonderful from a performance standpoint but [you can have problems where, for example] there's a goblet way off in the distance, two pixels, and it needs one piece of one texture [from a 500MB pool allocation], and you don't have any of that back until the texture goes away. PC doesn't care [if you run out of memory]; worst case, it goes into virtual memory. Console goes "I don't have virtual memory, I'm done." And it won't crash, but it will cause substantial issues. This caused what was internally known as the infamous landscape bug, where you would just walk into certain parts of the game and it would like someone painted an anime landscape on the ground, because it couldn't allocate for the virtual texture pool."

 
Immortals of Aveum devs had issues with the VT texture pool in UE5. VT is required if you're using Nanite.
🤷‍♂️ Don't understand what they want here. If your environments can't use the whole pool, you should add content to your environments. If none of your environments need the whole pool, you should make the static pool smaller. This case where sometimes you want 500mb and sometimes you want 1gb for textures is silly... You should budget all of your other systems so that you always have enough headroom for the maximum amount of textures you might need. /rant over.
 
🤷‍♂️ Don't understand what they want here.

I also found that very confusing. But what I think they meant to say is that sometimes the engine fills the entire pool with what is present in a scene, and once a lot of new content becomes visible it cant evict old stuff fast enough, or it is bad at prioritizing what to evict and what to clear more space for, so new assets might stay in low texture res for too long.

You should budget all of your other systems so that you always have enough headroom for the maximum amount of textures you might need. /rant over.

Perhaps thats exactly what they want to do, but dont know how to. Either because the engine itself does not provide enough tools to achieve those goals yet or the devs havent actually figured it out...

All guesswork though.
 
I also found that very confusing. But what I think they meant to say is that sometimes the engine fills the entire pool with what is present in a scene, and once a lot of new content becomes visible it cant evict old stuff fast enough, or it is bad at prioritizing what to evict and what to clear more space for, so new assets might stay in low texture res for too long.



Perhaps thats exactly what they want to do, but dont know how to. Either because the engine itself does not provide enough tools to achieve those goals yet or the devs havent actually figured it out...

All guesswork though.

What they want to do is budget only what's needed when it's needed. If 95% of the time you only need a 500mb pool, but then 5% of the time you need 700mb, then that's 200mb wasted 95% of the time. If it was random then who cares, you need that 200mb all the time. But if it's predictable, like it sounds like here, say from level to level, then most levels have 200mb wasted all the time on extra the texture pool. That just sucks, if you could re-assign texture pool size on demand you'd be able to fix this.

It'd be understandable that it couldn't be "runtime" runtime, maybe you'd need a load screen or a something. It's a feature that some devs could obviously make use of.
 
That just sucks, if you could re-assign texture pool size on demand you'd be able to fix this.
“I have too much texture memory for my art team to use” sounds like the best problem ever, ask them what they want to use it for. Nobody should waste dev cycles on this — go save some cpu perf (or gameplay code memory!) instead.

Edit: Maybe too blithe -- let me explain a few cases where you might(?) want(?) a variable texture pool and why I don't think they make sense.

First, clarification: This is a pool of packed, uniform memory. It's allocated up front. It does not suffer from fragmentation. It is the total amount of textures in memory. Textures need to be in memory to display them, so moving the pool up or down decreases your effective (max) texture resolution*.

Here are some scenarios:

1. I have too few textures in some scenes, but not others

Add more textures to the scenes that don't have as many!

2. My textures are too small in some scens to fill up the bigger pool

Make them higher res!

3. It's not the number of textures that's the problem -- I have too much memory in some scenes and not others. In scene A, a 800MB texture pool will fit just fine, but In scene B, a 800MB texture pool will crash the game.

This sets of horrible alarm bells for me! For one: Why is it OK for players to load into scene B and all of the textures are suddenly lower res*? What on earth are you using the memory for if not art? Why is crunching the art budget (sometimes) your first choice instead of optimizing memory usage whatever is unique about this scene? For two, this sounds like a big footgun opportunity -- what if later in development whatever feature is putting you into crashing memory budget gets added to other scenes? And you have to re-budget all the art around the smaller texture pool? Don't do this! Pick the budget for textures, nanite meshes, etc, have your art team fill it (but not exceed it), and fit the rest of your game around it.


*Really this is experienced as more texture pop in, more temporary blurry textures as you turn the camera, etc.
 
Last edited:
Virtual texturing normally keeps a tile cache in vram doesn’t it? You don’t always have to hit the pcie bus. The cache can keep tiles so if your view changes the texels you need may already be in the cache.
 
I don't know if anyone has caught on yet but virtual texturing on PC effectively makes the driver settings toggles for texture filtering obsolete since indirect texturing methods traditionally doesn't work with hardware filtering. Games that have implemented virtual texturing often only supports a fixed window of settings for texture filtering so going to higher quality filtering modes (8/16xAF) than officially supported is never a really an option since drivers can't figure out a clever enough workaround to the potential issues. Hardware implementations of virtual texturing (tiled/sparse resources) are compatible with hardware filtering and they could potentially even be used to implement virtual shadow maps but most implementations are bad enough to avoid usage ...

Artists want graphics programmers to implement virtual texturing but they now have to pick between losing hardware texture filtering/derivatives or the hot garbage that is hardware virtual texturing ...
 
I don't know if anyone has caught on yet but virtual texturing on PC effectively makes the driver settings toggles for texture filtering obsolete since indirect texturing methods traditionally doesn't work with hardware filtering. Games that have implemented virtual texturing often only supports a fixed window of settings for texture filtering so going to higher quality filtering modes (8/16xAF) than officially supported is never a really an option since drivers can't figure out a clever enough workaround to the potential issues. Hardware implementations of virtual texturing (tiled/sparse resources) are compatible with hardware filtering and they could potentially even be used to implement virtual shadow maps but most implementations are bad enough to avoid usage ...

Artists want graphics programmers to implement virtual texturing but they now have to pick between losing hardware texture filtering/derivatives or the hot garbage that is hardware virtual texturing ...

That's presumably the root of a lot of crappy texture filtering across PS5 titles?
 
Back
Top