I wouldn't think so, not in this context at least for most cases. In most games the texture setting is independent from the resolution setting.
DLSS (or any method of upscaling) will alleviate VRAM usage from resolution increases (although I'm not sure if anyone's looked into how much VRAM DLSS itself uses?), as in presumably if its upscaling from 1440p to 4k you'll have VRAM usage closer to 1440p instead of native 4k. But at least my understanding is that resolution delta itself generally is not the largest contributor to VRAM usage but game settings are (and for the current gen the bulk being textures), for instance this one example with Shadow of the Tomb Raider (at least in terms of allocation) -
https://www.overclock3d.net/reviews/software/shadow_of_the_tomb_raider_pc_performance_review/13
1080p to 4k is roughly 1GB in allocation difference, while lowest to highest settings is 3GB in allocation difference.
In terms of the texture setting itself I think some users will feel the need to max the setting regardless (or even go beyond with add-on downloads, mods). I actually don't think most users really consider the actual real noticeability of higher texture settings relative to resolution. As in if the odd game for instance starts offering assets for 8k resolution users will still try to max it if their playing at 1080p. Those users will then consider 10GB (as I think it would struggle with native 8k texture assets) as maybe not enough for 1080p.
Which of course comes back to it being difficult to generically answer the question of whether 10GB will be enough even if you can forecast perfectly. Users who are flexible and don't think you need to max everything and/or can deal with the occasional stutter on the occasional game will feel differently from those can't deal with those "compromises," along with everyone else in between having a different opinion.