I'm sure it's because the game preloads, it works fine on lower VRAM GPU's from what I remember looking at performance reviews.
Those performance reviews aren't always very definitive. I feel people have oversimplified VRAM related testing and might not realize how complex actually determining that requirement might end up being. It's not as simple as say 10+ years ago (or maybe even longer now) where universally it was just a matter of looking for stutters.
A lot of modern engines now heavily rely on being very dynamic in terms of what data is streamed in and LOD. This complicates VRAM testing as unless you have some way to directly profile or basically compare two games side by side you can't actually know for sure whether or not the engine is somehow compensating for VRAM limitations by things such as using lower LOD, streaming in textures later, or just plain lower textures.
Some games will also let you set graphics settings in the menu but not actually use those settings and enforce certain limits in the background. It's a psychological thing with knowing that users don't like to feel they aren't able to "max" settings.
Another issue is that how VRAM ends up being managed and used is means testing for thoroughness is tricky and testing for consistency may not reflect actual play. Two common issues are that memory budget can very greatly depending on where the game and what is happening. In terms of management there is also a factor in terms of how long you are playing.
I recall for example just during the onset of the last console generation with one of the very early games, Shadow of Mordor, recommending 6GB for max texture settings even at 1080p. User reports on this ended up very mixed with some people reporting that 3GB was fine while others were not. Why such discrepancy? Well depending on how you were playing, where you playing, and what your threshold of tolerance you'd have a different response.
So say with an open world game if only a small subset of areas end up stuttering briefly due to VRAM during fast movement/pans especially during environment transitions, is there enough VRAM? What if it's otherwise fine for 90%+ of the rest of game play scenarios? 95%? Is having to wait a few more seconds more consistently for texture stream in not enough VRAM? Not a very simple pass/fail answer here.
I'm not saying that is the case here but just to actually deep dive and tackle this issue would be much more intensive than just running a single scene (or even a couple) on fresh loads and comparing fps numbers these days.
Then there's also another factor here in terms of background applications and how newer WDDM and Win10+ manages memory compared to the Win7 and older era.