Hogwarts Legacy [PC Details]

So with Forspoken and this.... 8GB cards are basically doomed for new games?
I just dont understand why this should be the case. 8GB and 11/12GB cards should be ample for this current generation yet. It's insane to me how there can be this massive discrepancy assuming players play the game with reasonable settings for their hardware.

We need Direct Storage with GPU decompression in games stat.
 
According to TPU review, Intel ARC 770 16G is doing relatively pretty good, faster than 3080 10G in RT without any image upscaling. Could be over 60FPS with XeSS at 1440p.

Only from 16 GB you are really carefree​

Hogwarts Legacy requires a lot of graphics card memory for maximum texture detail. Specifically, it is the first game in the editorial team's test that only works without problems in all situations from 16 GB of VRAM. Even with 12 GB you have to accept limitations. For example, the GeForce RTX 4070 Ti does not have any performance problems in Ultra HD and DLSS Quality with otherwise full graphics quality, but the textures are not loaded for some objects, others disappear completely for a short time, since the engine is probably trying to fix the situation somehow to rescue. Without ray tracing, it also works with 12 GB. 10 GB is no longer enough even without ray tracing in Ultra HD with DLSS at a sufficient frame rate. If the graphics card only has 8 GB, there are already clear signs of wear and tear in WQHD including DLSS, even without ray tracing, and even in Full HD one or the other texture is missing when playing. Ultimately, 8 GB and maximum textures are not possible.
*Computerbase
 
Last edited:
Obviously a bit of a clickbait title there, but there should be no surprise that games built around PS5/XSX would see increased VRAM demands for the PC versions, especially without any proper DirectStorage implementation.

This is where the SSD's in the consoles become game changers. Their whole point was the ability to throw more at the immediate scene without needing a lot more RAM. Without that same ability on PC, you ARE gonna need a lot more RAM to keep up. DirectStorage was never just gonna be just this optional thing in some games for quick load times. It needs to become the new paradigm around which games are built, and yes, that will mean making SSD's a minimum requirement for games.

Now I know this game is technically cross-gen, but the next gen version releasing first(and with no footage of any other versions) suggests pretty strongly that PS5/XSX were the target machines.
 
Has reducing texture settings stopped being a thing now?

8GB may not longer be enough for max texture quality (and none of us should be upset about that), but if a game doesn't allow you to reduce texture settings in an elegant fashion to be completely playable on an 8GB GPU then that's a serious issue with the game, not the GPU.
 
Has reducing texture settings stopped being a thing now?

8GB may not longer be enough for max texture quality (and none of us should be upset about that), but if a game doesn't allow you to reduce texture settings in an elegant fashion to be completely playable on an 8GB GPU then that's a serious issue with the game, not the GPU.
i Agree but i see most problems expecially game falling to 20-30fps for multiple seconds even with texture setting at low
 
More Hogwarts performance analysis from Computerbase, at 4K, the 7900XTX is 5% faster than RTX 4080 without ray tracing, but with it the 4080 is 50% faster.

DLSS Q + Frame Generation has lower PC latency than both DLSS Q and barebone native. Frame generation alone adds about 15 ms latency on top of reflex.

Screenshot_2023-02-10-23-43-29-72_e3262feaf4551cb63bbc5d82a1734c9b.jpg
 
I have a 32Gb DDR5, 12Gb 3080, i12700K and I am glad I bought this on PS5. Mostly because I want to play from the sofa on the big TV, and not at a desk, but the performance on some PC hardware is just weird.

Come on lets not exaggerate, if you don't care to plug your PC into the TV then that's fair enough but you are not going to get a better visual experience on the PS5 than you are on that system by quite some margin.

Whenever I state high/medium settinngs, textures are also high/medium accordingly. Ray tracing is a no no, I don't even want to try. Nor that I feel entitled to, I fully agree or rather accept that "ray tracing" would, should or could require more than 16 GB RAM.

However game recommends 16 GB RAM (not minimum, recommendation), for 1080p/high. The game runs fairly stable on a 300 bucks machine with only 8 GB budget. Split pools be damned, I never see any PS4 pro/One x port "hard" requiring 16 GB. Just the other day, I was testing likes of GoW and Spiderman with 8 gigs of ram, and they ran MUCH smoother than this game with that limited budget.

The fact that game still uses 10+ GB raw RAM data at 810p/low texture setting is concerning.

Comparing total Sys RAM + VRAM usage on PC to total memory usage on console is never going to make sense. PC's will always use much more total memory than consoles.

The Analysis of Hogwarts from TPU: the game needs 12GB of VRAM for Rasterization, and 15GB for Ray Tracing at 4K. Rasterization runs reasonably well, but ray tracing is different, a 2080Ti is faster than 7900XTX at every resolution, even 1080p!


performance-3840-2160.png
performance-rt-3840-2160.png



4K with RT and without upscaling is unplayable on any GPU, arguably even the 4090 so I don't think there's any concern there for the 12GB GPU's. They should absolutely be playing with upscaling engaged. Even without RT you can see the a12GB having a negative impact there but again, native 4K performance these days in games that support upscaling really isn't something people should be concerned with.
 
According to TPU review, Intel ARC 770 16G is doing relatively pretty good, faster than 3080 10G in RT without any image upscaling. Could be over 60FPS with XeSS at 1440p.
Apparently this was a bug in their testing, they tested the A770 with low RT settings, when tested at max RT settings it's slower than the 3070, but faster than all AMD GPUs.


performance-rt-2560-1440.png


 
Yeah, first was Doom Eternal, then Godfall, then Far Cry 6, then Forspoken and now Hogwarts. 8GB cards are done for.
Doom Eternal was a non issue, because texture quality is the same even at low.
Godfall and Far Cry 6 can also be tweaked accordingly without significant compromises.

Forspoken is the first game where 8 GB is actually not enough to load textures at all. That is an entirely different matter. And Hogwarts Legacy stutters like crazy. Even when reducing settings, you cannot have a good experience.

Forspoken and Hogwarts Legacy are the first games where VRAM is truly an issue.
 
Btw direct storage should alleviate the vram issue right?

But it didn't help forspoken at all.

Hogwash use direct storage or not?
 
Btw direct storage should alleviate the vram issue right?

But it didn't help forspoken at all.

Hogwash use direct storage or not?

By using direct storage you'll probably have a bit less pressure on PCIe bandwidth because you are transfering compressed data rather than decompressed data.
However, you still need something like a streaming or semi-virtual memory system for it to actually reduce VRAM size requirement. And I guess neither Forsaken nor Hogwash have them.
 
No idea where everyone's getting there stutters. I've gotten a few entering a new area, yay shader compiling, but they're brief and really not bad. There's definitely some visible loading though, I do wonder if the all black sky walking outside once was loading or shader compilation.
 
If there's a lot of unique texting compared to normal titles then it's going to use a lot more VRAM.
 
Back
Top