Hogwarts Legacy [PC Details]

Yeah... this game doesn't look too hot on 8 GB too. At native 1440p, my 3070 is able to get 80 frames average at high settings, but I frequently get into weird VRAM bound situations where framerate tanks to 20-25. Restarting the game in the same area fixes it.

Why do these devs do not provide a VRAM consumption bar? It is impossible to know which settings affect what. Do they not have a smart intelligent VRAM management system? How hard can it be, really?

8 GB was fine in Hogwarts, and then comes Hogsmeade, destroys it. How the hell am I supposed to anticipate this?

Out of curiosity, what's your texture settings, is RT on, and how much system ram do you have?
 
Out of curiosity, what's your texture settings, is RT on, and how much system ram do you have?
Whenever I state high/medium settinngs, textures are also high/medium accordingly. Ray tracing is a no no, I don't even want to try. Nor that I feel entitled to, I fully agree or rather accept that "ray tracing" would, should or could require more than 16 GB RAM.

However game recommends 16 GB RAM (not minimum, recommendation), for 1080p/high. The game runs fairly stable on a 300 bucks machine with only 8 GB budget. Split pools be damned, I never see any PS4 pro/One x port "hard" requiring 16 GB. Just the other day, I was testing likes of GoW and Spiderman with 8 gigs of ram, and they ran MUCH smoother than this game with that limited budget.

The fact that game still uses 10+ GB raw RAM data at 810p/low texture setting is concerning.
 
I'm simply stuck:

Can't play at 1440p and above, VRAM overflows and causes extreme frametime instability.
Have to play at 1080p to get proper frametime instability with MEDIUM settings. Even using "high" preset at 1080p causes huge frametime stabilities at 1080p.

And with a 60 FPS lock at medium preset, 3070 hovers around %30-35 usage.

Can't target 120 FPS because CPU sucks; only goes up to 70-75 FPS in average.

Practically I have no ways to divert the GPU core power of 3070 if I'm simply content with 60 FPS. I'm open to suggestions. Which settings would some juice to the GPU without affecting VRAM? I really feel like I have a glorified GTX 1070 in my hands.
You never should have purchased a 3070. 8GB on that class of GPU was always going to run into real issues. Game optimization gets sloppier, not better as time progresses.
 
Whenever I state high/medium settinngs, textures are also high/medium accordingly. Ray tracing is a no no, I don't even want to try. Nor that I feel entitled to, I fully agree or rather accept that "ray tracing" would, should or could require more than 16 GB RAM.

However game recommends 16 GB RAM (not minimum, recommendation), for 1080p/high. The game runs fairly stable on a 300 bucks machine with only 8 GB budget. Split pools be damned, I never see any PS4 pro/One x port "hard" requiring 16 GB. Just the other day, I was testing likes of GoW and Spiderman with 8 gigs of ram, and they ran MUCH smoother than this game with that limited budget.

The fact that game still uses 10+ GB raw RAM data at 810p/low texture setting is concerning.
This is strange. I saw a streamer run it on a GTX 1650+10400KF+16GB of RAM at medium settings 1080p and 30fps+ without much trouble.
 
This is strange. I saw a streamer run it on a GTX 1650+10400KF+16GB of RAM at medium settings 1080p and 30fps+ without much trouble.
Were they in hogwarts? Tutorial area is pretty light on the hardware. It also fooled me and my friends initially, sadly. We were to happy see great performance in tutorial, only to be greeted with 20 FPS cutscenes once we entered sorting ceremony... :D
 
Were they in hogwarts? Tutorial area is pretty light on the hardware. It also fooled me and my friends initially, sadly. We were to happy see great performance in tutorial, only to be greeted with 20 FPS cutscenes once we entered sorting ceremony... :D
Yes, he got all the way to Hogwarts. There was a short sequence with the sorting hat where his fps tanked but it went back to normal.


The thing is, he seemed CPU-bound even with a 10400KF powering a paltry 1650. At times, his GPU usage would drop to the 70s while his threads were at 60-70%. Even dropping to 900p wasn't that much better and dropping to 720p yielded a an average in the 60-70 range which is way too low.

He also states: UPDATE: Tried it on an RTX 3090 + R9 5900x + 32GB RAM + Nvme, uses 21GB of ram, still stutters a lot when running, specially in the castle, this needs fixing
 
The Day 1 patch already came out.
Then they must update the specs. They recommend 16 GB RAM for 1080p/high.

Game calls for more than 16 GB at 720p/low;


Game constantly reads from Pagefile, and keeps stuttering most likely because of that, which leads me to believe that even at a theoritical 720p/low situation, you need 32 GB for pleasant frametimes, at least in this section.... I'm sure this is not recommendable.

This platform sucked the last drop of gaming joy I had.
 
Then they must update the specs. They recommend 16 GB RAM for 1080p/high.

Game calls for more than 16 GB at 720p/low;


Game constantly reads from Pagefile, and keeps stuttering most likely because of that, which leads me to believe that even at a theoritical 720p/low situation, you need 32 GB for pleasant frametimes, at least in this section.... I'm sure this is not recommendable.

This platform sucked the last drop of gaming joy I had.
Apparently it's coming out on the 10th so my bad.
 
I think it's worth remembering that although PC GPU performance is massively a head of console now, CPU and single threaded performance isn't so some games that are dependant on IPC won't see the same gains that we get with GPU's.

I have a 144hz monitor but honestly, I'm a graphics whore who only plays single player games and I would always chose 60fps at max settings over 144fps with reduced settings so this game isn't a problem to be honest as I just lock to 60fps.

And this game is making me feel glad that I upgraded from my 3060ti with it's 8GB VRAM buffer.
 
it seems to be performing very well on the A770.


@yamaci17 the PC has many advantages, like thousands of games you could never play on consoles, mods etc, but yeah, PC ports are horrible as of late, and there are developers expecting DLSS to solve their performance problems = less optimisation. My next PC is going to be something like GPD Win 4, even on my desktop PC I just have a 2TB partition with a tuned version of Windows 11 just for gaming, so....
 
Last edited:
The Analysis of Hogwarts from TPU: the game needs 12GB of VRAM for Rasterization, and 15GB for Ray Tracing at 4K. Rasterization runs reasonably well, but ray tracing is different, a 2080Ti is faster than 7900XTX at every resolution, even 1080p!


performance-3840-2160.png
performance-rt-3840-2160.png



Don't believe any benchmarks that are run inside the castle. Here there isn't much to see and your view distance is low. We've done our benchmark runs in the open world areas and FPS are much lower here. I picked a test scene that's demanding, but not worst case.
 
Last edited:
The Analysis of TPU: the game needs 12GB of VRAM for Rasterization, and 15GB for Ray Tracing at 4K. Rasterization runs reasonably well, but ray tracing is different, a 2080Ti is faster than 7900XTX at every resolution, even 1080p!


3GB for the BVH?? What!!!! Are they keeping the whole damn game in there 😲
 
More Hogwarts performance analysis from Computerbase, at 4K, the 7900XTX is 5% faster than RTX 4080 without ray tracing, but with it the 4080 is 50% faster.

 
Why on earth does the VRAM go up so much with resolution?

That seems like a bug right there, buffers, even with virtualized texturing, aren't nearly big enough for 4gb from that res bump. And what do they do with the system ram, is it just a total drop in replacement for the SSD, did the developers assume people had tons of gbs of ram and not an NVME drive? I can guess it was an early brute force solution they never got around to replacing, but... well.
 
Back
Top