Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
It doesn't really matter if its being caused by cpu limitations as the statement is still true. From what I've seen in different reviews, it doesn't matter what CPU you have, you're still going to be cpu limited.

It kinda does because to the uninformed (of which the internet is awash) statements like that suggest the 4090 itself is insufficient to run the game. Which it presumably isn't and is merely limited by the CPU. That's an important distinction for anyone considering whether to get the game and wondering if their GPU is insufficient. i.e. they might be running a 7800X3D with a 4070 and assume that the game would be unplayable on their system because "a 4090 can't hit 60fps" whereas in actuality they would likely get better performance than the reviewer with a 4090 and 5900X for example.
 

I can only assume it doesn't actually need that much VRAM (how could it?) and is just caching texture data as it goes along rather than flushing and reloading from storage because the memory is available. Actually that's a pretty good behavior which I think all games should follow, provided of course you prioritise the current scene data requirements over older data.

That said though, the same obviously has huge issues so who knows what's really going on.
 
It kinda does because to the uninformed (of which the internet is awash) statements like that suggest the 4090 itself is insufficient to run the game. Which it presumably isn't and is merely limited by the CPU. That's an important distinction for anyone considering whether to get the game and wondering if their GPU is insufficient. i.e. they might be running a 7800X3D with a 4070 and assume that the game would be unplayable on their system because "a 4090 can't hit 60fps" whereas in actuality they would likely get better performance than the reviewer with a 4090 and 5900X for example.
I think it's safe to say that an insignificant amount of people own a 7800x3d or Ryzen 7000 for that matter. Its an expensive platform with high upgrade costs which represents an insignificant minority in the pc space. A majority of pc users based on the known average specs will struggle to hit 60fps. The need to make that distinction to "protect" the 4090's image is unnecessary. The game is poorly optimized and does not adequately represent the capabilities of any gpu.
 

"Sign here and the house is yours! Oh, and don't worry about the leaky roof and electrical issues, the contractors have said they'll continue fixing them after you've settled in."

Absolutely ridiculous! smh
 
I think it's safe to say that an insignificant amount of people own a 7800x3d or Ryzen 7000 for that matter. Its an expensive platform with high upgrade costs which represents an insignificant minority in the pc space. A majority of pc users based on the known average specs will struggle to hit 60fps. The need to make that distinction to "protect" the 4090's image is unnecessary.

Stop talking rubbish. This has nothing to do with "protecting the image" of any GPU and everything to do with dissipating misinformation about any GPU's capabilities in situations where they are CPU limited. It doesn't matter one iota that the scenario I mentioned won't apply to many people, the fact that the scenario is even theoretically possible proves that the original statement that the game is "unable to hit x fps on [GPU X]" when in fact it is simply CPU limited is incorrect and shouldn't be made. If it bothers you less, swap out 4090 from my above posts with 7900XTX. The exact same principle applies.

The game is poorly optimized and does not adequately represent the capabilities of any gpu.

No one said otherwise. Only that a GPU's performance should not be judged from situations where it's entirely CPU limited. That's basic stuff, particularly for a forum as technical as this one and I'm surprised to see anyone here arguing otherwise.
 
"Sign here and the house is yours! Oh, and don't worry about the leaky roof and electrical issues, the contractors have said they'll continue fixing them after you've settled in."

Absolutely ridiculous! smh
Not really the best analogy given that much like games, houses always have at least some issues. Sometimes the owner will fix certain ones to get it ready for sale, or maybe they dont and it comes down to negotiations with potential buyers. But people absolutely buy houses with issues all the time.
 
People really need to stop buying games on day 1 at full price. Thats the only effective way to bring about possible change.
If everybody stopped buying games Day 1, the result would be a collapse of the game industry. I'm not saying that as encouragement to buy Day 1, just stating the reality of things. We need Day 1 buyers, whether it's ideal or not.

I'd say overall, it seems like most games outside of some golden goose examples do get punished with poor reception and lower sales when they release in a poor state. The incentive is still there to have the game in as good as condition as is feasible on release.

The actual source of the problem here is the ever increasing time and costs of game development, combined with the ever increasing complexity and challenges involved. Games already take longer than ever to make, and by quite a bit, and like 75% of big games seem to get delayed at least once. This results in enormous development costs and for the publisher to have to put their foot down on a release date. It's not that the games are being 'rushed', it's just that the longer the development takes, the harder it becomes to make the money back on it after release.

I honestly dont really see any solution in sight. Gamers are more demanding and have higher standards than ever, and games aren't gonna magically get cheaper or much easier to make. That said, I think a big source of issues at the moment for PC come from DX12/low level API's, and with more experience and some advancements with the API and tools and whatnot, there will be improvements in some of these areas. But big picture, games are going to continue to generally have issues at launch going forward and this is just gonna be how it is.
 

Isn't it appropriate considering maximum details with Ultra HD textures by definition should be attempting to use at least as much memory as is available on the highest end cards?

I mean what's the point of the highest texture tier if it isn't to utilize the memory available on the highest tier of graphics cards? You have lower texture resolutions for cards with less memory.

What bloody idiot is trying to run max details with Ultra HD textures on a card with less than 24 GB of memory? That's not the game's fault that's 100% user error.

Regards,
SB
 
Isn't it appropriate considering maximum details with Ultra HD textures by definition should be attempting to use at least as much memory as is available on the highest end cards?

I mean what's the point of the highest texture tier if it isn't to utilize the memory available on the highest tier of graphics cards? You have lower texture resolutions for cards with less memory.

What bloody idiot is trying to run max details with Ultra HD textures on a card with less than 24 GB of memory? That's not the game's fault that's 100% user error.

Regards,
SB
Yes but then you compare them to consoles with their 13.5GB of usable VRAM and realize they look the same.
 
Isn't it appropriate considering maximum details with Ultra HD textures by definition should be attempting to use at least as much memory as is available on the highest end cards?
The game aggregates VRAM as it loads more levels, so it starts with a reasonable 9GB VRAM, then as it loads more sections, it accumulates up to 22GB VRAM, it never discards anything! The behavior is fine on a 4090/3090/7900XTX, but lesser cards start to sutter or fail to load textures which results in muddy visuals!

It's like the developer hasn't done the minimum level of basic VRAM management needed for DX12.
 
Stop talking rubbish. This has nothing to do with "protecting the image" of any GPU and everything to do with dissipating misinformation about any GPU's capabilities in situations where they are CPU limited. It doesn't matter one iota that the scenario I mentioned won't apply to many people, the fact that the scenario is even theoretically possible proves that the original statement that the game is "unable to hit x fps on [GPU X]" when in fact it is simply CPU limited is incorrect and shouldn't be made. If it bothers you less, swap out 4090 from my above posts with 7900XTX. The exact same principle applies.



No one said otherwise. Only that a GPU's performance should not be judged from situations where it's entirely CPU limited. That's basic stuff, particularly for a forum as technical as this one and I'm surprised to see anyone here arguing otherwise.
GPU warring much? Thankfully, unlike you I don't care which gpu you use. I would still protest just as much regardless of the GPU because your whole arguement is pedantic. Nobody is questioning any GPU's capabilities and you're defending a fictional scenario that only exists in the imaginary and not in the reality. Nobody sees a 4090 which can run Cyberpunk Path Traced and then thinks the GPU must be the problem in Fallen Order.
 
Last edited:
Are we? I mean, are we supposed to sit here and expect requirements not to increase? The only thing shocking about this to me is that Jedi Survior is the game to require this much vram as it is thoroughly unimpressive. We went 3 gens without notable increases to VRAM in most of the stack and in the case of Nvidia, 4 gens. A new gpu should always provide a notable increase in vram over it's predecessor but, the GPU manufacturers pocketed the profit. I had to pay $2300 for a 4090 in my country and the fact that it only had 24gb of vram is criminal for that amount of money.
 
For those who missed it, there's a new TLOU patch out and it now allows you to set the size of the streaming pool. Going from the highest to lowest shaves off just under 2GB of VRAM with a warning that pop-in might become visible.

Baffling that it wasn't an option before but there it is.
 
GPU warring much?

Again, stop talking rubbish. You're the one seemingly trying to stoke a vendor war with your original claim that I was trying to "protect the image of the 4090" with an entirely factual statement about CPU vs GPU limitations, and then again bringing it up above. Vendor has nothing to do with the technical observation I made so stop trying to turn it into something it isn't.

Thankfully, unlike you I don't care which gpu you use.

Good, because it has literally nothing to do with the statement I made. Why are you even bringing it up?

I would still protest just as much regardless of the GPU because your whole arguement is pedantic. Nobody is questioning any GPU's capabilities and you're defending a fictional scenario that only exists in the imaginary and not in the reality.

That is literally exactly what some reviewers are doing when they make statements like "a 7900XTX can't even hold 60 fps in this game" when the game itself is clearly limited by the CPU rather than the GPU. That is by definition a statement about the GPU's capabilities. I'm not pointing at one specific instance here, you see this kind of thing regularly and I'm simply saying reviewers should be more specific and not blame the GPU when the CPU is the actual issue. I don't understand why you're even debating this point. It seems like you're just arguing for the sake of it.

Nobody sees a 4090 which can run Cyberpunk Path Traced and then thinks the GPU must be the problem in Fallen Order. As a result, your whole argument is pedantic, nit-picking at the insignificant.

No, my argument is about technical accuracy. If you're on this forum you should care about it. And as to your claim that no-one is blaming the GPU here, about 5 seconds of googling brought up plenty of such references, these were just the first couple I clicked on:

https://www.pcgamesn.com/star-wars-jedi-survivor/nvidia-rtx-4090-1440p-performance - that's an article specifically stating that the 4090, a GPU than can handle CB2077 path tracing can't handle this game. That's literally the exact opposite of what you say above no-one will think, and it's not just some random forum post, it's an actual published article.

https://steamcommunity.com/app/1774580/discussions/0/3828665650624552269/?ctp=1 - Post about the 4090 not being able to run this game properly "you'll just have to turn some settings down"

https://steamcommunity.com/app/1774580/discussions/0/3828665650623892099/ - post about how the 4090 can't handle RT in this game - despite the fact that the evidential screenshots clearly show the GPU underutilised in both scenes and that RT increases CPU load as well as just GPU load.

The net is absolutely awash with uninformed views like this, fuelled by exactly the reviewer practices I'm arguing against above. So no, it's not pedantic, or nitpicking, it's a very real spread of faulty information. If that doesn't concern you, then maybe you'd be better off frequenting the forums I linked above where the bar for accurate information is obviously a lot lower.
 
Isn't it appropriate considering maximum details with Ultra HD textures by definition should be attempting to use at least as much memory as is available on the highest end cards?

I mean what's the point of the highest texture tier if it isn't to utilize the memory available on the highest tier of graphics cards? You have lower texture resolutions for cards with less memory.

What bloody idiot is trying to run max details with Ultra HD textures on a card with less than 24 GB of memory? That's not the game's fault that's 100% user error.

Regards,
SB

Why should ultra HD textures require 24GB of VRAM and not 16 or 12 in this particular game? Is it doing something special? We seem to be falling into the trap of accepting massive resource usage simply because those resources are available.
 
Status
Not open for further replies.
Back
Top