Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
I expect most of the non-framerate related issues will be fixed or at least improved, but I don't expect any improvement in rendering performance. Uncharted 4 never received any either.
Uncharted 4 runs super smooth for me. This game on the other hand needs few CPU optimization passes at least.
 
The parts where I'm experiencing drops are within the first 15% of the game. Like I said, it's not just GPU, hell I'd say my 3060 is usually less responsible for the drops than my CPU is. My i5-12400 is at 90%+ often, there is no way that recommended 8700 will absolutely cut it for 60fps either - my 12400 is significantly faster.

You need to order the motherboard I have, it cost me very little as I sold my old board which funded 70% of the cost of my current board.CPU clock.png
 
It was perfectly clear what your intent was, which was made even more clear by your subsequent responses - you were pushing back on some supposed hysteria for people expressing mild concern that these specs were accurate. Your replies made no sense other than in that context, a few posts saying "Hmm these seem really high, what's up with that?" and you responding with how 'bewildered' you were that people were 'taking them seriously'. The rest of your post was expressing incredulity at how exaggerated the specs were!

This, in light of the game's CPU demands, is particularly comical now:
Wow dude. Just wow. Talk about being stubborn and straight up distorting everything I said in order to justify your reactions at the time, when I was absolutely proven correct.

I stated my intentions as clearly as possible - JUST WAIT. Dont take the requirements seriously as they are never accurate, just wait and see how the actual game turns out to learn what the game actually needs to run. I have and will say this for basically any game in existence where people take these seriously. I have literally no skin in the game for TLOU Pt1 whatsoever. I haven't bought the game and dont have any plans to anytime soon either(a stance I had even before we learned about its issues). I was even highly critical of this game's entire existence on PS5 to begin with, saying it was a disappointing use of Naughty Dog's time and resources.

I merely said people should stop taking these requirements in general seriously, since they aren't ever accurate(and again, I specified that it didn't matter whether the specs were over or understated - it wasn't relevant to anything I was saying, just that they'd be inaccurate in some way). Lo and behold, they weren't accurate, as usual, which you even now admit. That's it. That should be the end of the story, but YOU were the one who felt it necessary to push back against my comments and get defensive about them. Which you're still trying to defend, even though I was proven correct. Not that it takes any special foresight to have made the claims I did, they should be obvious to literally every PC gamer in existence by now since 'inaccurate requirements' are basically guaranteed at this point.

EDIT: I mean lord, I specifically called out how the CPU requirements were almost assuredly going to be wrong, just by how nonsensical they were. You dont even have to know anything about the game's actual running to look at things like this and have it made very clear the developers are NOT actually testing these configs, and thus are just throwing out guesses and so shouldn't be taken so seriously(as people were actually doing and who you were defending for doing so! lol).
 
Last edited:
If performance creep has happened that much already its worrying for the future. A 3060ti is 33% faster than a 5700xt and the generation has barely even started.

AC Unity 1 year (Nov 2014) into the PS4, CPU gap was also way larger (each 4770k core was probably x4 the speed of a console Jaguar core, a gap which we probably won't see on the PC even when a PS6 hits). For context the GTX 980 was released Sept, 2014 -


69436.png
 
Last edited:
And yet it is exactly what's needed on Social Media and perhaps too tame.
I guess if one is asking to be absolutely clowned by the internet? Sure. This is the kind of smoke most people don’t want. Also, unfortunately, no one here has the authority to police social media. Let’s not delude ourselves.
 
Last edited:
There’s calling out bad behaviour and there’s being condescending. That post was condescending.
He’s asking people to be above GPU vendor wars instead of using the current state to score points. From that perspective I don’t see an issue with it.
 
He’s asking people to be above GPU vendor wars instead of using the current state to score points. From that perspective I don’t see an issue with it.
There’s no issue with asking people not to use it in GPU vendor wars but, frankly the suggestion is asinine. The game is part of the group of statistical outliers. It’s still a game that a lot of people clearly want to play with their varying gpus. This means that people will compare will naturally compare GPUs because they want to play it. It’s the same as when a big game comes out for example cyberpunk. Some people based their GPU purchase solely on how the GPU performed in cyberpunk because at the time, it was what was most important to them.

Honestly for Alex’s sake, I hope he doesn’t make the mistake of using this as a point to defend the validity of 8gb of vram. He’ll catch an unbelievable amount of flak if he makes that mistake.
 
Last edited:
It was both condescending and justified. It’s a losing battle though asking Internet denizens to not behave like children.
Like I said, the content of his post is fine, the last line will piss people off. It’s just what it is…. I’m simply pointing that out.
 
There’s no issue with asking people not to use it in GPU vendor wars but, frankly the suggestion is asinine. The game is part of the group of statistical outliers. It’s still a game that a lot of people clearly want to play with their varying gpus. This means that people will compare will naturally compare GPUs because they want to play it. It’s the same as when a big game comes out for example cyberpunk. Some people based their GPU purchase solely on how the GPU performed in cyberpunk because at the time, it was what was most important to them.

Honestly for Alex’s sake, I hope he doesn’t make the mistake of using this as a point to defend the validity of 8gb of vram. He’ll catch an unbelievable amount of flak if he makes that mistake.

The hate against 8gb of ram has a lot more to do with recent price increases than game requirements. It is absolutely ridiculous that $600 GPUs are still shipping with just 8gb. However it’s equally ridiculous to accept horribly optimized games that blow out vram requirements for no good reason. The former shouldn’t be used to excuse the latter.
 
The problem with VRAM is, most games doesnt need it and putting just more on a GPU only increases the cost. Why does a game need more than 8GB in 1080p and 10GB in 1440p, when it just run fine with 16GB in 4K? Shouldnt it need at least 32GB because 4K has 4x the pixel of 1080p?
 
The hate against 8gb of ram has a lot more to do with recent price increases than game requirements. It is absolutely ridiculous that $600 GPUs are still shipping with just 8gb. However it’s equally ridiculous to accept horribly optimized games that blow out vram requirements for no good reason. The former shouldn’t be used to excuse the latter.
If you use mods, 8gb of ram has been useless since like 2019. There are also games that exist that arbitrarily limit texture quality due to lack of vram(👀 Far cry 6). I don’t think people are differentiating solely based on this game. In this case, with all the frustration around pc ports. I think this is the straw that broke the camels back.
 
As long as games do not present me Half Life 1 like textures to fit 6-8 GB VRAM, I wouldn't have any problems with VRAM.

Will nextgen games do not look nextgen on Series S? Just wondering and pondering. People act like nextgen visuals with decent textures is not possible with 8 GB VRAM. Why not? PS4 with its 4 GB VRAM (1.5 GB VRAM to CPU) pushed nextgen visuals with RDR2, TLOU2 and such.

So at 1080p, I'd expect 8 GB VRAM to keep trucking. Only blurrier and that is about that.
 
The problem with VRAM is, most games doesnt need it and putting just more on a GPU only increases the cost. Why does a game need more than 8GB in 1080p and 10GB in 1440p, when it just run fine with 16GB in 4K? Shouldnt it need at least 32GB because 4K has 4x the pixel of 1080p?
I don’t agree with this take at all. 8gb of vram should have been the base minimum of a long time ago. Dram exchange list 8gb of gddr6 at an average of 3.409. That is completely negligible when compared to the price increases of gpus from 2018. Thankfully, some of us figured out that we could just buy nvidia stock and let the stock price increases driven by their buybacks pay for our new GPUs.

With to regards as to why a game might need more vram at 1080p, try texture variety, texture quality, RT, etc. is that why TLOU1 is using 8gb? Nope. The memory subsystem on pc is very different to that of the ps5. If I were to hazard a guess, the cost of completely rewriting the system in place to better utilize pc hardware outweighed the potential benefit in terms of revenue for the port. Then again, if this were some nobody indie game, I doubt many would be talking about this at all.
 
Status
Not open for further replies.
Back
Top