And performance isnt even native 1440p. I guess this runs like Dead Space at ~1080p with FSR performance. This visible ghosting is disgusting.
And performance isnt even native 1440p. I guess this runs like Dead Space at ~1080p with FSR performance. This visible ghosting is disgusting.
Shocking how much better it performs on PS5. Lately many UE4 games are performing a bit better on Sony console, but here the XSX version clearly lacks optimization with big frame-time issues.
Shocking how much better it performs on PS5. Lately many UE4 games are performing a bit better on Sony console, but here the XSX version clearly lacks optimization with big frame-time issues.
Also FSR still sucks.
And the truth finally comes out. You talk about me being triggered yet you were the one triggered all along. All you needed to say was that I hurt your feelings and I would've apologized. Instead you've built some fantastic strawmen in our discussion. I think this one is the best though. People's buying decisions could be misinformed because a reviewer said the strongest gpu can't "run" this one game well? What are they going to buy, a weaker gpu? Frankly ridiculous. I want to laugh but, its actually sad.Hey if you want to define "escalated" as me politely explaining why blaming the performance issues on the GPU when it's actually a CPU bottleneck does matter - with a real world (albeit extreme to clearly illustrate the point) example of how doing so could misinform the buying decisions of consumers, then you have right at it.
We'll just ignore the fact that in your response to me you dismissed that example as irrelevant on the basis that it wouldn't apply to many people (despite the example actually being transferrable to a much wider audience which I'm quite happy to explain further if you need me to), and then accused me of trying to "protect the 4090's image". Perhaps we just define escalation differently?
And the truth finally comes out. You talk about me being triggered yet you were the one triggered all along. All you needed to say was that I hurt your feelings and I would've apologized. Instead you've built some fantastic strawmen in our discussion. I think this one is the best though. People's buying decisions could be misinformed because a reviewer said the strongest gpu can't "run" this one game well? What are they going to buy, a weaker gpu? Frankly ridiculous. I want to laugh but, its actually sad.
Now to answer the first part of your question, it doesn't matter if the reviewer blames it on the cpu or the gpu. What doesn't change is the outcome which is, you can't sustain a solid 60fps. From what we know so far, there's no cpu available that overcomes the bottleneck. Maybe when Zen 5 comes out, that might change but by that point, this game is not even relevant anymore.
Finally, I implore you to look up the word pedantic in the dictionary. Perhaps it'll provide some clarity as to the uselessness of your distinction.
What do we blame this on?! Do devs need more time to optimize the games? Are pubs forcing games out the door cause they want them out?Shocking how much better it performs on PS5. Lately many UE4 games are performing a bit better on Sony console, but here the XSX version clearly lacks optimization with big frame-time issues.
Also FSR still sucks.
I am speechless, this means the Series X and PS5 CPU is basically a glorified 8 core i7 from the 5th gen (i7 5960X), the low L3 cache (8MB) and clock speed (3.6GHz) really hurts these CPU, though I suspect the much bigger memory bandwidth from GDDR6 on consoles help equalize things a bit. Still, these CPUs are weak compared to even medium end desktop CPUs.Bit of an interesting video that I thought was worth a share, the Ryzen 4700G is basically the Series-X (And PS5 CPU) and this guy tests it.
I loved the meme
On a serious note though, i think it depends on the definition of 'don't need any more' though. In a linear game like TLOU for example you play and area, then move on, and will not revisit that area again without starting a new game. In that case, absolutely release the VRAM. But in a game like Survivor where I understand there is an element of open world, and you can travel back to previously visited locations, then I'd say it's better to keep that already loaded data in VRAM, even if it's no-where near the current play zone provided there is spare capacity in VRAM. It should just be flagged as low priority for replacement with more relevant data when the likelihood of needing that data increases.
I don't think it's a good idea to free up VRAM of data that's not currently (or going to be imminently) in use though if it can potentially be used at some point, just for the sake of creating more free VRAM. As you should just be able to copy over it (I assume?) when needed. Or is there a penalty involved in overwriting existing data vs writing to empty VRAM?
I am speechless, this means the Series X and PS5 CPU is basically a glorified 8 core i7 from the 5th gen (i7 5960X), the low L3 cache (8MB) and clock speed (3.6GHz) really hurts these CPU, though I suspect the much bigger memory bandwidth from GDDR6 on consoles help equalize things a bit. Still, these CPUs are weak compared to even medium end desktop CPUs.
Yeah of course, this is the first DX12 title for the studio.Is it another case of DirectX12 just being shit?
I would argue the CPUs are still strong enough to create most major game experiences if major open world games like horizon can exist at stable 60fps.Under 220 Watts total power draw for the entire console system versus 110+ Watts PC CPU alone. Yeah, something has got to give to fit into that power envelope.
I am speechless, this means the Series X and PS5 CPU is basically a glorified 8 core i7 from the 5th gen (i7 5960X), the low L3 cache (8MB) and clock speed (3.6GHz) really hurts these CPU, though I suspect the much bigger memory bandwidth from GDDR6 on consoles help equalize things a bit. Still, these CPUs are weak compared to even medium end desktop CPUs.
Bit of an interesting video that I thought was worth a share, the Ryzen 4700G is basically the Series-X (And PS5 CPU) and this guy tests it.
Yeah of course, this is the first DX12 title for the studio.
This is the previous game running on the supposedly "bad" DX11 API, the game is CPU limited at 144 fps for all GPUs, min fps is 135 at 4K on a 4090.
Both games are running on the Unreal 4 engine.Just seems like an engine fps cap rather than a CPU performance issue.