Thinking about it, why is this a DX12 title?
Surely with their engine coming off the back of PS4 surely DX11 would have been a better fit, easier and enabled better performance?
PS4 use GNM low level API, it was easier to go with DX12.
Thinking about it, why is this a DX12 title?
Surely with their engine coming off the back of PS4 surely DX11 would have been a better fit, easier and enabled better performance?
Didn't Nixxess experience the opposite? Their Spiderman port ran easily on DX11 at first, but they needed DX12 for Ray Tracing, so they had to write for DX12 from scratch, they even described their first DX12 rendering experience as being a black screen!PS4 use GNM low level API, it was easier to go with DX12.
Didn't Nixxess experience the opposite? Their Spiderman port ran easily on DX11 at first, but they needed DX12 for Ray Tracing, so they had to right for DX12 from scratch, they even described their first DX12 rendering experience as being a black screen!
Nah that performance is still atrocious.
You mean relative to the PS5? Not sure why that matters. A 3070 is hitting 60fps at 1440p medium which seems reasonable.
It was later discovered that some effects in COD were being rendered at around a quarter resolution compared to PC’s full resolution.I don't remember people kicking up such a fuss when the consoles were punching well above their GPU weight in AC: Valhalla or COD.
PS5 isn't performing like a 3080.Valhalla had a PS5 performing like a 2080, not slightly below a 3080. I don’t recall ever seeing PS5 punching up in performance comparisons for recent COD games.
quite dated engine and/or design phylosophy (akin to FF16): wooden blending between animation A to B, weird looking fabric.
The Last of Us Part I Benchmark Test & Performance Analysis Review
The Last of Us is finally available for PC. This was the reason many people bought a PlayStation, it's a masterpiece that you can't miss. In our performance review, we're taking a closer look at image quality, VRAM usage, and performance on a selection of modern graphics cards.www.techpowerup.com
2080 Ti fares pretty well vs Ampere. The commentary in these articles is kinda useless though because the author’s conclusions are based only ultra settings.
Extremely high praise from CB:
The Last of Us Part I im Technik-Test: Spielkritik und Fazit
The Last of Us Part I (PC) im Test: Spielkritik und Fazit / Wie gut ist The Last of Us Part 1? / Ein postapokalyptisches Roadmoviewww.computerbase.de
Check out our screenshots, these are excellent graphics, and they are paired with VERY well designed environments. The designers didn't just rely on the rendering tech, but they made smart use of their assets to create a believable, realistic experience. One highlight is without doubt the characters and their facial expressions which are among the best I've ever seen. What's also impressive is the quality of the textures, which are detailed, sharp and crisp, even if you walk right up to them, but this increases VRAM usage of course.
Putting that aside, The Last of Us Part I is a gorgeous, at times almost photorealistic, must-see game. Although ray tracing is missing, the high-quality pre-baked global illumination (GI) together with the believable material representation (physically-based shading) creates a very nice lighting atmosphere. The hand-built game world is bursting with details, such as overgrown street canyons, abandoned restaurants and many detailed assets. The textures are beyond doubt and mostly in very high resolution.
Lol, I love how this has turned into a witch hunt for the culprit.
Event the 4070Ti's 12GB is limiting it at 4K.
The 4070ti is right behind the 3090 in the techpowerup review at 4k, also remember that while the 4070ti is on par with a 3090/3090ti in terms of power it's massively down on memory bandwidth.
The minimum frame rate at 4k on the techpowerup article has the 4070ti right between the 6800XT and 6900XT, both of which are 16GB cards.
And techpowerup even highlighted in their review of the 4070ti that it's performance at 4k dropped off more than they expected it too.
Yeah it drops off a cliff at 4K because the high resolution is taking up additional VRAM to the point the 4070Ti's memory capacity is exceeded. We've seen in other reviews that the game can demand over 14GB at 4K Ultra.