Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
The "up to" 1.8x increase is with more than double the transistors. On 5nm that means a significantly more expensive chip.

Probably but I mean compared to console without infinity cache for example. I don't expect up to 1.8x compared to AMD RDNA 2 GPU. I hope the same difference compared to a PS5/Xbox Series X. If we have mid gen console they won't be top tier RT. If we have 3060 Ti or better 3070 level of performance that would be great. I don't expect Nvidia 4000 family level of performance even a 4060.

Maybe if they released something newer than 2 year old titles on PC they would see better sales?

Whatever they it will always be lower per unit even if they could sold games at the same price than on consoles. They need to pay royalties to PC store. And the goal is to sold consoles where is 99% of the revenue and profit of SIE like I said from a business point of view PC is a bonus.
 
Last edited:

Evil West tech review: a smooth, blurry 60fps or a clean but stuttering 30fps - it's your choice​

A brace of compromised graphics modes for an otherwise excellent shooter.

Evil West is a remarkably direct take on third-person action, a linear and propulsive combat game with no open-world diversions, no loot, no crafting, and no side quests. The combat system at its core is fast-paced, responsive, and fair, and the B-movie style plot is well-told through a series of high-quality cinematics.

Unfortunately though, the game has garnered controversy prior to release thanks to a pretty unexciting set of resolution options as reported by the game's publisher - a 4K30 quality mode is par for the course, but 1080p60 performance modes for PS5 and Series X have concerned many. Meanwhile, Xbox Series S gets a basic 1080p30 mode. The question is, if we look past the raw pixel counts, is there an attractive game underneath - and is performance really as straightforward as a simple 30fps/60fps split?

Before we tackle those issues, it's worth mentioning that Evil West does some things well in terms of assets, lighting, environmental design and rendering quality, with beautiful environments that stretch from sandy deserts to snowed-in summits and damp catacombs. This is very much an Unreal Engine 4 game - expect a heavy reliance on baked lighting within fairly static environments - but I love the way the visuals often come together.

...
 

Evil West tech review: a smooth, blurry 60fps or a clean but stuttering 30fps - it's your choice​

A brace of compromised graphics modes for an otherwise excellent shooter.

Evil West is a remarkably direct take on third-person action, a linear and propulsive combat game with no open-world diversions, no loot, no crafting, and no side quests. The combat system at its core is fast-paced, responsive, and fair, and the B-movie style plot is well-told through a series of high-quality cinematics.

Unfortunately though, the game has garnered controversy prior to release thanks to a pretty unexciting set of resolution options as reported by the game's publisher - a 4K30 quality mode is par for the course, but 1080p60 performance modes for PS5 and Series X have concerned many. Meanwhile, Xbox Series S gets a basic 1080p30 mode. The question is, if we look past the raw pixel counts, is there an attractive game underneath - and is performance really as straightforward as a simple 30fps/60fps split?

Before we tackle those issues, it's worth mentioning that Evil West does some things well in terms of assets, lighting, environmental design and rendering quality, with beautiful environments that stretch from sandy deserts to snowed-in summits and damp catacombs. This is very much an Unreal Engine 4 game - expect a heavy reliance on baked lighting within fairly static environments - but I love the way the visuals often come together.

...
Why devs even bother with those flawed "4K" modes?
 
Probably but I mean compared to console without infinity cache for example. I don't expect up to 1.8x compared to AMD RDNA 2 GPU. I hope the same difference compared to a PS5/Xbox Series X. If we have mid gen console they won't be top tier RT. If we have 3060 Ti or better 3070 level of performance that would be great. I don't expect Nvidia 4000 family level of performance even a 4060.
I think they would need to hit roughly 3070 level performance to double frame rates in RT heavy titles. And if the 7900 XTX performs like a 3090 Ti in heavy RT titles, they would need a about 60% of a 7900 XTX, which could require a full Navi 32 with reduced clocks. I can't see that being delivered in a console for $500.
 
Why devs even bother with those flawed "4K" modes?
yup, and tbh after this I don't understand the recommended PC specs.

https://store.steampowered.com/app/1065310/Evil_West/

  • MINIMUM:
    • OS: Windows 10 (64 bits)
    • Processor: Intel Core i5-2500K (3.3 GHz) / AMD FX-6300 X6 (3.5 GHz)
    • Memory: 8 GB RAM
    • Graphics: 4 GB VRAM, GeForce GTX 750 Ti / Radeon RX 460
    • Storage: 40 GB available space
    • Additional Notes: 30 FPS, 1920x1080 in medium
  • RECOMMENDED:
    • OS: Windows 10 (64 bits)
    • Processor: Intel Core i5-10505 (3.2 GHz) / AMD Ryzen 5 1600 (3.2 GHz)
    • Memory: 16 GB RAM
    • Graphics: GeForce GTX 1060 / Radeon RX 590
    • Storage: 40 GB available space
    • Additional Notes: 60 FPS, 1920x1080 in epic
 
yup, and tbh after this I don't understand the recommended PC specs.

https://store.steampowered.com/app/1065310/Evil_West/

  • MINIMUM:
    • OS: Windows 10 (64 bits)
    • Processor: Intel Core i5-2500K (3.3 GHz) / AMD FX-6300 X6 (3.5 GHz)
    • Memory: 8 GB RAM
    • Graphics: 4 GB VRAM, GeForce GTX 750 Ti / Radeon RX 460
    • Storage: 40 GB available space
    • Additional Notes: 30 FPS, 1920x1080 in medium
  • RECOMMENDED:
    • OS: Windows 10 (64 bits)
    • Processor: Intel Core i5-10505 (3.2 GHz) / AMD Ryzen 5 1600 (3.2 GHz)
    • Memory: 16 GB RAM
    • Graphics: GeForce GTX 1060 / Radeon RX 590
    • Storage: 40 GB available space
    • Additional Notes: 60 FPS, 1920x1080 in epic
Recommended specs make sense for 1080p/30.
 

Warzone 2.0 tested on all current-gen consoles: 60fps is a given, but what about 120?​

Crossplay gaming used to deliver precise, like-for-like testing on PS5 and Xbox Series machines.

Warzone 2.0 has arrived and like its predecessor, this free-to-play game is big in almost every sense of the word, with an expansive map inhabited by a massive playerbase. The game's based on the IW9 Engine, the same core technology that serves as the foundation for Modern Warfare 2, but here it's pushed to its limits to draw in a single huge environment while networking up to 150 players.

To put this latest battle royale to the test, the whole DF team combined forces to get simultaneous crossplay capture of each console - starting with PS5, Xbox Series X and Series S. How do each of these platforms stack up visually, what's performance like in the standard modes and how does the 120fps experience hold up for those with high refresh rate displays?

Before we get into the numbers, let's talk about what makes this game so challenging to play - and to run, even on powerful games consoles. As a battle royale, it's all about the giant new map. Al Mazrah is truly colossal, a joint production between multiple Activision studios that ranks as the biggest battlefield in the series' history. Tiny portions of Al Mazrah appeared in MW2's 6v6 and 12v12 modes already but the field of play also includes classic maps from 2007's Modern Warfare and 2009's Modern Warfare 2. It's a beautiful sight for long-time COD fans, and the map also adds new mechanics like boats and aquatic combat.

...
 
I don't mind performance modes and fidelity modes, but if you do either one you have to do both right! Why does the 30fps mode have bad framepacing?!
I can't offer you a technical explanation, but what I can offer you is wild noob speculation.
Based on casual observation the past few years, poor frame pacing in a 30hz capped game seems to occur when the aforementioned 30hz cap seems unusually strict. When the metrics for other modes and other platforms suggest that a particular system would likely run uncapped in the 45-55hz range, that seems to be when poor frame pacing happens most frequently. As if the system finishes what its doing far too soon and is forced to wait an eternity before it can begin working again. Perhaps other processes take over priority in the interim and the engine doesn't resume work on the graphics in a timely fashion. So you end up with a constant tug of war for synchronization.
Like I said, wild speculation of the uneducated.
 
I don't mind performance modes and fidelity modes, but if you do either one you have to do both right! Why does the 30fps mode have bad framepacing?!
Here they really should have added DRS for the 60fps mode. Considering how stable the game is at 1080p, it probably has some headroom for a higher resolution. It's a UE4 game it's not like it's a crazy thing to do on that engine.
 
Status
Not open for further replies.
Back
Top