Digital Foundry Article Technical Discussion [2020]

Status
Not open for further replies.
Written DF Article @ https://www.eurogamer.net/articles/digitalfoundry-2020-cod-black-ops-cold-war-series-x-s-ps5-tested

Call of Duty: Black Ops Cold War runs beautifully on PS5 and Series X
Ray traced shadows and 120fps support add some next-gen spice.

This is impressive stuff. Call of Duty Black Ops: Cold War manages the cross-gen transition gracefully, delivering excellent performance whether you're gaming on PS5 or Xbox Series consoles. The premium machines also benefit from two key features: 120Hz and ray traced shadows, both of which are transformative to the experience. Microsoft's pint-sized new console - the Series S - is also impressive enough, but appears to lack the signature next-gen features its more expensive counterparts deliver.

Let's kick off by digging into the features that separate Series X and PS5 from all of the other versions. That stars with ray traced shadows, which - as the video below demonstrates - replace the baked shadow maps enjoyed in the other versions of the game. In terms of overall effectiveness, RT shadows are often overlooked, but the effect in Cold War is exceptional, almost as if the game's art designs are built around the technology. Scenes that look perfectly fine with traditional shadow maps look so much better with ray traced shadows enabled: soft and diffuse at distance, ultra-sharp close up, just as they should be. Series S does allow you to download an 11GB RT pack, but there is no option in-game to enable it - though we've asked Activision for clarification.

RT is also only available as an option with the game running at the standard 60fps, but Cold War's other key next-gen feature is 120Hz support, which sees PS5 and Series X both doing a pretty excellent job of targeting and indeed maintaining 120 frames per second. In fact, in the multiplayer mode, 120fps is a lock on both systems (game-changing in its own right, especially when combined with keyboard and mouse support) but the fact that the entire campaign can play out at 120fps (or very close to it) is an impressive feat. It's not perfect for solo play, it's certainly not locked, but it is running flat-out for much of the experience.

...
 
Unless the PS5 does indeed have unified L3 cache on the CPU, which in the case of Zen3 it could be an important factor why its gaming performance is so much better than Zen2's.
there's a million reasons anything can be the way it is.

You can't base it off a youtube video with it's poor compression and loss of detail. You could have better dynamic resolution scaling on PS5 that keeps its framerate hitting the target 120fps better than XSX for instance.
It's insufficient to view frame times only from 4ish titles and conclude that its CPU and some sort of hardware advantage on that front. But you didn't pixel count or do any settings checking to see what's happening on that end which is the standard route we should go before we jump to those conclusions. Almost all 120fps settings are dynamic resolution. It's going to be really hard for you guys to tell anything without access to raw footage. The only data point you have is frame time. I think you're going to need a lot more to prove it's CPU.
 
Not really. The game comparisons and analysis are simply showing what developers and trusted sources have been saying for over a year now, that both of these systems will perform similar, no great divide in graphics (parity for the vast majority of third-party games). This generation just like the prior, will boil down to game artistry and art assets put forth by first-party games.

Meh, atleast i could dream :p
 
Well it is an issue. But only if someone can issue an update that suddenly makes all TV's support VRR as well. Until that happens it's an issue of totally, statistically, insignificant proportions. Unless of course you just need something to faux rage over because it's a slow news day or something.
I don't know. I mean, we are at the start of a generation and you don't buy one of these consoles today to only use it today. New console purchases are about the promise of future experiences. And if you don't have a VRR display now, you may in the future, as I think most new displays will support the feature because it's been adopted into the HDMI standard. It's great that Sony has promised to include the feature in the future, but it's up to consumers to hold them to it.

So the performance is the same? Weird that checkpoints affect the performance, it must be some kind of bug in the code.

edit: maybe the checkpoint clears some memory that wouldn't happen otherwise? Maybe a garbage management issue?
On PC, I've experienced something similar with Shadow of the Tomb Raider. Mostly when you come across a camp fire. Camp fires are RT lights (not ever light in that game uses RT), and the frame rate can drop hard. You can leave, and come back, and it's usually fine after that. I suspect that it has something to do with loading in/processing the BVH information. I don't think it's an IO issue because I ran the game off an SSD. Looking at the videos, it looks the PS5 is doing something like that.
 
Reminds me of GPU benchmarks where Titans are competing at 1080p for 240fps or something ridiculous where the CPU is bottleneck again.
120fps is a very difficult target to maintain, I am with DSoup in the laugh that people will think they will notice 10fps drops at 120fps. 120fps to 60fps, okay. That’s going to feel like you went from a jet to landing in jello, but minor fluctuations will mean nothing for most people. VRR just makes it entirely a nothing burger

It depends a lot on the gamer and what they expect out of their experience. I play at a locked 60 in pretty much everything, so a 5 FPS drop is massively noticable to me, especially if it happens erratically and somewhat often. That's the equivalent of a 10 FPS drop at 120 Hz.

It's basically enough to throw off your aim if your reflexes expect a consist level of input-feedback loop. The same would go for me with a 30 FPS title dropping 2-3 FPS from time to time.

It's also why I dislike VRR so much, the input-feedback loop is constantly being thrown off because of the constantly variable framerate. Hence why I consider variable resolutely infinitely better than VRR. And I'll almost always prefer screen tearing with unlocked framerate + no vsync over VRR.

But, as noted most gamers, especially console gamers are unlikely to notice as they don't have the option to adjust the settings of a game to ensure a locked framerate, so variable drops in framerate as long as they aren't too big mostly go unnoticed. Similar to how most console gamers dismissed the benefits of 60 FPS rendering WRT to gameplay and presentation ... until they've spent a good amount of time playing at 60 in a larger variety of genres than just racing games and fighting games.

But, because of the inability to tweak settings, console gamers will always have to live with a game with variable framerate for current games if the developer doesn't prioritize a locked framerate.

Of course, if they do that, the screenshot console warriors will jump on the it in a heartbeat and claim that performance is being left on the table that could have gone into graphics. For them, screenshots are far more important than good gameplay. :p (I kid...mostly).

Regards,
SB
 
So the performance is the same? Weird that checkpoints affect the performance, it must be some kind of bug in the code.

edit: maybe the checkpoint clears some memory that wouldn't happen otherwise? Maybe a garbage management issue?

Random ass speculation. I wonder if maybe the cache scrubbers aren't operating correctly with some bit of code in the game? If it's not clearing out the cache like it's supposed to and something goes to use it, perhaps it engages a backup emergency form of clearing that leads to these unexpected and seemingly random dips?

Regards,
SB
 
Random ass speculation. I wonder if maybe the cache scrubbers aren't operating correctly with some bit of code in the game? If it's not clearing out the cache like it's supposed to and something goes to use it, perhaps it engages a backup emergency form of clearing that leads to these unexpected and seemingly random dips?

Regards,
SB

Seems like a typical memory leak. I seen this in PC games like GTA V and Witcher 3 where some gamers reported sharp drops in certain scenes, then the next play-through they didn't. Of course they were patched out later.
 
They should have said VRR solves the XSX issue for users that own a compatible display, though that feature is also expected to appear on sony camp. The way its worded makes the regular joe think his 500$ TV has it.
 
I didn't quite follow this, but to be clear I'm not saying games shouldn't use VRR, I'm saying that developers should not engineer games on the basis that frame pacing isn't a problem that needs to be solved because it will take years for 150+ million console users to upgrade their TVs to solve this problem.

In the case of COD, Valhalla and Dirt, these just seems to be bugs that I'm sure will be smoothed out in time.

Also, I wish you well in the coming weeks. The annual culling of your kind, must be trying. :yep2:

Covid is an unexpected ally this season ;)

I agree, my point was for 120 specifically; people who can experience it are more likely to have vrr (and as a % I believe this will grow) it seems less important for a locked frame rate. Obviously a lock is better but as it's niche and such a tight frame time I don't see it as negatively as dips in mainstream frame rates.

60 needs to be very very stable and when targeting 30 basically an unrelenting lock is required, frame drops then are harsh.

It
It's also why I dislike VRR so much, the input-feedback loop is constantly being thrown off because of the constantly variable framerate. Hence why I consider variable resolutely infinitely better than VRR. And I'll almost always prefer screen tearing with unlocked framerate + no vsync over VRR.


Regards,
SB

I am not sure I follow the logic from dropped frames to VRR. Now I understand frame times will fluctuate as they are presented to the user when ready, my understanding is you cannot compare this to vsync dropped frames where a whole frame is missed.

Perhaps my understanding of VRR is flawed and I have not experienced.

Let's say 60 fps as it's middle ground, 16.6ms of responce delay is added as a judder when a frame is dropped. Under VRR that might only be an elongated frame to 17.6ms and is a 1ms delay to response without a easily visible judder. The next frame may be back to 16.6ms. Assuming it works like this it seems almost incomparable to regular frame drops in feel.

How does VRR feel to you, can you explain the frame time feel (assuming a minimum of 60fps game target) ? I don't think I have seen much critical discussion on it so would be good to hear from a non fan.
 
It's also why I dislike VRR so much, the input-feedback loop is constantly being thrown off because of the constantly variable framerate. Hence why I consider variable resolutely infinitely better than VRR. And I'll almost always prefer screen tearing with unlocked framerate + no vsync over VRR.
This is a symptom of the ancient approach of the master game loop being bound to frame rate. There's no reason that user input needs to be tied to the frequency of the display, nor should it be. Games can internally run the core logic at a rate different to the rendering system, racing sims have done this for a while and most games will have other constant processes (like audio) which will not change because of frame rate changes, whether it's VRR, dropped frames etc.
 
I have to say on some scenes difference is quite big, I wonder how would look demon souls remake with rt shadows

Demon's soul's remake would look better but they use a mix of high res shadow maps, capsule shadows and screen space shadows like in Days gone, the improvement would be a bit less dramatic. Everything cast shadows and every light cast shadows.

 
Last edited:
Status
Not open for further replies.
Back
Top