Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Returnal stutter was caused by raytracing (as confirmed by DF) which the PS5 version doesn't even have, so 🤷?

Returnal runs fine without RT on the two PC's I've tried it on (both relatively high-end, but they correspondingly run at much higher framerates and with noticeably better image quality than the PS5).
 
Wait, what review says Spiderman has stutters on sufficiently capable hardware? Alex didn't mention anything about it and if anyone was going to pick it up, it's him.

Momentary drops below an arbitrary target frame rate aren't stutters unless they're specifically caused by significant and noticeable frame time spikes, and they usually occur regardless of hardware performance. I've seen no evidence of that in Spiderman at all, let alone spikes which have been directly attributed to IO issues.

Yeah I'm not quite sure where the I/O 'stutters' angle is coming from either. The issues that I've seen with Spiderman PC, from my own experience at the time and several reviews/benchmarks/videos:

1) Periodic texture loading issue. This is about as "IO" related as there is, but that has always been most likely a bug. Still, don't think it was ever fixed, but not sure. I would hope at the very least those cutscenes with the very odd random low-res tiny textures like the cop badges were fixed but I don't know.

2) High CPU load when RT is used, particularly in Times Square. This would affect R5 3600's to drop below 60fps when using the same RT settings as the PS5 in performance mode, but dropping these slightly could compensate.

3) CPU culling performance when going from obscured view to full-city, such as running up the side of a tall building then leaping over it. These could cause frame spikes.

General performance though, especially without RT, is pretty stable. There's definitely some significant CPU-related swings when traversing the city, but I/O stutter? Haven't really heard that. Granted SM massively benefits from DDR5 as well so that's also more commonplace now.
 
- You should ignore reviews if they say a game has stutters

So if a 2 year old review states the game (back then) had stutters but has since been patched and no longer has stutters you should still believe the review that the game has stutters?

That logic is laughable and it's why I can't take you seriously.
 
Yeah I'm not quite sure where the I/O 'stutters' angle is coming from either. The issues that I've seen with Spiderman PC, from my own experience at the time and several reviews/benchmarks/videos:

1) Periodic texture loading issue. This is about as "IO" related as there is, but that has always been most likely a bug. Still, don't think it was ever fixed, but not sure. I would hope at the very least those cutscenes with the very odd random low-res tiny textures like the cop badges were fixed but I don't know.

2) High CPU load when RT is used, particularly in Times Square. This would affect R5 3600's to drop below 60fps when using the same RT settings as the PS5 in performance mode, but dropping these slightly could compensate.

3) CPU culling performance when going from obscured view to full-city, such as running up the side of a tall building then leaping over it. These could cause frame spikes.

General performance though, especially without RT, is pretty stable. There's definitely some significant CPU-related swings when traversing the city, but I/O stutter? Haven't really heard that. Granted SM massively benefits from DDR5 as well so that's also more commonplace now.

Yea the game killed my old Ryzen 3600, Spiderman was one of the games that I got a huge performance uplift in when I moved to 12100f quad core and I got another performance boost when moving to a 6 core 12400f.
 
I just wanted to point out that Andrew (sorry for bringing you back into this) mentioned both Vulkan and Direct X 12 when he said what he said.

At least to me, in my uninformed opinion, it was either a case of "Hey developers, be careful what you wish for." or the IHV's going, "Why do we have to keep optimizing drivers for every game?" Considering both dx12 and Vulkan try to give developers a lot of control over what happens on the GPU, it seems like some really high-end devs wanted more control, and Microsoft and Khronos group forgot about what most of the industry really needed. Maybe they could have found an in-between solution that would allow game-ready drivers and, at the same time, give developers more control when they want it.

If I'm mistaken, please enlighten me.

I imagine at least from MS and Kronos perspective, they had a pretty good ideal that the transition would be a little rocky for devs. But long term it would result in devs extracting more performance out of the hardware once they became acclimated to managing more of the rendering responsibilities and resource allocation that was once performed by the APIs.

MS simultaneously released Directx 11.3 with it being a high level api version of 12, so MS had some inclination that devs wouldn't adopt quickly in mass.

The best path forward isn’t always the smoothest. This seems to be scenario where the market has to break a few eggs to make an omelette.
 
Last edited:
So if a 2 year old review states the game (back then) had stutters but has since been patched and no longer has stutters you should still believe the review that the game has stutters?

That logic is laughable and it's why I can't take you seriously.

You can just go on Steam and look at the latest buyers reviews. I imagine any continued issues that haven't been patch will be readily mentioned.
 
It's 1080p on mine.

Maybe stop trying to watch it on a potato?

But off screen shots attached.

As I said, the game is perfectly smooth.

Even the hardest section of the game (Times Square) is a perfectly locked 16.6ms and I've yet to see it drop or stutter.

I have a 12400f and the game is now in a completely different place in terms of frame consistency than it was at release.
Not trying to be argumentative I was wondering why you used Google photo instead of YouTube. I know it can be time consuming but create a new gmail account if need be and setup a youtube studio account to upload video.
 
Not trying to be argumentative I was wondering why you used Google photo instead of YouTube. I know it can be time consuming but create a new gmail account if need be and setup a youtube studio account to upload video.

It was already on Google photo's (At 1080p on mine) so why bother with a time consuming YouTube upload and processing time.

I also don't use MSI Afterburner as I'm playing around with HWiNFO64 as it's OSD has way more sensor options (Like the option to show read/write rates for SSD's which MSI Afterburner completely lacks) but I've not fully set it up yet.

If I get time tomorrow I may do Time Square as I've just swung around it for 2 minutes with no issues.
 

Attachments

  • Spiderman, Spiderman, does whatever a Spider can!.png
    Spiderman, Spiderman, does whatever a Spider can!.png
    7.5 MB · Views: 6
Its fair when the said person is persistent that the game runs smoothly on their end.

Was it fair that he nitpicked over my claims that my 2700 played the game locked (almost...) 60 FPS?

- He first demanded that I test Times Square and not some random location
- Then he demanded frametime graph
- Then he nitpicked about how it dropped to 56 for a second

He battled me over to extreme extends on the mere definition of "locked". Now he must defend his "ZERO stutters" and "perfectly fine" claims.

If he's so nitpicky, persistence and pervasive in his behaviours, I'd expect him to do a video where frametime graph is open in TIMES SQUARE...

Or should I've rejected his demands saying "I don't have the stuff or time, what I say is true, bye"?

If one has a claim, then they should clearly prove it. At least that was his mindset. Back then. It is only fair I ask the same.

"I do have the ability to add a frame time graph but might take pictures rather than a video."

Nope. It's gotta be a video in Times Square with fast swinging and camera movement. If I see a single hitch, stutter or frametime spike, it is game over for your claims of "perfectly fine".

Just like how you were persistent on how a single dropped frame nullified my claims of "locked 60", now a single stutter will nullify your answer of "it is perfect on my end" to the user's claims of IO stutters being existent in Spiderman.

Proof that he says his game is PERFECT (as in PERFECT, not a single stutter should occur. THAT is the definition of PERFECT. like, when I said locked 60, you battled me over the definition of locked 60 and how it should not even drop a single frame. here ya go, now you must come to terms with the definition of being PERFECT.


Anthony davis replied: "And yet perfectly fine on my PC."

Additional proof of claiming NO stutters;

"But I have recorded a video of Spiderman running on my PC at max settings at a rock solid 60fps with zero stutter on my $140 CPU."



Means that he does not get I/O stutters. Let's put that to test with an actual high fidelity video with frametime graph open + Times Square + swinging + camera movement.

Video should be at least 3 minutes long (as he persistently wanted even more back then), should include swings over Times Square repeatedly.

As I said, frametime graph has to be opened.

I won't even care if you don't feel stutters. If I see a hitch, stall or stutter in your frametime graph in the 3 minute Times Square fast swinging + camera movement (important, if these two does not exist and if you only stand idle atop a perch or something, it won't count), I don't believe your claims.

I understand why you might feel frustrated, but for me your video on your 2700 system as you had it configured was an eye opener, and I really appreciated you uploading it.

It made me think more seriously about user end configuration and ram timings, which I'd probably stopped thinking about so much in the 10+ years since I really stopped dicking around with game and OS and memory settings and all that jazz.

You did some cool work and showed the receipts, and I respect that.
 
Returnal stutter was caused by raytracing (as confirmed by DF) which the PS5 version doesn't even have, so 🤷?

Returnal runs fine without RT on the two PC's I've tried it on (both relatively high-end, but they correspondingly run at much higher framerates and with noticeably better image quality than the PS5).
To be fair to PS5 in one respect, it could probably have much better image quality if housemarque went back and added FSR2 instead of their current upscaling solution which is much inferior and causes the game to be scaled twice
 
No.....there's a universal 'acceptable standard' for gaming.

  • No crashing
  • No game breaking bugs/glitches
  • A consistent frame rate of at least 30fps

You would be hard pressed to anyone who doesn't expect those as the minimum acceptable standard for any game on any platform.

And yet there are big budget titles that release that fall well below those standards every month.
And yet, Elden Ring is GOTY according to almost 150 outlets. How can an universally unacceptable game win that many GOTY awards?
 
And yet, Elden Ring is GOTY according to almost 150 outlets. How can an universally unacceptable game win that many GOTY awards?

Im not sure how bad Elden Rings issues were at launch, or how long it took them to be fixed, but it absolutely meets every one of those criteria now.

Elden Ring is perhaps one of the best arguments for delaying a game. It's universally accepted as a brilliant game, and yet whenever anyone brings it up it always seems to be in the context of what a technical mess it is on the PC, despite being patched to a perfectly playable state post launch. If they'd just resolved those issues pre launch it would only be remembered for what an amazing game it is. The same goes for Cyberpunk.
 
And yet, Elden Ring is GOTY according to almost 150 outlets. How can an universally unacceptable game win that many GOTY awards?

GOTY awards mean nothing to me as I never agree with them.

But are you saying gamers don't expect a stable game that offers at least a stable 30fps?
 
Last edited:
which very good recent dx12 games are you talking about?

cyberpunk hasnt been fine for ages the latest take on df showed their are still problems and it took years to patch

which crysis are u talking about the remastered or the original pc version cause im talking about the original pc version that created the phrase '' can it run crysis''

offers good performance but io stutters still exist unless playing on high end rigs!

its not a dev issue because the game just doesnt target 24gb so blaming the dev that the game wasnt made to target your 3000$ gpu is irrelevant.

doesnt matter what you expect some problems are just unsolvable and cant be solved just because you expect them to, and its not going to stop devs making games for multiple devices.
Crysis does not have I/o stutters, what are you talking about? There are a lot of good DX12 games.
A very recent one off the top of my head - A Plague Tale Requiem
 
You are a statistical outlier, bitter and hard to satisfy. You probably hate chocolate, ice cream, puppies, sex and life itself :yep2:

If you got out more you would see that a lot of people don't agree with the GOTY nominations or winners, many on social media thought Horizon: Forbidden West should have won GOTY last year. I wonder if they also hate chocolate, ice cream, puppies, sex and life itself?

But how dare I prefer different games to someone else.
 
If you got out more you would see that a lot of people don't agree with the GOTY nominations or winners, many on social media thought Horizon: Forbidden West should have won GOTY last year. I wonder if they also hate chocolate, ice cream, puppies, sex and life itself?

But how dare I prefer different games to someone else.
I was bantering
 
To be fair to PS5 in one respect, it could probably have much better image quality if housemarque went back and added FSR2 instead of their current upscaling solution which is much inferior and causes the game to be scaled twice
Yes it's hard to use that game as a fair comparison as it uses a fixed resolution of 1080p on PS5. I wonder why they didn't use DRS on UE4? Maybe to make development easier?

And by the way I still think FSR2 is not good enough IQ wise. I'd say TAAU combined with DRS has better IQ. For instance it's really great in Kena.
 
Status
Not open for further replies.
Back
Top