Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Are they using footage of an unlreased PS2 version for Switch?

I think you should start up your PS2 again, than you wouldn't write such things ;)
Even ps3 quality was most of the time worse.
This is a game on switch that showcases what is still possible on so little hardware. Well .. the performance is not really good, but mainly the game stays the same.
But yeah, a high detailed environment is nothing the switch can really do with that type of graphics.
 
I think you should start up your PS2 again, than you wouldn't write such things ;)
I play PS2 all the time still, fire up your PS2 and play Silent Hill 3 and remind yourself how good it still looks 😁
Even ps3 quality was most of the time worse.
Most of the time? I disagree.
This is a game on switch that showcases what is still possible on so little hardware.
It showcases a perfect example to other developers to not attempt such games on Switch.
Well .. the performance is not really good, but mainly the game stays the same.
Is it though?
But yeah, a high detailed environment is nothing the switch can really do with that type of graphics.
We call all see that from the video 😛
 
Doing some quick googling I'm see some benchmarks that seem to show the opposite (and also significantly more stuttering on DX11 ironically), for instance:
Sorry, as I'm going to use your post to reiterate about a different subject because it's a great example and this isn't directed specifically at you or Epic, so please don't take it that way, but that first video (DX11) is the epitome of why I am the way I am regarding stuttering in games. It's been a thing for ages.. and I just can't fathom how something like that gets released by a studio and the player is just expected to accept it. That experience in that video is just unacceptable.. regardless of whether it's the first run or 1000th run. There's no defending it. What good is 300+FPS if that's what you're experiencing? And the gall of companies like Nvidia and AMD to toot their own horns about how amazing PC and their hardware is for gaming.. while they know their consumers are experiencing this stuff.. just never fails to piss me off.

Obviously, things are changing now, devs know it's an issue gamers aren't happy about and they're doing work to improve/change it.. I know more about it now than I did before and it's a super complex issue... but how long did it have to take before some action was taken? How much did we have to bitch about this stuff to get any attention to it? Meanwhile in my head I'm just thinking, how does that pass scrutiny that the studios should undoubtedly be putting on their own games?

This entire issue should have been as big of a focus as it is now years ago! I think industry priorities for a lot of this stuff is all wrong. The priority in gaming always should have been a good baseline performance and stability first, then make it as pretty as you can while adhering to that standard. Engines should have evolved in this fashion as well... instead of pushing ridiculous complex shaders and features too quickly. I know UE is a tool which services multiple industries and it makes things complicated.. but once you pull the hat out of the bag you can't put it back in. Expectations have changed.. and now players particularly on certain platform are caught with an issue that should have been better considered before all of this.

I know PC gamers aren't the center of the universe, but I feel l like this entire initiative should have started long ago and we'd have been in a much better place by now. It's kinda like being fat and saying you're going to start losing weight for months, then finally starting it and being disappointed you're still fat. If you'd have started when you first said you were going to, you'd already much closer to your goal or achieved it already.. lol.
 
It's been a thing for ages.. and I just can't fathom how something like that gets released by a studio and the player is just expected to accept it. That experience in that video is just unacceptable.. regardless of whether it's the first run or 1000th run. There's no defending it. What good is 300+FPS if that's what you're experiencing?
Eh, having just experienced the same thing when I booted up DX11 today I'm not gonna disagree with you that it's bad. I'll slightly disagree with the notion that it's irrelevant whether or not it gets better... if it's an issue that clears up after some time it's less serious than one that never resolve itself, but of course I agree we should not have it in the first place.

The irony is that a lot of the recent shader stutter rage has been directed at DX12/Vulkan and while a good chunk of that is deserved, people seem to forget that the motivation for some of the API changes was specifically to allow us to *reduce* this kind of issue. Of course in some cases it backfires and things get worse, but conversely people pretending that it was somehow not an issue in DX11 is clearly revisionist history. The IHVs might be spending less time shipping precompiled shader replacements for DX11 games now, but even that was an imperfect solution. Ultimately if you launch DX12 Fortnite today the experience is significantly less stuttery than DX11, and I think it's fair to call that progress.

And of course only the IHVs can fix shader compilation stutter issues in DX11... the application has no control over that.

Meanwhile in my head I'm just thinking, how does that pass scrutiny that the studios should undoubtedly be putting on their own games?
I mean you can probably imagine hos it's easy to miss or deprioritize stuff that is only an issue if you are running on a fresh machine/install. It obviously does get tested, but it's not going to be as high priority as things that happen all the time or crashes or similar. The public focus on the issue has certainly shifted the needle in a positive direction, and in general as long as criticism continues to be made in good faith (rather than to be mean spirited or fuel some other soapbox/agenda), it's a good thing.

Engines should have evolved in this fashion as well... instead of pushing ridiculous complex shaders and features too quickly. I know UE is a tool which services multiple industries and it makes things complicated.. but once you pull the hat out of the bag you can't put it back in. Expectations have changed.. and now players particularly on certain platform are caught with an issue that should have been better considered before all of this.
🤷‍♂️ there's only so much hindsight speculation that is useful. That's easy to say now but Fortnite's shader issues are in a large part due to the myriad of cosmetics and wide array of hardware support, not the APIs per se. If you hadn't had all those fancy skins or supported a narrower range of platforms, would Fortnite have even become as popular as it did? Maybe, but that's certainly far from completely clear.

Also worth remembering that if it wasn't shader stutter it would just be something else. Quality is ultimately relative and at no point is there a shortage of work to do.
 
Eh, having just experienced the same thing when I booted up DX11 today I'm not gonna disagree with you that it's bad. I'll slightly disagree with the notion that it's irrelevant whether or not it gets better... if it's an issue that clears up after some time it's less serious than one that never resolve itself, but of course I agree we should not have it in the first place.
Well yea I mean I understand that view considering what type of game Fortnite is. I know not all gamers share my opinion about this stuff, and to this day even in some of the worst cases out there you have people who don't notice it, or aren't bothered by it.

I'm bothered by a lot of that stuff however and being me, hell.. I'd prefer the game install, I boot it up, and it gives me the option to have the game first just join a couple matches as a spectator and spectate random players until the end in the background and build a cache. Now I know people would be like.. "just play the game a bit and build the damn cache".. but I get annoyed when I'm expected to partake in that process and experience the game like that, regardless of whether I know it will get better or not. I feel like that "process" should be a seamless option for the player. That's why I greatly appreciate games which do these processes and get it out of the way first thing. Obviously, I understand Fortnite is a challenge because of user content nature of the game, so I get that it's not a full solution by any means.. but still, the first time play experience should be far better than that!

The irony is that a lot of the recent shader stutter rage has been directed at DX12/Vulkan and while a good chunk of that is deserved, people seem to forget that the motivation for some of the API changes was specifically to allow us to *reduce* this kind of issue. Of course in some cases it backfires and things get worse, but conversely people pretending that it was somehow not an issue in DX11 is clearly revisionist history. The IHVs might be spending less time shipping precompiled shader replacements for DX11 games now, but even that was an imperfect solution. Ultimately if you launch DX12 Fortnite today the experience is significantly less stuttery than DX11, and I think it's fair to call that progress.

And of course only the IHVs can fix shader compilation stutter issues in DX11... the application has no control over that.
Yea, people do seem to act like DX11 was some perfect thing when it wasn't. PSOs make a lot of sense and its understandable why they were created. Like I said, I'm down with waiting, and pre-compiling as much as possible to give the best experience possible, though I understand there's still cases where there's not much that can be done currently. Epic are making huge strides (as far as I can tell/hope) to reduce the number of unnecessary PSOs generated, which will certainly benefit games and developers, and it's greatly appreciated! I personally like DX12/Vulkan and I do think that it's better that developers have more control over this stuff! I'd rather experience the growing pains of all the added responsibility to end up in a better place on the other side!

I mean you can probably imagine hos it's easy to miss or deprioritize stuff that is only an issue if you are running on a fresh machine/install. It obviously does get tested, but it's not going to be as high priority as things that happen all the time or crashes or similar. The public focus on the issue has certainly shifted the needle in a positive direction, and in general as long as criticism continues to be made in good faith (rather than to be mean spirited or fuel some other soapbox/agenda), it's a good thing.


🤷‍♂️ there's only so much hindsight speculation that is useful. That's easy to say now but Fortnite's shader issues are in a large part due to the myriad of cosmetics and wide array of hardware support, not the APIs per se. If you hadn't had all those fancy skins or supported a narrower range of platforms, would Fortnite have even become as popular as it did? Maybe, but that's certainly far from completely clear.

Also worth remembering that if it wasn't shader stutter it would just be something else. Quality is ultimately relative and at no point is there a shortage of work to do.
Yea I can understand that certain things take priority depending on the game and everything else. Clearly crashes and bugs will take priority, but I think over the past couple of years it's become quite apparent that this issue needs to be further up the rung for some. I know as the engine provider, you guys can only do so much, and yeah people can be unbelievably mean, and speaking to a couple of QA devs really clued me into some of the abuse they put up with.. and it's not right at all. It absolutely has to be done tactfully and from a place of respect. I understand a lot more now about how complex the situation is than I did years ago, and so when it comes to criticism you have to look at it from a realistic angle about what can and can't be done.

For sure you're right, it's easy to say this stuff and speculate after the issues have finally revealed themselves and products are out there and not while you're actually building the darn things.

And I agree.. if it wasn't shader stutter it would be something else, but for the record.. I can't wait for it to be something else! lol
 
🤷‍♂️ See my post. It doesn't seem to match the other results people are getting. I hesitate to suggest this since HUB is a pretty thorough and careful site, but the only notes I see on quality settings on that video are "Epic Quality"... are we sure that he doesn't have Nanite/VSM/Lumen on (and/or other DX12-exclusive settings that default on on Epic...) in DX12? [Edit] I guess he tests that separately in the next test in theory, but I still wonder if this is actually apples to apples. There is no real reason for performance results to differ so much if the settings were the same, *especially* when GPU bound.

Just checked super quickly on my machine in the current season and DX12 is a bit faster than DX11 on my machine (4090, 1440p). To the limits of my ability to A/B test I'd call them functionally similar again, although ironically the shader stutter in DX11 was really bad for the first minute or so. When you look at the ground performance is similar. When you look off into the distance performance on DX12 is a bit better (but we're talking in the range of 3.5ms vs 3.8ms per frame which look impressive in % graphs but aren't really a huge difference in reality). What do you get on your machine?

So yeah I dunno, any number of variables could be different between my test, the youtube tests and the HUB test. I'm still leaning on something was probably a bit wonky in the HUB test, but as I said in my other post... there's a zillion variables and I'd take literally any single result with a huge grain of salt.
I went and installed it as I've never actually played this. Performance was similar but DX11 was a stutter fest and DX12 was surprisingly smooth. Framerates were much lower than I recall seeing in benchmarks over the years though. I recall my class of GPU getting close to 120 fps at 1440p. Has the game gotten more demanding in these newer seasons?
 
I went and installed it as I've never actually played this. Performance was similar but DX11 was a stutter fest and DX12 was surprisingly smooth. Framerates were much lower than I recall seeing in benchmarks over the years though. I recall my class of GPU getting close to 120 fps at 1440p. Has the game gotten more demanding in these newer seasons?

In 2001 they added new Epic settings to fortnite and the old epic settings became high. And then they added UE5 which brought nanite, lumen and virtual shadow maps to dx12 and those settings are even more demanding.
 
In 2001 they added new Epic settings to fortnite and the old epic settings became high. And then they added UE5 which brought nanite, lumen and virtual shadow maps to dx12 and those settings are even more demanding.
I tested with what I thought were the original epic settings prior to UE5. Didn't know they were increased.
 
I tested with what I thought were the original epic settings prior to UE5. Didn't know they were increased.
Yeah the downside of stuff like Fortnite for benchmarking is the whole thing is a moving target... least of which being the content changes all the time. The DX11 stutter stuff is sort of interesting as now that seems like a pattern (across the youtube folks and us here). Maybe it has always been that way, but maybe there has been some change in the way drivers handle it too, I dunno.
 
And I agree.. if it wasn't shader stutter it would be something else, but for the record.. I can't wait for it to be something else! lol
I think we're almost there to be honest. Nothing is ever going to be perfect but as the videos show the DX12 path in Fortnite at least is pretty good now, and we've had a few UE5 games with no real first run stutter. Certainly some amount of that getting some focus can be attributed to the additional priority of folks calling it out in previous releases.
 
We've finally got hold of a PlayStation Portal and we've got some initial impressions for you as we kick off this week's DF Direct Weekly, but beyond that we're talking about Steam Deck vs ROG Ally complete with some fresh benchmarks, we update you with our latest adventures with EA WRC on PC, while John spends plenty of time discussing the new RetroTink and its $750 price-point - along with his latest adventures with Meta Quest 3, Doom VR and the new Assassin's Creed VR experience.

0:00:00 Introduction
0:00:52 News 01: PlayStation Portal first impressions!
0:26:47 News 02: New Suicide Squad deep dive video lands
0:45:01 News 03: RetroTINK 4K to cost $750 USD
1:00:42 News 04: EA WRC PC update
1:08:59 News 05: Steam Deck OLED vs. ROG Ally!
1:25:12 News 06: John’s continuing Quest adventures
1:37:23 Supporter Q1: Will we ever see an arcade racing resurgence?
1:43:53 Supporter Q2: Did Starfield ship “unoptimized”? Why did it take so long to improve performance and add DLSS?
1:54:27 Supporter Q3: Is it possible Nintendo’s next console will be a traditional home system?
1:58:30 Supporter Q4: Is the Xbox One the only console to be obsoleted completely by its successor?
2:04:19 Supporter Q5: Will you ever do a retrospective on failed past consoles, like the Ouya?
 
Re: Q4

Yes. There is a reason to keep an Xbox One.
It's a tiny reason, highly specific. But it's fact.

The Xbox 360 game Risen, when played through backwards compatibility, runs too fast on One X and Series S/X. The camera speed is tied to framerate. A small tap on the stick will spin your view all the way around. Controlling the game becomes, if not impossible, then certainly extremely frustrating.
On Xbox One/S, the game runs slow enough to enable reasonable control of the game.
 
Re: Q4

Yes. There is a reason to keep an Xbox One.
It's a tiny reason, highly specific. But it's fact.

The Xbox 360 game Risen, when played through backwards compatibility, runs too fast on One X and Series S/X. The camera speed is tied to framerate. A small tap on the stick will spin your view all the way around. Controlling the game becomes, if not impossible, then certainly extremely frustrating.
On Xbox One/S, the game runs slow enough to enable reasonable control of the game.
What if you play it on a tv that's 4k but only accepts 4k30 via one or all of it's HDMI ports. Would forcing the console to output 30hz solve the problem, or would you just have the problem but with worse image quality.
 
What if you play it on a tv that's 4k but only accepts 4k30 via one or all of it's HDMI ports. Would forcing the console to output 30hz solve the problem, or would you just have the problem but with worse image quality.
I don't have a TV or monitor like that. No way to test it.

It should be mentioned, however, that the point I raised only really applies if you already owned the 360 version of the game, since they actually made a remastered version of it for newer consoles, which doesn't have the camera problem. So you'd have to want to play the original 360 release for some reason (which might be the jankiest game in history).
 
I wonder if the RetroTINK supports advanced / complicated algorithms like MadVR and are those even relevant to gaming content.
Looking at the price I thought for sure MadVR developer / people were behind it, since they were talking about launching a similar box but only for video content.
 
Re: Q4

Yes. There is a reason to keep an Xbox One.
It's a tiny reason, highly specific. But it's fact.

The Xbox 360 game Risen, when played through backwards compatibility, runs too fast on One X and Series S/X. The camera speed is tied to framerate. A small tap on the stick will spin your view all the way around. Controlling the game becomes, if not impossible, then certainly extremely frustrating.
On Xbox One/S, the game runs slow enough to enable reasonable control of the game.
Also, the Kinect isn't supported at all on Series X|S, so an Xbox One is still necessary for Kinect 2.0 games such as Fru, Fantasia, Kinect Sports Rivals, and a small handful of others.
 


0:00:00 Introduction
0:00:52 News 01: PlayStation Portal first impressions!
0:26:47 News 02: New Suicide Squad deep dive video lands
0:45:01 News 03: RetroTINK 4K to cost $750 USD
1:00:42 News 04: EA WRC PC update
1:08:59 News 05: Steam Deck OLED vs. ROG Ally!
1:25:12 News 06: John’s continuing Quest adventures
1:37:23 Supporter Q1: Will we ever see an arcade racing resurgence?
1:43:53 Supporter Q2: Did Starfield ship “unoptimized”? Why did it take so long to improve performance and add DLSS?
1:54:27 Supporter Q3: Is it possible Nintendo’s next console will be a traditional home system?
1:58:30 Supporter Q4: Is the Xbox One the only console to be obsoleted completely by its successor?
2:04:19 Supporter Q5: Will you ever do a retrospective on failed past consoles, like the Ouya?
Turn off Internet, take any XB1 disc game and insert it inside a XSX (that never played the game before); can you play the game via 'BC'?
 
Status
Not open for further replies.
Back
Top