Digital Foundry Article Technical Discussion [2024]

What's redeeming about the article is that it shows some investment by MS to generate the AI models to do the work. If those AI models (which are costly to develop) are sufficient and satisfactory enough, they may make their way to consoles. It's not clear right now is MS is using a 3rd party service for this model, or if they made their own.
interesting... Back in the day, in the PS2/PS3 era, consoles were like advanced devices that introduced new techs to the world that were later globally adopted. Magic SSD and Kraken aside, not much seems to be happening in the consoles world, and maybe some kind of chip like that would change things.

You'll see how the RT along with the high framerate (through frame generation technologies) are going to be the lure of future consoles. And the resolution fight -which DF videos/articles have shown a lot- will die definitively before the imposition of upscaling.
 
interesting... Back in the day, in the PS2/PS3 era, consoles were like advanced devices that introduced new techs to the world that were later globally adopted. Magic SSD and Kraken aside, not much seems to be happening in the consoles world, and maybe some kind of chip like that would change things.

You'll see how the RT along with the high framerate (through frame generation technologies) are going to be the lure of future consoles. And the resolution fight -which DF videos/articles have shown a lot- will die definitively before the imposition of upscaling.
I still think there's a lot more to come from the SSD's and the DirectStorage-esque capabilities. I think devs are only touching on what can be done with those in terms of using them as memory multipliers rather than just fast loading screens.

I dont think this was any minor thing they're bringing to the table, but ultimately a full-on paradigm shift in how games get made. But it's gonna take a while for developers to fully embrace the changes they need to make on their own end to take advantage of it.
 

Finally we have Relink covered. And yes that camera stutter is super annoying and perceptible across the entire game.

Also I believe they use the same engine from Platinum games, as the game assets shared the same format as in Nier Automata, with some improvements like using Granite to stream textures. Quite a lot of Platinum employees joined Cygame after the incident, and continue working on this project.
 
When he turned the anti-aliasing off it was the perfect example of just how blurry modern games are.
while not a blurry game by any means, I just turn AA off in Elden Ring to save as much energy as possible and play with a low wattage. I didn't pay yet for the mod that allows you to have DLSS or XeSS in Elden Ring, and I am not using lossless Scaling -an app- either, just locked 30fps and Black Frame Insertion from the TV, and I am "condemned" to play native 3840x2160 at times for testing purporses but I just play ultrawide 3840x1080 and even without AA the game looks okay.

I wonder how DF staff found out that the PS5 version of Helldivers 2 is running using FSR1 for upscaling. NIS and FSR1 are okay, but for a console where games are usually played at native 16:9 an you don't need to run a game at internal 1080p on a 1440p display -one of the strengths of FSR1 or NIS, working fine with those odd resolution conversions- it's puzzling why they didn't use FSR2.
 
The reality is temporal solutions are only going to increase. Sampling over time is the future. You're just not going to see real gains in the quality of rendering without temporal data. Spatial upscaling is very limited because it will always be some kind of interpolation, where temporal has access to real good samples that you've already generated. Downscaling is just a dead end because it requires generating more samples per frame which is just brute force. I do think the real issue is pushing the limit of temporal upscalers past their capabilities. Upscaling from 720p to 1440p, and then applying an additional spatial upscale to 4k is never going to look sharp, at least with current solutions (but probably never).

Unfortunately quite true. However, IMO, from my experience I'm almost entirely unsatisfied with any form of temporal upscaling/reconstruction at the moment, too many artifacts and instability in the image.

This is likely due to me only gaming at a max of 120 Hz (my displays maximum). I imagine that for temporal upscaling/reconstruction the point at which it goes from being mostly annoying to mostly pleasing for me will be somewhere around 240 Hz rendering or higher. At that point you can have say 4x samples accumulated at 240 Hz to give roughly the quality of a non-Temporal solution at 60 Hz, assuming of course that artifacting isn't too egregious.

The problem here is that currently I can't justify a GPU that can consistently render at 240 Hz without added artifacting from requiring something like DLSS/FSR/XeSS (none of which are entirely satisfactory to me, although some can be better than developer created temporal solutions).

I think this may be one (of a variety) of reasons that I just don't enjoy the vast majority of AAA games anymore. The reliance on image degrading (to me) temporal solutions in order to enable more advanced rendering features defeats the point of the more advanced rendering features in many cases. Again, for those reading this, I can't stress this enough ... for me. YMMV.

It's frustrating enough that when I cannot disable headache inducing attempts (motion blur, for example) to "hide" or mitigate temporal rendering artifacts I'll just stop playing the game. And even when I can disable them either through menu options or a config file, I then have to hope that the temporal rendering artifacts aren't so egregious that it completely ruins the experience of playing the game.

I totally understand that I'm likely in the minority on this, but it's frustrating none-the-less. /sigh.

Regards,
SB
 
@Silent_Buddha I've recently been tweaking my display a little bit using novideo_srgb. I'm always tinkering with calibration etc. In my reading I've come across people who are sensitive to temporal dithering and things like that. I guess one of the reasons nvidia disables dithering in their drivers is there are some people that actually get headaches from temporal dithering on some older nvidia products. There are people making claims about DSC and such as well. You're not supposed to be able to see temporal dithering or DSC, but apparently some people can. I'm not sensitive to those things at all. I imagine some people are just way more sensitive to particular visual artifacts. Personally I have a hard time with no AA because the edge crawling bothers me, and there's really not suitable way to deal with it other than TAA unless we kind of go back to having games look like they're 10+ years old. But for another person TAA artifacts could be more bothersome than edge aliasing. Hopefully solutions just keep getting better to the point that we have something where more people are happy. Unity's STP sounded really interesting but hopefully it's not dead now that they made the brain-dead decision to lay off Timothy Lottes.
 
Switch 2 has many spec leaks and is it possible to calculate the relative performance?

8 nm (probably from samsung ) SOC
8” LCD
Probably two Joycons with vibration
RTX 30 architecture

If Switch 2 is 15W for this console at handheld mode, can it defeat performance of Steam Deck?
GPU of Switch 2 may only use 6~8W.
 

Screamer was such an amazing game at that era. I also liked games like Speed Haste. But none like the original The Need for Speed 1. Microsoft Golf for the PC was my first game, and The Need for Speed 1 was my second.
 
RTX HDR is far better than Windows AutoHDR.

it looks impressive. The fact that it works on all games, even super old games, or DirectX 5 or 6 games, or Glide o_O games makes it even better. Seeing games with HDR on that I played on my Monster 3D Voodoo card is totally unexpected.

I'd love to see it working on Gears of War, a game that I liked graphically, but like Killzone 2, I found it too grey for my taste. At the time I thought HDR was like eye adaptation -as shown in the video- and I liked the games emulating that.

Hope MS takes note. It's quite sad to see an OS which works really well with gaming, made better suited for certain features by other companies, when MS could do that, and not only apply it to PCs but also their consoles, it's like ultimate retro-compatibility. But yeah, they've been sleeping all this time.

Auto HDR is better than nothing but the nVidia solution is a quite superior. Also MS could force the swap chain for HDR in some games, like Special K does in games like Alien Isolation.
 
some people make great things, like this Virtual Display app, which creates a virtual display, even on say a 1080p 60Hz display, emulating resolutions 640 x 480 to 7680 x 4320 (8K), and refresh rates including 60hz, 75hz, 90hz, 120hz, 144hz, 165hz, 240hz, 480hz, and 500hz.
 
Switch 2 has many spec leaks and is it possible to calculate the relative performance?

8 nm (probably from samsung ) SOC
8” LCD
Probably two Joycons with vibration
RTX 30 architecture

If Switch 2 is 15W for this console at handheld mode, can it defeat performance of Steam Deck?
GPU of Switch 2 may only use 6~8W.
As usual a warning, it's a Nintendo handheld. Realistically they are aiming for PS4 performance (with higher resolution on deck mode) with more ram to be able to run PS5/Series games at a much lower resolution / settings. The same way Switch can run native PS4 games at 360p/480p.
 
Back
Top