Digital Foundry Article Technical Discussion [2024]

This is not true for the Series X version, where the image quality has been improved twice. When it was released, there was a lot more shimmering in the picture, now it is surprisingly clean. It could be that they paid more attention to the console code.

I believe that was the case for the Remnant2. FSR2 in that game on PC was atrocious, especially from equivalent starting resolutions. The console implementation was superior. This is one of the reasons why I don't believe PS5 Pro's supposed "DLSS" competitor necessarily has to be the equivalent of DLSS in all aspects, simply the time and focus allocated to any reconstruction tech when it's the only implementation available can produce solid results.


finally an interesting AMD graphics card. That's the way to go

Good stuff, and kudos for Rich to focus on the value argument - the 4070 and 7800xt really should each be dropped by $50. I expect though that the 7800XT will find it self forced into that position soon though, it just doesn't make much sense with the GRE now.

(edit: Hmm: AMD drops the Radeon RX 7700 XT price by $30)

I did find it a little surprising though considering their arguments on this in the past that image quality did not get even a cursory mention, something along the lines of "on the downside however, you still have FSR vs DLSS...".

Speaking of that though, perhaps some promising news on a more standardized approach:

Microsoft DirectSR (Super Resolution) To Be Revealed At GDC 2024


BTW, Techspot (HBU's site) actually had a more negative review of the 7900GRE, but in their testing it was barely faster than the 7800XT. Wildly different benchmark results than DF relative to 7800/4070 performance, what's going on here?

1708972177028.png1708972188570.png
 
Last edited:
Speaking of that though, perhaps some promising news on a more standardized approach:

Microsoft DirectSR (Super Resolution) To Be Revealed At GDC 2024

that sounds really interesting, 'cos tbh I am tired of looking for mods to implement XeSS in games with DLSS support. Thankfully, many games include XeSS nowadays, but for those that don't it can be a PITA. This also might be very good if they create a console/PC hybrid, 'cos that way this could enhance the vast majority of games running under Windows, not only a few select titles, even if just becomes part of DX12.
 
This is not true for the Series X version, where the image quality has been improved twice. When it was released, there was a lot more shimmering in the picture, now it is surprisingly clean. It could be that they paid more attention to the console code.
I thought all upscaling changes in the last patch were PC only. There were some fixes for visual artifacts (not upscaler) which may have causes issues on consoles.

Starfield Update 1.9.67 Patch Notes​

Graphics​

  • Added support for AMD FidelityFXTM Super Resolution 3 (FSR 3). (PC)
  • Added support for Intel Xe Super Sampling (XeSS). (PC)
  • Fixed an issue that could make the clouds appear to vibrate when using DLSS performance mode. (PC)
  • Fixed minor visual artifact that could occur when aiming with a weapon or task swapping.

Stability​

  • Changed how FormIDs are freed when loading saves. This should improve stability for saves that have visited many locations.
  • Fixed a crash that could happen when making changes to the ship that required all items to be moved to the cargo bay in the Ship Builder menu.

Miscellaneous​

  • Reverted a change that caused the data menu to open when taking screenshots with F12 (PC)
  • Fixed an issue causing the resolution scale to reset to 1.0 when switching from Fullscreen to Windowed mode when using DLSS. (PC)
 
Yeah that is a great card for that price point. We definitely need more competition like this from AMD.
yup, GPU market is competition-less against nVidia, and now that they swim in money 'cos of IA, it's not that I expect they are going to care.

The part of the video I liked the most is when Rich mentions A Plague Tale Requiem is one of his favourite games.

I started the game as of recently and I couldn't agree more. The atmosphere is so real, and the attention to detail..., you feel like you are in a medieval town. Those developers are truly good.

The woman plucking the hen and the man roughing down that cane in the market, reminded me of my grandmother and my grandfather -which used a piece of crystal-. But well, where I live the lifestyle was like in medieval times -in a good way- until the 1950s-60s, then capitalism appeared and now we are "modern" and in some ways better, in many ways more "stupid".
 
AMD GPUs honestly need to be $200$ cheaper than the performance equivalent Nvidia counterpart now that upscaling is assumed for nearly every game. Until they offer a solution that doesn’t look completely terrible, this is their unfortunate reality.

BTW, Techspot (HBU's site) actually had a more negative review of the 7900GRE, but in their testing it was barely faster than the 7800XT. Wildly different benchmark results than DF relative to 7800/4070 performance, what's going on here?

View attachment 10888View attachment 10889
Do AMD GPUs have highly variable boosting like earlier Nvidia GPUs did? Remember the ASIC quality RMA days? Good times.
 
Last edited:
I believe that was the case for the Remnant2. FSR2 in that game on PC was atrocious, especially from equivalent starting resolutions. The console implementation was superior. This is one of the reasons why I don't believe PS5 Pro's supposed "DLSS" competitor necessarily has to be the equivalent of DLSS in all aspects, simply the time and focus allocated to any reconstruction tech when it's the only implementation available can produce solid results.



Good stuff, and kudos for Rich to focus on the value argument - the 4070 and 7800xt really should each be dropped by $50. I expect though that the 7800XT will find it self forced into that position soon though, it just doesn't make much sense with the GRE now.

(edit: Hmm: AMD drops the Radeon RX 7700 XT price by $30)

I did find it a little surprising though considering their arguments on this in the past that image quality did not get even a cursory mention, something along the lines of "on the downside however, you still have FSR vs DLSS...".

Speaking of that though, perhaps some promising news on a more standardized approach:

Microsoft DirectSR (Super Resolution) To Be Revealed At GDC 2024


BTW, Techspot (HBU's site) actually had a more negative review of the 7900GRE, but in their testing it was barely faster than the 7800XT. Wildly different benchmark results than DF relative to 7800/4070 performance, what's going on here?

View attachment 10888View attachment 10889
Yeah, Daniel Owen also has the 7900 GRE only 7% ahead at 1440p Ultra. Hardware Unboxed, 2%, and Richard 20%. This makes no sense.
 
Different test scenarios? Different settings? IDK
I do not think anyone else uses the bench location we use, I think many use the in-game benchmark.

I think it's this.

Just like back when reviewers benchmarked Crisis in 2007 they used the opening 2-3 maps of the game, only for people to get the latter levels and find out the performance was a lot worse.

Test location matters.
 
Different test scenarios? Different settings? IDK
I do not think anyone else uses the bench location we use, I think many use the in-game benchmark.
The settings are similar. HU uses High settings and Daniel Owen uses Ultra like Rich. HU drove around the city like Rich but Daniel Owen used the built-in benchmark. All were GPU-limited and that was also without RT so I don’t think this can explain the massively different results.

It doesn’t stop there though. Richard was impressed by the 7900 GRE whereas HU barely saw a difference and were disappointed. They tried 12 games and the results should have somewhat normalized but they didn’t.
 

0:00:00 Introduction
0:01:28 News 01: Switch 2 reportedly delayed until 2025
0:13:08 News 02: Microsoft multi-plat games revealed
0:29:55 News 03: Nvidia debuts new all-in-one PC gaming app
0:47:43 News 04: PlayStation Portal hacked
0:54:31 News 05: PSVR2 support coming to PC?
1:05:57 News 06: Radeon 7 re-tested in 2024!
1:20:19 Supporter Q1: In the age of ray tracing, why do so many games forego shiny mirrors?
1:25:44 Supporter Q2: Is cross-gen game scalability really that transformative?
1:32:50 Supporter Q3: Could Game Pass become a reality on PlayStation and Switch?
1:38:55 Supporter Q4: Could a future Xbox lower-spec console be a handheld?
1:50:25 Supporter Q5: Is the traditional console generation model breaking down?
 
Reviews seem to indicate not much has changed with the latest update.

TPU said:
Speaking of XeSS in Starfield, the XeSS 1.2 implementation in its DP4a mode looks quite impressive in this game compared to FSR 3 upscaling and the in-game TAA solution. The quality of built-in XeSS antialiasing is superior to both FSR 3 and TAA, and very close to DLSS in maintaining a stable image without shimmering or flickering of thin objects at 1440p and 4K resolutions. However, things are a bit different at the lower 1080p resolution: while XeSS is still able to produce an image without excessive shimmering at "Quality" mode, image artifacts become a problem, especially on weapon sights. There are very noticeable ghosting issues to the point that we could clearly see four laser sights on the weapon when moving the mouse, this is very distracting for some people. Pixelation of particle effects is also an issue for XeSS in "Quality" mode at 1080p resolution, but to a lesser degree in comparison to FSR 3 upscaling.

Okay, so DP4a XeSS starts to break down at 1080p (though still does some things better than FSR) so maybe this wouldn't be great for Series S. But blimey, it's sad that the Series X is super capable of using a better upscaler than FSR due to the hardware additions MS specifically asked for, but still no progress in this area on Xbox.
 
Okay, so DP4a XeSS starts to break down at 1080p (though still does some things better than FSR) so maybe this wouldn't be great for Series S. But blimey, it's sad that the Series X is super capable of using a better upscaler than FSR due to the hardware additions MS specifically asked for, but still no progress in this area on Xbox.

I'm guessing that's where this Directx Super Resolution thing that's going to be announced is coming from. Wouldn't be surprised if it was designed for xbox and pc. It's just coming later into the gen than would have been ideal.
 
Okay, so DP4a XeSS starts to break down at 1080p (though still does some things better than FSR) so maybe this wouldn't be great for Series S. But blimey, it's sad that the Series X is super capable of using a better upscaler than FSR due to the hardware additions MS specifically asked for, but still no progress in this area on Xbox.
If you looked at the results of XeSS (DP4a) for console equivalent GPUs such as the 6700XT in comparison to native 1440p, only the ultra quality preset could offer you unquestionably improved image quality but that tradeoff comes with a performance hit. For the quality mode preset, the results were a wash between XeSS and the native resolution especially if the latter is already treated w/ TAA. With the other lower quality presets, there was undeniably improved performance but the results were worse than native resolution ...

"Better" is a subjective description since both upscaling solutions come with a different set of compromises when it comes to console hardware ...
 
About FF7 rebirth blurry performance mode story I wonder if there are not some shenanigans where you get somehow sharper IQ when switching to 1080p on PS5 settings? The same way RDR2 could actually look sharper for some at 1080p vs bad reconstruction used on 4K Pro mode.
 
HUB explains why its review had lower performance.

AMD reference specs for this GPU have a lower power target and clocks alongside more throttling than partner cards.

Somewhat, but there were barely any changes in many games too - Cyberpunk had virtually no difference over the ref 7800xt, whereas with DF it had one of the largest differences.

DF's video of their 7900 review has the Sapphire Pulse model being pictured, afaik the Pulse models are basically reference design vs the Nitro, but no idea what model they were using for their 7800XT. Still DF should tell us exactly what model they're testing, I can't imagine they would ignore any differences in terms of ref clocks but maybe the power target isn't that obvious.
 
Last edited:
Somewhat, but there were barely any changes in many games too - Cyberpunk had virtually no difference over the ref 7800xt, whereas with DF it had one of the largest differences.

DF's video of their 7900 review has the Sapphire Pulse model being pictured, afaik the Pulse models are basically reference design vs the Nitro, but no idea what model they were using for their 7800XT. Still DF should tell us exactly what model they're testing, I can't imagine they would ignore any differences in terms of ref clocks but maybe the power target isn't that obvious.
Looking at other reviews like Computerbase, I suspect this GPU has a larger than typical variance in performance due to being so power limited. Silicon lottery may be back in effect.
 
AMD reference specs for this GPU have a lower power target and clocks
Is this not completely typical for basically any graphics card release that has a reference model? One of the main points of buying a 3rd party option is generally the 'better than stock' capabilities.
 
Back
Top