Digital Foundry Article Technical Discussion [2024]

I don’t think anybody is discussing


You don’t know what thermally constrained means, but anyway you still proved my point.
Xbox one uses 15 times the energy compared to iPhone. Looking at resolution it means apple graphics architecture and drivers would be 3 or 4 times as effective compared to xbox.

Compare m3 MacBook Air to m3 MacBook Pro if you want to see the effect of throttling
I'm starting to think that you don't know what you are talking about in general. Do you have some idea of what a process node is? If I tell you 28nm and 3nm do you know what those are? :unsure:

"You don't know what thermally constrained means"
Lmao
 
Xbox one uses 15 times the energy compared to iPhone. Looking at resolution it means apple graphics architecture and drivers would be 3 or 4 times as effective compared to xbox.
Yes, who could have guessed that an architecture almost 3 years newer with a two process node advantage would be much more efficient.
 
Last edited:

DF Direct Weekly #171: Xbox Fire Stick Tested, Black Myth: Wukong Specs, First Descendant Frame-Gen!​


0:00:00 Introduction
0:00:54 News 01: Microsoft says you don’t need an Xbox to play Xbox
0:24:31 News 02: Game Pass prices increased
0:38:36 News 03: New Black Myth: Wukong PC details revealed
0:53:26 News 04: Original Xbox gets 1.4 GHz CPU upgrade
1:01:34 News 05: The First Descendant frame gen tested!
1:13:21 News 06: Ace Combat 7 tested on Switch
1:19:41 Supporter Q1: Which Windows-based PC handheld would you recommend most?
1:25:36 Supporter Q2: Will you be doing more 3 person game playing videos?
1:29:44 Supporter Q3: What old unused Nintendo IP would you like to see again?
1:34:45 Supporter Q4: If Microsoft got Windows working seamlessly on ARM, what would that mean for their products?
1:43:27 Supporter Q5: Do you think Valve will give Steam Machines another shot?
1:49:21 Supporter Q6: Why is Intel’s XeSS so much better than AMD’s FSR?
1:59:01 Supporter Q7: Will Unreal Engine traversal stutter be fixed for the new Witcher game?
2:02:08 Supporter Q8: Could the PS Portal be updated or hacked to be a substantially more useful device?
2:06:25 Supporter Q9: John asks: how is the recording session going?
 
1.4Ghz OG Xbox upgrades existed more than twenty years ago. I'm surprised that DF has never heard of the Friendtech DreamX Xbox. You could buy them online. Very expensive, though.
 
Friendtech sold different configurations. They also had a model with a CPU close to the default speed, but it was still a desktop CPU with twice the L2 cache. People who got those units said that for a lot of games, the performance was increased just as much as the 1.4Ghz models. If those were accurate reports, then seemingly the Xbox was cache-starved in certain instances.
 
I remember various developers back in the day saying Xbox was mainly bandwidth starved and CPU limited, so I imagine doubling the L2 would make a large difference in performance.

Man I miss that generation, games were at their peak, hardware was interesting.....good times.
 
I remember various developers back in the day saying Xbox was mainly bandwidth starved and CPU limited, so I imagine doubling the L2 would make a large difference in performance.

Man I miss that generation, games were at their peak, hardware was interesting.....good times.
the biggest advantage of OG Xbox CPU was that it is an out of order CPU, something that was lost in the transition to the X360 generation, which could be much slower than the OG Xbox in certain situations because the X360's had an in-order CPU.
 

Alex takes a look at Forza Motorsport on PC with a RT mod enhancing the graphics.


Edit: Now that I'm watching the video, it's just baffling to me why some of these improvements aren't just officially supported and enabled by the game. The grass distance for one... like why is that not the default Ultra setting? Why can't we have options to push fidelity out further. Yes I know people would just complain about ultra being too demanding.. but this is a form of thinking that we have to begin to change in PC gaming IMO. We should normalize "High" settings which targets current top end GPU hardware, and "Ultra" which targets future GPUs.
 
Last edited:

Alex takes a look at Forza Motorsport on PC with a RT mod enhancing the graphics.


Edit: Now that I'm watching the video, it's just baffling to me why some of these improvements aren't just officially supported and enabled by the game. The grass distance for one... like why is that not the default Ultra setting? Why can't we have options to push fidelity out further. Yes I know people would just complain about ultra being too demanding.. but this is a form of thinking that we have to begin to change in PC gaming IMO. We should normalize "High" settings which targets current top end GPU hardware, and "Ultra" which targets future GPUs.
The unoptimized crowd has won that battle. Anything that isn't 60 fps 1080p at ultra setting on a 3060 is unoptimized garbage. No matter the visuals on screen.
 
Edit: Now that I'm watching the video, it's just baffling to me why some of these improvements aren't just officially supported and enabled by the game. The grass distance for one... like why is that not the default Ultra setting? Why can't we have options to push fidelity out further. Yes I know people would just complain about ultra being too demanding.. but this is a form of thinking that we have to begin to change in PC gaming IMO. We should normalize "High" settings which targets current top end GPU hardware, and "Ultra" which targets future GPUs.
The unoptimized crowd has won that battle. Anything that isn't 60 fps 1080p at ultra setting on a 3060 is unoptimized garbage. No matter the visuals on screen.

Isn't it because it could make the Series X look "too weak" if the visual difference was noticeable?
I remember the game being used as a Series X's power showcase, so it's inevitable that the studio would receive a lot of criticism if all the more proudly marketed visual features that the "Series X would enable" were only present on the PC version.
 
The XSX version is very well optimized, the graphic details have been improved several times since its release. All this in native 4K with a stable 60FPS. The performance has been improved several times on PC as well, but the game is not as well optimized, or you can only get so much out of it. Because FM8 is very resource-intensive in native 4K if you want a stable FPS.

Now, the fact that someone adds a mod doesn't mean anything, since the FPS indicator in this game is quite unstable on PC anyway (stable requires serious hardware), you can imagine how much the FPS can drop with the mod... It makes no sense .
 
Isn't it because it could make the Series X look "too weak" if the visual difference was noticeable?
I remember the game being used as a Series X's power showcase, so it's inevitable that the studio would receive a lot of criticism if all the more proudly marketed visual features that the "Series X would enable" were only present on the PC version.
You know what makes Series X look more weak than anything? The downgrade the game got from the vertical slice gameplay trailer they released..

I think they should be more interested in selling copies of the game than making Series X look weak. People know what a Series X is broadly capable of. Having enhanced foliage draw distance isn't something that's going to question being greatly enhanced on PC over the Series X.. and yet it's possible and they didn't bother doing it. Ray Tracing is also another area where it's completely understandable by the people that PC gets enhanced support over the console.. and yet again they don't do it.

I think in this particular case, it's more about their denoiser not being up to the task for their RTGI implementation.

But let's just be honest.. it likely has just as much or more to do with what Charlietus said which is that people would complain about how demanding the "maxed" settings are and claim it's unoptimized.. and that could easily lead to mass review bombing on Steam and easily affect sales. As sad as it is, developers have to be cognizant of that possibility and thus it's simply easier to ship a game with equivalent settings for the most part or only enhanced through increased resolution/samples of graphical features which are simple to adjust and don't drastically alter the image. There's of course the costs involved in having to test and ensure things work and don't break the game lol.
 
Last edited:
The unoptimized crowd has won that battle. Anything that isn't 60 fps 1080p at ultra setting on a 3060 is unoptimized garbage. No matter the visuals on screen.
RTX 3060 is not an ultra settings graphics card.

Ultra settings do not have to be optimised at all. In Avatar they often bring visually little to the table and cost a lot. However, if you have an RTX 4090 you can still use it.

Also Cyberpunk 2077 Pathtracing is something like Ultra Settings and you can forget about an RTX 3060 at 1080p and 60 fps. There you rather have 720p at 30fps but that has nothing to do with the game being unoptimised.

You always have to consider the visual output. For example, you can't compare the power consumption of appliances and say that appliance A is worse than appliance B because A consumes more. If A has a better output the higher consumption may be justified. Take a GTX 1080 Ti vs. an RTX 4090, for example: The RTX 4090 consumes more in absolute terms but delivers much higher performance.
 
Last edited:
RTX 3060 is not an ultra settings graphics card.

Ultra settings do not have to be optimised at all. In Avatar they often bring visually little to the table and cost a lot. However, if you have an RTX 4090 you can still use it.

Also Cyberpunk 2077 Pathtracing is something like Ultra Settings and you can forget about an RTX 3060 at 1080p and 60 fps. There you rather have 720p at 30fps but that has nothing to do with the game being unoptimised.

You always have to consider the visual output. For example, you can't compare the power consumption of appliances and say that appliance A is worse than appliance B because A consumes more. If A has a better output the higher consumption may be justified. Take a GTX 1080 Ti vs. an RTX 4090, for example: The RTX 4090 consumes more in absolute terms but delivers much higher performance.
That is, in fact, what I was saying 😅
 
Last edited:
Back
Top