Digital Foundry Article Technical Discussion [2024]

NVIDIA could sell the 4070S for $300 and probably still make a decent profit. Those prices are just a ripoff after they realized people were willing to pay top dollar even for mid-tier products.

Unless you can actually back that up with a good source I wouldn't make such wild comments.
 
The days of affordable GPUs offering console crushing performance are never coming back. Not from Nvidia at least.
it's not that they seem to care anymore now that they are involved in AI and it seems it's where most of the money comes from. That's why we have the new RTX 4070 "SUPER" which is not that superb when you consider it performs a 10% faster than a 3080, and the prize is what it is.
 
The DF crew argues the importance of stand alone engines, amidst an increase in the rate of reliance on UE5 engine among AAA and AA developers.

A critical point was made that most UE5 engines came with stuttering problems, while most games with stand alone engine shipped with no stuttering whatsoever.

I wonder if there is a selection bias here resulting in more correlation than causation.

I'd suspect developers which do use their own engines likely already lean (relatively) more towards an emphasis (and capability) on the technical and software side. Whereas developers who are going to go with UE5 likely skew towards more emphasis on the game design side.
 
To be fair PS5 is almost $90 cheaper than it was when it released. In Nov 2020 $500 was about $590 in Dec 2023 USD. $400 PS4 in Nov 2013 would be $415 in Dec 2016. In Dec 2016 the PS4 actually cost $350 AFAIK. I think this means the PS5 has actually gone down in price slightly more than the PS4 did 3 years after release, just by staying at the same price :mrgreen:

Edit as noted below PS4 slim launched in Sept 2016 for $300. So the PS4 had reduced in price a bit more than PS5 3 years after release. It's pretty close in terms of inflation adjusted dollars.
My wages didn't go by the same amount as inflation, so I dont consider this a valid argument at all. Especially when much of the 'inflation' we've seen is just greed.
 
it's not that they seem to care anymore now that they are involved in AI and it seems it's where most of the money comes from. That's why we have the new RTX 4070 "SUPER" which is not that superb when you consider it performs a 10% faster than a 3080, and the prize is what it is.
They saw what people were willing to pay during the crypto craze. Customers have confirmed they are more than willing to be gouged. Gamers brought this on themselves.
 
When I look at other PC component or smartphone prices I don't think the GPU prices are too exaggerated.
Edit: Also, I think the general discourse tends to undersell ray tracing. Rich's comparisons mainly focused on rasterization I believe (Cyberpunk in performance mode, Immortals of Aveum, TLOU Part II, A Plague Tale: Requiem, Alan Wake 2). I think only Frontiers of Pandora had RT in the suite of games he tested. If you enable it, then the performance differential is quite a bit higher than a 100% advantage in favor of the 4070S. You're probably looking at a 120-150% uplift on average and if the PS5 could take path tracing, it'd be 200%. So I guess on that front it's not all bad but even on a 4070S, ray tracing isn't that amazing, and current games are still largely built traditionally so rt isn't always impressive outside of the usual suspects like Cyberpunk and Alan Wake 2.
Raster doesn't scale that well. If you look at the 35 FP32 teraflops, the RTX 4070 has three times the performance of the PlayStation 5.

The difference between the PlayStation 5 and RTX 4070 Super is greater when it comes to ray tracing. In path tracing Cyberpunk 2077 the PlayStation 5 GPU would have less than 8 fps while an RTX 4070 Super has around 30 fps.
1706426805969.png

If you aim to have 60 fps the RTX 4070 is good for playing RT games except for Alan Wake 2 and Cyberpunk 2077.


It would be nice if the RTX 4070 Super cost less but $300 for a graphics card with the performance of an RTX 4070 Super is unrealistic these days. Perhaps we should aim for something more realistic like $500.


The RRP for an NVIDIA RTX 3080 was £649 in the UK
In 2021 the RTX 3080 sometimes cost almost $2000. In 2022 early it was $1200 and now one can get a RTX 4070 for $600 dollars
 
Last edited:
You can not use the TFLOPs from current GPUs as an example of rasterization not scaling well. Rasterization is more than just FP32 math. If we are being honest, these TFLOP ratings are a farce anyway. These GPUs are not capable of these metrics in practice. Nvidia has been doing the absolute minimum on the consumer side when it comes to hardware. Very few changes to the graphics architecture for 6 years now.
 
When I look at other PC component or smartphone prices I don't think the GPU prices are exaggerated.

Other power supplies and GPUs what other PC component do you find is significantly more expensive compared to pre 2020 (really pre Covid)?

Nvidia has been doing the absolute minimum on the consumer side when it comes to hardware. Very few changes to the graphics architecture for 6 years now.

I feel that's an overly broad statement/generalization unless by consumer you just mean "raster" gaming performance (in terms of FPS), and really price/performance.
 
Nvidia has been doing the absolute minimum on the consumer side when it comes to hardware. Very few changes to the graphics architecture for 6 years now.

I don't agree with this. The performance gap between the 4090 and 3090 is as large or larger (particularly where RT is involved) than that between any 2 previous flagship GPU's for the last decade or more. So Ada itself is a spectacular architecture.

The problem is with how NV chose to gouge the customer by taking advantage of that crazily good architecture to release massively cut down parts at higher product tiers than they would have previously occupied (resulting in average or even lower than usual performance gains per tier) while simultaneously massively ramping up the price of each tier.
 
I don't agree with this. The performance gap between the 4090 and 3090 is as large or larger (particularly where RT is involved) than that between any 2 previous flagship GPU's for the last decade or more. So Ada itself is a spectacular architecture.

The problem is with how NV chose to gouge the customer by taking advantage of that crazily good architecture to release massively cut down parts at higher product tiers than they would have previously occupied (resulting in average or even lower than usual performance gains per tier) while simultaneously massively ramping up the price of each tier.
The performance increase has come from a bump to the TDP and the increase in SMs allowed by the smaller node. The architecture has remained very similar since Turing. They have been beefing up the RT cores and increasing the cache but the graphics pipeline has seen minimal improvements. 2xFP32 is about it.
 
I see it differently. Amper is already very different from Turing.

Other power supplies and GPUs what other PC component do you find is significantly more expensive compared to pre 2020 (really pre Covid)?

In 2023 I paid more than two times as much for a standard mainboard (ASRock X670E PG Lightning) compared to before.
 
Nvidia has been doing the absolute minimum on the consumer side when it comes to hardware. Very few changes to the graphics architecture for 6 years now.

How many games use all of Turing’s features? Maybe they’re waiting for software to catch up. Adding more features that nobody is using isn’t going to help anyone. Better support for micro polys would be nice but that doesn’t seem to be a bottleneck today.

In the short term the best use of transistors is probably still more flops, more cache and faster RT.
 
You can not use the TFLOPs from current GPUs as an example of rasterization not scaling well. Rasterization is more than just FP32 math. If we are being honest, these TFLOP ratings are a farce anyway. These GPUs are not capable of these metrics in practice. Nvidia has been doing the absolute minimum on the consumer side when it comes to hardware. Very few changes to the graphics architecture for 6 years now.
With Pathtracing (compute heavy workload) the 4090 is 4x faster than the 6900XT. This is basically a 1:1 increase with the TFLOPs number.

Claiming that nVidia has done the "absolute minimum on the consumer side when it comes to hardware" is so ridiculous. What has Sony done since the PS4? They are selling basically the same product again, just faster. A 4090 can do real time pathtracing with DLSS Quality in 4K, a PS5 can be lucky to get 60 FPS in 1080p in a simple UE5 rasterizing game.
 
How many games use all of Turing’s features? Maybe they’re waiting for software to catch up. Adding more features that nobody is using isn’t going to help anyone. Better support for micro polys would be nice but that doesn’t seem to be a bottleneck today.

In the short term the best use of transistors is probably still more flops, more cache and faster RT.
I don’t necessarily mean features, but architectural improvements to improve performance and remove scaling bottlenecks.

With Pathtracing (compute heavy workload) the 4090 is 4x faster than the 6900XT. This is basically a 1:1 increase with the TFLOPs number.

Claiming that nVidia has done the "absolute minimum on the consumer side when it comes to hardware" is so ridiculous. What has Sony done since the PS4? They are selling basically the same product again, just faster. A 4090 can do real time pathtracing with DLSS Quality in 4K, a PS5 can be lucky to get 60 FPS in 1080p in a simple UE5 rasterizing game.
That path tracing improvement is not solely dependent on the TFLOPs though. The RT cores are doing a lot of the heavy lifting.

What does Sony have to do with this discussion? They don’t make GPUs. I don't understand where consoles factor into this at all.
 
To be fair PS5 is almost $90 cheaper than it was when it released. In Nov 2020 $500 was about $590 in Dec 2023 USD. $400 PS4 in Nov 2013 would be $415 in Dec 2016. In Dec 2016 the PS4 actually cost $350 AFAIK. I think this means the PS5 has actually gone down in price slightly more than the PS4 did 3 years after release, just by staying at the same price :mrgreen:

Edit as noted below PS4 slim launched in Sept 2016 for $300. So the PS4 had reduced in price a bit more than PS5 3 years after release. It's pretty close in terms of inflation adjusted dollars.
Simply adjusting for inflation doesn't give a proper picture.
Real wages have been reduced for the past two years in the US. In some countries this reduction takes a bigger period.

In the past, consoles represented a technological breakthrough. You were paying more at launch for something that was far greater in technology. First adopters and wealthier individuals could afford them. The prices of these consoles significantly dropped and more people could afford them in time making them affordable to lower classes. This was the case with PS2 and PS1. Price and form factor was hugely reduced.

PS5 and X is a smaller jump and a console that sees zero price reductions.

Price increases is absorbed into their lifecycle where they were supposed to have a price reductions keeping them at the same price having a bigger negative impact on the less wealthy groups

I also doubt that adjusting to inflation shows the proper picture of affordability. Inflation is calculated using a basket of goods as a yardstick. But more and more people are finding difficulties repaying debts and meeting the increasing demands of modern life that has a lot of unexpected costs. A lot of products are more prone to faults, damage, smaller life cycle and maintenance costs than before. People are having more reincurring costs now in their everyday lives
 
That path tracing improvement is not solely dependent on the TFLOPs though. The RT cores are doing a lot of the heavy lifting.

In Alan Wake 2 the 4090 is 2.6x faster with rasterizing and 4.1x faster with Pathtracing than a 2080TI. FP32 performance improved 6x (or 4x with a 0.6x INT/FP32).

What does Sony have to do with this discussion? They don’t make GPUs. I don't understand where consoles factor into this at all.
It started with a comparision between a PS5 and a 4070 Super...
 
In Alan Wake 2 the 4090 is 2.6x faster with rasterizing and 4.1x faster with Pathtracing than a 2080TI. FP32 performance improved 6x (or 4x with a 0.6x INT/FP32).


It started with a comparision between a PS5 and a 4070 Super...
The RT cores have had their throughput quadrupled since Turing. A 4090 also has almost twice as many.

But consoles have nothing to do with any of the statements I made.
 
With Pathtracing (compute heavy workload) the 4090 is 4x faster than the 6900XT. This is basically a 1:1 increase with the TFLOPs number.

Claiming that nVidia has done the "absolute minimum on the consumer side when it comes to hardware" is so ridiculous. What has Sony done since the PS4? They are selling basically the same product again, just faster. A 4090 can do real time pathtracing with DLSS Quality in 4K, a PS5 can be lucky to get 60 FPS in 1080p in a simple UE5 rasterizing game.
So I think it would be a much stronger argument to compare manufacturer families against their own families if you want to prove that they’ve done more.

So I would be looking at comparing the 4000 series up and down the line. And then comparing it down architecture lines like the 3000 and 2000 series if they have made any advancements.

The comparison against a competitor product and a full architectural generation below, doesn’t actually prove your statement here. Older cards were not designed to take on ray and path tracing workloads as much as newer generation cards.

But even if it were the same generation, comparing within IHV would still make more sense as we want to know what nvidia has accomplished since their last release, not since AMDs last release.
 

DF Direct Weekly #147: Palworld Mania vs Tech Jank, AMD AFMF Tested, Horizon Forbidden West PC!
0:00:00 Introduction
0:00:54 News 01: What’s up with Palworld?
0:25:45 News 02: Capcom adds Enigma DRM to older games
0:35:31 News 03: AMD releases Fluid Motion Frames tech
0:52:30 News 04: Horizon Forbidden West PC features detailed
1:05:22 News 05: Tekken 8: a superb fighter
1:15:53 News 06: Input lag deep dive
1:32:52 Supporter Q1: What would you recommend spec wise for an upper-midrange PC?
1:37:08 Supporter Q2: Should id Software commercialize their id Tech engine?
1:40:12 Supporter Q3: Is the time of 1080p monitors over?
1:43:55 Supporter Q4: Could Nintendo, Microsoft, or Sony develop their own Proton-like translation layers to run games from other platforms?
1:46:55 Supporter Q5: With John’s new Direct background, isn’t he worried about burn-in?
 
Back
Top