Digital Foundry Article Technical Discussion [2024]

AMD can easily compete and take market share from Nvidia if they focused on it.

The average gamer doesn't give a shit about the high-end GPU's and technology we talk about in here.

If AMD focused on making a GPU with the performance level of the 7900XTX but at half the price it would give Nvidia a right shafting.

That wouldn't be possible on every process node but it would be a good target to aim for and would generate more sales than any Nvidia GPU.

Not to mention such performance at that price would do wonders for PC gaming.

They did it with the HD4870, HD5870 and HD7970.

So they can do it......
Nvidia is quite easily surpassable.

"Easily", yeah, so why haven’t they done it? Maybe because it’s not anywhere near as easy as you’re pretending? Otherwise, give Lisa Hsu a call, we’d all have to see you steer AMD in the right direction.

AMD has been trying for over a decade, but it turns out two beyond3d posters are the answer to their woes.
 
Ok i'm an idiot and couldn't follow the discussion, nothing to see here move along please.........
No, it wasn't clear. Below2D followed a post about Sony boosting AMD's potential with a post about nVidia. The logical line of discussion would have been about what Sony can bring, if anything, to AMD's R&D.

Let's get this back on track...

This isn't a business nor marketing thread. The point was raised about Sony partnering with AMD to develop open rendering tech. The discussion should be a technical discussion about what Sony can contribute if anything, not trumpeting the market leader as unassailable.
 
Last edited:
It doesn't matter if Nvidia sets the tone or not. AMD could easily add rt cores if they wanted but, they insist its not efficient use of die space
Guess what? The market has already reacted, Sony is following NVIDIA's steps, Intel is the same, even Apple did the same, Epic went on full RT with their MegaLights feature, AMD is just lagging behind the entire market.

adoption is very slow. It's usually a time lag of a generation or two
This is the case of any DirectX feature, it's not something related to DXR alone. This has been the case even as early as Hardware T&L.

Guess what? 19k games were released on steam this year alone
99.9% of those are games that barely need a DX9 GPU.

If AMD released a gpu with rtx 4090 performance in raster and rtx 3080 performance in rt for $599, most people are not buying rtx 5000. Amd need to release products that make it hard to justify the purchase of Nvidia gpus at any price point under $599
This would trigger NVIDIA into a price war AMD can't afford. NVIDIA can withstand years of price war, AMD can't. NVIDIA's stock is priced based on their AI financial performance, not anything else. In the past, NVIDIA showed they can get into price wars easily (especially now that they have other high marigins products that can compensate for any drops), and AMD has learned to avoid that.
 
Last edited:
While some see this as SONY assisting AMD to achieve "more", I see it as a sign that AMD failed to deliver this console generation and SONY trying to avoid AMD failing even more.
This kinda mimicks what is going on in the AI space or PC GPU space.
AMD is failing to do anything else that trying to play catch-up 🤷‍♂️
 
"Easily", yeah, so why haven’t they done it? Maybe because it’s not anywhere near as easy as you’re pretending? Otherwise, give Lisa Hsu a call, we’d all have to see you steer AMD in the right direction.

AMD has been trying for over a decade, but it turns out two beyond3d posters are the answer to their woes.

The same reason it took them so long to give Intel a good bashing.

And rumours of what they're doing with the 9000 series show I'm more correct in the direction they need to go and where they're actually going.

So try thinking before posting instead of posting with the mocking and sarcasm.
 
While some see this as SONY assisting AMD to achieve "more", I see it as a sign that AMD failed to deliver this console generation and SONY trying to avoid AMD failing even more.
This kinda mimicks what is going on in the AI space or PC GPU space.
AMD is failing to do anything else that trying to play catch-up 🤷‍♂️

How did AMD fail to deliver this console generation exactly?

PS5 and Series-X are extremely competitive gaming machines compared to PC.
 
How did AMD fail to deliver this console generation exactly?

PS5 and Series-X are extremely competitive gaming machines compared to PC.
Have the discussions avoided you totally?
They launched a console APU that have lack luster RT implementation and lackluster upscaling implementation.
Games are either 30 FPS (or below) "high" quality or 60 FPS (or below) mode and neither satisfies the target group.
(several DF videos also show this)
And the console are far from "competive" as you claim.
We are talking 2070'ish performance...in 2024.
2025 will be even worse looking at performance when the 50 series from NVIDIA lift the image quality gap even more.
~40% of gamers om Steam already have better hardware/image quality, how do you think 2025 will end?
In 2028 when the next console comes out this will be even worse.

Add to that the fact that we are +4 years in to the current console generation and the PS5 still havn't surpassed the PS4 sales numbers.

Fool yourself all you want, the numbers tell a different story,
 
Thread temporarily locked to give people time to re-evaluate. This is not an AMD sucks thread. It's a technical discussion thread following DF's content. People are starting to get post-Christmas emotional and it's not even on-topic emotions.
 
Since the DF thread is locked temporarily, I wanted to talk about their latest article:


I got SW:Outlaws for xmas and I have to move my own goalpost as to what game is including more checkboxes than others. This game seems to (like CP2077) include RTXGI and RTXDI which is a killer combination. Literally everything has occlusion on it and all seems well grounded. I also like the area light rendering even though we've only been able to analytically evaluate rectangles and circles for light shapes.

Hellblade 2 is definitely a looker in the character rendering (i.e. Metahumans) as well as the excellent terrain detail (i.e.Nanite). I wish that game was more than just a walking sim and had better lighting (i.e.Path-traced).

Indy is the king for best looking RTXGI for me. The bounced diffuse is widely seen everywhere and isn't subtle. I'm constantly stopping and looking around at all the bounced light affecting nearby objects.
 
I saw this too. PRO is putting up some fairly good results. I'm hoping that Sony will revise the PSSR tech more before the generation is over. Image quality is a big priority over performance for me. I know some like 60FPS but I hate sacrificing the image quality and rendering quality for it. I'm OK with just 30FPS as long as it's constant -- for 3rd person shooters. 1st person is another story.
 
I saw this too. PRO is putting up some fairly good results. I'm hoping that Sony will revise the PSSR tech more before the generation is over. Image quality is a big priority over performance for me. I know some like 60FPS but I hate sacrificing the image quality and rendering quality for it. I'm OK with just 30FPS as long as it's constant -- for 3rd person shooters. 1st person is another story.
you can always use Lossless Scaling or something similar to achieve 60fps from a locked 30fps base. That's what I do when playing Indiana Jones at certain settings. I can play the game at a locked 60fps at 4K -although downsampling of course- but since I rather prefer a "60 fps" locked experience even if it runs at 30fps internally, I don't mind using high and above settings.

On my 1440p 165Hz I prefer to use lower settings to achieve better framerates.
 
Still no performance mode without frame gen on pro? How hard is it to make changes like that? Modders on consoles unlock frame rates and change resolutions and it's not that complicated.

Makes me think, is enabling the 120hz mode on PS5 for supported displays something that requires much work? So few games support it, but it has pros even for 60 or 30 fps games.
 
you can always use Lossless Scaling or something similar to achieve 60fps from a locked 30fps base. That's what I do when playing Indiana Jones at certain settings. I can play the game at a locked 60fps at 4K -although downsampling of course- but since I rather prefer a "60 fps" locked experience even if it runs at 30fps internally, I don't mind using high and above settings.
I do not think you can use LossLess Scaling on the PS5?
 
Back
Top