Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
Compare to a company like Sega who has only really started to view PC as a key gaming cornerstone in the past few years for their internal non-western dev. teams. Prior to that it was just token ports to PC for their internal Japanese dev. teams in most cases.
Sega had timely ports of many of it's tentpole games throughout the years, though. Comix Zone only had a few months between it's Genesis release and it's PC port. Sonic CD has had 3 separate ports to PC. And many Saturn games got PC ports soon after their Saturn releases. Panzer Dragoon, Daytona CCE, House of the Dead, Virtua Fighter Remix... I'm sure I've forgotten some, but this trend continued through the Dreamcast and beyond. You can call these token ports if you want, but no other platform holder in that time was releasing quality PC ports like Sega was back then.
 
Sega had timely ports of many of it's tentpole games throughout the years, though. Comix Zone only had a few months between it's Genesis release and it's PC port. Sonic CD has had 3 separate ports to PC. And many Saturn games got PC ports soon after their Saturn releases. Panzer Dragoon, Daytona CCE, House of the Dead, Virtua Fighter Remix... I'm sure I've forgotten some, but this trend continued through the Dreamcast and beyond. You can call these token ports if you want, but no other platform holder in that time was releasing quality PC ports like Sega was back then.
can attest to that. I remember playing Sonic 3 -iirc- and Comix Zone back in the day, then I had a game I liked very much, Sega Worldwide Soccer, although my computer had some trouble running it. I also remember Virtua Fighter released on the PC, Daytona and maybe Battle Arena Toshinden -but I am not really sure this was a Sega game- although it looked like a game from Sega visually wise. Most of their modern arcade machines are in fact PCs. As of recently they added XeSS support to 2 of their games.
 

They and Alex in particular seem to really be underestimating the AMD GPUs and comparing the 7900 XTX to 4090 when AMD have said that it's a 4080 esque product and not aimed to compete with Nvidia's halo product. Specifically zeroing on the RT performance. It's not as if AMD doesn't care about RT, it has great uplift.
 

They and Alex in particular seem to really be underestimating the AMD GPUs and comparing the 7900 XTX to 4090 when AMD have said that it's a 4080 esque product and not aimed to compete with Nvidia's halo product. Specifically zeroing on the RT performance. It's not as if AMD doesn't care about RT, it has great uplift.
They say that if the 4080 12gig still existed it get its assed kicked by the 7900xtx in traditional rasterization and maybe would have tied in Raytracing.

They bring up well if you can afford $1200 for the 4080 16gig can you afford $1600 for the 4090.


They do bring up a good point. They think the 7900xt is more cut down the 6800xt compared to the 6900xt. So I do think the pricing is a bit sku'd. I think AMD would have a much more compelling product line if the 7900xt was actually $800. If they pose the question that the $400 between the 4080 16 vs 4090 may be worth while then surely the $100 between the 7900s

Then they bring up the pricing of what the 4080 12gig becomes , which I have brought up before. I think the 4080 12 gig will be in a seriously tough spot performance wise. It is greatly cut down and I don't think it will work at its $900 price point. You may see a situation where you get better performance in ray tracing and traditional rasterization from the AMD side at that $800-$1k price point.
 
Its not odd, he states he prefers consoles. He assumes more people are like him. He’s not stating things as facts but opinions.
 

They and Alex in particular seem to really be underestimating the AMD GPUs and comparing the 7900 XTX to 4090 when AMD have said that it's a 4080 esque product and not aimed to compete with Nvidia's halo product. Specifically zeroing on the RT performance. It's not as if AMD doesn't care about RT, it has great uplift.
RX 7000 series are the fastest GPUs of the world at rasterization, the fastest ever. Will we ever need more rasterization speed than that? In fact they have so much power to spare that the CUs used for rasterization can be used for raytracing, and while not specialized they can balance the load and do the job to get a 50% increase in RT performance compared to previous gen. Those resources could well be used for RT specialized stuff too. The true test will come when the 7600 and 7700 and the 4060/4070 are out, and see how they compare. Those are the typical GPUs that aren't out of many people's league.

1440p 960 fps, 4K 480fps. o_O When will diminishing returns start? 'Cos maybe traditiional rasterization could disappear as we know it.

Tbh, I am more interested in things like FSR 3 from the series 7000 than 1440p 960fps.

After watching or playing games using DLSS and XeSS, I can't help it but feel that native 4K is a waste of resources where you can get better IQ -almost animation movie level of AA, better framerates, and more defined IQ where it matters- by using those technologies. Those are now viable.

In fact, there is nothing that annoys me more nowadays than playing games at 4K 60fps -which I can play aplenty- 'cos they don't have support for FSR 3, XeSS or DLSS what have you. Wish nVidia, Intel and AMD made these technologies available via drivers, so you force games to be played with those enabled.
 
RX 7000 series are the fastest GPUs of the world at rasterization, the fastest ever. Will we ever need more rasterization speed than that? In fact they have so much power to spare that the CUs used for rasterization can be used for raytracing, and while not specialized they can balance the load and do the job to get a 50% increase in RT performance compared to previous gen. Those resources could well be used for RT specialized stuff too. The true test will come when the 7600 and 7700 and the 4060/4070 are out, and see how they compare. Those are the typical GPUs that aren't out of many people's league.

1440p 960 fps, 4K 480fps. o_O When will diminishing returns start? 'Cos maybe traditiional rasterization could disappear as we know it.

Tbh, I am more interested in things like FSR 3 from the series 7000 than 1440p 960fps.

After watching or playing games using DLSS and XeSS, I can't help it but feel that native 4K is a waste of resources where you can get better IQ -almost animation movie level of AA, better framerates, and more defined IQ where it matters- by using those technologies. Those are now viable.

In fact, there is nothing that annoys me more nowadays than playing games at 4K 60fps -which I can play aplenty- 'cos they don't have support for FSR 3, XeSS or DLSS what have you. Wish nVidia, Intel and AMD made these technologies available via drivers, so you force games to be played with those enabled.
If I had my way, FSR and DLSS would be in all games automatically, and FSR would be automatic on console as well. Those technologies have costs obviously but the GPU cost in lowering the rendering resolution for net gain image quality and performance is just too good.

And yes, FSR3 if it is platform agnostic should be on everything as well including console. Being able to increase fps in such a way is just too good a thing.

Amd proved they could do a good enough dlss equivalent without AI and machine learning. And so maybe they can do the same with frame generation? They don't seem to have priority on the specialized tech compared to Nvidia but that seems good enough to be competitive anyway
 
RX 7000 series are the fastest GPUs of the world at rasterization, the fastest ever. Will we ever need more rasterization speed than that? In fact they have so much power to spare that the CUs used for rasterization can be used for raytracing, and while not specialized they can balance the load and do the job to get a 50% increase in RT performance compared to previous gen. Those resources could well be used for RT specialized stuff too. The true test will come when the 7600 and 7700 and the 4060/4070 are out, and see how they compare. Those are the typical GPUs that aren't out of many people's league.

1440p 960 fps, 4K 480fps. o_O When will diminishing returns start? 'Cos maybe traditiional rasterization could disappear as we know it.

Tbh, I am more interested in things like FSR 3 from the series 7000 than 1440p 960fps.

After watching or playing games using DLSS and XeSS, I can't help it but feel that native 4K is a waste of resources where you can get better IQ -almost animation movie level of AA, better framerates, and more defined IQ where it matters- by using those technologies. Those are now viable.

In fact, there is nothing that annoys me more nowadays than playing games at 4K 60fps -which I can play aplenty- 'cos they don't have support for FSR 3, XeSS or DLSS what have you. Wish nVidia, Intel and AMD made these technologies available via drivers, so you force games to be played with those enabled.
This can't be stated yet.
 
If I had my way, FSR and DLSS would be in all games automatically, and FSR would be automatic on console as well. Those technologies have costs obviously but the GPU cost in lowering the rendering resolution for net gain image quality and performance is just too good.

And yes, FSR3 if it is platform agnostic should be on everything as well including console. Being able to increase fps in such a way is just too good a thing.

Amd proved they could do a good enough dlss equivalent without AI and machine learning. And so maybe they can do the same with frame generation? They don't seem to have priority on the specialized tech compared to Nvidia but that seems good enough to be competitive anyway
FSR had an edge because it's open source and can be improved over time. The same could be said about XeSS though, which has two modes, one for cards with AI processors and one without. nVidia on the other hand, imho, has the superior technology of all and it's getting better too.

Digital Foundry staff are very technical people, and I kinda knew that if things didn't go as expected, tears would drop from them.

The point is that FSR 3 is the most interesting FSR update, because of how are they going to implement certain features. Without those new features FSR could be buried and forgotten, leaving AMD to be a follower yet.

DF staff would say, rightly so, and that's why I understand their point..., how FSR will get better unless AMD add ML features to it? Depending on how the future pans out, if they aren't using specialized processors (which also has an effect on RT performance) FSR 3 must have one heck of an implementation to compete.

@techuse right. What I mentioned is an estimate. If all the CUs can be used for rasterization -a boon and a bane- in games that don't feature RT, I am trying to see how nVidia can beat that level of performance.

nVidia is going to reign in raytracing again though, and DLSS is just too good -FSR 3 frame generation solution could be okay, but DLSS overall...-. They are at the very top as usual. Price, power connectors that don't melt, size, plug and play, compatibility, chiplets (smaller die and not as expensive components, also probably better durability 'cos of die sizes), new media and display engines, Display Port 2.1 support, those are the real challenge for them now.
 
AMD naturally estimated that RT will need to wait until more performant cards can be put out that can support high framerates and 4k resolution with RT enabled. Their attitude was always "RT is just not there yet".
They do have some point as implementation in games so far isnt that noticeable. But the RT marketing works.

NVIDIA must have been familiar with this original roadmap and DLSS came as a solution they thought would have enabled them to release hardware accelerated RT sooner. Because with DLSS you are saving a lot of performance to have RT without getting too much hit on the framerate and perceived image quality.

AMD most of the time is a follower and I am curious what is keeping them behind the curve. Maybe NVIDIA's huge resources from owning more than 80% of the GPU market in the PC space gives them enough resources and connections to constantly improve faster. Industries involved with CG are always starved and asking for smart solutions fast. I guess their clients feedback and simultaneous funds keep them more versatile and afloat.
 
It's because they're not pushing RT hardware that it's not there yet, they're not helping the problem.
Nvidia is learning to build the best possible accelerators and matching software.
AMD is learning to scale the most amount of silicon and bandwidth for as cheap as possible.

Somewhere down the line the two competitors will need to do what the other has been focusing on; when software and accelerator innovations plateau you need to scale for more power, and when it’s no longer efficient to keep throwing power at the problem, they need better accelerators.
 
Somewhere down the line the two competitors will need to do what the other has been focusing on; when software and accelerator innovations plateau you need to scale for more power, and when it’s no longer efficient to keep throwing power at the problem, they need better accelerators.
And you can be sure that IP patents and lawyers will make sure that consumers do not readily benefit from both approaches in one design for a long, long time. :no:
 
Status
Not open for further replies.
Back
Top