Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
Shame FSR doesn't seem to be offering anything worthwhile over the in game checkerboard solution but great to see the in game stuttering resolved. As Richard says though, it's not really acceptable for a PC game like this to come with a significantly worse TAA solution than that found on the consoles, and it could really do with a FoV slider.
 
That's something you can already try in Quake RTX for example since primary visibility is done with RT there.
I tryed cylindrical projection on curved 32 to 9 monitor and it looked great at 27 FOV, lol (for orthographic projection without distortion on cylindrical surface)
Ha, cool :) That's exactly what i planned to do first once i have RT GPU... :D

But i tried it with my toy path tracer. Mixing horizontal and vertical cylindrical projection, and trying to focus distortion to the edges gave me this:
upload_2021-7-26_20-56-21.png
Scene has same proportions as Cornell Box, and camera is at the missing front wall. FOV is 180 degrees on both H and V, so we could see enemies beside us, and even our own feet to do proper Jump'n'Run in first person :)
Needs some more work, like blending with standard planar projection in the center area. But if i rotate the camera, it does not feel that weird as is.

Probably players would rather puke than being excited, but making some compromise (like less vertical FOV) would indeed make some sense and maybe enable new experiences.

There are also some actual Quake Ports supporting such kind of projections, but have not tried this yet.

Edit:
hehe, looks awful. But the question is how does it feel to play?
 
Last edited:
You know, i want some RDNA2 to estimate console performance.

I dont even know if you can go that low in the performance bracket in the dGPU space atleast. Yeah a RX6700 perhaps but that has its own non shared bw, no power constraints etc. Just to estimate console performance, quite the expensive thing if it aint for other purposes? Let DF figur that out lol.

And i'm a red guy anyway.

Ah ok well then, aim for that 6800XT or even a 6900XT...... perhaps wait for RDNA3, thats what i would do (and do, but for nv's next i think). I do not think its worth it at all right now, next gpus are rumored to pack extreme performance and the prices now are crazy anyways.
Red or green it doesnt matter btw, both are awesome in the pc space.

Even a Steam Deck would do for me, i'm considering...

Have been thinking, but im going for a new laptop instead. If i needed the extra mobility then yes, but as things are now a laptop is mobile enough for me. The steamdeck is very intresting though, we need something like that in the pc space.
 
Just to estimate console performance, quite the expensive thing
Oh yes. But that's not my problem, it's ours.
I really lost belief in high end. To keep gaming economical, APU like consoles do would be the best choice for mainstream. I don't see a need for high end anymore. Stuff is fast enough, but too expensive.
Personally (admitted), i would take the 6800 XT, yes. Because i'm tech nerd too. And maybe we can get this after RDNA3 is out. Maybe extending chip lifetime to two generations would keep the ball rolling: Older gen at older process for the masses, expensive high end for rich enthusiasts.
That's the only option i see. Otherwise: Either come up with a form factor right in the middle of mobile and PC (Series S really nails it) or bury PC gaming.
next gpus are rumored to pack extreme performan
As always. And tbh, i really don't know what to do with those teraflops. For realtime pathtracing it's still too slow. We could do volumetric stuff, including fluid simulations. Pretty cool, but not worth it, not affordable.
It's crazy. In the past, performance always was too bad. Then, at the time when multi core CPUs came up, things changed. It was pretty fine for most things. But now? It is indeed too much of it. That's my experience.
Ok, i'm a bit exaggerating, but not much. Honestly, i would not want to upgrade, if not for development needs. RTX? Threadrippers? What's next? IMO, the most promising and future proof here is Apple M1. Gaming for everyone, that's what i want. Not supercomputers.
(again i feel like a traitor saying that in a forum like this :D )
Have been thinking, but im going for a new laptop instead.
Ha - you're right. I already thought about plugging K+M to steam boy so i can code in bed :) Laptop would be better, though may take some time until laptop APU with its low BW can compete the boy. Will see how Rembrandt does...
 
I really lost belief in high end.

Seems like many didnt :p GPUs no matter what price sell like hot cakes, even high end ones. PC hardware, in special higher end has historically always been expensive.

APU like consoles do would be the best choice for mainstream.

Or APUs in laptops/gaming pc's, or APU's in something like the steam deck, 3060/6700 class hardware etc, for those that want low end, atleast, its there.

bury PC gaming.

Its going to exist atleast aslong as console gaming, probably even longer, and probably always in much larger numbers, too.

As always. And tbh, i really don't know what to do with those teraflops. For realtime pathtracing it's still too slow. We could do volumetric stuff, including fluid simulations. Pretty cool, but not worth it, not affordable.
It's crazy. In the past, performance always was too bad. Then, at the time when multi core CPUs came up, things changed. It was pretty fine for most things. But now? It is indeed too much of it. That's my experience.
Ok, i'm a bit exaggerating, but not much. Honestly, i would not want to upgrade, if not for development needs. RTX? Threadrippers? What's next? IMO, the most promising and future proof here is Apple M1. Gaming for everyone, that's what i want. Not supercomputers.
(again i feel like a traitor saying that in a forum like this :D )

Enthousiat hardware like that has always existed.... i remember the days of skull trail (correct name?) and crazy things like quad SLI and dual GPU cards, a 3090 or 6900XT aint all that crazy i think. A 16 core TR has its uses too, not that much for gaming atm, but were gradually going for more and more (faster/efficient) cores it seems.
Guess that, something like a 6700XT/3060Ti and up class of system would suffice for most this generation.

Apple M1, dunno what to think about their chips, its a departure from x86, but theres probably good reason Sony/MS didnt stuff anything like that in their boxes which will run another 7 or more years. We will have to see where things lead from here. The M1 atleast doesnt catch any intrest for me as a gamer since its quite a poor performer in modern games due to its weak GPU. Its CPU does very well in benchmarks though.

Almost more intrested in what samsung and RDNA2 will do for mobile :p
 
Seems like many didnt :p GPUs no matter what price sell like hot cakes, even high end ones. PC hardware, in special higher end has historically always been expensive.
Good point. But over the years gaming has grown like crazy, making expensive high end much less practical than in the old days.

The problem with low spec / high end on one end, and increasing dev costs on the other is pressure on developers. How to deliver a worthy upgrade to the high end niche?
The solution is to deliver barely no upgrade, which is what's happening. And people don't complain, because they don't know what they miss, and still buy high end.
But won't work for long. Some devs will push high end, e.g. A4. This raises the standard, and we get the situation of scaling becoming an increasing problem, increasing costs even further.
It will always be just a bad compromise. From perspective of both devs and gamers. There's already frustration about lacking innovation in games, and adding envy on high end will do more harm than good.

Its going to exist atleast aslong as console gaming, probably even longer, and probably always in much larger numbers, too.
Maybe, but piracy reduces those large numbers a lot. So not attractive from devs perspective.
Huge power hungry boxes under the desk feels bloated and clumsy, not modern from gamers perspective.
Come up with a new small box with enough power, an open OS so people can decide themselves which software to run on it, maybe always on to have working piracy protection, and with a bit of luck you'll wipe all those current platforms from the planet.
Not really a bad perspective i think. Sooner or later any platform ends on a graveyard. Nothing wrong with that. We need progress more than backwards compatibility - that's what emulation is there for.
Currently, progress stands for increasing performance. But games still look gamey. So maybe reducing power instead makes more sense at some point. I think we are at this point now. The crisis smashes this right into our faces.

crazy things like quad SLI and dual GPU cards
Good example for something underperforming, lacking dev support, and niche. Thus it died out.
3090 is crazy expensive. 3060/70 would be ok, but is still crazy expensive in comparison to consoles. And it's too expensive to be a proper entry level.

Apple M1, dunno what to think about their chips, its a departure from x86, but theres probably good reason Sony/MS didnt stuff anything like that in their boxes which will run another 7 or more years.
The difference M1 to other SoC / APU is it also includes main RAM into the package. So the whole computer in one small chip. It can compete x86 perf at much lower power. It's too expensive yet, but it's a reasonable future and will replace our big towers.
Notice x86 is an instruction set from the seventies. Internally the instructions are translated into another instruction set the CPU finally uses. So we have a hardware emulation layer just to keep a standard upright, which dates back to when rock music was greatest.
Then we have other standards preventing high bandwidth memory on our mainboards, so iGPU is capped at that.
All those ancient standards turn our platform into an outdated one, which has a hard time to evolve. Huge industry committees making decisions slowly over many years. It remains outdated all the time. Somebody will take that Goliath down sooner or later...

Almost more intrested in what samsung and RDNA2 will do for mobile :p
... like those guys for example. ;)
 
Good point. But over the years gaming has grown like crazy, making expensive high end much less practical than in the old days.

The problem with low spec / high end on one end, and increasing dev costs on the other is pressure on developers. How to deliver a worthy upgrade to the high end niche?
The solution is to deliver barely no upgrade, which is what's happening. And people don't complain, because they don't know what they miss, and still buy high end.
But won't work for long. Some devs will push high end, e.g. A4. This raises the standard, and we get the situation of scaling becoming an increasing problem, increasing costs even further.
It will always be just a bad compromise. From perspective of both devs and gamers. There's already frustration about lacking innovation in games, and adding envy on high end will do more harm than good.

Dunno about that, the high (est) end gaming hardware has always been very expensive. Low spec/high and up has always been there, too. Scaling has gotten much and much better though, and with consoles now also offering different specs (XSS to XSX and probably more later), i think its a non-issue.....
Barely no upgrade? I think we have had one of the largest leaps in hardware so far going from pascal to turing to ampere. The next GPU's are rumored to offer over 6 times the consoles abilities, or almost three times that of AMDs fastest GPU.
Then we have ray tracing and dlss like technologies that have lots of room to improve.

Indeed i dont see pc gamers complaining, and they shouldnt have to either i think (i dont). We'l know what were missing once we go console and have to live with lower resolutions, settings, ray tracing and trade frame rates for fidelity.
Again, scaling is doing its thing very well already in the PS4/XOne generation, Eternal is a good example of that. Someone with a 6900XT does get the better experience vs someone with a mere 6700 for example.

Maybe, but piracy reduces those large numbers a lot. So not attractive from devs perspective.
Huge power hungry boxes under the desk feels bloated and clumsy, not modern from gamers perspective.
Come up with a new small box with enough power, an open OS so people can decide themselves which software to run on it, maybe always on to have working piracy protection, and with a bit of luck you'll wipe all those current platforms from the planet.
Not really a bad perspective i think. Sooner or later any platform ends on a graveyard. Nothing wrong with that. We need progress more than backwards compatibility - that's what emulation is there for.
Currently, progress stands for increasing performance. But games still look gamey. So maybe reducing power instead makes more sense at some point. I think we are at this point now. The crisis smashes this right into our faces.

Lol thats a long time ago since ive seen piracy being mentioned in a pc vs console discussion :p Everything contradicts your claims that the pc aint attractive to devs. Its doing better then ever and piracy has become a much of a lesser issue these days as say 10 or 15 years ago.
The comment on 'huge power hungry boxes' seems a strange one too i think, i mean yeah they are but looking at the PS5..... consoles have grown alot looking back, aswell their drawing more or have the ability to then before.
PC's on the other hand have shrunken quite much as compared to the large grey towers from 20+ years ago. Also, we have laptops these days.... their more popular then before, probably more gamers are on gaming laptops as opposed to stationary systems. Then we have small stationary builds, too, aside from upcoming things like steamdeck. Everything covered for everyones needs.

Good example for something underperforming, lacking dev support, and niche. Thus it died out.
3090 is crazy expensive. 3060/70 would be ok, but is still crazy expensive in comparison to consoles. And it's too expensive to be a proper entry level.

3060 would already suffice to match and exceed console performance, that its expensive is due to the craze going on (wheter thats due to mining, covid or other factors no idea), but that problems arise on PS5 aswell, its a rarity to be able to get one (here atleast). Anyway, i'd rather have that 3060Ti/3070 over a console, i'd pay more but get more aswell.

The difference M1 to other SoC / APU is it also includes main RAM into the package. So the whole computer in one small chip. It can compete x86 perf at much lower power. It's too expensive yet, but it's a reasonable future and will replace our big towers.
Notice x86 is an instruction set from the seventies. Internally the instructions are translated into another instruction set the CPU finally uses. So we have a hardware emulation layer just to keep a standard upright, which dates back to when rock music was greatest.
Then we have other standards preventing high bandwidth memory on our mainboards, so iGPU is capped at that.
All those ancient standards turn our platform into an outdated one, which has a hard time to evolve. Huge industry committees making decisions slowly over many years. It remains outdated all the time. Somebody will take that Goliath down sooner or later...

Well, both Sony and MS have gone with X86 so you will have to live with that for the next 7 years, at the least.
 
with consoles now also offering different specs (XSS to XSX and probably more later), i think its a non-issue.....
It's a must to solve scaling now, but this does not make it a non issue. It adds costs to all ends, from programming up to content generation.
The really big problem is dev costs being so high, cross platform is essential. Otherwise we could just make different games for different specs. One would think gaming has grown so large this would just work, but due to costs so high, sadly that's not the case in a AAA context.
Also we can't do magic with scaling. There's always a limit. If we want to support Steam Deck or even Switch, we either do a complete downgraded port later, or we do compromises on high end. Gfx scales fairly easily maybe, but other thing's don't and will never do.
Conclusion: High end becomes properly utilized only years later, after min specs go up. It's thus more economical to invest in mid range and entry level, no matter if you are average gamer or dev.
Largest RDNA3 for 2500 bucks as speculated? Pointless. And no, we never had HW this expensive before. It's a crazy. Meaningful for content creation and enterprise, but pointless for gamers and developers.
Everything contradicts your claims that the pc aint attractive to devs. Its doing better then ever and piracy has become a much of a lesser issue these days as say 10 or 15 years ago.
It is attractive, agreed. But mostly because cross platform development is cheap. The question is: Can we still count on further growth with WH supply simply being unable to match demand? No. Solution: Lower specs / slower increase of specs. It's simple economy, no matter what's our personal enthusiasm.
With smaller boxes we also make gaming more accessible, to keep it growing. Grandma already has internet, so let's sell her some Quake too :) That's not my personal desire, but what the industry wants to justify their upwards spiral of ever increasing costs.
PC's on the other hand have shrunken quite much as compared to the large grey towers from 20+ years ago.
Yep. Monster GPUs with chips becoming larger instead smaller just don't fit into this picture. (Agree PS5 is too big too, and ugly)
3060 would already suffice to match and exceed console performance, that its expensive is due to the craze going on (wheter thats due to mining, covid or other factors no idea), but that problems arise on PS5 aswell, its a rarity to be able to get one (here atleast).
3060 + the rest you need to have a PC costs at least twice as match as a console with similar specs. So i want this form factor on PC too. Then the PC is a nice platform again, and beats console due to being open.
And i fully agree consoles have similar issues. Too big, specs higher than needed. I don't believe in a PS7, and PS6 maybe neither. But we want some box, no? We agree streaming sucks? So we need to work on such box before it's too late. Lower power, but still better gfx than current stuff. Totally possible.
Barely no upgrade? I think we have had one of the largest leaps in hardware so far going from pascal to turing to ampere.
The visual upgrade requires exponential growth in power. Point of diminishing returns. The new RT feature obfuscates this a bit, but the visual win becomes smaller and smaller, even if power doubles and doubles.
We need to find a sweet spot about HW power, and we need more time to achieve improvements than we did in the 90's. Rushing forward and depending on increased HW power and costs is like shooting our own legs.
It's like climate change. To solve it, we need to tone down and work on efficiency in the long run. Not great, not popular, but sadly the reality.
Well, both Sony and MS have gone with X86 so you will have to live with that for the next 7 years, at the least.
No problem. Aside from some SIMD optimizations we don't deal with instruction sets at all anymore. If NV gives us this small box with nice ARM SoC and powerful GPU, and AMD does the same with x86, extra work to support both is almost nothing.
 
It's a must to solve scaling now, but this does not make it a non issue. It adds costs to all ends, from programming up to content generation.
The really big problem is dev costs being so high, cross platform is essential. Otherwise we could just make different games for different specs. One would think gaming has grown so large this would just work, but due to costs so high, sadly that's not the case in a AAA context.
Also we can't do magic with scaling. There's always a limit. If we want to support Steam Deck or even Switch, we either do a complete downgraded port later, or we do compromises on high end. Gfx scales fairly easily maybe, but other thing's don't and will never do.
Conclusion: High end becomes properly utilized only years later, after min specs go up. It's thus more economical to invest in mid range and entry level, no matter if you are average gamer or dev.
Largest RDNA3 for 2500 bucks as speculated? Pointless. And no, we never had HW this expensive before. It's a crazy. Meaningful for content creation and enterprise, but pointless for gamers and developers.

Scaling has always been important, aswell as baselines (mid range hardware etc). While games arent made purely for the highest end hardware (always been a rarity), it depends on how you see it, someone with the highest end hardware (6900XT etc) does have that advantage in modern AA/AAA games.
What your describing has always been the case largely, with the advantage of modern-day scaling and consoles now offering multiple SKU's in different spec ranges.
Just because theres a 100TF RDNA3 doesnt mean everyone wants or needs one, a 6800 class gpu (or even lower) will play games too.... Just gpus like a 3090 will do all that at higher settings, more ray tracing, resolution/framerates etc.

Prices and availability are crazy now according to some due to mining, frabrication, covid etc etc.

It is attractive, agreed. But mostly because cross platform development is cheap. The question is: Can we still count on further growth with WH supply simply being unable to match demand? No. Solution: Lower specs / slower increase of specs. It's simple economy, no matter what's our personal enthusiasm.
With smaller boxes we also make gaming more accessible, to keep it growing. Grandma already has internet, so let's sell her some Quake too :) That's not my personal desire, but what the industry wants to justify their upwards spiral of ever increasing costs.

Ye well then Sony and MS have had the wrong insight and Apple/steamdeck are doing the right thing..... :p

3060 + the rest you need to have a PC costs at least twice as match as a console with similar specs. So i want this form factor on PC too. Then the PC is a nice platform again, and beats console due to being open.
And i fully agree consoles have similar issues. Too big, specs higher than needed. I don't believe in a PS7, and PS6 maybe neither. But we want some box, no? We agree streaming sucks? So we need to work on such box before it's too late. Lower power, but still better gfx than current stuff. Totally possible.

As far as i can remember, ive always paid more than twice the money (around 1200 dollars) over a console for a pc that has roughly the same specs, perhaps somewhat better. Npw prices are crazy but thats for the consoles too (scalpers miners whatever). Furthermore, i think its entirely possible to have a system in roughly PS5 sizes if you want to. A 3060Ti could be a very nice contender if prices wherent crazed up.
I certainly dont think the specs are higher than needed, but thats just me. Generation hasnt even really started and were already back to choosing between 30 and 60fps modes, aside from ray tracing being kinda limited and 4k is still not really there aside from upscaling.

But i get where your coming from though, gaming has gotten more expensive (both on pc and console), hardware didnt really get cheaper and consoles have seen price increases aswell (damn its a long time since weve had 500/600 dollar boxes and 80/90 dollar games, holy crap). Its the PS3 era all over again, large, expensive and quite the power draw.

The visual upgrade requires exponential growth in power. Point of diminishing returns. The new RT feature obfuscates this a bit, but the visual win becomes smaller and smaller, even if power doubles and doubles.
We need to find a sweet spot about HW power, and we need more time to achieve improvements than we did in the 90's. Rushing forward and depending on increased HW power and costs is like shooting our own legs.
It's like climate change. To solve it, we need to tone down and work on efficiency in the long run. Not great, not popular, but sadly the reality.

Yes that i agree on, the leaps arent there anymore in special for this generation PS4 to PS5, the gap has been the smalles ever in terms of 'true generational leaps in graphics'. So we have to come with new things like faster loading, ray tracing, 3d audio, haptics and content (how we interact), probably VR aswell.
But yes again i get where your coming from, the games really taking advantage of modern hardware are fewer and fewer, maybe what, 2 games a year perhaps that truly go for it? (right now rift apart and cp2077 come to mind). Creating games is more and more time consuming aswell as costlier.
I think what we need is faster development times, easier and less costly to create those stunning AAA games like rift apart, right now were probably looking at years of development time for such a title.

No problem. Aside from some SIMD optimizations we don't deal with instruction sets at all anymore. If NV gives us this small box with nice ARM SoC and powerful GPU, and AMD does the same with x86, extra work to support both is almost nothing.

Ye i agree, it'd be intresting to see a departure from X86 cpu's, what it would bring etc. Aslong it aint coming from Apple perhaps. AMD/intel could come with some ARM solutions, maybe even AMD/NV. I personally dont care, if an ARM chip is as capable as my current CPU in every possible way, and prices etc are right, it doesnt bother me what kind of architecture's driving it all :p
 

Loving the VRR mode, though can definitely see those frametime hitches. Amazing that a 30hz standard mode can actually be >100hz in parts. Hope to see more of this over the course of the generation.
 
Prices and availability are crazy now according to some due to mining, frabrication, covid etc etc.
Why are you so optimistic this changes? After those issues comes inflation. People won't afford stuff easily even at msrp, which goes up with smaller processes not down.
Ye well then Sony and MS have had the wrong insight and Apple/steamdeck are doing the right thing..... :p
No. IMO Steam Deck is not powerful enough with 1TF, and M1 is no gaming chip. Series S is the right thing to me. Said so before already.
I certainly dont think the specs are higher than needed, but thats just me.
Don't think it's easy for me to give up on demanding moooaaar. There are just those reasons we can't ignore.
I think what we need is faster development times, easier and less costly to create those stunning AAA games
Exactly. I see some options here:
Much better procedural content generation.
Replace motion capture with robotics simulation.
Replace fakery and tuning with 'do it just right'. (e.g. realtime GI as we saw)
But all of this is much harder than PacMan, Quake, and Fortnite. It's not just games anymore, but science. Guys capable of doing so work for Boston Dynamics for better money, not for a doomed games industry. So i don't expect quick progress. I rather expect downfall and a reset after that.
Working on such things myself, but sadly being to rocket scientist, i'll surely see many downfalls, platforms coming and going... before i finish any of this crap :D
 
Loving the VRR mode, though can definitely see those frametime hitches. Amazing that a 30hz standard mode can actually be >100hz in parts. Hope to see more of this over the course of the generation.
That >100Hz comes from frame doubling/trippling of the TV ;)
They mention that in the video.

Those hitches are more loading related than GPU related. The whole game stops for a moment and than it is working again.
 
Why are you so optimistic this changes? After those issues comes inflation. People won't afford stuff easily even at msrp, which goes up with smaller processes not down.

Well, otherwise no one will be able to get ahold of a PS5, GPU or anything else. The rarity of one in store/stock makes these things so expensive.

No. IMO Steam Deck is not powerful enough with 1TF, and M1 is no gaming chip. Series S is the right thing to me. Said so before already.

Steam deck is aiming for a much lower resolution though :p I think the steam deck might have a future for those that want to trade performance for mobility (i dont).

Exactly. I see some options here:
Much better procedural content generation.
Replace motion capture with robotics simulation.
Replace fakery and tuning with 'do it just right'. (e.g. realtime GI as we saw)
But all of this is much harder than PacMan, Quake, and Fortnite. It's not just games anymore, but science. Guys capable of doing so work for Boston Dynamics for better money, not for a doomed games industry. So i don't expect quick progress. I rather expect downfall and a reset after that.
Working on such things myself, but sadly being to rocket scientist, i'll surely see many downfalls, platforms coming and going... before i finish any of this crap :D

Yes i really hope development costs/time will come down, since the amount of AAA's on playstation is getting quite dire if were comparing to my favourite PS (the PS2) which had a huge amount of AAA games.
 
Steam deck is aiming for a much lower resolution though :p I think the steam deck might have a future for those that want to trade performance for mobility (i dont).
Yeah, Steam Deck has PS4 levels of GPU power and XBOne levels of resolution targets, but with the most limiting factor from that generation mitigated (the CPU). At least on paper. I can't wait for it to get tested so we can see where it really lands.
 
Status
Not open for further replies.
Back
Top