Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
I wouldn't consider those totally true -- shipped games with heavily raymarched lighting was first seen in (major) console games, hardware rt is 'just' an optimization to make it more viable. dlss sure, but it's fundamentally just a kind of temporal upscaling, and we've seen that for a long time on consoles. Obviously pc hardware is far beyond console hardware, and everybody who makes games works on a PC, graphics research is done on pcs, etc. I'm not saying there's no place to point out how much more powerful pc hardware is.

Just the generation talk is goofy imo. There's a reason everyone uses console generations to refer to games.

Nah, ray tracing made its debut in 2018 on the PC, so did deep learning stuff. Looking back the generations, pixel shaders etc, pc debut. DLSS is using ML/AI reconstruction tech to achieve its results, its no spatial upscaling the PS uses.
Generational talk.... well, in the past generations there where games utilizing pc hardware quite much, like Doom 3, far cry, crysis, HL2 etc etc, new generation each about each year if you want to spin it that way.
Nowadays this doesnt matter all that much anymore anyway, since scaling has become very, very good. Thats why the talk about rolling generations and the lack of true generational shifts as opposed to previous generations.

What the consoles usually did was lifting the base-line each generation of new consoles. But again, those days are clearly over, with the 'true generational leaps' weve had in the past, and im totally okay with that.
 
Raymarced lighting? What? Like screen space lighting?
If you mean something else

I'm thinking of claybook, tomorrow children (maybe that ones screen space?), svogi, etc. RT against easier acceleration structures like voxels and sdfs have been around for a while in realtime.

No doubt hardware RT is very transformative (other examples of rt didn't trace against triangles, were lower resolution, used raymarching or cone tracing resulting in much lower precision, etc) but it's still just a (important) generalization and improvement of existing realtime techniques.
 
Wouldnt call either of them the 'start of ray tracing in consumer games', that would be the likes of BFV.

imo that's mostly semantics. If you're tracing rays (or some optimized version of rays -- cones) against geometry outside of screen space and using that to shade a pixel, it's the same general technique imo.

This isn't a hill I wanna die on though, just to clarify: I think Rt hardware is a big deal, I just don't buy that people are developing whole new techniques thanks to pc hardware -- I think pc hardware is mostly following what engineers are doing in big budget productions and providing ways to make it way better, remove limitations, etc. When pixel shaders came to consoles we started seeing big advancements, when useful compute came to consoles we started seeing big advancements, etc. That's why "generations" make sense.
 
As John mention 40 fps is not realy teritory when vrr looks and feel good, for this fps teritorry locking fps at 40 and 120hz is the way btw soon df video about it

I'd strongly disagree. What happens if the frame rate occasionally dips slightly below 40fps?

With vsync you get judder, with VRR and a 40fps frame rate cap, its invisible.

If it never dips below 40 then your average is likely much higher. With VRR you might be able to lock at 46fps 95% of the time (15% more framerate than 40fps) with only occasional invisible dips down to 40fps.

I play AC Odyssey at 36fps locked. Thats 20% more frame rate than 30fps and its smooth as butter (actual fps is in the 40's somewhere. To get that to a locked 40fps with vsync I'd need to lower the graphics settings.
 
If it never dips below 40 then your average is likely much higher. With VRR you might be able to lock at 46fps 95% of the time (15% more framerate than 40fps) with only occasional invisible dips down to 40fps.

I play AC Odyssey at 36fps locked. Thats 20% more frame rate than 30fps and its smooth as butter (actual fps is in the 40's somewhere. To get that to a locked 40fps with vsync I'd need to lower the graphics settings.

Not sure what was originally being referred to here, but fwiw my pretty high end tv (c9) vrr range is only 40fps and up afaik -- a drop to 36 has no vrr. Not sure how common that is in other displays.
 
smh....


Ok. Ray Tracing in games didn't start until consoles said so.... happy? :rolleyes:

Back when we where playing games using ray tracing and watching DF analysis on how intresting the tech was, we werent even sure the consoles would get any RT support at all. Also, theres a big difference between a title or two doing what tomorrow childeren did and the launch of quite many games practicing RT as in BFV etc. I mean, we could start to argue ray tracing started back in the 80's lol.
 
Not sure what was originally being referred to here, but fwiw my pretty high end tv (c9) vrr range is only 40fps and up afaik -- a drop to 36 has no vrr. Not sure how common that is in other displays.

Fair point. I'm really only referring to VRR using Gsync or Freesync with LFC. I'm sure VRR that starts at 40fps has its uses but its much more limited in its potential application, especially if using a more modest GPU like mine.
 
How do you lock a game to 36 fps ?
or is your refresh rate 72hz (and the game cant hit 72fps) and you have vysnc on
 
How do you lock a game to 36 fps ?
or is your refresh rate 72hz (and the game cant hit 72fps) and you have vysnc on

Its just a frame rate cap using the games inbuilt cap as far as I remember, although it may be through NVCP, I'm away at the moment so unable to check. VRR takes care of the rest, I.e my monitor sets itself to run at 36hz. Or actually 72hz but assigns 2hz for each frame.
 
I'd strongly disagree. What happens if the frame rate occasionally dips slightly below 40fps?

With vsync you get judder, with VRR and a 40fps frame rate cap, its invisible.

If it never dips below 40 then your average is likely much higher. With VRR you might be able to lock at 46fps 95% of the time (15% more framerate than 40fps) with only occasional invisible dips down to 40fps.

I play AC Odyssey at 36fps locked. Thats 20% more frame rate than 30fps and its smooth as butter (actual fps is in the 40's somewhere. To get that to a locked 40fps with vsync I'd need to lower the graphics settings.
Accordingly to John it feels worse also tvs rather don’t have vrr below 40hz (maybe some monitors has but not many play consoles games on small monitors, that doesn’t make much sense in era of 55 inch oleds and above)
 
imo that's mostly semantics. If you're tracing rays (or some optimized version of rays -- cones) against geometry outside of screen space and using that to shade a pixel, it's the same general technique imo.
Don't some early "3D" games do this for mirrors? Like, old DOS ray caster engines. They ray cast to render the player view, and then ray cast into off screen space to get the reflection for the mirror?
 
Accordingly to John it feels worse

I've yet to watch the video yet as I'm away on limited data but I'm guessing John was not talking about a frame capped and fully locked 40fps which should feel identical to a vsync 40fps, but rather a varying frame rate averaging 40fps or with 40fps as the minimum.

also tvs rather don’t have vrr below 40hz (maybe some monitors has but not many play consoles games on small monitors, that doesn’t make much sense in era of 55 inch oleds and above)

Lol, because every console gamer has a 55" OLED now do they?

As someone who actually does have a 55" OLED connected to their PC as well as a 38" ultrawide I can say with confidence that the actual viewport of the monitor is far bigger at normal viewing distances than the TV, and the aspect ratio is much more immersive too. That's why its hands down my first choice screen even though the OLED has better HDR performance. And that's before considering the benefits of the full range VRR or dynamic ambilight.
 
I've yet to watch the video yet as I'm away on limited data but I'm guessing John was not talking about a frame capped and fully locked 40fps which should feel identical to a vsync 40fps, but rather a varying frame rate averaging 40fps or with 40fps as the minimum.



Lol, because every console gamer has a 55" OLED now do they?

As someone who actually does have a 55" OLED connected to their PC as well as a 38" ultrawide I can say with confidence that the actual viewport of the monitor is far bigger at normal viewing distances than the TV, and the aspect ratio is much more immersive too. That's why its hands down my first choice screen even though the OLED has better HDR performance. And that's before considering the benefits of the full range VRR or dynamic ambilight.
Sure not all but I and John have :d for sure will not go back to some lcd monitor hehe (especialy ultrawude which I also got in the past)
 
Sure not all but I and John have :d for sure will not go back to some lcd monitor hehe (especialy ultrawude which I also got in the past)

Why would John use an ultrawide monitor with full range VRR over a TV? The PS5 supports neither ultrawide or VRR. And as a review site I expect his seating arrangement is far from the typical console users living room setup and thus he's probably free to sit as close as likes to the TV. Not much motive to switch from OLED when you're locked out of the main benefits of gaming monitors.

I do find your statement about ultrawide strange though. Are you suggesting its somehow worse for gaming? Obviously that's a matter of preference but its not one I've heard expressed before. There's a reason the movies are filmed in ultra wide for example. Obviously you still have the option of playing in 16:9 on a 21:9 monitor with black bars at the side if you wish. But lol, who would choose that?
 
Why would John use an ultrawide monitor with full range VRR over a TV? The PS5 supports neither ultrawide or VRR. And as a review site I expect his seating arrangement is far from the typical console users living room setup and thus he's probably free to sit as close as likes to the TV. Not much motive to switch from OLED when you're locked out of the main benefits of gaming monitors.

I do find your statement about ultrawide strange though. Are you suggesting its somehow worse for gaming? Obviously that's a matter of preference but its not one I've heard expressed before. There's a reason the movies are filmed in ultra wide for example. Obviously you still have the option of playing in 16:9 on a 21:9 monitor with black bars at the side if you wish. But lol, who would choose that?
I'm not sure I'd like to watch films with grey blacks...likewise I think I'd sacrifice the view for the better blacks, I mean, there's only so much you can concentrate on in a game...but that's just me.
 
Status
Not open for further replies.
Back
Top