Digital Foundry Article Technical Discussion [2017]

Status
Not open for further replies.
Agreed, just like the PS4 was not a significant improvement over the XBO version. Although at least with regards to PS4-P and XBO-X, the XBO-X offers increased performance and a longer viewing distance before LOD kicks in.

Actually, there is no difference between the PS4/XB1 version if both games are capped at 30fps.

So they didn't take any advantage of the PS4 hardware.

At least, the game runs at native 4k on XBX with better settings.
 
oh so it's not available already? huh. interesting to note.
I guess that explains why DF hasn't reported on it's functionality yet.
think ms said they would update 1X to support vrr once it was ratified with the expectation of it being around November.

so that would explain why no one has tested it hooked up to a monitor yet, freesync/vrr not enabled.
 
The X1X supports FreeSync, FreeSync 2, and VRR when ratified. The first two use a different protocol than the last one which is a part of HDMI 2.1, and there are no HDMI 2.1 device available yet as the standard is not yet ratified.

My guess would be that DF are yet to publish an article and/or videos about how FreeSync works on X1X so it would be better to be ambiguous when it comes to how it performs until such an article is published.
 
Resolution, draw distance etc are nice bonuses for those with the X1X, but not something too noticeable. The framerate difference, however, matters IMO. I really hope this kind of thing sets a precedent and makes it clear that people do want 60fps, at least as an option.

You can't say this of the console market, where 30fps (and sub-30fps) have been the norm for a few generations. That is a the precedent because the larger segment of that market seem content with this, otherwise explain the massive sales of 30fps console games.

Options are good, though. I like the option of putting extra performance towards better graphics, resolution or framerate.
 
The X1X supports FreeSync, FreeSync 2, and VRR when ratified. The first two use a different protocol than the last one which is a part of HDMI 2.1, and there are no HDMI 2.1 device available yet as the standard is not yet ratified.

My guess would be that DF are yet to publish an article and/or videos about how FreeSync works on X1X so it would be better to be ambiguous when it comes to how it performs until such an article is published.
i thought that vrr is basically open standard of freesync 2? Could be wrong.

either way, that wouldn't stop DF doing vids and article on it whilst mentioning that they will do a follow up to see how it performs with a vrr display, if it currently supported freesync.
also other people would've tried it and reported on the benefits. That's why I suspect that it's not enabled at all.
guess it is possible no one has bothered to try it though.
 
i thought that vrr is basically open standard of freesync 2? Could be wrong.

either way, that wouldn't stop DF doing vids and article on it whilst mentioning that they will do a follow up to see how it performs with a vrr display, if it currently supported freesync.
also other people would've tried it and reported on the benefits. That's why I suspect that it's not enabled at all.
guess it is possible no one has bothered to try it though.

VRR and FreeSync are different implementation of the same thing. FreeSync2 is basically FreeSync1 with the addition of HDR and a more closed standard but the variable framerate portion of it is the same thing. VRR is variable framerate implementation under HDMI 2.1.

As for why DF don't want to come out and state their findings? well it could be because they want to make a dedicated article/video about it which should include detailed analysis of how it works, also for the clicks it will generate. Or it could be that its not yet implemented yet, just like how 1440p has native support but hasn't being enabled.
 
You can't say this of the console market, where 30fps (and sub-30fps) have been the norm for a few generations. That is a the precedent because the larger segment of that market seem content with this, otherwise explain the massive sales of 30fps console games.

Options are good, though. I like the option of putting extra performance towards better graphics, resolution or framerate.

Challenge accepted ;) 30fps is fine, and the success of titles with that framerate is testament to that, but the era of every game having just one mode of performance seems to be drawing to a close. The likes of Uncharted will most likely stick to 30fps so they can push the graphical fidelity as much as possible, and maintain a framerate similar to film, but we've seen an increase in the number of games offering a high framerate mode and I've yet to see these modes met with anything but positivity.

So I think we'll continue to see such modes in even more games. That is, unless the multiple modes we're seeing is just a quirk of mid-gen refreshes.
 
I think we are entering an era where some developers will increasingly start to prepare their console games for future generations of hardware. The X1X is a good piece of hardware, but that comes at a significant cost that will keep the Pro much cheaper for a long time, and then the question is will the X1X gain enough traction before the PS5 or whatever, which you would expect to be coming in say two years? But I think Microsoft’s work in getting software to work well with future hardware, including all their work on 360 titles, as well as their UWP and PC purchase overlap, gives them a valuable ecosystem and will be far more important than any hardware differences that come at a premium (were the Pro and X1X to cost about the same it would be a different matter of course). And Sony isn’t even their only or perhaps even main rival here - both are increasingly competing with Stream, Google, and perhaps above all, Apple.

So yes, to stay on topic, we’ll see more modes that scale better with improved hardware for sure.
 
Overall not a significant improvement from the Pro game in the 1440p mode. Only around 25 to 50% more frames with more half-rate animations.

Apparently the Pro game could have a much better framerate if animations rates were set like in the XBX game.
Well, it is capped at 60, so we don't know how far it can go. The 4k mode interesting, in the uncapped mode, it seems that it is almost the performance level of the pro in 1440 mode.
But well, the game has really heavy CPU limits in some scenes, so it could not be that much faster than the PS4 pro in some scenes.
 
Hitman in Performance mode seems roughly in line with the difference in ALU when the game is ALU bound, and the difference in CPU when the game is CPU bound. Nothing about the results looks that shocking. The quality mode on the other hand is rendering more than 100% more pixels with performance roughly equal at 30fps. That's more than the difference to be expected. PS4 Pro is definitely hampered by memory capacity and/or bandwidth.
 
As for why DF don't want to come out and state their findings? well it could be because they want to make a dedicated article/video about it which should include detailed analysis of how it works, also for the clicks it will generate. Or it could be that its not yet implemented yet, just like how 1440p has native support but hasn't being enabled.
It would be a change in direction for DF, I've never known them to do something in 1 article that they felt they could do in multiple :D. Worth noting I'm not complaining about that either.

Be nice to have a definitive answer. Although I still believe that they grouped freesync and vrr support together when they spoke about it, but may have been my interpretation of what was said. I'll try and see if I can google it at some point today.
As you said, same way they plan to support 1440p but haven't yet. Would easily make sense that they will roll out freesync with vrr all at one time.
 
I think we are entering an era where some developers will increasingly start to prepare their :LOL:console games for future generations of hardware. The X1X is a good piece of hardware, but that comes at a significant cost that will keep the Pro much cheaper for a long time, and then the question is will the X1X gain enough traction before the PS5 or whatever, which you would expect to be coming in say two years? But I think Microsoft’s work in getting software to work well with future hardware, including all their work on 360 titles, as well as their UWP and PC purchase overlap, gives them a valuable ecosystem and will be far more important than any hardware differences that come at a premium (were the Pro and X1X to cost about the same it would be a different matter of course). And Sony isn’t even their only or perhaps even main rival here - both are increasingly competing with Stream, Google, and perhaps above all, Apple.

So yes, to stay on topic, we’ll see more modes that scale better with improved hardware for sure.
I agree with most of what you said.
Part of the reason I bring up VRR pretty often is because I could see that as part of the code for the future echo system that is being built.
Use VRR not to just smooth out the odd frame drop here and there, but target it so a 60fps game runs at say 45-55 fps(picking random fps and range), so if if you have a compatible display now you get an even higher IQ experience. On future hardware it would run at locked 60 (which you probably could only tell the difference via DF) and higher resolution if it also used a dynamic one, downscaled to 4k.

I don't expect everyone to run out and swap their HDR TV's to new ones, but in few years time when new Xbox comes out all those games would look better. People who have a VRR display would get an improved image quality now though.
I wonder how good a game could look and play if it targeted that frame rate, with dynamic res, all added together it's big saving on gpu resources, add in smart rendering tech. Basically throw everything at it :LOL:
 
The thing about variable refresh rate displays is they don't support variable refresh except within particular ranges of frequency. One tv might be 50-60Hz and another could be 45-60Hz. So, most likely it'll only help with 60fps games, and devs should probably still target 50-60 fps. Variable refresh also isn't compatible with a lot of the motion enhancers like lightboost or other flickering backlights, from what I understand. So it's a trade between motion clarity and torn frames, depending on the display.

Still, I think variable refresh is useful, and I'd hope 60fps becomes the norm so we can take advantage of it. Looking at hitman, it's pretty obvious to me that the cpu is going to need a nice boost for next-gen, unless we want to be playing games with the same world density as Assassin's Creed and Hitman for the next ten years.
 
With a high frequency screen you won't have to wait long before being able to display a newly available frame...
(1/120 seconds or 1/144 seconds...)
 
With consoles, the monitors would be forced to operate at 60Hz, no?

There's nothing that says they have to be forced. But considering that consoles are predominantly used with TV that operate at 60 Hz, there isn't a huge reason to optimize beyond that.

But if we are entering an era of "rolling" generations then there's no reason why an older "generation" game couldn't be allowed to run over 60 Hz on the newest console if it was connected to an appropriate high refresh display.

Personally I dislike Variable Refresh for the same reasons I dislike AFR rendering. It smooths out the display of the game but retains variable control/input response (input combined with feedback to your input) similar to how AFR provides the input feel of a much lower FPS.

I much prefer Variable Resolution with a fixed refresh as I value control consistency and feel far more.

Regards,
SB
 
The thing about variable refresh rate displays is they don't support variable refresh except within particular ranges of frequency. One tv might be 50-60Hz and another could be 45-60Hz. So, most likely it'll only help with 60fps games, and devs should probably still target 50-60 fps. Variable refresh also isn't compatible with a lot of the motion enhancers like lightboost or other flickering backlights, from what I understand. So it's a trade between motion clarity and torn frames, depending on the display.

Still, I think variable refresh is useful, and I'd hope 60fps becomes the norm so we can take advantage of it. Looking at hitman, it's pretty obvious to me that the cpu is going to need a nice boost for next-gen, unless we want to be playing games with the same world density as Assassin's Creed and Hitman for the next ten years.
as I said I picked the range out of the air, if all displays support down to 48Hz, then make that the minimum.
better yet, interrogate the display and make the minimum based on what it supports, and as I said if it still supports dynamic resolution, then the resolution would drop further to support it.

with reasonable options, could make games scale well in this generation depending on set up and for the next.

I think the options that some games are already giving is a nice start and bodes well imo.
 
COD Infinite Warfare enhanced on XBX. Dynamic 4K (without the CBR seen on Pro): slightly sharper image than Pro CBR, slighty higher effects (difference is quite small though). But screen tearing and quite worse framerate than on Pro. Unable to sustain solid 60fps like the reference Pro game.

We can also add this one to the list of games having trouble with performance on XBX compared to Pro, again when there are tons of alphas on screen. What kind of bug could cause this ?

 
Is it accurate to say that COD Infinite Warfare (Nov 04, 2016) is an older engine compared to COD WW2 (Nov 03, 2017)? If so, that could explain some things since COD WW2 doesn't seem to suffer from the same issues.
 
Status
Not open for further replies.
Back
Top