Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
How the hell is the Xbox One getting a 6fps (24fps low to 30fps high) variable while the PS4's getting a 27fps (33fps low to 60fps high) variable?

BTW, where is the 45fps peak that was mentioned for Xbox One? :oops:

Looks like the X1 is capped at 30, with avg 29.98fps
PS4 max is at 60, though avg is 50-53fps, and easily beats 30fps min.
 
The gravy is any frame rate above 30 FPS, which is what the PS4 version has.
(sarcasm? That smiley tones down implied satire.)
The 30 FPS floor is the common requirement for both platforms, but the developers for each version made different choices as to how they would handle performance spikes past it.

The Xbox One version favors consistency over frame rate above the minimum, while the PS4 version's performance cushion is such that its developers felt they could get away with a higher cap, at the apparent price of some judder.

The ratios in minimum frame rates are actually pretty close to the CU difference in the GPUs, for what it's worth. The Xbox One's version would probably have a closer average if the cap wasn't there, but the frame rate appears to not have been consistently high enough above 30 to meet the developer's criteria for comfort.
 
Ya, if uncapped, the X1 version's avg framerate would probably be a bit higher. But to be completely fair, the PS4 version is also capped at 60fps and there were quite a few moments where the PS4 version held steady at 60fps and would probably go higher.
 
Last edited by a moderator:
Fluctuation from 33-60 fps....PS4 version will feel like a PC game, don't mind the average 53Hz performance, the min max is also quite important to me (also of course the duration and frequency of min fps). I don't like that.

I wonder if this is a sign of a 'not so easy' port scenario with a limited time budget, i.e. not much time to optimize the code for the new console.

Are the scenes with low fps particularly demanding on the CPU? As this seems to be the weak part in the PS4?!?
 
I fear that 'gravy' could soon be as annoying a term as 'secret sauce'.

And both make me hungry.
 
Fluctuation from 33-60 fps....PS4 version will feel like a PC game,

This makes no sense. A PC game running at that framerate with vsync off will feel like the PS4. The same PC game will feel like the XB1 with the flick of a vsync switch. And the same PC game will have a locked 60fps framerate with that same vsync switch and a few graphical settings lowered.

Applying fixed standards to PC gaming makes no sense. You can basically have whatever frame rate you want - locked or otherwise. The variables in that scenario are graphical settings and price (of the hardware you're willing to buy). You balance all 3 factors in whichever way you're most happy with.
 
Fluctuation from 33-60 fps....PS4 version will feel like a PC game, don't mind the average 53Hz performance, the min max is also quite important to me (also of course the duration and frequency of min fps). I don't like that.

I wonder if this is a sign of a 'not so easy' port scenario with a limited time budget, i.e. not much time to optimize the code for the new console.

Are the scenes with low fps particularly demanding on the CPU? As this seems to be the weak part in the PS4?!?
It only dips to 33fps for a few seconds in one scene. It mostly runs at 40-50 during intense battles and around 60fps when not much is going on. The min/max is mis-leading IMO.
 
This makes no sense. A PC game running at that framerate with vsync off will feel like the PS4. The same PC game will feel like the XB1 with the flick of a vsync switch. And the same PC game will have a locked 60fps framerate with that same vsync switch and a few graphical settings lowered.

Applying fixed standards to PC gaming makes no sense. You can basically have whatever frame rate you want - locked or otherwise. The variables in that scenario are graphical settings and price (of the hardware you're willing to buy). You balance all 3 factors in whichever way you're most happy with.

If I want 60fps fixed on PC...I always had to turn down settings extremely low...this micro stuttering or Min fps value seems really hard to control. Hence, lowering settings is not an option, as the game may run 53fps in average with an occasional (but unpleasant) drop...so to get really all frames down to 16.67ms, the sacrifice is to big for me...in my personal experience.

PS4 version seems to be the same...which is not good imo.
 
If I want 60fps fixed on PC...I always had to turn down settings extremely low...this micro stuttering or Min fps value seems really hard to control. Hence, lowering settings is not an option, as the game may run 53fps in average with an occasional (but unpleasant) drop...so to get really all frames down to 16.67ms, the sacrifice is to big for me...in my personal experience.

PS4 version seems to be the same...which is not good imo.

So on the PC (with this example framerate) what's stopping you from turning vsync on and locking the game to a solid 30fps?

My point is that with the PC you have choice. Lock the game to a solid 30fps like the X1? Done. Get the most out of your hardware with variable framerates between 40-60fps? Done. Lock the game at 60fps? Done (with graphical sacrifices). Lock the game to 60fps with no graphical sacrifices? Done (buy more hardware).
 
This makes no sense. A PC game running at that framerate with vsync off will feel like the PS4. The same PC game will feel like the XB1 with the flick of a vsync switch.

You are confused on vsync versus fps cap.
Just because you can do vsync on/off on PC does not make the engine automagically do frame rate cap [at arbitrary limit], they are completely different.
 
You are confused on vsync versus fps cap.
Just because you can do vsync on/off on PC does not make the engine automagically do frame rate cap, they are completely different.

Doesn't double v sync mode cap the game at 30?

There's tool on PC too that enables you to pretty much set the frame rate cap at whatever you want.
 
I thought that just referred to double/triple buffer. With double, it would still allow 60fps (1 unique and completed frame per update interval for 60Hz monitor), but any slight increase in frame render time means it waits for the next update interval.
 
I'm not super fond of fluctuating frame rate, usually comes with some tearing.
It seems that Sony is kin to get PR wins and gave devs some guidance that translate in PR wins but may not be the best choice for gamers.
This gen is just started, I would say it is "fair".
 
It's not clear what pressure, if any, was placed on the developers of each version by their respective platform holders. The only known definite requirement was the 30 FPS minimum, and that wasn't a demand from Microsoft or Sony.

It could be that if the frame rate behaviors of the versions were switched, the decisions might have been different as well.
 
Not one review mentions a juddery framerate on the PS4 version... quite the opposite actually. And DF's analysis doesn't show any tearing. Again, I think the min/max is a bit mis-leading. If you actually watch the video, the range is more like 10fps during each section of the game and it rarely dips below 40. It's not like it always fluctuates between 33 and 60.
 
Last edited by a moderator:

If we try to compare the 2 versions fairly (like they said in the article) I think we have to analyze the framerate gap only when both are not locked either by 30fps or by 60fps.

In this case there is roughly an averaged 20fps difference between the 2. When X1 runs at 28 (quite often in fact with screen tearing), PS4 is at 48fps.

28fps * 1.7 = 47fps again confirming by framerate only that PS4 performs 70% better than X1, on average of course. There was the same 70% difference with BF4 (resolution + framerate differences) so no big surprises here.

What is surprising is the double buffered framebuffer for the X1 version (screen tearing) because PS4 version appears to have a triple buffer (zero screen tearing).
 
Status
Not open for further replies.
Back
Top