Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
If it's as simple as that, then run every game with an unlocked framerate. That won't go in the Xbox's favour either you realise? The PlayStation version has a refresh rate lock (at least for some of it), hence why it's either 30 or (very rarely) 20.
The Xbox One version doesn't show any sign of tearing, if that's what you mention as "go in the Xbox's favour", perhaps the lock would make it better. Maybe... Skyrim ran okay back in the day, and I spent a lot of hours playing it on the X360, at 30 fps.

But DF also says that Far Cry 4 runs at 30 fps and I can't stand playing it, I can see transitions and "jumps" between frames. Maybe I got used to 60 fps games.

The Witcher 3 is fine. I see those typical framerate transitions in 30 fps games, but it doesn't upset the brain.

This game makes me reaffirm what I have always thought that X1 being a slightly less powerful hardware than the PS4 or PC, is able to run games in a very balanced and solvent way, even if it requires some time, desire and extra talent by programmers. I think DX12 will follow that way, by facilitating the use of hardware for better results with less effort.

I am sorry that you don't understand completely the concepts of frame-time fluctuations or dynamic 20fps cap in cutscenes (you don't spend enough time in this forum! that must be it!). And I really look forward to having the same discussion next year when you'll start studying computer science.

But I'am not sure they'll teach you those concepts in their program...;-)
Well, I get the basics if that's any consolation. If vsync'ed and *properly* capped the game tries to keep the framerate at 30 all the time, if a frame can't be properly synced to avoid tearing then framerate drops occur, and because the framerate is locked and runs based on multiples, those are more prominent. Basically that's how I understand it. Is that right?
 
This game makes me reaffirm what I have always thought that X1 being a slightly less powerful hardware than the PS4 or PC, is able to run games in a very balanced and solvent way, even if it requires some time, desire and extra talent by programmers.

And 30% less pixels. The Witcher 3 devs didn't work any magic, they did what most everyone has done.
 
And 30% less pixels. The Witcher 3 devs didn't work any magic, they did what most everyone has done.
But a 5-10% better framerate, and in some cases like Resident Evil Revelations 2, a 100% better framerate.
 
You don't know the unlocked frame rate of the PS4 version of W3, all those times the PS4 version is at 30 fps it is actually rendering more frames and throwing them away. And yes, devs can still screw up ports of budget games, that doesn't support your claim.
 
I think Cyan's point is that PS4 drops lower than XB1, so probably runs at a lower framerate overall.
I think an argument could be made that the PS4 version might be running a few scenes in the comparison worse (at least the gameplay city scene), but the issue there would be that Cyan's argument is based on the framerate counter, which is extremely misleading as to the relative performance under the hood.
 
Last edited:
But DF also says that Far Cry 4 runs at 30 fps and I can't stand playing it, I can see transitions and "jumps" between frames. Maybe I got used to 60 fps games.

The Witcher 3 is fine. I see those typical framerate transitions in 30 fps games, but it doesn't upset the brain.

I would assume that the pixels have more movement per frame in FC4 vs Witcher 3 due to the first person perspective.
 
But a 5-10% better framerate, and in some cases like Resident Evil Revelations 2, a 100% better framerate.

You can safely discard all the frames that have less than 33ms frame time. Because you won't see them, they will be capped to 33ms anyway, because of screen refresh.
If after that the frame-rate will still have 5-10% advantage - you won (hint: it won't).
 
I've followed that conversation, but it's technical jobbery that I don't fully understand.

It will be a time -will start studying computer science next year, if everything goes as planned- when I will be able to talk geeky and help others, but for now I don't fully understand all the terms, hence I ask rather than talk about certain terms.

To tell you the truth, the fluctuations of the Xbox One are in the range of 30 to 40 fps. The PS4 has no fluctuations, but falls to 20 something fps in some cases. That is a felony and more important than the difference in resolution. But DF treat it as peccadillo.

If the Xbox One version is running at more than 30 fps 99% of the time, what makes you think that it would drop to less than 30 fps when it's obvious the console is more capable than that?

They didn't teach me these concepts at university.

What is important to remember regarding consoles is that they send a new frame to the display ever 16.6 ms (60 frames per second). So if you got a game that has a perfect 30 fps you get a new frame every 33rd ms.

However, if you got a game that runs at 35 fps you get an uneven frame time, something like 33-33-33-16.6-33-33-33 ms. That looks a bit weird. I personally do not mind and prefer unlocked frame rates, but not everybody agrees.

Also, unless you have tried both version you should be very careful to make any judgements. People said that PS3 and PS4 The Last of Us was very similar, but when I actually tried them on the same display I experienced a vast difference.
 
Also, unless you have tried both version you should be very careful to make any judgements. People said that PS3 and PS4 The Last of Us was very similar, but when I actually tried them on the same display I experienced a vast difference.

this and lol at the bolded! I don't get why cyan didn't like FC4, it was pretty solid for me...maybe the first person view was the issue
 
It is worthy noting that after 18 months from launch, console performance doesn't have much advantage. Even a GTX 750ti can beat performance of PS4. Can we still expect any benefit of optimization for consoles (like previous generations)?
 
Face-Off: The Witcher 3: Wild Hunt

The Witcher 3 is a game of many firsts. Above all for CD Projekt Red, it has the distinction of launching on three platforms at once, pushing for PC, Xbox One and also its first Sony format - PlayStation 4. Also breaking new ground is a more open-world design than we've seen before in the series, widening the scope of Geralt's adventure as we enter a sprawling third act. We've had a cursory glance at how console versions hold up in performance terms, but factoring in a PC release with plenty of visual bonuses, how do the consoles compare?
 
Last edited by a moderator:
It is worthy noting that after 18 months from launch, console performance doesn't have much advantage. Even a GTX 750ti can beat performance of PS4. Can we still expect any benefit of optimization for consoles (like previous generations)?

Can you point to anywhere in the history of consoles where console ports of PC games performed better than the PC original? Because I can't think of many (they will exist, but I bet most of them are from the last half of the console cycle, not the first). Conversely, games that were developed for consoles first rarely do well on PC. Of course, this generation, the change that games will stay being developed on PC primarily will be bigger, and if ever the situation reverses, then DirectX12 will help keep the differences lower. So things are looking better for PC gamers this gen than ever before.
 
Note that Eurogamer uses different settings, intended to match the console settings as much as possible. So I'm guessing their conclusions are right.

Don't forget though that the PS4 version is a port of the PC version, and we don't know how well it was optimised.
 
Status
Not open for further replies.
Back
Top