Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
The game performs poorly on both systems. It only hits 30fps in tunnels and other small enclosed levels. Infact DF says that the X1 version actually runs smoother in some places than the ps4, but the X1 version had a timing issue between simulation and graphics.
I am not happy with the situation period. I know idtech5 is not the best choice for real time lighting, but how in the hell does a dev take an engine known for being designed to hit 60 fps
and ignore all it's built in features that would help with performance.
Since they are using dynamic lighting it is understandable to shoot for 30 fps, but why in the hell would they use letterbox to save on performance when the engine is built for dynamic resolution. I was really looking forward to The evil within but after reading the DF article I am holding off to see if they even bother fixing the framerate and simulation bugs before I buy this title.

See, even before you play it you're already scared! ;-)
 
surprised at the higher precision effects on PS4, those are real frame rate killer irc in Wolfenstien if you set the effects to ultra, PS4 win the graphics on this but not sure if its worth the performance killer. They have already got the extra time delaying the games but I think 5 versions might have been too much.
 
Not like how we saw a huge change like we are seeing (resolution changes) mid cycle for consoles eg. Killer instinct 2. But pC games have definitely shipped with performance aiding patches after release. Most commonly addressing hardware configurations; some games tend to have a hard time with specific vendors or cards.
 
Is it normal for developers to improve performance in post-release updates? Unlikely?

ZOE 2 HD for PS3 was night and day difference before and after patch. From Worst version to one of the best HD remake ever. Question is will bethesda bother? Probably not.
 
Digital Foundry: Hands-on with COD: Advanced Warfare multiplayer

Article up on Eurogamer.

Digital Foundry said:
The good news? After analysing over 23,000 frames of footage we record only three actual torn frames - essentially invisible during the run of play. The use of alpha appears to be a weakness here, with v-sync breaking when stood at the centre of a grenade's splash range with shader effects on the go. It's a tricky one to catch by eye, but as a worst-case scenario it allows the engine to uphold controller response on the rare occasions rendering spills over budget.
 
Well it looks better than the bad out look they gave about the e3 SP demo.
I am still suprised they had such doubts about the game reaching 60 fps for it's retail release.
I am really looking forward to AW like I havent looked forward to a COD since 2.
Hopefully the game turns out great on both systems. It certainly looks like a fresh take on the stale franchise. Maybe it will turn out so good that noone even argues about the res difference between platforms. Which in turn could lead to a sudden outbreak of world peace and magical cures for all contagious diseases. : )
 
XB1 Out of sync renderer: the-evil-within-performance-analysis

What's worse, on Xbox One, it almost feels as if the renderer is out of sync with the game simulation.

I have started to analyse the constant judder present in XB1 version of The Evil Within using the short video provided by DF in their first performance analysis of The Evil Within.

By looking at the 60fps video frame by frame it's clear than the renderer is out of sync with the game simulation by a 1 / 7.5 ratio.

It's a serie of 1 / 7 then 1 / 8 repeated, averaging 1 / 7.5

So Every 14th frame (then 16th, then 14th again etc.), the computed image ignore one animated step and forward to the next image (like if the speed was accelerated 2X during 2 frames only), creating an uneven stuttering animation.

The video is 60fps so for a 30fps capped game, every couples of frames are identical, we obviously divide by 2 which gives: 1 / 7 then 1 / 8 ratio averaged at 1 / 7.5

Now I am not sure how to definitely conclude but we can notice that 30 / 7.5 = 4 and 30 - 4 = 26.

Is it possible that the game runs internally at 26fps on XB1 (~13% slower than on PS4) and somehow this internal framerate is transformed into this out of sync "false" 30fps?

If yes, or no, how? Do you know a similar case?

EDIT: The period may be different that {1/7, 1/8} I found some variations like {1/7,1/7,1/8}. Also my first calculation finding 26fps is in fact probably wrong, but not by much. If the engine produces 7 frames instead of 8 then the engine is 12.5% slower than normal. So when the renderer is capped at 30fps then the engine really runs at 26.25fps. When the renderer outputs 25fps then the engine really runs at ~21.875fps etc.
 
Last edited by a moderator:
I have started to analyse the constant judder present in XB1 version of The Evil Within using the short video provided by DF in their first performance analysis of The Evil Within.

By looking at the 60fps video frame by frame it's clear than the renderer is out of sync with the game simulation by a 1 / 7.5 ratio.

It's a serie of 1 / 7 then 1 / 8 repeated, averaging 1 / 7.5

So Every 14th frame (then 16th, then 14th again etc.), the computed image ignore one animated step and forward to the next image (like if the speed was accelerated 2X during 2 frames only), creating an eneven stuttering animation.

The video is 60fps so for a 30fps capped game, every couples of frames are identical, we obviously divide by 2 which gives: 1 / 7 then 1 / 8 ratio averaged at 1 / 7.5

Now I am not sure how to definitely conclude but we can notice that 30 / 7.5 = 4 and 30 - 4 = 26.

Is it possible that the game runs internally at 26fps on XB1 (~13% slower than on PS4) and somehow this internal framerate is transformed into this out of sync "false" 30fps?

If yes, or no, how? Do you know a similar case?

Could this just be a frame pacing issue, that didn't get resolved with XB1 edition? We seen these issues before with a few PS4 games, later getting resolved with a patch.
 
Almost exactly the percentage that the Xbox CPU is faster than the PS4 CPU in those Ubisoft GDC slides.

Could the Xbox version be burning though hardwired per-frame workloads, filling some kind of render or simulation buffer, then just ... chilling ... until it's caught up with itself?

Reminds me of some games that tried to force 60 hz games into 50 hz mode, or vice versa, and pulled a similar trick ...

Might explain why the game is so messed up on PC too.
 
Could this just be a frame pacing issue, that didn't get resolved with XB1 edition? We seen these issues before with a few PS4 games, later getting resolved with a patch.

No it's not a frame pacing issue.

Every couples of frames are always identical (not the case in frame pacing issue still existing in the XB1 versions of Destiny where couple of frames are different during the issue) .

And in a frame pacing issue, it occurs depending of the engine load or randomly, here it's perfectly periodical. 1 / 7 then 1 / 8 then 1 / 7 etc.

Like DF said, the game simulation is definitely out of sync with the renderer. If we take the case of the 1 / 7 ratio then for 8 frames normally produced by the game engine, the game simulation (the engine) only produces 7 frames (so it must cheat in the 7th frame and produce the 8th frame instead of the 7th), like if it ran constantly ~13% slower than the 30fps renderer (the frames produced to the TV).
 
It's in fact really similar to the problem seen in the pre-patched Diablo 3. The difference is that in Diablo 3, only the characters were out of sync (not the environment) when in The Evil Within, XB1 version, the whole game is out of sync.

But in this case DF didn't think it was necessary to make a specific framerate video for the XB1 (when they did it after a lot of work for Diablo 3), where for TEW in fact the real fps should be always ~13% lower than it is.

When it's written 30fps, the game really runs at 26fps and when it's at 25fps, the game really runs at ~22fps etc. So the latent narrative of the whole article would be rather different IMO.
 
Going through the video frame by frame, I don't think I'm seeing the same thing you do.

It appears that the Bone version updates every two frames, but every few updates it does so by an accelerated amount. It's not a full frames worth of skipping, it's like one frame is 40~60% ahead of where it would be given the rate at which the other frames update. It's not an entire frame being skipped.

Try measuring the distance that a point on the scrolling background moves each interval, such as the high contrast wall edge.

This is a timing issue. I wonder if they're timing against clock cycles (CPU or GPU or NB I dunno) instead of the built in clock, and then correcting using a hack when it gets too out of sync? This feels hacky as fuck.
 
Going through the video frame by frame, I don't think I'm seeing the same thing you do.

It appears that the Bone version updates every two frames, but every few updates it does so by an accelerated amount. It's not a full frames worth of skipping, it's like one frame is 40~60% ahead of where it would be given the rate at which the other frames update. It's not an entire frame being skipped.

Try measuring the distance that a point on the scrolling background moves each interval, such as the high contrast wall edge.

This is a timing issue. I wonder if they're timing against clock cycles (CPU or GPU or NB I dunno) instead of the built in clock, and then correcting using a hack when it gets too out of sync? This feels hacky as fuck.

Yes it's tricky because the apparent speed of motion is not perfectly stable and fluctuates a bit. But by watching the video at reduced speed, I can even count the number of uneven animation steps at ~31 for a 7 seconds video which still gives between 1 / 7 and 1 / 8 ratio.

But I admit now that it's possible that the period may be more complex than 1 / 7 followed by 1 / 8 only but it may exist some variations, like 1 / 7 + 1 / 7 then 1 / 8. Hard to manually figure it though without more advanced tools like the ones DF used to find real Diablo 3 characters framerate in the pre-patched game.

Ctrl + shift + s to play at reduced speed in Windows media player (select the infinite track playback in the options to re-count several times automatically) will show you ~31 judders on the XB1 video. To better know what you are looking for I would advise you to try first with the PS4 video at full speed, then at reduced speed, then XB1 video full speed and finally at reduced speed in which you can easily count the number of "stutters" in the XB1 video.

Anyway it has nothing to do with CPU or GPU clock, because modern game engines are obviously totally independant of the CPU or GPU clocks. Maybe in the SNES era it was still the case...
 

Looks really sharp. Probably the most impressive looking X1 game I have seen yet. Youtube comments are hilarious, people trashing it for the locked 30fps and 900p resolution, and saying how it looks like a PS2 game. These clowns are funny, but the sad part is, I think they have themselves convinced their BS is actually true. Anyway, the rock steady 30fps, no screen tearing, and crisp 900p res all make for a technically sound game.
 
Wondering if they shouldn't have gone the direction of Clustered Forward+ vs Light Linked Lists. hm...
 
Status
Not open for further replies.
Back
Top