Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
I agree with you. But at the same time, just giving a minimum without any context is misleading, because you don't know how often or how long it stays there. Some people are looking at the worst of the DF videos and thinking the game plays like that all of the time, but it isn't true. To go the opposite way and say those videos are not representative at all is also not true.

Which is why fps segments should be shown in context, as in how much it takes to make the fps drop to certain values. Having a distribution also helps. The way DF shows it tells you exactly what is happening on screen and what fps it correlates to, and that's doing as much justice to the game as I can imagine.
 
So about the same average as Shadwfall MP. 45=not bad. The overall average would be a good deal higher in real gameplay, when you factor in DF vids are all demanding scenes.

Since when? ShadowFall MP is rather at near 50fps (maybe like 49fps) on average in this video. But how it does this on average is even more important:

It hovers mainly between 44-54fps, is much more stable: when you compare the shape of the fps line it's plain obvious SF rarely dips under 40fps, when Titanfall may dips at sub-20fps purposefully unrecorded levels.

Even more importantly SF hasn't got any screen tearing, unnacceptable on a next gen console and not something that can be resolved by a GPU boost or draw calls optimizations, which is my main concern with this game over the fps average, sub-20fps dips or resolution.
 
It hovers mainly between 44-54fps, is much more stable: when you compare the shape of the fps line it's plain obvious SF rarely dips under 40fps, when Titanfall may dips at sub-20fps purposefully unrecorded levels.

Please tell me that this isn't some more DF conspiracy garbage ...?

And yes, KZ is more stable, but the load varies a lot less. So ... that means it's more stable. Taking things in isolation isn't helpful.

Even more importantly SF hasn't got any screen tearing, unnacceptable on a next gen console and not something that can be resolved by a GPU boost or draw calls optimizations, which is my main concern with this game over the fps average, sub-20fps dips or resolution.

Tearing sucks ass, but Killzone has it's own issues, like low resolution and interlacing artifacts, and likely higher input latency.

Again, taking things in isolation isn't helpful, it's just about point scoring. These are games, they are entire products with many characteristics.
 
Tearing sucks ass, but Killzone has it's own issues, like low resolution and interlacing artifacts, and likely higher input latency.

DF said input latency in Shadow Fall was 80ms (but not sure if SP or MP or both).

But with the very fluctuating Titanfall framerates leading to very inconstistent input lag, it is here really likely that DF, if they measured the input lag in titanfall, would prefer Killzone SF consistent input latency.
 
DF said input latency in Shadow Fall was 80ms (but not sure if SP or MP or both).

But with the very fluctuating Titanfall framerates leading to very inconstistent input lag, it is here really likely that DF, if they measured the input lag in titanfall, would prefer Killzone SF consistent input latency.

Huh? if the vsync is off then the fps shouldn't have impact on the input latency, supposedly the notorious input latency has been fixed with KZSF, though I don't know the actual measurement.

Typical 60fps titles have input latency of 4 frames, or 67ms.
 
Huh? if the vsync is off then the fps shouldn't have impact on the input latency, supposedly the notorious input latency has been fixed with KZSF, though I don't know the actual measurement.

Typical 60fps titles have input latency of 4 frames, or 67ms.

Like not at all? So we can just vsync-off any game, and at 15fps, 20fps, 30fps or 40fps the input lag will always be consistent 67ms only at the cost of screen-tearing?
 
Huh? if the vsync is off then the fps shouldn't have impact on the input latency
Turning vsync off eliminates the time between a frame finishing rendering and a frame being displayed, but there are other performance-related timings. Like duration between button press and response by the game engine, and duration between that and a responding frame finishing rendering. Both of those factors can be fps dependent, and the latter factor basically always is.
 
Gamespot 360/Xone comparion video uploaded today


Definitely a major difference in sharpness and texture quality. In a few scenes (getting into Titan) it almost looks like 360 version is suffering UE3 textures failing to load.

GameSpot did a much better job cutting their video than IGN's annoying "cut to a new scene every .0008 seconds before the viewer has time to process anything" technique on comparison vids.
 
Like not at all? So we can just vsync-off any game, and at 15fps, 20fps, 30fps or 40fps the input lag will always be consistent 67ms only at the cost of screen-tearing?

Huh? Why is this turning into a generalization that applies to all games all of the sudden?

So let me ask you this, why is a 150ms constant latency better than 80ms to 100ms fluctuations? (just throwing out random numbers)

Turning vsync off eliminates the time between a frame finishing rendering and a frame being displayed, but there are other performance-related timings. Like duration between button press and response by the game engine, and duration between that and a responding frame finishing rendering. Both of those factors can be fps dependent, and the latter factor basically always is.

I'm under the impression that with Titanfall, turning off vsync trades tearing for input latency, and when vsync is on, the input latency became inconsistent, no?

Of course there are many reasons why different games react differently, but then that's a given, no?
 
I'm under the impression that with Titanfall, turning off vsync trades tearing for input latency, and when vsync is on, the input latency became inconsistent, no?
Games using double-buffered vsync have extremely good response characteristics when they're hovering just above their target framerate, because the buffer flips work to time the initialization of frame rendering in a near-ideal way.
If performance drops below that target, flipping vsync off won't keep it at the input lag performance that it was at earlier, but it will keep it from tumbling off a cliff; input lag with vsync off at 59.99fps performance will be almost as good as input lag with double-buffered vsync on at 60fps performance, but input lag with vsync on at 59.99fps performance is generally going to suck by comparison (even if you switch to triple buffering).

Think of it this way. If you're maintaining 60fps with double-buffered vsync on, if your game has a classical rendering cycle with no high-level pipelining, there'll be 1/60th of a second between when a frame begins to render and when a frame begins to output. But no matter what your buffering scheme is, even if vsync is off, if you're only maintaining 40fps rendering performance, there's going to be a minimum of 1/40th of a second between when a frame begins to render and when it begins to output.

So yes, Titanfall trades vsync for input lag, but doing so does not mean your input lag doesn't change. It still gets worse, just not as badly as it would if you resorted to the vsync'd alternatives.
 
This would be easier to explain with frametimes than fps. :)

---

We'll have to wait for more data on 360, but it does appear that the game seldom drops below 30Hz, so enabling the 30Hz lock should result in a rather consistent experience - clearly not as ideal as faster frametimes (lower latency), but consistent.
 
Gamespot 360/Xone comparion video uploaded today


Definitely a major difference in sharpness and texture quality. In a few scenes (getting into Titan) it almost looks like 360 version is suffering UE3 textures failing to load.

GameSpot did a much better job cutting their video than IGN's annoying "cut to a new scene every .0008 seconds before the viewer has time to process anything" technique on comparison vids.

Yes, there would be a fairly significant difference in clarity considering it's ~79% more pixels ( might also consider pixels per triangle or pixel:texel density).

Streaming is definitely new for the engine as per the DF preview. One could infer that the prevalence of texture popping gives a clear reason for not supporting a single-source streaming option (disc-only, HDD-only), especially given how much of the environment can be seen and how quickly the scene can change (or else, simply loading ultra shit tex everywhere o_O).
 
So yes, Titanfall trades vsync for input lag, but doing so does not mean your input lag doesn't change. It still gets worse, just not as badly as it would if you resorted to the vsync'd alternatives.

So you are saying that the engine ticks of Titanfall actually changes, such that some ticks are longer than 1/60th sec?
 
Definitely a major difference in sharpness and texture quality.

There is also definitely 8 years between the release of those consoles. Think about it 8 years... I wonder if Halo 4 would hold up as well on a Dreamcast :)
Yea gross gen titles are bound to leave a lot of performance on the table but...
 
So you are saying that the engine ticks of Titanfall actually changes, such that some ticks are longer than 1/60th sec?
Maybe; obviously I don't have access to either input lag benchmarks or the source code.

I suppose, if the game logic runs in lockstep with the rendering (which would certainly imply that some ticks absolutely are longer than 1/60th of a second), it's possible that it runs with some latency-adding headroom that gets dropped when performance falls below target, which would result in a very nonlinear input characteristic relative to performance (and which would probably actually reduce input lag if performance is sitting just under 60fps, though input lag would again get worse as you dropped lower).

I'll say this much: if the input lag tends to be as low when the game is pushing 20fps as when it's pushing 60fps, I'd be quite astounded.
 
Last edited by a moderator:
I'll say this much: if the input lag tends to be as low when the game is pushing 20fps as when it's pushing 60fps, I'd be quite astounded.

Obviously when you have 2 frames that are the same, at minimum the perceived latency at that moment is 16.67ms in extra. Though it gets more complicated because game engines don't necessarily tick the internal computation at the same frequency as the graphic frames, and the GPU can render the previous frame while the CPU is computing the current frame at the same time.

I personally would take a 67ms-84ms input latency over a constant 100ms any day, and going back to the earlier suggestion that DF would prefer a consistent response over variable but less latency, here's what DF actually said:

Titanfall on Xbox 360 has similar levels of crazy screen-tear as its Xbox One sibling (unless you cap at 30fps) but the unlocked frame-rate does at least provide a higher level of controller response - essential for this particular game.
 
This would be easier to explain with frametimes than fps. :)

---

But Digital foundry does provide frame-time graph along with the fps graph, it's located on the top left of the fps graph.

And in the latest X360 face-off they did confirm it's basically a input lag reference:

Xbox 360 may be slower, but look at the frame-time graph. The game tears heavily, but the plus point is a much improved controller response compared to a 30fps shooter.

And from the performance videos I have seen, the advantage from the vsync-off is also linked to the fact that you allow the game to fluctuate at 45fps or more and so you get a better average input lag.

But in many cases where the game dips near 30fps, the frame-time graph shows a near 35ms frame-time similar to a locked 30fps.

And when it dips under 30fps, then the frame-time graph also dips under the 35ms typical locked 30fps game:

X360_Titanfall_faceoff_perf.jpg
 
http://www.eurogamer.net/articles/digitalfoundry-2014-titanfall-xbox-360-tech-interview

Digital Foundry Interview of Blue point about the 360 TitanFall port

That is quite interesting; 40 devs a total of 15 month, or 50 man years total.

Assuming a cost of 150k a year a person average...the X360 version roughly costs 7.5 million dollars to develop...without factoring in any additional costs.

Is this about right? Anybody has a detailed idea about this?

This seems quite low: thinking about a PS3 version and even a PS4 version...the costs are quite low in comparison to the potential net gain...I really do wonder how much MS paid EA and if EA is happy.
 
Assuming a cost of 150k a year a person average...
Unless salaries in gaming have escalated suddenly, you're way off base here. I couldn't afford quit my current job coding and move into entertainment. Managers with a lot of responsibility? Perhaps. Maybe some really talented people in top tier studios could get that too, but most? Nope.
 
Status
Not open for further replies.
Back
Top