Games & input lag.

The problem there was people being sloppy. The topic of lag was raised in the Framerate Analysis thread despite not being framerate anaylsis. People replied despite it being totally OT. Someone suggested starting a new thread but didn't have the gumption to actually start a new thread, and then someone said there wasn't a thread where, if they had gone about starting a thread, this one would have been raised as already in existence. finally I didn't bother to check and just moved these posts out of the wrong thread.

I don't get why people are so averse to starting new threads for new topics, and would rather stick everything unrelated into one epic uberthread!

Anyway, here is where we can talk about input lag. Specifically, why is it different between platforms on the same game? What is causing the delay with SF?
 
Now the real reason I bumped this thread, I came across some very interesting info in the Shoryuken thread, they said the biggest factor in reducing lag is to run the console at the TV native res. The scaler in the console is said to be much faster than the one in the TV. Feeding the TV a non native res is supposedly a sure way to invite lag.

Now, it appears the PS3 has problems outputting 1080P due to it's scaler issues, and it depends on the game. So apparently, the PS3 could have an endemic lag problem on 1080P native LCD's if I'm understanding this right. On the Shoryuken forums for example, it was discovered that the PS3 version of SF4 has an extra frame of "inherent" lag vs the 360 (I believe 4 frames on PS3 vs 3 on 360) which was possibly considered due to this 1080P scaling thing. I thought this was interesting and I thought Mr Leadbetter might be interested in such things given his past lag tests.

Am I right or way off on this? I dont own a PS3 anymore so I'm not sure what output options it has. Does it have 1080i on all games? If so, would 1080i (PS3) to 1080P (TV) mean no TV scaler lag?

I guess another option touted on Shoryuken for all gamers, was to simply stick to 720P TV's. Of course though, the big/nice sets arent in 1080P, though I think they can go to 37". (Yes I'm aware it's really 1366X, but I think this would still reduce lag I guess).

Also, I recall a Sony TRC to make all games have a render mode at 960X1080, then scaled the 960 to 1920 via the working half of the scaler, being discussed on B3D. Did that ever happen and is it currently in place?

My TV is 1080p and I am running games on their native resolution which in most cases they are 720p and in very rare cases 1080p like Wipeout HD.

When you say that the game should play on the TV's natrive res does this suggest that I should force 1080p mode?

Also if the PS3 scaler has problems outputting 1080p which is the best solution for that?
This is very important for me to know since I am playing Tekken comptitively a lot, and one of the things I have noticed, when I play online with perfect connection quality (5 namely with some very competitive friends) even the moves with the fewest frames dont seem fast enough to counter some relatively slower moves coming from the competitor, resulting to my character responding slower in general.
In very many cases my Just frames moves arent fast enough to counter. Some might say this is most likely a connection lag. But knowing that both have identical connections and that both should have experienced similar issues, I suspect that there might be some TV lag in my part whereas the guy on the other side has less input lag.
 
I dont know anymore, the theory is that setting your console to scale, rather than your TV, produces less lag. The console can supposedly scale faster than the TV.

So for example on Xbox, you want to set the console to 1080P output (if you have a 1080P tv), rather than 720P (in which case the TV will scale it). This can be problematic on PS3 though, since I understand many PS3b games cant output 1080P due to the scaler problems.

Honestly though, lag is such a finicky business with such a ridiculous number of variables and little hard testing, I guess I wouldn't worry about it. Probably ten other unknown factors might influence the lag more than scaling, for all we know.
 
so i was snooping around ps360s blog and found this... about his next frame analysis regarding input latency and all i have to say is holy shit, seriously who is this guy he puts paid journalists to shame... im simply astonished at his technological abilities.

http://blog.livedoor.jp/ps360/archives/51620353.html

It look like a very accurate method to mesure lag & seem less tiring for the finger using a fight stick & a came that can do 60fps or over.

The only reason why I got into SSFIV lag test, is due that a lot of people claimed 1 frame which wasn't even realist in the first place for those who are sensitive to lag.
 
hello, sorry upping old topic. but looking it seems more talking about the things that add lag.

im curious why some game that lower the frame-rate = more lag.
this happen in BFBC2, Lost Planet, Witcher 2 (very noticable in this game), etc.

But some other game the frame rate can go very low with no input lag added (stay the same lag).
This happen in GTA IV. My game go as low as 10fps but the input lag still the same and a lot faster than The Witcher 2 at 30fps.

what causing GTA IV able to keep the input lag stable?

thanks
 
Cool counting!

Maybe you could post more of your findings online, at least I would be really interested. Maybe some MW3 vs BF3 input lag measurement?

Sorry that I can't help you with your question though...
 
hello, sorry upping old topic. but looking it seems more talking about the things that add lag.

im curious why some game that lower the frame-rate = more lag.
this happen in BFBC2, Lost Planet, Witcher 2 (very noticable in this game), etc.

But some other game the frame rate can go very low with no input lag added (stay the same lag).
This happen in GTA IV. My game go as low as 10fps but the input lag still the same and a lot faster than The Witcher 2 at 30fps.

what causing GTA IV able to keep the input lag stable?

thanks

Most important thing is to separate the controller input from the screen refresh. You can have game logic running at for instance 60fps at all times (though some games like racing games do this even higher), and then have a completely or nearly completely asynchronous rendering of the graphics that belong to the game state. There are more factors involved, but this is by far the most dramatic. If this separation isn't in the game's code, then allowing your screen to tear and double vs triple buffering can also affect the experience.
 
@tuna
^^ seems interesting checking those game in Xbox version (30fps bf3, 60fps MW3) it will be hilarious if the result is BF3 have lower input lag than MW3 :p

btw bonus framecount : witcher 2
http://www.gamexeon.com/forum/imagehosting/201107/44194e34d6a45f578.jpg
http://www.gamexeon.com/forum/imagehosting/201107/44194e34d6a47a70d.jpg

@arwin
wow thanks a lot. Thats clears my couriousity :D

btw i also tried GTA IV only in 1 cpu core. The result still same 8 frames delay. Seems the controller input is given very high priority (and async).
 
Last edited by a moderator:
Posted here to avoid going further off topic in the "predict next gen tech" thread...

I remeber Carmack making comments that the current 66/100+ms situation is far from optimal. I was wondering - was that a general comment on the way the hardware works or is there something about the way the current gen hardware was designed that inflates that number? From my understanding, it takes 33ms to render a frame in a 30fps game while the transmission of the signal from the controller to the console probably takes a few miliseconds tops. Yet the total lag can exceed 133ms. Looking at this thread there are reasons given for that but... Could that be adressed when designing next gen console or this is pretty much how things work(and it can even get worse as some are suggesting!)? Would it be possible to get lower than 100ms for a 30hz game?
 
Posted here to avoid going further off topic in the "predict next gen tech" thread...

I remeber Carmack making comments that the current 66/100+ms situation is far from optimal. I was wondering - was that a general comment on the way the hardware works or is there something about the way the current gen hardware was designed that inflates that number? From my understanding, it takes 33ms to render a frame in a 30fps game while the transmission of the signal from the controller to the console probably takes a few miliseconds tops. Yet the total lag can exceed 133ms. Looking at this thread there are reasons given for that but... Could that be adressed when designing next gen console or this is pretty much how things work(and it can even get worse as some are suggesting!)? Would it be possible to get lower than 100ms for a 30hz game?

A lot of the input lag comes from the display. Most HDTV's sold have cheapish {P/M}VA panels that do a lot of signal preprocessing to hide their awful pixel response times. This adds several frames worth of lag to every frame. This is not as much of a problem on the PC, because there most cheap displays use TN panels, where the image quality is crap, but at least they are fast.
 
Sure but that's additional lag from the display. I was courious if there's anything that can be done with the delay caused by the console hardware.
 
There is no inherent delay in the console hardware. None of the console scalers I'm aware of add frames.
Games generally add between 1 and 2 frames, depending on how things are split out.
The rest is the display.
 
I can believe those numbers.
In the simplest form, you have a worst case of 1 frame from the time a button is pressed to it being "read" at the start of a frame, the frame it's rendered in, then 16ms for the image to be drawn (assuming no additional display lag).
~48ms, though on average you only lose 8ms for the "read" so it's more like 40ms.
Double that for a 30fps game.

practically most games add 1 additional frame if they spit gameplay from the rendering thread.

If they run the gameplay thread asynchronously and interpolate all bets are off.
 
I can believe those numbers.
In the simplest form, you have a worst case of 1 frame from the time a button is pressed to it being "read" at the start of a frame, the frame it's rendered in, then 16ms for the image to be drawn (assuming no additional display lag).
~48ms, though on average you only lose 8ms for the "read" so it's more like 40ms.
Double that for a 30fps game.

practically most games add 1 additional frame if they spit gameplay from the rendering thread.

If they run the gameplay thread asynchronously and interpolate all bets are off.

Thanks, that explains a lot. I was just looking for those missing "puzzles" that increase the total amount of lag to over 100ms, on top of the 33ms spent rendering the frame.
 
Last edited by a moderator:
I can believe those numbers.
In the simplest form, you have a worst case of 1 frame from the time a button is pressed to it being "read" at the start of a frame, the frame it's rendered in, then 16ms for the image to be drawn (assuming no additional display lag).
~48ms, though on average you only lose 8ms for the "read" so it's more like 40ms.
Double that for a 30fps game.

practically most games add 1 additional frame if they spit gameplay from the rendering thread.

If they run the gameplay thread asynchronously and interpolate all bets are off.

Yeah, I assume 4ms for the controller, which can cross the 16ms border like this:

Code:
0 --------- 16.6 -------- 33,3 ms
          x --- 4 ms
, and maybe is even forced to to prevent inconsistent response?

Then you get the next 16,6 for drawing the frame, the next 16,6 for making that frame the framebuffer, and then it needs to be drawn.

I vaguely recall reading that some Nintendo stuff, not sure if it was GameCube or handheld, is 2-3 frames generally.
 
Back
Top