30fps on PC feels worse than 30fps on consoles?

First thoughts would be:
-input lag
-display lag (likely severe on a TV, mild on a computer screen, and none at all for the best screens. I don't remember any TV managing to be close to the best computer screens...)
-irregular frames durations (engine/driver issues maybe)
-lack of assisting code on the PC version (like auto aim, smoothing, and what not)

Unless you coded the different versions of the game you can't really know how much it diverged between PC and console.
If your screen & TV were tested or if you have a display lag meter you could look into that.
I would also expect (but I never measured) that input latency is better on consoles as it's critical and they are dedicated gaming machines. (That said since Microsoft is making Windows & the XBox improvements may be carried over... Or not, from what I heard of MS internal competition between teams a few years ago)
 
(...). Theoretical is all well and good, but if you can't buy it yet, it is mostly irrelevant(...)

You can test this already, seek a place with CRT projector using fast retrace mode at 60hz, and bring DLP with its blazing response but full frame length pixels. Latter will be like OLED, former resembles a perfect backlight. It's kind of shocking, instead of say, irrelevant.
Regardless, consumer available LCD displays all have worse pixel response than current consumer available OLED displays.
Except it's getting displaced by MPRT, "motion picture RT" that can express backlight performance and seems something reliable.

Likely because the cost is far too much to approach any of the theoretical benefits.

As usual with new technology, but then can you really call this window dressing of a HDR new technology, with its bad MPRT (with high power consumption!)...

No doubt LCD improvements are coming, especially because of that atrocious glowing and bleeding.
 
Yup it's been a thing for like forever. Half-Life and Half-Life 2 had extremely bad mouse lag with Vsync on back when they came out. It wasn't the only game that did it (there were lots), but it's one that has always stuck out because it was particularly bad.

One of the reasons I generally played with Vsync off. Developers have gotten much better at doing whatever needs to be done to reduce the impact of Vsync on mouse lag in recent years though.

I've also only ever limited FPS within the game engine itself, so I can't recall having run into limiting frame rate causing input lag. Then again, I may just be lucky that the games I play have good programmers. :p

Regards,
SB
One classic setting that helped/caused lag with Vsync was the MaximumPre-Rendered Frames in nvidia control panel. (not sure what it's called on AMD/Intel.)
It basically let CPU work few frames forward to reduce GPU stalls due to CPU and thus increased latency.
On low framerates additional ~3 frames of buffer is really painfully obvious.
 
I would also expect (but I never measured) that input latency is better on consoles as it's critical and they are dedicated gaming machines. (That said since Microsoft is making Windows & the XBox improvements may be carried over... Or not, from what I heard of MS internal competition between teams a few years ago)
I recall seeing this tested a few years ago and it was found that the Xbox 360 had significantly more input lag than a PC at the time. Wish I could find the source.
 
Do you suggest setting this to 1 or let the application decide ?
It's been quite long time since I have fiddled with the setting, could be that they have changed the defaults at some point.
If there was laggy feel, I just changed it to 1 for the application.
 
Semi OT: 2019's attempt at OLED motion:
Moving down to 4K UHD, the company is debuting something called Crystal Motion OLED in a 65 inch display, which is said to have the world's fastest MPRT (Motion Picture Response Time) of 3.5 milliseconds
Explosive statement aside,
going by the old motion resolution numbers by former plasma producers, that's about 1430 lines worth equivalent, and 99.9999 % scrolling type (you can safely forget about strobing or BFI in high end), good 60hz image , flickerfree in dim room but luminance drops accordingly ( 16ms / 3.5). Variable frames aren't a good idea because closeness of "flicker fusion threshold" .
 
Last edited:
Certain engines don't seem to handle frame limiting correctly for the PC version despite a console edition existing (different paths, I guess).

I had a pretty strange experience with Gears Ultimate where setting the in-game framerate to unlimited provided a better experience than 60 even with v-sync still on, as if the engine didn't decouple a bunch of things like streaming/loading; there was much more stuttering setting it to my monitor rate, and a lot less with "unlimited".

 
Back
Top