One assumes they can copy-paste from the "Using AF" example in the online manual...If it's one additional step, how hard is it to take that step? This is getting ridiculous. Even if it's a bit convoluted to set AF flags...
One assumes they can copy-paste from the "Using AF" example in the online manual...If it's one additional step, how hard is it to take that step? This is getting ridiculous. Even if it's a bit convoluted to set AF flags...
If it's one additional step, how hard is it to take that step?
It's not hard, but it means that you need a "PS4 person" in your team, and no multiplatform studio seems to have one. Why bother, if it runs ok anyway?
don'tNeed a PS4 person? Multiplatform studios don't have people experienced with PS4 development?
Multiplatform studios don't have people experienced with PS4 development?
http://www.eurogamer.net/articles/d...am-knight-on-ps4-is-a-technical-tour-de-force
Batman is great on PS4 but bad PC port
Edit: Not the same team it was outsourced for PC version. I hope it will be better after PC port.
Digital Foundry: It gets worse - Batman: Arkham Knight on PC lacks console visual features
Frame time graphs are where it's at. The frame-time-graph-revolution that hit PC reviewing a couple of years back is something that DF should look to get in on.
Their performance analysis video's effctively give you that in real time. Interestingly the frame times (while unreasonably high for a 780Ti) didn't look all that uneven in the recent Batman article. Presumably that wasn't giving us the whole story, otherwise with e 30fps lock applied the game should be pretty smooth on that level of hardware - and I hear that it isn't.
All rates have to be measured over a period of time, that period being arbitrary.
the single longest update interval is useless though
I agree. Which means the "fps" measure is useless nowdays. It was useful when fram times were not so wildly changing.
It's not and obviously it's not useless if it is 401ms, if you set your moving average to 500ms it will give you min 3fps (which everybody would understand is a problem).
The big question here is how perception of frame-pacing works? I would argue that difference between longest and shortest frame time in sane visual reaction interval (120-150ms for average human) would provide us with the answer. But I have no hard evidence to prove it. And some people would argue that long frame times are worse (which I think is a complete bs, unless it's above 50ms). Probably somebody can conduct an experiment on a focus group (caveat: it cannot be done on any modern game, because these have uneven frame pacing by definition, probably using some old fast shooter like Queak3 is a best bet).
Standard Deviation of frame-time values would be a good indication of big frame-time variance
It's kind of useless, because frame-time distribution is not normal. I would say that using FT to move it into frequency domain can be beneficial (lower frequencies in signal - better frame-pacing), but not regular random variable stuff.
Maybe they have.. TWO FACES!They are taking an age to do the Batman face-off, what gives?