Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
They did a really good job with this remaster: from 720p ~30-35fps (in most hectic moments) on PS3 to 1080p with almost rock solid 60fps on PS4 (only 2 dropped frames during the whole ~7mn stress test, this moment has being obviously put near the beginning of the video).
 
Last edited:
They did a really good job with this remaster: from 720p ~30-35fps (in most hectic moments) on PS3 to 1080p with almost rock solid 60fps on PS4 (only 2 dropped frames during the whole ~7mn stress test, this moment has being obviously put near the beginning of the video).

Wonder if it's just a CPU hiccough. Otherwise, I'd be rather curious to know how much rendering overhead they have. Super-sampled dynamic res would be pretty cool for those moments where there isn't much going on - platforming/climbing etc.
 
1080p is a defined broadcast standard (BT.709). The fact that early plasma displays (I recall quite a few having a real resolution of 1024x768) couldn't fully replicate the full pixel frame information and have to scale/interpolate for the native display is neither here nor there.
IIRC BT. 709 defines a colour space and resolution for mastering BD but is not a broadcast standard in that there is no industry standard method for transmitting 1080p signals. I believe 1080i is the highest standard but there have been limited experiments using proprietary methods by satellite and cable operators.
 
IIRC BT. 709 defines a colour space and resolution for mastering BD but is not a broadcast standard in that there is no industry standard method for transmitting 1080p signals. I believe 1080i is the highest standard but there have been limited experiments using proprietary methods by satellite and cable operators.
You're thinking of RC.709. BT.709 is indeed a broadcast standard.
 
http://www.eurogamer.net/articles/digitalfoundry-2015-should-you-install-the-witcher-3-patch-107

The patch 1.07 means lower performance on Xbox One and sometimes lower performance and better performance for PS4

This is bizarre, on PC i noticed improved performance and improvements in both lighting and HBAO+

I mean, can't CDPR just decrease some visual effects to achieve better performance? Maybe use triple buffering instead of db as well, considering how well it scales on PC it's baffling why it's struggling on current gen consoles. Maybe CPU bottleneck?
 
They now cap the game at 20fps on the XB1 version but only on some specific locations like the swamps (and sometimes not everywhere in the swamps), we can see some ~25fps relatively stable framerate or slowly fluctuating one on the XB1 game on many areas in the video including in the cutscenes.

The PS4 version still has double buffering everywhere including cutscenes.
 
They'll mean the physics/game engine. 90 fps output would be stupid as no TV supports it. The claim to smoother netplay is perhaps that the interval between faults is reduced? Maybe 90 UDP packets a second so only 1/45th second on a dropped message?
 
Now that you are talking about framerates... It has been discovered that Killer Instinct actually runs at 90 fps on the Xbox One.

It started at 720 60fps (if measurements at the time were correct, perhaps the game was 90 fps then), now 900p 90 fps. :)

http://www.gamezone.com/news/killer...-3rd-season-and-pc-crossplay-revealed-3421721

You're misinterpreting this... :yep2:

More than likely, they're talking about headroom (+30fps), however locked/Vsync at 60fps. If the headroom is that much... why not bump up the resolution to full HD?!

Edit: As Shifty mentioned, most household TVs can't support that refresh rate.
 
They now cap the game at 20fps on the XB1 version but only on some specific locations like the swamps (and sometimes not everywhere in the swamps), we can see some ~25fps relatively stable framerate or slowly fluctuating one on the XB1 game on many areas in the video including in the cutscenes.

The PS4 version still has double buffering everywhere including cutscenes.

As much as I love TW3... both XB1/PS4 versions (framerates) sound awful.
 
I think they always concentrate on worse-case with these tests. Most of the time the games is 30fps on both. They always have to condition the numbers with "in the bog, at night, in the rain with enemies.." which is a clear indication they are gaming the test videos for drops. I find the frame rate perfectly acceptable. So far only the bog has issues, but even then the combat is so bad there is little issues with controller latency.
 
You're misinterpreting this...
No, I am not. I am not native but I know well what I read, despite the typical patronising thing towards me here "that's not what they said". It's not physics nor particles, not volumes, SSAO, nor realistic ferns or anything. It's frames per second.

The developers are saying that the game runs internally at 90 fps, and that's thrice the framerate of many games -I am getting to the point of not playing 30 fps games anymore, and deleted the ones at that framerate on my console, thankfully it was just one-, and if they say that you can believe it. DF guys can find out, but I wonder how.

Additionally, a game like Forza running at 60 fps all the time means that the game could be running internally at 80 fps or 100 fps in a few given frames, or even 120 fps or 200 fps, but they are limited in the code to 60 fps.

@Globalisateur shared a link in the Halo 5 thread about input lag time ago. Killer Instinct has the best input lag of any game!! And they use Beyond3D as their inspiration for their webpage.

http://www.displaylag.com/video-game-input-lag-database/

Game Title Platform Resolution Frame Rate Lag (60hz) Lag (MS) Test Link Rating Price
Killer Instinct Xbox One 900p 60 FPS 4.9 frames 81ms YouTube Excellent Check Amazon

Goodbye
 
*snip* Goodbye

Dude, what the engine is capable of doing is still limited by the hardware it's running on. You're not going to get anything over 60fps on XB1 (nor PS4), with the current household TVs vastly supporting 59-60hz refresh-rates. What you are going to get is a smoother locked/Vsync 60fps experience, because of the additional headroom that the engine can provide. The 90fps figure, is more than likely what PC gamers can expect with the proper hardware in place (a monitor that can support refresh rates over 60hz).

There is a reason why KI is 900p/60fps on the XB1... and if the engine is providing additional headroom for higher fps, then it's to make sure that the current locked frame-rate isn't dipping under 60fps. Which makes sense...
 
Dude, what the engine is capable of doing is still limited by the hardware it's running on. You're not going to get anything over 60fps on XB1 (nor PS4), with the current household TVs vastly supporting 59-60hz refresh-rates. What you are going to get is a smoother locked/Vsync 60fps experience, because of the additional headroom that the engine can provide. The 90fps figure, is more than likely what PC gamers can expect with the proper hardware in place (a monitor that can support refresh rates over 60hz).

There is a reason why KI is 900p/60fps on the XB1... and if the engine is providing additional headroom for higher fps, then it's to make sure that the current locked frame-rate isn't dipping under 60fps. Which makes sense...
The developers of the game have said it's 900p and 90 fps, don't you think that's proof enough? Or are you hurt because SF V is 60 fps?

a DF test should just say 60 fps, most likely, because they don't read more than 60 fps on consoles, I guess. But you are admitting yourself it's 90 fps when you say "a smoother locked/Vsync 60fps experience", so you have me all confused.

Look, you seem to be an okay guy, and while I got tired of this "you didn't understand n well", I can live with it, but with embittered smart alecks, then I can't stand that attitude.

I am not talking about you nor people who tell me that I didn't understand well, which are usually blokes that I like here for the most part, and admire even --say Shifty to say one (too many to list here). Certain attitudes are different, and I can't help it but feel upset about them.
 
The developers of the game have said it's 900p and 90 fps, don't you think that's proof enough? Or are you hurt because SF V is 60 fps?

Why would I care about Street Fighter being 60fps??? :confused:

Anyhow, the developer has stated the engine itself is capable of 90fps... but the actual XB1 KI is locked at 60fps. The additional fps headroom is more than likely (as I stated many times) on keeping the locked 60fps smooth (butter smooth) and gamepad/gameplay latency way down. Which Rare should be commended for... however, you know this is nothing new.

But feel free to believe whatever you want...
 
The reason the game operates at 90fps is to roll back for their multiplayer services. So update runs at 90 (net code and controller inputs), fixed update runs at 60(animations, active frames, damage etc), at least that is the best way I can describe it. The additional 30 frames allows the game to roll back any mistakes it makes over online play; ultimately decreasing lag and increasing accuracy without impacting the games 60fps. It's quite good online!
 
Last edited:
The reason the game operates at 90fps is to roll back for their multiplayer services. So update runs at 90 (net code and controller inputs), fixed update runs at 60(animations, active frames, damage etc), at least that is the best way I can describe it. The additional 30 frames allows the game to roll back any mistakes it makes over online play; ultimately decreasing lag and increasing accuracy without impacting the games 60fps. It's quite good online!
I wonder why they do not use hz dividable with 60, 120 or 240 might give easy way for delayed rendering as well.
 
Either they can't get the physics up to 120 Hz, or it just isn't reliable enough across the net having too little time to receive data. I imagine the latter. Ordinarily we talk about network data being 30 fps or even less in some aspects, and just tween the visualisation based on projections and corrections.

Would be easy to evaluate with networking snooping. That sounds like a decent DF investigation actually.
 
No, I am not. I am not native but I know well what I read, despite the typical patronising thing towards me here "that's not what they said". It's not physics nor particles, not volumes, SSAO, nor realistic ferns or anything. It's frames per second.
"FPS" is (incorrectly) used to measure lots of different aspects of games though. eg. Racing games have physics running at hudnreds of fps (should be called something like Hz, but we often use fps). The source is a tweet, kept short and so missing details. 90 fps on rendering would result in uneven frame pacing with every third frame not being shown, or a constant tear alternating between top and bottom third of the screen as you see only two thirds of every frame. Ergo it doesn't make sense to render at 90 fps unless they're talking PC.
 
Status
Not open for further replies.
Back
Top