Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
Anyway, in my experience with The Witcher 3, the most annoying aspect of the game is easily and by far the camera stutter thing that can be much more prevalent, noticeable and gameplay damaging than the usually very short framerate drops IMO.

But apparently they are going to improve it on the next patch.
 
I'm reading that PS4 is double buffered? Which is why it is dropping to 20 FPS lock. Not a clue what that means.
He means that we don't know if PS4 is hovering in the high segment of the 20s as anything < 30fps will drop the framerate to 20fps with double-buffer V-sync, so it'd be hard to compare.
 
He means that we don't know if PS4 is hovering in the high segment of the 20s as anything < 30fps will drop the framerate to 20fps with double-buffer V-sync, so it'd be hard to compare.
Yea I get that part, but I'm confused on the purpose of double-buffer v-sync. What exactly are it's pros/cons?
This should introduce additional input latency?
 
If you consider 33 ms -> 40 ms = +7ms "an additional input latency" then yes, but most probably - no.
hmm, so when frame time is greater than 33, vsync will move it to 40 thereafter, as opposed to letting it run uncapped then. Interesting.
 
I've seen your video and I don't see major framerate drops, that's true. Even so, you locked the framerate,

I specifically said the frame rate was not locked. Only vsync.

and most people can't do that in their GPU control panel. I know I can't for instance, but I can enable Vsync, at least.

You don't do it from the control panel, you use a third party application like NV Inspector or RivaTuner. The option is there for everyone.

Good examples of the always changing behaviour of the framerate can be seen in the most recent Digital Foundry video:


With respect, that's an absolutely terrible video to prove your point with. The frame rates in every one of those games are very stable when onscreen action is consistent. It's only when the onscreen action changes significantly that the frame rate varies which is exactly what would happen with any system, PC or console when no lock is in place. For proof look no further than your favorite subject of the moment - Witcher 3 performance on XBO with an unlocked frame rate.

At one point the frame rate goes from 23 to 40 to 22 fps in a few seconds. That's not a dig at the XBO, any system, will behave that way when the onscreen action varies considerably (as common sense would suggest it would). Look at the comparable sections between the two videos (i.e. the horse ride through the city) though and you see the frame rate is just as steady on the PC as it is on the XBO.

AC Unity is one of the worst offenders, the framerate can be at 100+ fps and then run at around 60 fps, and The Witcher 3 goes from 90+ fps in a scene to 60 fps under a split second afterwards. CoD: AW is running at 210fps in one scene and suddenly decreases to 130 fps, and I mean it. Or from 68 fps on the R9 290X to 150 fps...

You mean just like at 56 seconds in on the XBO performance video for the Witcher 3 where it goes from 23 to 40 to 22 in a few seconds? As I said, this isn't a PC only phenomenon, it's a general rendering phenomenon when running without a frame cap/vsync.

If you don't want frame rates to vary like that you simply apply vsync and/or a frame rate limiter which in almost all of those PC games would have resulted in a perfect 60fps lock every time.
 
hmm, so when frame time is greater than 33, vsync will move it to 40 thereafter, as opposed to letting it run uncapped then. Interesting.
It basically misses the vblank of the display (16.67ms for 60Hz display), so it waits for the next interval.

30fps is already a missed interval for a 60Hz display. So let's say, you render a frame under 16.67ms, and that's your first unique frame. That is ready for display, so that's what you see. If the engine takes longer than 16.67ms it waits for the next refresh @ 33.33ms, so the 1st frame is essentially sitting on-screen for two intervals or 1/2*60Hz -> "30fps". If it missed the next interval, then you wait for the next vblank, i.e. 50ms (3/60), which means you're at "20fps" as the frame has now been sitting on-screen for a 3rd interval.

So basically, the frame rates you see will be an integer divide of the display's Hz. e.g. 60, 30, 20, 15, 12, etc. corresponding to the number of intervals the last frame has been sitting on screen.

With triple-buffering you add a second back buffer to the swap chain for the display to grab frames from, so at least one of those should have a completed frame by the time the display needs it, though you get into frame pacing issues for obvious (various? :p) reasons.

edit: Please excuse any brain farts in the explanation. :p
 
Eh. 5400 rpm drive in both. I'm not seeing a reason for there to be such a difference between the two platforms if the HDD is to blame.

There was an additional bit of weirdness, where the teardown showed that the PS4's HDD goes through an SATA to USB bridge, which wasn't mentioned for the Xbox One teardown.
There was also some talk about a bandwidth cap that was taken off for developers pre-launch. It's unclear if there was something similar on both consoles, or the fate of that cap for PS4.
 
Didn't most load-time comparisons and installation speed measurements in fact favor the PS4? Hmm, maybe not for actual in-game loading.
 
At least early on, there were notable cases where the PS4 did better, but I have not kept track of where things are since then and the Xbox One did have some clunkiness.
I was addressing one possible difference in the systems, even if the hard drives themselves seem to be generally comparable.
We also do not have a comparison for how the file systems are managed, so there might be a difference there as well. I can't remember which title on the PS4 had a noticeable improvement in performance if the user forced a rebuild of the console's database.
 
Didn't most load-time comparisons and installation speed measurements in fact favor the PS4? Hmm, maybe not for actual in-game loading.
The install times for PS4 is really, really short compared to XboxOne. Maybe PS4 is trading something there? Maybe it's only unpacking necessary stuff on the fly when needed instead of fully unpacking all of files in install packages?
 
There can be some major differences just based on how full the drive is.
Outer edge at 120MB/s
Innner edge at 60MB/s

When a game is installed on a bare system it would be at the outer edge, but an installation on an almost full local drive would be half the speed... unless they cap it, I guess. Or spread it, or whatever. We know so little about the file system behaviour.
 
The install times for PS4 is really, really short compared to XboxOne. Maybe PS4 is trading something there? Maybe it's only unpacking necessary stuff on the fly when needed instead of fully unpacking all of files in install packages?
The PS4 did more directly emphasize the ability to play while background installation continued, past a certain point. It does seem to be the case that the Xbox One does not have the same thresholds, so I'm not sure which comparisons take that into account.

I think it was Assassin's Creed Unity where I first saw suggestions about rebuilding the database.
 
I'm wondering if the vsync on the ps4 could have something to do with async compute? If the frames are capped at 30 there is more compute time for the async stuff. At least if it is capped the way that really only 30fps are processed.
 
The install times for PS4 is really, really short compared to XboxOne. Maybe PS4 is trading something there? Maybe it's only unpacking necessary stuff on the fly when needed instead of fully unpacking all of files in install packages?

Once the game is installed it is the full size on disk. Playing as you install is another matter and usually limits the gamer to a portion of the game.
 
While playing from a partial install, there'll be installing going on in the background, right? That'll affect HDD performance. A fair test would need both consoles to be played from zero games installed, save the one being tested which should be installed to completion. There's certainly the possibility that the OS or something else is reducing HDD performance in a machine. In fact the ideal test would be to grab the same model HDD from an XB1 and plug that into a PS4.

I'll be happy to perform these tests if anyone wants to fund me the hardware acquisition! :mrgreen:
 
If your being that specific on hdds then the default drives should be out, ssd internally on the ps4 and one external on the xbox one.

Take the hdd out the equation, or at least the spinning part. This is a supported hardware scenario.
 
While playing from a partial install, there'll be installing going on in the background, right? That'll affect HDD performance. A fair test would need both consoles to be played from zero games installed, save the one being tested which should be installed to completion. There's certainly the possibility that the OS or something else is reducing HDD performance in a machine. In fact the ideal test would be to grab the same model HDD from an XB1 and plug that into a PS4.

I'll be happy to perform these tests if anyone wants to fund me the hardware acquisition! :mrgreen:

This is actually where I'm getting at. I remember "play as you download" system is also being utilized for game installs from disc, from a Mark Cerny interview, the idea is exactly the same if you think about BluRay disc as a file server. There are many examples of this on youtube but here's one for call of duty where PS4 is massively faster for installing the game:

If I recall correctly game may take about a couple of hours less than an hour to fully install on PS4, as the install happens on background on idle time. So this may be adversely affecting the hdd performance elsewhere?

OK, here's the piece talking about background installs: http://kotaku.com/how-mandatory-game-installations-will-work-on-ps4-1462283797 . But since it shouldn't take more than an hour, it shouldn't be affecting the benchmarks as I'm guessing they would have started the game several hours prior to the benchmark. Still, may explain some irregular performance that occurs during the early play hours.
 
Last edited:
If your being that specific on hdds then the default drives should be out, ssd internally on the ps4 and one external on the xbox one.

Take the hdd out the equation, or at least the spinning part. This is a supported hardware scenario.
That would tell you how well the systems handle SSDs, not HDDs. If the seek times is a significant reason for slowing (OS moving head around a lot between game reads?), the SSD would eliminate that and you'd get an inaccurate result.
 
Status
Not open for further replies.
Back
Top