Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
You are confused on vsync versus fps cap.
Just because you can do vsync on/off on PC does not make the engine automagically do frame rate cap [at arbitrary limit], they are completely different.

I know how vsync works. My point is that if you have a frame rate that varies between 33 and 60fps (as in the example given) then with vysnc on, your frame rate is going to lock at 30 fps most of the time. Granted it will on occasionally jump to 60 when your frame rate strays that high but aside from those occasions you've effectively eliminated the constant frame rate variance. If you find yourself straying up to 60fps too often, increase your graphics settings a little to keep yourself within the preferred frame rate window.

Of course you can also limit frame rates to an arbitrary level using 3rd party apps as well.
 
Are the scenes with low fps particularly demanding on the CPU? As this seems to be the weak part in the PS4?!?

probably tressfx or CPU related, these were the main problems with the PC version (when using weak CPUs), maps like the Shanty Town (on ultra settings only, high had a lot lower CPU load because the level of detail option going down) were really bad for slow CPUs
 
DF doesn't seem to have bothered with measuring tearing on this face off.

Wrt to frame rate, for me going back on v-sync or soft v-sync as we often have last gen is not a progress. Actually that approach has merit as it even made it in the PC realm.

Imo if it comes down to showing the difference in power between both system I would have hope it would have materialize in another fashion.
Doesn't red indicate tearing on their framerate display? The X1 version had some, but it might have been out of the screen area. While they didn't report any tearing numbers, I think their analysis tool always measures it.
 
Not one review mentions a juddery framerate on the PS4 version... quite the opposite actually. And DF's analysis doesn't show any tearing. Again, I think the min/max is a bit mis-leading. If you actually watch the video, the range is more like 10fps during each section of the game and it rarely dips below 40. It's not like it always fluctuates between 33 and 60.
DF doesn't seem to have bothered with measuring tearing on this face off.

Wrt to frame rate, for me going back on v-sync or soft v-sync as we often have last gen is not a progress. Actually that approach has merit as it even made it in the PC realm. Anything that is not v-sync and in line with monitors refresh rate is by nature juddery (more or less) that is the whole point of v-sync and newer approach like G-sync and AMD freesync.

Imo if it comes down to showing the difference in power between both system I would have hope it would have materialize in another fashion.
 
Doesn't red indicate tearing on their framerate display? The X1 version had some, but it might have been out of the screen area. While they didn't report any tearing numbers, I think their analysis tool always measures it.

Yes red indicates screen tearing. And they must have forgotten to report them. Here a few different excerpts from the performance DF video:

Tomb_Raider_PS4_X1_Perf_screen_tear_1_cropped.jpg


Tomb_Raider_PS4_X1_Perf_screen_tear_3_cropped.jpg


Tomb_Raider_PS4_X1_Perf_screen_tear_4_cropped.jpg


Tomb_Raider_PS4_X1_Perf_screen_tear_5_cropped.jpg


Notice the average 20fps framerate gap between the 2 versions when X1 dips under 30fps.
 
Doesn't red indicate tearing on their framerate display? The X1 version had some, but it might have been out of the screen area. While they didn't report any tearing numbers, I think their analysis tool always measures it.
They have a note about their "consistency graphs not being ready" not sure what those graphs are.

Actually I have not followed their work since quite a while as I've no interest in the results thouhg I would hope they ramped up their game. The tool Nvidia provides made a great job at showing the lacking in how displays work. I give me a snarly smile when I read people speaking of smoothness of an extra couples of fps, it may affect input lag but it does not qualify as smooth.
 
Last edited by a moderator:
probably tressfx or CPU related, these were the main problems with the PC version (when using weak CPUs), maps like the Shanty Town (on ultra settings only, high had a lot lower CPU load because the level of detail option going down) were really bad for slow CPUs

I forget, but does tressfx have different detail levels? If so I wonder if they could set those on the fly, and downgrade the hair simulation when loads get high. Would lead to hair pop but maybe then they could hold frame rate better.
 
If we try to compare the 2 versions fairly (like they said in the article) I think we have to analyze the framerate gap only when both are not locked either by 30fps or by 60fps.

In this case there is roughly an averaged 20fps difference between the 2. When X1 runs at 28 (quite often in fact with screen tearing), PS4 is at 48fps.

28fps * 1.7 = 47fps again confirming by framerate only that PS4 performs 70% better than X1, on average of course. There was the same 70% difference with BF4 (resolution + framerate differences) so no big surprises here.

What is surprising is the double buffered framebuffer for the X1 version (screen tearing) because PS4 version appears to have a triple buffer (zero screen tearing).

Well the back buffer is going to be stored in the ESRAM, and at 1080p a triple buffer would use a fair bit of space as it would possibly double the esram used. That leaves less space in the esram to store additional heavily reused data. This would mean the esram would be doing less to alleviate bandwidth demands from the main ddr3 memory.

Only being able to afford having a double buffer is probably why they chose to lock the framerate at 30 on Xbox One, while having it unlocked on PS4.
 
Last edited by a moderator:
I know how vsync works. My point is that if you have a frame rate that varies between 33 and 60fps (as in the example given) then with vysnc on, your frame rate is going to lock at 30 fps most of the time.
Perhaps I misunderstand but I don't believe this to be the case.

In any given one second of play, assuming the hardware is capable of outputting 60fps, it only takes 2-3 frames to not complete within .16ms (but less than .33ms) for the frame rate to drop to 55fps and only 10 frames to not complete within .16ms (but less than .33ms) for the frame rate to drop to 40fps.

The DF article demonstrates this given the average framerate on PS4 is in the 50s. Personally I'd prefer to see the median reported as it's less influenced by a few extreme highs or lows.
 
probably tressfx or CPU related, these were the main problems with the PC version (when using weak CPUs), maps like the Shanty Town (on ultra settings only, high had a lot lower CPU load because the level of detail option going down) were really bad for slow CPUs

So the Benchmark they have in the game isn't really usefull? When i did the benchmarks i saw a CPU usage of 10%
 
probably tressfx or CPU related, these were the main problems with the PC version (when using weak CPUs), maps like the Shanty Town (on ultra settings only, high had a lot lower CPU load because the level of detail option going down) were really bad for slow CPUs

This is game is 99% likely to be using TressFX 2.0 which is a lot lighter on the system

TR also runs flat out on pretty much any modern quad core with dual core CPU's not far behind.
 
Why would TressFX tax the CPU? Isn't that AMD technology that runs almost exclusively GPU side?
And what's the point of looking at CPU usage on PC when we know that a lot of the PC's CPU usage is due to DirectX not being capable of benefitting from multi-threading? So I would sooner expect that the game is using some physics/vertexes pre-processing on PC on the CPU side that is not so much related to TressFX.

Or have their been benchmarks on TressFX performance suggesting it is heavily CPU bound?
 
Why would TressFX tax the CPU? Isn't that AMD technology that runs almost exclusively GPU side?
And what's the point of looking at CPU usage on PC when we know that a lot of the PC's CPU usage is due to DirectX not being capable of benefitting from multi-threading? So I would sooner expect that the game is using some physics/vertexes pre-processing on PC on the CPU side that is not so much related to TressFX.

Or have their been benchmarks on TressFX performance suggesting it is heavily CPU bound?

The built in benchmark didn't really show heavy use of the CPU when i enabled TreeFX aka weird hair :)

At the most it was 1-2% but since the PC runs all kinds of shit in the background i wouldn't bet a sandwich on it. I saw around 10% CPU usage during normal benchmarking with a few peaks to 12-14 and with the treefx i saw 11-12% with peaks into 14.. Note, totally based on Task Manger in Windows 7. On a old i7 920 Overclocked to 3.2 Ghz for the glory of Nathalm
 
Perhaps I misunderstand but I don't believe this to be the case.

In any given one second of play, assuming the hardware is capable of outputting 60fps, it only takes 2-3 frames to not complete within .16ms (but less than .33ms) for the frame rate to drop to 55fps and only 10 frames to not complete within .16ms (but less than .33ms) for the frame rate to drop to 40fps.

The DF article demonstrates this given the average framerate on PS4 is in the 50s. Personally I'd prefer to see the median reported as it's less influenced by a few extreme highs or lows.

But if the majority of your frames take longer to complete than 16ms (as I was taking the example to be) then you will spend the majority of your time locked at 30fps.

It's academic anyway since if you can't get the lock you want out of vsync you just use something like nvidia inspector to lock it to whatever you want. That's why I don't like all this talk of variable framerates = PC games. It's basically the exact opposite. I can lock the framerate in Tombraider to literally anything I want (within the performance limits of my machine). Say at my chosen settings the lowest my framerate ever falls is 33fps and the highest it goes to is 85. Well I can just lock the framerate to 33fps and maintain that rate 100% of the time - if I wish.

In TR for example I can get a solid lock (well 99% of the time anyway) at vsynced 60fps by selecting "High" graphics settings plus TressFX. If you desperately want a locked 60fps, thats an option. Personally I don't mind the variance between 35 and 60 so I play at max settings and live with that.

Incidentally it appears on PC that the game automatically implements adaptive vsync so if your frame rates do fall below the vsync cut off, they become variable - just like the PS4.
 
What the hell's going on here? Blurrier post AA implementation on Xbone? Also details like rivets on the plane's fin are absent on xbone (TR comparison)

http://i.imgur.com/cQCWacC.gif

I wonder if that comes from increased compression or they somehow messed up the capture for the xbox version. Though I guess if you think about it if they pushed each console's memory system to the max, in theory PS4 in some circumstances should have more memory bandwidth for textures than the Xbox One.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top