Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Feel free to correct me.

EDIT: Actually let me correct myself first as I've just re-read my post and noticed a couple of basic errors.

1 - obviously triple buffering would eliminate tearing since it's a for of vsync duh. Had a brain fart on that one.
2 - I did think triple buffering drops the framerate from 60 to 40 and then 20 as opposed to 60 then 30 then 15 of double buffering but perhaps I'm wrong there. Thinking about it I'm not even sure myself how that could work since the monitors refresh rate can't change to 40hz!
3 - Just to clarify what I said about the PC version, I mean that with vsync on, the framerate will cap at 60fps but vary greatly below that. Obvously with vsync off it will vary above 60fps as well. The thing is that with vsync on (double ot triple, it doesnt matter), when framerate falls below 60 fps it doesn't go straight to 30fps, it varies just like we're seeing in the PS version, hence why I assumed an adaptive vsync is being used.

I hate to point this out again, but you are completely lacking any technical knowledge on the subject that you are talking about.

Double / triple buffer has nothing to do with the specif frame rate which the game rendering at. The are meant for preventing tearing by rendering a full frame in the back and then flip over to the front for display output. You don't need to monitor to refresh at 40Hz to render a game at 40fps, you just render some of them every 33.3ms, and some of them 16.67ms, and add up to 40 of them, in a second, and again, has nothing to do with double/trible buffering, and also, this is not what adaptive vsync is.
 
http://www.eurogamer.net/articles/digitalfoundry-2014-ps1-ps2-games-heading-to-ps4

A well-placed source working with Sony's streaming service reveals that only PlayStation 3 titles are currently scheduled to use the "gameplay over IP" cloud service. PS1 and PS2 titles are set to follow the more conventional route of running locally under emulation on Sony's latest console - but with the possibility of HD visual enhancements.

While Sony effectively phased out hardware emulation of previous platforms during the early stages of the PS3 era, it has worked consistently on opening up the PlayStation back catalogue to publishers via software emulation and PSN downloads. Internally, PlayStation 3's firmware contains emulators for PS1, PS2 and even the handheld PSP, while Vita is capable of running PS1 and Vita titles virtually flawlessly using the same technology. In short, there's a proven codebase that can only get better for the vastly more powerful PS4.

Our information suggests that the same internal emulator strategy is planned for PlayStation 4, and we understand that Sony is actively pursuing the ability for older titles to run without the blurry upscaling seen on PS3, suggesting that native HD resolutions are being targeted. Assuming this intention carries through to final code, we'll be seeing an effect similar to the resolution scaling seen on unofficial PC emulation of Sony's consoles, as well as a great many of the "HD remasters" we saw on PS3 - where original assets were rendered at a higher resolution, often without any actual remastering at all.
 
eurogamer said:
Internally, PlayStation 3's firmware contains emulators for PS1, PS2 and even the handheld PSP, while Vita is capable of running PS1 and Vita titles virtually flawlessly using the same technology. In short, there's a proven codebase that can only get better for the vastly more powerful PS4.

You don't say :D

FWIW, I think they meant the Vita emulates PSP titles, which given how Vita plays all digital PSP titles, it should be correct.
 
I hate to point this out again, but you are completely lacking any technical knowledge on the subject that you are talking about.

Double / triple buffer has nothing to do with the specif frame rate which the game rendering at. The are meant for preventing tearing by rendering a full frame in the back and then flip over to the front for display output.

You don't need to monitor to refresh at 40Hz to render a game at 40fps, you just render some of them every 33.3ms, and some of them 16.67ms, and add up to 40 of them, in a second, and again, has nothing to do with double/trible buffering, and also, this is not what adaptive vsync is.

So if every frame you render takes 17ms to complete, with vsync off you'd run at something like 58fps and with vsync on (double buffered) you'd run at 30fps becuase you can't swap the front and back buffers until the next refresh cycle and you can't render a new frame to the back buffer until it's been wiped - which would always mean each frame gets displayed for 2 refresh cycles. That's how I've always understood it anyway.

I see what you're saying about frame times varying under and over 16.67ms within a given second producing an actual framerate somewhere inbetween but surely such a variance in framerate every second is going to be rare and thus the more common scenaro is that you'll be rendering below 16.67ms for a whole second or above it? Hence the more common jumping between 60 fps and 30 fps that we see with double buffered games.

And if the above is the case then my understanding of adaptive vsync is simply that when you're rendering frames below 16.67ms, vsync is on and thus every buffer swap syncs to the refresh rate but when you're rendering above 16.67ms then vsync is turned off and the swap occurs as soon as the frame is complete - thus resulting in tearing. Is that not correct? I suppose that means that even within a given second you're actual framerate could be say 40fps with vsync being on for part of that second for those frames that come under 16.67ms and vsync being off (and tearing) for the rest of that second for anything over 16.67ms.

I admit my understanding of triple buffering was way off though. If I understand it correctly now, because the GPU can continue to to render frames between the two back buffers as fast as possible, you're never wasting any rendering time and thus you actually can display say 58 unique frames in a single second if every one of them renders at 17ms but at the same time each one would still be synced to a refresh cycle with vsync on with say 56 frames being shown for exactly 16.67ms and 2 frames being shown 33.33ms when neither of the back buffers are complete.
 
But if the majority of your frames take longer to complete than 16ms (as I was taking the example to be) then you will spend the majority of your time locked at 30fps.
No, you're only [soft] locked at 30fps if every single frame takes more than .16ms but less than .33ms and you're sync locked. Vsync will prevent the game updating the screen as it's being viewed (generating tearing).

It's academic anyway since if you can't get the lock you want out of vsync you just use something like nvidia inspector to lock it to whatever you want.
We're either at cross purposes or you're confused. Vsync isn't there to lock the framerate, vsync is there is eliminate tearing by not updating the frame buffer, or switching to another frame buffer while an existing frame buffer is being drawn to the screen. They are different things.

I'm not sure why triple buffering would prevent tearing and the framerates in DF analysis don't support tripple buffering anyway - i.e. they don't jump between 60, 40 and 20, they vary all over the place.
I don't understand why you think the framerate should jump between 60/40/20 fps but I think it's to do with your [mis]understanding of vsync and locked frame rates.
 
No, you're only [soft] locked at 30fps if every single frame takes more than .16ms but less than .33ms and you're sync locked. Vsync will prevent the game updating the screen as it's being viewed (generating tearing).

Isn't that what I just said? i.e. if most of your frames take longer than 16.67ms to render (but less than 33.33) then you're GPU will be rendering - and displaying only 30 frames per second the majority of the time for exactly the reason you state - because vsync prevents the screen update which prevents the GPU from rendering the next frame until the back buffer has been cleared (with double buffering)?

We're either at cross purposes or you're confused. Vsync isn't there to lock the framerate, vsync is there is eliminate tearing by not updating the frame buffer, or switching to another frame buffer while an existing frame buffer is being drawn to the screen. They are different things.

Yes but it can have the effect of locking your framerate which is what I was initially driving at. You can alter your game settings to change your frame render time so that ever frame completes in less than 16.67ms and thus you get a 60fps lock or every frame (realistically not every, but the vast majority) completes between 16.67 and 33.33ms in which case you get a locked - or mostly locked 30fps with vsync (double buffered) engaged. This is exactly what I was doing in AC4 for a while until I decided I'd rather live with tthe tearing and average frame rates around 40 -55.

I don't understand why you think the framerate should jump between 60/40/20 fps but I think it's to do with your [mis]understanding of vsync and locked frame rates.

Yeah just ignore that, my understanding of triple buffering was completely off. Or rather, I'd read that somewhere and never really tried to work out the mechanics behind it.
 
Yes but it can have the effect of locking your framerate which is what I was initially driving at. You can alter your game settings to change your frame render time so that ever frame completes in less than 16.67ms and thus you get a 60fps lock or every frame (realistically not every, but the vast majority) completes between 16.67 and 33.33ms in which case you get a locked - or mostly locked 30fps with vsync (double buffered) engaged. This is exactly what I was doing in AC4 for a while until I decided I'd rather live with tthe tearing and average frame rates around 40 -55.

No, you are still not getting it.

AC4 has a rendering bug where it's capping the frame rate at 30 with vsync when it shouldn't be doing that. Vsync can produce variable framerate, it does [should] not lock the frame rate when the code is written correctly, that's completely unrelated.
 
Vsync can produce variable framerate...
An important clarification that may be affecting the discussion (not reading it fully) - framerate requires a sampling period, which no-one is specifying and so may be at cross purposes. The time interval between any two frames with vsync enabled is either 1/60th of a second or 1/30th of a second. It's not possible to have a variable framerate between two frames. Within one second, there can be frames of 1/30th second duration and frames of 1/60th second duration resulting in a mean average framerate over one second of somewhere between 1/30th and 1/60th of a second.

Without vsync, the time interval between two frames can be any amount between 1/30th and 1/60th of a second as the frame gets thrown to the screen as soon as its ready regardless of where the current frame is in output.

If this clarification is of no use in understand what everyone's talking about, I apologise for butting in. ;)
 
No, you are still not getting it.

Hmm, I'm pretty sure I do get it now. Let me make my example more.extreme for the sake of clarification. If 100% of your frames are rendered between 16.7 and 33.2 ms what will your frame rate be with double buffered vsync enabled? I say it will be locked at 30 fps which is what I do in AC4 - for a lot of the time. The frame rate still varies from that, but not regularly. I.e. it will still jump up to 60 if I look at the floor and start spinning around so there's no lock going on.
 
Don't disagree on your particular example, but it says nothing, the problem is tying up vsync with framerate cap.
Let me pose this question to you then, if 100% of your frames are rendered at 35ms with vsync on, how many fps will you get?
 
Last edited by a moderator:
So the Benchmark they have in the game isn't really usefull? When i did the benchmarks i saw a CPU usage of 10%

completely useless for CPU performance, you have to play the game and find some of the bad places, it's pretty common for review sites to use a tool like this and give you the false idea the game is never CPU bound

This is game is 99% likely to be using TressFX 2.0 which is a lot lighter on the system

TR also runs flat out on pretty much any modern quad core with dual core CPU's not far behind.

it still is significant load for the GPU I would think, the old TressFX could have a HUGE hit in performance in some scenes, when the camera was really close to the hair.

my sandy bridge i3 (dual core with ht at 3.1GHz) is far from acceptable in this game, it goes easily under 30fps in some places (with level of detail on ultra), while the built in benchmark gives unrepresentative inflated numbers.

obviously the consoles are not running the same thing, so I don't know how relevant it would be, but I would like to see the framerate analysis on the shanty town big fight, also compare the objects with the PC with level of detail on ultra and high.
 
Im gonna say you can only display a new frame every 3rd refresh so 20fps?

I calculated 28 fps. With a new frame presented every other refresh except for 3 cases per second where it takes three refresh before a new frame is presented. Thats only calculated for the first second it may alternative from 28 to 29 every other second.
 
I calculated 28 fps. With a new frame presented every other refresh except for 3 cases per second where it takes three refresh before a new frame is presented. Thats only calculated for the first second it may alternative from 28 to 29 every other second.

Wouldn't that only apply with triple buffering since with double you cant start rendering a new frame until the back buffers been switched and since it can only be switched on a refresh cycle and it takes 2 and a bit refresh cycles to render each frame you're effectively waiting 3 refresh cycles before you can start any new frame, hence 20fps.
 
Wouldn't that only apply with triple buffering since with double you cant start rendering a new frame until the back buffers been switched and since it can only be switched on a refresh cycle and it takes 2 and a bit refresh cycles to render each frame you're effectively waiting 3 refresh cycles before you can start any new frame, hence 20fps.

Oops, forgot that caveat of your example.
 
Funny choice of dynamic res in the main cut-scenes on XO. Guess it's about TressFX taking up larger % of the screen (amongst other effects).
 
Status
Not open for further replies.
Back
Top