How should devs handle ports between consoles? *spawn

I meant to add that also (by all accounts) Sony went to devs and asked what they wanted - so again I'm struggling why the CPU is already (apparently) limiting games
Because it's Weak Source. Sony may have asked devs what they wanted, but that doesn't mean the devs got everything they asked for. And at the time of the asking, Jaguar didn't even exist so devs requesting x86 (x64) would probably be thinking in terms of the PC desktop parts rather than a laptop processor.
 
Here Digital Foundry says PS4 Tomb Raider runs 40-60 fps without any tearing.

http://www.eurogamer.net/articles/d...b-raider-definitive-edition-next-gen-face-off

Are you claiming that DF is wrong here?

A) no where on the article that I could ctrl+f 'screen tearing/tearing/tear'. So you're putting words into their mouths about them running TR @ 40-60fps without tearing. Having said that, tearing is less frequent below target refresh rates, but it still occurs, just less than having frame rates higher than the refresh rate.

B) I mentioned you can remove screen tearing entirely if you enable v-sync, but that will result in juddering and latency in controls. Please read the following:

However, on the PS4 we see frequent dips and spikes in the render time between frames, translating into on-screen judder and a variance in controller response that does feel a little odd during fast-moving action.

That said, when exploring more complex locations filled with heavier effects work, the more consistent frame-rate provided by Xbox One has some advantages - motion has less judder during fast camera pans and more hectic moments, while the controls feel more stable. This is most obvious when the PS4 hovers between the 40-45fps mark, but once we reach metrics closer to 60fps, the inconsistent frame-rate is no longer a problem

And last but not least DF mentions v-sync in the article itself. They make no mention if it's adaptive or full v-sync. But I know if it's the latter they will never experience tearing. If it's the former, they will eventually catch footage of the tearing occurring.

You need to do research on the topic yourself. V-sync and screen tearing has been around forever. It's why they're developing technologies to combat it: http://www.anandtech.com/show/7582/nvidia-gsync-review
 
A) no where on the article that I could ctrl+f 'screen tearing/tearing/tear'. So you're putting words into their mouths about them running TR @ 40-60fps without tearing
It's vsync locked. Search the article for "sync" and you'll find it.
 
Yea my assumption is that it is. Though I've always been under the assumption it's often adaptive, seldom full v-sync.

You don't need to assume it, Digital Foundry confirm that the game is vsync locked. Where games output torn frames, past Digital Foundry analysis has specifically highlighted the frequency and location. For example look at their analysis of Soul Suspect and Outlast.

As someone who is tolerant of fluctuating frame rates but intolerant of torn frames, this is something I pay close attention to. I gave up on the original Assassin's Creed because it was a mess of torn frames on PlayStation 3.
 
Ah ok thanks DSoup! Yea I was unsure because I didn't really get into consoles until this gen so I wasn't sure if DF had to be explicit in every detail. Makes sense though. In an action game you are going to run locked v-sync. In a fps it's going to be adaptive to make sure your controls and latency is as tight as possible.
 
Consoles are locked to 30 and 60 FPS where no such limitations exists on PC. The game needs to be shipped with enough frame budget for the worst case scenario. So right now being 30fps, they really need the game to be running well above that and in the worst possible scenario drop to 30.

If the feature you've enabled puts you below you've got to optimize to get it up there again. So it's not that you can't just enable these features, it's just when you do and there is no space left already for it then you have to make space. And making space takes time and money.

Consoles are not locked this way. For example, when I played Second Son, I had the option to enable the 30fps lock or to play with uncloked frame rate. The latter provided, for me, a better experience.

What you´re suggesting is ridiculous: target the lowest common denominator (Xbox One), than port to PS4; the PS4 version will run at a higher frame rate, but you lock it down to create parity with MS´s console. Is that it? Are you asking devs to not make optimizations for one of the consoles? They always did it at all previous generations. That´s the marvel of console gaming.
 
Is that it? Are you asking devs to not make optimizations for one of the consoles? They always did it at all previous generations. That´s the marvel of console gaming.

Well ... no. At least, not always, and not always remotely as much as they could.
 
Consoles are not locked this way. For example, when I played Second Son, I had the option to enable the 30fps lock or to play with uncloked frame rate. The latter provided, for me, a better experience.



What you´re suggesting is ridiculous: target the lowest common denominator (Xbox One), than port to PS4; the PS4 version will run at a higher frame rate, but you lock it down to create parity with MS´s console. Is that it? Are you asking devs to not make optimizations for one of the consoles? They always did it at all previous generations. That´s the marvel of console gaming.


Developers are welcome to ship their game however they want. If you are happy with judder and latency to comes with v-sync that's fine. Some developers may not be. The fact is that true locked 30 and locked 60 are ideal frame rates to ensure judder free and screen tear free gameplay.

I have not a clue what you are accusing me of, I'm answering a question asked as to why you can just flip on a bunch of features since ps4 has more power. It's because it's not always trivial to turn on. If you turn it on and the worst case scenario "in the game" ie. 100 NPCs all shooting their effects and having building explodes makes the frame rates dip well below the threshold because you wanted some more AA which is normally fine in a basic scene you've got to do something about the complex scene.

This has nothing to do with other platform.
 
Has to be a multiple of 60Hz. So 120 and 240 would also be fine. Unless you have nvidia g-sync.

It has to deal with your displays refresh rate. If you are putting out frames at a time when your monitor is refreshing it will grab two frames and you get the screen tear.

It's a problem that's only come with flat panels really.

I remember PS2 God of War tearing quite a bit on my old JVC CRT.
 
A) no where on the article that I could ctrl+f 'screen tearing/tearing/tear'. So you're putting words into their mouths about them running TR @ 40-60fps without tearing. Having said that, tearing is less frequent below target refresh rates, but it still occurs, just less than having frame rates higher than the refresh rate.

B) I mentioned you can remove screen tearing entirely if you enable v-sync, but that will result in juddering and latency in controls. Please read the following:


And last but not least DF mentions v-sync in the article itself. They make no mention if it's adaptive or full v-sync. But I know if it's the latter they will never experience tearing. If it's the former, they will eventually catch footage of the tearing occurring.

You need to do research on the topic yourself. V-sync and screen tearing has been around forever. It's why they're developing technologies to combat it: http://www.anandtech.com/show/7582/nvidia-gsync-review


"latency in controls" :)
You seem to have picked a lot of technical terms without learning their actual meaning. If you miss your vsync target you will keep the previous frame for another 16.7 ms . That is the judder and the extra input lag.

G-sync can only help with this judder and maybe reduce latency. It can't combat V-sync :)
 
Ah ok thanks DSoup! Yea I was unsure because I didn't really get into consoles until this gen so I wasn't sure if DF had to be explicit in every detail. Makes sense though. In an action game you are going to run locked v-sync. In a fps it's going to be adaptive to make sure your controls and latency is as tight as possible.

What exactly is an "action game" here?
 
"latency in controls" :)
You seem to have picked a lot of technical terms without learning their actual meaning. If you miss your vsync target you will keep the previous frame for another 16.7 ms . That is the judder and the extra input lag.

G-sync can only help with this judder and maybe reduce latency. It can't combat V-sync :)

Wow Tuna I must have kicked your dog or something because you are seriously out to ride me.

CPU frames update at 16ms for a 60 fps game, if all of a sudden I'm getting a drop to 40, I'm getting visually seeing less inputs into the game than my previous inputs. If you don't interpret this as increasing latency on your controls, I don't know how else to explain that. If your frames drop to 12fps, it becomes nearly unplayable because your controls and what you see are so far apart your mind has difficulty linking the too together. You could very well make the argument that your controls are likely way too ahead of your screen though. You likely plan to I imagine. I would call this as completely crappy controls though, one might even call it laggy, correctly used laggy controls would be more synonymous with long display processing but w/e. Point is, if you tap left, and you're used to seeing it go left, and next time you tap left and you don't move, at the end of the its latency, you don't know how much you under shot or over shot your controls because the screen hasn't refreshed as much.

I call it combating v-sync cause it has been a crappy solution. Nvidia G-sync is doing what v-sync is incapable of doing, which is managing the keep the video card and monitor in sync without being detrimental to the experience.

It is a discussion. I'm not here to get anyone. You've known v-sync has been around for ages so just get forward with advancing the discussion, you haven't added anything by riding my back about it. He wanted an honest answer why it PS4 can't just flip on features that are on the PC version already. Because it's not trivial to. If it was, it'd be on. If it's not on, it was never trivial to begin with, not because there is some magical parity clause that MS has been paying repeatedly for every single multiplatform title that is coming out from now till next generation.

If you really want to nit pick my words, I seriously give up. Point is, I've known for some time that you have known fully what I've been trying to say all this time. You've been posting here forever, I've seen your posts come up on the most obscure and old technical threads. But you just kept leaving these one liners on this thread when it could have been an easy 'here I fixed this for you, you meant this when you wrote that' and discussion would just move forward.
 
What exactly is an "action game" here?

Action adventure. Ie Tomb Raider and Watch Dogs, and not a twitch based game like COD and Titanfall.


edit:.. Annnnddd you're still doing it.

edit2: You know what. nvm Tuna no point in saying anymore. I'll take my licks. I won't respond unless it's super friggen clear and concise otherwise I'm just adding noise and causing confusion or derailment.
 
Last edited by a moderator:
I get a kick out of people who are wrong on the internet. Don't take it too personal.


Lol I won't. But thank you for this response. I'll take it as a lesson learned. Don't press send until it's bullet proof.
 
CPU frames update at 16ms for a 60 fps game, if all of a sudden I'm getting a drop to 40, I'm getting visually seeing less inputs into the game than my previous inputs. If you don't interpret this as increasing latency on your controls, I don't know how else to explain that. If your frames drop to 12fps, it becomes nearly unplayable because your controls and what you see are so far apart your mind has difficulty linking the too together.

So at an average 40 fps we have an average latency of 75 ms (assuming 3 frames input lag). That is about 25 ms worse than the perfect 60 fps input lag.

If we assume a display lag of 33 ms we have 108 vs 63 ms lag. But I think the judder and loss of motion resolution will be a larger problem....

BTW, maybe you could calculate how much input lag will we get at 12 fps?
 
I call it combating v-sync cause it has been a crappy solution. Nvidia G-sync is doing what v-sync is incapable of doing, which is managing the keep the video card and monitor in sync without being detrimental to the experience.

My opinion is that tearing is preferable to the added lag and judder coming from vsync at high frame rates. But with the game engines today I am not really sure you can really do half-frames unless you render into tiles or something. Or maybe I am wrong?
 
If you really want to nit pick my words, I seriously give up. Point is, I've known for some time that you have known fully what I've been trying to say all this time. You've been posting here forever, I've seen your posts come up on the most obscure and old technical threads. But you just kept leaving these one liners on this thread when it could have been an easy 'here I fixed this for you, you meant this when you wrote that' and discussion would just move forward.

If you are not sure you can just say that. You can just write "Is this correct?" And nobody can now that you mean one thing when you write something else. We all should try to be as correct and precise as possible in our wording, but there are some bad influences on this forum (see the "arcade racers" naming discussion :) )
 
If you are not sure you can just say that. You can just write "Is this correct?" And nobody can now that you mean one thing when you write something else. We all should try to be as correct and precise as possible in our wording, but there are some bad influences on this forum (see the "arcade racers" naming discussion :) )


Yes this was bad etiquette on my part. As a response to the last 3 replies. Firstly I also prefer screen tearing to judder depending on the type of game but not always. In the end I guess it comes down to whether the type of game demands instant reaction or a focus on image quality.

I'm not sure if you can render half tiles to combat screen tearing because the engine would need to know precisely where it occurred I think if this is what you are referring to. Truthfully as you mentioned earlier the near-removal of latency is about the best solution g-sync can offer - not perfect but the best without impacting performance on the video card.

As for the last reply. I'm not sure I fully understand what we are attempting to calculate. 40 FPS works out to be 25ms, and the monitor refreshes as 16 so wouldn't that be 7ms lag for the next frame and then it would be perfect the following frame and then back to 7ms lag for the next frame? Something similar would occur for 12 FPS, 5ms lag followed by (add 83ms) then mod 16 to get another 6ms lag?
 
Back
Top