The Game Technology discussion thread *Read first post before posting*

Vsync nothing of special :???: What dou talking about? Cut the fps to the half I knew. So why don't use it if isn't so special ? :???: And re5 yes have a large amount of tearing too at 720p in some moments.
Hum... Looks like you're right
digital foundry said:
Capcom has taken two entirely different approaches to the frame rate on both platforms with Resident Evil 5. Contrary to early interviews with Capcom, the game isn't running at 60fps, instead operating at the more usual 30fps, albeit with some frame-blending in effect to make it look perceptually smoother. Overall, this effect works well.
The major difference is that Xbox 360 runs with v-lock disengaged, while the PlayStation 3 code has absolutely no tearing whatsoever. However, similar to Grand Theft Auto IV - which operates in the same way - the Xbox 360 version has a tangible advantage here on two fronts. Firstly, it drops far fewer frames than the PS3 code, and secondly, the response from the controls is significantly crisper, particularly when the environments are chockfull of opponents. And again, similar to GTAIV, while the tearing is there, it's pretty much unnoticeable in gameplay (cut-scenes are another matter).
To give some idea of the difference in performance, here's a frame rate analysis video comparing two scenes from the game from the first and second chapters respectively, followed by a more lengthy QTE section (essentially a cut-scene with the occasional button-press). The numbers at the top are fairly self-explanatory, as are the graphs - both being averages of a set number of frames. Where you see a green vertical line, that's an indication of a torn frame on the 360 version. A blue line would indicate the same thing on the PS3 build, but you won't see any of those here due to the v-lock.
Basically v-sync is nothing special instead of displaying a torn frame you drop a frame (which happens in RE5). Why capcom didn't implement it on the 360? It's a bit of a mystery if you ask me.
They have the 360 doing more stuffs as described in this article with overall better frame rate at the cost of tearing.
It's not the only case of weird choices (see bionic commando) developers are in search for parity but in the same time make pretty significant decision in the way they implement game on the both platform. I wonder if it could be a MS/Sony issue say at validation MS ask for some thing Sony others, etc.
 
Hum... Looks like you're right

Basically v-sync is nothing special instead of displaying a torn frame you drop a frame (which happens in RE5). Why capcom didn't implement it on the 360? It's a bit of a mystery if you ask me.
They have the 360 doing more stuffs as described in this article with overall better frame rate at the cost of tearing.
It's not the only case of weird choices (see bionic commando) developers are in search for parity but in the same time make pretty significant decision in the way they implement game on the both platform. I wonder if it could be a MS/Sony issue say at validation MS ask for some thing Sony others, etc.

I don't think is a mistery. Simply capcom and other developers choice the better way in both platform (50 cents, grin games, even wolfenstein and brothers in arms are another example of triple buffer on the ps3 vs double buffer on the 360 ) . If the triple buffer would be so easy on the 360 will have seen 360 totally vsync even when there isn't on the ps3. Not the reverse. I don't find properly to say ps3 not have choice but 360 did it but developers are so smart to prefers the tearing :???:
I mean uh ?! Who said that?
 
Last edited by a moderator:
I don't think is a mistery. Simply capcom and other developers choice the better way in both platform. If the triple buffer would be so easy on the 360 will have see 360 totally vsync even when there isn't on the ps3. Not the reverse. I don't find properly to say ps3 not have choice but 360 did it but developers are so smart to prefers the tearing :???:
I mean uh ?!
It's kind of the same for AF it's not an important check point to have.

For RE5 it looks like Vsync has a consistent impact on the ps3 rendition in term of fps and input lag, maybe Capcom had to introduce triple buffering and Joker has a point (we will most likely never know for sure). On the 360 they may have be happy with the overal performances and look.
Clearly I see where you are coming but triple buffering is nothing new neither there is something in the 360 that would prevent its use. Overall it consumes more RAM and the 360 and the 360 is not lacking in this regard.
 
I don't think is a mistery.

I think we are going in circles. True, it's not a mystery, the answer (as I've also been saying) is right in that Digital Foundry quote:

"...the Xbox 360 version has a tangible advantage here on two fronts. Firstly, it drops far fewer frames than the PS3 code, and secondly, the response from the controls is significantly crisper, particularly when the environments are chockfull of opponents. And again, similar to GTAIV, while the tearing is there, it's pretty much unnoticeable in gameplay (cut-scenes are another matter)."

The first thing I noticed the instant that I played both versions of RE5 is that the PS3 version ran and responded worse. It was patently obvious. The 360 version on the other hand ran smooth, responded fast, and I didn't even notice any tearing while playing. What I really don't get is why you are using PS3 RE5 as the example when the PS3 version was clearly worse. They have the 360 version running where tears aren't really noticeable and response is good, why in heck would they want to ruin that and make it perform bad like the PS3 version by introducing missed frames with vsync, or introducing lag with triple buffer? That would be madness. Clearly they didn't think it was a good idea either, which is fortunate since at least there remains one version of RE5 that actually plays and responds well.

Tearing is no where near as noticeable as you guys are making it out to be (especially if it happens near the top of the screen) whereas frame rate drops and input response are. Look at Dirt 2 for example using tear measurements taken right from this forum:

360 AVG(29.942) Tearing( 1.729%) Low(29.0)
PS3 AVG(29.833) Tearing(20.261) Low(28.0)

The measurements show significant amounts of tearing on the PS3 version. But then look at the Dirt 2 thread here and notice people posting that they didn't even notice it. And that's with ~20% tearing! Now, slap on vsynch induced frame rate stutters or some input lag, do you really think typical people would prefer that?

If I had the choice I would have also gone the Capcom route on the 360 version of RE5, it makes the game better because the tearing is not that big of a deal whereas the nicer frame rate and control response is very welcome.
 
Joker on another topic, may I ask you if you heard comments about new xbox development kit.
I wonder if it will have an impact on the amount of optimization done on the 360 or if will only come down to doing the same job in less time (against doing more job in the same time)?
 
I think we are going in circles. True, it's not a mystery, the answer (as I've also been saying) is right in that Digital Foundry quote:

"...the Xbox 360 version has a tangible advantage here on two fronts. Firstly, it drops far fewer frames than the PS3 code, and secondly, the response from the controls is significantly crisper, particularly when the environments are chockfull of opponents. And again, similar to GTAIV, while the tearing is there, it's pretty much unnoticeable in gameplay (cut-scenes are another matter)."

The first thing I noticed the instant that I played both versions of RE5 is that the PS3 version ran and responded worse. It was patently obvious. The 360 version on the other hand ran smooth, responded fast, and I didn't even notice any tearing while playing. What I really don't get is why you are using PS3 RE5 as the example when the PS3 version was clearly worse. They have the 360 version running where tears aren't really noticeable and response is good, why in heck would they want to ruin that and make it perform bad like the PS3 version by introducing missed frames with vsync, or introducing lag with triple buffer? That would be madness. Clearly they didn't think it was a good idea either, which is fortunate since at least there remains one version of RE5 that actually plays and responds well.

Tearing is no where near as noticeable as you guys are making it out to be (especially if it happens near the top of the screen) whereas frame rate drops and input response are. Look at Dirt 2 for example using tear measurements taken right from this forum:

360 AVG(29.942) Tearing( 1.729%) Low(29.0)
PS3 AVG(29.833) Tearing(20.261) Low(28.0)

The measurements show significant amounts of tearing on the PS3 version. But then look at the Dirt 2 thread here and notice people posting that they didn't even notice it. And that's with ~20% tearing! Now, slap on vsynch induced frame rate stutters or some input lag, do you really think typical people would prefer that?

If I had the choice I would have also gone the Capcom route on the 360 version of RE5, it makes the game better because the tearing is not that big of a deal whereas the nicer frame rate and control response is very welcome.


The screen tearing is clearly visible in the cut scenes of RE5 360 (no they don't stay on the top), why not use triple buffer in the cut scenes at the least? You don't control anything there. And IMO, the input lag issue with RE5 PS3 is more to do with its piss poor frame rate. We should look at the games with steady frame rate like Resistance 2 or Uncharted 2, if we're to discuss about the input lag of triple buffering. In fact, when you look at the 30fps games out there, input latency is already well over 100ms. Even supposedly responsive game like Halo3 (double buffer v-locked) inhabits 130ms~160ms of input latency, and it is amplified even further depending on what kind of display you use. (like your plasma add 30~50ms more) I seriously doubt added 17ms of latency by triple buffer would make any tangible difference, nor does it in practice.

Now if we look at some other examples, such as 50 cent and Bionic Commando, these games are completely missing v-sync on the 360, so you get well over 50% of torn frames seriously affecting IQ, while the PS3 version is tear free at a similar frame rate with triple buffer. These games do provide slightly more responsive controls on 360 (in this case 33ms latency advantage over triple buffer), but if that was the reason not to use triple buffer, why not do the same for the PS3?

If there's really nothing that holds 360 from using triple buffer, the 360 should be even more capable of utilizing triple buffer than the PS3 with its more available ram. But in reality, there're plenty of triple buffered games on PS3, while I've yet seen one on 360.

The screen tearing is a serious issue, and having it less on 360 can't be a valid reason for its complete absence of triple buffer, because there're always the games with poor frame rate and tons of screen tearing like Mass Effect that could clearly benefit from triple buffer. (If you hadn't noticed the screen tearing in ME, I envy you. I'm extremely keen to screen tearing, and it is the single deciding factor when I purchase a game. I did get the 360 version of RE5 though, dynamic QAA is f**king stupid :LOL:)
 
I think we are going in circles. True, it's not a mystery, the answer (as I've also been saying) is right in that Digital Foundry quote:

"...the Xbox 360 version has a tangible advantage here on two fronts. Firstly, it drops far fewer frames than the PS3 code, and secondly, the response from the controls is significantly crisper, particularly when the environments are chockfull of opponents. And again, similar to GTAIV, while the tearing is there, it's pretty much unnoticeable in gameplay (cut-scenes are another matter)."

The first thing I noticed the instant that I played both versions of RE5 is that the PS3 version ran and responded worse. It was patently obvious. The 360 version on the other hand ran smooth, responded fast, and I didn't even notice any tearing while playing. What I really don't get is why you are using PS3 RE5 as the example when the PS3 version was clearly worse. They have the 360 version running where tears aren't really noticeable and response is good, why in heck would they want to ruin that and make it perform bad like the PS3 version by introducing missed frames with vsync, or introducing lag with triple buffer? That would be madness. Clearly they didn't think it was a good idea either, which is fortunate since at least there remains one version of RE5 that actually plays and responds well.

Tearing is no where near as noticeable as you guys are making it out to be (especially if it happens near the top of the screen) whereas frame rate drops and input response are. Look at Dirt 2 for example using tear measurements taken right from this forum:

360 AVG(29.942) Tearing( 1.729%) Low(29.0)
PS3 AVG(29.833) Tearing(20.261) Low(28.0)

The measurements show significant amounts of tearing on the PS3 version. But then look at the Dirt 2 thread here and notice people posting that they didn't even notice it. And that's with ~20% tearing! Now, slap on vsynch induced frame rate stutters or some input lag, do you really think typical people would prefer that?

If I had the choice I would have also gone the Capcom route on the 360 version of RE5, it makes the game better because the tearing is not that big of a deal whereas the nicer frame rate and control response is very welcome.
And again not explain why there are a serie of games how normally turn with 30% and more of tearing on the 360 too but the triple buffer lack continue although that, but not on the ps3.
 
A few notes about the v-sync situation. First of all it is important to note that there can be a HUGE difference in human perception vs the mathematical fact of whether a frame is torn or not.

Frame rate analysis will note v-sync tearing where the human eye cannot see it. The human eye is far less accurate an instrument. Some observations based on my own perception of tearing:

1. The further away from the centre of the screen the tear is, typically the less obvious it will become during gameplay.
2. If the tear is within the overscan areas of the typical HDTV, it will not be noticed. A pure v-sync calculation on Batman: Arkham Asylym or Mirror's Edge on PS3 for example will show that around 40% of frames are torn. However, as far as the human eye is concerned, there isn't that much tearing. In these cases, I tend to clip out the overscan area from analysis. But I am making a subjective call here on where I would consider the overscan boundaries to be.
3. If the game scene is moving in a lateral, left to right motion, the tearing will be far more obvious to the human eye than movement "into" the screen.

In this sense saying that Game X has 10% screen tear or whatever is an interesting measurement, but it is a measurement only and its relation to the gameplay experience will vary dramatically from game to game.

To illustrate, up until I did in-depth analysis of Halo 3, I could've sworn it was v-locked at 30fps, but it isn't - that's an example of Bungie making the right call on performance and response vs IQ.
 
youre right grandmaster, the human eye is a terrible judge as to whether or not its happening, give me the machines results anyday.
also depending on what scene is being rendered makes it apparent (eg colors, contrast) this is related to your point 3
 
I'd like to make a little correction here, the triple buffering does NOT add more input latency over the double buffering with v-sync, and it does have input latency advantage over double buffering games with v-lock (though rarely seen, the most recent example would be the Infamous) simply because the triple buffer can display more frames. (the double buffer with v-lock displays at refresh rate of low common denominator of 60,30,20,15 etc) So, brain_stew was right after all. Most games use double buffer with dynamic v-sync (turn v-sync off when frame rate dips below 30) However it does add a frame (16.67ms) of input latency over the games with no v-sync, and I believe this was the case with joker's game.
 
youre right grandmaster, the human eye is a terrible judge as to whether or not its happening, give me the machines results anyday.
I don't think that was quite his point. ;) I think what grandmaster is saying is that you may have a numerical advantage, but in the real world it makes no odds, so saying 'this game looks better on this platform than that because it 67% less tearing' is misleading. Thus triple buffering to remove tearing may be pointless, if the tearing, even if there's lots of it, is mostly imperceptible. eg. 100% torn frames in the top three lines looks bad on paper but won't affect gameplay.
 
I don't think that was quite his point. ;) I think what grandmaster is saying is that you may have a numerical advantage, but in the real world it makes no odds, so saying 'this game looks better on this platform than that because it 67% less tearing' is misleading. Thus triple buffering to remove tearing may be pointless, if the tearing, even if there's lots of it, is mostly imperceptible. eg. 100% torn frames in the top three lines looks bad on paper but won't affect gameplay.

But then Shifty, from what I gather from what folks like assurdum and a few others are saying, is that for the many games on 360 that do tear badly the tearing is NOT imperceptible. And so the question is then, in these cases, why would the developer choose not to impliment tripple buffering on the 360 (which observably tears rather heavily) and yet does impliment it on the PS3?

Personally I'm generally not sensitive to some tearing and stuff like minor framerate drops. Although there have been a few games on 360 I've played that the tearing was obvious it became jarring :-S
 
Because it is clear that human perception of tearing is different from person to person? At the end of the day, someone somewhere with human eyes is making the decision about when to engage v-sync and when not to.

We can only make a guess as to what those people's reasons are.
 
I suppose the question isn't really about tearing, but why different versions of a multiplatform game have variations that seem nontechnical, like brtghtness and contrast settings. In the case of tearing, I guess the XB360 team decided less lag was better than less tearing, whereas the PS3 team chose otherwise.

I think Joker has given enough info for us to know it's not because triple buffering is easier or more effective on PS3, so people shouldn't be looking for hardware explanations.
 
I suppose the question isn't really about tearing, but why different versions of a multiplatform game have variations that seem nontechnical, like brtghtness and contrast settings. In the case of tearing, I guess the XB360 team decided less lag was better than less tearing, whereas the PS3 team chose otherwise.

I think Joker has given enough info for us to know it's not because triple buffering is easier or more effective on PS3, so people shouldn't be looking for hardware explanations.
I don't try to blame joker, has more knowledge about me surely, big respect, but I don't think said '360 no needs of that' and theoretically it isn't a problem, to prove the reverse. The games shows a different situation. An ot example, Fallout 3 had better texture on the ps3 than 360, theoretically must be the reverse; and the first time who someone has noticed that I have readen a lot of explanation where wasn't possible, placebo effect etc...after the shots anyone said something more. So when there are a bunch of games who shows triple buffer isn't a problem on the ps3 and on 360 is missed, with an evident tearing after all, where is the logic to say 360 no needs of that, tearing is subjective etc? I'm the first who don't care so much about 2% & 30 % of tearing difference between both system but reading eurogamer face off, the 360 version hit the ps3 version only for that sometimes or I'm wrong? :???: Don't get me wrong, the console war isn't my purpose, I have just said vsync seems more suitable on the ps3, no more, I don't believe is so absurd watching the results.
 
Last edited by a moderator:
Because it is clear that human perception of tearing is different from person to person? At the end of the day, someone somewhere with human eyes is making the decision about when to engage v-sync and when not to.

We can only make a guess as to what those people's reasons are.

Well then if triple buffering on 360 is as easy as joker claims then I guess the only conclusion we can draw is that most 360 developers have terrible eyesight! :p

Any tearing is too much tearing as far as I'm concerned and it absolutely does impact my gaming purchases. Even tearing in the overscan area isn't a great solution in my eyes, one of the specific reasons I chose the HDTV I bought is because it has a "just scan" mode, and now my quest for perfect IQ is having the exact opposite effect! :devilish:

Tearing is just a real pet peeve of mine, and when there's such a simple solution out there, I always find it baffling, it is used so little. I'm talking the PC side here as well, but at least D3DOverrider sorts out that for me, it should be an ingame option in more games, and more people should know the benefits.
 
Well then if triple buffering on 360 is as easy as joker claims then I guess the only conclusion we can draw is that most 360 developers have terrible eyesight! :p

Any tearing is too much tearing as far as I'm concerned and it absolutely does impact my gaming purchases. Even tearing in the overscan area isn't a great solution in my eyes, one of the specific reasons I chose the HDTV I bought is because it has a "just scan" mode, and now my quest for perfect IQ is having the exact opposite effect! :devilish:

Tearing is just a real pet peeve of mine, and when there's such a simple solution out there, I always find it baffling, it is used so little. I'm talking the PC side here as well, but at least D3DOverrider sorts out that for me, it should be an ingame option in more games, and more people should know the benefits.


http://forums.xna.com/forums/p/31951/182981.aspx#182981


the linked forum posts above seem to indicate that the triple buffering on 360 is not as easy to incorporate as joker claims. Even the most hardcore tech guys like Crytech couldn't get it running on the 360 version of their engine, I guess there's nothing more to discuss about whether 360 is capable of triple buffering or not.

IMO, the real benefit of triple buffering is for the games that are just shy off 60fps, where you'd have to choose between screen tearing and 30fps cap in double buffering. It would also save time for any further optimizations to get the game running at 60. For extra 15MB of V-RAM for third buffer, I think this is a steal deal. I'm pretty sure a lot more people would prefer tear free IQ with possibly higher frame rate and more responsive controls over slightly better texture fidelity 15MB could provide which probably no one will notice.
 
http://forums.xna.com/forums/p/31951/182981.aspx#182981


the linked forum posts above seem to indicate that the triple buffering on 360 is not as easy to incorporate as joker claims. Even the most hardcore tech guys like Crytech couldn't get it running on the 360 version of their engine, I guess there's nothing more to discuss about whether 360 is capable of triple buffering or not.

IMO, the real benefit of triple buffering is for the games that are just shy off 60fps, where you'd have to choose between screen tearing and 30fps cap in double buffering. It would also save time for any further optimizations to get the game running at 60. For extra 15MB of V-RAM for third buffer, I think this is a steal deal. I'm pretty sure a lot more people would prefer tear free IQ with possibly higher frame rate and more responsive controls over slightly better texture fidelity 15MB could provide which probably no one will notice.
Than you very much for the link. Finally we have found an explanation. ;)
 
Back
Top