Image Quality and Framebuffer Analysis for Available/release build Games *Read the first post*

Plainly wrong.
well of the points I believe 3/4 are wrong
see the top of this page like I said, IIRC the anandtech had some points that were bollux

I've seen the evidence plain as day in dozens of PC games I've used it with.
How do you know youre using triple buffering vs double, what games + what videocard
 
How do you know youre using triple buffering vs double, what games + what videocard

Perhaps becouse he uses D3Doverrider + FRAPs to see the framerate. In games that dont have triple buffering enabled automatically when enabling ingame vsync then you get on a 60Hz monitor 60fps or 30fps or 20fps etc, nothing inbetween. However forcing on triple buffering gives you anywhere between 20-60fps. And most likely it will never reach 20fps as the lowest framerate.

About the framerate it is visible with use of FRAPs or debug info overlay. And despite whatever lag might be introduced due to triple buffering it still is more responsive to have 40-50fps than 30fps. Or 28fps than 20fps etc.

For me it is plain obvious when triple buffering is not enabled as you really notice the framerate dips. 60 to 30fps for a second(s) is blatantly obvious as is 30 to 20fps. Like in Witcher going from 60fps directly to 30fps is like a 'rollercaster abruptly braking in throwing you slightly of balance'. While with triple buffering it is a steady 58-60fps, go figure which one is more pleasant and responsive to the player.

Of course one could just disable vsync and be happy ignoring the plague of tearing frames (IMO worse than lack of AF). However some games just loose their smoothness by having vsync off. The games feel stuttery and not fluid despite high framerates. A good example is BF2 as back in the old times I had around 60fps and without vsync as visual output was twitchy and not coherent (Singleplayer) ofcourse due to the tearing, skipping frames/frame parts the transition between frames was not good enough to give a smooth 'even motion sequence'. Enabling vsync and triple buffering gave a butter smooth experience. Heavens bless triple buffering and vsync.
 
Last edited by a moderator:
The difference with triple buffering with v-sync vs double buffering without v-sync is that you inject, - on average, half a frame of latency into the user-input - update - display chain.

For 60Hz DVI/HDMI connected displays this amounts to 8ms which is noticable enough in twitch first person shooters.

Cheers

"Twitch" shooters that already have over 100ms of lag? When your average console game is releasing with over 100ms of input lag (and a lot more once you take into account the extra lag added by the dislay) an extra 8ms of input lag is going to make literally no difference at all. The input latency argument doesn't hold up at all, in fact one of the key benefits of triple buffering is more responsive controls compared to double buffer vsync. Its not always the correct solution, but I'll be damned if 9 times out of 10 it isn't the best solution.


How do you know youre using triple buffering vs double, what games + what videocard

I use D3DOverrider, and a sound notification lets me know if it has been enabled. Performance compared to standard double buffer vsync in most games is literally night and day. Try it yourself if you don't believe me, for any PC gamer that isn't a fan of tearing (Which I assume is the vast majority) its literally the same as a free and very significant GPU upgrade.

You're talking about as much as a 50% inprovement in average framerate in the best case scenario but more than that the actual perceptual improvement can not be understated as it becomes much harder to spot dropped frames. Tearing is just never a solution as far as I'm concerned, totally ruins the image consistancy and in the worst circumstances will make a game unplayable. Would I be happy to slightly lower the resolution of a few dozen textures to 100% get rid of that artefact in a console game? Hell yes I would, heck considering the complete lack of anisotropic filtering in most console games, I'd never notice the difference anyway.
 
Last edited by a moderator:
Plainly wrong. With triple buffering the graphics hardware sits on the completed frame until vertical blank. This means scan out of this fresh frame, on average, is delayed the time it takes to scanout half a frame (8ms @ 60Hz).

Cheers
With correctly implemented triple buffering (always using the most recently completed frame at vblank) the only case where you get more latency than without vsync is when you would get tearing -- and the additional visual processing of that artifact distracts more than a few ms additional wait time.
 
With correctly implemented triple buffering (always using the most recently completed frame at vblank) the only case where you get more latency than without vsync is when you would get tearing -- and the additional visual processing of that artifact distracts more than a few ms additional wait time.

Which is the case, that most people here are complaining that the 360 is tearing. Because it's not doing double buffer with v-sync and it's not doing tripple with v-sync, because the developer (as joker and others have already pointed out), they rather have some screen tear (using double without v-sync or a pseudo double buffer v-sync) than higher latency with triple with v-sync.

pseudo double buffer v-sync, does v-sync if the framerate is above 30, if less, then it falls back to double buffer without v-sync.
 
about batman AA ps3/360 lends of truth comparison, has anyone noticed the differences in textures or drawing distance on the ground that both versions have.

http://i30.tinypic.com/egtkx2.jpg

http://i27.tinypic.com/a5ktw9.jpg

The first shot shows an added shader detail mapping on the 360's surface. The guard's face and helmet might have a similar effect applied too, but the JPEG is rather compressed; there is definitely a higher definition to the normal map creases (like the part just above his lip). The second pic looks like a different texture altogether. The ground geometry looks different too (notice the part jutting out at the bottom of the red circle on PS3.
 
The first shot shows an added shader detail mapping on the 360's surface. The guard's face and helmet might have a similar effect applied too, but the JPEG is rather compressed; there is definitely a higher definition to the normal map creases (like the part just above his lip). The second pic looks like a different texture altogether. The ground geometry looks different too (notice the part jutting out at the bottom of the red circle on PS3.

yeah, all of the lends of truth pics seem to be jpeg shots, so it's hard to see much detail:???: although in a few of them you can see some differences. the first one that i showed i noticed more glimmering on the ground pavement for 360 version.

http://www.lensoftruth.com/wp-content/gallery/h2h_batman_aa/h2h_batman_aa_00112.jpg

and even on the second rollover pic you can see 360 version having that same glimmer effect, it's shrunken down though.:cry: http://www.lensoftruth.com/?p=14110

for the second pic, i too think the texture is different, but it could all be do to the sloppiness of the texture that's showing such distortion. it seems to be the same ground texture, just a distortion on ps3's version.:???: http://i25.tinypic.com/iep93l.jpg

more 1080p uncompressed shots of both versions would help.;)

hmmmmm, in IGN's review for batman AA they said quote "the graphics tend to be sharper on the Xbox 360 when compared to the PS3, which packs a 1.2 GB install that lasts for three minutes."1.2 GB install that lasts for three minutes. Still, it's not enough to matter in terms of how great this game truly is so don't expect a score difference in terms of reviews. Of course, the PS3 also has the exclusive ability to take on challenges as the Joker, but I'm not including that in this review because the Joker stuff isn't on the game disc -- it's free DLC you get from the PlayStation Store."

http://xbox360.ign.com/articles/101/1016701p3.html

now, they would only say something like that if they had already did a comparison, so it seems that they did. what i liked about what IGN's written review is that they had a sense of compassion and respect for both versions and left out what they had over each other. (whereas with lends of truth they did a finer comparison which is what do but left out more specific detail of what the 360 version had and depicted in their shots, and left out that ps3's version comes with a mandatory installation.)

i think if you're going to make the two steps to make nitpick comparison you should take the extra step to bring forth all the nitpicking details......after all their quote states "just telling the truth" :)
 
in the first screenshot, xbox looks like its high res (but Im no expert like quaz)
2nd one imposible to say, the camera is further back thus a lower mipmap level is used on the ground which will always leads to blurring

btw Nebula/brain_stew about D3DOverrider Ild never heard of this application before

though my point still stands
Here're some facts
3/4 are wrong
eg
>>1. triple buffering does not affect frame rate performance.
plainly false as the whole reason for it is to lock the frame rate to vsync refresh,
#2 is corect but 3+4 are both false
 
Let sum up something obvious. V-sync=for weak system.

V-sync only add control delay no matter how much you like V-sync, though a system that can always deliver above the target framerate doesn't need V-sync because it is more powerful. Which mean there's no screen tearing or no obvious screen tearing with more direct control because the more ppwerful system rarely or doesn't fall under 30 or 60fps.

You have 2 choice Samsung+V-synced game(200ms of input delay) or a Panasonic+no V-synced(30ms of input delay) game.

Well the difference is that the guy with the Panny will probably enjoy more his game than the guy with the Samsung.
 
about batman AA, IGN stated "the graphics tend to be sharper on the Xbox 360 when compared to the PS3, which packs a 1.2 GB install that lasts for three minutes."

http://xbox360.ign.com/articles/101/1016701p3.html

I'm guessing they too already did a comparison, and judging by the out come they already came to terms.

funny how lends of truth never mentioned anything about a mandatory installation for the ps3 version, isn't that something that the 360 version doesn't have;)

a comparison of the two batman console versions just turned up at gametrailers.

http://www.gametrailers.com/video/console-comparison-batman-arkham/55191

the first 40 seconds is bs cause those movie scenes are all pre-rendered. ( they should sit closer to their TVs :rolleyes:)

of their comparison i noticed a few things, at around 1:13 on a closes up of a person apparently there appears to be some lack of self shadowing on ps3's part. (under nose and eyes)

http://i29.tinypic.com/2zqzj2o.jpg

it's better seeing it in motion.

at 1:46 you see some differences in ground detail on both versions.

http://i28.tinypic.com/e1b402.jpg

even in this blurry pic i print screened you see some differences in 360's version. (in ground highlights )

and at around 2:13 you see what looks like a very good reason why not to make a comparison video.:oops: (very very lowrez:???:) anyways, you see a difference there (though it could be because of the poor angle that they took of the ps3 version) http://i31.tinypic.com/2egfouu.jpg

an uncompressed 720p (or 1080p upscaled) comparison would be very much appreciated:smile:
 
Vsync off = "poor mans choice"
No triple buffering solution with vsync = Dumb unless system limitations prohibite the use of it

For some of us Vsync off = no annoying extra bit of lag to throw us off in fast moving FPS games. Especially if you're doing it in professional competition as I used to.

Vsync on and triple buffering is fine for casual play but the small bit of lag is just too infuriating for twitch play for some of us.

I keep revisiting the issue with each new video card I get, but the situation remains the same for the most part. Vsync off for twitch games. Vsync on and triple buffering on for slower paced more casual games (like RTS, RPG, adventure games, etc.)

Sorry for the off topic but it's a pet peeve when people try to blow off others that prefer Vsync off for twitch games.

Regards,
SB
 
the first 40 seconds is bs cause those movie scenes are all pre-rendered.

Indeed... There are numerous cut-scenes that have been pre-recorded as videos to hide loading.

of their comparison i noticed a few things, at around 1:13 on a closes up of a person apparently there appears to be some lack of self shadowing on ps3's part. (under nose and eyes)

http://i29.tinypic.com/2zqzj2o.jpg

it's better seeing it in motion.

This is actually one example of the heavy reliance on SSAO for the 360 edition.
 
about batman AA, IGN stated "the graphics tend to be sharper on the Xbox 360 when compared to the PS3, which packs a 1.2 GB install that lasts for three minutes."...
Hi, just a quick note could you please add capital letters in all the right places. It aids reading and gives a visual cue to the standards we hold here. Thanks.
(I'll remove this OT post once read)
 
btw Nebula/brain_stew about D3DOverrider Ild never heard of this application before.

Part of RivaTuner install package and separate standalone application.

Let sum up something obvious. V-sync=for weak system.

V-sync only add control delay no matter how much you like V-sync, though a system that can always deliver above the target framerate doesn't need V-sync because it is more powerful. Which mean there's no screen tearing or no obvious screen tearing with more direct control because the more ppwerful system rarely or doesn't fall under 30 or 60fps.

What was it, 8ms lag for triple buffering?

You either get ~8ms lag or unfinished frames making it harder to see whats going on due to interupted fluidity in visual output.

Also screen tearing is still present at above 30fps or 60fps or inbetween. So you still need vsync. Turning of vsync in any PC game that runs beyond 30fps should be proof enough (locked framerate 30fps still tearing, locked 60fps still tearing). CoJ Blood Bound should be another title to use as it is above 30fps and tears noticably on 360 as observed in this thread.

EDIT: I assume control response is also something to factor in. Wireless versus directly connected etc. Myself use a wired 500Hz(present setting)/1000Hz poll rate, 2000DPI mouse.
 
Last edited by a moderator:
Whatever differences you're seeing in Batman AA's texture resolution or quality is because of the slightly different angles (the guard shot), the capturing quality, AA, and the lack of SSAO on the PS3. Having both versions run side by side, there aren't any differences other than what the DF concluded with the demo.

On that note 1080p comparisons for a 720p native game is pointless.
 
"Twitch" shooters that already have over 100ms of lag? When your average console game is releasing with over 100ms of input lag (and a lot more once you take into account the extra lag added by the dislay) an extra 8ms of input lag is going to make literally no difference at all.

I'm talking about visual feedback latency, not network latency. It's not the same thing at all. If you turn left, your PC/Console doesn't wait 100 ms for a response from the server before it turns your view. All network games today has local update of your own position, - and prediction for other clients.

If you really want to explore the difference go play the original Quake. It had a strict client/server architecture where the client (your PC) would would only draw once it got the update from the server. Fairly spooky to press 'W' for moving forward and then waiting 100ms for you to move forward, timing jumps over lava was tricky. Compare that to the internetworked optimized QuakeWorld which had local update and prediction.

Most games have a fundamental loop that goes:
Read input -> drive game-engine (interaction, motion+collision ) -> render

Lather, rinse and repeat.

The relative penalty for enabling triple buffering and v-sync varies with the game update rate (frame rate) and the refresh rate of the screen.

For a souped up PC running a FPS at 200 Hz (5ms internal latency) on a screen refreshing at 60Hz it makes a big difference.

For a console game running at 60Hz with a screen refreshing at the same frequency triple buffering adds 50% latency on average.

For a console game running at 30Hz vsync adds little penalty.

Cheers
 
I'm talking about visual feedback latency, not network latency.

Um, so am I. Your average console game has over 100ms, Game Austra did a very nice article on the subject. Once you factor in the horrible amounts of latency inherant in modern flat panel displays, you realise most people are playing games with 200ms+ of visual latency without a care in the world.

Multi threading and synchronisation are the big causes of input latency these days, not vertical sync.

Edit, here's the article for you:

http://cowboyprogramming.com/2008/05/30/measuring-responsiveness-in-video-games/

These measurements are without the display latency taken into accoount (since this varies between setup) but you're looking at an extra 50ms+ of latency in many cases.

Games that run at 60 fps:

* PS3 System menus: 3/60ths
* Guitar Hero 3 (XBox 360): 3/60th
* Ridge Racer 7: 4/60ths
* Virtua Tennis: 4/60ths
* Ninja Gaiden Sigma: 4/60ths
* PixelJunk Racers 4/60ths

Games that run at 30 fps:

* Genji: days of the Blade: 6/60ths
* Tony Hawk’s Proving Ground: 8/60ths
* Blacksite: Area51: 8/60ths
* Halo 3 (XBox 360) : 8-10/60ths
* EA’s “Skate”: 10/60ths
* GTA-IV: 10/60ths
* Harry Potter: 10-14/60ths
* Heavenly Sword: 7-18/60ths
 
Last edited by a moderator:
Hi, just a quick note could you please add capital letters in all the right places. It aids reading and gives a visual cue to the standards we hold here. Thanks.
(I'll remove this OT post once read)

Hey, yeah i got the memo. Is there anyway i can re-edit my message?

Whatever differences you're seeing in Batman AA's texture resolution or quality is because of the slightly different angles (the guard shot), the capturing quality, AA, and the lack of SSAO on the PS3. Having both versions run side by side, there aren't any differences other than what the DF concluded with the demo.

On that note 1080p comparisons for a 720p native game is pointless.

Hmmmm, i see, my conclusion about the ground textures looking that way is because of the closer camera the 360 version has, since the camera is closer the mipmap level gets pushed back further than on ps3 version. (unless I'm wrong:smile:)

As for inquiry about 1080p shots, true there wouldn't be any change, though i would still like to see some nicer 720p shots. Batman's one of my favorite superheros, i got both a ps3 and a 360, so it's hard to make up my mind about which one to get.:cry: I really like the AA and SSAO the 360 version has, not to mention the fewer torn frames the 360 version offers, but i also like the joker dlc the ps3 version's got.

Something in my head always tells me to get ut3 games on 360, cause they look cleaner and perform slightly better. What's with 360 having AA present in all of it's ut3 games, is that epic's doing or is that 360's doing?

ps3 can handle anti-aliasing too. :???:
 
Hey, yeah i got the memo. Is there anyway i can re-edit my message?



Hmmmm, i see, my conclusion about the ground textures looking that way is because of the closer camera the 360 version has, since the camera is closer the mipmap level gets pushed back further than on ps3 version. (unless I'm wrong:smile:)

As for inquiry about 1080p shots, true there wouldn't be any change, though i would still like to see some nicer 720p shots. Batman's one of my favorite superheros, i got both a ps3 and a 360, so it's hard to make up my mind about which one to get.:cry: I really like the AA and SSAO the 360 version has, not to mention the fewer torn frames the 360 version offers, but i also like the joker dlc the ps3 version's got.

Something in my head always tells me to get ut3 games on 360, cause they look cleaner and perform slightly better. What's with 360 having AA present in all of it's ut3 games, is that epic's doing or is that 360's doing?

ps3 can handle anti-aliasing too. :???:

Because AA haven't a great support on the ue3. In fact the 360 has edge of 2xAA but not 'full' 2xAA.
 
Back
Top