GRAW review XBOX 360 by IGN

Why have I never noticed this before?

I know what Vsync is, vaguely. What I dont know is what these people are talking about in regards to a console game.

Screenshots of this in GRAW? Examples of other games that have this failure?

Explanations why videogame companies would put out product with an obvious easily fixed technical flaw?

Proof that this flaw exists in other console games? No personal claims about what *you* see, please.
 
Do I care about this alledged flaw? Will I not buy GRAW? Do I think I will notice this flaw?

No.
I have never noticed it before. I have never heard of it before.

Why did the IGN review not mention this? Is it mentioned in any review? Why would reviewers miss a glaring visual flaw in a game?

Examples of this in other console games?
 
Xbot360 said:
Sometimes it seems like people pick apart every little flaw in Xbox games.

Sorry I forgot to mentiom the game is excellent, easilly the best so far on the X360 IMO and the tearing really doesnt bother me. I'm pretty sure alot of gamers will not even notice it, my only gripe is the lack of anistropic filtering which is noticable especially if you are a PC gamer used to applying 8x-16x. From what I understand AF is not as big a performance hit as AA so why not apply it?
 
Xbot360 said:
Let me ask this: What are some other games with this "issue"? Whatever it is.

:idea:

You do have Xbox right? Do you have a copy of Metal Arms in close range? Play it and witness what they mean with screen tearing. It is pretty severe in Metal Arms, a great introduction to the phenomena we call screen tearing.
 
The vast majorty of console games use vsync. I'm not sure if the ones that don't do not becuase they don't want to use the extra framebuffer's worth of memory to make vsync run smooth or if they just don't know any better, probably a combination of the two.

Other examples of games that lack vsync are PDZ and Maddon on the 360, all the Splinter Cell games (Xbox versions anyway, never played the other console versions), and God of War on the PS2.
 
kyleb said:
The vast majorty of console games use vsync. I'm not sure if the ones that don't do not becuase they don't want to use the extra framebuffer's worth of memory to make vsync run smooth or if they just don't know any better, probably a combination of the two.

Other examples of games that lack vsync are PDZ and Maddon on the 360, all the Splinter Cell games (Xbox versions anyway, never played the other console versions), and God of War on the PS2.
PDZ has pretty nasty tearing. FEAR for the PC let's met turn it off and I get a fairly good jump in framerate, so I run with it off.

I doubt it's a "don't know any better" issue, but a design tradeoff they didn't expect to have to make. It sounds like vsync is trivial to incorporate but at a small hit in perf, so a prime candidate for hitting their perf numbers.
 
Xbot360 said:
Sometimes it seems like people pick apart every little flaw in Xbox games.

I have never in my life heard of this so called vsync issue before. But all the sudden it's this big thing right?

I do not care 1%. I've never, ever heard of it, let alone noticed it.

Well i guess i'm a grumpy, crusty, old gamer because vsync has been really distracting for me since i first fired up GLQuake on my PC many moons ago! :)

Vsync is one of things that some people care about and some dont, similar to AA. Some people's eyes bleed without it and some dont even notice it when its gone.

For me, the tearing caused by vsync being 'off' really impacts(negatively) the suspension of disbelief. When it is off, the effect can be minimal to awful.

What i am wondering is if the use of tiling, or not, has any impact on the ability to vsync. I think its odd/nsettling that a few of the marquee titles have chosen to leave it off.
 
kyleb said:
The vast majorty of console games use vsync. I'm not sure if the ones that don't do not becuase they don't want to use the extra framebuffer's worth of memory to make vsync run smooth or if they just don't know any better, probably a combination of the two.

Even games without vsync use double buffering, so enabling vsync won't require an additional buffer. Or are you refering to tripple buffering to improve overall frame rates (when used with vsync), but at the cost of added latency?

Keeping vsync until you drop below target frame rate seems like the best way to use resources. A much less noticeable drop in frame rate but at the cost of tearing.
 
The console games that don't use vsync don't show a lack of knowledge of the dev part. If anything, the opposite.
These games don't have vsync enabled because the devs couldn't get the framerate stable enough. In such circumstances, they rather make the screen tear than having the game framerate go all the way down to 15fps (with vsync, it always has to be a multiple of 30). that way, the games slows down less noticeably than with vsync enabled, as it can go down to 29 or 28 or 27 or whatever fps the hardware can handle at the time, however the screen tears.
 
From my perspecitve if it's dropping down to 29 or 28 and that's why they lose the VSync, they should cut back on the engine so it never takes > 33ms to process a frame. I hate that the first thing to go in any endeavour is quality. If something is to be cut back, for cost reasons, performance, or anything else, the creators will tend to keep all the features and keep them rough rather than drop a few and keep everything classy. Disabling VSync is a 'quick and dirty' way to minimalise nasty jitters that should otherwise have been designed out IMO.
 
london-boy said:
The console games that don't use vsync don't show a lack of knowledge of the dev part. If anything, the opposite.
These games don't have vsync enabled because the devs couldn't get the framerate stable enough. In such circumstances, they rather make the screen tear than having the game framerate go all the way down to 15fps (with vsync, it always has to be a multiple of 30). that way, the games slows down less noticeably than with vsync enabled, as it can go down to 29 or 28 or 27 or whatever fps the hardware can handle at the time, however the screen tears.

Probably correct here. I was just hoping, out loud, that tiling doesnt make vsync prohibitve in any way. I'm pulling that theory out of very thin air, of course. :)

Shifty Geezer said:
From my perspecitve if it's dropping down to 29 or 28 and that's why they lose the VSync, they should cut back on the engine so it never takes > 33ms to process a frame. I hate that the first thing to go in any endeavour is quality. If something is to be cut back, for cost reasons, performance, or anything else, the creators will tend to keep all the features and keep them rough rather than drop a few and keep everything classy. Disabling VSync is a 'quick and dirty' way to minimalise nasty jitters that should otherwise have been designed out IMO.

Or put an option in the menu which disables those features to provide vsync for those who want it. Probably opens up a big can of worms though.
 
Last edited by a moderator:
expletive said:
Or put an option in the menu which disables those features to provide vsync for those who want it. Probably opens up a big can of worms though.
If the game just keeps dipping under 30 fps, VSync is going to really hurt with constant jitters. But disabling VSync will have the dreaded tearing. Providing it as an option is only going to let users chose bad graphics or really bad graphics :p

I can't see a fair way to explain disabling VSync when you have the possiblity to design and target a specific framerate and engineer everything to fit to that. Unless it's near deadline and you find problems with slight drops and quickly fix them by disabling VSync, which I guess could be the reason in a lot of cases. But personally I'd rather devs target 90% hardware performance and be sure of hitting the mark than try to push the envelope for that extra bit and break the smooth, consistent framerate. On PCs it's a different matter because of the wide variety of machines. When playing NWN for the first time on my (then new) GF4200 I tried without VSync and got terrible tearing at like 4 lines per update, and then switched on VSync and got 5 jitters a second on an otherwise 60 fps update. Ouch! That's why I don't like PC gaming, and that's what I don't want from closed-box consoles. I remember R-Type on the Sega Master System would have shimmering on sprites when it was drawing too much on screen, with some lines not being drawn. It kept the frame rate up and smooth and was much more preferable IMO that slowdown. Of course I may be very much in the minority :(
 
function said:
Even games without vsync use double buffering, so enabling vsync won't require an additional buffer. Or are you refering to tripple buffering to improve overall frame rates (when used with vsync), but at the cost of added latency?
Triple buffering doesn't add any latency over double buffering with vsync, it just requres an extra framebuffers worth of memory to store the finished frame while moving on to the next one instead of sitting idle waiting for buffer swap which is what has to be done with only double buffering and vsync.

function said:
Keeping vsync until you drop below target frame rate seems like the best way to use resources. A much less noticeable drop in frame rate but at the cost of tearing.
I'd rather just see say 30 unique textures in a sceen and vsync always on with triple buffering, than 34 textures with a big tear across the screen every time the framerate drops.
 
london-boy said:
The console games that don't use vsync don't show a lack of knowledge of the dev part. If anything, the opposite.
These games don't have vsync enabled because the devs couldn't get the framerate stable enough. In such circumstances, they rather make the screen tear than having the game framerate go all the way down to 15fps (with vsync, it always has to be a multiple of 30). that way, the games slows down less noticeably than with vsync enabled, as it can go down to 29 or 28 or 27 or whatever fps the hardware can handle at the time, however the screen tears.
See, I've seen that argument from developers as well which shows a lack of knowlage does have something to do with it in at least some cases, as triple buffering solves the issue you mention.

Oh, and they are vsyncing to the refreshrate which is 60hz TVs; so double buffering and vsync gives you 60, 30, 20, 15 and so on; rather than if TVs refreshed at 30hz which would mean straight from 30 to 15 fps.
 
Last edited by a moderator:
The "if you don't hit 30 fps, it'll be 15" or less with vsync part is just not true. I am to bored typing two paragraphs to point out the obvious, just search for it in the pc forums...
 
Back
Top