The Technology of GTA IV/RDR *Rage Engine*

Why? There's no option to turn off vsync in the PS3 version of GTAIV.

Because if you're conducting a test where you want to compare two things, in the interests of fairness you'd want to account for the effects of vsync.

Typing "vsync" into Google came up with a couple of articles which could act as a starting point:
http://www.hardforum.com/showthread.php?t=928593
www.tweakguides.com/Graphics_9.html

So how does this affect the reported framerates? That's a question that article should be asking IMO.
 
Though that is some low average framerates on the PS3.
true but the max it could of achieved was 30fps

u cant compare vsync enabled vs not enabled
its very possible that one with vsync turned on can have a much higher min fps bt a lower average fps
eg in fps
A/ 30,30,30,29,26,25,30, = ~28fps
B/ 80,75,50,25,20,15,50, = ~52fps
 
You can update the Hi-Z when switching tiles. There's a .pptx somewhere talking about loading a 720p screen of Z values from memory and updating the Hi-Z in 0.17ms. About 1/100th of GTA4's render time, and the "lazy" method is only twice that.

Hm... if memory serves me properly, I believe you're referring to these slides. :)

360-PerfSlide-1.png


360-PerfSlide-2.png




Xbox 360 GPU Performance Update
http://download.microsoft.com/downl...324bad9b1/Xbox 360 GPU Performance Update.zip

Personally, I would be more happy if they did 640p with 4xAA and better shadow filtering (and of course no texturing bug/whatever). Would probably have higher framerate, too.


Indeed!
 
true but the max it could of achieved was 30fps

u cant compare vsync enabled vs not enabled
From what grandmaster is saying, very few frames on 360 show tearing. There's a good chance that V-sync is enabled with some conditions disabling it.

You can't have 26 fps average in two tests without v-sync and not notice tearing.

grandmaster, do you have the source code for the frame counter? As long as the hardware is drawing slower than 60fps, it wouldn't be hard to modify the code so that it took lack of v-sync into account.
 
I did love the bit where "obvious pop-in on 360 that the PS3 version coped better with" concludes that any pop-in whatsoever is "something you wouldn't expect from the PS3 code".

Yeah he did skip that part very fast didn´t he :)

This bit:
What is curious to me is that I can see no technical reason why the 360 game shouldn't just be a more detailed, smoother version of the PS3 version. Indeed, if the texture-dither filter could be turned off with a selectable option in a forthcoming patch, I'm almost certain that it would be the superior-looking game simply by virtue of the extra resolution and edge-filtering. But as it is, right now, there's not much in it.

Wouldn´t the texture filter stuff have to replaced by something else to provide for the DOF effect?
 
From what grandmaster is saying, very few frames on 360 show tearing. There's a good chance that V-sync is enabled with some conditions disabling it.

You can't have 26 fps average in two tests without v-sync and not notice tearing.

grandmaster, do you have the source code for the frame counter? As long as the hardware is drawing slower than 60fps, it wouldn't be hard to modify the code so that it took lack of v-sync into account.

u do agree with the following statement

platform A vsync on, B vsync off

if A has higher fps than B then is is definitely is faster
if B has higher than A it perhaps is faster (maybe some readers think than by perhaps i mean 90% but no i mean theres anywhere from a 0-100% chance, we just cant tell )

u cant compare something with vsync off to vsync on
the eurogamer article glosses over this, in fact platform A vsync on + B off, its logical to deduce that B is slower.

look at the vast majority of pc games which they benchmark, the framerate varies greatly depending on whats happening on/off screen
eg average might be 40fps but ingame its going between 20-100fps btw in this case the median framerate is prolly ~30fps which is a far more accurate measurement than average fps (*)

(*)aside, why do most(all) review sites use average!!!!!!! fps in benchmarks
(for those that dont know statistics wiki median)
they should be using median or lowest fps, as this is far more useful
 
joker454 said:
I wonder if there is some kind of synchronization limitation with their PS3 engine which prevents them from disabling v-sync on that box.
Possibly, but it strikes me as rather odd given how many sync options there are available for the PS3 innards.
I mean I'm coming from perspective of machines where we did self-synchronizing, self-looping DMA lists with real-time list modification, and the hw-supported sync mechanisms were FAR more primitive.
That said, it's always a matter of software design more then anything - so yea, who knows.

Mintmaster said:
You can't have 26 fps average in two tests without v-sync and not notice tearing.
Sure you can - I've done loads of testing in that respect. But it's one thing what you see on screen when playing, and another counting captured frames one by one.

As long as the hardware is drawing slower than 60fps, it wouldn't be hard to modify the code so that it took lack of v-sync into account.
Actually yea - it should be measured in time units rather then frame counts.
Eg:
1 Solid frame ~ 16.7ms.
2 solid frames ~ 33ms.
1 Solid frame + 33% of frame ~ 22.1ms
... well you get the idea. All you really need to add is logic to pixel measure height of torn frames.

Should be possible to construct a min/max fps graph from this too.
 
Last edited by a moderator:
Lead platform always benefits, no question about that. We don't really know though if they lead on 360 or did parallel development.
What we know for sure is 360 version was always the lead, at least in terms of progress.
It also looks like assets ended up being same in both versions.
I don't know what other conclusion can be there.
Things like post processing differences, or a couple of parameter tweaks don't really make a game "parallel developed", me thinks.
If 100 million bucks isn't enough to hire enough programmers, then how much is?
Good question, but I'm not convinced ps3 development scales linearly with money. ;)
 
Cool, so I'm not insane :) Now I have to wonder wtf is wrong with these other online mags. Are they playing a different game? Very strange. This comment was interesting regarding the PS3 blur:

"The upscale and resultant blur helps to make the game look a touch more movie-like; less rendered and less 'gamey' if you will - a good combination for a mainstream audience."

Thats kind of what I was thinking, that perhaps some people viewed it as more cinematic. If true though, then it does not bode well for blu-ray movies when viewed against regular upscaled dvd's.

Well, that's kinda silly. Movies on Blu-ray still look filmic by virtue of having been shot on film, or if shot digitally, by virtue of the fact that such cameras use film as their reference point. There may be something to what you're saying as far as CGI is concerned. Special effects can look more obviously artificial in HD (although I'm not sure if this isn't the result of compression technology on the movies I've seen).

At any rate, HD movies generally reveal more detail. What you're seeing in GTA4 is an effect that obfuscates the lack of detail in the environment.
 
"The upscale and resultant blur helps to make the game look a touch more movie-like; less rendered and less 'gamey' if you will - a good combination for a mainstream audience."

Thats kind of what I was thinking, that perhaps some people viewed it as more cinematic.

How does upscaling cause blur? If I take a PS2 game like ridge racer and upscale it will still look the same. Jaggies will not go away. The blur must be some extra post processing effect.
 
How does upscaling cause blur? If I take a PS2 game like ridge racer and upscale it will still look the same. Jaggies will not go away. The blur must be some extra post processing effect.

By the same reason a photo that is upscaled looks blurrier and thus losses detail.
 
How does upscaling cause blur? If I take a PS2 game like ridge racer and upscale it will still look the same. Jaggies will not go away. The blur must be some extra post processing effect.
A pixel that is an average between two or more pixels that are further away than adjacent can be considered a blur. In the case of upscaling, the original pixel values are moved apart and averaged fill-ins space them out. As Nebula says, grab a picture and enlarge in a photo program. The more you enlarge it, the more blurred it becomes - the amount of information contained in the image is being spread over a wide area.
 
How does upscaling cause blur? If I take a PS2 game like ridge racer and upscale it will still look the same. Jaggies will not go away. The blur must be some extra post processing effect.

You are right, though upscaling to a similar resolution may cause blurring on parts of the image, that's not why PS3 version looks non-game like.
 
If you're upscaling from 640x480 to 1280x960, then the 1280 image can look exactly the same as the 640 image, with four pixels making up the exact same shape as the original one pixel.

However, if you have to do 640 to 720, then you have to figure out something a little more intelligent to decide how to distribute the information over 720 pixels. That would be your post-processing effect right there. Typically it is done really basically with a certain fixed pixel being added or whatever, which is why Quaz can fairly easily pick up the upscaling.

However, this game also has depth of field effects and it may be that there is a blur effect going on even on the most close up pixels. I'm not sure it does though (haven't looked at it) and the depth of field effect may make it look a lot less bad because there simply at least significant amount of pixels in the distance that are blurred for depth of field.
 
Yet the game's interpretation of Manhattan looks amazing even with aliasing. Such an incredible piece of work, although some of the missions can get pretty annoying at times ;)
 
I've also noticed, that during the night there's no blur...

Btw. xbox version is in desperate need of artifacts removing device ;-)
Both versions have it's flaws, depth of field in PS3 version is not one of them
 
Yer I've played both versions now (15hrs Ps3, 2hrs 360) the 360 does have a better framerate and is crisper...

But somehow I still like the look of the Ps3 version (Bias maybe) it just looks more gritty and realistic (maybe the blurring helps :p ) this is after doing a side by side comparison...
The 360 version has a plasticky look to it.


If only the Ps3 version had a better framerate :(

We need a interview with the dev, what are they using the spu's for besides euphoria
 
Blur

A pixel that is an average between two or more pixels that are further away than adjacent can be considered a blur. In the case of upscaling, the original pixel values are moved apart and averaged fill-ins space them out. As Nebula says, grab a picture and enlarge in a photo program. The more you enlarge it, the more blurred it becomes - the amount of information contained in the image is being spread over a wide area.

If I make a 640x480 picture that is half white and half black, then upscale to 1600x1200, it will have no blur.
 
ehh, I was moderated with my picture, telling truth too straight . . . so again PS3 version over day is blurry because it is artistic intent. Over night is everything clear and sharp, look at picture below, photo of 40" screen from 1 meter. And to be politically correct all this is my opinion of course.
gta4.jpg

gta44.jpg
 
Back
Top