The Game Technology discussion thread *Read first post before posting*

QAA is my major bugbear with the PS3 to me when a game uses this technique the game looks so blurry its nigh on impossible for me to play. I may be over sensitive to it I don't know but I find it so distracting. I know some game use QAA and look pretty clean but from those shots the PS3 version just looks plain blurry to me, especially in the background areas.
 
QAA is my major bugbear with the PS3 to me when a game uses this technique the game looks so blurry its nigh on impossible for me to play. I may be over sensitive to it I don't know but I find it so distracting. I know some game use QAA and look pretty clean but from those shots the PS3 version just looks plain blurry to me, especially in the background areas.

You can't tell when you play it. I've played both and they looked almost identical. QAA is a positive IMO - the blurring is very slight. I mean, it's used for TRU.
 
QAA is not a positve for me, many games that use it look blurry to me to the point where I want adjust the focus on the screen. Its that irratating. I know there are QAA games that look fine but if RE5 does look like those screens I will notice in game for sure.
 
QAA is quite evident in all the PS3 games i've played that had it, like Resistance and Ratchet and Clank. I dislike the texture blurring, kind of defeats the purpose of having high res textures when you're going to blur them anyway.

But it's obviously better than having no AA (or other aliasing reduction techniques).
 
There is some noticeable difference in some of the shots. I am not talking about aliasing. I am talking about texture detail. There is one specifically that shows a significant change in what appears as a sharp detailed texture on the 360 its blurrier with less defined details on the PS3 version.

I wasnt expecting that difference
Me neither, there appears to be some peculiarities with the lighting & colouring too.

However, I remember watching a video comparison though on GT I think, and the two versions looked nearly identical asides from a few missing shadows (only in the helicopter scene) for the PS3 title. I need to make a closer assessment really.
 
Maybe this is a stupid question, but why on earth would you use AA while you're doing motion blur? To me it makes perfect sense to only do AA when you need it, especially as I can imagine motion blur and AA competing for resources. And this game does seem to have very specific use of motion blur in that it seems to activate when you are doing certain actions only.
Just thinking, wouldn't this cause pecularities in IQ or performance if not both?

I would think it would be better to decide from the outset the anti-aliasing that will be applied to the scene, whilst motion blur can help hide aliasing, its a process that could be used at any moment and not constantly.

I'm not sure if trying to toggle AA and Motion Blur would really help, furthermore resource commitement to every effect that will be used should be pre-decided so that there is no issues of two things competing for resources.
 
Insomniac Games updated their R&D page with some new articles, I did a quick scan and didn't see that it was posted here befo.



http://www.insomniacgames.com/tech/articles/1108/files/Ratchet_and_Clank_WWS_Debrief_Feb_08.pdf

and this http://area.autodesk.com/index.php/ext_stories/insomniac_interview/ an interview with a senior character artists. It mention number of tris used for the models etc. 60K for the big bosses and on average 5K to 10k on the models.

Was looking at that interesting article, I added up all the memory allocations on the memory map and the total only comes to 390 MB in total, quite a bit short of the 512 available. Is this just the minimum RAM usage of the engine, and thus will grow depending on the workload?

It also mentions they are using 20MB for framebuffer stored in VRAM. With regards to the 360 is it possible to store the framebuffer in main memory, or does it have to be stored in the edram?

Is this why TRUW is running at much higher resolution on the PS3? Seeing as the engine doesn't support tiling, they can't emulate the PS3's larger framebuffer, and are forced to go with a lower resolution or no AA (seeing as the 360 is already doing motion blur, couldn't 720p with temporal AA be another option?).

PS: Any reason they aren't using 1024x600 2xAA like COD4/5, Oblivion etc?

And by extension, could you say that games like COD4/5 which run at 600p on both consoles, are limited by the 360? (due to the edram size).

eg. could COD4 have run at a higher resolution on the PS3, but to keep all parties happy and ensure crossplatform parity, they chose to have the same resolution on both consoles?

And so is TRUW the first MP game that has a clear advantage in both resolution and AA in favour of the PS3?
Will more devs (eg. IW) go down this asymmetric route in the future?
 
Last edited by a moderator:
If CoD4/CoD5 were at full 720p on PS3, they would be running at a lower frame rate. As both games already run around 10%-20% slower than than 360 versions on average, I'm not sure this would be a particularly good idea. Realistically you would be moving from the perceived 60fps the games currently have on PS3, to a locked 30fps.

With regards Tomb Raider, on like-for-like gameplay, the 360 version is locked at 30fps - maybe just the odd dropped frame in 12,000 captured. The PS3 version on the other hand is *mostly* 30fps, but can drop hard at any given point. The same gameplay run-through sees the PS3 version drop to a low of 18fps at one point.

In short, both versions are compromised, or to look at it more postiively, the 360 version is optimised for speed and perceived response from the controls, whereas the PS3 version is optimised for a significantly higher IQ.
 
If CoD4/CoD5 were at full 720p on PS3, they would be running at a lower frame rate. As both games already run around 10%-20% slower than than 360 versions on average, I'm not sure this would be a particularly good idea. Realistically you would be moving from the perceived 60fps the games currently have on PS3, to a locked 30fps.

With regards Tomb Raider, on like-for-like gameplay, the 360 version is locked at 30fps - maybe just the odd dropped frame in 12,000 captured. The PS3 version on the other hand is *mostly* 30fps, but can drop hard at any given point. The same gameplay run-through sees the PS3 version drop to a low of 18fps at one point.

In short, both versions are compromised, or to look at it more postiively, the 360 version is optimised for speed and perceived response from the controls, whereas the PS3 version is optimised for a significantly higher IQ.

I do realise that both COD games run worse on the PS3.
But with all the dev tools (Edge) being developed by Sony, and as devs get more used to the system, they may find that they can match or exceed the framerate of the 360 game.

As you said yourself Treyarch didn't really add many improvements to the IW engine for COD5, perhaps this year we may see a far more optimized engine from IW.
Then it could be the case that COD6 to be restricted to 600p for the sake of the 360? As its highly unlikely that they can incorporate tiling for the 360 code and have them both run at a higher resolution.

And with regards to the inferior framerate of the PS3 code, you have to give the devs some credit, it's pushing 50% more pixels than the 360, while maintaining parity in other areas like textures, lighting, AA (unlike Oblivion for PS3).
 
Anything is possible, but your premise essentially suggests that development on 360 has reached a standstill. It's the old 'untapped power of the PS3' argument. My take on that is that you're looking at an engine specifically written for the hardware to see the benefit. Cross-platform developers do not generally have that luxury.

I recently spoke with a very prominent, and technically successful cross-platform developer and the performance analysis of their previous game on both 360 and PS3 showed equal room for improvement going forward.
 
Anything is possible, but your premise essentially suggests that development on 360 has reached a standstill. It's the old 'untapped power of the PS3' argument. My take on that is that you're looking at an engine specifically written for the hardware to see the benefit. Cross-platform developers do not generally have that luxury.

I recently spoke with a very prominent, and technically successful cross-platform developer and the performance analysis of their previous game on both 360 and PS3 showed equal room for improvement going forward.

Doesn't TR:UW and COD4/5 both run multiplatform engine, and so not 'specifically written for the hardware to see the benefit' ( or are you talking about KZ2?)

I realise that the 360 has untapped potential (in fact I'm one of the proponents of that particular argument).

In this case I was talking about a specific limitation of the 360 hardware that's going to be very hard to overcome by improvements to the 360 code.

From what I can garner, it is quite hard to introduce tiling to a MP engine, that wasn't designed for it (AFAIK the only MP engines that do support tiling are Capcom's MT Framework and Bethesda's Fallout 3 engine).

So while 360 graphics will definitely improve, it may be hard for MP devs to compensate for the limited edram size, and so PS3 games could have a resolution or AA advantage in the future. TR:UW could be the first indication of this.

Unless devs choose not to use PS3's advantage in this area, and go for parity instead, and keep resolution the same for both versions (ie in COD's case 600p).
 
So while 360 graphics will definitely improve, it may be hard for MP devs to compensate for the limited edram size, and so PS3 games could have a resolution or AA advantage in the future
What? PS3 now has more available RAM than x360?
 
What? PS3 now has more available RAM than x360?

Well for the framebuffer yes I suppose one could say that, given that the 360 framebuffer is held within edram. Then we go back to notions on "tiling" and necessary bandwith, throughput limitations, etc.


yadda yadda yadda :D
 
This is x360 UMA limitations?

Well issues with resolution would be edram size limitations. On the 360 the framebuffer is held within edram. In order to achieve higher levels of resolution and AA "tiling" is often needed given the limited amount of edram. There has been numerous threads on tiling implementations in the past.
 
Thanks
But i mean overall performance wise(bandwith, throughput limitations, etc.) this is x360 UMA limitations, instead of PS3 NUMA?
 
Last edited by a moderator:
Thanks
But i mean overall performance wise(bandwith, throughput limitations, etc.) this is x360 UMA limitations, instead of PS3 NUMA?

What? Do you mean Unified Memory Architecture or Unified Shader Architecture?

Most devs prefer the 360's UMA over the PS3's NUMA as it's more flexible.
And the 360 has more RAM than the PS3, we are talking about the limited size of the edram, used to hold the framebuffer here.

As I understand the 360 has to use it, ie it can't store the framebuffer in main memory, unlike the PS3.

Which could possibly lead to the situation in my above post.
 
If CoD4/CoD5 were at full 720p on PS3, they would be running at a lower frame rate. As both games already run around 10%-20% slower than than 360 versions on average, I'm not sure this would be a particularly good idea. Realistically you would be moving from the perceived 60fps the games currently have on PS3, to a locked 30fps.

With regards Tomb Raider, on like-for-like gameplay, the 360 version is locked at 30fps - maybe just the odd dropped frame in 12,000 captured. The PS3 version on the other hand is *mostly* 30fps, but can drop hard at any given point. The same gameplay run-through sees the PS3 version drop to a low of 18fps at one point.

In short, both versions are compromised, or to look at it more postiively, the 360 version is optimised for speed and perceived response from the controls, whereas the PS3 version is optimised for a significantly higher IQ.

Yes how on 360 too to have 720p and stable fps, 30 fps would be the only chance.
 
Last edited by a moderator:
I recently spoke with a very prominent, and technically successful cross-platform developer and the performance analysis of their previous game on both 360 and PS3 showed equal room for improvement going forward.

Ooooh so you talked to Criterion? What else did they say?

/bait ;)
 
What? Do you mean Unified Memory Architecture or Unified Shader Architecture?
Memory Architecture of course :)
Many thanks for explanation
As I understand the 360 has to use it, ie it can't store the framebuffer in main memory, unlike the PS3
Strange, I read that final framebufer stored in main memory ,EDRAM holds only part of it (backbuffer) but Im not a specialist
It is really impossible not to use EDRAM and use AA tricks similar to PS3 on xbox?
 
Last edited by a moderator:
Back
Top