The Game Technology discussion thread *Read first post before posting*

I wouldn't get too excited about an Unreal release. They put out a new release every month on a completely unpredictable day.
 
But i noticed it only on player's car and it was quite blurred
Dont know about Shift, but Dirt 2 have some awesome reflections [but environment have really heavy pop-up]
http://www.youtube.com/watch?v=32SbHdtxSrA

Having that on one car is enough, it aint a cheap effect. Bluriness comes certainly from lower res reflections but that is true for GT5. Shift got it nice but it's in bonnet cam view but does correct reflections of the cars surrouding you with cars having same LOD and transparency effects as on the street. Consider that if you got 15 cars in a race you get 15 cars reflected aswell as effects applied to them. If all cars would reflect all cars on their bodies framerate would take severe impact.

Dirt 2 and GRID have nice reflections but the reflections dont take into count all track objects IIRC.
 
Welp, based on that shot, the PS3 edge indicates 540 vertical. The 360 is at a higher resolution than that (different, but not too far from 1024x600).

I'm still looking for longer stretches before I give anything more exact than that, but that's the gist of it.

Yikes! What happened?!
 
Yikes! What happened?!

Hard to say without a framerate analysis. It's hard to compare about what in particular is more taxing between this game and earlier games as well considering the different settings and what other things they may be doing (lighting, shaders, effects etc). I do get the impression that it's got higher quality lighting compared to before.
 
I hope the forthcoming analysis includes the Theater mode. I had high hopes that maybe they'd cut replays down to 30fps and thus increase resolution and effects, but now...
 
I've read that unlike MW2, this game doesn't scale to 1080p with the PS3, it only scales to 720p then it's up to the TV.

With all my options checked in video settings, the game boots in 1080p by default.

BTW does forcing the game to play in 720 help framerate all?
 
no bug I can see, just different positions.
from that shot no major differences between ps3+xbox360 (though big difference to pc, which will get 2% of the sales [smile] ), perhaps higher quality textures in xbox version see the cigarette packet (the guy in the back has a different color shirt I see)

PC gamers have high quality standards. Imagine if everyone would swallow 'hook line and sinker' on first contact for each sequel and series... oh wait! :LOL:

Anyway dont know if LOD bug but it looks like the window wood panels are sprites for PS3 and geometry for PC/360. Lighting seems more natural in PC version unless console versions have contrast upped to ridicolous levels as the whites are way off.
 
Last edited by a moderator:
Something I noticed in that picture, the map on the table seems to have a higher res texture in the PS3 shot, almost as good as the PC shot (minus the added crispness of the PC ver cause of higher native res).

EDIT: Nvm got teh answer in the DF article, it seems to be a mip map issue, same for the cigarette pack.
 
Last edited by a moderator:
It really is a shame that some devs can't release their SPU usage (charts with what systems are being run within a frame, etc) information. I would love to see the free SPU time left over or how low the number of jobs there are on the SPUs for CoD:BO! I also wonder about the streaming system for the PS3 version (if it's basically the same as the 360's).

It seems and if this game is built to maximize all the 360's strengths and almost none of the PS3's strengths. Of course, that would yield results similar to what we see here.
 
Would definitely be interesting. I'm surprised by the way about Barbarians comment on main memory generally being harder to spare after all the complaints in the past about the split memory pool on the PS3 not leaving enough room for textures in graphics memory.

As is typical, it depends on the game. The simplest example is ai, what if you need a humongous data set resident in ram at all times? What do you do, compromise ai to try and implement mlaa? The simplest solution is to lead on ps3, that way you can restrict all systems to it's parameters and everything will automatically port perfectly to the other machine. For example, if your ai coder one day is all giddy about a new system he developed and he wants to demo it to everyone. One of two things happen depending on which platform he is working on:

1) He demo's it on 360. It looks great, but then he reveals that he needs 110mb reserved for his system at all time. Crap, now what do we do? It looks legit, but it simply won't work on ps3. Who in heck has 110mb dram to spare on ps3? Sorry but it needs to be paired back or scrapped.

2) He demo's it on ps3. It looks good. More memory would improve it, make it more human like, but hey we don't have more memory, and this version will automatically work on the 360. So we go with it because it's ready to go on all platforms.

Note...that's not a fictional example, but I won't give any more details than that :) Suffice to say that you can run out of either memory pool *very* easily. Another example, look at all the post processing that is done in today's games. You simply can't do it all on ps3 vram, the 360 version will absolutely kill it in performance. This was a major problem on very early ps3 ports, their performance was ass and a quick look was because the split buses were not being effectively used. Note, they didn't have any choice, they were totally out of the 256mb of dram on ps3 so they had to shove all the graphics work into vram. Fast forward a few years, cut back dram use, move the render targets around to the various pools of ram and look at that, an extra 9 fps! For performance reasons you have no choice but to use a chunk of that 256mb of dram as graphics ram at all times. Note...that is also not a fictional example.

Generally speaking, you want to keep all code related dram use <150mb for it to work right on ps3. If you don't you will hate life when you try to make it work. That is limiting for things like ai which love large data sets, but if your game has primitive ai then you are in luck because you will have more dram to play with for graphics use.
 
It really is a shame that some devs can't release their SPU usage (charts with what systems are being run within a frame, etc) information.

Did you read the thread?
SPU utilization levels on their own don't tell much of the story, they could be busy 100% of the time running inefficient code...

I would love to see the free SPU time left over or how low the number of jobs there are on the SPUs for CoD:BO! ... It seems and if this game is built to maximize all the 360's strengths and almost none of the PS3's strengths. Of course, that would yield results similar to what we see here.

Again, did you read the thread?? There's been several warnings about this kind of talk.
 
As is typical, it depends on the game. The simplest example is ai, what if you need a humongous data set resident in ram at all times? What do you do, compromise ai to try and implement mlaa? The simplest solution is to lead on ps3, that way you can restrict all systems to it's parameters and everything will automatically port perfectly to the other machine. For example, if your ai coder one day is all giddy about a new system he developed and he wants to demo it to everyone. One of two things happen depending on which platform he is working on:

1) He demo's it on 360. It looks great, but then he reveals that he needs 110mb reserved for his system at all time. Crap, now what do we do? It looks legit, but it simply won't work on ps3. Who in heck has 110mb dram to spare on ps3? Sorry but it needs to be paired back or scrapped.

2) He demo's it on ps3. It looks good. More memory would improve it, make it more human like, but hey we don't have more memory, and this version will automatically work on the 360. So we go with it because it's ready to go on all platforms.

Note...that's not a fictional example, but I won't give any more details than that :) Suffice to say that you can run out of either memory pool *very* easily. Another example, look at all the post processing that is done in today's games. You simply can't do it all on ps3 vram, the 360 version will absolutely kill it in performance. This was a major problem on very early ps3 ports, their performance was ass and a quick look was because the split buses were not being effectively used. Note, they didn't have any choice, they were totally out of the 256mb of dram on ps3 so they had to shove all the graphics work into vram. Fast forward a few years, cut back dram use, move the render targets around to the various pools of ram and look at that, an extra 9 fps! For performance reasons you have no choice but to use a chunk of that 256mb of dram as graphics ram at all times. Note...that is also not a fictional example.

Generally speaking, you want to keep all code related dram use <150mb for it to work right on ps3. If you don't you will hate life when you try to make it work. That is limiting for things like ai which love large data sets, but if your game has primitive ai then you are in luck because you will have more dram to play with for graphics use.
I've actually had this conversation a while back with co workers. The consensus is that we made it too easy for devs to get stuff working on our console, and now it's backfiring on us by making developers lead on PS3.
 
Back
Top