The Game Technology discussion thread *Read first post before posting*

You can produce specular highlights in LDR with a highlights pass. I'd guess it was HDR as it looks like UE3, HDR and no AA as standard. No-one can say for sure though.
 
Jeez the tearing difference is absurd in the ps3 demo...:cry: what the hell going on codemaster? In theory the ps3 is more suitable for the vsync ...

I think the engine was tailored for Cell and SPU. IIRC they mentioned this in old DIRT interviews. Maybe just the unified shaders doing their magic to help framerate.
 
Before the PS3 version was the last to get some developer love, the demo was far worse than the actual retail release was.

Not sure if that is the case this time, but it could be.

Maybe Codemaster altered the code so much for their Neon engine simply updating to the new PhryeEngine (which incorperates CELL unlike PCSG) was not feasable.
 
I think the engine was tailored for Cell and SPU. IIRC they mentioned this in old DIRT interviews. Maybe just the unified shaders doing their magic to help framerate.

... dirt interviews that came after 360 release and before PS3 release.
That tells more about how tailored their engine was.
 
Ummm...why?
Why not?! The triple buffering it isn't more common on the ps3 hardware than 360 because the 360 have the purpose to achieve steady fps to everything and the ps3 indeed uniform picture? :???: Well probably I haven't used the correct terms but the substance is the same.
 
Last edited by a moderator:
Why not?! The triple buffering it isn't more common on the ps3 hardware than 360 because the 360 have the purpose to achieve steady fps to everything and the ps3 indeed uniform picture? :???: Well probably I haven't used the correct terms but the substance is the same.

It's more of a hardware issue as the edram makes the triple buffer pretty much impossible to use.
 
Why not?! The triple buffering it isn't more common on the ps3 hardware than 360 because the 360 have the purpose to achieve steady fps to everything and the ps3 indeed uniform picture? :???: Well probably I haven't used the correct terms but the substance is the same.

Both machines can use vsync, or not use vsync. On PS3 because of the tight integration of the cpu in gpu tasks, sometimes it's easier to just vsync to help deal with synchronization issues.


MazingerDUDE said:
It's more of a hardware issue as the edram makes the triple buffer pretty much impossible to use.

You can triple buffer on 360, edram doesn't stop you from doing it. Typically you try to avoid it in some games because it adds input lag and it munches up more memory.
 
You can triple buffer on 360, edram doesn't stop you from doing it. Typically you try to avoid it in some games because it adds input lag and it munches up more memory.

Is that so? Sicne, the edram can barely fit a single buffer, I thought processing 2 front buffers would have been problematic. Well, I haven't seen a single game using triple buffer on the 360 and, there're quite a few multi platform games that supports triple buffer only on the PS3. (ie. MT framework) I wonder what's the hold up?
 
Is that so? Sicne, the edram can barely fit a single buffer, I thought processing 2 front buffers would have been problematic. Well, I haven't seen a single game using triple buffer on the 360 and, there're quite a few multi platform games that supports triple buffer only on the PS3. (ie. MT framework) I wonder what's the hold up?

Edram is not the frame buffer, frame buffers reside in main memory. Edram is the rendering scratch pad that gets resolved out to your frame buffers when you are done with it. You can have two or three, it's the devs choice. One of the MLB 2K games on 360 that I worked on shipped as triple buffered, which was slightly controversial at the time because a small percentage of people claimed they noticed input lag during batting. I believe the later versions went back to double buffered but I wasn't there for those.

You see more triple buffered games on PS3 because the PS3 render hardware is slower, so you needed three buffers to try to minimize tear. No I don't want to argue that point anymore, you either believe it or you don't and we'll leave it at that :) A triple buffered game on PS3 will run double buffered just fine on the 360, so on 360 it's better to spend that 3rd buffer memory elsewhere, like on better textures.
 
You see more triple buffered games on PS3 because the PS3 render hardware is slower, so you needed three buffers to try to minimize tear. No I don't want to argue that point anymore, you either believe it or you don't and we'll leave it at that :) A triple buffered game on PS3 will run double buffered just fine on the 360, so on 360 it's better to spend that 3rd buffer memory elsewhere, like on better textures.

Maybe that's the case for some, but not for all. The games like Burnout Paradise does double buffer with v-lock on the 360, so when the frame rate dips below 60, it will go 30 making it much more noticeable than the PS3. Also, if you look at the CryEngine 3 tech demo, it is running well below 30 (20~25 most time), so it tears constantly on the 360. I can trust you that the triple buffer is not impossible on the 360 after all , but wouldn't there be a reason for being not so popular on the 360?

BTW, Uncharted2 uses triple buffer, and IMHO it's got better looking texture than most other games ;)
 
Maybe that's the case for some, but not for all. The games like Burnout Paradise does double buffer with v-lock on the 360, so when the frame rate dips below 60, it will go 30 making it much more noticeable than the PS3.

They are PS3 developers, maybe they also didn't know that you can triple buffer on 360 :) Just kidding, I don't know why they chose what they chose, you'd have to ask them.


Also, if you look at the CryEngine 3 tech demo, it is running well below 30 (20~25 most time), so it tears constantly on the 360. I can trust you that the triple buffer is not impossible on the 360 after all , but wouldn't there be a reason for being not so popular on the 360?

It's mostly not needed on 360 versions of multi plat games since you can hit parity to the PS3 build without it. Why take the memory hit and input lag hit if you don't need to? CryEngine 3 is still a work in progress so performance talk about it is probably premature at this point. But input lag on a triple buffered 30fps game can be measurable, so often it's avoided unless your game is input lag tolerant due to auto aiming or some other mechanic.


BTW, Uncharted2 uses triple buffer, and IMHO it's got better looking texture than most other games ;)

ND are l33t :) Amazing artists can always do more with less.
 
ND are l33t :) Amazing artists can always do more with less.

I think that's because they need less memory because of having a good streaming system that can depend on a combination of BD and HDD. That's in turn what may hold the 360 down a little in some multi-platform titles that actually use this aspect of the PS3, you may need the extra RAM for caching. Just guessing though, but at any rate the HDD / BD streaming system is one thing that makes the textures so much better in Uncharted, and multi-platform titles tend to not use this feature.
 
They are PS3 developers, maybe they also didn't know that you can triple buffer on 360 :) Just kidding, I don't know why they chose what they chose, you'd have to ask them.

It's mostly not needed on 360 versions of multi plat games since you can hit parity to the PS3 build without it. Why take the memory hit and input lag hit if you don't need to? CryEngine 3 is still a work in progress so performance talk about it is probably premature at this point. But input lag on a triple buffered 30fps game can be measurable, so often it's avoided unless your game is input lag tolerant due to auto aiming or some other mechanic.

So, there's really no reason not being able to use triple buffer on the 360 eh? I do get your point, however it's not like the 360's got infinite amount of power. What puzzles me is that there are those games that would clearly benefit from the triple buffer that they don't do so in the 360, for example, Fallout 3. Sure, it runs smoother than the PS3 version, but it still tears considerably especially on the close up slow-mo scenes which in turn, affects significantly to the overall IQ. It's not that the game use better quality textures or anything (in fact it's got worse textures) There's really no logical reason not to use the triple buffer here.

As for the input lag, I'm not sure in practice (since I'm no dev ;)), but in theory, dynamic triple buffer would reduce input lag (it would lag just as much as the double buffer with v-sync) if you make use of the third buffer when the frame rate dips below 30. (in this case it will lag only when there're frame dips) I participated Uncharted 2 beta, and didn't really feel any more input lag over the first game (double buffer - dynamic v-sync) nor I've heard it became an issue. Same goes for the Resistance 2.

I wonder any other dev here can provide a clear answer, I don' think it's purely a coincident that you don't see the triple buffer on neither multi-platform, nor exclusive on the 360.

BTW, CryEngine 3 should be near complete as they'd be shipping out SDKs on coming month.
 
Both machines can use vsync, or not use vsync. On PS3 because of the tight integration of the cpu in gpu tasks, sometimes it's easier to just vsync to help deal with synchronization issues.




You can triple buffer on 360, edram doesn't stop you from doing it. Typically you try to avoid it in some games because it adds input lag and it munches up more memory.

But triple buffering produces less input lag than standard double buffer vsync, its really not a valid reason to avoid it imo, especially when non triple buffered games are shipping with 100ms+ input lag anyway, without consumer complaints.

The lack of eDRAM is absolutely a reason I can understand for avoiding triple buffering but if its not such an issue to implement as you claim then its lack of any widespread use is simply baffling to me. Tearing is never a solution and to find out it could so easily be avoiding is really quite frustrating. Perhaps more developers should start reading Derek's article just like he suggests, there's clearly a lot of confusion about the inherent benefits of triple buffering.

Being "faster" than the PS3 version is really a terrible reason if you ask me, sure performance may be closer to 30fps but if its not a damn near locked 30fps, then your 360 build is going to see exactly the same benefits as your PS3 build, it doesn't matter if those benefits are slightly less pronounced. Games like Bionic Commando and Red faction are shipping with nearly 50% torn frames on 360! The sentiment that you're quicker than the PS3 build counts for nought when you're shipping a frankly broken and unplayable 360 build like that.

Edit: I hadn't read MagazinerDude's reply before I made this post and it seems we're on the same wavelength, the logic behind it is just baffling if you ask me. Naughty Dog are the best in the business as far as I'm concerned, if they believe triple buffer v-sync is the best solution then that's all the stamp of authority it needs, all the theory backs up that opinion as well.
 
HUm, I don't think that is invalidating Joker's point, it's not like developers never do "weird choices".
Bionic commando is a good example, we know that developers are shooting for parity but in the same time they go for different solutions on both systems.
To speak about bionic commando:
FP10 on the 360, log luv16 on theps3
SSAO on the 360, none on the ps3
tearing problems on the 360 vs minor ones on the ps3
All while rendering at a low 1122*640.
It's not the seven marvel of the world basically it looks more like weird decisions than anything else.
If anything the 360 should have run almost the same code as the ps3 and most likely difference would be unnoticeable.
SSAO consumes gpu cycles and memory, nothing prevent xenos from using log luv algorythm, etc.

For others points Joker made pretty clear where the memory impact for buffering (double triple whatever) is (main RAM I guess back vs front buffer, on the 360 the front buffer has no impact (whether you tile or not) on the system memory it's always store on the edram).

As for Naughty Dog I trust them for have made the best decision for their game and their tech but it's theirs games and theirs tech. Quiet some games ship without noticeable tearing without having to rely on vsync and triple buffering.

(A bit speculative in regard to my knowledge, some further explanations might be welcome).
Maybe their solution will have growing adoption (I'm should reread how triple/double buffering works) and the pro overcome the vs. From my understanding it has cost in memory (three back buffers in memory so 21MB instead 7MB) but it may ease synchronization issues. Say you aims at 30 fps per second so you have 33ms per frame little variation will have few impact (between I guess that if you're almost always render in 33ms triple buffering may be useless).
Overall at 30 fps per second (33ms a frame) and if you manage to do your rendering in 33ms, a ~100ms will not be percieved (in regard to input lag). So I guess that most problems are induced by others factors/issues (frame taking way to long to render, cpu waiting too long for something etc.)
 
Last edited by a moderator:
Back
Top