Digital Foundry Article Technical Discussion Archive [2010]

Status
Not open for further replies.
Hey Barbarian,since you are here,can you please tell me whats the cost in ms of depth of field in black ops?It looked great,it had depth,very 3d-ish.
 
It'd be interesting to know as per what exactly in BO is causing the issue with PS3 that didn't already in W@W.
It might then mean that Cell is the bottleneck in that situation. Or maybe GPU MSAA logic and bandwidth not being taxed much at 544p res. Just speculation though.

@Barbarian

Is the PC version (on max settings) rendering alphas at sabe res as framebuffer or scale with screen resolution while being lower res or a combo. Also if it scales, does it have a limit (thinking about how triple screen gaming would be affected)?

Some thoughts about games I've had since observing how some games have fixed PP/alpha res while other games have it scales in relation to screen resolution and some games do PP/alphas at framebuffer res. Metro 2033 for example on very high setting does alphas/post-processing at same as framebuffer res and AFAIK atleast upto 2560x1600 res.

I am pretty sure PC ver uses lower res alpha as well. I'm not at home otherwise I'd have posted some captures.
 
Last edited by a moderator:
@Barbarian

Thanks for the response; I am very appreciative that you are taking the time to come here and answer some questions and offer commentary...

I did forget that having a very fast controller response is another of the major things the CoD series strives for...and with that said, the more graphic duties you hand off to Cell, the more latency involved...sound about right?
 
if you are using SPUs for lots of different stuff, how difficult is to sync everything? and then, sync it even with RSX when SPUs are doing rendering stuff?

is it rather easy in general and no deal at all, or something a lot of effort has to spend on?
 
There's very little publicly known about the IW/Treyarch's engines, which is a shame, since there are some rather brilliant things about it (and massive amounts of legacy too).

I'm really, really interested - I've always thought that the games made the right compromises and concentrated on features that were the most important to create a somewhat stylized, cinematic kind of reality. But I'm not sure what these features are ;)


Also, are the games really using triple buffering? I have a hard time believing that, knowing that low input latency was an important driving force.
 
No disrespect, but perhaps you should save lecturing PS3 developers on the finer points of SPU usage until after you can do something like calculate frame/Z buffer size.

Yes, the memory saving from turning MSAA off is a lot less than 18Mb.
Plus, if you want to do MLAA you'll have to move your frame buffer to main memory which is usually more scarce than video memory.
And don't forget that the cost of MLAA was quoted at 20ms for 1 SPU (for 720p I believe, for COD it will be less). At any rate that's quite a lot of SPU time for a 60fps game, that needs to get everything done in 16ms.
 
Guys, I'll try to answer questions, but please do understand I can't go into certain specifics. I hate to be vague, but anything I say, might be pulled out of context and tweeted across the globe and have all kinds of unpleasant consequences for me.

There's very little publicly known about the IW/Treyarch's engines, which is a shame, since there are some rather brilliant things about it (and massive amounts of legacy too).
I'll try to get permission on what I can I can't talk about it.
Until then, it's fair to say that there's very little that hasn't been squeezed out. No low hanging fruit as they say. It's one of very few engines that targets 60 fps, and that puts incredible restrictions on the kind of methods/techniques one can use. A lot of the deferred lighting, motion blurs, and other high end sexy effects that KZ and Uncharted do are only possible because they have double the GPU budget to play with (not to mention tons of SPU time to help out as well, which adds a 3rd frame of latency!!!).

Did I highlight your key problems correctly ? ^_^

The Insomniac people rewrote their RFOM code and ran out of time polishing R2. You probably did the right thing by keeping the gameplay and legacy code, at least for this iteration. Sales is good too. :cool:
 
Well, as pertains there not being enough cycles leftover on the SPU's, I have a hard time believing that BO uses them more thoroghly than some of Sony's standout 1st party efforts like Killzone 2 (which theorectically could have easily used MLAA had the technique been available for them at that point in time)...

As pertains the BO team not saving much performance by not turning MSAA off, doesnt MSAA require 6ms of GPU time as well making a dent on RAM usage and fill rate? Sorry...but that and not saving much performance doesnt add up.

6ms for MSAA on the ps3? Have a link for that?

I really think a bigger deal is being made out of nothing. Different engines push the systems differently. On top of that, when developing exclusively for one system, you can cater better to it's strengths and design around the weaknesses.

KZ2 is pretty and all, but it too uses it's share of shortcuts.

I'm sorry but I don't think it's ever a good idea to reference "game/engine X" and think the same rules should apply to "game/engine Y".
 
Did I highlight your key problems correctly ? ^_^

The Insomniac people rewrote their RFOM code and ran out of time polishing R2. You probably did the right thing by keeping the gameplay and legacy code, at least for this iteration. Sales is good too. :cool:

Something like that makes Insomniac's decision to abandon 60 fps gaming in favor of 30 fps gaming much more understandable. Especially when everyone seems to insist on comparing graphics quality of 30 fps games to those of 60 fps games.

Regards,
SB
 
According to their in-house studies, they think it's not worth the effort. A Crack in Time may be their last 60fps game.

They had trouble getting some old monsters to work in R2. So I doubt they had time at all to do 60fps for that game.
 
While gamers like us may recognize and appreciate higher frame rates, Regular Joe doesn't know the difference between 60fps and 30fps. It's for this reason that I don't understand why IW or Treyarch don't lock the framerate down to 30fps and add in some new effects.
 
@Kagemaru

Yep...I have a link, but I was wrong; it takes RSX 5ms to do 2xMSAA...not 6ms:

http://www.eurogamer.net/articles/the-making-of-god-of-war-iii?page4


@Barbarian

Wouldnt it be possible to do the MLAA in parallel across 5 SPU's like in God of War? It takes only 4ms CPU time done that way...

As pertains the memory saving, you are right; I should have stated "up to 18 MB," but at anyrate wouldnt MLAA be less memory intensive than MSAA and save fill rate as well?
 
@Kagemaru

Yep...I have a link, but I was wrong; it takes RSX 5ms to do 2xMSAA...not 6ms:

http://www.eurogamer.net/articles/the-making-of-god-of-war-iii?page4


@Barbarian

Wouldnt it be possible to do the MLAA in parallel across 5 SPU's like in God of War? It takes only 4ms CPU time done that way...

As pertains the memory saving, you are right; I should have stated "up to 18 MB," but at anyrate wouldnt MLAA be less memory intensive than MSAA and save fill rate as well?

Joker(ex programmer) and Barbarian both said that it depends from game to game how much time RSX needs for MSAA.

4ms across 5 spus is exactly what he said,it equals 20ms of cell time.4 ms per spu means you lose about 13% of spu time solely for MLAA if you are going for 30fps game.In Black Ops you lose twice as much since game runs at 60fps,meaning you have only 16ms of rendering "budget".That means 25% of spu time goes for MLAA.Add to that other rendering stuff done on spus,vertex processing etc. and you will see that there aint much time left for it.
 
Last edited by a moderator:
Yes, the memory saving from turning MSAA off is a lot less than 18Mb.
Plus, if you want to do MLAA you'll have to move your frame buffer to main memory which is usually more scarce than video memory.
And don't forget that the cost of MLAA was quoted at 20ms for 1 SPU (for 720p I believe, for COD it will be less). At any rate that's quite a lot of SPU time for a 60fps game, that needs to get everything done in 16ms.

I'd actually forgotten about the split memory pools thing regarding MLAA, and that demand on one pool might be tighter than another.

And yeah, 20ms is over 20% of your SPU time at 60fps. And even if you had 20ms of free time in total I'm guessing there could be potential scheduling issues.
 
From what I've seen, it's simply that the shadow technique wasn't changed from MW2 on PS3 (maybe self shadows did?). On 360 BO, it seems they were able to afford an increase in shadowmap resolution (consider memory, z-fill etc). With regards to filtering, they'd still have to do the four texture fetches (4 samples) so adding a few math ops to smooth them out shouldn't be too bad (in MW2, it looked like just different offsets with no interpolation between the samples).

Oh, I see - any screens of the differences in shadow filtering between the two, like those MW2 screens I posted earlier?
 
@Kagemaru

Yep...I have a link, but I was wrong; it takes RSX 5ms to do 2xMSAA...not 6ms:

There's no fixed period of time for RSX "to do" MSAA.

As pertains the memory saving, you are right; I should have stated "up to 18 MB," but at anyrate wouldnt MLAA be less memory intensive than MSAA and save fill rate as well?

2xMSAA on a 960x544 back and z buffer won't take "up to 18MB" extra memory.

At 960x544 the sub-pixel accuracy that comes from MSAA is going to be even more valuable than at 1280 x 720. MLAA is not a replacement for MSAA.
 
Status
Not open for further replies.
Back
Top