Digital Foundry Article Technical Discussion Archive [2011]

Status
Not open for further replies.
Not necessary the response of controls depends of variables fps... by the way, GOW 3 has a really annoying judder only during the corridor/area sequences walkthroughs, during the combat honestly I don't perceived any loss in controls responsiveness (not so to influence the gameplay & I have finished that at any level of difficult, one of the best game of this generation) probably you have to try Castlevania to see what really mean noticeable loss in responsiveness. OT However is really a pity there isn't yet a fps analysis of infamous 2 because I found the average really interesting & impressive for a free roaming game. I'm very curious to see if my perception it's right here.

I actually never found Castlevania to be any less than exceptionally responsive despite the crappy framerate. Better than God of War III in fact, and only slightly behind Dante's Inferno.
 
I actually never found Castlevania to be any less than exceptionally responsive despite the crappy framerate. Better than God of War III in fact, and only slightly behind Dante's Inferno.

Nothing of personal, but technically GOW 3 is in an another world compared to Castlevania lag (if we want to talk of lag problem in GOW 3...) I'll bet everything you want without an analisys tools... perhaps your ps3 or your game have some problem, because I found even uncomparable the responsiveness on both. It's just how to compare kz2 with halo reach...
 
Last edited by a moderator:
When are we getting too see that high quality bf3 ps3 footage they tweeted about 2 weeks ago or so.

I was looking forward to it too, but I assume we're not.

Oh well not that big a deal, as much more console footage should be in the coming months. The console versions do have to actually release on Oct 25 whether EA likes it or not :LOL:
 
I was looking forward to it too, but I assume we're not.

Oh well not that big a deal, as much more console footage should be in the coming months. The console versions do have to actually release on Oct 25 whether EA likes it or not :LOL:

Beta/demo September demo if it is end September.
I think cliffy B is right on that part a beta 4 weeks before launch is just a demo.
We should know more when gamescom and sigraph(360 version slides right?) begins.
 
Uncharted 3 beta analysis

http://www.eurogamer.net/digitalfoundry/

I disagree with some conclusions. For one, the loss of 2xmsaa is obvious and the IQ really suffers.

Also, and this is playing in 2D, the framerate seems to still have a lot of stutters and the controls feel less responsive than 2.

I don't have a 3D TV, but just from some of the pics, I noticed missing grass tufts or other details. I know it's hard to represent 3D in 2D, but those screenshots really look bad when right next to normal full 720 images.
 
My question is: could FXAA work on the SPU? Considering how better work MLAA on the SPU than on the GPU, I'm really curious how could work (whether technically possible)
 
My question is: could FXAA work on the SPU? Considering how better work MLAA on the SPU than on the GPU, I'm really curious how could work (whether technically possible)

I wouldn't say MLAA works better on the SPU than the GPU... I mean, MLAA definitely works well on the SPUs, but I still think the GPU is better suited to that type of calculation. It is really just a matter of GPU cycles being in higher demand on PS3 titles than SPU cycles, so it kind of makes sense to use the SPUs to give yourself more room on the GPU. That was sort of the reasoning behind making MLAA in the first place. It isn't really that the process is faster on the CPU than the GPU. More that the GPU is kind of a major limiting factor in the performance of most games, and anything that you can offload to the CPU can potentially improve the overall performance. By not doing the AA on the GPU, you can possibly improve performance.

In the case of FXAA, I don't really think it is possible because everything is based on HLSL and is designed to use pixel shaders... I may be wrong though and perhaps there is some way to do it on the SPUs.
 
Last edited by a moderator:
pretty sure DF said SPU does MLAA/edge detection much better than any GPU somewhere. It does it with a much better accuracy than GPU. Same deal with DLAA where the PS3 version of SWFU2 DLAA is offlead to the SPU. This is solely about the quality, not speed, of course the GPU can do graphical task faster. I could be wrong, or remember wrong.
 
pretty sure DF said SPU does MLAA/edge detection much better than any GPU somewhere. It does it with a much better accuracy than GPU. Same deal with DLAA where the PS3 version of SWFU2 DLAA is offlead to the SPU. This is solely about the quality, not speed, of course the GPU can do graphical task faster. I could be wrong, or remember wrong.

That might be true. But I can't imagine that the difference is that large, and honestly speed is a bigger factor with these things. The DF article was primarily speaking about the AMD Driver version of MLAA, which has some obvious disadvantages given that it is just a driver based implementation... Being that it isn't actually run within the game and is simply a post process on the output frame, it doesn't necessarily have access to all the same data or buffers that an in-engine implementation may have. I have to imagine that a true in-game implementation of MLAA would always have an advantage over the AMD Driver version, and in that case it is difficult to say whether or not the SPU would do significantly better than the GPU... I doubt it is a huge difference, if anything at all.
 
I would say their DLAA article is a good enough indication of post processing filter AA on spu vs gpu. Don't think anyone said theres a HUGE difference, but its there. I thought CPU accuracy is just how things work. Even in the older Sony phyre engine with spu SSAO and motion blur, they said they were slow but quality is nicer. Good example would DF comparison of test driver unlimited 2 motion blur, they suspect that it was done on the SPU so it was higher quality than the 360 version.
 
There's really no reason you couldn't do FXAA on SPU's, it would just take some work to port the original shader implementation to something more performance-friendly for the SPU way of doing things.
 
I would say their DLAA article is a good enough indication of post processing filter AA on spu vs gpu. Don't think anyone said theres a HUGE difference, but its there. I thought CPU accuracy is just how things work. Even in the older Sony phyre engine with spu SSAO and motion blur, they said they were slow but quality is nicer. Good example would DF comparison of test driver unlimited 2 motion blur, they suspect that it was done on the SPU so it was higher quality than the 360 version.

Yeah, I can see that being true. If you have extra SPU cycles, sure, but I can't help but think that it is a bit expensive and there are probably better ways to use the SPUs(if there is something significantly faster like FXAA3 around). But, I guess I should leave that to developers to decide.
 
Yeah, I can see that being true. If you have extra SPU cycles, sure, but I can't help but think that it is a bit expensive and there are probably better ways to use the SPUs(if there is something significantly faster like FXAA3 around). But, I guess I should leave that to developers to decide.

MLAA works definitely better on the spu from what I have read from now, I don't remember the exact technical reason (probably because is a post processing or is more suited for the spu/ cpu cycle, AI said something of that if remember correctly) but definitely the effects on the gpu it's less evident & more blurried. Could be really interesting to see if sony first parties or the genius of santa monica studio can achieve even a better level adding a fxaa solution on the spu, but I have read the fxaa is more gpu oriented, so here it's my doubt.
 
Status
Not open for further replies.
Back
Top