Digital Foundry Article Technical Discussion Archive [2011]

Status
Not open for further replies.
ya, there was a little tool or something that you can run on the PC, the quality seems great but you loose too much detail. I hope they will improve it if not then I definitely hope that this won't be the future....
 
I thought only PC version has FXAA.

Edit : You're right and i may have misunderstoodbut,but here * talks about consoles X360 and ps3 (so not only ps3 could have FXAA as i post before) .


* However, our shots and footage were taken using FXAA mode - a new post-process form of anti-aliasing designed by NVIDIA's Timothy Lottes that seeks to provide the edge-smoothing of MLAA but with reduced sub-pixel edge issues. PC and console iterations of the technology exist, where the strengths of the respective GPUs provide two settings: performance (console) and quality (PC). The smoothing effect on PC is perhaps not best suited to lower resolutions like 720p, though, and the FXAA seems to take away a little too much detail.

FXAA is an important technology for the future because the many game engines - like Unreal, plus a whole host of new deferred renderers - don't get along nicely with traditional multi-sampling AA. Duke Nukem Forever is interesting in that it is one of the first games where we've seen it implemented. Expect to see it being utilised on a great many console projects in the future, especially on Xbox 360 where GPU resources are typically more plentiful than they are on PlayStation 3. The console version of FXAA can process a 720p image in around 1ms, providing quite remarkable quality with only a very small performance hit."

Interesting notes about FXAA :

7900GT as optimization proxy for ps3,on x360 only 1.3 and 1.5ms with FXAA2

http://timothylottes.blogspot.com/2011/05/fxaa-console-status.html

http://timothylottes.blogspot.com/2011/03/nvidia-fxaa.html

Latest stage of FXAA3

http://timothylottes.blogspot.com/2011/06/fxaa3-console-status-update-2.html
 
Last edited by a moderator:
interesting from the article

"FXAA3 Console on PS3 does not have the same long edge quality as Sony's MLAA, but overall FXAA3 costs significantly less, freeing up SPU time for things like tile deferred lighting, etc! "


Sony didn't say how much it cost now to do MLAA on PS3, their new phyre engine said edge finding is done with SPU but MLAA is done on rsx or something.
 
Sony didn't say how much it cost now to do MLAA on PS3, their new phyre engine said edge finding is done with SPU but MLAA is done on rsx or something.

The comparison is based on MLAA being a few ms on 5 SPEs.

The new implementation of MLAA in Phyre has the edge detection performed on Cell with the rest of the blending done on RSX.
 
The comparison is based on MLAA being a few ms on 5 SPEs.

The new implementation of MLAA in Phyre has the edge detection performed on Cell with the rest of the blending done on RSX.

ya, thats is what I thought, most if not all research still use the old GOW3 20 ms across 5 spu for comparison. Curious how much they have improved it by now.
 
Disappointed in DNF's performance on the 360, hate uncapped frame rates, and on top of that it's still lower than the PS3 version.

Switching my purchase to the PS3.
 
Sorry i got noobish question. In DF is often mention a game that have vsync but they got variable framerate like 30-35fps. How can that be?

In pc if i enable vsync, the fps will jump up or down exactly in 30/45/60fps.

Are the vsync in console is different from pc? Or its my pc that weird...
 
Sorry i got noobish question. In DF is often mention a game that have vsync but they got variable framerate like 30-35fps. How can that be?

In pc if i enable vsync, the fps will jump up or down exactly in 30/45/60fps.

Are the vsync in console is different from pc? Or its my pc that weird...
Vsync doesn't imply any particular framerate, it just means that every frame is drawn 100% before moving on to the next.. no screen-tearing. The actual framerate will be whatever the hardware can push at that particular moment.
 
It does if its double buffered, which is when the framerate will start to fluctuate between 60, 30 and 20. Which is why many games use triple buffering, to avoid this.
 
I only saw the 360 version of Cod MW2 tests and they ran fairly consistently > 50fps, usually > 55fps. Everyone claimed both versions of the game ran identically so I presume PS3 version framerate is identical.

Especially on ps3 COD:BO is more like 30fps.

I dunno, since this is the DF thread maybe this is relevant:

http://www.eurogamer.net/articles/digitalfoundry-call-of-duty-black-ops-faceoff

Their tests and video seem to put it at 10-15fps behind Xbox most of the time it is running behind. A difference of "as much as 20fps" is mentioned, but this appeared to be only for a short time. Subjectively I saw framerates approaching the 30s on both the Xbox and PS3 sides but these were rare and were spikes rather than consistent drops in both cases.

http://www.eurogamer.net/articles/digitalfoundry-modern-warfare-2-face-off?page=3

This one references "a consistent 12fps" difference and missing post effects on PS3.

So I'm not sure what points either of you were trying to prove, but there's some actual, measured data with which to work.

Oh and just for fun here's GOW3:

http://www.eurogamer.net/articles/digitalfoundry-godofwar3-performance-blog-post

Seems to fluctuate between 50 and 30, but hangs out around 30 during action sequences.

Just from the analysis I'd be hard pressed to call GOW3 a "60 fps game" as that was obviously not the target framerate. It's a variable framerate game with a target of "at least 30". While the COD games definitely target 60 and do indeed spend some amount of their time running at that framerate it is less consistent than would be nice, but also far from being "a 30 fps game" on Xbox or PS3.
 
Last edited by a moderator:
Oh and just for fun here's GOW3:

http://www.eurogamer.net/articles/digitalfoundry-godofwar3-performance-blog-post

Seems to fluctuate between 50 and 30, but hangs out around 30 during action sequences.

Just from the analysis I'd be hard pressed to call GOW3 a "60 fps game" as that was obviously not the target framerate. It's a variable framerate game with a target of "at least 30". .

Why did You post year old demo analysis? Final game is much smoother, its 40+, not 30+ and quite often average fps is about 45.
http://www.eurogamer.net/articles/digitalfoundry-godofwariii-demo-vs-retail-blog-entry
 
Why did You post year old demo analysis? Final game is much smoother, its 40+, not 30+ and quite often average fps is about 45.
http://www.eurogamer.net/articles/digitalfoundry-godofwariii-demo-vs-retail-blog-entry

The retail runs at 40+fps and had a boost of almost 10fps compared to the demo but the action sequences are still at the 30-40fps range at most fights - going from a small area/corridor fight that run at smooth 50-60fps to a bigger with more enemies area that run at 30-40fps was really distracting, the judder was annoying and the loss in controls responsiveness was quite noticeable IMO.

Personally I'm not a fan of unlocked frame-rates with so big differences, locking the game at 30fps would've been a better choice IMO.
 
The retail runs at 40+fps and had a boost of almost 10fps compared to the demo but the action sequences are still at the 30-40fps range at most fights - going from a small area/corridor fight that run at smooth 50-60fps to a bigger with more enemies area that run at 30-40fps was really distracting, the judder was annoying and the loss in controls responsiveness was quite noticeable IMO.

Personally I'm not a fan of unlocked frame-rates with so big differences, locking the game at 30fps would've been a better choice IMO.

Not necessary the response of controls depends of variables fps... by the way, GOW 3 has a really annoying judder only during the corridor/area sequences walkthroughs, during the combat honestly I don't perceived any loss in controls responsiveness (not so to influence the gameplay & I have finished that at any level of difficult, one of the best game of this generation) probably you have to try Castlevania to see what really mean noticeable loss in responsiveness. OT However is really a pity there isn't yet a fps analysis of infamous 2 because I found the average really interesting & impressive for a free roaming game. I'm very curious to see if my perception it's right here.
 
Last edited by a moderator:
Not necessary the response of controls depends of variables fps... by the way, GOW 3 has a really annoying judder only during the corridor/area sequences walkthroughs, during the combat honestly I don't perceived any loss in controls responsiveness (not so to influence the gameplay & I have finished that at any level of difficult, one of the best game of this generation) probably you have to try Castlevania to see what really mean noticeable loss in responsiveness. OT However is really a pity there isn't yet a fps analysis of infamous 2 because I found the average really interesting & impressive for a free roaming game. I'm very curious to see if my perception it's right here.

This was also my experience with GOWIII. I agree that the variable framerate was noticeable at certain points visually, but there was never a point during that whole game where i felt that the responsiveness of the controls were ever an issue (or even inconsistent).
 
Sad DF hasn't released an analysis article on the BF3 console footage yet. A buddy told me they have some good quality footage from the show, not sure who provided it, so I'm eager to see what they say about it (if they say anything at all that is)
 
Status
Not open for further replies.
Back
Top