Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
The thing is the only 1080P 60 fps games on the XB1 aren't using a deferred renderer, any game that does seems to have to drop resolution.

I agree, deferred rendering + 1080p and 16 rops are not sufficient

Devs not having time to mess around with buffer management within 32MB is a bigger issue.

ROP-limited scenarios are more likely to be heavy blended effects (transparencies) that aren't pixel shader limited.
 
Can you or anyone else explain what the difference is, for people without a phd? What are the pro and cons of each method?

From what I read about it, first off, it requires much less memory because it's not using a massive G- buffer approach that many deferred renderers have these days. Second, you maintain MSAA support. Third, transparencies require no work arounds or hacks to fit your materials chain.

But the downside is more costly dynamic lights I believe. There was a blog post somewhere about RADs reasoning behind choosing Forward+ after evaluating several approaches.

I wonder now that everyone had jumped to the deferred rendering boat, the smart move may actually be going back to forward... plus that is. Especially now that GCN GPUs are the standard across both consoles.
 
I agree, deferrend rendering + 1080p and 16 rops are not sufficient
Not true at all.

If you manage to pack all your stuff into one 64 bit (4x16 bit integer) render target, you will reach full fill rate on any AMD GCN card. This way deferred g-buffer rendering doesn't require any more fill rate than forward rendering. Also compute shaders do not use ROPs at all, so compute shader based lighting sidesteps the GPU fill rate limit completely.
 
How would the size of the light index buffer compare to typical g-buffer usage? Wouldn't you need a rather large tile size to still fit everything into ESRAM? Or are you suggesting that the bandwidth requirements to and from the light index buffer are such that it could be stored in DRAM? I wouldn't think that would be sufficient given a 16ms per frame target in common AAA scenarios but I have not tested.
 
Digital Foundry vs. inFamous: Second Son

DF have posted their final analysis on Infamous Second Son.

DF said:
While Second Son doesn't hit every mark, we can't ignore the fact that Sucker Punch was able to deliver such a polished, technically accomplished title so early in the life cycle of PlayStation 4. Even with its faults, it feels like the type of game one might expect only after a solid year or two as opposed to just a few months into the system's lifecycle. Such an effort certainly reserves a spot for Sucker Punch around the table of upper tier of internal studios and with the bar set so high so early, we can't wait to see what Sony's Worldwide Studios come up with next.
 
http://www.eurogamer.net/articles/digitalfoundry-2014-vs-infamous-second-son

Highlights:
- full 1080p
- Unlocked framerate w/Vsync (avg ~35fps never < 30)
- AF (no indication of how many levels)
- Variant on SMAA T2x described as "the most impressive utilisation of post-process anti-aliasing we've seen on any platform"
- Delsin is comprised of 60,000 triangles, his beanie 7,500 (goddammit why can't we just stick to vertices for this stuff)

Edit: Beaten!
 
http://www.eurogamer.net/articles/digitalfoundry-2014-vs-infamous-second-son

Highlights:
- full 1080p
- Unlocked framerate w/Vsync (avg ~35fps never < 30)
- AF (no indication of how many levels)
- Variant on SMAA T2x described as "the most impressive utilisation of post-process anti-aliasing we've seen on any platform"
- Delsin is comprised of 60,000 triangles, his beanie 7,500 (goddammit why can't we just stick to vertices for this stuff)

Edit: Beaten!

never?

We've produced a stress test designed to push the engine to its limits. While the frame-rate typically stays north of 30 fps there are certainly plenty of circumstances which can knock performance into the 20s.

http://www.eurogamer.net/articles/digitalfoundry-2014-vs-infamous-second-son
 
Last I checked, this was a thread for discussing the technical bits brought up in the articles too, but you guys are still falling for the editorial instead. Yes, it's not the first and not likely to be the last time, but that's not for this thread or these parts of the forums.
 
The v-synced frame rate on I:SS will swing between 20 and 60 fps.

A trailing average will mean this is represented as higher than 20 fps, and that's fine as long as DF are consistent in the way they represent frame rate (which I'm sure they are), and it does give a better representation of how the game will "feel".

In my day, back when this was all just fields, a ~35 fps game would have been capped at 30 and been better for it. Frame rates this generation suck ass, swinging all over the place like a janky mess. DF are valuable for showing just how bad this really is.

This gen is not a great one in terms of frame rates. :(

Edit: Isn't 1TX supposed to better than 2TX?
 
Last edited by a moderator:
Can it? Sure. Will it? No. I'm expecting 1080p/30.
I'm not. Tomb Raider has set the bar for remastered PS3 games and I doubt that Naughty Dog, or whoever they would trust to handle a port, would be aiming for less than 60fps.
 
I'm not. Tomb Raider has set the bar for remastered PS3 games and I doubt that Naughty Dog, or whoever they would trust to handle a port, would be aiming for less than 60fps.

And don't forget that TR included TresFX over the previous console version which sucks down a massive amount of performance. Presumably TLoU won't have a feature like that on the PS4 so 60fps should be easily achievable.
 
Status
Not open for further replies.
Back
Top