DICE's Frostbite 2

Compute shader is the big one in this context. They're doing their entire tile-based deferred shading step in a small number of big compute shaders. This isn't possibly in DX10 so either you have several additional passes through main memory for storing light lists (probably impractical) or you fall back on conventional non-tiled deferred shading, which will be significantly slower. Deferred MSAA is also far more efficient in DX11 because of the tiling.

They have a similar issue on the XBox 360, so I'm curious to see what sort of fall-back path they've implemented, but I don't think there's any question that if you're planning to play BF3 on the PC you should grab a DX11 card for this fall. It's likely to be significantly faster and likely higher quality as well.

The interesting thing then will be if PS3 is following the more efficient DX11 path, how much power will a DX10 level PC need to have to overcome those innefficiencies and produce equal or better results.
 
The interesting thing then will be if PS3 is following the more efficient DX11 path, how much power will a DX10 level PC need to have to overcome those innefficiencies and produce equal or better results.
Indeed, it'll be interesting to see. Note that the PS3 pays the G-buffer readback overhead though which should normalize it a bit.
 
Last edited by a moderator:

Interesting. It appears that with the "pay for extras" buisness model for B-P4F, network complexity is starting to approach low MMO levels with more objects that need to be tracked. Still nowhere close to what an MMO needs to do over network, but a fair bump up from your standard online FPS. Of course, that's countered a bit by potentially requiring more "instances" of play areas.

Regards,
SB
 
mm... looks like the GDC power point. A shame the audio is bad. Why does the camera guy zoom OUT during the videos. -_-
 
Back
Top