DICE's Frostbite 2

I'm curious if you can give an insight into my question earlier in the thread, as to why shading was offloaded onto the SPUs rather than the other type of post-process graphical effects etc. that other games seem to be offloading instead.

Of course I'm just an outsider, but my impression is that other games that don't do fully deferred rendering would have trouble with the texture access part if they tried to shade on SPUs. Far too much data with totally random access - but if you go deferred, you have everything already stored in the G-buffer nice and tidy so it's a far more simple and predictable case. Still BF3 has several layers of splitting up tasks and optimizations so it's not that easy even this way.

The only reasonable comparison is with the Killzone 2/3 engine IMHO, but it appears that Guerilla has far too many other jobs for the SPUs to have enough processing time for the shading. Nevertheless it'd be an interesting in-depth discussion but I doubt anyone knowledgeable enough is up to it. Maybe Repi and some others did it aside some beers at GDC ;)
 
Of course I'm just an outsider, but my impression is that other games that don't do fully deferred rendering would have trouble with the texture access part if they tried to shade on SPUs. Far too much data with totally random access - but if you go deferred, you have everything already stored in the G-buffer nice and tidy so it's a far more simple and predictable case. Still BF3 has several layers of splitting up tasks and optimizations so it's not that easy even this way.

The only reasonable comparison is with the Killzone 2/3 engine IMHO, but it appears that Guerilla has far too many other jobs for the SPUs to have enough processing time for the shading. Nevertheless it'd be an interesting in-depth discussion but I doubt anyone knowledgeable enough is up to it. Maybe Repi and some others did it aside some beers at GDC ;)

I guess a question I can branch from that is whether all this work they did to make a tile-based deferred renderer made SPU shading a logical conclusion?
 
Oh I missed it...than that makes 3 of us :smile:

I remember GG and ND mentioning they couldn't get the desired quality of post processing on RSX so they moved it on SPUs.

They said SPU postprocessing costs slightly more with slightly better quality.
However neither Uncharted nor KZ is big on transparency.
So if I had to guess, DICE simply wanted to do post-process after forward rendering.
 
Well Killzone is pretty big on transparency, the environments in both games exploit loads of transparent effects.
Though rendered at quarter resolution, but I don't think we are talking about that right now (are we?)
 
They said SPU postprocessing costs slightly more with slightly better quality.
However neither Uncharted nor KZ is big on transparency.
So if I had to guess, DICE simply wanted to do post-process after forward rendering.

They said some of the post processed quality are better than both consoles' GPUs (mathematically more correct, yada yada). You should be able to find the quotes by ND on the Depth of Field and SSAO effects. The GPUs should be able to beat SPU in some other stuff (handily).
 
So PC will sport fully dynamic lighting.
bf3lightingbqyr.png
 
Of course I'm just an outsider, but my impression is that other games that don't do fully deferred rendering would have trouble with the texture access part if they tried to shade on SPUs. Far too much data with totally random access - but if you go deferred, you have everything already stored in the G-buffer nice and tidy so it's a far more simple and predictable case.
Which is an interesting parallel with the Cell GPU patents. Cell combined with some hardware texturing is theoretically capable of respectable rendering performance, and here we see GPU working as a texturing unit to Cell. I wonder how far explorations into the Cell GPU architecture went in terms of evaluating its performance?
 
Yes, but would that have been possible with the hardware available to PS3's designers? If PS3's silicon budget had gone on x86 and nVidia or ATi GPU, would that same transistor budget be able to match/exceed the Frostbite 2 results that PS3 is getting with Cell+RSX?

The games being done on the 360, presumably to the same quality and that's much closer to the "regular CPU and good GPU" model of the PC. And the 360 was a year older.

So I don't see why it wouldn't have been possible with PC hardware of the time given console optimisations.
 
I've almost spilled my drink upon seeing the first terrain screenshot ;)
Very impressive, especially considering how much trouble we have building any kind of terrain, we usually use per-shot matte paintings instead because it's so complicated to get good results...

Same here! I think this may be the first title where tesselation makes a truly impressive difference in the game rather than something either just thrown in or experimented with. Well, this is assuming Crytek isn't making extensive use of terrain tesselation.

Thanks! We have a _long_ history of working with large scale terrains in our games and it is definitely not easy, very difficult to scale things properly while keeping detail high and memory low.

We had a quite competent system already in Frostbite 1 with procedural shader splatting and dynamic heightfields that I talked about at SIGGRAPH'07 (http://www.slideshare.net/repii/ter...l-shader-splatting-presentation?from=ss_embed) that we have since then improved and revamped quite significantly for Frostbite 2.

Hopefully something we will be able to describe in more detail about at SIGGRAPH'11 :)

Kudos. I have to admit I'd been growing distant from DICE games after BF: Vietnam due to all the game breaking bugs in BF2 and BF2142. And BF:BC just seemed too "consoley." But with BF:BC2 and now BF3, you're definitely drawing me back in, in a big way. :)

Have to say I'm extremely impressed with what's been shown of Frostbite 2 so far.

Regards,
SB
 
So I don't see why it wouldn't have been possible with PC hardware of the time given console optimisations.
It's not a case of can't be done, but what what features are enabled/disabled due to the systems? They may turn out identical. It may be one has signficantly better IQ thanks to fancy AA, and another has a better framerate. It may be a standard PC design from 2005 will miss a lot of features enabled on PS3, or maybe not. At least we have here a developer that is publically showing their approach and demonstrating that they are trying to do the very best, no compromise, engine implementation on each system, including novel ways of thinking. So we'll get a more interesting comparison of systems than with other engines that haven't tried such things as shading on SPUs. And we'll get a game that we can play on PS3, XB360, a DX11 PC and a 2005 PC and compare the results in a way that better reflects what's possible on the machines rather than what is achieved through more traditional engines like UE3.
 
It's not a case of can't be done, but what what features are enabled/disabled due to the systems? They may turn out identical. It may be one has signficantly better IQ thanks to fancy AA, and another has a better framerate. It may be a standard PC design from 2005 will miss a lot of features enabled on PS3, or maybe not. At least we have here a developer that is publically showing their approach and demonstrating that they are trying to do the very best, no compromise, engine implementation on each system, including novel ways of thinking. So we'll get a more interesting comparison of systems than with other engines that haven't tried such things as shading on SPUs. And we'll get a game that we can play on PS3, XB360, a DX11 PC and a 2005 PC and compare the results in a way that better reflects what's possible on the machines rather than what is achieved through more traditional engines like UE3.

Comparisons to a 2005 PC won't be possible since the game is DX10 only. So late 2006 is the best comparison point you will get (which is when PS3 launched anyway). Comparing on a transistor basis the PC wouldn't stand a chance but that's arguably because DX10 capability didn't make a huge amount of sense at that transistor budget from a performance point of view. And obviously the level of optimisation would be worlds apart.

The best comparison from an architecture point of view is still the 360 IMO. That's a perfect example of what could be done in 2005/6 at that transistor budget with a normal(ish) CPU and great GPU which had a more appropriate balance between features and performance than the lower end varients of G8x.

Will be interesting to see how BF3 performs on higher end PC's from 2006 though. But not as a fair point of comparison to either console.
 
Indeed it sounds like the DX11 and PS3 paths will be roughly comparable (although from the Lighting talk it sounds like stuff is way more dynamic on PC, so maybe not), but DX10 or anything previous isn't going to be the same path and thus isn't really directly comparable. Time to upgrade your GPUs folks :)
 
Indeed it sounds like the DX11 and PS3 paths will be roughly comparable (although from the Lighting talk it sounds like stuff is way more dynamic on PC, so maybe not), but DX10 or anything previous isn't going to be the same path and thus isn't really directly comparable. Time to upgrade your GPUs folks :)

Its DX10/11 only so yes the comparison to the DX10 path will certainly be interesting.
 
What are the tangible differences between the DX10 and 11 paths, aside from tessellation?
 
What are the tangible differences between the DX10 and 11 paths, aside from tessellation?
Compute shader is the big one in this context. They're doing their entire tile-based deferred shading step in a small number of big compute shaders. This isn't possibly in DX10 so either you have several additional passes through main memory for storing light lists (probably impractical) or you fall back on conventional non-tiled deferred shading, which will be significantly slower. Deferred MSAA is also far more efficient in DX11 because of the tiling.

They have a similar issue on the XBox 360, so I'm curious to see what sort of fall-back path they've implemented, but I don't think there's any question that if you're planning to play BF3 on the PC you should grab a DX11 card for this fall. It's likely to be significantly faster and higher quality as well.
 
Last edited by a moderator:
Back
Top