Situations Where Next Gen Console Will Struggle/Shine?

Kabbage

Newcomer
I've been reading about how the PS3 will be bandwidth limited because of its lack of EDRAM and such. That got me wondering, which may be old pointless repeats of what you guys may have been over several times now. In what situations will each console shine and in which situations will each console struggle.

(This isn't meant to be used as fodder on any other forum)

P.S. Try and keep it in "average joe" terms since i'm not as fluent in some of the technical terms as most of you here.
 
Last edited by a moderator:
Shine? Lots and lots of geometry, lots and lots of characters on screen. These machines can process more polygons than we'll ever need so it will be kinda trivial to just keep adding objects without a performance hit. Obviously the limit will be reached, but slowdowns will occurr for shading and fillrate reasons, as all those objects will have to be shaded and lit properly.
I fully expect sound to be absolutely gorgeous on next gen systems, with full high quality surround sound. I just LOVE surround sound. Had my surround sound system for years, looong before this whole craze about HDTVs.

Struggle? Hard to say, i guess a developer can always find something to make the machine slowdown, but i think it will be easy to kill performance with some decent lighting on many objects on screen. Also it all depends on the shaders, fillrate will run out relatively quickly even on next gen machines.

Also let's not forget that the artists will have to work hard too, you can have the perfect machine with unlimited performance, but if the sound and the design is crap, the game will look and sound crap. So i could say that the artists can be the limit sometimes, as we've seen in this generation too.
 
Kabbage said:
I've been reading about how the PS3 will be bandwidth limited because of its lack of EDRAM and such.

That's a fairly absolutist statement - of course it's limited, if it weren't you'd have infinite bandwidth ;) If you mean relative to X360, it's only relatively limited when it comes to framebuffer bandwidth - in terms of general bandwidth, it's the other way around. So I certainly don't think you can say it's generally bandwidth limited relative to another system.
 
Titanio said:
That's a fairly absolutist statement - of course it's limited, if it weren't you'd have infinite bandwidth ;) If you mean relative to X360, it's only relatively limited when it comes to framebuffer bandwidth - in terms of general bandwidth, it's the other way around. So I certainly don't think you can say it's generally bandwidth limited relative to another system.

And the Framebuffer would be used for thing like AA and particle effects right?

Oh and thanks for the responses.
 
Kabbage said:
And the Framebuffer would be used for thing like AA and particle effects right?

AA yes, it's one of the biggest bw consumers for the framebuffer, if not the biggest.

Particle effects - depends what you're doing, but in the "traditional" alpha-blended quad way, framebuffer bw becomes one factor.
 
If ps3 developers choose not to use AA, like the seem to be doing with MGS4 and motorstorm, would they have more bandwidth then 360? I mean, if 360 developers chose not to have it, would they gain anything? Like if 360 could use the bandwidth in the edram for something else?
Although, MS has put AA as a standard, was that really a good idea?
 
weaksauce said:
If ps3 developers choose not to use AA, like the seem to be doing with MGS4 and motorstorm, would they have more bandwidth then 360? I mean, if 360 developers chose not to have it, would they gain anything? Like if 360 could use the bandwidth in the edram for something else?
Although, MS has put AA as a standard, was that really a good idea?

One question at a time!! ;)

Right, IF PS3 games don't have AA, that's a whole lot of bandwidth saved, yes.
Any resource can be used for "something". If X360 games don't have AA, the resource saved from that could be used for something else. However, apparently, the savings will not be that big since AA has an estimated 5% hit on X360, which is really nothing...
 
Pure (wild) speculation...

Shine: Network connectivity...regardless of xbox live or not...360, PS3 and Revo connectivity will impress many console gamers. One the biggest change we'll see in coming years will be more MMO on console. Tighter integration with different devices media center, media hub whatever you call it, it will be have the same effect on the way people will use the console.

The console (notably the PS3) will allow people to do more with their console than before. I can see ShutterFly like service for the PS3. Businesses will not miss the opportunity to jump into the console service as it will reach a astounding number of homes. From game/movie rental to on demand pizza. This is sort of stuff that your cable and telephone companies have been trying for years. I'm not saying it will be in every household, but it will prove a viable market. And the following generation will nail it.

Struggle: Graphic...Like CPU speed, if it's marginally better most people will not notice it. And especially with today's graphic on the PC, many will be jaded to 360, PS3 and Revo graphic. Secondly, to make a game true jaw dropping gorgeous, it will take a buget that will revial most hollywood movies. And frankly, I think a lot of low budget game can get really fun...even without dazzling graphics.

I think overall Nintendo is on the right track, trying to appeal to the mass first, puting hardcore gamers second. However, from a convergence device, Revo will be lagging behind 360 and PS3.
 
Hopefully, AI. With all the additional floating-point power it would be exceedingly cool to see what good AI programmers can do. This, to me, means stepping away from the current pretty case-scenario type AI with simple rules that you randomly pick from based on conditions to a more uncertain and natural approach. Primarily I could see things such as Bayesian probabilities used to classify your tactics, are you agressive or passive etc.

Graphics are potentially the down fall of all the next-generation consoles simply because you can put as much eye-candy (that costs thousands to generate) in but if the idea sucks it won't ever matter. This may be advantageous and hopefully some of the companies that do not have the originality, but stick as many flash graphics in as possible, will be hit by the Darwin bus and become bankrupt so that the more original studios survive.
 
weaksauce said:
If ps3 developers choose not to use AA, like the seem to be doing with MGS4 and motorstorm, would they have more bandwidth then 360?

Even if PS3 devs go up to the limit of what's available for the framebuffer (22.4GB/s, or 22.4GB/s multipled by the average compression factor for what you're doing) - be that with AA, without AA, whatever - there's still more left over than on X360. But they probably won't, which will leave even more still.

weaksauce said:
I mean, if 360 developers chose not to have it, would they gain anything? Like if 360 could use the bandwidth in the edram for something else?

london-boy said:
If X360 games don't have AA, the resource saved from that could be used for something else.

You can only trade off against other framebuffer operations or more specifically, other things the ROPs can do. Like alphablending, perhaps. Things like DOF and motion blur, if done in shaders, are trading with the 32GB/s between the parent and daughter die.

You can't cash in that bandwidth to use for "the rest"* - texturing/vertex fetch, CPU usage etc - as you can on PS3.

*well I don't think you can. Dave's article talks about the result of RTT ops having to be copied out into main memory, for example, before being used as a texture, so it doesn't seem you can.
 
Last edited by a moderator:
The things will shine when Cell starts to be used. Its actually quite strange how much shit Cell has got from this forum, it is a defense to "what i bought mentality" .
In the end i think this has lowered what many think cell could do and put higher pressure to Xcpu as just as better but in the end i think that many that downplayed Cell for whatever reason maybe will realize that it is that good afterall and the Xcpu maybe isnt as good as many claims or claimed.
 
Right, IF PS3 games don't have AA, that's a whole lot of bandwidth saved, yes.
Any resource can be used for "something". If X360 games don't have AA, the resource saved from that could be used for something else. However, apparently, the savings will not be that big since AA has an estimated 5% hit on X360, which is really nothing...

I find it frankly astounding to see people trying to spin running without AA into something positive or desirable, and just a matter of saving bandwidth without bothering to take the IQ loss into account. Why not bring back 16 bits rendering and 8-bits textures while we are at it ? I mean, all that 32 bits and FP stuff costs quite a lot of bandwidth, right ? :rolleyes:

It takes some clever wording to turn the proposition "AA costs very little on the 360 which makes it possible to always implement it with proper programming" into "the bandwidth saved by disabling AA on the 360 is not that big". :oops:
 
Corwin_B said:
I find it frankly astounding to see people trying to spin running without AA into something positive or desirable, and just a matter of saving bandwidth without bothering to take the IQ loss into account. Why not bring back 16 bits rendering and 8-bits textures while we are at it ? I mean, all that 32 bits and FP stuff costs quite a lot of bandwidth, right ? :rolleyes:

It takes some clever wording to turn the proposition "AA costs very little on the 360 which makes it possible to always implement it with proper programming" into "the bandwidth saved by disabling AA on the 360 is not that big". :oops:


:???: I'm really not sure where all that is coming from... If anything, my statements were leaning towards the fact that AA has a very small hit on X360 (which is a Good Thing) and the fact that on PS3 you either have AA or HDR (which is a Bad Thing)...

I think you should read my post again, this time without thinking i'm some kind of fanboi.

The "spin" was all in your head. Or maybe you just quoted the wrong guy, i think you should have quoted Titanio, he's the one who's trying to make it look like a good thing that AA might not be used on PS3 games, certainly not me. :smile:
 
I'm not saying it's a good or bad thing. It's simply a choice the developer can make.

Corwin_B protest that "it takes some clever wording to turn the proposition "AA costs very little on the 360 which makes it possible to always implement it with proper programming" into "the bandwidth saved by disabling AA on the 360 is not that big"", but the latter part of that statement is true in that any daughter die bw savings are of only limited use - for things the daughter die can do. You can't go using that saved bw in a general manner, so you may as well use it for what the daughter die is meant to do, and a big part of that is AA. Of course, tiling issues may present other tradeoffs, but I'm just purely looking at the bandwidth here.

london-boy - re. HDR and AA, it's a choice on PS3 of having both via shaders, or FP16 HDR via hardware without it (although I can't see what would stop you using a FP16 buffer with AA and HDR done via shaders also, except increased bw consumption, and the fact that it may look little different ;)).
 
As Xbox360 games have shown, even a 720p image without AA is painfully jaggy on new LCD's. And it's not just jaggies here, which can very well be hidden with clever colour decisions. It's the moire (sp?) and all other aliasing errors which really shouldn't be there in 2006.

I think it's not a matter of "the developer has the choice to either use AA or use the resources for something else". It's a matter of AA is needed most of the time.
 
london-boy said:
As Xbox360 games have shown, even a 720p image without AA is painfully jaggy on new LCD's. And it's not just jaggies here, which can very well be hidden with clever colour decisions. It's the moire (sp?) and all other aliasing errors which really shouldn't be there in 2006.

I think it's not a matter of "the developer has the choice to either use AA or use the resources for something else". It's a matter of AA is needed most of the time.

Which Xbox360 games are you using for reference though? PGR3 at least, seems to be rendering at slightly less than 720p and then upscaled which will result in more aliasing. Another thing is the viewing distance from the TV - does a non-anti-aliased 720p game look better than a non-anti-aliased 480p game from the same distance? What if PS3 games render for the most part at 1080p without any AA and then downscale it to 720p? What would be the results? Without knowing the exact answers to those questions, I don't see "AA being a requirement" such a clear point. What about games that don't look that bad without AA?

As far as I have seen with the High-definition screens of some footage, I really had nothing to complain about, even with some of the jaggies here and there. I'm not sure how representative that is though.

Also, the COD footage I've seen had no anti-aliasing at all and to be honest, it didn't look that bad.
 
Phil said:
Which Xbox360 games are you using for reference though? PGR3 at least, seems to be rendering at slightly less than 720p and then upscaled which will result in more aliasing. Another thing is the viewing distance from the TV - does a non-anti-aliased 720p game look better than a non-anti-aliased 480p game from the same distance? What if PS3 games render for the most part at 1080p without any AA and then downscale it to 720p? What would be the results? Without knowing the exact answers to those questions, I don't see "AA being a requirement" such a clear point. What about games that don't look that bad without AA?

As far as I have seen with the High-definition screens of some footage, I really had nothing to complain about, even with some of the jaggies here and there. I'm not sure how representative that is though.

Also, the COD footage I've seen had no anti-aliasing at all and to be honest, it didn't look that bad.


Obviously some people will be more susceptible to a crapy image than others. And obviously "from a distance", things get better, but this all depends on the distance, on the eyes, on the type of game... It's all so variable that giving one opinion and one solution is useless, but i did say that in my post :smile:

If PS3 renders each and every game at 1080p and outputs at 720p, it will look ok, better than a non-AA 720p image, not great though. And 1080p without AA will look gorgeous, and pixels will be small enough that jaggies will be hidden mostly, but that doesn't take care of texture aliasing and moire.

Screens only tell you one story, AA is something that affects how all those pixels blend together and it's not only about jaggies, as i said, it's about how it looks in motion (moire is one of the most hated effects of aliasing, at least for me).

The footage of COD i've seen on X360 could look painfully aliased in my eyes, it might be just me, my eyes might be just more bitchy than most, but at times it made me go ewww. Same for PGR3, only much more than COD.
 
london-boy said:
As Xbox360 games have shown, even a 720p image without AA is painfully jaggy on new LCD's. And it's not just jaggies here, which can very well be hidden with clever colour decisions. It's the moire (sp?) and all other aliasing errors which really shouldn't be there in 2006.

To reiterate what was said earlier, it's a matter of choosing between more, less or no AA. Not more or none. Maybe we should be asking ourselves how typically reasonable 2xAA or 4xAA would be with up to 22.4GB/s before writing either off.

But I do think that it is very simply a matter of the developer choosing what they want. It's up to them to decide if the tradeoffs are or aren't worth it. It's not like the gains wouldn't be pretty siginificant.
 
Titanio said:
To reiterate what was said earlier, it's a matter of choosing between more, less or no AA. Not more or none. Maybe we should be asking ourselves how typically reasonable 2xAA or 4xAA would be with up to 22.4GB/s before writing either off.

But I do think that it is very simply a matter of the developer choosing what they want. It's up to them to decide if the tradeoffs are or aren't worth it. It's not like the gains wouldn't be pretty siginificant.


Of course, but you'd think in 2006 AA would come free with a coffee and croissant with every game.

Really, not only jaggies, but moire patterns like this http://web.onetel.net.uk/~simonnihal/assorted3d/samppat.html should not exist today, and WILL show on nice HDTVs. (That's just a simple example). Obviously "clever artistic design" can possibly take care of it by using low contrast colours in the game, but really...

I'm not bitching (much..), i'll be happy whatever happens. I don't think i'll be buying any next gen console any time soon anyway.
 
Back
Top