Next Gen Graphic Effects Are Amazing (Xbox 360, PS3)

Titanio said:
Thanks for clarifying, that's pretty much what I thought.

I wouldn't expect them to advertise it, but I guess that just highlights the problem with all of our information regarding Xenos. It's basically all ATi advertising, as all our info has been filtered through them. You basically have to pay as much attention to what they don't say as what they do.

Anyway, I'm not a HDR expert, but I don't think it was a wise decision not to have full support for FP16, but it's not surprising given ATi's attitude on that up until the R520 series. The dynamic range offered by FP10 is a tiny fraction of that of FP16 (1024 possible values per RGB component vs 65,536 possible values per component..?), and would require careful management of your data/art to try and hide shortcomings, if it could be done at all. It could potentially also hand PS3 games a default and immediate graphical advantage - probably the most commonly identified distinguishing feature of PS3 stuff to date has been "better lighting". And maybe this is why (?)


I very highly doubt any normal person would be able to tell the difference between FP10 and FP16. Where has it been stated that the majority of PS3 games have better lighting? If someone likes the lighting in certain PS3 games, it comes down to the software implentation and personal preference more than the hardware anyway.
 
Last edited by a moderator:
Hardknock said:
I very highly doubt any normal person would be able to tell the difference between FP10 and FP16.

You may be right or wrong, but you base this on..?

I think it's more a case of people seeing a difference, but not being able to point specifically at a particular technical quality - and could you blame them? This is complex stuff relatively speaking, it's not like telling the difference between a low and high poly model ;) That doesn't mean a difference wouldn't be there though, or perceptible. You just can't expect people to immediately say "oh, that's X" or "oh, that's a lack of Y" anymore.

Again, I'm not an expert, but from what I've read, even with FP16 there'd still be cases where you might see the lack of precision. But it's a lot lot closer than FP10. It comes down to how general and free you are to throw whatever you want at the renderer without artefacting, from what I have read. FP16 allows you to do more without having to change things to hide limitations. This could maybe be significant when it comes to multiplatform games, for example (?) Some scenes, relatively low contrast scenes, may look fine with either level of precision, but others..?

Pity Nvidia didn't include a 10-bit per component image in their HDR comparisons at E3 ;)

Hardknock said:
Where has it been stated that the majority of PS3 games have better lighting?

It's been said in this very thread, for one. Since the system has been shown, many have tended to unspecifically say "they just look better" or have more of a prerendered look (and no, not because some actually were prerendered :p) and some have been a little more specific, pointing at lighting. I would tend to agree from what I've seen sofar.
 
Last edited by a moderator:
I'd imagine that its too early to tell anything about the lighting of actual games, relative to one another, since developers haven't got final PS3 hardware and most current XBOX 360 games were developed using R420's, which had no HDR blending support.
 
Titanio said:
You may be right or wrong, but you base this on..?

I think it's more a case of people seeing a difference, but not being able to point specifically at a particular technical quality - and could you blame them? This is complex stuff relatively speaking, it's not like telling the difference between a low and high poly model ;) That doesn't mean a difference wouldn't be there though, or perceptible. You just can't expect people to immediately say "oh, that's X" or "oh, that's a lack of Y" anymore.

Again, I'm not an expert, but from what I've read, even with FP16 there'd still be cases where you might see the lack of precision. But it's a lot lot closer than FP10. It comes down to how general and free you are to throw whatever you want at the renderer without artefacting, from what I have read. FP16 allows you to do more without having to change things to hide limitations. This could maybe be significant when it comes to multiplatform games, for example (?) Some scenes, relatively low contrast scenes, may look fine with either level of precision, but others..?

Pity Nvidia didn't include a 10-bit per component image in their HDR comparisons at E3 ;)



It's been said in this very thread, for one. Since the system has been shown, many have tended to unspecifically say "they just look better" or have more of a prerendered look (and no, not because some actually were prerendered :p) and some have been a little more specific, pointing at lighting. I would tend to agree from what I've seen sofar.

Um, unless they're the same game on each system how could any reasonable person ascertain that PS3 has better lighting overall than Xbox 360? A lot of that has to do with personal preference and art direction. Not to mention(as Dave said) these are unfinished games that weren't developed on final hardware.
 
Hardknock said:
Um, unless they're the same game on each system how could any reasonable person ascertain that PS3 has better lighting overall than Xbox 360?

If the trend with PS3 games was nice(r) lighting sofar?

Hardknock said:
A lot of that has to do with personal preference and art direction.

True, but all else being equal, better tools might or will give better results..(in this case the "tool" being much greater dynamic range).

Hardknock said:
Not to mention(as Dave said) these are unfinished games that weren't developed on final hardware.

But we know some X360 games are using HDR despite the lack of it till beta hardware e.g. PGR3. It's not like we haven't seen that yet.
 
Titanio said:
If the trend with PS3 games was nice(r) lighting sofar?

What "trend"? We've seen what, 2 or 3 real-time PS3 demos? :LOL:


And in the end, unless they are the exact same game the comparision is flawed.
 
Titanio said:
But we know some X360 games are using HDR despite the lack of it till beta hardware e.g. PGR3. It's not like we haven't seen that yet.
Just like skinning a cat, there's more than one way of implementing "HDR" within a title - frame buffer blending of a high dynamic range format is not the only way of doing it. Unless you know whats being used and how you can't really come to many conclusions.
 
Hardknock said:
What "trend"? We've seen what, 2 or 3 real-time PS3 demos? :LOL:

icon_rolleyes.gif


There's plenty to choose from. Mobile Suit Gundam, The Getaway demo, Heavenly Sword, MGS4 (in particular - outstanding, almost photographic quality and contrast in some scenes), Vision Gran Turismo, amongst many others. Good to great lighting has been a hallmark of PS3 stuff so far, IMO.

Hardknock said:
And in the end, unless they are the exact same game the comparision is flawed.

Not really. If games on one system often or generally exhibit a quality not quite matched by games on another, one could readily point at the hardware. To take it to an extreme, I don't need Gran Turismo on my SNES to tell it has better lighting than SNES games, for example.

I'm sure you'll have your direct comparisons though.

Dave Baumann said:
Just like skinning a cat, there's more than one way of implementing "HDR" within a title - frame buffer blending of a high dynamic range format is not the only way of doing it. Unless you know whats being used and how you can't really come to many conclusions.

I can simply conclude that based on what I've seen sofar, PS3 titles seem to have better lighting. I'm not alone in thinking that. It remains true that the more constrained dynamic range on Xenos might well be a factor in that - not necessarily the only one, but one we can identify as a potential for now. I have seen devs express scepticism over the usefulness of FP10 (albeit with the caveat of no firsthand exposure at the time). There is a potential hardware explanation there. I'm sure you'd agree that the burden of proof lies with Xenos, and eventually these "30% power!" arguments will start ringing hollow (if they haven't already for many!). I'm sure you'd also agree more generally that all else being equal, more dynamic range is more desireable, and that there are reasons for that. There are some devs out there clamouring for FP32!
 
Last edited by a moderator:
I'm not alone in thinking that. It remains true that the more constrained dynamic range on Xenos might well be a factor in that - not necessarily the only one, but one we can identify as a potential for now.
In what way is it constrained?

I'm sure you'd agree that the burden of proof lies with Xenos, and eventually these "30% power!" arguments will start ringing hollow (if they haven't already for many!).
Sorry, I'm not sure what you are talking about here - I assume that you are suggesting that Xenos is already close to being used to it potential (but what this has to do with HDR is a question), if this is the case I would be of the opinion that current titles are not even scratching the surface of Xenos yet - none of the games will be using dynamic branching yet, none of the launch titles will have experimented with the high levels of geometry the unified architecture can allow, and the shader to pixel/texture ratio will be far lower than what can be achieved given that the games were designed on hardware with a nigh on 1:1:1 ratio. Developers have to get to grips with the tiling and its penalties before they really delve into those areas I would imagine.

I'm sure you'd also agree more generally that all else being equal, more dynamic range is more desireable, and that there are reasons for that. There are some devs out there clamouring for FP32!
In a perfect world we'd have more of everything, but as we know there is never a perfect world with hardware, so you always have to balance things. Inevitably "more HDR" will also mean less something else; with FP16 blending on PS3 you are looking at twice the framebuffer size and twice the bandwidth of normal rendering, which is ineviably going to come at the detriment of something else (then factor in the demands of AA on top of that, should it be desired). Titles that have been developed so far have been done so using G70's, which have different bandwidth properties than RSX will have, so its going to be interesting to see how titles in development will be tuned when they have more representative dev kits.
 
Titanio said:
:rolleyes:

There's plenty to choose from. Mobile Suit Gundam, The Getaway demo, Heavenly Sword, MGS4 (in particular - outstanding, almost photographic quality and contrast in some scenes), Vision Gran Turismo, amongst many others. Good to great lighting has been a hallmark of PS3 stuff so far, IMO.

You roll eyes and then THAT is your so-called proof?

The only thing worthy of eye-rolling is your weak examples. 3 of the 5 are tech demo's, and Heavenly Sword and MGS4 have no X360 version os it speaks only to the skills of the developers not of the power of the hardware.

ALso, Gears of War, Too Human and Mass Effect all seem to have excellent lighting, the new GR: AW also is extremely good.

If you want a fair comparison stop comparing games due out in 2007 (MGS4) to games launched in 2005.

As for Xenos not performing true HDR I don't know where you're getting that from, it's been confirmed it can do FP16 HDR, and I don't see how the EDRAM is a real issue here. RSX can't even due HDR+4xAA, so why couldn't Xenos use the same apporach if it needed to? In worst case scenario it suffers from teh same limitations as RSX, in the best case it can do more...
 
Dave Baumann said:
In what way is it constrained?

FP10 is 10bits per component versus 16 with FP16. It's a fraction of the available range. Maybe constrained isn't the right word - restricted?

Dave Baumann said:
Sorry, I'm not sure what you are talking about here - I assume that you are suggesting that Xenos is already close to being used to it potential

No, that's not what I'm suggesting, but if that's your point, it applies equally to PS3. I'm not sure what this has to do with HDR either, as it's not going to take multiple generations to be able to use it properly, we're already seeing it used in some games. If you see better implementation in the future, more power to you, but one could see better still in PS3's, and you equally couldn't contradict them.

Dave Baumann said:
In a perfect world we'd have more of everything, but as we know there is never a perfect world with hardware, so you always have to balance things.

True, but IMO FP16 is a better compromise if you have it.

Dave Baumann said:
Inevitably "more HDR" will also mean less something else; with FP16 blending on PS3 you are looking at twice the framebuffer size and twice the bandwidth of normal rendering, which is ineviably going to come at the detriment of something else (then factor in the demands of AA on top of that, should it be desired).

PS3 gets something of a free pass on this as long as it can't do MSAA with HDR ;) Reduction in available bandwidth for "other things" would also be relative to itself of course, not necessarily X360. It's obviously a variable, and it's difficult to gauge the impact of things like color and z compression, but I'd be somewhat surprised if after using a FP16 framebuffer, PS3 had less general bandwidth available than X360.

Dave Baumann said:
Titles that have been developed so far have been done so using G70's, which have different bandwidth properties than RSX will have

I may be mistaken, but I believe PC Watch Impress reported that the G70s in the PS3 dev kits had their bandwidth cut back to match the announced spec. Either way, on the other side, CPU-GPU bandwidth will only go up (a lot). I appreciate the point that regardless of all this, the bandwidth situation will be different, but then I guess that's what getting final hardware out to devs relatively early is all about.

Anyway, at least ATi's inclusion of FP10 in the R520 series means we should see more open discussion about it from devs that we might have otherwise, so that'll be interesting to watch with regard to this debate..

scooby_dooby said:
The only thing worthy of eye-rolling is your weak examples. 3 of the 5 are tech demo's, and Heavenly Sword and MGS4 have no X360 version os it speaks only to the skills of the developers not of the power of the hardware.

3 of the 5 are not tech demos. MGS4, HS and Mobile Suit Gundam are all games in development. Vision Gran Turismo is an experiment as is The Getaway - but neither is using HDR or lighting above and beyond what we've seen in the other mentioned games. The only way in which they are an unfair display of PS3's power is that they're using PS2-era assets - if anything, they understate the potential.

And again, see my previous point about cross-platform comparison.

scooby_dooby said:
ALso, Gears of War, Too Human and Mass Effect all seem to have excellent lighting, the new GR: AW also is extremely good.

Let's not debate "good" and "better". I'm not saying X360 games don't have good lighting. I'm saying I've seen better in PS3 games.

scooby_dooby said:
As for Xenos not performing true HDR I don't know where you're getting that from, it's been confirmed it can do FP16 HDR

Please read the thread.

If you want HDR+AA on RSX, Valve have a solution for you. It's less precision, but then so is FP10.
 
Last edited by a moderator:
Titanio said:
Please read the thread.

I've read the threadm it's you grasping at straws to assume since ATI didn't mention FP16 HDR in one slide then therefor it doesn't support it.

From dave's article: ""The ROP's can handle several different formats, including a special FP10 mode. FP10 is a floating point precision mode in the format of 10-10-10-2 (bits for Red, Green, Blue, Alpha). The 10 bit colour storage has a 3 bit exponent and 7 bit mantissa, with an available range of -32.0 to 32.0. Whilst this mode does have some limitations it can offer HDR effects but at the same cost in performance and size as standard 32-bit (8-8-8-8 ) integer formats which will probably result in this format being used quite frequently on XBOX 360 titles. Other formats such as INT16 and FP16 are also available, but they obviously have space implications."

Funny how you choose to ignore that obvious line stating FP16 support, instead choosing to "go with" a hunch about some ATI slide which IYO are very "telling" Pleeease...

Dave has already said there are other ways to implement HDR even without using the EDRAM, and presumably dev's always have the option to use either AA or HDR (as they wll have to on PS3 anyways most likely)

The bottom line is there is no technical reason why PS3 games should have better lighting, you're smart enough to know this, there is no "trend" showing, Sony just manages their PR and video clips much better than MS, MS shows everything, Sony shows only what they think will impress you. BTW, I thought the genearl opinion on Mobile Suit Gundam was that the 360 version looked better, personally I think they look pretty much the same.

IF the PS3 "GAMES" are looking so much better, why couldn't they even show one at tgs? WHat's telling to me, is that still, almost into the 2006 year Sony still hasn't even showed a real in-game video.

Lets wait until real games before we start identifying 'trends' cause right now you're comparing real X360 games to pre-selected, PS3 tech demo's and trailers to try and make some argument about an overal superiority in PS3 lighting. It's extremely weak.
 
scooby_dooby said:
I've read the threadm it's you grasping at straws to assume since ATI didn't mention FP16 HDR in one slide then therefor it doesn't support it.

From dave's article: ""The ROP's can handle several different formats, including a special FP10 mode. FP10 is a floating point precision mode in the format of 10-10-10-2 (bits for Red, Green, Blue, Alpha). The 10 bit colour storage has a 3 bit exponent and 7 bit mantissa, with an available range of -32.0 to 32.0. Whilst this mode does have some limitations it can offer HDR effects but at the same cost in performance and size as standard 32-bit (8-8-8-8 ) integer formats which will probably result in this format being used quite frequently on XBOX 360 titles. Other formats such as INT16 and FP16 are also available, but they obviously have space implications."

Funny how you choose to ignore that obvious line stating FP16 support, instead choosing to "go with" a hunch about some ATI slide which IYO are very "telling" Pleeease...

We're going round in circles now. Simple support for the framebuffer format is not in question. It's whether you can effectively do HDR on it or not. Simply having the framebuffer format is not enough. You need to support blending on that format, for example (which is where I guess the problem is on Xenos).

Besides, a dev here has already confirmed the matter (Faf). And you really think the likes of Dave Baumann would enter this discussion and let it continue if it were not the case?

Assuming it's there when ATi haven't said it is there, and others who should know say it's not there, is what's grasping at straws.

scooby_dooby said:
Dave has already said there are other ways to implement HDR even without using the EDRAM, and presumably dev's always have the option to use either AA or HDR (as they wll have to on PS3 anyways most likely)

No he didn't. This is not about the eDram issues we've seen elsewhere. It's simply about some framebuffer formats not having full functionality to support HDR. Your other option is FX16. More precision than FP10, I think (?), but it's "half-speed".

scooby_dooby said:
The bottom line is there is no technical reason why PS3 games should have better lighting

FP16 as opposed to FP10 may be that reason. Welcome to the debate.
 
Last edited by a moderator:
scooby_dooby said:
The bottom line is there is no technical reason why PS3 games should have better lighting,

IF the PS3 "GAMES" are looking so much better, why couldn't they even show one at tgs? WHat's telling to me, is that still, almost into the 2006 year Sony still hasn't even showed a real in-game video. .

1. A technical reason might be somthing called Cell. A little bit of lighting trickery done on Cell could make a difference. OR Cell post porcessing could have an effect also.
2. You really are a fanyboi, MGS4 real-time, granted it might not of been game play but it was 100% in-engine. Add to that that hideo stated it would look BETTER as the months progress.

Honestly i gettin really sick and tired of x-bots using the no gameplay excuse as a way to downplay sony and PS3.
 
Titanio,
No, that's not what I'm suggesting, but if that's your point, it applies equally to PS3
I would suggest, probably not to the same degree. The hardware that developers are using likely already is much closer in terms of capabilities and configuration of RSX than Xenos is to anything that developers could have used before; while developers will get more interface bandwidth and about a 25% overall performance improvement (look at the GTX 512MB) they will be dealing in lower bandwidths.

It's obviously a variable, and it's difficult to gauge the impact of things like color and z compression
Z depths don't change. the level of colour compression probably won't change either, in fact because of the increased range its likely to be reduced; FP16 pixels are twice as costly.

but I'd be somewhat surprised if after using a FP16 framebuffer, PS3 had less general bandwidth available than X360.
I wouldn't. And, it will almost certainly have less available RAM for other assets given equal resolutions - this is the alternative benefit of the eDRAM, the available RAM space isn't going to be severly impacted by what else you do to the pixels, as system RAM will only ever store portions of the displayable image.

I may be mistaken, but I believe PC Watch Impress reported that the G70s in the PS3 dev kits had their bandwidth cut back to match the announced spec.
I don't recall. I thought I remembered seeing a lower PCI Express bandwidth, which if were the case would actually be totally the opposite direction for final hardware.

Scooby,
Funny how you choose to ignore that obvious line stating FP16 support, instead choosing to "go with" a hunch about some ATI slide which IYO are very "telling" Pleeease...
What that doesn't make note of is whether there are blending capabilities or not, which I don't actually recall if I asked specifically.

Dave has already said there are other ways to implement HDR even without using the EDRAM, and presumably dev's always have the option to use either AA or HDR (as they wll have to on PS3 anyways most likely)
Radeon 9700 demonstrated an HDR demo using FP textures when it was first released, of course done without blending. Valve's Lost Coast demo also uses a mechanism that effectively uses a Photoshop filter post process on the frame, which I would guess is actually less costly than FP16 blending.
 
Titanio said:
No he didn't. This is not about the eDram issues we've seen elsewhere. It's simply about some framebuffer formats not having full functionality to support HDR. Your other option is FX16. More precision than FP10, I think (?), but it's "half-speed".
Well, as we've already said its not "the other option", but an other option. Half speed refers to the fact that it will blend at 4 pixels (or up to 16 subsamples) per cycle as oppsed to 8 (32 sub), because its probably bit packing over multiple cycles. IIRC NV40 had a blend rate of half its pixel rate, and an FP16 blend rate of half that - I would guess that G70's FP16 blend rate is at most half its pixel fill-rate.

[Edit] Here we go, FP16 blend rate of G70 is 8 Pixels per clock.
 
Dave Baumann said:
I would suggest, probably not to the same degree. The hardware that developers are using likely already is much closer in terms of capabilities and configuration of RSX than Xenos is to anything that developers could have used before.

True, but there's much headroom on both for "creative solutions". I'm not sure how far you can go to escape a lack of FP16 functionality, but either way, workarounds are rarely standard. FP10 usage will be that standard I think.

Dave Baumann said:
Z depths don't change. the level of colour compression probably won't change either, in fact because of the increased range its likely to be reduced; FP16 pixels are twice as costly.

It'll always be twice as costly, but twice what is the question..

Dave Baumann said:
I wouldn't.

Wanna suggest a typical # of accesses to each pixel/z value per frame? If such a thing exists? That's really the only variable. This is kinda getting OT (though it's something I've wondered about for a while..).

Dave Baumann said:
And, it will almost certainly have less available RAM for other assets given equal resolutions - this is the alternative benefit of the eDRAM, the available RAM space isn't going to be severly impacted by what else you do to the pixels, as system RAM will only ever store portions of the displayable image.

You mean just storing the framebuffer in RAM vs storing some or all of it in eDram. So you'll save a few MBs, maybe? It's not a one way street, though, on the flipside you have to consider the memory requirements of the display list when using tiling..

Dave Baumann said:
Radeon 9700 demonstrated an HDR demo using FP textures when it was first released, of course done without blending. Valve's Lost Coast demo also uses a mechanism that effectively uses a Photoshop filter post process on the frame, which I would guess is actually less costly than FP16 blending.

The question is whether the results are as good. From what I've seen of the Lost Coasts, I'd say no (although I'll admit I have not taken a very close look). There's a tradeoff there, on the plus side they can run on much wider hardware and get AA with it. In fairness, though, there are AFAIK two Valve implementations to talk about.

As for FP textures, do they not require twice the memory space and bw? Of the little discussion I've seen regarding them, they don't seem to be really feasible for general use in games yet.
 
Last edited by a moderator:
scooby_dooby said:
You roll eyes and then THAT is your so-called proof?

The only thing worthy of eye-rolling is your weak examples. 3 of the 5 are tech demo's, and Heavenly Sword and MGS4 have no X360 version os it speaks only to the skills of the developers not of the power of the hardware.

ALso, Gears of War, Too Human and Mass Effect all seem to have excellent lighting, the new GR: AW also is extremely good.

If you want a fair comparison stop comparing games due out in 2007 (MGS4) to games launched in 2005.

As for Xenos not performing true HDR I don't know where you're getting that from, it's been confirmed it can do FP16 HDR, and I don't see how the EDRAM is a real issue here. RSX can't even due HDR+4xAA, so why couldn't Xenos use the same apporach if it needed to? In worst case scenario it suffers from teh same limitations as RSX, in the best case it can do more...

I think its better to compare exlusive games since cross platform releases arent always exploiting each console's architecture abilities
 
I'm not sure how far you can go to escape a lack of FP16 functionality
Again, do you know what its “lackingâ€￾?

It'll always be twice as costly, but twice what is the question..
On G70, FP16 blending costs twice the bandwidth, twice the memory footprint and twice the performance relative to an 8-bit per component integer pixel.

Wanna suggest a typical # of accesses to each pixel/z value per frame? If such a thing exists? That's really the only variable.
No, bandwidth is a major variable.

You mean just storing the framebuffer in RAM vs storing some or all of it in eDram. So you'll save a few MBs, maybe?
If you had a 720p display running with 4x FSAA and 16-bit per component frame buffer you be saving ~50MB.

It's not a one way street, though, on the flipside you have to consider the memory requirements of the display list when using tiling..
That’s negligible – this isn’t a PowerVR solution, sorting all the geometry, memory the commands that create the geometry.

As for FP textures, do they not require twice the memory space and bw?
Yes. Potentially more so as there’s no compression yet.
 
Dave Baumann said:
Again, do you know what its “lacking”?

I can only guess it is blending since everyone is presumably so gagged they can't discuss it publically. I'm quite surprised you didn't ask for the article, since this was even such a point of speculation regarding their then upcoming cards (R520), and everyone was wondering when ATi would start supporting it. Whatever the lack is, though, it's enough for ATi not to present it as an option for HDR alongside FP10 and FX16, and again as a dev here put it, enough that "FP10 is the format you're supposed to use".

Dave Baumann said:
On G70, FP16 blending costs twice the bandwidth, twice the memory footprint and twice the performance relative to an 8-bit per component integer pixel.

Yes, my question is what that cost with a 32-bit buffer is, once whatever impact compression has is factored in (be it little or large). Then the other variable is how many times on average you're accessing values in the colour and z buffer, which as you point out, can't really be pinned down.

Dave Baumann said:
If you had a 720p display running with 4x FSAA and 16-bit per component frame buffer you be saving ~50MB.

Again, though, unless NVidia made changes for their 90nm parts and RSX, this isn't even an option, so why consider it? You'll be typically talking about a 720p 64bit frame without MSAA. That, with a 32-bit framebuffer, would be ~10MB?

Dave Baumann said:
That’s negligible – this isn’t a PowerVR solution, sorting all the geometry, memory the commands that create the geometry.

All I know is when it was discussed previously it was referred to as "potentially significant" by one dev here, in terms of memory usage, that's what I was thinking back to anyway.
 
Last edited by a moderator:
Back
Top