More RSX tidbits:

Status
Not open for further replies.
People should go check the "capabilities" of 6800Ultra ;)

Jawed

Well, that's the strange part on NVidia's end to tell you the truth... Like G70 can do FP32, but with G80 they're sort of 're-announcing' that feature. I guess due to it's actually becoming viable in G80...
 
If I’m not mistaken a EGM article stated that PS3 version of COD 3 will feature better lighting, light sources, ECT…over its other console brethren. If this site permits I can post scan-part of that article later tomorrow.

Summary is ok, scans aren't allowed.
 
The problem is that we do not have the final specification sheet yet. We really need Sony and Nvidia to jointly release an updated version.
 
128 bit HDR is a waste of resources (assuming that's 128 bit per channel) and even if PS3 can use it, it's a waste.

Naturally it means 32bit per channel. I think nVidia and ATI use the sum of per-channel bit counts for marketing reasons, bigger number is better.
 
Naturally it means 32bit per channel. I think nVidia and ATI use the sum of per-channel bit counts for marketing reasons, bigger number is better.

Is someone here actually beliving the OP's 128 bits is single channels of 128 bits color information? :oops:
I dont even want to try counting how many bits of info would be needed for that.
 
Is someone here actually beliving the OP's 128 bits is single channels of 128 bits color information? :oops:
That what I was asking, because I thought FP32 was supported on several cards already, so 128 bit per pixel wouldn't actually be anything new.
 
The way i understand the whole thing has always been:

FP32 is what is also called 128-bit rendering, as in 32x4(channels)=128.

FP16 is what is also called 64-bit rendering, as in 16x4=64.

Even FP16 should give much better results than the 32-bit rendering we got in the last few years, which in turn gave better results than good ol' 16-bit rendering.

But that also shows you how overkill FP32 is in the vast majority of cases, if not always.

And that's why i was a bit surprised at how people found it amazing that RSX supports "128bit rendering", when FP32 has been around for quite a while on NVIDIA GPUs...
 
The way i understand the whole thing has always been:
FP32 is what is also called 128-bit rendering, as in 32x4(channels)=128.
FP16 is what is also called 64-bit rendering, as in 16x4=64.

Even FP16 should give much better results than the 32-bit rendering we got in the last few years, which in turn gave better results than good ol' 16-bit rendering.

But that also shows you how overkill FP32 is in the vast majority of cases, if not always.
It's worth noting that there was a large increase in quality when CG renderers moved from 32 bit per channel to 64 bit. That'd make 256 bit the 'optimum' for quality I think. I can at least see that 128 bit should have a small but noticeable quality difference in some situations, so won't be totally pointless. But I also doubt it'd be used in any but the most limited of situations due to the system demands.
 
We have all known that RSX supports 128-bit HDR since the very first announcement at e3 last year. You'll never see it in games, and if you do, it means that the developers really had resources to spare, which should have been put to much better use.
There is no point whatsoever in using 128-bit today, when you need to worry about so many other things before you slow down your game with silly rendering formats, just so you can tick a checkbox.

If Factor5 (and lots more) came out today and said they are using 128bit HDR i'd laugh at them. They need to sort out a lot of other things that are wrong with the look of the game before they go all out on the PR front.

128bit HDR is more of a PR format than it is a real rendering format... 1080p is bad enough...


I agree with everything you said. Well except the little 1080p part. 1080p is cool. :devilish:

But it really does look like 128bit HDR is a complete waste. How good is 32bit HDR compared to nAo's Heavenly Sword HDR lighting?
 
I agree with everything you said. Well except the little 1080p part. 1080p is cool. :devilish:

But it really does look like 128bit HDR is a complete waste. How good is 32bit HDR compared to nAo's Heavenly Sword HDR lighting?

32bit HDR = fp8
40bit HDR = fp10 <- special case for xenos
64bit HDR = fp16
128bit HDR = fp32

LDR is normal lighting...this is the light you tend you live in especially at night. HDR is overrated
 
What do you mean "how good is it?".
Look at it. Does it look good to you? That's how good it is.


Yeah my question sucked. I meant would it even be worth the time, energy, and system resources to even program a game with FP32 in mind if the RSX could do it compared to what NAO32 can do?

Of course if this news is true. Which is probably not.
 
LDR is normal lighting...this is the light you tend you live in especially at night. HDR is overrated

Mmm not really... Reality's dynamic range is higher than what our silly computers can do now. But this is the point when i get that you were being sarcastic and feel slightly embarrassed...
 
Yeah my question sucked. I meant would it even be worth the time, energy, and system resources to even program a game with FP32 in mind if the RSX could do it compared to what NAO32 can do?

Of course if this news is true. Which is probably not.

Like with everything else, it's all about speed. FP32 is slow. NAO32 isn't as slow. When GPUs can render at FP32 as fast as they can render FP10 (or whatever), then we'll all move forward.
 
Mmm not really... Reality's dynamic range is higher than what our silly computers can do now. But this is the point when i get that you were being sarcastic and feel slightly embarrassed...

the real range of sight is very wide, dynamic and much more sensitive to shadow and particulates than is represented in video games. your iris is resizing constantly to compensate for lighting conditions ... all dispays are merely representations of lighting conditions... I think global ilumination techniques, atmospheric and surface scattering and reflections would produce better and more realistic interpretations of realworld lighting conditions than the craze over HDR.

You can use "lesser" forms of dynamic range lighting than the fp32/fp16 formats, and mixed it with LDR to get fairly realistic representations of realworld lighting situations...
 
And can i just remind everyone the very interesting part of this whole 128-bit colour and FP32 mumbo jumbo, that for all of the poeple using LCD displays to play these games, in the end the physical display is still 8-bit... Just to put things into perspective... More precision helps with approximation errors, but after a certain point (well before 128bit!) there will be no discernable difference, as the physical display will only be able to show so many colours. In LCD's case it's still 8-bit. Last time i checked obviously. Sometimes even 6bit!
 
And can i just remind everyone the very interesting part of this whole 128-bit colour and FP32 mumbo jumbo, that for all of the poeple using LCD displays to play these games, in the end the physical display is still 8-bit... Just to put things into perspective... More precision helps with approximation errors, but after a certain point (well before 128bit!) there will be no discernable difference, as the physical display will only be able to show so many colours. In LCD's case it's still 8-bit. Last time i checked obviously. Sometimes even 6bit!

What the hell? I hope you are wrong hombre.
 
What the hell? I hope you are wrong hombre.

Uhm nope. The physical screen in LCDs is 8-bit per channel, as in the physical liquid crystals are only capable of that much. Last time i checked of course. Might have gone up a bit, i heard it would eventually, to 12bit, but i'm not sure it has happened yet?
Numerous times, in order to get the response times lower (to avoid motion blur LCDs have problems with) they use 6-bit panels, which is silly cause the sets still have motion blur, but this time with even less colours displayed.
 
Status
Not open for further replies.
Back
Top