Playstation 3 e3 thread 4

getaway used RSX... i rember during the presentation he was highlighting the use of HDR used in RSX...HDR is not a Cell feature
 
blakjedi said:
getaway used RSX... i rember during the presentation he was highlighting the use of HDR used in RSX...HDR is not a Cell feature

He did'nt use this demo to highlight that. There where screenshots of the demo and you could clearly see that especialy the bright areas had it's problems, something the RSX would be really good at.

Fredi
 
This demo befell the Cell demo's part of the programme. RSX came later with their own demos including shaders. The HDR illumination of Doc Oc was Cell. The Getaway renders were described as 'mostly Cell'. The Getaway demo was more to show a living world simluation than amazing graphics (though the graphics weren't a problem IMO! The contrast changes from viewing light and dark areas were very realistic).
 
McFly said:
blakjedi said:
getaway used RSX... i rember during the presentation he was highlighting the use of HDR used in RSX...HDR is not a Cell feature

He did'nt use this demo to highlight that. There where screenshots of the demo and you could clearly see that especialy the bright areas had it's problems, something the RSX would be really good at.

Fredi

uh.... i need to watch the video demo of getaway from the presentation again to be sure but that is what i remember and commented about it in another thread....
 
hey seriously Shifty it was the best contrast from light to dark that I have ever seen. I was like this -> :oops: . For some reason the Getaway demo is not being talked about as much as it should. But they will notice when the games come out. ;)
________
Buy Volcano Vaporizer
 
Last edited by a moderator:
McFly said:
blakjedi said:
getaway used RSX... i rember during the presentation he was highlighting the use of HDR used in RSX...HDR is not a Cell feature

He did'nt use this demo to highlight that. There where screenshots of the demo and you could clearly see that especialy the bright areas had it's problems, something the RSX would be really good at.

Fredi

I thought Harrison mentioned HDR as they scanned around the Getaway demo.

In one of the interviews, he said SCEE had dev kits for 5 months, which makes sense that they would have it longer than even key third parties like Epic.

Anyways, Getaway has a checkered history. Early screens from Getaway were used to suggest what was possible on the PS2. They were almost photorealistic and they talked about how in detail they made a particular section of London.

Well the game arrived about 2 years later and it was run-of-the-mill at best.
 
About the 3 ethernet ports, from the interview.
In terms of the PS3 console itself... Why does it have three network ports on the back?

Because it can be a hub, rather than just being a terminal at the end of a network. Also, we want to be able to have a Gigabit port for an IP camera. So one of the ports is an in, and two of them are through. It can be a server as well as a terminal.
So the new EyeToy is not connected to USB, but Ethernet :?
 
Hey wco they did use the Getaway demo to show HDR plus other things. You thought right so don't worry about it. Some people here just don't want to believe anything that Sony says.

*shakes head*
________
Honda XRV750
 
Last edited by a moderator:
Is 123bit High Dynamic Range lighting the main differentiator between PS3 and XB360 that we have seen so far (excepting the CGI of course ;-) )? And if so is that because of the state of 360 dev kits, or because the R500 will be lacking that feature?

I thought the ducks and London tech demos both had a CGI quality to them: a combination of resolution, IQ and the color range.
 
gmoran said:
Is 123bit High Dynamic Range lighting the main differentiator between PS3 and XB360 that we have seen so far (excepting the CGI of course ;-) )? And if so is that because of the state of 360 dev kits, or because the R500 will be lacking that feature?

The X360 will be more than able to handle HDR. If not 128, at least 96-bit.
 
london-boy said:
gmoran said:
Is 123bit High Dynamic Range lighting the main differentiator between PS3 and XB360 that we have seen so far (excepting the CGI of course ;-) )? And if so is that because of the state of 360 dev kits, or because the R500 will be lacking that feature?

The X360 will be more than able to handle HDR. If not 128, at least 96-bit.

It might very well be limited to FP16-blending (64bit) all the realtime HDR you have seen from Sony is FP16 (FP32-blending is simply not suported on the current hardware) and FP32 has twice the performance hit and only marginal higher quality than FP16-blending.
 
london-boy said:
The X360 will be more than able to handle HDR. If not 128, at least 96-bit.

Well I'd assumed R500 would be at least 64-bit HDR, as that's what current chips are. But as they haven't mentioned 128-bit HDR or 128-bit frame buffer I'm not sure.

I suppose its similar to how some are wondering about RSX AA, as it wasn't mentioned: I'm assuming RSX will have Quincunnix 2*AA for free (if its worth it? might be at high res, has it improved since GF3), and NV's RGAA which will be relatively (to R500) expensive.
 
gmoran said:
Well I'd assumed R500 would be at least 64-bit HDR, as that's what current chips are. But as they haven't mentioned 128-bit HDR or 128-bit frame buffer I'm not sure.

No current ATI chip has FP16 blending...
 
DeanoC said:
gmoran said:
Well I'd assumed R500 would be at least 64-bit HDR, as that's what current chips are. But as they haven't mentioned 128-bit HDR or 128-bit frame buffer I'm not sure.

No current ATI chip has FP16 blending...

Thanks Deano.

I said what i said because current ATI GPUs have FP24 throughout the pipeline. Not sure why Xenus (or whatever it's called) should be any different, or worse. It's either going to be FP24 or FP32, which is 96-bit and 128-bit.
 
london-boy said:
I said what i said because current ATI GPUs have FP24 throughout the pipeline. Not sure why Xenus (or whatever it's called) should be any different, or worse. It's either going to be FP24 or FP32, which is 96-bit and 128-bit.

I'm pretty sure I read FP32....which really only makes sense given the unified nature of the pipelines.

However, this doesn't guarantee that FP blends are at FP32 precision.
 
london-boy said:
I said what i said because current ATI GPUs have FP24 throughout the pipeline. Not sure why Xenus (or whatever it's called) should be any different, or worse. It's either going to be FP24 or FP32, which is 96-bit and 128-bit.

Xenus is 32bit throughout the pipeline, that is required as 24bit simply is not enough for vertex shading, vertex shaders are 32bit on current Ati chips too.
 
Joe DeFuria said:
I'm pretty sure I read FP32....which really only makes sense given the unified nature of the pipelines.

However, this doesn't guarantee that FP blends are at FP32 precision.

Given that the shader units and framebuffer units are physically detached... XeGPU has FP32 shaders but making any assumption about the framebuffer blending would be silly.
 
Tim said:
london-boy said:
I said what i said because current ATI GPUs have FP24 throughout the pipeline. Not sure why Xenus (or whatever it's called) should be any different, or worse. It's either going to be FP24 or FP32, which is 96-bit and 128-bit.

Xenus is 32bit throughout the pipeline, that is required as 24bit simply is not enough for vertex shading, vertex shaders are 32bit on current Ati chips too.

You mean FP32 and FP24?
Cause that would mean it's 128-bit precision throughout the pipeline.
Unless i'm missing something.
 
london-boy said:
You mean FP32 and FP24?
Cause that would mean it's 128-bit precision throughout the pipeline.
Unless i'm missing something.

You are missing something. The precision "throughout the shading pipeline" has little bearing on the precision of frame buffer blends.

It can be "FP32" shaders, and FP16 blending...or no blending at all for that matter. ;)
 
Back
Top