blakjedi said:getaway used RSX... i rember during the presentation he was highlighting the use of HDR used in RSX...HDR is not a Cell feature
McFly said:blakjedi said:getaway used RSX... i rember during the presentation he was highlighting the use of HDR used in RSX...HDR is not a Cell feature
He did'nt use this demo to highlight that. There where screenshots of the demo and you could clearly see that especialy the bright areas had it's problems, something the RSX would be really good at.
Fredi
McFly said:blakjedi said:getaway used RSX... i rember during the presentation he was highlighting the use of HDR used in RSX...HDR is not a Cell feature
He did'nt use this demo to highlight that. There where screenshots of the demo and you could clearly see that especialy the bright areas had it's problems, something the RSX would be really good at.
Fredi
So the new EyeToy is not connected to USB, but Ethernet :?In terms of the PS3 console itself... Why does it have three network ports on the back?
Because it can be a hub, rather than just being a terminal at the end of a network. Also, we want to be able to have a Gigabit port for an IP camera. So one of the ports is an in, and two of them are through. It can be a server as well as a terminal.
gmoran said:Is 123bit High Dynamic Range lighting the main differentiator between PS3 and XB360 that we have seen so far (excepting the CGI of course ;-) )? And if so is that because of the state of 360 dev kits, or because the R500 will be lacking that feature?
london-boy said:gmoran said:Is 123bit High Dynamic Range lighting the main differentiator between PS3 and XB360 that we have seen so far (excepting the CGI of course ;-) )? And if so is that because of the state of 360 dev kits, or because the R500 will be lacking that feature?
The X360 will be more than able to handle HDR. If not 128, at least 96-bit.
london-boy said:The X360 will be more than able to handle HDR. If not 128, at least 96-bit.
gmoran said:Well I'd assumed R500 would be at least 64-bit HDR, as that's what current chips are. But as they haven't mentioned 128-bit HDR or 128-bit frame buffer I'm not sure.
DeanoC said:gmoran said:Well I'd assumed R500 would be at least 64-bit HDR, as that's what current chips are. But as they haven't mentioned 128-bit HDR or 128-bit frame buffer I'm not sure.
No current ATI chip has FP16 blending...
london-boy said:I said what i said because current ATI GPUs have FP24 throughout the pipeline. Not sure why Xenus (or whatever it's called) should be any different, or worse. It's either going to be FP24 or FP32, which is 96-bit and 128-bit.
london-boy said:I said what i said because current ATI GPUs have FP24 throughout the pipeline. Not sure why Xenus (or whatever it's called) should be any different, or worse. It's either going to be FP24 or FP32, which is 96-bit and 128-bit.
Joe DeFuria said:I'm pretty sure I read FP32....which really only makes sense given the unified nature of the pipelines.
However, this doesn't guarantee that FP blends are at FP32 precision.
Tim said:london-boy said:I said what i said because current ATI GPUs have FP24 throughout the pipeline. Not sure why Xenus (or whatever it's called) should be any different, or worse. It's either going to be FP24 or FP32, which is 96-bit and 128-bit.
Xenus is 32bit throughout the pipeline, that is required as 24bit simply is not enough for vertex shading, vertex shaders are 32bit on current Ati chips too.
london-boy said:You mean FP32 and FP24?
Cause that would mean it's 128-bit precision throughout the pipeline.
Unless i'm missing something.