What rendering tricks might RSX employ?

marconelly! said:
it was strongly hinted at for a long time and the lack of hardrive is what killed it . From how i understand it they were waiting on sony to confirm the hardrive and the word they got was no and the scrapped before e3.
You only answered my question with more hearsay. Who said, and where, that the game was being planned for PS3, and then cancelled due to lack of standard harddrive? That, if true, would invalidate the rumor that Xbox 360 devs were told to develop games as if the harddrive was not packed in standard with the console (which would basically mean the game would have to work even if the user would de-attach the HDD and borrow it to someone, etc.)
I can't say who said what and where , I can only say that it was said :)

Anyway from what i understand that no hardrive rumor was very old and was just restated by certian people even though it had been finilized that the x360 would have a hardrive many months before it .

As for detaching the hardrive and letting someone borrow it you would see that wouldn't really work as all your live info and other things will be st ored on the hardrive. You can send any files you want to your friend from streaming it to another x360 (time consuming of course ) or sending it to your pc and burning it to a cd to give them or if smallenough on a memory card
 
Prior to PS3, real-time rendered 3D graphics might have looked real, but they weren't actually calculated in a fully 3D environment. But that was OK for screen resolutions up until now. Even as of the current time, most of the games for the Xbox 360 use that kind of 3D. However, we want to realize fully calculated 3D graphics in fully 3D environments. In order to do that, we need to share the data between the CPU and GPU as much as possible. That's why we adopted this architecture. We want to make all the floating-point calculations including their rounded numbers the same, and we've been able to make it almost identical. So as a result, the CPU and GPU can use their calculated figures bidirectionally.

All he's really saying here is that for the most part the floats in use by the GPU and floats in use by the CPU are compatible at more than the bitwise representation level.

i.e.

F(x) on the cpu will give exactly the same answer as F(x) on the GPU.

This is a bonus if you need consistency between the units. An example might be transform half the verts on the CPU and half on the GPU there shouuld thoretically not be any seams.
 
zidane1strife said:
if the heavenly sword/ FFVII/nvguy/etc realtime(or nigh-realtime as in deano's) demos are any indication of final product IQ than any improvement on that will, IMHO be negligible.

Interesting you mention Heavenly Sword, because as beautiful as it looked, some screens do exhibit some aliasing (check out the flags and flag poles in this pic, for example: http://ps3media.ign.com/ps3/image/article/614/614802/heavenly-sword-20050516074026605.jpg ) It looks really good regardless, I could live with that ;) Harder to notice in motion too (not sure if that's due to vid compression or just the motion of the game or all the effects going on or what though).

It'd be interesting if DeanoC could clarify what AA was being used, and/or what AA they might expect to use on final hardware (in a non-commital fashion, obviously!).

ERP said:
All he's really saying here is that for the most part the floats in use by the GPU and floats in use by the CPU are compatible at more than the bitwise representation level.

i.e.

F(x) on the cpu will give exactly the same answer as F(x) on the GPU.

This is a bonus if you need consistency between the units. An example might be transform half the verts on the CPU and half on the GPU there shouuld thoretically not be any seams.

Question - what has to happen to get transformed vertex data, for example, from Cell to the pixel shaders? Will they go through a vertex shader on the GPU to get there (an identity matrix transform?)? In other words, does any vertex input to a pixel shader have to come from a vertex shader, or can you set the inputs up to be whatever you want? Possibly a very silly question, and reveals my lack of shader experience ;)
 
Titanio said:
ralexand said:
Titanio said:
Hard to mimic RSX. Whether he just means in terms of raw power or specific features or otherwise I don't know.

My question though is how would he know unless he has an RSX chip. He's giving a very practical real-word example here about his development environment. He's not talking theoretical performance.

How does anyone know final hardware will be different without actually having it? ;) They've a good idea on paper of what's coming. Same as MS knows their final hardware will be X times more powerful or have certain new features. It's all theoretical until they actually have the hardware, of course, but that won't stop them making public statements or suggestions about the relative merits of final hardware vs what they currently have.
But that's exactly my point. The guy isn't talk about theory, he's talking about the hardware he's working with now in development and if he's saying that his development pc's gpu can't do the same things then he obviously can't be talking about a pc card that's currently available so he must be talking about RSX hardware but AFAIK that is still being designed.
 
ralexand said:
But that's exactly my point. The guy isn't talk about theory, he's talking about the hardware he's working with now in development and if he's saying that his development pc's gpu can't do the same things then he obviously can't be talking about a pc card that's currently available so he must be talking about RSX hardware but AFAIK that is still being designed.

He's saying that the RSX that has been outlined and detailed to them is in some way shape or form more capable than what they've currently got, and they can't mimic it. I'm not sure how what's not to undertand? It's pretty much a given that the final hardware will be more capable than what they've got, no?

edit - Actually, I misread the quote completely. My apologies. It does seem he's talking in far less theoretical terms. I've no idea if any devs actually have RSX or not..according to recent comments from NVidia, no silicon exists yet. Maybe when he's referring to "RSX" he means the GPU that's in the kits, but I'm not sure how it wouldn't be possible to mimic that on a PC kit (unless they just don't have SLI configs in their PCs, or many of them - which I suppose is possible, I guess ;)).
 
Titanio said:
He's saying that the RSX that has been outlined and detailed to them is in some way shape or form more capable than what they've currently got, and they can't mimic it. I'm not sure how what's not to undertand? It's pretty much a given that the final hardware will be more capable than what they've got, no?

edit - Actually, I misread the quote completely. My apologies. It does seem he's talking in far less theoretical terms. I've no idea if any devs actually have RSX or not..according to recent comments from NVidia, no silicon exists yet. Maybe when he's referring to "RSX" he means the GPU that's in the kits, but I'm not sure how it wouldn't be possible to mimic that on a PC kit (unless they just don't have SLI configs in their PCs, or many of them - which I suppose is possible, I guess ;)).
Thanks, I was beginning to think that I'm reading that completely wrong. Could just be bad grammar and he means that his development pc is no match for the cell cpu in the dev kits.
 
Titanio said:
It'd be interesting if DeanoC could clarify what AA was being used, and/or what AA they might expect to use on final hardware (in a non-commital fashion, obviously!).
The shots you see have no FSAA at all that is a limitation of all currently available graphics chips (due to HDR). We did use anisotropic filtering on all textures though.
 
DeanoC said:
The shots you see have no FSAA at all that is a limitation of all currently available graphics chips (due to HDR). We did use anisotropic filtering on all textures though.

Cool, thank you! I must say it looks very good indeed regardless, I certainly didn't think there was NO AA there. I guess that the apparently reduced impact of aliasing may in part be due to the higher resolution? Do you know if that limitation with HDR at the same time as AA will still be present with later hardware?
 
Titanio said:
DeanoC said:
The shots you see have no FSAA at all that is a limitation of all currently available graphics chips (due to HDR). We did use anisotropic filtering on all textures though.

Cool, thank you! I must say it looks very good indeed regardless, I certainly didn't think there was NO AA there. I guess that the apparently reduced impact of aliasing may in part be due to the higher resolution? Do you know if that limitation with HDR at the same time as AA will still be present with later hardware?
Yeah, that's what I'm wondering. Will the xenos and RSX be able to do HDR and AA simultaneously?
 
Xenos can through Float10 HDR. We've no idea for RSX and I'd be surprised if Deano were allowed to say anything.
 
Shifty Geezer said:
Xenos can through Float10 HDR. We've no idea for RSX and I'd be surprised if Deano were allowed to say anything.
Xenos can do it with even fp 16 hdr . The main point of fp10 is that it will be much faster than both fp16 and fp32 . I assume this is because both the bandwidth needed and the processing power needed . It was a good move and a smart move . I dunno about the rsx though. I guess we will find out more regardless of if its modified g70 or a modifed g80 when the g70 comes out at the end of the month
 
Shifty Geezer said:
Xenos can through Float10 HDR. We've no idea for RSX and I'd be surprised if Deano were allowed to say anything.
What's the difference in quality between fp10 and fp16,fp32?
 
DeanoC said:
Titanio said:
It'd be interesting if DeanoC could clarify what AA was being used, and/or what AA they might expect to use on final hardware (in a non-commital fashion, obviously!).
The shots you see have no FSAA at all that is a limitation of all currently available graphics chips (due to HDR). We did use anisotropic filtering on all textures though.

Good choice. Personally I don't get this anti-jaggy craze. I think I read somewhere that MS wants to force 4x FSAA on all 360 games (I know your ps3, btw) because with Edram it will have no penalty. But in the back of my mind Im thinking heck with jaggies give me some more eye candy like HDR! (please correct me if Im wrong about this)

Sure jaggies this gen sucked cause we were at 720X480, but next gen at 1280x720 or 1920x1080 Im not gonna care that much. Heck on a PC at 1024X768 I'd rather just bump up the Anistropic than enable AA.
 
ralexand said:
Shifty Geezer said:
Xenos can through Float10 HDR. We've no idea for RSX and I'd be surprised if Deano were allowed to say anything.
What's the difference in quality between fp10 and fp16,fp32?


The ROP's can handle several different formats, including a special FP10 mode. FP10 is a floating point precision mode in the format of 10-10-10-2 (bits for Red, Green, Blue, Alpha). The 10 bit colour storage has a 3 bit exponent and 7 bit mantissa, with an available range of -32.0 to 32.0. Whilst this mode does have some limitations it can offer HDR effects but at the same cost in performance and size as standard 32-bit (8-8-8-8) integer formats which will probably result in this format being used quite frequently on XBOX 360 titles. Other formats such as INT16 and FP16 are also available, but they obviously have space implications. Like the resolution of the MSAA samples, there is a conversion step to change the front buffer format to a displayable 8-8-8-8 format when moving the completed frame buffer portion from the eDRAM memory out to system RAM.

The ROP's are fully orthogonal so Multi-Sampling can operate with all pixel formats supported.

from here http://www.beyond3d.com/articles/xenos/index.php?p=04
 
Thanks Jawed.

I'm the opposite, I can't stand jaggies more than anything.
Pozer said:
DeanoC said:
Titanio said:
It'd be interesting if DeanoC could clarify what AA was being used, and/or what AA they might expect to use on final hardware (in a non-commital fashion, obviously!).
The shots you see have no FSAA at all that is a limitation of all currently available graphics chips (due to HDR). We did use anisotropic filtering on all textures though.

Good choice. Personally I don't get this anti-jaggy craze. I think I read somewhere that MS wants to force 4x FSAA on all 360 games (I know your ps3, btw) because with Edram it will have no penalty. But in the back of my mind Im thinking heck with jaggies give me some more eye candy like HDR! (please correct me if Im wrong about this)

Sure jaggies this gen sucked cause we were at 720X480, but next gen at 1280x720 or 1920x1080 Im not gonna care that much. Heck on a PC at 1024X768 I'd rather just bump up the Anistropic than enable AA.
 
I dunno, glare from your screen does a pretty good job... Loud neighbors/nagging GF... And my personal favorite, dropped network connections... :p
 
Pozer said:
Good choice. Personally I don't get this anti-jaggy craze. I think I read somewhere that MS wants to force 4x FSAA on all 360 games (I know your ps3, btw) because with Edram it will have no penalty. But in the back of my mind Im thinking heck with jaggies give me some more eye candy like HDR! (please correct me if Im wrong about this)
Well, why not both? The eDRAM has enough bandwidth to make HDR + 4x MSAA very feasible, provided the fillrate holds out.

Sure jaggies this gen sucked cause we were at 720X480, but next gen at 1280x720 or 1920x1080 Im not gonna care that much. Heck on a PC at 1024X768 I'd rather just bump up the Anistropic than enable AA.
AA is a big thing, IMO. Even at 1280x1024 on my PC, I notice a huge difference between no AA and 4x AA. The stability of the picture increases dramatically as slight camera moves don't make weird patterns crawl across surfaces as much and objects don't get magically lost between pixels as often.

Of course, anisotropic filtering is a big win for picture quality, too.
 
HD on a small screen will limit jaggie pains. 1024x768 on a 15" monitor is better for me than 800x600 @ 4x AA for example. But when stretched over a large screen those jaggies will be painfully apparent once again.

Ideally they should go with the best of both worlds, but for me, AA is a more important next step than HDR, as a good amount of HDR's optical effects canbe simulated without the need for HD images, whereas the only cure for jaggies is AA.

That said, it needn't be full screen AA. If they could develop new AA techniques just to smooth out edges that'd be most of the pain gone.
 
Shifty Geezer said:
HD on a small screen will limit jaggie pains. 1024x768 on a 15" monitor is better for me than 800x600 @ 4x AA for example. But when stretched over a large screen those jaggies will be painfully apparent once again.

Ideally they should go with the best of both worlds, but for me, AA is a more important next step than HDR, as a good amount of HDR's optical effects canbe simulated without the need for HD images, whereas the only cure for jaggies is AA.

That said, it needn't be full screen AA. If they could develop new AA techniques just to smooth out edges that'd be most of the pain gone.


I'm not sure. HDR makes things look very realistic, and jaggies can be get rid of more or less without AA to a certain extent.

To me, non-HDR graphics look kinda flat now, and no amount of AA will make them look more realistic.

If a console is going to render at 1080p internally, the final output at lower resolutions will be pretty smooth as you get free supersampling. I'd take that and HDR over "real" AA and no HDR any time of the day, but everyone has different tastes.
 
Back
Top