What rendering tricks might RSX employ?

Titanio said:
No, what I'm saying is that if you had a SDTV, you'd essentially be getting AA for free - the image is being scaled down from a high resolution (720p or better yet 1080p) and that's effectively anti-aliasing. Unless I'm misunderstanding something along the way..
No, that's perfectly correct. As a matter of fact everyone on SDTV can expect quite insane image quality. The effective AA-level you'll get on 480i, downscaled from a 720p framebuffer rendered with 4xMSAA, is 4 shader samples and 16 geometry samples per pixel.
 
Inane_Dork said:
zidane1strife said:
Why all this talk of AA? Maybe it's just me, but have ye all watched high-rez ps3 vids and still came away thinking AA was needed?
You watched what, though? A 720p, lossy compressed version of the videos? It's not exactly a fair conclusion. And also, jaggies aren't the only thing AA helps eliminate.

Really, if AA were such a non-issue, why do you think Pixar spends so much detail on massively demanding AA algorithms?

Well, I think it's cause they need to be able to show their worlds on extremely large screens and be able to move the camera around and get close-ups and stuff without worrying about IQ. And besides they've the money to burn, they've to achieve as good IQ as their money/time/resources can get, even if the improvements are minor ones.(they've a reputation on the line, this is professional cgi, and they're pixar afterall.)

BG/CoN have extremely good iq, and they're running with basic iq enhancing features, iirc. As for next gen, look at luna's pic in this very thread, if that's an example of realtime iq, than you're seeing my point(note geometry on ps3, I think, is likely to be higher so that slightly noticeable polys in the background monster would likely disappear were this designed for final ps3 h/w.)

ed
 
I don't think anyone's arguing that AA is less important at higher resolutions. Certainly I'm not. That decreased importance doesn't imply minimal importance, though. It can be lesser and still worth spending performance on.
 
If I may run an idle thought or two by you guys... what's the chance that in adapting the RSX (I'm kinda making the assumption the RSX is a modified G70) to the PS3 we might see something like Rambus' bus system (the name escape me for the moment) implented in the RSX along with the CELL?

More over do you think they might even change the memory controller as well? I know it's probably off the wall, but if Sony-nVidia were to convert the memory pool for the RSX over to XDr from GDDR3, I think there might be some pricing benefits... performance as well, I would venture... I'm thinking that the lower pin count on XDR would lead to faster shrink and thus making the PS3 cheaper in the long run.

Also, I thinking we might see a boost coming from the possible drop of video compression hardware on nVidia GPUs (the Pixel perfect stuff or something... I'm horrible with names, sorry). Anyway, the reclaimed transistors would probably be better put to use as shaders I would assume... or maybe the "chaperone" rumor.

I bring up the hardware issues because I get the feeling there really isn't goning to be anything particularly special implemented into the RSX that isn't going to be available in the PC space... things will be fined tuned along console lines, but no fancy magic to make things stand out.
 
we might see something like Rambus' bus system (the name escape me for the moment) implented in the RSX along with the CELL?
Are you referring to the point-to-point interconnect bus (FlexIO AKA Redwood)? That's already in there as it happens to be the bus between CELL and RSX.
 
AH!! Thanks you!! Yes, FlexIO. I'm talking of implementing that withing RSX not just the CELL CPU. It seems like a logical way to connect the 2 processors. Or are you saying it's already implemented within the RSX?
 
Mefisutoferesu said:
AH!! Thanks you!! Yes, FlexIO. I'm talking of implementing that withing RSX not just the CELL CPU. It seems like a logical way to connect the 2 processors. Or are you saying it's already implemented within the RSX?

Cell and RSX are connected by Flexio. 20GB/s (write)/15GB/s (read).
 
Mefisutoferesu were you maybe wondering if it would be possible to tweak the RSX's own native memory support pre-tapeout so that it functions with XDR directly, rather than GDDR3? That would be what would achieve your XDR cross-over; as it stands now though the RSX IS on the FlexIO.
 
Yes, that is part of what I was wondering, the 2nd part of my original ramblings... sorry for it being so incoherent.

I'm surprised RSX IS running on the FlexIO... how closely does its configuration come to the one implemented in the CPU? That's probably an unknown as of now though... thought I'd ask anyway.

Also, another thought came to mind... assuming the RSX is indeed a modified G70, what does the FlexIO bring to the table? A whole new bussing system must bring some distinctions to the GPU.
 
Not much , the flexio is so it can talk to the cell. On the pc version it talks to the pci-x buss that goes to the cpu . So it will just be talking to the cell on a faster buss that it will need to acess the ram behind the cell .

The resason why they are doing gddr vs rambus ram is because its cheaper and hte gpu doesn't really care about latancy just needs speed
 
jvd said:
Not much , the flexio is so it can talk to the cell. On the pc version it talks to the pci-x buss that goes to the cpu . So it will just be talking to the cell on a faster buss that it will need to acess the ram behind the cell .

I wouldn't dismiss greater bandwidth between cpu and gpu as being "not much". The relationship between Cell and RSX seems to be a key point raised in sony/nvidia interviews (and the allowances made for that go beyond bandwidth too, though obviously a fat pipe is needed, and flexio provides).
 
In relation to RSX vs. G70, adding a FlexIO doesn't add much. Just faster comms with the CPU. It won't add, say, free 4x AA or Reyes capabilities on the GPU. Technically if RSX is G70 with FlexIO, it'll function exactly as G70 and could fairly be called as such (G71 :p )
 
REYES!! Oops, sorry I shouldn't have said Reyes. Oop, I said Reyes again! Oh darn, I said Reyes again. What is wrong with me. :p

Do you think it's possible to see something like the ring bus implemented in the CELL design to make its way into the RSX? I dunno if there'd even be a practical use for such a design in the RSX, but it is a rather novel design and novelty can breed some very interesting effects.

I know I'm probably grasping at straws here, but I simply can't imagine that Sony would pay millions for an off the shelf GPU (even if it's the next gen off the shelf gpu). There must be something different no?

Or perhaps it's more Favorable for Sony to utilize a more PC stylized GPU? I can't see why though... the point with the CELL design was to drop legacy with the "workstation" design, I would imagine they'd want to do the same for the GPU as well.
 
The funny thing is....2 years on the PS3 GPU and all they can come with is a barely modified G70? IF that was the case, the GPU should have been finished by now...and developers should have more than just 100 PS3's at hand. I think it's a little more than just a modified G70...Might be somthing between the G80 and G70 with extra stuff on it....but leaning more on the G80...

-Josh378
 
Mefisutoferesu said:
Do you think it's possible to see something like the ring bus implemented in the CELL design to make its way into the RSX? I dunno if there'd even be a practical use for such a design in the RSX, but it is a rather novel design and novelty can breed some very interesting effects.
No point in it unless there's reasonably sizeable local storage for each shader or something.

I know I'm probably grasping at straws here, but I simply can't imagine that Sony would pay millions for an off the shelf GPU (even if it's the next gen off the shelf gpu). There must be something different no?
I can imagine some new particular features to work in association with Cell, seeing as direct communication seems such a plus point. And maybe they put GS on there too (how many trannies is GS? I recently read on these boards 17 million) for BC? I'm not expecting magic new features that don't appear on the PC part though, like an inbuilt Ambient Occlusion processor or something.
 
VNZ said:
Titanio said:
No, what I'm saying is that if you had a SDTV, you'd essentially be getting AA for free - the image is being scaled down from a high resolution (720p or better yet 1080p) and that's effectively anti-aliasing. Unless I'm misunderstanding something along the way..
No, that's perfectly correct. As a matter of fact everyone on SDTV can expect quite insane image quality. The effective AA-level you'll get on 480i, downscaled from a 720p framebuffer rendered with 4xMSAA, is 4 shader samples and 16 geometry samples per pixel.

Wouldn't it make it blurry? Whenever I play games on my laptop and the resolution doesn't match my lcd native resolution, the image looks really bad, if scaling isn't done to whole number. ie my lcd is 1600x1200, but games at 640x480 adn 1024x768 scaled up look really bad, but at 800x600 it looks okay.

So when these console isn't using HDTV. Would it be generating HDTV resolution? Or would it be intelligent enough to generate SDTV with AA to the max. I'm assuming that this will be the case. I surely don't want any scaling to be apply especially considering 720 -> 480.
 
TrungGap said:
Wouldn't it make it blurry? Whenever I play games on my laptop and the resolution doesn't match my lcd native resolution, the image looks really bad, if scaling isn't done to whole number. ie my lcd is 1600x1200, but games at 640x480 adn 1024x768 scaled up look really bad, but at 800x600 it looks okay.

So when these console isn't using HDTV. Would it be generating HDTV resolution? Or would it be intelligent enough to generate SDTV with AA to the max. I'm assuming that this will be the case. I surely don't want any scaling to be apply especially considering 720 -> 480.

That has a lot to do with the nature of LCDs. They actually have a fixed number of pixels. When they go lower than their native resolution and try to fill the screen they end up doing strange things. The reason 800x600 probably looks okay is because it can use 4 of the screen pixels to represent one of the game's pixels. In 1024x768 and 640x480 there is no nice way of symmetrically distributing the pixels to make it look okay.
 
Bobbler said:
That has a lot to do with the nature of LCDs. They actually have a fixed number of pixels. When they go lower than their native resolution and try to fill the screen they end up doing strange things. The reason 800x600 probably looks okay is because it can use 4 of the screen pixels to represent one of the game's pixels. In 1024x768 and 640x480 there is no nice way of symmetrically distributing the pixels to make it look okay.

Okay, I get you. But boy, you can explain it better than I could. :) But doesn't that somewhat apply to this also? So if the console is locked to rendering 720p it would scale to whatever the output can handle, so wouldn't it pose the same problem? Even HDTV that up convert or down convert video, not only introduce artifacts but also *reduced* the image quality.

So basically, what I'm saying is that IMHO, the next console should render to the resolution of the output and not to rely on video scaling for free AA.
 
TrungGap said:
Okay, I get you. But boy, you can explain it better than I could. :) But doesn't that somewhat apply to this also? So if the console is locked to rendering 720p it would scale to whatever the output can handle, so wouldn't it pose the same problem? Even HDTV that up convert or down convert video, not only introduce artifacts but also *reduced* the image quality.

So basically, what I'm saying is that IMHO, the next console should render to the resolution of the output and not to rely on video scaling for free AA.

There is a small difference though. In your case with the LCD your computer wasn't scaling any resolution, it was just outputting at that selected resolution and your monitor was making it look goofy (as all LCDs do). It's not exactly the same thing. There isn't really any resolution scaling going on (at least not in the same way as the scaling talked about), it is the monitor trying to scale an image with less pixels to the size of something with more pixels (take an image file jpg/bmp/whatever and zoom in -- it looks ugly)

In the case of the consoles the picture information will/can be sent at a high resolution and the TV will scale it down (shrink it to the point that it fits on the screen). It essentially does what Super sampling AA does. Find a normal sized JPG/bmp/whatever, and zoom out so it gets smaller -- the picture gets the advantage of having, essentially, all the pixels of it's original size but not taking up the space.

If the consoles can output at 1080i(p) full time then I don't see a reason why it shouldn't. Scaling an image down is good, scaling it up is bad. Taking a 1600x1200 image and making it fit into 800x600 space will lose some information but that 800x600 image will be really sharp (zomming out on a picture). Taking an 800x600 image and making it fit into a 1600x1200 image will keep the quality of the 800x600 image but make it really big (zooming in on a picture).
 
Josh, you know more about the timing of releases of new parts than I, do you get the impression that NVidia have invested 2 years of development for the console specific implementation. The transition from AGP to PCI-E seemed to represent a significant investment, and I would guess that getting an existing chip re-worked and tuned for FlexI/O and its bandwidth potential would be non-trivial. This is in addition to a process shrink from .11 to .09 micron. If the design for the G70 was complete last Nov. how long would it take to make it suitable for use in the PS3 platform? Assuming a spring 06 launch so you would need final spins in mass production by Dec?/Jan?/Feb?. Waiting for a modified G80 would only serve to push the timeline out further no?
 
Back
Top