Anti-aliasing Without EDRAM in NVidia's PS3 GPU...

Jawed

Legend
...will PS3 look anywhere near as good as Xbox 360? I'm thinking of anti-aliasing, specifically and ignoring geometrical complexity, shader effects etc.

Whenever I see games on consoles, the aliasing is the most obnoxious artefact of all.

If the EDRAM in (or working with) R500 is a crucial part of R500's ability to do anti-aliasing, will NVidia be forced into designing EDRAM into the ROP of their GPU for PS3? If not, what alternatives are there?

Will XDR's bandwidth somehow save the day?

Jawed
 
Is this a joke thread or something? :)

Just ask yourself about how today's CG chips, such as the R420/N40, or even the older R300 can output an image with MSAA without any eDRAM onboard. ;)
 
Most of the aliasing I notice while playing PS2 games comes from texture shimmering, which can be corrected with just anisotropic filtering, which will be no problem at all for any modern Nvidia GPU. As far as true anti-aliasing, any modern PC GPU is capable of good AA without any embedded RAM at all, so I'm sure PS3 will be (I can't imagine Nvidia would design a next-gen GPU that can't even do what their old PC GPUs can).
 
I wonder instead if super AA tecniques are really needed with next gen games supporting HD resolutions.
We already have perfectly polished games with the current consoles at standard TV resolutions.
 
eDRAM is not required for AA, or the AA the R500 will be doing. It simply makes it faster/easier, which may be a moot point anyway depending on any performance gaps that might exist between it and PS3's GPU. I wouldn't worry about AA in any of the next gen systems - it won't be a problem for any of them.
 
I'm not suggesting that PS3 GPU will not be able to do AA. I'm asking how will it do so without taking a huge performance hit, in comparison with XBox 360, which uses EDRAM to speed-up, amongst other things, AA?

If XDR can be architected to provide 200GB/s of bandwidth, say, then maybe PS3 GPU will be able to run as fast as Xbox 360's GPU.

That's the kind of thing I'm wondering about... Anyone got any ideas?

Jawed
 
1) I don't know if CELL CPU will share its 25.6 Gb/s (or more..) with NVIDIA GPU, even if this is not going to happen it wouldn't be a big loss imho, in many fp-intensive tasks CELL CPU will need tons of bandwith.
2) If NVIDIA GPU will not have a big pool of embedded memory then performance wise R500 could be faster when AA is on.
I believe is unlikely NVIDIA GPU will have more than 40-50 Gb/s of bandwith (to be shared between back buffer, zbuffer, front buffer, other render targets and textures), R500 should have 48 Gb/s just for everything but textures.
If NVIDIA will use the same MSAA scheme is currently using on nv40 R500 will have an edge over PS3 GPU, imho.
Are we going to see something new about AA? I don't think so.. :?
 
Embedded RAM could certainly be a huge differentiator between the respective GPUs in X360 and PS3. In the past Nvidia GPUs have always been much heavier in transistor count than ATI GPUs, but there have been rumours that the trend will switch with the next-generation designs (I've read G70 will be ~300 million transistors and R520 will be 300-350 million transistors). As was the case with PS2, and most high end PCs, it wouldn't surprise me if the GPU in PS3 has more transistors than the CPU.
 
If we assume the PS3 is backwards compatible, wouldn't that mean that there should be at least some sort of memory on board the GPU capable of at least 48 GB/sec?

Unless of course CELL would be handling emulation and the rasterizer (like a cut down "Visualiser" from the patent) and sending the buffer through to the GIF via the GPU.....

*mumbling*
 
If rumors (strong in this period) of Sony and Nvidia jointly developing the PS3 GPU are true,with Nvidia also implementing Sony's technologies,embedded ram also for PS3 doesn't seem unrealistic.
It would be amazing if both Cell and the GPU had their own XDR ram pool with the GPU also having an embedded ram quantity in it.
 
BOOMEXPLODE said:
Embedded RAM could certainly be a huge differentiator between the respective GPUs in X360 and PS3. In the past Nvidia GPUs have always been much heavier in transistor count than ATI GPUs, but there have been rumours that the trend will switch with the next-generation designs (I've read G70 will be ~300 million transistors and R520 will be 300-350 million transistors). As was the case with PS2, and most high end PCs, it wouldn't surprise me if the GPU in PS3 has more transistors than the CPU.

You know, I want to put everything else here we've been talking about aside and focus on the N70 for a second - did anyone else read recently how NVidia wants to try and bring up to 225 watts to bare on powering the next gen of video cards alone? I mean - wow. This makes me think that in some serious ways, the GPU inside the PS3 must deviate from current trends over at NVIdia, because that sort of madness is just not sustainable within a console. Any thoughts?
 
If the geometry is handled by the CPU, then the die space that would have been taken up by the geometry shaders on the GPU can be used for embedded RAM or more pixel shader logic. Either way, I don't think the GPU is going to have a problem using high quality AA, as long as the capability is built into the chip.

As it stands on the current PC front, ATi AA is more efficient and provides better IQ at the same settings. ATi could carry over their advantage of gamma corrected AA, and a fixed platform such as a console would be ideal for using TAA. So, NV has a lot of improvements they'll have to make to catch up to ATi's AA, but I don't think bandwidth will be an issue. They'll probably build a huge GPU since 65nm isn't all that far away and they can get a process shrink relatively early in the console's life.
 
Well the high end "ultra" cards that use so much power are not mainstream parts. Those are very expensive (the 512MB 6800 is $900!) enthusiast parts made in limited quantities, with absolutely huge cooling solutions. Don't expect something like that in PS3! Remember Nvidia also makes "mainstream" GPUs like the 6600 with small coolers and reasonable power consumption. I expect something like that.
 
BOOMEXPLODE said:
Remember Nvidia also makes "mainstream" GPUs like the 6600 with small coolers and reasonable power consumption. I expect something like that.

Well I have to say I do too, but I have to wonder what sort of pipe configurations we're going to be seeing nonetheless. These days it's the norm for consoles to exceed PC graphics for a short sliver of time, but when you look at what it would take to do that nowadays, what you're left with is the impression that unless something radical is done in design it's going to be a power and heat monster. So, I'm hoping for something radical. ;)
 
System and method for filtering graphics data on scanout to a monitor

Abstract

A graphics processing system performs filtering of oversampled data during a scanout operation. Sample values are read from an oversampled frame buffer and filtered during scanout; the filtered color values (one per pixel) are provided to a display device without an intervening step of storing the filtered data in a frame buffer. In one embodiment, the filtering circuit includes a memory interface configured to read data values corresponding to sample points from a frame buffer containing the oversampled data; and a filter configured to receive the data values provided by the memory interface, to compute a pixel value from the data values, and to transmit the pixel value for displaying by a display device, wherein the filter computes the pixel value during a scanout operation.

System and method for filtering graphics data on scanout to a monitor

Above in an interesting nVidia patent for oversampling/downsampling and compressing/decompressing pixels for anti-aliasing...

Add that patent to a back-end of a TBDR/ Gigapixel upgraded GPU that shades oversampled fragments/ micro-polygons...and with it's inherent bandwidth/memory saving architecture, voila, the custom PS3 GPU! :p

*Warning, that was pure speculation*

;)
 
INteresting question .

I would say if the r500 has edram and the ps3 gpu does not then yes the x360 will have an addvantage on fsaa . How much is hard to say . Last I heard the x360 will have just south of 50gb/bw to its memory pool and then fast edram to boot If true I don't see the ps3 gpu even with edram to end up much faster esp when u factor in the edge ati currently has in fsaa benchmarks on the pc .

However i still believe the ps3 gpu will be the more feature rich gpu. Now if all those features will be useable or a big deal remains to be seen
 
What about if PS3 is a tile renderer? Surely then there won't be need for lots of RAM for framebuffers etc. so eDRAM wouldn't be needed? :?
 
System and method for real-time compression of pixel colors

Abstract

A system and method are provided for the compression of pixel data for communicating the same with a frame buffer. Initially, a plurality of samples is received. It is first determined whether the samples are reducible, in that a single sample value can take the place of a plurality of sample values. If it is determined that the samples are capable of being reduced, the samples are reduced. Reduction is a first stage of compression. It is then determined whether the samples are capable of being compacted. The samples are then compacted if it is determined that the samples are capable of being compacted. Compaction is a second stage of compression. The samples are then communicated with a frame buffer, in compressed form, if possible, in uncompressed form if not. Subsequent reading of frame buffer data takes advantage of the smaller transfer size of compressed data. Compressed data is uncompacted and expanded as necessary for further processing or display. Where possible, data is processed reduced, rather than expanded, to minimize the amount of processing required and transfer bandwidth required. This system and method accelerate the rendering and display of computer-generated images.

System and method for real-time compression of pixel colors

This is the cross-referenced patent to the above filtering patent. Interestingly , Stephen Morein of ATI is referenced in this patent and Microsoft's Talisman (which btw, is similar also to this Sony patent)...

Maybe it's me, but lots of things point to the PS3 GPU being a TBDR! :p

At the very least both nVidia and Sony will have options to utilize their investments in IP's, to have bandwidth saving and AA technologies in the custom PS3 GPU...
 
Back
Top