FS ATI interview on R500 + Block Diagram

Acert93

Artist formerly known as Acert93
Legend
01.PNG


http://www.firingsquad.com/features/xbox_360_interview/default.asp

FiringSquad: You said earlier that EDRAM gives you AA for free. Is that 2xAA or 4x?

ATI: Both, and I would encourage all developers to use 4x FSAA. Well I should say there’s a slight penalty, but it’s not what you’d normally associate with 4x multisample AA. We’re at 95-99% efficiency, so it doesn’t degrade it much is what I should say, so I would encourage developers to use it. You’d be crazy not to do it.

5% hit for a bump from 2X AA to 4X AA? Well worth it!


FiringSquad: Onto the video processor, is it an on-die TV encoder or something like a Rage Theater-type chip? Would that be a third chip?

ATI: It is a third tiny chip and actually Microsoft did that. Microsoft if you recall acquired, about five or six years ago, acquired WebTV. So the people in Mountain View, CA that were a part of that group, and of course, it’s not just those people anymore, but they did that chip, and they’ve done a good job.

You know it’s a good choice because it’s a lot cheaper silicon, they’re using 90nm.
Interesting.

FiringSquad: How does Xbox 360 GPU compare in size to the RSX?

ATI: In terms of size, we’re a bit smaller. Of course, I’m not sure if that’s a good way to compare things, and to be honest I can’t talk about the number of transistors for this design. Microsoft owns the IP and that has a lot to do with their cost model and all that sort of stuff. But we’re a very efficient engine and we feel very good about our design. You know, the bang for the buck is awesome. The power of the platform [pauses] we’re going to be the most powerful platform out there, we’ve got a lot of innovation in there, we’re not just a PC chip.

I think the Sony chip is going to be more expensive and awkward. We make efficient use of our shaders, we have 64 threads that we can have on the processor at once. We have a thread buffer inside to keep the [inaudible]. The threads consist of 64 vertices or 64 pixels. We have a lot of work that we can do, a lot of sophistication that the developer never has to see.

FiringSquad: Do you know if it supports dual HD displays?

ATI: No it doesn’t. I know the NVIDIA chip does, but that’s only because PC products do. It doesn’t seem to have a real use inside the living room, but maybe you differ with me on that.

FiringSquad: Well, on the Sony console, I think they’re looking at applications that go beyond just a console in the living room don’t you think?

ATI: Yeah I really think it’s just an accident because, well you know, last summer they had to change their plans. They found out that Cell didn’t work as well as they wanted to for graphics. Remember originally you had two or three Cell processors doing everything and then in August last year they had to take an NVIDIA PC chip. And as you know, all PC chips do this, and so it [dual HD display outputs] just came for free.

Well if it looks like a duck, and your competition is calling it a duck... is there much doubt that the NV deal was later and that the PS3 has a PC part?

I am really interested in knowing more about the G70 now, especially 128bit pixel percision. It will be pretty neat to compare the chips feature sets. More info/interviews at:


HardOCP
http://www.hardocp.com/article.html?art=NzcxLDM=

Adandtech
http://www.anandtech.com/tradeshows/showdoc.aspx?i=2423
 
That diagram is very high level. I was showed another that they use internally (unfortunatly I wasn't allowed to take it myself) and that actually showed effectively 8 groupings of six shaders.
 
They found out that Cell didn’t work as well as they wanted to for graphics. Remember originally you had two or three Cell processors doing everything
We're supposed to remember that? It has never been disclosed that 2 or 3 Cell Processors would do everything. Nothing has been disclosed until recently.

This article was pretty useless, classical marketing driven BS, if you ask me.
 
DaveBaumann said:
That diagram is very high level. I was showed another that they use internally (unfortunatly I wasn't allowed to take it myself) and that actually showed effectively 8 groupings of six shaders.

Is it true that they only have 4 texture units? I was little surprised to say at least
 
jvd said:
there was the leak of the cell gpu that was just a cell chip doing the graphics .
That was not a leak, nor it was said that it was set in stone to be the PS3 final design. It was simply a patent.

They had two choice, a Cell based GPU, or a Classical GPU.
I don't remember hearing a rumor about 100% software rendering solution using two ro three Cell.
 
pc999 said:
I need to ask, will B3D make an article to next gen consoles?

For the same reasons Anand mentioned there is very little I can say on RSX, and although NVIDIA's David Kirk would suggest that they "are different parts" when we look at G70 we'll also discover much about the PS3's graphics. I am here trying to get as many details on the XBox graphics because a.) it has been announced and people are talking about it, b.) its a fairly different part in a number of respects, so its quite interesting, and c.) this will have ramifications beyond XBox 360 and even the PC.

I've just had a quick chat with Bob, but I asked a few questions that were a little too detailed for him so I should be getting a CC with the two senior architects next week.
 
jvd said:
there was the leak of the cell gpu that was just a cell chip doing the graphics .

back when cell started development that would have been a sound theory. Everything was polygons and textures. didn't Tim Sweeney say back in like '99 that gpu's were just a fad and he expected software renderers to make a big comeback in 5-10 years. Nobody could have forsaw the revolutionary geforce3 and the pixel shader.
 
tEd said:
DaveBaumann said:
That diagram is very high level. I was showed another that they use internally (unfortunatly I wasn't allowed to take it myself) and that actually showed effectively 8 groupings of six shaders.

Is it true that they only have 4 texture units? I was little surprised to say at least

No, its 4 groups of 4. They are grouped in four as these are the most common sampling requirements.
 
These shaders execute a new unified instruction set that incorporates instructions for both vertex and pixel operations. In fact, Feldstein called it a "very general purpose instruction set" with some of the same roots as the DirectX instruction set. Necessarily, the shader language that developers will use to program these shader units will be distinct from the shader models currently used in DirectX 9, including Shader Model 3.0. Feldstein described it as "beyond 3.0." This new shader language allows for programs to contain an "infinite" number of instructions with features such as branching, looping, indirect branching, and predicated indirect. He said developers are already using shader programs with hundreds of instructions in them.

Cool info from the link Jaws posted. http://www.techreport.com/etc/2005q2/xbox360-gpu/index.x?pg=1

In traditional pixel shaders, he noted, any shader output is generally treated as a pixel, and it's fed through the rest of the graphics pipeline after being operated on by the shader. By contrast, the Xbox 360 GPU can take data output by the shaders, unaltered by the rest of the graphics pipeline, and reprocess it. This more efficient flow of data, combined with a unified instruction set for vertex and pixel manipulation, allows easier implementation of some important graphics algorithms in real time, including higher-order surfaces and global illumination.

From one of the other interviews it seems there is some LOD refinements and higher order surface support. These type of little things are pretty cool.

he claimed the relative abundance of vertex processing power in this GPU should allow objects like fur, feathers, hair, and cloth to look much better than past technology had allowed. Feldstein also said that character skin should look great, and he confirmed to me that real-time subsurface scattering effects should be possible on the Xbox 360.

Now lets see if it can actually perform these features as a respectible level so they can be used in games.
 
DaveBaumann said:
They are grouped in four as these are the most common sampling requirements.
Is that because texturing is still done in quads?

Edit: Umm, my maths is right here, yes?
ATI: So we have 48 shaders, each one of them does 4 floating-point ops per cycle, so 196 floating ops per clock.
48 x 4 = 192 not 196?
 
Vysez said:
It was simply a patent.

It would appear, by their late inclusion of the NV GPU, that they were at least seriously exploring the direction of the patent. As rumoured here by others it would seem that Toshiba's effort, possibly based on CELL, just was not working out for some reason (performance, features, cost, who knows).

Obviously ATI knows more about what is going on at NV than we do. I would think it is safe to say, in the least, that even if ATI is conjecturing based on the patent that we can safely conclude Sony was trying something that did not involve NV until fairly late in the process.

Not that this is a bad thing. The RSX is basically a high end desktop GPU. 300M transisters :oops: A ton of bandwidth and an amazing CELL behind it to bring the game worlds to life with animation, physics, and stuff moving around.

Most importantly, the RSX gives Sony access to some great NV tools and a "known" quantity that developers can immediately begin exploring and exposings its potential. Yet for whatever reasons NV was not Sony's first choice.

What I am really interested in is getting to the nuts and bolts and seeing how the RSX performs compared to the R500. It will be an interesting tradeoff if the RSX has more power but the R500 has better IQ because of the free AA.
 
Now I am thoroughly confused of what the R500 is. So, is it now etched in stone that the R500 can do 192 shader ops per clock? 48 parallel processing units, each capable of 4 ops?

So, this turns everything on its head in respect to shading capability. It was first reported that the RSX had higher shading performance than R500, with 136 ops to R500's 96. Seeing the 'typo' of 196, as pointed out by Neeyik, it makes it look like somone just slipped a 1 in front of the 'original' 96.

Acert93 said:
It will be an interesting tradeoff if the RSX has more power but the R500 has better IQ because of the free AA.

So, shading power alone, it now looks like the R500 has a definite edge in power here as well as having "AA for free". *gulp* How quickly things change in chaos. I was under the impression that the PS3 was the potentially less efficient powerhouse, but now it is starting to look like Xbox 360 "has it all".
 
FiringSquad: Do you know if it supports dual HD displays?

ATI: No it doesn’t. I know the NVIDIA chip does, but that’s only because PC products do. It doesn’t seem to have a real use inside the living room, but maybe you differ with me on that.

FiringSquad: Well, on the Sony console, I think they’re looking at applications that go beyond just a console in the living room don’t you think?

ATI: Yeah I really think it’s just an accident because, well you know, last summer they had to change their plans. They found out that Cell didn’t work as well as they wanted to for graphics. Remember originally you had two or three Cell processors doing everything and then in August last year they had to take an NVIDIA PC chip. And as you know, all PC chips do this, and so it [dual HD display outputs] just came for free.

Ahhh...found the source...just read a few threads and there's a sudden conspiracy theory that has nothing to do with anything...and it's from here! :p

That's so LAME! :p

You'd think that multi-billion dollar projects do not operate with any contingencies? Sounds like sour grapes to me and clutching at straws! :p

You'd think the PS3, CELL+RSX did badly at E3 or something... :rolleyes:
 
Jaws said:
You'd think that multi-billion dollar projects do not operate with any contingencies? Sounds like sour grapes to me and clutching at straws! :p

Actually, I'd expect that ATI would have been in some competition with nVidia for the PS3 GPU, and would thus know something about Sony's PS3 GPU strategy.
 
wireframe said:
Now I am thoroughly confused of what the R500 is. So, is it now etched in stone that the R500 can do 192 shader ops per clock? 48 parallel processing units, each capable of 4 ops?

So, this turns everything on its head in respect to shading capability. It was first reported that the RSX had higher shading performance than R500, with 136 ops to R500's 96. Seeing the 'typo' of 196, as pointed out by Neeyik, it makes it look like somone just slipped a 1 in front of the 'original' 96.

Acert93 said:
It will be an interesting tradeoff if the RSX has more power but the R500 has better IQ because of the free AA.

So, shading power alone, it now looks like the R500 has a definite edge in power here as well as having "AA for free". *gulp* How quickly things change in chaos. I was under the impression that the PS3 was the potentially less efficient powerhouse, but now it is starting to look like Xbox 360 "has it all".

I would love for dave or someone to break this down to laymans terms. I don't think you can compare shader instuctions per clock because the ATI GPU has to use a percentage of thiers for VS ops. From the sounds of it this GPU is trading raw fill rate for ALUs and e-ram. Would I be right in saying since this GPU does not have to worry about legacy apps like DX7 and older raw fill rate does not matter as much as it would a PC GPU? I could be wrong but this thing sounds like shading beast.
 
wireframe said:
Now I am thoroughly confused of what the R500 is. So, is it now etched in stone that the R500 can do 192 shader ops per clock? 48 parallel processing units, each capable of 4 ops?
This method of counting shader ops is the same as NVidia. At least superficially.

Still, there's a question mark over whether R500's ALUs are 4D+1D or 3D+1D.

Fun aint it?... :p

Jawed
 
Back
Top