New Technical Information About RSX???...But In Japanese

My translation:



Latest Change in PS3 Graphics. RSX was downgraded.

Before getting to the main question, we’re reporting the latest change concerning SP3 graphics.

It has been widely reported that the PS3 graphics chip is something like a Nvidia GeForce 7800 GTX based chip, but according to the collected data, as development has been advances at every studio, one part of the PS3 developers has begun to complain about the lack of fill rate performance.

At first, according the development studios which created graphics prototype {i.e. prototype engines} intended for PS3 titles on a PC base, it is extremely troublesome that the performance goes down (literal: does not come forth) when carrying [their software] over to a real PS3. In the case of each studio, it appears that they proceed with a detuning schedule {Comment: this is typical for the Japanese language: they create their own Engrish words. Detuning means making worse, to deteriorate; BTW this was a real bitch to find} of the technical specifications, in which the rendering resolution is lowered and shaders are removed, and that they carry it over to a real PS3 [afterwards].

So what seems to have happened to the GPU with the same base as the GeForce7800 … Mainly three reasons are given.

Concerning PS3’s RSX, although the design is said to based on the GeForce 7800 GTX, the video memory is similar (literal: suitable) to the lower-rank model 7600 due to the 128bit bus connection and the video memory bandwith is narrower (in comparison to the 7800 line). That is considered the first reason.

Secondly, the RSX specs which have been presented to the public in 2005 have been downgraded. In 2005 it was announced that PS3’s RSX had a core clock of 550 MHz and video memory clock of 700 Mhz, but at E3 2006 it suddenly changed and behind closed doors {the literal version here sounds like ass. Thus slightly adapted}it was said that in the final model on sale the core clock would be reduced to 500 Mhz and the video memory to 650 Mhz.

Thirdly, the ROP unit is in fact only half that of the GeForce 7800GTX.

RSX has 24 pixel shader units just like the GeForce7800 line, but the ROP Unit (Rasterize Operation Unit, also called rendering back-end) which writes(literal: fills) the output from there practically (alternative: virtually) in the video memory as data (basically, drawing the pixels) is different: Instead of the 16 units of the GeForce 7800 line, RSX has only half of it, 8 units. That is exactly similar and corresponds to the lower-rank model GeForce 7600.

[Rest is blabla]
 
New news to me since it was discussed before that the triangle fill rate would be a lot less than the X360...
But as has been addressed, those are a different class of triangles. PS3 provides better sorted trianlges to RSX as it were, so it only gets what needs to be rendered and is in that regard more efficient.

None of these figures in isolation mean anything at all. Except maybe clockspeeds ;)
 
But as has been addressed, those are a different class of triangles. PS3 provides better sorted trianlges to RSX as it were, so it only gets what needs to be rendered and is in that regard more efficient.

None of these figures in isolation mean anything at all. Except maybe clockspeeds ;)

Can you show me where this was mentioned? I'm intrigued as to how it is able to do that without relying upon Cell (perhaps this is why the SPU is allocated to the OS?).
 
Well, I feel that to us, well-informed B3Ders, this article doesn't really bring anything to the table. The way I understood it, we knew (please do correct me if I misunderstood something!)

- that the clock downgrade had taken place. Also, that in terms of performance, it's fairly insignificant.
- that having 8 instead of 16 ROPs makes sense in the context of the PS3, as the 8 extra ROPs wouldn't have contributed much
- that the 128bit bus doesn't hamper the RSX in PS3 much either, partly because instead it has been much more integrated with the PS3 to work together much more directly with the Cell and the main XDR memory.

Indeed, by far the most important and interesting part of this discussion, imho, is not so much the exact specs of the RSX in isolation, but the details of its integration with the rest of the system. People are far too PC centric in their discussions of graphics hardware sometimes. A console like the PS3 is a well-tuned device in which all the components are tuned to each other's performance. The most exciting part of the machine is its whole, not its parts. Sounds obvious, but really isn't. ;) Even if Cell is an interesting part, its relevance is how it can contribute to the system. Can it calculate all sorts of wonderful stuff? Sure, but if there's no efficient way to integrate any graphics related calculations with the RSX output to the video memory, then there's no point.

Also, Barbarian seems to spill some new details that I hadn't seen before, even if its just a small one about the size of texture cache raised from 48k to 96k:
http://www.beyond3d.com/forum/showpost.php?p=838753&postcount=45
 
Can you show me where this was mentioned? I'm intrigued as to how it is able to do that without relying upon Cell (perhaps this is why the SPU is allocated to the OS?).
I haven't a link I'm afraid but IIRC it was nAo that explained this. Neither can I recall whether Cell was involved or not, nor to what degree. I feintly remember the idea that sorting of a sort was going on in RSX prior to the transform part where RSX was inferior (this was 'the RSX can only do 250 M Tris a second whereas Xenos can do 500 million' story.)

If you search for this info along with the Inq story about 250 and 500 million triangles, you'll probably find it. I would but I've really got to do some work!
 
Interesting that the article says that the devs that were complaining of lack of fillrate performance had to remove shaders in addition to reducing the rendering resolution. Why would they need to remove shaders?

Reduction in the number of pixel shaders?
 
I haven't a link I'm afraid but IIRC it was nAo that explained this. Neither can I recall whether Cell was involved or not, nor to what degree. I feintly remember the idea that sorting of a sort was going on in RSX prior to the transform part where RSX was inferior (this was 'the RSX can only do 250 M Tris a second whereas Xenos can do 500 million' story.)

If you search for this info along with the Inq story about 250 and 500 million triangles, you'll probably find it. I would but I've really got to do some work!

I haven't found it yet, but while searching I came across a wonderful discussion between Titianio and Barbarian, and I'd love to get a 'one year later' perspective on that discussion:

http://www.beyond3d.com/forum/showthread.php?t=22127&page=3&highlight=xenos+rsx+vertices+sorting
 
I haven't a link I'm afraid but IIRC it was nAo that explained this. Neither can I recall whether Cell was involved or not, nor to what degree. I feintly remember the idea that sorting of a sort was going on in RSX prior to the transform part where RSX was inferior (this was 'the RSX can only do 250 M Tris a second whereas Xenos can do 500 million' story.)

If you search for this info along with the Inq story about 250 and 500 million triangles, you'll probably find it. I would but I've really got to do some work!
Maybe you were thinking about this discussion?

http://www.beyond3d.com/forum/showthread.php?t=31255&page=6

I think page seven and forward contained some juicy details, but the whole thread is pretty informative.
 
I think i've seen a video about lair...or something, Anyway the guy in the video couldn't understand why people were saying that there was a downgrade. I'll try to find it.
 
Um, it's meant to be a launch game is it not? I kind of doubt it's only 35% done if that's the case, it'd be pretty impossible to finish it in time. If it's REALLY only 35% done, it'll be a 2008 title for sure, and that might be optimistic considering how long they've been working on it! :p
Let's just hope that Lair is suffering from Ugly Duckling Syndrome (UDS). Hopefully F5 puts the game under wraps, shoots for 720p and shows us the beautiful swan that they promised.
 
Let's just hope that Lair is suffering from Ugly Duckling Syndrome (UDS). Hopefully F5 puts the game under wraps, shoots for 720p and shows us the beautiful swan that they promised.
It depends how long 'til it launches. We've seena few titles that have looked bad on first showing, but got much nearer their targets as they've progressed. Plus a penciled date might get postponed. That's not uncommon. So even if Lair is due March, they might work at it until September to get it up to expectations.

Like all these games, comment on what we see, but don't judge the final game until it's out.
 
That's usually my mantra. But I've been hearing folks gushing about this game as if the graphics were indicative of a "done" game. It simply isn't. I don't judge unfinished games, but I don't praise them either. Gameplay is the only thing you can judge and so far it looks like Warhawk with dragons. Whether that's good or bad depends on your taste.
 
Hey guys, noticed a similar thread over at NeoGaf. From http://pc.watch.impress.co.jp/docs/2006/0926/kaigai303.htm .

Slow updates on the PS3 SDK

The article goes into some hurdles PS3 developers had to go through. SDK 1.00 was supposed to be released by TGS, but the latest release before TGS is 0.93. What is shown at TGS is still developed with SDKs that are not final.

This situation is similar to Sony’s PSP launch with slow SDK updates, however the situation may be worse. According to Sony’s Spring schedule SDK 1.0 was planned on being released in June.

The SDK version number is relative to hardware maturity and 1.0 release may take awhile. The developers will have to debug using non-final SDK. This may indicate that SCE software development has not caught up to schedule in regard and previously may have been over optimistic. Several PS3 “DebugStationsâ€￾ are finally starting to reach the hands of developers, with about 2-months till launch were concerns on schedule before TGS.

Hurdles involving the Performance of the Cell:

There are some challenges involving the architecture of the Cell, the cell consists of PPE and SPE cores. A developer states, "It is impossible to extract the full performance of the Cell on launch titles, it will take time get familiar with it". Another developer states that they are having difficulties with the 256KB memory of each SPE core. The actual useable area of the 256KB is closer to 128KB when buffering is considered with accessing external DRAM. "It would have been much different situation if there was 1MB of local memoryâ€￾. There is however a benefit for these restrictions on local memory, since latency can be reduced, and latency cycles are more easily read. This is an advantage for real-time gaming applications.

Till now, developers were stingy with programming and memory usage, and this will not change with the PS3, in fact, the Cell will reward developers that put more effort into programming. While that may not be a negative, it is a hurdle that will take larger developer resources and time. For the Cell it has changed from extracting performance from the hardware, but more towards multi-threaded performance and takes a different skill set then the previously.

RSX Memory Bottleneck

Developers were using 7800GTX for development, The RSX uses Nvidia’s G70 and performance programmable shader performance is very high. But the memory interface is 128-bit, in addition 8 ROP (Rasterizing Operation). It can be said that the RSX has a shader equivalent of a high-end PC with mid-range memory bandwidth. For that reason, due to the GPU high shader performance there is a bottleneck to the ROP memory and is causing a bottleneck. “For lower resolutions it is a fantastic GPU, but it gets difficult for high end HDTV resolutionsâ€￾, says a developer.

The biggest impact is the HDR and FSAA, the memory bottleneck becomes hard for PC levels of HDR and FSAA, to overcome this hurdle developers are using memory from the Cell for textures, and using FlexIO as a texture lead to reduce GPU bandwidth issues.

Developers are having exactly opposite problems then that of the PS3 and 360 as far as GPU performance is concerned. While there memory bandwidth issues with the PS3, they have great shader performance. For the 360, they have little memory bandwidth issues with the 10MB eDRAM, and less FSAA issues. But some developers are having issues with a lack of shader ALU performance and threading resources, however performance will increase as developers get more familiar with unified shader architecture.

However, all hardware has its limitations, and the real difference should be felt as the two systems get mature. For the PS3, developers are having concern it they can exhibit the difference in power between the PS3 and Xbox 360 due to technical hurdles.
 
Most of this stuff is a compilation of thoughts and speculations that have already been discussed to death here. Nice to have them summed up, though.
 
Interesting clarification at on 360 GPU troubles exactly:

"Saying they (360 developers) are lacking in Shader ALU performance and having stalls due to not having enough "threading resources"
________
Water Pipes
 
Last edited by a moderator:
Interesting clarification at on 360 GPU troubles exactly:

"Saying they (360 developers) are lacking in Shader ALU performance and having stalls due to not having enough "threading resources"

Just to finish that sentence...
however performance will increase as developers get more familiar with unified shader architecture.
;)
 
Back
Top