PS3 internals

I think the fact that Sony have declined to release RSX's clock speed is official confirmation enough ;)

Who knows, but as you said, if they havent't said it yet, is because RSX doesn't reach 550MHZ. We all know how nvidia works, remeber the gpu on xbox.

And i have a question about raytracing, the thing is, i heard that it could generate any image without textures, so if this is true,and a CPU could generate a 3D image, is this the beggineng of the end of the gpu's?
 
[Raytracing could]....generate any image without textures ... is this the beggineng of the end of the gpu's?

It has been possible for a CPU to generate raytraced images since before the time of the ATARI ST and Amiga 500, on Motorola 68000 CPU's running at a blistering 8MHz (slightly less for the AMIGA). Since then CPU speeds have increased in megahurtz by mre than 400x and still we don't see developers on masse using the CPU to render an entire image in a game.

Someone else will give you a reason why developers are not converting to raytracing and why iD, Epic or Crytek are not extolling the virtues of raytracing.
 
It has been possible for a CPU to generate raytraced images since before the time of the ATARI ST and Amiga 500, on Motorola 68000 CPU's running at a blistering 8MHz (slightly less for the AMIGA). Since then CPU speeds have increased in megahurtz by mre than 400x and still we don't see developers on masse using the CPU to render an entire image in a game.

Someone else will give you a reason why developers are not converting to raytracing and why iD, Epic or Crytek are not extolling the virtues of raytracing.

You'd need a super computer to render whole games with CPU alone, no? Say no to 0,000003 FPS games! :)
 
You'd need a super computer to render whole games with CPU alone, no? Say no to 0,000003 FPS games! :)


LOL, but you could use it for some parts of the game, like the clouds on warhawk, i think that's interesting, clouds on warhawk looked pretty good.
 
You know I was pondering the whole scaling thing and something powerful occured to me. Just why would a fancy scaler chip be needed at all? RSX should be just as capable of running at other resolutions as a PC GPU, and there's no real framebuffer size problem like Cube/Wii. So, why not just natively render at each needed resolution? Surely devs should be capable of planning for that.
 
You know I was pondering the whole scaling thing and something powerful occured to me. Just why would a fancy scaler chip be needed at all? RSX should be just as capable of running at other resolutions as a PC GPU, and there's no real framebuffer size problem like Cube/Wii. So, why not just natively render at each needed resolution? Surely devs should be capable of planning for that.

That type of stuff is why PC gaming is more messy. Still wouldn't be as bad... but dealing with a single resolution is much easier (debug, assets - text/fonts/etc., and performance concerns at given resolutions).

It'd be possible, but a scaler is probably the optimal way (especially from a dev standpoint, I imagine). Apparently there is a scaler? It's just off limits or something? I wonder if that's part of what the extra size in the RSX is -- if that's the case, then I imagine software update should allow for it (hopefully there isn't some hardware fault in it, making it useless though!).

Scaling is my biggest concern for PS3 at the moment -- the rest of the stuff (background downloading, for example) are quite fixable in updates. I was dumb/lucky enough to get a 1080p TV, so there isn't any issues for me really (accepts any res), but it'd be nice if DVDs/PS1/2 games got upscaled (although, I'd prefer if PS1/2 games were actually just rendered at higher res).

Guess we wait and see at this point.
 
You know I was pondering the whole scaling thing and something powerful occured to me. Just why would a fancy scaler chip be needed at all? RSX should be just as capable of running at other resolutions as a PC GPU, and there's no real framebuffer size problem like Cube/Wii. So, why not just natively render at each needed resolution? Surely devs should be capable of planning for that.
Fill rate.

RSX could perform some scaling duties though. Use current framebuffer as texture, render to differently sized display buffer using bilinear or something better (custom filter in a shader).
The problem though is that you can't tack such things on as an afterthought. If a game keeps RSX busy all the time to barely attain the framerate it is locked to, you can't pile extra duties on RSX later without breaking stuff.

Similar with Cell. Now the good thing about the Cell approach to scaling is that one if the SPEs is, as far as we know, reserved for OS duties anyway. It does consume memory bandwidth though, and by extension may interfere with RSX duties again (GDDR requests have to go through RSX).
 
Someone else will give you a reason why developers are not converting to raytracing

Simply put, the workload is colossal. You'd need a whole row of Cell processors to generate a game grade image at even 30FPS.

There are ray tracers running on Cell but at the moment you get in the region of 1.5 FPS at 512x512 resolution (roughly 480p) - using 8 SPEs*.

There are many programatic ways of speeding things up and you could split the workload across the Cell and RSX which should give a nice healthy boost.

Even if you could speed things up by several hundred percent the Wii will still be kicking it's arse - and it'll be a really, really, really boring as the image generation alone uses 100% of the CPU's power.

That said I'd love to see someone do the old Amiga "juggler" demo in real time, Cell should be capable of that.

*See this. There's also a video floating around somewhere.
 
Definately not. It only works in the 60Hz range.
Did you look at the image in the post you were quoting?
Did you notice some forum members here use Japanese PS3s in European households without step-down converters or anything similar?

It is a universal PSU, and it definitely works just fine with 50Hz AC.
 
I know this is stupid, but in the "Higher Definition Video" on playb3yond, now says that RSX can do 2 trillion operations per second, the thing is, that they had changed it, because in the same video, it said that RSX could do 100 billion operations per second.

Before

After

I mean, the diference between 100 billion and 2 trillion (200 Billion, am i right?) is HUGE.
 
I know this is stupid, but in the "Higher Definition Video" on playb3yond, now says that RSX can do 2 trillion operations per second, the thing is, that they had changed it, because in the same video, it said that RSX could do 100 billion operations per second.

Before

After

I mean, the diference between 100 billion and 2 trillion (200 Billion, am i right?) is HUGE.

Uhm as figures, they're both as useless as each other... So i wouldn't worry too much about it...
 
I know this is stupid, but in the "Higher Definition Video" on playb3yond, now says that RSX can do 2 trillion operations per second, the thing is, that they had changed it, because in the same video, it said that RSX could do 100 billion operations per second.

Before

After

I mean, the diference between 100 billion and 2 trillion (200 Billion, am i right?) is HUGE.

In one video there refering to only RSX ( The 100 Billion comment, maybe talking shader ops? ), while in the other video there talking about PS3( The 2 trillion comment ) as a whole system.
 
In one video there refering to only RSX ( The 100 Billion comment, maybe talking shader ops? ), while in the other video there talking about PS3( The 2 trillion comment ) as a whole system.


Oh, i missed that.
 
2 Trillion what?
You'd have to count a lot of things you really shouldn't be counting to reach that kind of number. Not relevant, just PR (see sig). It's simply a more memorable form of "it's really fast!".
 
Last edited by a moderator:
Sorry for digging up an old thread but reading some comments in the HS thread about PS3 scaling prompted me to google to see if any more info had been discovered about that big IO chip (CXD2973GB) in the PS3 that I have always reckoned to be a Toshiba Super Companion Chip (based on size, isuppli cost estimates and wishful thinking).

A quick dig came up with the following email from a Linux mailing list which I reckon to be fairly close to a confirmation that there is indeed an SCC in there from some guy called Geoff Levand that appears to work for Sony.

usb: fix ohci-hcd quirk in cell south bridge

There are currently three platforms that use this chip, the IBM
Cell Blade, the Toshiba Cell Reference Set (Celleb), and the
Sony PS3. The chip itself is on a system bus (IOIF), but each
platform represents it somewhat differently.

From reading some of the responses to that email it looks like the Hypervisor adds a degree of complication to programming for the SCC so I wondered if this could explain the delay in getting the functionality out the door (if it's there that is of course)
 
  • Like
Reactions: one
Back
Top