8 ROPs on the RSX. Bad engineering decision?

Status
Not open for further replies.
The fact that studios with infinite budgets, unlimited time and/or preferential support are able to eek performance out of rsx+cell equivalent to what can be achieved by more typical studios with far less time and far less money on xenos alone hardly indicates that Sony devs are aiming higher.

Heck, if you think there are a lot of rsx sucks posts here, you should check the official Sony dev PS3 support forums. You'll be overwhelmed by the myriad of posts of "...why is this slower than the other console..." or "...but the other console can...", etc, etc, etc. Of course, I'm sure all those developers must be totally wrong, and rsx clearly represents The Pinnacle Of Rendering Performance (tm).

Hey, I don't believe that Insomniac has had unlimited bugets or time and by November of this year, they will have produced 3 of the best looking titles available this gen (after only 2 years of the PS3 being out). I think Insomniac did the right R&D up front and did not fight the changes required to maximizing whats available in the PS3. Also, I never claimed that RSX was the pinnacle of anything and am very well aware of how ordinary it is... but how can a dev as smart as yourself not agree that Sony was much wiser not to blow their R&D + production costs on a GPU when the solution they have is giving them equal and in many cases better results (1st party games) then Microsofts most likely UBER EXPENSIVE investment in Xenos (that ended up causing ROD problems). Cell as compared to Xenos was a good investment that will bring many future returns and don't get me started on Blu-Ray as it is bound to bring some money to Sony's pocket.

Sony, is also improving many of its libraries and tools which I think will greatly help with multi-platform titles being at par with the 360.

Also, if history serves me right the Playstation 1 and 2 were always behind their competition big-time in terms of their GPU. If anything, the PS3 seems to put them on the best position they have ever been (in terms of graphics).
 
Hey, I don't believe that Insomniac has had unlimited bugets or time and by November of this year, they will have produced 3 of the best looking titles available this gen (after only 2 years of the PS3 being out). I think Insomniac did the right R&D up front and did not fight the changes required to maximizing whats available in the PS3. Also, I never claimed that RSX was the pinnacle of anything and am very well aware of how ordinary it is... but how can a dev as smart as yourself not agree that Sony was much wiser not to blow their R&D + production costs on a GPU when the solution they have is giving them equal and in many cases better results (1st party games) then Microsofts most likely UBER EXPENSIVE investment in Xenos (that ended up causing ROD problems).

Problem is you're making the assumption that guys like KP, ND & Insomniac couldn't produce equal to or even better results as they did on the PS3 had they dedicated their talent to the Xbox360 platform with the same resources & schedule..

I agree with Joker in that IMO it's likely that parity could have been achieved in this scenario ('better' is a seperate discussion & much less easy to predict..) & maybe even with less time/money spent on R&D due to the lower degree of complexity in the general hardware use case..
 
Problem is you're making the assumption that guys like KP, ND & Insomniac couldn't produce equal to or even better results as they did on the PS3 had they dedicated their talent to the Xbox360 platform with the same resources & schedule..

I agree with Joker in that IMO it's likely that parity could have been achieved in this scenario ('better' is a seperate discussion & much less easy to predict..) & maybe even with less time/money spent on R&D due to the lower degree of complexity in the general hardware use case..

Who cares if they can do it... these titles will never come out on 360.
 
The fact that studios with infinite budgets, unlimited time and/or preferential support are able to eek performance out of rsx+cell equivalent to what can be achieved by more typical studios with far less time and far less money on xenos alone hardly indicates that Sony devs are aiming higher.

Heck, if you think there are a lot of rsx sucks posts here, you should check the official Sony dev PS3 support forums. You'll be overwhelmed by the myriad of posts of "...why is this slower than the other console..." or "...but the other console can...", etc, etc, etc. Of course, I'm sure all those developers must be totally wrong, and rsx clearly represents The Pinnacle Of Rendering Performance (tm).


I see many forums developers talking same thing,and if you are third and your game have "only US$ 15 million" (with compare US$20 to 60 millions/3 years on each first party games) to share 80% resources/coded based (sdks etc) in PC version and x360 and 20% in ps3 the results could not be diferent.

(I understand point of views thirds devs...if pc and x360 have much more tools to reach more results in less time how much $$ you would spent or invest more? ... Old days with ps2 "lonely" for 20 months and 20 million install base doesnt exist anymore)
 
Last edited by a moderator:
If you mean only on PS3, then I agree. If you mean on PS3 and 360, then I don't. I finished both Mass Effect and Uncharted and if you ask me, Mass Effect looks better

Well, you're ALONE on this one.
 
Problem is you're making the assumption that guys like KP, ND & Insomniac couldn't produce equal to or even better results as they did on the PS3 had they dedicated their talent to the Xbox360 platform with the same resources & schedule..

I agree with Joker in that IMO it's likely that parity could have been achieved in this scenario ('better' is a seperate discussion & much less easy to predict..) & maybe even with less time/money spent on R&D due to the lower degree of complexity in the general hardware use case..

I remember nAo mentioned that the traditional way of programming barely works for 360 CPU, but certainly does not work on PS3. For their first games, wary developers could be targeting a lesser visual to fit within the 360 CPU + GPU working envelope (without over designing). But if they want to have breakthrough performance, they may need to rewrite a large part of their code to optimize for 360 fully. On PS3, they had to start everything anew right away. So the devs may be able to achieve higher targets because of the extra design work from the get-go.

Now that the devs have some real experiences in Cell and RSX, do they still need to waste a lot of time on extra development ? A large part of the difficulty is due to the radical departure from traditional thinking/practices, and trying to retrofit legacy code to PS3 architecture (UE3 ?).

I think it's too early to judge the performance of current gen consoles. If we wait for 1-2 more years, everyone should have a better idea of the performance level and expected resources/ease-of-development. Oh yes, the developer tools count too.
 
Last edited by a moderator:
Not only them you've forget some Dinosoft's OS kernel process handle by
Windows OS and a lot of sub programs running.

The OS, API and background processes all take up system memory. And to some extent they sap CPU power.

Thats why it takes a PC with 1GB of RAM to match a console with 256MB of RAM. Similarly it takes quite a bit of CPU oomph to overcome the API overhead although thats supposed to be largely rectified with DX10.

The point i'm making is that these things don't have much impact on the GPU and so a 7800GTX in a PC is pretty much equivilent to a 7800GTX in a console as long as its backed up by sufficient CPU power and system memory.

The advantages in consoles will come from balancing your game to that GPU's resources to fully take advantage of every aspect of it. I'm not sure that happens an aweful lot though, especially not in cross platform games.
 
Cod1 to CoD2 is a huge improvement in graphics, to CoD4 a even bigger one.

I did say graphics and gameplay btw, although shaders were added to CoD2 it is still pretty much CoD 1+ shaders as all of the graphical geometry issues are still there.

Otherwise standing next to a bursting grenade has the same effect across all 3 IW developed game, throwing a grenade also has the same effect, physiscs behaviour has hardly changed, vehicle interaction has hardly changed and I could go on but in the end CoD is best enjoyed on a PC because that is the targeted software enviroment and by hardly changed I do mean that a new coat of paint being slapped on a 3d engine does not mean a new revolutionary or even evolutionary step.

How is Mass Effect a failure? I mean it has recieved good critic and has good graphics aside from framerate problems (which so many console games are plagued with). It seems obvious to me that for Halo 3 they sacrificed to much for having top notch lighting. I dont say it was worth it though.

I already explained why as I heavily played KOTOR 1, 2 and JE repeadedly back in the days to know that ME did not bring anything new that was not already there graphically, what it did bring though was more glitches and framerate problems two years after a console launch, hence the failiure.

Critical acclaim these days is so inundated with reviewers who forget how they got there in terms of graphics and gameplay and start to imply that just because they noticed a feature in a new game, that they were not aware it was there in the begining, KOTOR 1, 2, JE and ME are comparable to CoD series in basically just applying a new coat of paint every few years despite hardware changes increasing the spec and all the time in the world to execute it, then again some do feel that if its not broken to not fix it

I think YAKUZA3 can't beat Uncharted in technicial term.

There're many pre-render around the stages. It cannot compare with uncharted anymore.

Yakuza 3 also has to render NPCs that you are not supposed to kill but have a role as part of the world as opposed to the nature of Uncharted, basically for the world that Uncharted is rendering it does an amazing job, however for the world that Yakuza 3 is rendering it does a trully amazing job.
 
...and by hardly changed I do mean that a new coat of paint being slapped on a 3d engine does not mean a new revolutionary or even evolutionary step.

Well engines are improved, new rendering effects are introduced as goes with all games. Why does it have to be revolutionary, what game has a revolutionary new engine doing things other engines don't?

I mean you used some other game exmples for other topic but take Uncharted or Ratchet and Clank, neither one is doing anything revolutionary tech wise that puts them above all other games since it is done by other engines and some do far more, so = fail? -No of course.

Or take Ratchet and Clank 3 vs previous versions, same type of makeup uppgrade as CoD1 to CoD4 etc.

I already explained why as I heavily played KOTOR 1, 2 and JE repeadedly back in the days to know that ME did not bring anything new that was not already there graphically, what it did bring though was more glitches and framerate problems two years after a console launch, hence the failiure.

So it is becouse of the framerate problems and glitches 2 years after launch. That still doesn't make the game a failure, it is not unplayable and it got it's gameplay. When the PC version get's out then it will be a success then for you since it wont have framerate problems and glitches are taken care of, yes?

Yakuza 3 also has to render NPCs that you are not supposed to kill but have a role as part of the world as opposed to the nature of Uncharted, basically for the world that Uncharted is rendering it does an amazing job, however for the world that Yakuza 3 is rendering it does a trully amazing job.

You dont mean graphically since what is not shown can be culled out for more perfomance and it would be a stupid move not to cull out/LOD out. I assume you mean the AI, that is active "offline"/OOS AI. If so that is very hard to use when talking total technically "shine" level of a game. It depends on updates per minute, complexity etc. For example Oblivion has OOS AI for most NPC's.

Well sure in that respect yes but that is only one part of the puzzle.
 
Well engines are improved, new rendering effects are introduced as goes with all games. Why does it have to be revolutionary, what game has a revolutionary new engine doing things other engines don't?

I mean you used some other game exmples for other topic but take Uncharted or Ratchet and Clank, neither one is doing anything revolutionary tech wise that puts them above all other games since it is done by other engines and some do far more, so = fail? -No of course.

Or take Ratchet and Clank 3 vs previous versions, same type of makeup uppgrade as CoD1 to CoD4 etc..

It may be possible to say that Ratchet and Clank Future ToD is nothing more than a new coat of paint, the problem arises when you take into account the technological leap from PS2 and the GS capabilities.

So it is becouse of the framerate problems and glitches 2 years after launch. That still doesn't make the game a failure, it is not unplayable and it got it's gameplay. When the PC version get's out then it will be a success then for you since it wont have framerate problems and glitches are taken care of, yes?

It fails because two whole years after the so called "easier and better dev tools" by so many game developers should have been revised enough to avoid having ANY type of framerate problems and graphical glitches and geometry pop up.

Specially considering that Bioware only had to worry about making the XBox 360 version, not the PC as that is still a year away but yes that one better not fail or many PC gamers will me miffed.

You dont mean graphically since what is not shown can be culled out for more perfomance and it would be a stupid move not to cull out/LOD out. I assume you mean the AI, that is active "offline"/OOS AI. If so that is very hard to use when talking total technically "shine" level of a game. It depends on updates per minute, complexity etc. For example Oblivion has OOS AI for most NPC's.

Well sure in that respect yes but that is only one part of the puzzle.

Ahh yess Oblivion, yes you could say something like that, at the same time though you are refering to AI behaviour while I am refering to rendering geometry becoming more complex, still the games (both Oblivion and Yakuza 3) are nowhere near the indication of the power of the home consoles, that is still years away, however Yakuza 3 is aiming for a high direction.
 
It may be possible to say that Ratchet and Clank Future ToD is nothing more than a new coat of paint, the problem arises when you take into account the technological leap from PS2 and the GS capabilities.

Doesn't mather what hardware, it still is a new "paint coating". CoD1 on a P3 and a TNT2 versus CoD4 on a Core 2 Duo with a 8800GTX? ;)

Specially considering that Bioware only had to worry about making the XBox 360 version, not the PC as that is still a year away but yes that one better not fail or many PC gamers will me miffed.

Time will tell but a budget PC is already running it better (lol)!

Ahh yess Oblivion, yes you could say something like that, at the same time though you are refering to AI behaviour while I am refering to rendering geometry becoming more complex...

I see but it doesn't mather if the NPC is a essential quest NPC or just a random "fill-in" actor in the game world. They will all be LOD'ed/culled out when not in sight. Saves perfomance, saves memory, all games do it.
 
I guess we'll just have to see how Mass Effect runs on my 7800 GTX. :) Will that be some kind of twisted irony?

Really, I've been mostly disappointed with the consoles. 360 games always have some sort of issue, be it bilinear filtering, super noisy framerate-killing DVD-ROM swapping that reminds me of HDD thrashing, or texture pop-in. I haven't even personally used a PS3 because I know of no one willing to stomach the $600 for its few worthwhile games.

But I do have a hard time seeing how any GPU based on a G70 could really be superior to what ATI built for 360. I just sit and reminisce over the crappy texture filtering that my 6800 and 7800 put out because of its performance issues with filtering. I'm sure RSX has that same problem because it was a fundamental part of the texture units. And then you have the obvious inefficiencies of a non-unified design. For example, my 8600GTS at home, a very disappointing chip according to web folks, easily rivals my 7800GTX and looks better doing it. Of course, after seeing bilinear mip-map filtering in Forza 2, and the touted 95% free 4X AA go out the window, my opinion of ATI's C1 has become rather tarnished to say the least too.
 
Pfff no poor filtering with a 7900GT at highest driver setting! :p

Though x1800xt's AF is better for at certain angles IIRC. Though how many console games use AF to begin with... :mad:
 
I haven't even personally used a PS3 because I know of no one willing to stomach the $600 for its few worthwhile games

That would explain why. :)
PS3 80Gb is now US$499 (with goodies if you look hard enough), just like launch day Xbox 360. $399 if you're willing to forego b/c.
 
Doesn't mather what hardware, it still is a new "paint coating". CoD1 on a P3 and a TNT2 versus CoD4 on a Core 2 Duo with a 8800GTX? ;)

The problem with that line of thinking is that I clearly remember CoD1 when released and it would NEVER run as intended under a TNT2 card, you needed a minimum of a GeForce 2 32MB ram and even that was a joke because you need to turn all features on and only GeForce 4 Ti128MB ram made sense specially for the opening Russian Campaign

A Core 2 Duo is really nothing more than raw beast power to CoD4 as the game runs on my lowly Athlon XP 3200 single core and even lower GeForce 6800GT at 1024x768 and high quality settings that look better than PS3/X360 and whats funny is that as Nvidia releases newer drivers, performance improves even more.

Time will tell but a budget PC is already running it better (lol)!

I will fall of my chair if ME runs better on my lowly toaster of a PC as I have been since CoD2, Oblivion, CoD4 and others have.

I see but it doesn't mather if the NPC is a essential quest NPC or just a random "fill-in" actor in the game world. They will all be LOD'ed/culled out when not in sight. Saves perfomance, saves memory, all games do it.

I don't think I even contested saving performance but whatever at least so far the dev has made an amazing job.
 
I guess we'll just have to see how Mass Effect runs on my 7800 GTX. :) Will that be some kind of twisted irony?

Really, I've been mostly disappointed with the consoles. 360 games always have some sort of issue, be it bilinear filtering, super noisy framerate-killing DVD-ROM swapping that reminds me of HDD thrashing, or texture pop-in. I haven't even personally used a PS3 because I know of no one willing to stomach the $600 for its few worthwhile games.

But I do have a hard time seeing how any GPU based on a G70 could really be superior to what ATI built for 360. I just sit and reminisce over the crappy texture filtering that my 6800 and 7800 put out because of its performance issues with filtering. I'm sure RSX has that same problem because it was a fundamental part of the texture units. And then you have the obvious inefficiencies of a non-unified design. For example, my 8600GTS at home, a very disappointing chip according to web folks, easily rivals my 7800GTX and looks better doing it. Of course, after seeing bilinear mip-map filtering in Forza 2, and the touted 95% free 4X AA go out the window, my opinion of ATI's C1 has become rather tarnished to say the least too.

The PS3 80GB that includes a free copy of Motorstorm came down to $499.99 since around September or August of 2007.

The 40GB PS3 is $399.99

And as for the 6800 and 7800 running in a PC I am the type of person who does annual reformats, specially since I have multiple drives and use my main drive for things that can be replaced (specially when virus or malware strike hard) I personally have found that the last time I reformatted back when Nvidia released their Dec 2007 driver release for Win XP that I gained alot of much smoother performance and I have a 6800GT with a single core Athlon XP.
 
But I do have a hard time seeing how any GPU based on a G70 could really be superior to what ATI built for 360. I just sit and reminisce over the crappy texture filtering that my 6800 and 7800 put out because of its performance issues with filtering. I'm sure RSX has that same problem because it was a fundamental part of the texture units. And then you have the obvious inefficiencies of a non-unified design. For example, my 8600GTS at home, a very disappointing chip according to web folks, easily rivals my 7800GTX and looks better doing it. Of course, after seeing bilinear mip-map filtering in Forza 2, and the touted 95% free 4X AA go out the window, my opinion of ATI's C1 has become rather tarnished to say the least too.

I still don't think Xenos is all its made out to be. Sure it looks great on paper but does that design transfer into superior performance in the real world? Based on R6xx's performance with an "evolved" design and more resources i'm thinking not.

For example look at the 2600XT. On paper it packs around 80% the power of Xenos as well as having a more advanced/efficient architecture. But if you look at how it performs, it doesn't come close to 80% the speed of a 7900GTX.
 
I still don't think Xenos is all its made out to be. Sure it looks great on paper but does that design transfer into superior performance in the real world? Based on R6xx's performance with an "evolved" design and more resources i'm thinking not.

For example look at the 2600XT. On paper it packs around 80% the power of Xenos as well as having a more advanced/efficient architecture. But if you look at how it performs, it doesn't come close to 80% the speed of a 7900GTX.

Yeah it is worth questioning. The major catch in the comparison though is that Xenos is a lot different than R600. The shader core is fundamentally different. It would appear though, from the use of bilinear filtering in games and general lack of anisotropic, that its texturing performace may leave a lot to be desired.
 
Status
Not open for further replies.
Back
Top