Predict: The Next Generation Console Tech

Status
Not open for further replies.
Nvidia totally screwed Sony. They took one of their parts which they had long since shelved because of both defects and performance issues, and proceeded to sell it to Sony. RSX was never meant to be sold to anyone, it was a long since shelved piece of hardware, but they realized they could make a pile of money this way and protect their PC business at the same time since the video hardware they sold Sony was obsolete from day one. Sony was desperate at that point as they were both out of time and out of money trying to go with their own solution, so NVidia took advantage of the situation and profited big time.

Well who was it that got them into such a situation, surely that's where the blame lies, not with Nvidia? Whilst, I'm never going to argue that RSX is anywhere near as good as it should have been, was the alternative any better? I fear not. If Sony didn't have Nvidia to fall back on, where would they have been then? The question still stands, who else could have produced a somewhat competitve solution in the same time frame? If RSX was bad, then the alternatives were surely a spectacular disaster waiting to happen.
 
The r520 isn't any better than rsx and r600 came out in 2007 so Sony would either have to delay the launch or rush to market with it own rrod getting a tweener design between 520 and 600
 
The r520 isn't any better than rsx and r600 came out in 2007 so Sony would either have to delay the launch or rush to market with it own rrod getting a tweener design between 520 and 600

R580 seems to have survived a lot better than any G7x, though
 
It's not like the PS3 was ever going to be running Windows or Linux with an OpenGL driver and using that for games though.

PC benchmarks show us that even the single slot, mainstream x1950 Pro outperforms my big old 7900 GTX in modern "shader heavy" games - and it was out before the PS3. Presumably this wasn't an option though (not enough time? ATI overstretched?).
 
In OpenGL?

Not so sure about that, since there ain't really that many new OGL games anymore, but how is that relevant?
I've gotten the impression from here that devs favor libGCM rather than PSGL or OpenGL ES or whatever on the PS3?
 
Considering how late R600 was, I really don't think ATI was ever a realistic option. They didn't seem to have enough resources at the time to develop 1 new console GPU as well as a new generation of PC hardware, nevermind two. If ATI weren't an option, then I stand by my statement, no matter how bad RSX was, the alternative would have been significantly worse.
 
You can see why MS turned it down for the 360, quite apart from them hating Nvidia after the first Xbox. They would have to have been insane to choose it over what they got. I spent a lot of money (by my standards!) on a 7900 GTX in 2006, and by 2007 I largely felt I'd wasted my money.
The GPU was made to run the games out at the time really fast it wasn't meant to be forward looking hardware and why should it be? Nvidia already knew what was going to be their forward looking architecture and it wasn't that far way from being released.
 
Considering how late R600 was, I really don't think ATI was ever a realistic option. They didn't seem to have enough resources at the time to develop 1 new console GPU as well as a new generation of PC hardware, nevermind two. If ATI weren't an option, then I stand by my statement, no matter how bad RSX was, the alternative would have been significantly worse.

I wonder would G80 been possible, extra costs on top of already huge costs, but it would have helped PS3 to secure a clear technical edge over 360.
 
I wonder would G80 been possible, extra costs on top of already huge costs, but it would have helped PS3 to secure a clear technical edge over 360.
Yeah, it would have likely had Geometry Shaders, you know something devs would actually use. It would have likely been gimped to keep temperature down and deal with yields(disable ALUs like the Xenos.) but many people think the GPU would still have been too big for a console. I would say that if the PS3 had a G80 derivative along with the Cell I would probably be seated firmly in the Sony camp. Edram or not I would surely be hard to compete with something like that.
 
I wonder would G80 been possible, extra costs on top of already huge costs, but it would have helped PS3 to secure a clear technical edge over 360.

It also would have blown the already quite high heat output to new dimensions
 
simple question from a non-tech guy coming:oops:

even with the shortcomings of the rsx, how much help would come from just increasing the ram if any on the ps3?
 
simple question from a non-tech guy coming:oops:

even with the shortcomings of the rsx, how much help would come from just increasing the ram if any on the ps3?

From what I've understood, PS3's RAM shortcoming isn't that much the amount, but the fact that there's 2 separate 256MB pools; XB360 with 1 (shared) 512MB pool does just fine
 
simple question from a non-tech guy coming:oops:

even with the shortcomings of the rsx, how much help would come from just increasing the ram if any on the ps3?
More ram would help almost any console that has and will hit the market. I think adding another 256 or 128 would give the PS3 a clear advantage visually especially if it's added to the RSX side of the non unified memory. Most of the ram related problems Devs have had with it in comparison with other consoles would be gone. On the flip side adding that same amount to the competitor would just lead to a similar situation to what we have now.
 
What was the date of Sony's contracting Nvidia for RSX?

G80 or a derivative of it seems iffy from a timing standpoint. It would have needed to have been ready or nearly so when Sony released details in mid-2005, and it would have needed to be in final production months before the initial mid-2006 launch date.

Even if G80 were used, it would still have had problems with backwards compatibility, for those that consider that a problem.
 
What was the date of Sony's contracting Nvidia for RSX?

G80 or a derivative of it seems iffy from a timing standpoint. It would have needed to have been ready or nearly so when Sony released details in mid-2005, and it would have needed to be in final production months before the initial mid-2006 launch date.

Even if G80 were used, it would still have had problems with backwards compatibility, for those that consider that a problem.
The biggest problem IMO would have been mass producing enough of them for a console launch.
The G80 was launched in the same month as the PS3. A gimped version with only 64 steam processors working probably could have been ready before that. So maybe Sony really did intend on launch in spring that year. Most devs didn't have a Xenos until a few months before the system launched and from what I remember they weren't running a full speed. Sony could have done something similar and could have put all arguments to rest.
 
Last edited by a moderator:
The biggest problem IMO would have been mass producing enough of them for a console launch.
The G80 was launched in the same month as the PS3. A gimped version with only 64 steam processors working probably could have been ready before that. Most devs didn't have a Xenos until a few months before the system launched and from what I remember they weren't running a full speed. Sony could have done something similar and could have put all arguments to rest.

Yes but when Gpu s launch they don't launch in millions. I doubt nvidia would have been able to make millions of functioning g80s by ps3 launch date
 
The GPU was made to run the games out at the time really fast it wasn't meant to be forward looking hardware and why should it be? Nvidia already knew what was going to be their forward looking architecture and it wasn't that far way from being released.

I don't think anyone intends to make a GPU that ages as badly as G71 has, even if they have a new line of cards coming. It maybe wouldn't look so bad if you didn't compare it to the X19xx series, or the (older) Xenos in the 360.

Perhaps GPU designers should make their GPUs "forward looking" - it's paid off for people that bought ATI cards (like my house mate who bought an X1900 XT when I bought my 7900 GTX, and who still uses it).

If at all possible though, I think a console vendor definitely should try and select hardware that's "forward looking". That's why I think Sony had no realistic alternative other than to go with RSX - if they'd had one, they'd presumably have gone with it.
 
From what I've understood, PS3's RAM shortcoming isn't that much the amount, but the fact that there's 2 separate 256MB pools; XB360 with 1 (shared) 512MB pool does just fine


More ram would help almost any console that has and will hit the market. I think adding another 256 or 128 would give the PS3 a clear advantage visually especially if it's added to the RSX side of the non unified memory. Most of the ram related problems Devs have had with it in comparison with other consoles would be gone. On the flip side adding that same amount to the competitor would just lead to a similar situation to what we have now.

i guess to be more specific could sony just add say 2gigs of shared( or 1 gig split) of ram and call it a day? i guess since i dont really understand the ins and outs of the whole picture im asking if the nexbox is fully upgraded what other shortcomings could there be if sony say only upped the ram?
 
If at all possible though, I think a console vendor definitely should try and select hardware that's "forward looking". That's why I think Sony had no realistic alternative other than to go with RSX - if they'd had one, they'd presumably have gone with it.
I find this interest because when Sony first showed off basically G70 slides, some of them containing bench marks of what were essentially "old games", of what the RSX would be capable of at E3 2005 everyone seemed to be blown away by it now people are like "Is this the best Sony could have done?"
 
Status
Not open for further replies.
Back
Top