What will Sony add to their RSX GPU development?

Shinjisan said:
I totally agree that consoles are all about compromise.
But it they have to buy a 90nm GPU and they plan to have the product out in late spring 2006 they don't buy a GPU developed around a 110nm process.
That's not compromise,it's dumb.
A good compromise is to keep the frequency at 550MHZ which for a 90nm GPU is definetly low,PC counterparts will reach 650-700MHZ.

The G70 has a die size of 334 mm^2 in a 110 nM process. A direct shrink will have a die size of ~240 mm^2, which is fairly large. The G70 runs nominally at 430 MHz and will likely run in the range of 530 - 580 MHz at nominal bin in 90 nM.

People seem to forget that the intense competition at the high end of the PC graphics market has significantly pushed the boundries of both die size and speed bins. You mention the 550 MHz G70 part, but that part has significant availability constraints and is running off of a pushed VDD as well.

Sony has a need for a significant number of these chips running at a reasonable power limit, as such, a 110nM part which is pushing the boundries of die size, makes a reasonably attractive 90nM console graphics chip that can be produced at a reasonable volume in a reasonable power envelope.

This is why I believe that a shrunk g70 (with possibility of some *feature* enhancement) is the most likely path that sony will follow. If not, god help Sony's bottom line cause that PS3 is going to cost a bloody mint.

In addition, I fully expect the high end desktop chips from both ATI and Nvidia to be higher performance than the graphics chips in either the PS3 or Xbox360 at the time of the PS3 release.

Aaron Spink
speaking for myself inc.
 
pc999 said:
Not exact number like 10 but just longer than PS3, anyway if you dont like Sonys (as they will use it for others thinghs and yet we dont know which amout of is directely related to PS3) eg take the MS how spent 4B in R&D (from memory, sorry but no link) for XB360 plus they lost ?M to sell their firsts XB360 and we dont know till when they will sell bellow cost(many of them with problems that will cost even more), plus ads/PR, they had lost a lot of money they will only make this again if they need, plus the rest of the industry dont like much too (15Mx? in games cost, new tools, new programesetc...).

There is no way in hell MS spent 4B in R&D, at most it is around 1B in R&D. And even that is very high. A design team for a from scratch high end enterprise level processor only cost on the order of 90 - 100 million a year and that is for a ~400 person team that is making a significant amount of money per person. Reasonably, I would put the X360 R&D in the range of ~500-600 million.

Aaron Spink
speaking for myself inc.
 
leechan25 said:
In regard to Microsoft trying to push this gen harder; their going to fail for reasons that have nothing to do with hardware. This 10 lifecycle has more to do with industry development than hardware. So what you got a console that really does 2 tflops out in three years, if developers are still learning cell architecture who's going to push their console? Remember Jaguar? came out as a 64 bit whatever-ma-do with low support and wried hardware nobody wanted to work on. 3DO did the same thing only to bong right out the gate!

The industry will develop and expand and create new development tools, etc.

Jaquar was a console pushed by a failling company with a list of bugs in production hardware so long it could fill a bookshelf.

3DO had a bad business model based on the hardware manufacturers actually making a profit on the boxes and paying 3DO for the favor.

Next gen console will likely come out in another ~5 years, initially on either 35 nM or 28 nM processes. Thats about the time they will be able to deliver a ~10x performance improvement.

Aaron Spink
speaking for myself inc.
 
xbdestroya said:
Yeah but Allard's just putting up a smokescreen on that. Going by their Home Entertainment division quarterly performance over the life of the console, and knowing that in fact some of XBox's losses are buffered by other profitable operations within that division, a rough estimate of losses can be compiled by anyone willing to do the research. There were some threads back in the day that took a look at exactly that, and the losses were definitely substantial (~$6 billion or so from what I recall).

Um, you assume that live back end is actually run by the home entertainment division (AFAIK, its run by the MSN group). Live sits over in the Mountain View MSN campus, and I doubt that they spent more on Live than they have on any of the other properties at that campus. Hotmail alone probably has higher CapEx than live.

Aaron Spink
speaking for myself inc.
 
aaronspink said:
Um, you assume that live back end is actually run by the home entertainment division (AFAIK, its run by the MSN group). Live sits over in the Mountain View MSN campus, and I doubt that they spent more on Live than they have on any of the other properties at that campus. Hotmail alone probably has higher CapEx than live.

Aaron Spink
speaking for myself inc.


Wait a minute, what do you mean to imply by 'Hotmail alone'?

I'd be *shocked* if Hotmail didn't have a higher Capex than Live, so please don't think I'm comparing the likely expenditure between the two - or comparable services to Hotmail - at all; I was viewing Live in isolation with respect to Xbox.

But ok then, if Live is run by MSN, that explains where some of these XBox expenditures/losses might be further concealed/obscured outside of H&E.
 
Last edited by a moderator:
xbdestroya said:
Wait a minute, what do you mean Hotmail alone? I'd be *shocked* if Hotmail didn't have a higher Capex than Live.

yeah, thats kinda the point. We can do the math:

say 32 people per server (which is probably very low)
1-2 million people online at a time
so say 70K servers at ~2K a pop
125 million capex which is probably 1/4th that or lower in reality.
another say 10 million a year for network bandwidth
another 10 million a year in power, cooling, etc
another say 100 million for network hardware

so around 350 million for non human related cost over the lifetime of Live as a realistic upper bound.

But ok then, if Live is run by MSN, that explains where some of these XBox expenditures/losses might be further concealed/obscured outside of H&E.

I don't know for sure that Live is run by MSN but I'd likely bet on it since they run pretty much all internet related stuff for MS and it wouldn't make much sense to have a parallel group for operations, etc.

Aaron Spink
speaking for myself inc.
 
Why would ppl think that G70 is an inferior GPU? Featurewise, the only thing it does not have is AA with FP16 buffers. But with a 128 bit bus, using AA with Fp16 buffers should be totally overkill from the beginning. The other problem might be the performance of dynamic branching, but I am sure that developers will find a way to get around that one as well. Other than those two, it is a very capable GPU. Add that it will be used in a closed system not like PCs, developers will find more ways to harness its power.

At the end, I think both Xenon and RSX will be in the same ballpark in terms of features and performance (though I might still give Xenon an edge).. That's why MS promotes live while Sony brings the Cell to the front instead of talking how their graphics capability beats the other.
 
Wel since the MSN group is profitable..I guess you can consider Xbox to have lost even less :LOL: Maybe it is adding to MSN profits.
 
aaronspink said:
say 32 people per server (which is probably very low)
1-2 million people online at a time
so say 70K servers at ~2K a pop
125 million capex which is probably 1/4th that or lower in reality.
another say 10 million a year for network bandwidth
another 10 million a year in power, cooling, etc
another say 100 million for network hardware

so around 350 million for non human related cost over the lifetime of Live as a realistic upper bound.

Sure I can subscribe to those estimates, they seem reasonable enough.

@Bill: No, it's the other way around. A profitable MSN would obscure further losses on the part of Live, *if* indeed they are the division in charge of running and maintaining Live. Live may already have reached the critical mass it requires to be profitable goimg forward, but it will be some time yet certainly before it becomes net profitable throughout it's life.
 
Last edited by a moderator:
silhouette said:
Why would ppl think that G70 is an inferior GPU? Featurewise, the only thing it does not have is AA with FP16 buffers. But with a 128 bit bus, using AA with Fp16 buffers should be totally overkill from the beginning. The other problem might be the performance of dynamic branching, but I am sure that developers will find a way to get around that one as well. Other than those two, it is a very capable GPU. Add that it will be used in a closed system not like PCs, developers will find more ways to harness its power.

At the end, I think both Xenon and RSX will be in the same ballpark in terms of features and performance (though I might still give Xenon an edge).. That's why MS promotes live while Sony brings the Cell to the front instead of talking how their graphics capability beats the other.


I think odds are it's superior. I'm hoping not though.

And I dont think Cell matters much. It comes down to the GPU's.

Just like on PC's, CPU's dont matter much anymore. My old A64 3000+ has no need to be upgraded. Few games are CPU bound. The best pairing is an "average" CPU and a great GPU.

I expect that to change unfortunatly as they figure out how to use dual core..oh well it's been a nice respite.
 
Last edited by a moderator:
Xbox 360 probably cost between $500 and $900M in total R&D

PlayStation3 probably a couple billio

Dreamcast was, $500M-$600M

PS3 will be around until the middle of the next decade, 10 years or so, but PS4 will arrive sometime after 2010 but will before 2015, in the early part of the next decade, with at least 20x the power of PS3, probably pushing some Sony Holographic Video Disc standard, a SuperCell processor with dozens of cores/SPEs and a custom joint Nvidia-Sony GPU that can produce visuals that mimic raytracing, and all the impossible-to-do rendering techniques that send graphics whores into uncontrolled fits of drooling, and some new controller that is an evolution of the Revolution controller :)


to keep this on topic btw, I am hoping against hope that Nvidia have switched gears and that RSX is based on NV5x / G80, rather than NV47 / G70.
 
Megadrive1988 said:
I am hoping against hope that Nvidia have switched gears and that RSX is based on NV5x / G80, rather than NV47 / G70.
Well, it's not. Might have one or two NV5X qualities though.
 
Next step=PPU

Bill said:
I think odds are it's superior. I'm hoping not though.

And I dont think Cell matters much. It comes down to the GPU's.

Just like on PC's, CPU's dont matter much anymore. My old A64 3000+ has no need to be upgraded. Few games are CPU bound. The best pairing is an "average" CPU and a great GPU.

I expect that to change unfortunatly as they figure out how to use dual core..oh well it's been a nice respite.

Games are designed for existing hardware no? So maybe because CPU performance of PC is not so good for physics, PC games do not seem CPU bound on fast processors like A64 3000+.

But, now consoles have good floating point capability and will have much better physics than PCs but instead of upgrading your A64 3000+, you can get PPU. But between Xbox360 and CELL, difference in capability is quite large so this will make a difference if software is good.

So, CELL makes not difference for games of today but CELL matters for purposes of games of future.
 
Closed box.

silhouette said:
Why would ppl think that G70 is an inferior GPU? Featurewise, the only thing it does not have is AA with FP16 buffers. But with a 128 bit bus, using AA with Fp16 buffers should be totally overkill from the beginning. The other problem might be the performance of dynamic branching, but I am sure that developers will find a way to get around that one as well. Other than those two, it is a very capable GPU. Add that it will be used in a closed system not like PCs, developers will find more ways to harness its power.

At the end, I think both Xenon and RSX will be in the same ballpark in terms of features and performance (though I might still give Xenon an edge).. That's why MS promotes live while Sony brings the Cell to the front instead of talking how their graphics capability beats the other.

You are correct my friend, closed box makes big difference to performance of GPU. Look at VF5 of Sega Lindbergh hardware using 256MB Nvidia card.

http://images.17173.com/news/20050901/0909sega02.jpg
http://www.geocities.com/arcade_store/after_burner_3.jpg
http://www.geocities.com/arcade_store/unbenannt.jpg
 
ihamoitc2005 said:
You are correct my friend, closed box makes big difference to performance of GPU. Look at VF5 of Sega Lindbergh hardware using 256MB Nvidia card.

http://images.17173.com/news/20050901/0909sega02.jpg
http://www.geocities.com/arcade_store/after_burner_3.jpg
http://www.geocities.com/arcade_store/unbenannt.jpg

Sure closed box makes a big difference, and Sony don't need the most powerful GPU to produce beautiful games, but knowing Sony, I forecast that the RSX will be the best GPU on the market when PS3 will be released...a GPU sharing a lot of features of the next G80...
 
Bliss said:
Sure closed box makes a big difference, and Sony don't need the most powerful GPU to produce beautiful games, but knowing Sony, I forecast that the RSX will be the best GPU on the market when PS3 will be released...a GPU sharing a lot of features of the next G80...

And I predict that if its the best GPU on the market when PS3 is released, you won't be able to buy a PS3 because they won't be able to produce enough parts to satisfy a 1/10th of the Xbox360 launch volume. You've got you understand that ATI and Nvidia have really gone to the edge and in some respects over it with their high end PC GPUs. They are monsters, they use significant power (and will for the next release too, since the high end buyers don't seem to mind), significant silicon real estate (300+ mm^2), push the process voltages to the max, and are willing to sell parts that are at the very upper end of the yield curve. If sony wants that, they:

A) need a bigger box design and power supply to cool and power it
B) need an insanely high price so they don't loose their shirt
C) need at least 1 more fab for volume
D) their heads checked :)

Aaron Spink
speaking for myself inc.
 
aaronspink said:
And I predict that if its the best GPU on the market when PS3 is released, you won't be able to buy a PS3 because they won't be able to produce enough parts to satisfy a 1/10th of the Xbox360 launch volume. You've got you understand that ATI and Nvidia have really gone to the edge and in some respects over it with their high end PC GPUs. They are monsters, they use significant power (and will for the next release too, since the high end buyers don't seem to mind), significant silicon real estate (300+ mm^2), push the process voltages to the max, and are willing to sell parts that are at the very upper end of the yield curve. If sony wants that, they:

A) need a bigger box design and power supply to cool and power it
B) need an insanely high price so they don't loose their shirt
C) need at least 1 more fab for volume
D) their heads checked :)

Aaron Spink
speaking for myself inc.

let's wait and see.....we may place a bet:)
 
Bliss said:
Sure closed box makes a big difference, and Sony don't need the most powerful GPU to produce beautiful games, but knowing Sony, I forecast that the RSX will be the best GPU on the market when PS3 will be released...a GPU sharing a lot of features of the next G80...

According to Nvidia, when the PS3 is released the RSX will already be surpased by their PC line of GPUs which is kind of reasonable, and having lots of fatures doesn't really mean so much many times. most of the times new GPUs have features that are rearly used in the games since they don't have the power to run them in real life and a refresh or even a new gen is needed to make those features realistic...
 
Platon said:
According to Nvidia, when the PS3 is released the RSX will already be surpased by their PC line of GPUs which is kind of reasonable, and having lots of fatures doesn't really mean so much many times. most of the times new GPUs have features that are rearly used in the games since they don't have the power to run them in real life and a refresh or even a new gen is needed to make those features realistic...

Yes, Nvidia said so 6 or more months ago, give me a break if I'm wrong...maybe something has changed...and...wasn't the NV2a of the XBox the best GPU when XBox was released ? sharing a lot with the GF4, which will be released 2 months later...
 
Bliss said:
wasn't the NV2a of the XBox the best GPU when XBox was released ? sharing a lot with the GF4, which will be released 2 months later...

Feature-wise it might have been advanced (second vertex shader + other specific instructions built-in), but definitely not in terms of speed. NV2A was/is quite bandwidth limited.
 
Back
Top