RSX pipeline speculation?

RSX pipeline configuration?

  • Option A: 8 VS pipes, 2-issue: 24 PS pipes, 5-issue

    Votes: 39 59.1%
  • Option B: 12 VS pipes, 3-issue: 20 PS pipes, 5-issue

    Votes: 5 7.6%
  • Option C: 4 VS pipes, 4-issue: 24 PS pipes 5-issue

    Votes: 10 15.2%
  • Option D: 20 VS pipes, 2-issue: 16 PS pipes, 6-issue

    Votes: 3 4.5%
  • Option E: Other! Please specify...

    Votes: 9 13.6%

  • Total voters
    66
expletive said:
Could the RSX have zero VS pipes and use the cell for all those functions?

J
I hope not... Please nooooo!

I want to see what devs can do with the SPEs. Forcing devs to dedicate 2, 3, or 4 SPEs to Vertex Shading would be... dumb! GPU VS units are well designed for the task. I am not sure how many more PS units you could fit on RSX considering PS units are more complex than VS units, but this type is tradeoff is a poor one IMO.

Let devs devote the SPEs to physics, AI, particles, animation, terrain deformation, procedural synthesis of geometry and textures, lighting and shadowing, and other REALLY cool stuff that advance gameplay.

Using CELL to do tasks the GPU is quite capable of doing is a waste of silicon IMO. Obviously aiding graphics is a good thing; but offloading work that could easily be done on the GPU? Nuts I tell you!
 
I'm sure everyone here would rather have better physAIulation gameplay then better pixel shading. Turning Cell into a semiCPU would be a poo idea and one I'm certain Sony aren't taking.
 
Acert93 said:
I hope not... Please nooooo!

I want to see what devs can do with the SPEs. Forcing devs to dedicate 2, 3, or 4 SPEs to Vertex Shading would be... dumb! GPU VS units are well designed for the task. I am not sure how many more PS units you could fit on RSX considering PS units are more complex than VS units, but this type is tradeoff is a poor one IMO.

Let devs devote the SPEs to physics, AI, particles, animation, terrain deformation, procedural synthesis of geometry and textures, lighting and shadowing, and other REALLY cool stuff that advance gameplay.

Using CELL to do tasks the GPU is quite capable of doing is a waste of silicon IMO. Obviously aiding graphics is a good thing; but offloading work that could easily be done on the GPU? Nuts I tell you!

Well you read my mind. Just trying ot piece together what could be taking so long and wondering if this was a possibility. I was thinking more along the lines of no PS increase so they could possibly reduce costs on the chip itself since hteyve been ocming under a lot of fire recently about the possible cost on the console.

J
 
_phil_ said:
The nv2a had an extra VertexShader.at that time ,it's not really a small tweak.

The GeForce 4 also had an extra vertex shader and launched only 3 months after the NV2A. The GeFroce 4 4200 had the exact same clock speed and bandwidth as the NV2A.

Coincedence?
 
Shifty Geezer said:
Because Sony have contracted them to make a GPU part and are paying them to do this. They'll put in whatever Sony asks, and if Sony want a trichromatic-inverse-tachyon-pulse rendering engine, nVidia will put one in.

:LOL:

No.

before Nvidia signed a contract with Sony, they sat down and figured out what they could provide. Sony then looked at it, and told Nvidia what they wanted within those limitations. Nvidia listened, and negotiated a price.

So, the price Sony is paying reflects the amount of work that Nvidia put into the part. The more expensive the part, the more work went into it.

And call me crazy, but I don't think that Nvidia is exactly charging Sony an arm and a leg.
 
The RSX G70 like is possible have "some kind of SIMD SPE" in vertex shaders?

(or more.. it is possible that RSX G70 like has anything of Cell GPU beyond FlexIO?)
 
Last edited by a moderator:
If you do a search aaronspink had a fairly detailed post on the switch from TMCs 110nm process to Sony's 90nm process. Its not just a "flip the switch" type process. If Xenos, which had working silicon last November, is any indication you need time. You do NOT want to run into an X1800/R520 issue. You have to ramp up MONTHS before hand and have millions of chips ready to go by launch date. So if they want to ship in March of 2006 they better start chip production... NEXT MONTH.

Also, take the PR statements with a grain of salt. KK said it was not related to a "PC GPU architecture" at all, while the original press release noted it was an adaptation of the "next generation NV GPU (=G70)". Similarly there was a good mass of confusion between the two when the deal took off... some indicated the G70 was in development the last couple years, later some Sony reps noted it was in codevelopment (which is very unlikely due to the fact they are public companies IMO). If RSX was in development as long as some indicate there is no reason RSX could not be a COMPLETE clean slate like Xenos and have eDRAM and other features. But as it is NV only dedicated 50 people (plus however many Sony did) for the project. If it was codeveloped you would expect teams within teams, not the later partitioning of engineers. And a clean slate design takes 300, 400 people.

It seems more likely, to me at least, that RSX is what the original press announcements stated: an adaptation of the new product line. LARGE speed bump, transition to a smaller processes, new interface, probably some new tricks. But G70 is a big chip. The goal is to get a LOT of these chips in the systems--so good yields and controllable heat are important.

It is amazing that Sony is getting an adaptation of NV's flagship chip. Plus Cg, great OpenGL knowledge, and strong ties to IMPORTANT dev houses (e.g. Epic, id).

There is always a point of diminishing returns on cutting edge technology. Obviously if NV could have done more with G70 they would have. Obviously they will do as much with RSX as they can within the confines Sony gives them but I am not expecting anything radically different based on the press clippings. Not that that is a bad thing. G70 is quite a nice chip in many ways.
 
Powderkeg said:
And call me crazy, but I don't think that Nvidia is exactly charging Sony an arm and a leg.
Not that they had a choice.

The absolutely worse thing that could have happened to NV was Sony going with an ATI product (like R520). They would have been absolutely shut out this gen in regards to hardware. That would have allowed ATI proprietary formats, features, and quirks to totally dominate the marketplace.

MS and ATI are already close, if ATI had gotten Sony I think NV would have been impacted significantly in the long run.

And thus the deal with Sony is long term--as in future products as well. The battle lines, MS/ATI and Sony/NV have been drawn. MS has not made any confirmation of sticking with ATI, but ATI wants it, DX10 shows it, and most of all NV siding with Sony is a pretty good reason to do so.
 
Acert93 said:
Not that they had a choice.

The absolutely worse thing that could have happened to NV was Sony going with an ATI product (like R520). They would have been absolutely shut out this gen in regards to hardware. That would have allowed ATI proprietary formats, features, and quirks to totally dominate the marketplace.

MS and ATI are already close, if ATI had gotten Sony I think NV would have been impacted significantly in the long run.

And thus the deal with Sony is long term--as in future products as well. The battle lines, MS/ATI and Sony/NV have been drawn. MS has not made any confirmation of sticking with ATI, but ATI wants it, DX10 shows it, and most of all NV siding with Sony is a pretty good reason to do so.


Ahh, but from Nvidia's point of view, they have to make a profit on RSX, regardless of what the future may hold. They aren't wallowing in surplus cash, they still need to make money. It would be better for them to not compete in the console arena at all than it would to lose millions on a proprietary chip for Sony.

They aren't raping Sony like they did MS, but they aren't taking a loss either. Nvidia is charging a fair price for the work they did. The price is cheap, I suspect the amount of extra work it required from Nvidia was as well.
 
WAit last time Nvidia had to dedicate 300-400 people because they had to design most of the Xbox by themselves.This time it's different.They are only designing the GPU by itself with Sony engineers involved.
 
There's no way NVidia won't make a profit off of RSX though. Since the indication seems to be that Sony is co-shouldering the development costs, and remember they are taking care of all the fabbing themselves, even though potentially less lucrative than XBox was for them, the nominal license fee model should do well for NVidia this gen.
 
Powderkeg said:
Why should Nvidia put in any more effort than they have to? They won't make any more money if they do, so what's the incentive to put in more work and expense into RSX development than is absolutely necessary?
So that they can continue to work with Sony? My opinion would be, if I actually hired someone (nvidia) to do a job of making the best thing out in the market, and my competitor asks my person's rival (ati) to do the same and he does it better, next time, the person that I hired would not be my first choice. If Sony is not happy with what nvidia provides, they could easily go to ati for the ps4, which would leave nvidia out of the gaming market. (assuming MS and Nintendo go with ati as well) Not to mention out of all of the other places that nvidia could work with sony on. (their electronics) But my thinking could be wrong.
 
Powderkeg said:
:LOL:

No.

before Nvidia signed a contract with Sony, they sat down and figured out what they could provide. Sony then looked at it, and told Nvidia what they wanted within those limitations. Nvidia listened, and negotiated a price.

So, the price Sony is paying reflects the amount of work that Nvidia put into the part. The more expensive the part, the more work went into it.

And call me crazy, but I don't think that Nvidia is exactly charging Sony an arm and a leg.

KK is not that guy who just takes what is available on the table.If that's so Sony may have already went with ATi.You can't rule out the possibility that Sony could have talked to various parties before going with Nvidia.It's not just pricing that made them chose Nvdia.I believed the deal gave them the best product that fits in to their goal of producing the most powerful console this coming gen.
 
EpicZero said:
So that they can continue to work with Sony? My opinion would be, if I actually hired someone (nvidia) to do a job of making the best thing out in the market, and my competitor asks my person's rival (ati) to do the same and he does it better, next time, the person that I hired would not be my first choice. If Sony is not happy with what nvidia provides, they could easily go to ati for the ps4, which would leave nvidia out of the gaming market. (assuming MS and Nintendo go with ati as well) Not to mention out of all of the other places that nvidia could work with sony on. (their electronics) But my thinking could be wrong.


If they don't make money, they won't be a solvent business by the time the PS4 arrives.

It's a business. Like I said, it's better for Nvidia not to even compete in the market than it is to take a loss in it.
 
hugo said:
KK is not that guy who just takes what is available on the table.If that's so Sony may have already went with ATi.You can't rule out the possibility that Sony could have talked to various parties before going with Nvidia.It's not just pricing that made them chose Nvdia.I believed the deal gave them the best product that fits in to their goal of producing the most powerful console this coming gen.

Did they go to ATI? Did ATI turn them down perhaps? I seem to recall ATI having a limited number of development teams, and none of them were sitting around idle that I can recall.

Sony had 2 choices. Make their own GPU or go to Nvidia. They tried the first, and Nvidia was the only realistic option they had left. With one ATI team on the 360, another on the Revolution, and a third on the R520, they didn't have anyone who could have worked on a Sony chip, and there realistically is no one else to go to.
 
Just think of how much transisters have been striped from G70 due to useless features for PS3 ( pure video, encodeing, decodeing...etc ) then think of what is a LOGICAL way that nvidia would use them. and there's your answear.
 
Acert93 said:
If RSX was in development as long as some indicate there is no reason RSX could not be a COMPLETE clean slate like Xenos and have eDRAM and other features.
Here's a thing to wonder about. If the RSX were to be a clean sheet design, what extras/differences would it have over G70, and why? eg. eDRAM. I think I've heard it cited a couple of times that Sony would have preferred eDRAM, but what if as KK says, they didn't because you can't fit an entire buffer in without stupid amounts? Not without tile-rendering, and nVidia aren't keen on that. Lack of eDRAM might not be indicative of lack of development time, but an architecture choice.

The main functions of a GPU are vertex and pixel 'shading', and G70 does that. Better then any other GPU at the moment. And with excellent OpenGL support, which is something Sony wants. Even if Sony were to ask for a bespoke solution, why would it not be a G70 like solution? Why do people expect it to be some eccletic blend of features with it's own peculiar quirks?

If there's not a clear answer to that (and I don't know, hence I ask. What is so inherantly bad about using a PC type GPU that a bespoke console solution would be better?), is it not just as feasible that Sony approached nVidia and said 'we want something just like your top end PC part with a bit extra' two plus years ago, as it is feasible that Sony thought a year ago 'we really want an eDRAM GPU with imaginary number support and odd-bit colour depths, but we've run out of time so will just have to buy whatever ATi or nVidia are offering'?

Despite it's similarites with G70, RSX could still be a bespoke solution could it not? And if not, what's so bad about it's (G70 derivative) design that it's not suited for a console?
 
Powderkeg said:
Did they go to ATI? Did ATI turn them down perhaps? I seem to recall ATI having a limited number of development teams, and none of them were sitting around idle that I can recall.

Sony had 2 choices. Make their own GPU or go to Nvidia. They tried the first, and Nvidia was the only realistic option they had left. With one ATI team on the 360, another on the Revolution, and a third on the R520, they didn't have anyone who could have worked on a Sony chip, and there realistically is no one else to go to.

Or maybe they went to Nvidia because they were the only one's willing to spend time and effort to make a chip that worked with Cell.
 
Shifty Geezer said:
Even if Sony were to ask for a bespoke solution, why would it not be a G70 like solution? Why do people expect it to be some eccletic blend of features with it's own peculiar quirks?

Maybe the EE... VU0 & VU1 has something to do with peoples expectations! :).
 
Powderkeg said:
Did they go to ATI? Did ATI turn them down perhaps? I seem to recall ATI having a limited number of development teams, and none of them were sitting around idle that I can recall.

Sony had 2 choices. Make their own GPU or go to Nvidia. They tried the first, and Nvidia was the only realistic option they had left. With one ATI team on the 360, another on the Revolution, and a third on the R520, they didn't have anyone who could have worked on a Sony chip, and there realistically is no one else to go to.

Well you said it yourself that Nvidia presented them an off the shelf ready solution,presented in front of Sony.It was like "take it or leave it" deal.That wasn't hard for ATi to offer such deal too right?If it's so could it be that ATi presented a "not so good" solution compared to Nvidia at that time?
 
Back
Top