RSX pipeline speculation?

RSX pipeline configuration?

  • Option A: 8 VS pipes, 2-issue: 24 PS pipes, 5-issue

    Votes: 39 59.1%
  • Option B: 12 VS pipes, 3-issue: 20 PS pipes, 5-issue

    Votes: 5 7.6%
  • Option C: 4 VS pipes, 4-issue: 24 PS pipes 5-issue

    Votes: 10 15.2%
  • Option D: 20 VS pipes, 2-issue: 16 PS pipes, 6-issue

    Votes: 3 4.5%
  • Option E: Other! Please specify...

    Votes: 9 13.6%

  • Total voters
    66
scatteh316 said:
Or maybe they went to Nvidia because they were the only one's willing to spend time and effort to make a chip that worked with Cell.

I could have sworn that's what I said.
 
scatteh316 said:
Or maybe they went to Nvidia because they were the only one's willing to spend time and effort to make a chip that worked with Cell.

Not just make it work with the Cell but I believe they are customising it with new suffs not found in the G70.
 
hugo said:
Well you said it yourself that Nvidia presented them an off the shelf ready solution,presented in front of Sony.It was like "take it or leave it" deal.That wasn't hard for ATi to offer such deal too right?If it's so could it be that ATi presented a "not so good" solution compared to Nvidia at that time?


Like I said. ATI simply didn't have a development team to work on the chip in the first place.

Nvidia had a development team to spare since MS dropped them as a chip supplier.

I doubt the cost to Sony, or the results would have been different regardless of who they went to.

I really don't see what the big deal is. Is it really such an insult to say "Sony got what they paid for"? Why is this a "common sense" statement when used with anything except a game console?
 
scatteh316 said:
To me, your statment sounded like they went to nvidia because every1 else turned sony down and they had no choice

There are only 2 companies fit to do the job. ATI or Nvidia. ATI was busy. Who is left?
 
Powderkeg said:
There are only 2 companies fit to do the job. ATI or Nvidia. ATI was busy. Who is left?

That meens nothing, do you really think if Sony asked ATI to make a GPU for PS3 that ATI would say " were to busy "

Not one GPU manufacturer would turn down the chance to make a GPU to what potentially could be the best selling console EVER.

There is more to this than is being let on. I think Nvidia and Sony are up to something.
 
scatteh316 said:
That meens nothing, do you really think if Sony asked ATI to make a GPU for PS3 that ATI would say " were to busy "

Since ATI had already singed contracts with Nintendo and MS, what choice did they have?

Give up their entire PC line of products to work for Sony? Get their asses sued by the likes of Nintendo, or worse, MS, for breach of contract? Maybe hire in a bunch of new people who don't know a damn thing about their current products and technoligies, and pray they can pull a GPU out of their butts on schedule?

Not one GPU manufacturer would turn down the chance to make a GPU to what potentially could be the best selling console EVER.

Not as long as there is money to be made, which brings us back to Nvidia making a profit.

There is more to this than is being let on. I think Nvidia and Sony are up to something.

They are up to business. Each trying to make their company money, and their investors happy. No more, and no less.
 
Powderkeg said:
Like I said. ATI simply didn't have a development team to work on the chip in the first place.

Nvidia had a development team to spare since MS dropped them as a chip supplier.

I doubt the cost to Sony, or the results would have been different regardless of who they went to.

I really don't see what the big deal is. Is it really such an insult to say "Sony got what they paid for"? Why is this a "common sense" statement when used with anything except a game console?

Look 2 years ago when Sony went out to look for a graphics company that could help them design a GPU for their next console.Their aim was to come out with the most powerful next gen console and Ati was pawning Nvidia's 5th series at that time.The R520 was already on sketch paper or further down the development line.

If Sony only wanted a desktop GPU to modify it with their console,they could have went straight to ATi instead.I am very sure Sony is one customer ATi will never ignore since the Playstation line is the best selling console of all time.Customizing the GPU is never a problem for Sony because they had all the resources and engineeers.If ATi had turn down Sony because they were obliged to MS why are they offering the R520 to Nintendo instead today?Why IBM is brave enough to work with 3 companies this gen?

It just doesn't make sense.If Nvidia offered something interesting on paper and it was one of their future desktop part with a dateline how come the RSX isn't ready till now?The G70 has already been out for like say more than 6 months or maybe longer since it was taped out.
 
Powderkeg said:
Not as long as there is money to be made, which brings us back to Nvidia making a profit.

Do't tell me you think Nvidia will have trouble making a profit. They are having a one time expense for developing the chip. They do not have to do anything but provide Cg and opengl support after that. This means they will quite easily turn a profit though it may not be as big per unit as they got on the xbox.
 
MechanizedDeath said:
I chose Option A. How's it work with the DOT product number? 6 SPEs. :D One is tied up for the OS. That brings me to 50M DOTS if you use 56 DOT/cycle for RSX then. 1M DOTs missing. I chalk it up to a rounding error. :lol Come on, 2% error is acceptable in polls. It's acceptable in forum speculation. AMIRITE? PEACE.

Nice try! Nit pick, but that would be 50 'Billion' not 'Million'. And it's exactly 50 B, so rounding to 51 B would be even more strange! Here's a breakdown,

CELL+RSX = 51 GigaDOTS/sec

Case A: For 6 SPUs

-> 6 DOTS/cycle x 3.2 GHz ~ 19.2 GigaDOTS/sec

RSX ->51-19.2 ~ 31.8 GigaDOTS/sec

-> 31.8/0.55 GHz
~ 57.82 DOTS/cycle

FAIL, NOT integer


Case B: For 7 SPUs OR 6 SPUs + 1 VMX (PPE)

-> 7 DOTS/cycle x 3.2 GHz ~ 22.4 GigaDOTS/sec

RSX ->51-22.4 ~ 28.6 GigaDOTS/sec

-> 28.6/0.55 GHz
~ 52 DOTS/cycle

PASS, integer -> 52 VEC4 units


CASE B: For 7 SPU + 1 VMX (PPE)

-> 8 DOTS/cycle x 3.2 GHz ~ 25.6 GigaDOTS/sec

->51-25.6 ~ 25.4 GigaDOTS/sec

-> 25.4/0.55 GHz
~ 46.18 DOTS/cycle

FAIL, NOT integer
 
xbdestroya said:
...
Well, I'll be a little dissapointed if it's just a tweaked GTX. Not that I'll be complaining loudly though, just more along the lines of 'get it over with already if that's what it is.'
...

That would still suffice. However TurboCache has been strongly rumored as an alternative to eDRAM. I'd expect some ROP tweaks to work with XDR, GDDR and SPU LS etc...and perhaps VS units customised to work with CELL. That only leaves the PS units pretty much unchanged!
 
rabidrabbit said:
I voted "Other" so that I could see what others have voted, and because I have no idea what's the difference between "2-issue" and "5-issue" :)

Well, 2-issue, 5-issue etc. are the number of 'instructions' the VS and PS units could handle...
 
Damn, suddenly speculating is not funny anymore ;)
I still think the Sony-Nvidia deal was a last minute thing...
 
nAo said:
Damn, suddenly speculating is not funny anymore ;)
I still think the Sony-Nvidia deal was a last minute thing...

Are just speculating or do you know sopmething cause if you know somehting I might have to get out the torches and pickforks and russle up an angry mob to hunt you down until you spill what you know. :p
 
Urian said:
Perhaps we are wrong assuming that the RSX is a derivative of an existing Nvidia architecture.

The facts that we know are:

-No Embedded DRAM.
-No Unified Shader.
-51M Dot/Products combined with the Cell.
-136M Shader Instructions.

Nitpick but it's 51 'B' per sec and 136 not 136M per cycle

In other words the config could be:

4 Vertex Shaders.
20 Pixel Shaders.

They don't add up to 136 inst/cycle

The problem with this scenario is the transistor count but since the demo in real time of MGS4 and since I win my salary editing videos I can see how the demo uses some effects from the video and photo edition and not from the 3D edition. I believe that the element under the NDA is a module that can apply effects like Motion Blur, HDR and other effects in real time making the work more easy for the rest of the RSX, this module cannot be combined with the Cell because isn´t programable.

Something like an advanced 3DFX T-Buffer?
 
Xenus said:
Are just speculating or do you know sopmething cause if you know somehting I might have to get out the torches and pickforks and russle up an angry mob to hunt you down until you spill what you know. :p
Prior to RSX's announcement etc., would Ninja Theory have had an insight into what Sony were planning? I can't imagine they would have. I guess they'd just be targetting a high-end PC type rig and trust whatever hardware Sony and MS put out there would fit the bill, plus of course they'd know Cell would be coming. I can't see that anyone outside of Sony/nVidia would know if it was a last minute fix or not as I can't see why Sony would care to divulge that information. Except for developers. eg. 'Hi Ninja Theory. For our PS3 you'll have to write your shader code on Cell. That'll be our GPU, only with some texture units.' And then a year later 'Um, you know what we said about PS3's graphics. Scratch that, we're going with nVidia.' Do devs get that much info on the hardware that early on? I wouldn't have thought so.
 
RSXspecs.jpg


expletive said:
Could the RSX have zero VS pipes and use the cell for all those functions?

J

That was a popular theory before E3. However, they already specified 'independent' pixel/vertex shaders, though that term was never really clarified...
 
Back
Top