ATI - PS3 is Unrefined

Alpha_Spartan said:
To add to the notion that RSX isn't a completely new part, Nvidia is making chump change off of the PS3 (compared to the Xbox). Sony basically payed them to use their current design and to take over the fabrication. Nvidia still gets a small royalty. So apparently, Nvidia didn't spend anything on RSX development since they basically sold Sony their previous work (G70).

This idea can be reconciled with all current Sony and Nvidia press comments.

Small royalty... it is $5+ per chip and when you do not even fab it, well it is not a small amount of money. Plus, IIRC SCE agreed to give nVIDIA money for RSX's R&D aside from the royaltes for each chip made or maybe I am getting confused when nVIDIA's CEO was talking about R&D costs (edit: no confusion, they have been receiving NRE/R&D funds from SCE) and royaltes and SCE taking over fabrication... certainly there are hardware engineers from nVIDIA together with SCE's own engineers working on RSX (on customizing the 90 nm die shrink of G70 so that it becomes RSX).

No, I was not confused... thanks Titanio.
 
Last edited by a moderator:
geo said:
Agreed. Do you have anything to point at to support that conclusion? Preferably something unambiguous --or at least less ambiguous than "aspects"? :smile:

I've always felt that the 2 HDMI ports is a dead giveaway on the basis of RSX being a PC part (i.e. G70). I cant imagine that SOny really put dual video outputs on the list of key requirements for the PS3 but once they had the *free* 'off the shelf' capability they found a great way to market it as a revoutionary element in gameplay (as they alwyas do).
 
Alpha_Spartan said:
To add to the notion that RSX isn't a completely new part, Nvidia is making chump change off of the PS3 (compared to the Xbox). Sony basically payed them to use their current design and to take over the fabrication. Nvidia still gets a small royalty. So apparently, Nvidia didn't spend anything on RSX development since they basically sold Sony their previous work (G70).

nVidia has received NRE revenue from Sony, and has up until very recently (if it has stopped). I'm not sure how much it was, though.

Alpha_Spartan said:
So RSX isn't a "bolted on PC part" but rather a G70 heritage GPU for the PS3.

The latter is exactly what it is, as per everything that's been presented to us. I think it's more than possible to craft a worthy chip for PS3 - the most powerful in a console, even - out of G70 parent tech. It's really good stuff.

expletive said:
once they had the *free* 'off the shelf' capability they found a great way to market it as a revoutionary element in gameplay (as they alwyas do).

I felt gameplay considerations were rather secondary in the presented uses of dual video-out at E3..
 
Last edited by a moderator:
mckmas8808 said:
What exactly do you consider late in the day? Like what month and year?

Whatever Wavey meant by:

but from what I’ve heard and from what I’ve been told about it from quarters within NVIDIA I actually do believe the decision was fairly late in the day and that the design isn’t significantly different from what they already have “on the shelfâ€￾, so to speak.

upstream. If you are a practised deconstructor of Wavey-ese, at least.

Month and year? No idea. I'm not that precise in these things. Summer 2004 wouldn't surprise me. Even if the decision itself was made later than that, there must have been some mutual exploration of how it would work and what each side could bring to the table and how those parts 'n pieces would fit together. Summer 2003 would surprise me. Summer 2002 and now we're talking amazed. At least regarding RSX. Not in the least surprised if they were talking about licensing CG that early --one of the few weak points I recall everyone sticking the needle to Sony last time was weak dev tools.
 
Panajev2001a said:
Small royalty... it is $5+ per chip and when you do not even fab it, well it is not a small amount of money. Plus, IIRC SCE agreed to give nVIDIA money for RSX's R&D aside from the royaltes for each chip made or maybe I am getting confused when nVIDIA's CEO was talking about R&D costs and royaltes and SCE taking over fabrication... certainly there are hardware engineers from nVIDIA together with SCE's own engineers working on RSX (on customizing the 90 nm die shrink of G70 so that it becomes RSX).
I'm guessing that Sony didn't pay for JUST the design. The Nvidia engineers were part of the deal. My belief is that Sony approached Nvidia about a possible graphics processor while Nvidia was developing the G70. An agreement was made and the RSX was borne out of the G70 line of processors.

I mean, some people want it both ways. They want to believe that Sony and Nvidia were working together for three years AND they want to believe that somehow the RSX is a completely new design based on G80+ technology. That would make more sense if Sony and Nvidia just recently started their partnership and if the PS3 was going to ship in 2007, but alas that isn't the case. Sony never fabbed a complex GPU in its life and so it makes sense that they'll get their start with a mature design, not something new that even TSMC will have a hard time with.
 
expletive said:
What were their other notions of dual HDMI out besides gameplay elements?
I've seen expanded video displays in some presentations where you just get one large 1080p video display from two 1080p displays.
 
It will be funny if an unrefined console will be able to clearly outdo a refined console, very funny..:)
 
Alpha_Spartan said:
I'm guessing that Sony didn't pay for JUST the design. The Nvidia engineers were part of the deal. My belief is that Sony approached Nvidia about a possible graphics processor while Nvidia was developing the G70.

IMHO it was the other way around with nVIDIA courting SCE, but still your analysis might be right if you dateed end of G70's development in mid/mid-end 2004 ;).
 
Last edited by a moderator:
nAo said:
It will be funny if an unrefined console will be able to clearly outdo a refined console, very funny..:)

It would only happen if an a tree hitting simulator programmer, an ex bad dog or an Italian stallion from Emilia Romagna were programming for it and I am sorry that cannot happen ;).
 
expletive said:
What were their other notions of dual HDMI out besides gameplay elements?

Using a secondary screen for non-gameplay functionality. Put your video chat window on it, so it doesn't interrupt your playing. Access the rest of the functionality using it too, while you or someone else plays on the "primary" screen. These new systems are about on-demand functionality and essentially multitasking, so being able to present to two screens could be quite convenient.
 
Titanio said:
Using a secondary screen for non-gameplay functionality. Put your video chat window on it, so it doesn't interrupt your playing. Access the rest of the functionality using it too, while you or someone else plays on the "primary" screen. These new systems are about on-demand functionality and essentially multitasking, so being able to present to two screens could be quite convenient.

Not to distract from the thread but dual HDMI and the new Sony SRXD HDTVs will go hand in hand. Especially for multi-tasking or online gaming if Sony does things right. These TVs have a nice split screen functionality and micr-screen functionality 3/4 and letter box at the corner.

Playing Socom and having your eye-toy set video conferencing with the rest of your team, or Madden just to see their reaction when you take 'em out or throw a touch down, will be priceless.

Speng.
 
Additionally, the architecture of XB360 does not provide for an explicit data flow from Xenos->Xenon. All such data flows are routed via memory - i.e. Xenos writes to memory, and Xenon then reads it. So while RSX->Cell is supported by a specific 15GB/s link, Xenos's data is forced to go via memory, and capped at 10GB/s.

Well Xenos does support locking parts of its 2nd level cache for exactly this reason, so you don't have to go through memory if you don't want to. Or do you refer to xenos not having dedicated ports to access the gpu and main memory?
 
Panajev2001a said:
IMHO it was the other way around with nVIDIA courting SCE, but still your analysis might be right if you date G70's development in mid/mid-end 2004 ;).
After all the shit Nvidia went through with MS, I don't think they wanted to court another console maker. IIRC, Nvidia made statements about how they need to focus more on the PC side of their business and that consoles were a distraction.

Secondly, looking at Nvidia's role in this "partnership", they seem more like a vendor than a partner. If you look at MS/ATi partnership with Xenos, that's no mere vendor/buyer relationship. MS designed part of the console (the eDRAM), wrote the compilers and is handling the manufacturing. Xenos can't even be attributed to a family of ATi GPUs. Is it an R500 or an R600? It's neither. It's completely different.

The best way I can illustrate it is that MS relationship with ATi in the Xenos design was as close as Sony and IBM's relationship in the Cell design. Sony's relationship to Nvidia with the RSX is about the same as MS relationship with IBM with Xenon's design. It's basically, "What you got? We'll buy it!" rather than "Let's get together and design something from scratch."
 
Alpha_Spartan said:
After all the shit Nvidia went through with MS, I don't think they wanted to court another console maker. IIRC, Nvidia made statements about how they need to focus more on the PC side of their business and that consoles were a distraction.

Well, I do believe they still did not want ATI to get two console makers and possibly all three with them getting out of the picture all-together (if only for PR alone, but that would not be telling the whole thing IMHO).

They knew they would not want to repeat the issues they had with MS, but in the end they did not lose money overall from the Xbox 1 venture. IMHO they wanted to be in a profitable venture, they wanted to be inside PLAYSTATION 3 and they got two birds with one GPU, RSX :).
 
Alpha_Spartan said:
I'm guessing that Sony didn't pay for JUST the design. The Nvidia engineers were part of the deal. My belief is that Sony approached Nvidia about a possible graphics processor while Nvidia was developing the G70. An agreement was made and the RSX was borne out of the G70 line of processors.

I mean, some people want it both ways. They want to believe that Sony and Nvidia were working together for three years AND they want to believe that somehow the RSX is a completely new design based on G80+ technology. That would make more sense if Sony and Nvidia just recently started their partnership and if the PS3 was going to ship in 2007, but alas that isn't the case. Sony never fabbed a complex GPU in its life and so it makes sense that they'll get their start with a mature design, not something new that even TSMC will have a hard time with.

Very well said!

Nvidia engineers will still be helping out on RSX process shrinks over the years, as Sony did with the PS2 is very concerned about cost, and releases many smaller revisions of it's console chips, in order to save money, and help contribute to greater manufacturing numbers.
 
Last edited by a moderator:
speng said:
Playing Socom and having your eye-toy set video conferencing with the rest of your team, or Madden just to see their reaction when you take 'em out or throw a touch down, will be priceless.

Speng.
As badass as that sounds, using another HDTV for that seems kinda wasteful. You can use a 20" SDTV for that stuff.

One idea I've seen thrown around with MGS4 where one player controls Snake on the main TV while a second player (Octagon?) controls the little robot thingy. It would be cool to use the capabilities of the little robot to scope out areas, distract targets, stun targets, etc, while Snake does his thing. In fact, that would fucking rock.
 
PiNkY said:
Well Xenos does support locking parts of its 2nd level cache for exactly this reason, so you don't have to go through memory if you don't want to. Or do you refer to xenos not having dedicated ports to access the gpu and main memory?

I thought he was saying that Xenos can read directly from Xenon's L2 cache, but Xenos cannot write directly into Xenon's L2 cache, but has to pass through memory... I myself cannot remember it now, so I would have to go check...
 
Back
Top