CONFIRMED: PS3 to use "Nvidia-based Graphics processor&

Status
Not open for further replies.
Are you really suggesting that the Cell processor that will be in the PS3 could have a hard time doing sound processing?
The same sound processing that today PC CPU could do, using software solution, with little to no hit at all? You're not the only one to have a hard time to believe this.

Eh? Some crappy software based solutions have up to a 50% performance hit on modern cpus, and they're not even doing dolby digital encoding. I don't even know if current PC cpus are powerful enough to do completely unassisted dolby digital encoding on top of all the EAX HD effects.

Even just stereo sound on quality sound cards typically has a 10% performance hit(5% on soundstorm) over no sound.

BTW, all 3 consoles seem too similar. Nintendo should go with a K9(fabbed by IBM of course) and a powervr gpu(or at leave a flipper 2). Nintendo seems to least like having their console like others(they're the most proprietary and been stubborn just to avoid being like the competition), whereas I think Sony just tries to do whatever they think will work out best. In the case of the PSX I believe they looked to nintendo for guidance, and for the ps2 they tried to defy the way of the pc.
 
and the PC people aren't making it easier with their "thoughts"

Lol, that comment was pretty funny. I've seen you say something similar twice yet, just what do you think you are? Not a "PC guy"? when was the last time you argued about ATI or Nvidia is the real question.
 
Fox5 said:
Eh? Some crappy software based solutions have up to a 50% performance hit on modern cpus, and they're not even doing dolby digital encoding.
So you're actually saying that a software sound solution, that does not AC-3 encoding or the like, could eat up 50% of Modern CPU cycle?
I disagree, Fox5, I really do. :D

BTW read Dave's reply, if you don't believe me.
 
I'd thought Cell is mainly meant for 'World Simulation' as SCEI repeatedly stated as so, rather than graphics processing (finally we came to the stage to perceive what the true meaning of the 'Emotion Engine'/'Emotion Synthesis' is). The patent which describes a Cell-based Visualizer is the only source of a Cell-based GPU.

Goto Hiroshige @ Impress PC Watch wrote in his article at 2002/03/28 that it's possible that PS3's GPU could be a dedicated graphics hardware rather than a Cell-based one because it's cheaper in the silicon cost. But in the 2003/06/02 article, he changed his view in favor of a Cell-based GPU as the patent surfaced, and explained that he'd thought it wouldn't be Cell-based as in 2002 he heard from an industry source that SCEI was reviewing GPU vendor technologies. However he questioned how a Cell-based GPU can efficiently implement 3D-specific algorithms like early-stage HSR or texture-caching still present as hardware in today's programmable-shader-centric GPU, citing massive efforts of GPU vendors, ATI Research and David Kirk of nVIDIA.

Now, with the announcement of the nVIDIA partnership, in the 2004/12/09 article, from nVIDIA CEO's remarks Goto expects the graphics hardware in the PS3 is in the 'Media Processor' category which includes GoForce and covers non-graphics features such as audio/network, rather than separated GPU/MCP. It's said that Kutaragi also is calling the chip in the PS3 as 'Media Engine' rathar than 'Graphics Engine' (PSP's Media Engine is indicative?).

BTW it's known for some time from 2003 that Toshiba is developing a graphics hardware though it may be only the physical design of the IC or non-PS3 thing, as you see in those recruitment ads (which are apparently still running) in Japanese web sites. In the ads, they want engineers who will engage in Cell and 'super-high-performance graphics LSI' development.
 
BTW, all 3 consoles seem too similar. Nintendo should go with a K9(fabbed by IBM of course) and a powervr gpu(or at leave a flipper 2). Nintendo seems to least like having their console like others(they're the most proprietary and been stubborn just to avoid being like the competition),

Yes, Nintendo will be the one that concentrate the most on gaming side and "uniqueness" of their hardware.

whereas I think Sony just tries to do whatever they think will work out best. In the case of the PSX I believe they looked to nintendo for guidance, and for the ps2 they tried to defy the way of the pc.

And this time Sony try to cover all consumer electronics as well as PS3, into one R&D. I think they did well to think of that.
 
BTW it's known for some time from 2003 that Toshiba is developing a graphics hardware though it may be only the physical design of the IC or non-PS3 thing, as you see in those recruitment ads (which are apparently still running) in Japanese web sites. In the ads, they want engineers who will engage in Cell and 'super-high-performance graphics LSI' development.

Since the beginning, Toshiba most likely will use those for their own products rather than PS3.
 
Vince said:
Exactly. This graphic solution has been stated to be utilized in Sony and Toshiba's CE based products. We know that STI is going to use Cell, which was designed to be inheriently highly area and power effecient, in CE and I'm failing to see how a seperate or monolithic NG-GPU block of logic is helpful. Actually, it's logically inconsistent with every indication seen this far. The people around here who think this is going to be a PC-GPU derivative akin to the XGPU, IMHO, are going to be proven wrong. I'm much more partial to an advanced ROP/nVidia Pixel Engine integration with IP usage scenario. Those who think this is indicative of a failure of Cell will be proven wrong (which is an absolutely asinine stance to be taken a week after the ISSCC publishings).

I hardly call >200 mm2 for the CPU in 90-nm area efficient. :rolleyes:
 
SiBoy said:
Vince said:
Exactly. This graphic solution has been stated to be utilized in Sony and Toshiba's CE based products. We know that STI is going to use Cell, which was designed to be inheriently highly area and power effecient, in CE and I'm failing to see how a seperate or monolithic NG-GPU block of logic is helpful. Actually, it's logically inconsistent with every indication seen this far. The people around here who think this is going to be a PC-GPU derivative akin to the XGPU, IMHO, are going to be proven wrong. I'm much more partial to an advanced ROP/nVidia Pixel Engine integration with IP usage scenario. Those who think this is indicative of a failure of Cell will be proven wrong (which is an absolutely asinine stance to be taken a week after the ISSCC publishings).

I hardly call >200 mm2 for the CPU in 90-nm area efficient. :rolleyes:

AFAIK, there haven't been any die areas revealed about CELL. Just curious where you got that figure from and for what configuration of CELL. i.e. nos of PUs and S|APUs per PU? :)
 
Megadrive1988 said:
would somebody who knows Japanese be so kind as to translate this article:
Well a lot of it is already known stuff, so I'll pick out the interesting bits.

Jen-Hsun Huang: "I cannot elaborate now, but the next PS will be a very powerful platform". (OK this is PR stuff)

Article noted how all three console companies are bringing in technology vendors which are normally associated with the PC space.

The first news of a relationship was heard in 2002, and Ken Kutaragi remarked that Nvidia Chief Scientist David B. Kirk "is a very intelligent person". There was also some news that Nvidia was "very interested in the XDR Dram said to be used in PS3."

JHH: "We, SCEI and the other 2 partners will collaborate to create the next-gen PS." (PR again, but this is pretty good PR.)

Article noted that sometime ago, KK publicly stated that PS3 will not be using CELL alone, but will work with a "media processor". (I definitely remember that one.)

JHH: "Graphic technology is of course important. But software and tools are also vital. This collaboration includes many levels. One of the reasons SCEI chose us is because of our excellent set of software tools."

The collaboration started around 2nd half of 2002. JHH: "Creating GPU technology takes time."

Industry rumour/news: SCEI had intended to have an internal team design the PS3 GPU originally. But when XBox launched with a GPU from Nvidia, the final product caused great discomfort to SCEI. SCEI began re-evaluating their plans, and chose a parallel aproach. Internal development continued, and SCEI started liasing with PC graphics vendors as well. The final decision was not made till recently.

With regards to Nvidia manufacturing XBox components directly and making good profits, as opposed to "merely licensing tech to SCEI". JJ: "The agreement is great. The licensing is great. There is no concern of whether we manufacture the product directly or license the tech. Our best engineers are working with this new PS product. This is a business of mind and knowledge. We will see returns by investing our best in this."(PR again. But what a great way to turn a question into great PR!)
 
AFAIK, there haven't been any die areas revealed about CELL. Just curious where you got that figure from and for what configuration of CELL. i.e. nos of PUs and S|APUs per PU?

That's not a bad estimate.
 
V3 said:
That's not a bad estimate.

I'd say it's most definitely a bad estimate, at least as stated, with no figures whatsoever backing it up, nor no statement what flavor Cell chip it applies to.

Single PE-type Cell, or preferred embodiment broadband-engine-type Cell with 4 PEs? 200sqmm for the former seems excessive considering modern x86 MPUs are considerably smaller even with more cache included, and too small for the latter.
 
On Psinext Cp and I calculated that a 4PE die on a 65nm process woulbe about 280 sq. mm. The site is down so I can not add a anouther link but it would potentially cost $12 dollars per die but intial production would probably be 5 to 6 times that cost.
 
Xenus said:
On Psinext Cp and I calculated that a 4PE die on a 65nm process woulbe about 280 sq. mm. The site is down so I can not add a anouther link but it would potentially cost $12 dollars per die but intial production would probably be 5 to 6 times that cost.

Both my estimates were roughly the same..that 's freaky! :oops:

Because I got 281mm^2 for the 4PE broadband engine (BE) on 65nm.

http://www.beyond3d.com/forum/viewtopic.php?p=353155#353155

And a cost of 70$ per BE die.

http://www.beyond3d.com/forum/viewtopic.php?p=358184#358184
 
I must say I'm a bit confused as to what exactly is nVidia making for PS3. Is it a GPU? If it is, why is Kutaragi referring to it as a media processor (which implies a chip that does stuff like decoding video, or encoding DD5.1 audio, something like MCP in the XBox or Media Engine in PSP) If it's just a media processor, I find that a bit odd. I think Sony has much better expertise with these media processors than they have with 3D graphics processors, so why would they outsource such thing to nVidia?
 
nVIDIA is making the GPU with SCE.

A Media Processor would still be a fitting definition if you had the GPU with the Sound DSP and the ethernet link embedded on the same SoC (think about Flipper with the Northbridge and the Sound DSP).
 
ATI Rebuts Nvidia With Counter-Claims

Responding strongly, Suki Matharu, country manager, ATI India, said, "We can point out 100 flaws with Nvidia drivers and its cards. They cheat the consumers all the time. We can actually kill them as our products are far better.'

According to ATI, it enjoys a worldwide market share of 51 percent as compared to Nvidia's share of less than 45 percent.

Matharu further added, “We take the interest of the consumers at large, unlike Nvidia. The users of Nvidia 6800 GT will realize that its technology is not good enough to support the next generation graphics that Longhorn or 64-bit needs."

http://www.channeltimes.com/channel...ection_code=16&storyid=904819&sp=h_bn

This may be irrelevant to this thread but quoting the last sentence it may be an indication that Microsoft will lock Nvidia out of their doors with their future plans and favouring ATI instead.
 
Status
Not open for further replies.
Back
Top