CONFIRMED: PS3 to use "Nvidia-based Graphics processor&

Status
Not open for further replies.
DaveBaumann said:
That particular option, being the Toshiba part I assume you are referencing, is likely another of the backup options. I’d say that a fully Cell route was the way they wanted to go, but their performance targets they had in mind for PS3 Cell processing are off and so they had to go with more traditional, dedicated, hardware for graphics processing – yes, its very easy to see that NVIDIA would be a far better option than the Toshiba route (that’s a total no-brainer IMO!) but neither the Toshiba or NVIDIA route where the way Sony wanted to go (at least, judging from the patents and the conversations I’ve had today).

I think perhaps there may be other explanations than performance being off the mark. Perhaps it ended up being cheaper to go with the nVdia graphics/sound option and a smaller scale cell design rather than using a larger scale cell design for everything. Given the recent PSP/DS price war, Sony may have realized that getting in a price war with Microsoft using a 100% cell design was going to leave them too far in the red.

I'm also very curious to see if the graphics chip may end up handling the sound processing too. A lot of the same kinds of operations come into play for 3D positional sound. That might justify the $5 per chip.

Nite_Hawk
 
Maybe raytracing could be used exclusivly for solving the shadow problem, and then vertex lighting, light maps and cube maps for the rest? (Shadows is after all what the technique was invented for to begin with. :))
 
Nite_Hawk said:
DaveBaumann said:
I think perhaps there may be other explanations than performance being off the mark. Perhaps it ended up being cheaper to go with the nVdia graphics/sound option and a smaller scale cell design rather than using a larger scale cell design for everything. Given the recent PSP/DS price war, Sony may have realized that getting in a price war with Microsoft using a 100% cell design was going to leave them too far in the red.

I agree. Asides from the fact that we don't know for sure about the architecture of PS3's GPU (i.e. whether it's cell-based or not), even if it isn't, I don't think it's necessarily a reflection on the performance they're achieving with Cell. Hardware asides, the choice to partner with Nvidia makes sense from many other points of view, not least their software/tools development expertise etc.
 
Squeak said:
Maybe raytracing could be used exclusivly for solving the shadow problem, and then vertex lighting, light maps and cube maps for the rest? (Shadows is after all what the technique was invented for to begin with. :))

Yeah that would be a pretty neat step for GPUs to go in.
I think the basic problem with shadows on current generation GPUs is that rasterization is an inherently local operation whereas shadow gneration is asking for a global operation. So it just doesn't fit with the assumptions of the model so you have to go back, re-render, compare.....

Also, I think people forget that IBM was looking to build a ray tracing machine back in 1995. However, the machine turned into Blue Gene:

"Blue Gene was hatched in 1995 when Denneau and IBM designer Peter Hochschild were kicking around the idea of building a machine for photo-quality computer graphics using a computationally intensive technique called ray tracing. Farms of workstations used ray tracing to produce the computer-animated film Toy Story, in which each frame took many hours of computing time. With "aggressive simplicity" as his mantra, Denneau began by designing a ray-tracing chip while colleague Hank Warren worked on a C compiler for generating executable code."
 
I don't see how licensing tech will be cheaper than "not" licensing tech. Certianly cheaper that what MS did with Xbox (buying the chips through nvidia after TSMC manufacturing), but not cheaper than doing it in house.
 
I think it has more to do with performance of Sony's in house GPU in the rasterization stage along with key features like filtering and AA. I do think CELL would be an optimal solution for things like shading and very good at it too. We already know the thing will pump out huge amounts of geometry and if there is a way to incorporate that into the stage.

IBM is fairly confident of CELL's performance numbers but of course we still need more information about the specific configuration that will be in the PS3. Until we know that then we have no clue. Speaking to a lot of developers can only get you so far. So far in speaking with them it seems PS3 will indeed be the most powerful in terms of raw performance. We have yet to have any clue on efficiency but time will definitely tell.
 
Qroach said:
I don't see how licensing tech will be cheaper than "not" licensing tech. Certianly cheaper that what MS did with Xbox (buying the chips through nvidia after TSMC manufacturing), but not cheaper than doing it in house.

Well, I think a lot of it might come from software development costs. Considering that nvidia was going to write drivers and promote developer relations for their new GPUs anyway, and sony would have had to do the same for the PS3, this eases the burden on both of them by only needing to write one set of development software. Assuming they get a lot of developers on board, it means that sony/nvidia can probably get some of them to start using thier tools for developing PC games too. We might see a shift away from directx for certain PC titles like the GTA games and others that Sony have a strong hand in.

From a purely hardware perspective, there are lots of things that could affect the pricing. Perhaps yields on Cell aren't going to be nearly as good as expected. Maybe they run hotter than IBM thought and trying put too many of them in an enclosed space will require an extremely fancy and expensive cooling setup. Perhaps the wafer size and expected yields just ended up making it too expensive compared to whatever solution nVidia is coming up with.

There are many different possible scenarios. Probably will never know what's the truth.

Nite_Hawk
 
nAo said:
This doesn't make any sense at all. If CELL is off their performance expectations (and I have data to support that..) the first solution is to scale back your internal GPU design (reducing target clock, number of pipes..and so on.), not to harry up to find a new GPU supplier.
Sony switched from the internal soloution to NVIDIA cause their internal GPU lacked features (jsut 2.0 like PS...emh..) and cause they need tools and shaders compilers expertise..
Now Sony has an agreement even for future products, not just PS3, that tells a lot about this story.

Sounds like you know something!! Tell us more!!!
 
DeanoC said:
Guden Oden:
To me your focusing on the problems at the moment not the gains from solving these problems, virtually all those issues are fairly easily solveable.

Why bother solving them? Because (just as in Cell), these types operations are suited to particular special hardware. The simple fact is (as both Cell APUs and GPUs show) that its easier to scale n special CPU's (SPU or GPU ALU) than general purpose CPUs. We are looking at systems in a couple of years capable of a 1000 (real) FLOPs per cycle! Thats worth trying to adapt certain algorithms to use that power.

GPU and Cell are solutions to the computation problem, graphics were the first things that needed it, sound and physics second, AI will be third. Being able to switch where we compute things, allows us to overcome bottlenecks and produce higher quality games.

So next gen can win by having the best multi-accelerators?
Then the real question is wich HIV is more advanced in these things.

BTW
1000 (real) FLOPs per cycle

:oops: :oops: :oops: that would give a real beast in a todays 500Mhz 16 pipes
 
The press release says it all...

"In parallel, we have been designing our next-generation GeForce GPU. The combination of the revolutionary Cell processor and Nvidia's graphics technologies will enable the creation of breathtaking imagery that will surprise and captivate consumers."

Sounds like it is a dual conjunction in developing a GPU tech for both Nvidia's portfilio and PS3. (Each Benefiting the other). Also, with recent comments from Peter Hofstee and Ken Kutaragi there seems to be a departure from traditional Pc GPU's graphical functions....

"Current general-purpose microprocessors have not been able to fully support high-resolution image data. On the other hand, using a proprietary LSI for image data would have a specific logic so that it has no flexibility in supporting new formats or new services distributed through a network. But the CELL microprocessor will handle high-resolution image data and at the same time meet requirements of a device that can be used for new services and formats."
 
Wow, judging from the comments from Dave, Qroach and JVD this partership has failure written all over it. Sony missed their targets and will be now forced to use just another GPU (JVD says even slower than what Ati will make for Xbox2, despite releasing sooner). All their money and efforts went to waste. End of the Cell dream? :(
 
Qroach said:
Failure? nobody said that. Why is it people in here jump from one extreme to teh next??

Well, you wrote that Sony missed their ambitions. So basically they overhyped a machine they won't be able to deliver. To me that means failing on one's intentions and lying to their customers. Sorry if I read that wrong.
 
DaveBaumann said:
Curiously there is further speculation that NVIDIA's Soundstorm will form the audio core for PS3, this time its coming from Dave Ross at Hexus, who is currently over in Santa Clara for NVIDIA's Editors day - I'd say that he'd only post this if there are links to suggest that to be the case. Personally, I have a hard time believing that one myself, but if it did turn out to be the case I'd say they are hitting quite wide of the mark of their initial performance estimates for Cell (in the PS3 implementation).
Are you really suggesting that the Cell processor that will be in the PS3 could have a hard time doing sound processing?
The same sound processing that today PC CPU could do, using software solution, with little to no hit at all? You're not the only one to have a hard time to believe this.

If there's a Nvidia Soundstorm in PS3 that would have more to do with Sony/Nvidia buisiness relation than the inability of PS3's Cell CPU to deal with such a trivial task (I remember a 3D sound software solution vendor claiming to have a ridiculously low hit on performance on a CPU, and that's a few years ago... I'd try to find a link, but i completly forget the name of this software/vendor).

BTW the PS3 is very likely to embark an EE (Which maybe a EE+DS a.k.a Dragon, or PSX2OAC...) for the PS2 backward compatibility, that would also serve as the IO chip... And judging by the raw computational power of the EE it could also be a sound chip for that matter.

jvd said:
With the r420 and nv40 the nv40 was a new chip that turned out slower albiet slightly more feature rich than a modified r3x0 core . WHich leads me to believe that ati will once again have the lead in performance with next gen chips .
Was it slower in OpenGL benchmarks? Was it significantly slower at 640x480 and 1280x720?
 
Panajev2001a said:
Birds, reliable ones, mentioned that this race between nVIDIA and the other "sort-of-in-house" challenger design had been going on for a while.

if they are birds, then they must be pink flamingos.
 
Devourer said:
Wow, judging from the comments from Dave, Qroach and JVD this partership has failure written all over it. Sony missed their targets and will be now forced to use just another GPU (JVD says even slower than what Ati will make for Xbox2, despite releasing sooner). All their money and efforts went to waste. End of the Cell dream? :(

Nobody here said anything about failure. And if they hinted at it, it was purely guessing. The same goes for people who think PS3 will be graphical supercomputer that can render CGI quality images in realtime. It's all guessing. Unless someone here has inside information as to the real specs, of the CPU/GPU setup of the PS3 which I very much doubt, we're all most likely pissing at the wind.

Personally, I don't know what to expect. But if if take Sony's hype comments and divide it by 5, we will still have a kick ass machine.
 
Well, you wrote that Sony missed their ambitions. So basically they overhyped a machine they won't be able to deliver. To me that means failing on one's intentions and lying to their customers. Sorry if I read that wrong.

I never said anything like that. I said "sometimes things don't work out the way you want" how that turns into "missed ambitions or failure is purely they way you want to look at it. It can't be a success or a failure until it has launched IMO?
 
Devourer said:
Wow, judging from the comments from Dave, Qroach and JVD this partership has failure written all over it.

My comments don’t relate to this partnership as we don’t yet know that that partnership entails. IMO, if what I believe will happen actually happens then I think this will be a safer route that the one that initially had in mind.

Vysez said:
Are you really suggesting that the Cell processor that will be in the PS3 could have a hard time doing sound processing?
The same sound processing that today PC CPU could do, using software solution, with little to no hit at all? You're not the only one to have a hard time to believe this.

If there's a Nvidia Soundstorm in PS3 that would have more to do with Sony/Nvidia buisiness relation than the inability of PS3's Cell CPU to deal with such a trivial task

Well, its not a case that its little to no hit – we’re talking about DD encoding here which requires mixing of multiple high bit rate voices and compression on the fly; Dolby encoding can take a fair performance hit. I’ve heard that the DD solution with Intel’s Azalia implementation can eat up as much as 20% CPU usage (P4EE) – and PC CPU’s aren’t necessarily going to be tasked with vertex processing, which I’m currently assuming the Cell CPU in PS3 will be, so if performance is marginal in relation to the targets then there are some performance gains to be had, potentially.

Given the number of DSP processors Sony have in consumer land, I have a tough time believing they wouldn’t be able to provide their own solution anyway, even if they were looking for a dedicated solution.

But like I said, I really would be surprised if they go for a hardware DD encode solution.
 
bbot said:
I agree that Sony never intended to use Nvidia technology. And here is more proof:

http://www.gamesindustry.biz/content_page.php?section_name=dev&aid=2161
Anonimous industry senior gibberish and overconfident claims and PR refuting rumors, never constitued a proof of anything.
bbot said:
So it seems those slides that showed 1 cell (apu) = 1 gflop were right. Sony fell far short of its performance goal for cell.
So you're expecting a 8Gflops CPU for the PS3, five years after the 6,2Gflops PS2 CPU...
Devourer said:
Well, you wrote that Sony missed their ambitions.
Sony patents always showed a Visualizer/Realizer coupled to the CPU. The "full software route" was (and still) an ambition comming from enthusiasts, people from this forum mainly, i must add.
Devourer said:
So basically they overhyped a machine they won't be able to deliver. To me that means failing on one's intentions and lying to their customers.
You are really jumping the gun here. Sony never promised a 1Tflops CPU, nor a in-house GPU...
 
Status
Not open for further replies.
Back
Top