From PC to Next-Gen Consoles: Largest Performance Gap...

Status
Not open for further replies.
But more importantly....It will be the viability of Cell itself. If STI wishes to promote the chip for other uses then having some advantage is needed; Otherwise what's the incentive on continueing on the Cell path. (maybe not Teraflop performance but even half that would be impressive).
 
Panajev2001a said:
It still is not as you exclude that the GPU could help in that area as well: think abotua CELL based GPU maybe.

Xenon blurs the lines between CPU and GPU. Some graphics ops will be done on CPU and some general procressing will done on GPU.
If anything it's fairer to compare XeGPU to Cell, the PPC CPU is just a bonus ;-)

Processors come in strange places these days, what exactly makes an APU a CPU thread (apart from location on a chip) but a thread in a GPU not? Serious question.

Cell seems to have 1 thread per APU. How exactly are you going to hide the data dependecies that a graphics pipeline has?
Unless Cell is very different from the model currently thrown about, it would make a terrible GPU... Or a bloody hard one to program (manually hiding latency on the scale of Ghz processor is :devilish:)
 
Polarbear53 said:
If the cell was really that powerful, i think IBM would be crapping themselves....... :?

Cell doesn't compete with the Power5. Comparing the two is like comparing a sports car to a freight-train. Powerful means two completely different things for each.
 
MfA said:
I would be surprised if they have conceived of anything revolutionary. No revolutionary programming model for the local parallel programming, let alone for the distributed case.

I think we wont have a revolution, local parallel programming will remain as hard as it has ever been ... and distributed programming over comparitively high latency low bandwith networks will rely on batch processing like it always has. Cell wont be a revolution either, but it will represent the culmination of the dreams who have seen this coming for decades. Not a revolution, but the first true processor of its kind (general purpose programmable massively parallel processor).

What the engineers lack is purity

--A year has passed since the release of PlayStation2. Is everything under full sail?

Kutaragi: Partly yes, and partly no. However, I think one needs to have a broader view when looking at PlayStation2. Keeping track of the moves of the console itself is not an issue anymore, because the box itself has no significant meaning. There is network in the first place and the box hangs onto the network. I intend to change this equation.


--I assume you are referring to "CELL," the new microprocessor chip that you will be jointly developing together with U.S. IBM Corp and Toshiba. Won't the chip be embedded into the next generation game console?

Kutaragi: Whether CELL would be built in to the game console or not is not an essential matter. Should the era of packaging continue, I guess PlayStation3 and PlayStation4 would be worth a topic to discuss, but what I would like to stress is that the concept of packaging, or box, would disappear in the broadband era. Same thing can be said of the concept of servers and clients. A band of CELL would assume the role of the existing computer system and would establish a living organism like the real cell. World's broadband will consist of an aggregation of CELL. One CELL has a capacity to have 1TFLOPS performance and an aggregation of 1,000 CELLs would have 1P (Pets) FLOPS. The capacity of 1P is an equivalent to the information processing ability of one human being. Thus creation of another world is possible if we were able to collect CELLs that equal to the capacity of 5 billion people.


-- I heard that you took the leading role in introducing the idea to develop CELL

Kutaragi: Yes, I had been imaging it in my mind from years before. It was also my idea to dub it "CELL." Although at initial stages, I had been calling it "Saibo (meaning cell in Japanese)," I christened it with an English name "CELL" in spring 2000, when I confided my thoughts to IBM Corp. I ponder that the development of CELL will bring renovation - the first in 50 years of computer history. Nothing has changed ever since ENIAC appeared until now -- where we have Itanium. To date, network-linked computers have existed as stand-alone islands. That was not much of a problem because operating systems were unevenly distributed to each island and were interchanging data among themselves.


--So you are saying that exchanging data among stand-alone computers is not enough?

Kutaragi: What would happen if things would become even more broadband and there would be no ceilings to set limitations for the bandwidth of broadband? To be sure, there are restrictions under our current wires, but shifting to fiber optics would dramatically boost the speed of communication. We are now witnessing further development of an optical switch that has a capacity to input/output data under the form of a light signal. People would start to review the current computer architecture once such networking environment of optical communication is completed. I am not denying the high processing capability of computers that establish our current networks. Microprocessors of personal computers have reached the operating frequency of 1GHz and high-powered microprocessors are embedded onto PlayStation2. Why then can't such highly capable computers interact with each other once they are connected to the Internet? The reason is neither attributed to fiber optics nor to the "Last One Mile" task of connecting high-speed lines to households. The fact that servers and personal computers have the same LSI is the greatest bottleneck that is hobbling the realization of interaction among computers. Merely connecting one personal computer to another directly by fiber optics is easy. However, if we were to connect one personal computer to ten, what would happen to the server that positions in the center of the networking? In a case where the server is also required to function as a switchboard, we must lay out legions of clusters even when we have a centralized networking topology. Furthermore, the server would collapse should we try to shape it in the form of a complete network. The idea is the same as in the case of a server break down of the e-mail service at NTT DoCoMo. Not every single person will be able to enjoy bandwidth even if fiber optics were to spread over to all households around the world.


Topology to change

--Will CELL be a resolution to solving the bottleneck you mentioned?

Kutaragi: Exactly. CELL will transform the fundamentals of the network topology. The old mechanism functioned by reading memory data into resistors and rewriting the arithmetic into memories. In short, it was just a repetition of loading and storing. Because each cash memory differs in time of access and capacity, it worked out in such hierarchical structure as primary cache, second cache, etc. On the other hand, CELL might completely transform the concept of cache as it would drastically accelerate the speed of networking. What comes into reality is that each of the astronomical number of computers around the globe could unite to form a CELL and operate by one operating system. Each CELL would be the broadband network itself. Just to give you a picture, it is like 1,000 units of computers at one company functioning as one server. In such a networking world, one would only see the overall strength of power decline when one computer drops out and vice versa. It sounds like a human society.
[source: http://ne.nikkeibp.co.jp/english/2001/30aniv/int5_1.html ]


(I first found out about that article from this thread: http://www.beyond3d.com/forum/viewtopic.php?t=8856 )
 
DSN2K said:
I see the PS3 hype certainly got to everyone :?

Tell me about it... I really didn't intend for this thread to be centered almost solely around PS3. I guess it can't be helped. But hey, I'm thankful I got more responses than I hoped for, although, I'd like if we get back on topic. :?
 
Killer-Kris said:
I didn't read the whole thread but if the diagram several posts in is at all accurate PS3 is going to be very hard to get decent performance out of. That is unless Sony has created a very good threading library and compiler for the developers.

Though for the most part a CPU setup like that is going to need alot of work on behalf of the game developers to make good use of. And frankly 3 very decent at single threaded processing CPUs will be alot easier to use, than 4 psuedo cores with many functional units (I'm assuming there's multithreading going on there or else you'll never make use of all those functional units) .

How did you get to these conclusions?
Dean Calver, who is developing for Xenon (under NDA, so let's not ask him to be precise or anything :D ), asked this question in another thread:
Deano Jedi Master C said:
I'd advise checking you preceived notions at the door, cause there was me thinking Xenon had more hardware threads than the Cell preferred embodiment :rolleyes:

Xenon may well be much more 'alien' than Cell.

Ask yourself whats makes a CPU a CPU and a GPU a GPU? The real answer may surprise you.

Nothing is written on the stone, Xenon internal architecture could deliver more threads than Cell... (It appears that Xenon is using quite well thoses threads, but that doesn't rule out the fact that Sony can also adress the issues; in fact, it shows that thoses issues are not impossible to overcome.)

Killer-Kris said:
I don't doubt one bit that the theoretical performance is 1 TFLOP, but odds are that will never be realized in practice.

There's no such thing as sustained "theoretical" performance, and that also go for mono thread processors. :D

Killer-Kris said:
Here's the best thread here on B3D I've found discussing the architecture of PS3 http://www.beyond3d.com/forum/viewtopic.php?t=14418&highlight=playstation

On a side note, the only Cell thread you could have found on this forum is a 2 pages long one?
There must be tons of 20 pages Cell's threads with diagrams and conjectures, backed up with complex examples.
 
OryoN said:
DSN2K said:
I see the PS3 hype certainly got to everyone :?

Tell me about it... I really didn't intend for this thread to be centered almost solely around PS3. I guess it can't be helped. But hey, I'm thankful I got more responses than I hoped for, although, I'd like if we get back on topic. :?

PS3 has the more exotic, and therefore, the more arguable architecture. Which explain why people are so vocal about it. Read Peter Glaskowsky of the Microprocessor Report opinion, for instance, everybody has something to say about it. (nb: It's the same Microprocessor Report that gave the award of best graphical processor of the year to the NV30, an not to the R300... :LOL: )

OTOH, Xenon, looked more traditionnal in its form. And the Revolution... err... All we know is that it will have, if everything goes right, an IBM CPU and an Ati VPU, so... We have nothing to discuss about it, right now.
 
Deano Jedi Master C said:
I'd advise checking you preceived notions at the door, cause there was me thinking Xenon had more hardware threads than the Cell preferred embodiment :rolleyes:

Xenon may well be much more 'alien' than Cell.

Ask yourself whats makes a CPU a CPU and a GPU a GPU? The real answer may surprise you.

I can't wait to hear what they are going to say about it in 05'.


Cell doesn't compete with the Power5. Comparing the two is like comparing a sports car to a freight-train. Powerful means two completely different things for each.

So if the Xenon does have the 3 Power5 cores, what will it excel in?
 
ERP said:
But this is pure speculation.

Actually it's all pure speculation, since we really know nothing about the exact configuration of the PS3 CPU/GPU's.

I'll second that, third and forth it if I could - we know nuthin'. :p

Even the 1 TFLOP claim is speculation. :oops: I remember how it happened.

1st, KK said in '01/'02 that "PS3 with be 1000 times more powerful!".
Then, some obscure source claimed that it will deliver 1 TF. I was never able to confirm the validity, but everyone grabbed on to that and believed it.
Then, as the years passed, KK made statements about CELL tech delivering 'terraflops performance'.
When the first patents were found out, they detailed configurations ranging from 128GF to 1.5TF.

Looking back at what took place, we never had an official 'PS3=1TF claim'.

So there. :D

As for whether a CELL architecture(ok, the little we can speculate about) is practical, I used to think that this kind of design is not suitable for real-time gaming, for the various reasons mentioned.

Then the GSCube happened.

Unless I am mistaken, there are quite a lot of conceptual similarities between CELL and GSCube. All the doubts about latencies, memory, efficiency should apply. But the GSCube worked. I dunno how, but it did,
For the Siggraph debut, the high-end computer animation shops Manex Visual Effects and Softimage built a triple-buffered game demo that rivaled Manex's breathtaking work for The Matrix. F/x like this once took an hour per frame to render, but the GScube does it on the fly, delivering 60 frames per second to an HDTV screen.
"We showed a sequence from the Antz movie from PDI/Dreamworks, running in real-time, actual models and animation from the movie," he said. "It looked pretty damned cool: it was the bar fight scene from the movie, and you could navigate fully in 3D.

And GSCube is only credited with a 'humble' ~100GF theoretical peak.

Obviously we can't expect the amount of resources building a GSCube to be channeled to each PS3. (16 GS with 32EDRAM each :oops: ) But it does demonstrate that there is probably some way that such a design can work. Anyone had experience with the thing?
 
passerby said:
Then the GSCube happened.

Unless I am mistaken, there are quite a lot of conceptual similarities between CELL and GSCube. All the doubts about latencies, memory, efficiency should apply. But the GSCube worked. I dunno how, but it did,
For the Siggraph debut, the high-end computer animation shops Manex Visual Effects and Softimage built a triple-buffered game demo that rivaled Manex's breathtaking work for The Matrix. F/x like this once took an hour per frame to render, but the GScube does it on the fly, delivering 60 frames per second to an HDTV screen.
"We showed a sequence from the Antz movie from PDI/Dreamworks, running in real-time, actual models and animation from the movie," he said. "It looked pretty damned cool: it was the bar fight scene from the movie, and you could navigate fully in 3D.

And GSCube is only credited with a 'humble' ~100GF theoretical peak.

Obviously we can't expect the amount of resources building a GSCube to be channeled to each PS3. (16 GS with 32EDRAM each :oops: ) But it does demonstrate that there is probably some way that such a design can work. Anyone had experience with the thing?

GScube, which is 16 PS2 with x8 amount of graphics memory, debuted at SIGGRAPH 2000. If you only take account of processor speed, 5 years after then it's not too wild assumption that the PS in 2005 has x32 performance, by Moore's law (at the evening of this 'law' itself, luckily). With higher bandwidth interconnection and smaller process technology I don't think we can't expect the equivalent to the amount of resources building a GSCube to be channeled to each PS3.
 
Let me summarize:

Collada is a simple hack done in a week after srealing Renderware file-format.

OpenMAX, OpenGL ES and the whole Khronos initiative are overhyped mistakes which are not useful for an embedded custom platform like PlayStation 3 anyways and cannot compete with DirectX anyways.

God self deigned ATI's GPU and Sony will likely ship 8 months later with a less powerful, less efficient and less flexible platform which will ride on the PlayStation name alone.

Last but not least, it seems that IBM geniuses (Kahle, Hofstee, Gschwind, Moore, etc... ) designed a shitty architecture with a customer (SCE) and a good one with the other (MS).

One camp is galvanized and the other one is seriously dmeoralized and with little hope.

Have I summarized well enough what we know so far ?

Sorry Deano, I am not angry at you (sorry for having lashed out), just that on one side we have developers with SDKs and on the other no one knows shit.

This always happens when a console launches 6-8 Months after another: you are 6-8 months behind. Or so I tell myself.


You know what though ?

F*ck it, royally f*ck it... I heard this s*&t for the past 10+ years regarding Sony's effort, regarding Sony as a company, regardy SCE and the PlayStation business.

They have not held the market by the balls for the past 10+ by being idiots and using extremely strong-arm tactics making up for shitty management and poor technology.

"People love heros, but they love even more to see a hero fall"... I do sense it, I do sense it in the hearts of many people.

There is no malice or evilness, it is embedded in our nature: it is a sublime spectacle when a titan dies, falls to the ground ... the shockwave is sopure and strong.

I am not sure it will be a cake-walk for Microsoft: you are underestimating their opponents.
 
Cool, it seems that IBM designed a shitty architecture with a customer (SCE) and a good one with the other (MS).

Heh. I think ibm designed a good one for both sce and ms . The diffrence is ati designed a great vpu for ms . And no one designed shit for sce . Except sce .
 
On topic: I think the next-generation consoles will have their work cut out for them if they are to present the kind of leap in performance over PC graphics that was seen with certain machines in the past.

The year 1996 was the era of the Voodoo 1 and the Nintendo 64, and accomplishments like moving beyond point sampling and using perspective correction were still pretty recent. However, Model 3 was pushing into the millions of polygons per second and using effects like mipmapping with trilinear texture filtering and edge anti-alaising (multi-layered) in its games at that same time with polished releases like Scud Race. And its Model 1 and Model 2 predecessors, which dated back years further, outpaced the PC even more, though it was before the boom of consumer 3D accelerators.

At the consumer level, the inexpensive DC came out a short couple of years after Model 3 and went on to lead PC graphics for a very long time too. I wouldn't bet that PC graphics will lag for that long of a duration behind the next-generation consoles in the upcoming cycle.
 
The year 1996 was the era of the Voodoo 1 and the Nintendo 64, and accomplishments like moving beyond point sampling and using perspective correction were still pretty recent. However, Model 3 was pushing into the millions of polygons per second and using effects like mipmapping with trilinear texture filtering and edge anti-alaising (multi-layered) in its games at that same time with polished releases like Scud Race. And its Model 1 and Model 2 predecessors, which dated back years further, outpaced the PC even more, though it was before the boom of consumer 3D accelerators.

This is the only generation i would say the hardware for the console was more powerfull at launch than the pc.

However its not very fair to add in the model 1 and model 2 systems nor model 3 as they are in the hundreds of thousand dollars . That is like adding in a super computer . Or saying in 2002 they had 32-64 9700 chips running together for simulatiors .



At the consumer level, the inexpensive DC came out a short couple of years after Model 3 and went on to lead PC graphics for a very long time too. I wouldn't bet that PC graphics will lag for that long of a duration behind the next-generation consoles in the upcoming cycle.

I dunno about that . The dc came out and the geforce 1 was already out. judging by what the geforce finally acomplished i would say the graphics capabiltys were pretty much equal giving a slight edge to the dc as it was a closed system.

Similar to the ps2 and the geforce 2s .

a geforce 2 ultra which came out the same year as the ps2 is able to run doom 3 low quality at 640x480 at 30fps and it looks just as good as any ps2 game .


I think the same will happen this time around if not more in the pcs favor.


the systems are all going to come out in very late 2005 or 2006. dx next or windows graphics api or whatever its going to be called should be out the end of 2006 early 2007 with dx 9 as its base for graphics.

This will make the geforce fx and r3x0 (perhaps just the r3x0 as the fx is slow as heck) the minimum card developed for longhorn games.

So i think we will finally see graphics take a huge step foward on the pc. I would. Just look at unreal 3 engine that should have its first game in 2006 .
 
ah ah ah again sony playstation hype !
and they are all for it again! I guess PS3 will be a great success
Can't you people remember PS2?

oh boy and is that japonese guy Crazy or what?
 
Panajev2001a said:
Sorry Deano, I am not angry at you (sorry for having lashed out), just that on one side we have developers with SDKs and on the other no one knows shit.
I'm not saying that Cell won't be revolutionary, I'm not saying PS3 won't be powerful. What I'm saying is A) Xenon is revolutionary and powerful and B) Lots is unknown about PS3

I give Sony more credit than most Cell fans, they WONT make a crap architecture just to fit some higher vision. I'm doubtful of a GPU heavily based on cell having good real-world performance.
I expect both Xenon and Cell will be able to make up any major short falls. Due to the flexible architectures of both, they will be able to move processing resources. Both will have real-world performance in the 100's of GFLOPS, both will be relatively hard to program, both are revolutions.

Which will win? Ask in five years time.

I like both, Cell looks to be an awesome CPU, XeGPU looks to be an awesome stream processor. Can I have both in one machine please? :)
 
Lazy8s said:
On topic: I think the next-generation consoles will have their work cut out for them if they are to present the kind of leap in performance over PC graphics that was seen with certain machines in the past.

Definately plausible.

Lazy8s said:
At the consumer level, the inexpensive DC came out a short couple of years after Model 3

Did it launch at 199? If so, that wasn't a bad launch price of a console if I recall correctly.

Lazy8s said:
and went on to lead PC graphics for a very long time too.

Have to disagree here since, among other things, you're disregarding resolution (PCs were running 800x600 at least for the average game) and DC didn't need to even come close to that.
 
Status
Not open for further replies.
Back
Top