nVidia building the PS3 GPU in its "entirety"

Status
Not open for further replies.
pahcman said:
So i ask the Cell lobbyists, do you think eg Cell PS3, because of revolutionary design, will absolutely destroy competition time for time, price for price, until next big Cell hardware? If no, i got no disagreement.

Of course it won't, for the vast majority of codes.
It's not even certain that it will be vastly superior even in the limited console niche.
However.
It would seem to be a much better thought through platform for high performance media centric computing than x86, particularly when integration levels go 4 times or more beyond the limits of this first implementation. Small wonder, not being the result of decades of evolution of an architecture meant to manipulate bytes in a clerical environment.

Success in the marketplace has little to do with technical merit though, being more related with economic (software) inertia, and market manipulation. The cell concept could very well be a success for the PS3 and PS4, but be irrelevant everywhere else. I certainly don't see the x86 market as a whole clamoring for higher media performance, so even if this first attempt turns out wildly successful, it doesn't really give it much leverage in the x86 market as a whole. If, as seems more prudent to assume, the first cell processor is a mixed success with some strengths and some weaknesses, computing life is likely to go on much as we know it for at least another half decade or more.

Realistically though, going forward, who really gives a damn what ISA Word runs on? Clerical computing is not performance critical anymore, and can be emulated just fine if need be. This creates a small opening where an architecture that is strong in those particular areas where performance still matters can rock the boat.

I doubt the world is interested in having its boat rocked though - x86 forever.
 
a bit off topic but Gigabyte is going to launch a pc videocard with dual nvidia gpus on one card

----
The card integrates two Nvidia GeForce 6600 GT graphics processors and is the first 6600 GT card on the market to offer a total of 256 MByte DDR 3 memory and 256 Mbit of memory bandwidth, according to the manufacturer. The card is cooled by two on-board fans.

The 3D1's two processors communicate through Nvidia's SLI interface and achieved 14,293 points in 3DMark2003, sources at Gigabyte said. This would not only be almost twice the performance of a regular 6600 GT card, but also more than ATI's Radeon X850 XT Platinum Edition, which achieved in Gigabyte's test environment 13,271 points and Nvidia's GeForce 6800 Ultra, which posted 12,680 points.
----

you guys think we will see this kind of customisation on the ps3 ? multicore GPU
 
Hi, Im reading B3Ds Forums now and then and decided to push some more Speculations in this Thread :D

There are a few reasons Cell could be more than just a "good CPU". Anyway I`m like most others basing them assumpitons, so dont think im talking about anything else than my interpretations.

The first time I seen the Network-Diagram, I looked over to my Rack: holding HIFI Components, a TV, a couple Consoles and other Devices, all strung together in a mess of Cables, Scart-Switchers and obscure Adapters.
Now think Cell: There would be a single (optical) Wire for each Device, the only additional thing you may need is a hub or switch. Future devices like Pseudo3D TVs wont need yet another Type of Cable, theres no restriction where you route your Music/Video/whatever to, you could even decide via a TV-Remote if you want to play against a pal in splitscreen or on 2 TVs in different rooms. Let alone the fact that everything is transfered digital and therefore free of any noise.
Thats my guess of Cell an how Sony ( and Toshiba ) are gonna use it - it makes perfect sense, as Cell is network-ready and Devices could communicate with each other after exchanging "Software-Cells"- which tell how to de/encode the Data.

There will of course be "legacy" Ports for quite a while, but given a good marketing this could certainly be a huge success (and a very good thing IMHO). The only way to get a digital connection between a PS3 would be to get a Cell-ready-TV.. and so you already got 2 of these Chips ;). In the end Sony could have its IP in all new Media-Devices.

Given those odds, Cell would certainly be a big thing and a big platform to stay for years - a Cell-Desktop doubling as a Mediaserver for the whole family wouldnt be that unrealistic. OSes are gonna be needed anyways, be it for embedded Systems or IBMs Servers.
Additionaly if Cell should have the ability to run multiple OS`s at once and the existence of a few Wintel-Emulators available for Unix( good enough for Apps atleast ) could ease the transistion.



hey69 said:
you guys think we will see this kind of customisation on the ps3 ? multicore GPU
multicore (on one Chip) - surely not, instead they would simply double the Pipes, more efficient than having two GPUs to "talk to".
Two physically seperate Chips - possible, but unlikely
 
one said:
Sony doesn't lose money. They only change who they pay for in-house semiconductor needs.

And they will end up paying more for a cell based solution.

Sony will sell HDTV with Cell, Blu-ray HD recorder with Cell, PDA with Cell, HD digicam with Cell, robot with Cell, and on and on.

PDA? we'll see. HDTV? again we'll see. HD digicam? Again we'll see. I'll predict that the vast majority (95%+) of all "cell" devices will be in PS3.

Aaron Spink
speaking for myself inc.
 
Tuttle said:
Wow.

Perhaps it's time for you to actually do some research on Sony, IBM, and Toshiba's plans for the Cell architecture.

Sony = PS3. IBM = licensing and semi revenue. Toshiba = still maintains some relevancy.

Aaron Spink
speaking for myself inc.
 
one said:
:rolleyes:
You should note that it's easier and cheaper to maintain Cell and software that run on it than maintaining several thousands of different chips. Also, note that you can adjust the number of cores and clockspeed freely in Cell.

Not really. See, the other software already exists. Hmm, imagine that, the world gets on just fine without cell...

As far as changing the number of cores freely... If free is several tens of millions of dollars minimum, then yeah, you can do it freely.

Aaron Spink
speaking for myself inc.
 
one said:
:rolleyes:
You should note that it's easier and cheaper to maintain Cell and software that run on it than maintaining several thousands of different chips.

Let's wait for a product to be brought to market based on Cell before we start noting how easy and cheap it is. You're very idealistic.
 
This idea of cell everywhere is a bit overblown. I would have to agree with Aaron that it is well suited for a console but overkill in other applications. The consumer electronics market is very cost sensitive. Unlike consoles the cost (plus profit margin) must be paid for up front. There is no continuing revenue stream which could be used to subsidize the COM.
 
I take issue with a lot of what has been said in this thread.

Correct me if I'm wrong, but I don't recall anyone saying that PS3 is going to replace the PC! That is utterly rediculous. For Cell to replace the PC, it will take many years. I don't think anyone is claiming otherwise.

And no, Cell is not just a "kick ass console". Sony corporation is investing the billions that it is because they are R&Ding for the future of their entire consumer electronics business. Same goes for Toshiba, who I heard are even supplying GM with cell based processors for use in motor vehicles.

What's the total STI investment? More than $10 billion probably. What kind of fuckwit company spends that kind of money on the CPU of a stupid games console? I cannot believe the relentless stupidity that festers around these boards! Cell probably isn't suitable for general purpose computing, but the scalable architecture sure is suitable for the kinds of applications that Sony and Toshiba are best known for!

Also, who are the morons in this thread who somehow think that PC's aren't moving towards media and entertainment applications? I hate to break it to you, but the "media revolution" already began years ago. Today, even companies like Microsoft, Intel etc openly admit that media and entertainment will soon become the key application for PCs. Open your eyes people. Why do you think that the PC industry relentlessly pursues more power? If you try to tell me that it is to speed up your Microsoft Word, I will laugh in your face! Why do you think things like SSE, 3DNow, MMX, Altivec have appeared? Do you understand?

It wasn't that long ago that PCs didn't even have FPUs. Now, not only are FPU execution units an important part of the CPU, we see endless optimisations and extensions to improve media performance, even in spite of rapidly improving GPU technology. PCs are constantly required for CGI modeling, gaming, photo viewing, music playing, video/TV watching. Everyone wants entertainment and recreation. This is the media revolution. Deal with it.

Media/entertainment and the PC industry is converging. Microsoft wants to put the PC at the centre of your life. Consumer electronics companies want to do the same with their eqiuipment. The PC will become the "home server" if Microsoft get their way. If Sony gets their way, your whole house will essentially be your home server. Which will win out? Don't ask me. But the PC and the consumer electronics industries are converging, simple as that. Where have you people been?

Microsoft wants everything to come from a PC. They have said this already. They want your videos, photos, games and everything else to be stored on your PC, which then serves the user with "entertainment". They want appliances to feature Windows CE type operating systems. I thought all of this was common knowledge by now. But consumer electonics companies do not share this view. They want "ubiquitous computing". They want to conserve their own influence on the market that concerns them the most: The Home. Companies such as Sony envisage that media and entertainment, along with everything else, will come not from PCs but from TVs, sterios and games consoles. At the very least, companies such as Sony do not believe that a single home server should be PC-based. Why do think Cell is so important?

Now, I also read people defending the PC paradigm. In the face of the media revolution, the lack of industry unity and the requirement for legacy software support has held back progress in the PC industry. The PC industry is not as fast moving and flexible as some people *cough*DaveBaumann*cough* seem to think. If it was so quick to react, then why are we stuck in some kind of archaic x86 dark ages. This is the media/entertainment era, but x86 is far from multi-media friendly. A common argument I see is that "but teh GPU cans do all da work!" And while yes, we will see everything move onto the GPU, why then will I want to fork out multiple hundreds of dollars for a useless x86 processor to sit on my motherboard? I don't need 500 million transistors of pentium what-the-hell-ever to run Microsoft Word!

We'll be buying x86 and its derivitives for years to come, just so we can subsidise Intel's survival and backwards compatability with Microsoft software! You have to look at the bigger picture. How should we spend 2 billion transistors total overall system budget? retaining x86 architecture and excessive scalar capability? Think about this a little harder people! The PC "paradigm" is neither cheap nor efficiant, especially in light of the media revolution. How is that so hard to grasp?
 
x86 with it's SIMD extensions is about as good at multimedia as any other of the mainstream instruction sets with their SIMD extentions. IMO x86 would even be a reasonable ISA for a massively parallel processor with simple cores.

Id much rather have a massively parallel x86 with lots of simple x86+SIMD cores and some vertical multithreading than Cell personally.
 
ultimate_end said:
And no, Cell is not just a "kick ass console". Sony corporation is investing the billions that it is because they are R&Ding for the future of their entire consumer electronics business. Same goes for Toshiba, who I heard are even supplying GM with cell based processors for use in motor vehicles.

Apparently Sony then wants a short future in the consumer electronics business.

What's the total STI investment? More than $10 billion probably. What kind of fuckwit company spends that kind of money on the CPU of a stupid games console? I cannot believe the relentless stupidity that festers around these boards! Cell probably isn't suitable for general purpose computing, but the scalable architecture sure is suitable for the kinds of applications that Sony and Toshiba are best known for!

10 Billion? likely not. Probably not muc more than 1.2 billion at the most. If they have spent more, then it is already a failure.

And if it isn't suitable for the general purpose market then it will have a hard time paying for its self for use in anything except PS3.

Also, who are the morons in this thread who somehow think that PC's aren't moving towards media and entertainment applications?
No one. The PC already has a large portion of the media and entertainment applications.


Today, even companies like Microsoft, Intel etc openly admit that media and entertainment will soon become the key application for PCs.

No, they believe it is a key *growth* application for PCs. As in PCs will end up taking over most of the market that currently exists.


It wasn't that long ago that PCs didn't even have FPUs

Actually, it WAS quite a long time ago. PCs had FPUs before consoles did.


t from TVs, sterios and games consoles. At the very least, companies such as Sony do not believe that a single home server should be PC-based. Why do think Cell is so important?

it isn't that important. Just another also ran on the road of computing progress. Nothing about cell is any different than the fanism and ra-ra comments about PPC during the AIM era.


Now, I also read people defending the PC paradigm. In the face of the media revolution, the lack of industry unity and the requirement for legacy software support has held back progress in the PC industry.

Its easy to pontificate. It is much harder to give examples. So lets hear them.

The PC industry is not as fast moving and flexible as some people *cough*DaveBaumann*cough* seem to think. If it was so quick to react, then why are we stuck in some kind of archaic x86 dark ages.

What x86 dark ages? There are over a half BILLION x86 users. And all of those users paid real prices, that actually covered the costs of design and development. There are at most 75 million PS2 users and the vast majority didn't pay enough to cover the R&D costs of the hardware.

Maybe, just maybe, x86 is king becuase people want x86. It certainly isn't from lack of alternatives. the alternatives come and go, but the PC remains. It adapts. It is because it is fast moving and dynamic that it still exists. The last threat from the "media processors will take over the world" resulted in significant consumer exectronics companies writing off hundreds of millions in development costs when the PC space made a correction and added new functionality. Boom media processors dead. Same thing happened with the DSP rage era where people were predicting multiple DSPs in each PC doing the bulk of the work. Now DSPs are also a dime a dozen.

Aaron Spink
speaking for myself inc.
 
one said:
ultimate_end said:
Sony behind 3D labs? Judging from patent applications, Sony has been looking very hard at antialiasing technology and Sony does understand pixel shading (PSP), so it isn't so rediculous to assume that with enough money and work (they have that research department remember) that the PS3 implementation would have been acceptable. Then there is the SALCS/SALPS theory that would have been interesting in this regard too.

IMO Sony thought it's enough to beat competitors in the handheld space with the Sony in-house PSP GPU. Plus cost-wise handheld is tighter.

In the console space, they had to secure sure win over competitors with capable graphics technology nVIDIA and ATi demonstrated in this gen, especially when in so-called 'this generation' (though with unfair 2 years interval) PS2 is behind in hardware power. Consumer may wink it away once, but won't twice.

Will it really matter if the PS3's pixel shaders are only PS2.0 equivelent? (Unified Shader Model notwithstanding). What if the proprietry pixel shading technology actually turned out superior to Xenon's, much like how PS2's VU1 is functionally superior to Xbox's Vertex Shaders? In any case, why does everyone assume that Sony is so far behind nVidia/ATI? PS2 was designed in an older graphics technology era than Xbox for ****s sake!

one said:
Sony might choose nVIDIA programmable shader only to lure developers into the PS3, as a bridge to more avant-garde things avaliable in the PS3. If silicon space permits, they can even cram different graphics paradigms in 1 die. To save silicon space, Sony need optimized implementation of 3D algorithms nVIDIA put into their GPU. Though they already gambled big money into the Cell R&D as they are confident in winning, at the same time they have to be realistic without complacence to continue to be the market leader.

I'm very curious how Xbox2 turns out in the next month more than PS3, as so many PS3 info are disclosed in this timing. Can those Xbox executives and Nintendo executives sleep well these days or sh*tting in their pants? If I'm Steve Ballmer I'll put back Xbox 2 release to 2006 like Xbox 1 release was delayed several times to complete its spec than taking headstart with only a half year window advantage.

Yes, I do think that PS3 will still be a beast. I do not see eye to eye with those who somehow think that nVidia/Sony cannot make a graphics chip to compete with ATI's Xenon chip. That seems rediculous to me. I agree that Microsoft/ATI are probably concerned by the potential of PS3.

As for what this chip turns out to be, it's just that I keep getting the impression that it won't be anything truly special. Yes I know it doesn't actually matter, but what can I do, I'm a tech freak :oops: .

I predict a chip that is close to 1 billion transistors and is fabbed initially at 90nm and then moved immediately to 65nm (a repeat of the GS basically). If I hypothetically place an NV50 for example, a sound processor and 64MB eDRAM on a die then that would require the chip to have been designed strictly for 65nm as a 90nm chip would be horrifyingly big. Maybe I am under the wrong impression, but from what has been said through all of this, I still keep thinking that the final chip won't stray too radically from this. But according to my own advice, I will attempt to assume nothing. At least I may be pleasently surprised when the final chip features considerable SCEI input. I just feel that there is still going to be a lot of Sony ideas/research that will be going to waste.

Lazy8s said:
The design of the PS2 was finished only shortly before launch, like most systems -- even though Sony had originally planned to launch in 1999, some of the chips hadn't been ready and had required a layout to a new process size.

Designing a system is more than locking down the specs; there's also the whole fabrication side that determines what's possible. There's nothing hard about drawing up a system that can't be effectively produced until many years later, and that's really a sign of bad R&D scheduling -- not forward thinking -- since the goal is to have readiness from the target fabrication process and from the feature design of the chip to coincide as closely as possible so that neither part is left waiting around for the other and becoming outdated.

Yes, I in fact know all of this. My original comment was very loose and intended to support an argument that I am making. I should have been more specific about what it was exactly that I was referring to. Now I end up going off on a tangent, questioning when the GS was implemented in final silicon. But as I said before, the final hardware date is irrelevant considering my argument. No matter how long it took for Sony to fine tune their manufacturing process, how much work it took to implement edge antialiasing, or what compatability problems they had, the point is that, like the Dreamcast, the basic architecture of PS2 was conceived in a time before pixel shaders and before graphics chip integrated T&L. For Sony to suddenly drop all their work and throw an NV15 into the PS2 at the last minute would have been impossible IMO, and it still would not have had any pixel shaders!

I should have brought all this up months ago, but now it surfaces here and I end up involving innocent people such as Megadirive1988, One and yourself.

My point is that Sony is not as retarded as a lot of people like to believe. My point is that Sony has been carrying out graphics research before companies such as nVidia practically even existed. Why is it so hard for people to believe that Sony might actually be able to create an incredible GPU for PS3 on their own, or that the reasons why Sony went with nVidia might have nothing at all to do with Sony's technical failure to do so. That is my argument to everyone here.

And now it looks as though I have some other posts to write *looks at two previous posts*...
 
ok one quick ramble from me before im off to bed....

I for one believe that Sony-Nvidia together has tremendous potential to design a graphics processor that is better than what ATI-MS can come up with in many ways, and on par in other ways. I don't expect Sony-Nvidia to come out with a graphics processor that is noticably worse in any way than what ATI-MS have made for Xenon. as far as image quality, both graphics processors might end up being about equal. i expect the Nvidia-Sony chip to have significantly higher internal memory bandwidth. the ATI-MS Xenon graphics processor (R500) is reported to have only 32 GB/sec internal bandwidth from eDRAM. which is lower than the far far older GS in PS2 which has 48 GB/sec eDRAM bandwidth. of course, the bandwidth for R500 might be much better than what is reported in the supposed leaked document. we'll see. given that Xenon's R500 is slated to be on 90nm and PS3 graphics processor has a good chance of being on 65nm, the potential to have more transistors, more performance, is very very real. if we can agree that Nvidia and ATI are more or less equal, with a few advantages on Nvidia's side and a few advantages on ATI's side, then at the very worst, the PS3 and Xenon will have similar quality graphics. but if Sony-Nvidia get to use the 65nm process, and, if Sony adds to the PS3 graphics processor all the good things that GS had, only on a next-generation scale, and minus all the bad things about GS, thanks to Sony gaining experience and especially because of what Nvidia can bring to the table with its vast amount of IP and expertise, I think the PS3 graphics processor has a good chance of significantly out performing Xenon graphics. maybe not to the same degree that PS2's GS outperformed Dreamcast graphics (remember we're talking raw performance now, not quality as i already covered quality)... but still enough to notice. I don't claim to know much more than the average joe, im just speculating from my own fairly limited tech knowledge. well we're only a few months from knowing much more about PS3 and probably just a couple of weeks from knowing more about the Xenon. I predict.... this board is gonna be going wild :oops:
 
Megadrive1988 said:
graphics processors might end up being about equal. i expect the Nvidia-Sony chip to have significantly higher internal memory bandwidth. the ATI-MS Xenon graphics processor (R500) is reported to have only 32 GB/sec internal bandwidth from eDRAM. which is lower than the far far older GS in PS2 which has 48 GB/sec eDRAM bandwidth. of course, the bandwidth for R500 might be much better than what is reported in the supposed leaked document. we'll see.

Ever stop to think that the quoted number for the GS was only imaginary?

An example... According to rumor the Xenon GPU can complete up to 8 pixels per cycle with each pixel being 4B of color and 4B of Z for a total of 8B. 8 x 8 = 64. Clock speed is rumored at 500 Mhz. 64 x .5 = 32 GB/s required memory bandwidth. Which is what is quoted in the purported leaked diagrams. Marketing could easily quote a ridiculous 200 GB/s number and be truefull but deceitful.

It doesn't matter how much memory bandwidth you have as long as it is sufficient. Anything else is waste.

Aaron Spink
speaking for myself inc.
 
In any case, why does everyone assume that Sony is so far behind nVidia/ATI? PS2 was designed in an older graphics technology era than Xbox for ****s sake!



The logical answer would be if they had better they would use it in the PS3 and not go running to NVida at the last second for a GPU. Also considering that the GS of the PS2 has about as many features as a voodoo 1.
 
quest55720 said:
In any case, why does everyone assume that Sony is so far behind nVidia/ATI? PS2 was designed in an older graphics technology era than Xbox for ****s sake!



The logical answer would be if they had better they would use it in the PS3 and not go running to NVida at the last second for a GPU. Also considering that the GS of the PS2 has about as many features as a voodoo 1.
Right , had the features of a voodoo 1 when geforce 2s were hitting the market .

Only advantage the gs had was its fillrate .
 
MfA said:
x86 with it's SIMD extensions is about as good at multimedia as any other of the mainstream instruction sets with their SIMD extentions. IMO x86 would even be a reasonable ISA for a massively parallel processor with simple cores.

Id much rather have a massively parallel x86 with lots of simple x86+SIMD cores and some vertical multithreading than Cell personally.

Are you suggesting that such a thing would be economically viable? I think you are. What are your reasons to preferring this to Cell?

aaronspink said:
ultimate_end said:
And no, Cell is not just a "kick ass console". Sony corporation is investing the billions that it is because they are R&Ding for the future of their entire consumer electronics business. Same goes for Toshiba, who I heard are even supplying GM with cell based processors for use in motor vehicles.

Apparently Sony then wants a short future in the consumer electronics business.

I didn't mean their entire future. If I meant that I would have said that.

aaronspink said:
ultimate_end said:
What's the total STI investment? More than $10 billion probably. What kind of fuckwit company spends that kind of money on the CPU of a stupid games console? I cannot believe the relentless stupidity that festers around these boards! Cell probably isn't suitable for general purpose computing, but the scalable architecture sure is suitable for the kinds of applications that Sony and Toshiba are best known for.

10 Billion? likely not. Probably not muc more than 1.2 billion at the most. If they have spent more, then it is already a failure.

And if it isn't suitable for the general purpose market then it will have a hard time paying for its self for use in anything except PS3.

I am including Sony's investment in fabs as well, because Sony has stated that these 65nm fabs will be used only for Cell IIRC. I am also referring to the investment that all the STI partners have made, as everyone appears to assume that somehow, the total spent by all three companies is all for PS3 and that somehow, Sony has orchistrated this entire project because they want a "kick ass console".

By "general purpose" I am referring to PC-style general purpose computing and I speak relatively, because according to the patents, Cell is intended to have Integer/Floating Point and Scalar/SIMD capability which means that at high clock speeds Cell will actually perform extremely well at such applications as Word processing etc. I should have been more careful in my wording.

aaronspink said:
Also, who are the morons in this thread who somehow think that PC's aren't moving towards media and entertainment applications?

No one. The PC already has a large portion of the media and entertainment applications.

And that means exactly what?

As for my comments, again, my phrasing was sub-par. That's what you get when you need sleep, which also explains the ranting nature of my posts. But I am referring to the couple of comments such as this one by Entropy:

"I certainly don't see the x86 market as a whole clamoring for higher media performance"

Which is completely untrue.

aaronspink said:
It wasn't that long ago that PCs didn't even have FPUs

Actually, it WAS quite a long time ago. PCs had FPUs before consoles did.

What have Consoles got to do with this?

As for PC's FPU, well in the overall history of the Personal Computer, I don't see it as such a long time. More importantly, if we think of media/entertainment as being the current "era", then the fact that PC FPU also belongs in this timeframe also suggests "recentness". I am merely trying to highlight the rise of entertainment/media importance in the PC. Frankly, I find your comment to be nit-picking.

aaronspink said:
t from TVs, sterios and games consoles. At the very least, companies such as Sony do not believe that a single home server should be PC-based. Why do think Cell is so important?

It isn't that important. Just another also ran on the road of computing progress. Nothing about cell is any different than the fanism and ra-ra comments about PPC during the AIM era.

You need to improve your reading comprehension buddy. I meant important to "companies such as Sony".

I agree that Cell is just a step on the road of computing progress, I never claimed otherwise. I do fail to see however, how you can compare Cell to PPC, which was effectively nothing more than a simple x86 alternative for desktop computing.

aaronspink said:
Now, I also read people defending the PC paradigm. In the face of the media revolution, the lack of industry unity and the requirement for legacy software support has held back progress in the PC industry.

Its easy to pontificate. It is much harder to give examples. So lets hear them.

LOl. You do realise that quite a lot of CPU transistors are dedicated to x86 legacy support don't you? In the past, when overall transistor budgets were small, this was a very big hindrence to peformance. Think of all those transistors that could have been used to increase performance. That's just the beginning my friend. Bottom line? Microsoft dictates the PC industry. The majority of consumers use Microsoft applications and Microsoft operating systems. Because of Microsoft, each iteration of CPU has to be backwards compatible and therefore very similar to the previous one. This has prevented innovation in moving away from the tired old x86 architecture. The same thing also applies to the overall layout and architecure of the PC. If you can't see it, then you are blind.

As for industry unity, I was referring mainly to the Graphics card industry (even though, in the PC industry most things take forever to gain approval, which slows progress). According to Dave, It is graphics card companies who dictate to microsoft what direction to move the industry in. It is then up to Microsoft to balance the opposing forces and introduce the next DirectX. Any type of mediation process takes not inconsiderable time, this is where progress is limited. Then there are industry standards to fight over, such as PCI Express for example. That took quite a while IIRC. These things take the time that they do purely because there are so many opposing forces in the industry. People like you label anyone with a different perspective as "idealistic", but so are the people who seem to have dillusions that the PC industry is some kind harmonious utopia, when could not actually be possible for anything that consists entirely of countless self-concerned parties.

aaronspink said:
What x86 dark ages? There are over a half BILLION x86 users. And all of those users paid real prices, that actually covered the costs of design and development. There are at most 75 million PS2 users and the vast majority didn't pay enough to cover the R&D costs ofthe hardware.

You have made good comments so far, but now you just sound rediculous.

Over a half billion users? If you read my comments and those of others on this thread, then this is actually irrelevant. I would also like to add that just because something is common does not mean that it is good. This is something that everyone should learn at an early age, but maybe you haven't reached that point yet.

Yes we all payed real prices. Real prices that are inflated so that every single one of a thousand bickering companies can make healthy profits. You are quite comical as you seem to think that you are actually seeing some kind of value in your PC! A PC that you are paying for a decade and a half of x86 backwards compatability. A PC that you will have to upgrade in 6 months because of the next incremental rise in CPU performance that Intel, AMD and whoever else decides to release, because they need to sustain their income in a market that will remain essentially unchanged for years to come.

aaronspink said:
Maybe, just maybe, x86 is king becuase people want x86. It certainly isn't from lack of alternatives. the alternatives come and go, but the PC remains. It adapts. It is because it is fast moving and dynamic that it still exists. The last threat from the "media processors will take over the world" resulted in significant consumer exectronics companies writing off hundreds of millions in development costs when the PC space made a correction and added new functionality. Boom media processors dead. Same thing happened with the DSP rage era where people were predicting multiple DSPs in each PC doing the bulk of the work. Now DSPs are also a dime a dozen.

Yeah like consumers actually care whether x86 powers their computer or not. Consumers are brand loyal to Intel, AMD, NVIDIA, ATI and Microsoft. They are not brand loyal to x86! Again, read the comments made and you will realise that the iron grip x86 has on the market is not attributable some kind of imaginary superior technology or mass appeal. The PC industry is a self perpetuating behemoth. That, my friend, is what will prevent Cell from taking over your daily computing. You are completely dilluded if you believe that the few challenges to the PC industry in the past failed because the PC is the only realistic alternative.

In any case, the PC era as we know it is almost at an end. The question is, who will win out as the home server king and evolve from there? will it be the PC? Don't be surprised if Microsoft lose the battle to take over your life. If Microsoft loses, it will be the beginning of the end for your beloved PC. Everything dies aaronspink... everything dies.
 
CELL living room? :LOL:

Get back to me when Matsushita, Pioneer, Mitsubishi, Sanyo, Yamaha, Denon, JVC, Hitachi, Marantz, Runco, etc. supports CELL...
 
quest55720 said:
In any case, why does everyone assume that Sony is so far behind nVidia/ATI? PS2 was designed in an older graphics technology era than Xbox for ****s sake!



The logical answer would be if they had better they would use it in the PS3 and not go running to NVida at the last second for a GPU. Also considering that the GS of the PS2 has about as many features as a voodoo 1.

WTF? Go back and read everything that I have posted in this thread. Maybe then you might be in position to comment. :rolleyes:
 
Status
Not open for further replies.
Back
Top