Look at this Google-cached (pulled down) PlayStation 3 page

Crazyace, would have you liked to see the SCU's DSP set at full clock speed ( not ~half the clock of the SH-2s as it is now ) ?

I think it would have provided the incentive in pushing it and co-ordinate it with the twin SH-2s...

If I had my way I would have had the SCU's DSP as co-processor of the Slave SH-2 sort of like the GTE is for the PSOne's R3000A...
 
Not really,

It didn't have a divide - which complicates things if you want to use it for geometry transformation...
I tended to find it easier to use the SH2 for full geometry transform and culling - dsp could have been used for lighting, but I wasn't implementing lit renderer, just prelit 3D world..
 
If it were a full-speed co-processor for the SH-2 wouldn't have been better to do matrix addition, Multiplication and subtraction on the DSP and division on the SH2 ?
 
...

The incentive is a scalable architecture that's can be used in all a companies home electronics
CELL in $99 DVD players and $200 TV sets? Give me a break.

Because Sony's intention is to make the Sony Group into a broadband connected commodity that is self feeding and self sustaining. Sony Movies is the hottest film producer in Hollywood and holds enormous resources of digitized TV shows, movies, etc. Sony Music is huge in itself....
Other manufacturers don't own Hollywood studios and music labels and they don't benefit anything froms such integration. They simply sell cheap TV sets, DVD players, and Sterero for a living. What consumers do with their product after sales is none of their business.

They, unlike any other company in existence, not only produces the digital media at the front end but they sell the back-end products.
Exactly, other companies have no incentive to support CELL for above reasons. Why empower your competitor?

This is unprecedented if it's pulled off.
Can't be pulled off.

No other company is in this position to control the entire spectrum of media transmission, nor has one been since the invention of the printing press. Not even AOL-TimeWarner had this potential in it's heyday.
Sony doesn't own Hollywood. Sony is just another player in the consumer electronics market. Sony is not Microsoft.

People buy commodities. Sony is a commodity and it's about to increase this many-fold.
Are you a Sony employee or a shareholder??? I have not met such a passionate fanboy like you in years.

Well, I'll tell you that you're quick to judge and ignorant of the industries want for cross-platform standards for electronics. Sony is already doing great with their Linux derivative being designed by Sony, Samsung, Philips and LG Electronics, etc.
Linux is open and free, CELL is not.

The industry wants to sell their product.
Yap. Cheap product. Every penny count in cost-cutting, and sticking in a $200 processor in their $99 DVD player is counter-productive.

People want their products to be simpler to use together (eg. anti-Microsoft)
What's wrong with today's interface? How can they be made any simpler, until the TV sets start reading your voice command?

and they want pervasive computing and allways connected.
People don't even know what pervasive computing is.

Infact the patent shows examples of chips based off the Cell architecture (which it is, an architecture not a specific chip) that are under 1/5th the size of the BE.
Still too large. ARM measures 3~30 mm2.

And they can scale lower is necessary, although for the 2005 window this is great.
Can never beat ARM.

You don't know this. You don't know anything about the architecture and especially the software programming of it.
Of course I do what Kutaragi is upto; I was there myself 3 years ago. Eventually I realized it was not feasible due to programming complexity and moved on to something better. Why do you think I keep repeating the term "Auto parallelism"? Because it is possible.

I don't know the details, so I can be sure as hell you don't.
Ha Ha Ha....

I feel that the XBox Next and Nintendo 5 must be so much harder to program because their preformance is so much greater and the effort to extract this preformance must scale linearly.
Nintendo is out of console hardware race. Xbox2 will have a 6 Ghz X86 with some kind of nVIDIA or ATI GPU with 8~16 shaders, and this hardware will outperform PSX3; this much isn't very hard to figure out.

To Fafalada

But VU1 IS the geometric/T&L processor in PS2.
It's both easier to use for that purpose and faster at it then VU0 (which should be obvious since VU0 wasn't designed to function as a T&L unit to begin with).
When Sega ran out of horsepower with SH-2, it stuck in another.
When Matsushita ran out of horsepower with PPC602, it stuck in another.
When Kutaragi ran out of horsepower with VU0, he stuck in another.

The EE architecture speaks for itself, VU1 functionality is duplicated by VU0 and it is not particularly tied to CPU, living as just another device on the bus.

To anyone with ANY realworld usage experience it's painfully clear VU0 was the worse thought out part of the two, however, architecturally VU0 is even less likely to be an afterthought since it's actually coupled as coprocessor to R59k and extends the CPU instruction set, as well as working as a standalone unit.
This is why I know VU1 was an add on at later date.

However most efficient use tended to be with locked D$ so second SH2 could run geometry and animation calculations while first worked on game logic.
That was the very purpose of its addition.
 
this might not make a whole lot of sense for this discussion, but, when Lockheed Martin was making Model 3, they probably only started out with one Real3D-Pro/1000.

that produced 750,000 textured pps with all effects on, so they added a second one to give Model 3 it's 1.5 Mpps peak / 1 Mpps sustained performance :p
 
Nintendo is out of console hardware race. Xbox2 will have a 6 Ghz X86 with some kind of nVIDIA or ATI GPU with 8~16 shaders, and this hardware will outperform PSX3; this much isn't very hard to figure out.

A couple hundred people at SCEI, IBM and Toshiba intend to have the last laugh.
 
Deadmeat, about Xbox 2 having a 6Ghz chip... bullshit.

MS will undoubtably be going with Intel right?

The only 6Ghz chip around during the time frame of 2005 is the Intel Tejas, which starts off at around 4.5 Ghz and gets the axe at 9.

Just like it's brother, Xbox 2 will not be using a top of the line CPU.

Look at all the cash MS is losing on Xbox, they aren't ever going to make money on it. They didn't even use a top of the line CPU for Xbox either, are you saying that MS will take even greater a loss with Xbox 2?

When does MS make money in all this? It will be 10 years in a row of losses since they don't have any control over the chips and Intel and Nvidia can make them pay out the nose. I am afraid this is where Sony has MS beat, they have control over their chips, MS does not.

Top of the line 6Ghz Tejas chips aren't going to come cheap.

Oh and please, Nintendo out of the hardware market? Unlike their competitor they are making money, as long as there are little kids playing video games Nintendo will have a market to sell their hardware + software in.
 
Nintendo would not be vigorously preparing software for a machine they never intend to release. I would anticipate Nintendo to drastically reposistion themselves in the market only if their next machine is less successful than GC.
 
Re: ...

DeadmeatGA said:
You end up with the worst looking, blurry 1600x1200 image evah
1600x1200 is always better than 800x600. You see more detail than 800x600, even if 3/4th of pixels were artifically created.

OK, I'm game. SHOW ME ONE. Post it! Show me how a compressed 800x600 image can be upscanned to 1600x1200 and still look presentable, let alone better (not just different ) than the original 800x600. POST IT RIGHT FRICKEN HERE! ...and why just 1600x1200? Why not prove your point more directly (and proportionally) by citing that such improvement can be demonstrated by upscanning all the way to 3200x2400? Yes, I truly look forward to seeing a heavily jpeg'd 800x600 image blown up to 3200x2400 and then doctored to actually look better than the original...

ARM is also a general purpose processor.

Your penchant for misrepresentation is admirable (as admirable as misrepresentation can be taken, that is). No where did I say that ARM was not a general purpose processor. What was even your point with that statement??? Just to post something? Cell can be scaled to serve as a general purpose processor, a DTV decoder, a full-bore gaming console- whatever you need, whatever pricepoint you want to hit. Normal people would see that as a valuable quality. What can your ARM do? (not saying it isn't a versatile piece, as well, but just what can it do that Cell cannot?) Who would turn down more power at comparable cost? They'll just think of ways to use the extra power. Duh!

ARM has been doing that just fine all these years.

No where has it been said that it hasn't. What's your point? No where has it been said that Cell needs to beat ARM, either. It only needs to possess the same functionality at a reasonable price point, and that will make it a plausible candidate to serve in the same applications. Why is this so hard to compute in your head?

Why use CELL when ARM costs $1 a piece and is proven to work? Do you need 1 Telaflop to change channels? Why pay $200 and even more expensive software suits??? It is not logical.

You assume that Cell can only exist as is predicted for PS3, in its 1 TeraFLOP glory? ("Telaflop"? Do you write for the Onion, or something? Is this some sort of metric for Telatubbies, maybe?) If it is to be used in an ARM-esque application, one would naturally assume it would be scaled down to comparable performance parity so as to achieve the best cost benefit. ...course if you are making millions of them at smaller and smaller process sizes, maybe you won't have to scale down the architecture much at all to compete at a certain price point.

I welcome competition and new paradigm. But the new paradigm must be auto-parallelism, not "We throw in a bunch of processors in a die for a massive marketting hype, now you code slaves figure it out how to work this thing" kind of paradigm.

Don't be offended if we just take your assessment of the future as an opinion and not the infallible judgement of God, himself. Perhaps you end up wrong, and your dire prediction of the future turns out to be utterly inconsequential? I fully admit, Cell could go either way. I'm certainly interested to see how it turns out, instead of wallowing in a cloud of FUD to placate my insecurities.

Was the ARM developed for distributed/networked scenarios?
Is CELL then??? CELL does not do networking automatic, developers had to code in the networking. You do networking cheaply and more reliably on proven ARM platform than an unproven and beastly complex CELL.

Can you just answer the question, instead of just introducing another question as a form of diversion? Maybe you should let Intel know that they should just stick with ARM for all future CPU's, instead of wasting time coming up with a P5?

I fully see you are just trolling, judging the quality of the counter arguments you have presented here. I'm surprised you haven't been erased from these boards, yet, but maybe there's hope if the powers-that-be would just get on the ball.
 
Microsoft doesnt need super high clockspeed (like 6 Ghz) for XBox2's CPU. what they need is parallelism like PS3's version of CELL.

I'm hoping for a multicore AMD or INTEL CPU. but even so, a 2-4 core AMD or INTEL CPU wont reach even close to 1 TFLOP. that's why the lionshare of XBox2's computational power will come from its nVidia or ATi GPU(s) or VPU(s).
 
You have touched upon a very familiar argument utilized by those enslaved in the Xbox camp. What motivation is there to say Xbox will have a "6 Ghz CPU" other than to perpetuate the "my specs are better than yours" rivalry? In the same breath they will swear up and down that the "PC way" is the "correct way" with lots of hardware features and power made possible in the GPU. So which is it? They can't pick a way. Either it will benefit from a monster CPU or it will rely heavily on the GPU and use a more moderate CPU to drive the game. If it is the latter, there simply is no need for an "uber 6 Ghz CPU". ...but they could not be happy with their MS next gen console if it isn't sporting the baddest CPU on the block, right? Even then, what good is it to attach value to the term "6 Ghz"? If it is only good for 40 or 50 GFLOPs, that certainly doesn't look impressive against 1000 GFLOPs. What does that say? Is it about the games? No. Is it about bragging about dick sizes? Yes. There- I said it. ...and they say the Sony breed is an unusual mindset...
 
In a console environment when things can go 'to the metal' with little overhead, would HyperThreading (or Intel's upcoming 'impoved' HT) be a Good Thing (tm)?

I know currently HT sort of sucks because of resource conflicts and windows overhead, but in a console that wouldn't be a factor.
 
...

A couple hundred people at SCEI, IBM and Toshiba intend to have the last laugh.
There was never a case of a new console unable to outperform ones predating it. Whatever comes out after PSX3 will outperform it, it is a fact.

To Paul

MS will undoubtably be going with Intel right?

The only 6Ghz chip around during the time frame of 2005 is the Intel Tejas, which starts off at around 4.5 Ghz and gets the axe at 9.
If PSX3 comes out in 2005 in Japan, it will hit the US the next year to launch alongside Xbox2, right? Is 6 Ghz that far-fetched in 2006 holyday season???

Just like it's brother, Xbox 2 will not be using a top of the line CPU.
Of course not. The top end should be hitting 9~10 Ghz.

Look at all the cash MS is losing on Xbox, they aren't ever going to make money on it.
Maybe that is not the primary goal of Xbox, at least not right away. Just like the goal of Internet Explorer was not to make money.

Top of the line 6Ghz Tejas chips aren't going to come cheap.
6 Ghz is not the top of line in 2006.

Unlike their competitor they are making money, as long as there are little kids playing video games Nintendo will have a market to sell their hardware + software in.
Nintendo doesn't have the resources to compete in console hardware biz anymore, they have to defend their GBA business from PSP soon.

To randycat99

OK, I'm game. SHOW ME ONE. Post it! Show me how a compressed 800x600 image can be upscanned to 1600x1200 and still look presentable, let alone better (not just different ) than the original 800x600. POST IT RIGHT FRICKEN HERE!

fl_orig.png

Original

fl_nedi.png

Thriple resolution and Interpolated.

Cell can be scaled to serve as a general purpose processor, a DTV decoder, a full-bore gaming console- whatever you need, whatever pricepoint you want to hit.
You fail to answer my question. Why would other manufacturers use CELL when dedicated hardware solutions are cheaper, runs cooler, and easier to code software for??? CELL was designed for only one purpose, that is to render graphics. It is just not a competitive solution for everything else.

What can your ARM do? (not saying it isn't a versatile piece, as well, but just what can it do that Cell cannot?)
It meets the developer requirements at right price point and packaging. Why do you think billions of 16 bit processors are still in production annually??? Because 16 bit processors are all you need for certain applications and even an ARM would be considered an overkill. Try to see the big overall picture.

Who would turn down more power at comparable cost?
When $5~10 solutions deliver sufficient performance and power consumption.

No where has it been said that Cell needs to beat ARM, either.
ARM rules in consumer electronic applications and CELL needs to kick ARM out to make your fantasy world of CELLtized TV sets and DVD players a reality.

If it is to be used in an ARM-esque application, one would naturally assume it would be scaled down to comparable performance parity so as to achieve the best cost benefit. ...
It is easy to develop for ARM.(Many schools use ARM to teach ASM to their students) Even the most scaled down CELL will still dwarf Emotion Engine in die size, suck up tons of power, and still the same beast to code for.

course if you are making millions of them at smaller and smaller process sizes, maybe you won't have to scale down the architecture much at all to compete at a certain price point.
That will only happen if Sony opens up CELL to everyone, have dozens of second sources paying a loyalty of only 20 cents per chip, allow 3rd party architectural modifications, and find some guru to make "auto parallelization" work.

Of course this will never happen.

What motivation is there to say Xbox will have a "6 Ghz CPU" other than to perpetuate the "my specs are better than yours" rivalry?
6 Ghz P5 can actually be programmed by average coders, whereas the same is not certain for CEL.

If it is only good for 40 or 50 GFLOPs, that certainly doesn't look impressive against 1000 GFLOPs.
What makes you think CELL will do a teraflop in real world???

To Vince

We already stated that Cell is an architecture, not a specific chip. Just as MIPS cores scale from the absolute bottom end of the market to the top - as can Cell. Just look at it's design, it's modularity is appearent.
Call me when CELL reaches $1 price point and made available as a licensible source code available to everyone.

But you'd have to be clueless to assume that there isn't a large market in industrialized nations that buys high-end equiptment and enjoys these services.
If the US market is any indication, there isn't.

It will works it's way down eventually, in what incarnation I don't know or care.
You should.

Why license Trinitron out?
The vast majorty of screens are not Trinitron based.

Because acceptence is worth it.
So what do other venders gain by accepting CELL?

Because it creates demand for your liecensed product.
There was never a market for "licensed" product, since the "licensed" product always end up costing more than the original since the original vender is willing to lose money to increase market share, whereas the licensee cannot afford to lose money. The playing field is not even.

This is why 3DO, Saturn clones, and Panasonic Q failed.

If you can either loose the sales of, say, a TV with competitors technology and keep yours propietary or sell the high-profit technology and forgo selling what will ultimatly be a lower-profit device - why not?
Because the bread and butter of consumer electronic industry is low-priced mass market stuffs.

Sony is probobly close to the hottest film studio in Hollywood.
Does Sony have a 60% film market share needed to influence the consumer purchase decision?

I just googled and they own: Columbia Tri-Star, Columbia Pictures, Tri-Star Pictures, Jim Henson Productions (partial interest), Mandalay Entertainment (partial interest), Phoenix Pictures (partial interest), Sony Pictures Classics, Sony Pictures Entertainment, Columbia-Tri Star Home Video.
Yea, Sony does Charlie's Angel while AOL-Time Warner does Matrix Reloaded and Terminator3.

And of course their not Microsoft, Microsoft is one-in-a-Billion.
Only Microsoft can enforce its proprietary technology as a market standard. Sony is not.

Um, point?
CELL is not free and actually very expensive.

Ask Apple. The idea is that once you plug it in you should never have to mess with it again. Kind of like using a Sony Camera (iLINK)/Mac or a Memory Stick in a Sony TV/VAIO.
The TV and DVD user interface I see is text and simple graphics based and works just fine. No need to fix what is not broken.

No, but they know how to get the Internet on their Cellphone.
That cellphone is ARM powered.
 
Deadmeat you posted 2 pics of similar resolution and not what was requested. ie one 800*600 image versus the blown up interpolated 1600*1200.

in addition you linier scaled the first pic producing an unfair comparison.

wither you are unaware of this or we cross wires an are argueing about disimilar things care to clarify a little?
 
Re: ...

DeadmeatGA said:
The vast majorty of screens are not Trinitron based.

Get out much? Gone shopping lately? :rolleyes:

This is why 3DO, Saturn clones, and Panasonic Q failed.

What does this have anything to do with home electronics?

Because the bread and butter of consumer electronic industry is low-priced mass market stuffs.

Um, no. Well, perhaps if you're based in China and utilizing cheap labor and parts.

The "bread-and-butter" is high-end products that have commodity value.

Does Sony have a 60% film market share needed to influence the consumer purchase decision?

I don't know what it is, but you'd be insane to not see the large media empire they control.

Yea, Sony does Charlie's Angel while AOL-Time Warner does Matrix Reloaded and Terminator3.

Yeah, they've had such a bad last few years.... Superman, Charlies Angels, Men in Black, Crouching Tiger Hidden Dragon, Bad Boys2, Ali, XXX, Mr. Deeds, SWAT.

Oh yes, they're insignificant. :rolleyes:


The TV and DVD user interface I see is text and simple graphics based and works just fine. No need to fix what is not broken.

Not an excuse. Any company with your anti-advancment attitude will find themselves outdated and out of the game in a soon enough time.

That cellphone is ARM powered.

This proves what? That people desire the little bit of pervasive computing they've seen thus far on a little ARM processor? What do you think is going to happen when you allow people to surf or play streaming digital music or video onto their cellphone? Or get pictures and data off their home PC or electronics device seemlessly? Or control their recordable media at home to record a movie on TV from the phone? Or any of the thousands of possibilities that easil;y open up when you have an architecture like this that scales into many electronic devices cost effectivly.

If you can't even admit that this is the future, a connected and seemless future, regardless of if it's Sony and partners leading it - then my conversation to you is futile (which I already know it to be).
 
...

What does this have anything to do with home electronics?
I am trying to show how other manufacturers are at cost disadvantage. With ARM, you only need to pay ARM cents per chip. With CELL, you have to pay IBM(PPC), Toshiba(VU), and Kutaragi(He just wants money), not to mention expensive fabrication(ARM is happy with 0.25 micron process), 10X jump in software development cost, etc.

The "bread-and-butter" is high-end products that have commodity value.
Well, go check what price range product sells the most. Consumer electronic industry is known for a 5% margin and manufactacturers count pennies to save.

I don't know what it is, but you'd be insane to not see the large media empire they control.
Yap, Sony's films could not prevent the industry from dumping Sony-Phillips's HDCD standard in favor of what is largely Toshiba's proposal intact... Sony is a small fry in the media industry and plays no role in standard setting.

Not an excuse. Any company with your anti-advancment attitude will find themselves outdated and out of the game in a soon enough time.
I am not anti-advancement; I actually look at the bills and balance sheet.

This proves what?
ARM is everywhere because it is everything CELL isn't.

1. Simple.
2. Inexpensive.
3. Easy to develop software for.
4. Low power consumption.
5. A dozen 2nd sources competing on price.
6. Excellent tools.

What do you think is going to happen when you allow people to surf or play streaming digital music or video onto their cellphone?
Use faster ARM processors that still sips power in milliwatts.

Or get pictures and data off their home PC or electronics device seemlessly?
That is a software functionality and has nothing to do with hardware.

Or control their recordable media at home to record a movie on TV from the phone?
A software functionality once again.

open up when you have an architecture like this that scales into many electronic devices cost effectivly.
ARM is so open that it even has clones.(Other venders selling ARM compatible processor source code) I don't think anyone will even attempt to clone CELL.
 
That is not the original. This is:

fl_orig.png


and it looks much better than that blurry-ass upscaled version.

You see more detail than 800x600, even if 3/4th of pixels were artifically created.
Yes, true... you see more blur.


If PSX3 comes out in 2005 in Japan, it will hit the US the next year to launch alongside Xbox2, right?
From the latest Microsoft's show in Japan (held yesterday):

"We will not let them (Sony) beat us to the market again"

So, if they stay true to the word they will have to lauch at the same time, and in Japan. That is if Sony decides to launch there first, (since they are the ones that control what goes on and when) Consider also that at one point MS wanted to launch Xbox in the same time with PS2, and projected hardware was something as humble as 500MHz Celeron with GF2. In many aspects such hardware wouldn't exactly stack favorably against PS2. I'm saying this to show, and it's evidenced by present Xbox too, that money was an issue for MS when they designed Xbox. They had to cut the corners a lot, they moved production all over the world to make the units as cheap as possible. The reason is, Xbox division manipulates only with the small fragment of Microsoft's wealth, and I don't think that will change much in the next gen.

At the end of the day, it won't matter. If they want to launch at the same time, MS will not be able to concot a hardware that will have a distinct technological edge over PS3, and even if there is some kind of edge, games will not show it, as the complexity of both consoles' graphics will be so high that the whole balast of impressive looking visuals will be in the hands of people who make art assets.

Sony is just another player in the consumer electronics market. Sony is not Microsoft.
You mean just like Microsoft is yet another player (wannabie) in the media/gaming market?

[/i]I was there myself 3 years ago. Eventually I realized it was not feasible due to programming complexity and moved on to something better[/i]. Why do you think I keep repeating the term "Auto parallelism"? Because it is possible... ...I welcome competition and new paradigm. But the new paradigm must be auto-parallelism,
I could be an ass and tell you to go Here and share your ideas with people who will understand and support you, but this could be more serious.

I think I have told you this before, but I'm telling you again, as a friend: Illusions of grandeur, especially of the kind where you go around and scream that you are right and that people who have proven work behind them are wrong (as of yet, I've never seen any proven work from you), can be a first sign of chemical imabalance or worse, possible schizophrenia. I've witnessed the same sad development of this kind first hand with a good friend of mine. Consider consulting your real-life friends about this, and if they agree, seek some professional help. The saddest thing is, if you have it, you are completely unaware of it, while for the rest of the world it's painfully obvious.
 
Re: ...

DeadmeatGA said:
OK, I'm game. SHOW ME ONE. Post it! Show me how a compressed 800x600 image can be upscanned to 1600x1200 and still look presentable, let alone better (not just different ) than the original 800x600. POST IT RIGHT FRICKEN HERE!

Thriple resolution and Interpolated.

As noted above by notAfanB, your comparison is flawed. You need to compare the upscanned image to the original (at its original resolution). The upscanned + filtered image will naturally be more impressive in that it is larger, but ultimately it will just look like a softened version of the original. No extra detail. No real improvement in clarity. To be absolutely consistent with your 16x AA DTV claim, you then need to return the image back to 800x600 as it would be shown on said hypothetical DTV. Compare to the original 800x600. Not that big of a difference? Worth all the processing of upscanning 16x, filtering, and then downscanning for presentation? Dubious. What you are suggesting is akin to a perpetual motion machine.

You fail to answer my question. Why would other manufacturers use CELL when dedicated hardware solutions are cheaper, runs cooler, and easier to code software for???

You have contrived a scenario where it can only fail. Unless you have a running Cell machine you are tinkering with right now, I fail to see how you can be so certain things will be exactly how you say.

CELL was designed for only one purpose, that is to render graphics. It is just not a competitive solution for everything else.

Is it or is it not capable of general processor duties?

It meets the developer requirements at right price point and packaging. Why do you think billions of 16 bit processors are still in production annually??? Because 16 bit processors are all you need for certain applications and even an ARM would be considered an overkill. Try to see the big overall picture.

So a single quantum level Cell unit should be quite adequate for the job. It will be small and the price will be right (using my DMGA-esque crystal ball- see how it works both ways?). Developer requirements will be nothing unusual, as you would be programming for only a single unit, anyway.

When $5~10 solutions deliver sufficient performance and power consumption.

No reason a baseline Cell couldn't fulfill that.

ARM rules in consumer electronic applications and CELL needs to kick ARM out to make your fantasy world of CELLtized TV sets and DVD players a reality.

Right, and there has never been a market shared by 3 competitors (wink)... Cell will find its specialties and proliferate from there. ARM will probably always be around, as well.

It is easy to develop for ARM.(Many schools use ARM to teach ASM to their students) Even the most scaled down CELL will still dwarf Emotion Engine in die size, suck up tons of power, and still the same beast to code for.

...again with the DMGA crystal ball of infinite vision?

That will only happen if Sony opens up CELL to everyone, have dozens of second sources paying a loyalty of only 20 cents per chip, allow 3rd party architectural modifications, and find some guru to make "auto parallelization" work.

It appears you find it a conceivable scenario then. Thanks.

What motivation is there to say Xbox will have a "6 Ghz CPU" other than to perpetuate the "my specs are better than yours" rivalry?
6 Ghz P5 can actually be programmed by average coders, whereas the same is not certain for CEL.

The point is not whether or not you think they can be programmed. The point is what is the great importance of having a 6 GHz CPU? If it is just 4 Ghz, will that make Xbox2 a failure? Why does the clockrate have any significance at all, if all the power is to come from the GPU in this architecture? You avoided that point altogether and instead diverted to FUD-related issues of programmer friendliness.

If it is only good for 40 or 50 GFLOPs, that certainly doesn't look impressive against 1000 GFLOPs.
What makes you think CELL will do a teraflop in real world???

Who really knows? It won't matter anyway for those resigned to buying consoles based soley on marketing specs like # of GHz or GFLOPs. PS3 buyers will happily scoop up their "personal 1 TeraFLOP machine", and XBox2 buyers will placate themself with the notion that they have a whopping 6 Ghz CPU inside.
 
...

To notAFanB

Deadmeat you posted 2 pics of similar resolution and not what was requested. ie one 800*600 image versus the blown up interpolated 1600*1200.
This is what was requested, how the 320x240 original and 1280x960 interpolated image would look on IDENTICAL SCREEN SIZE. The 320x240 would look that pixelated when the monitor resolution is set to 320x240, while interpolation of same image to 1280x960 and showing it in 1280x960 resolution would make it look smooth and detailed as the example.

To marconelly!

Yes, true... you see more blur.
You need better glasses.

Consider also that at one point MS wanted to launch Xbox in the same time with PS2, and projected hardware was something as humble as 500MHz Celeron with GF2.
That Xbox was not a game console, but some kind of TVPC. Microsoft was smart enough to realize this wouldn't work and redesigned the thing to outrun PSX2.

You mean just like Microsoft is yet another player (wannabie) in the media/gaming market?
Yap. No single company dominates the entertainment media market.

that people who have proven work behind them are wrong
Mean to suggest me that this Vince guy designs systems for a living?

can be a first sign of chemical imabalance or worse, possible schizophrenia.
Could be said the same about you too.

I've witnessed the same sad development of this kind first hand with a good friend of mine.
Too bad I am not your good friend. Like attracts likes. You have troubled friends because you yourself is most likely troubled with similar problems.

Consider consulting your real-life friends about this, and if they agree, seek some professional help.
I ask the same from you too. You do need help.

To randycat99

As noted above by notAfanB, your comparison is flawed.
It is legit, this is what you would see if you reset your monitor to 320x240. If this is how far your thoughts will reach on any particular subject, I am very dissapointed.

You have contrived a scenario where it can only fail. Unless you have a running Cell machine you are tinkering with right now, I fail to see how you can be so certain things will be exactly how you say.
Because Sony has not filed a patent for the very technique that enables something like CELL work for real world applications. Kutaragi doesn't get it.

Is it or is it not capable of general processor duties?
At great expense, of course.

So a single quantum level Cell unit should be quite adequate for the job.
What is the transistor count of single PE? 100 million? 200 million? Can you fabricate it on mainstream 0.18 micron fab at a reasonable cost of say $10~20 per chip so that other manufacturers could use it in their $300 products??

Developer requirements will be nothing unusual, as you would be programming for only a single unit, anyway.[ /quote]
Then the whole CELL thing is pointless, you could achieve something similar with off the shelf PPC G4 or MIPS3D cores. And such code will not scale on multicore CELL either.

The point is what is the great importance of having a 6 GHz CPU? If it is just 4 Ghz, will that make Xbox2 a failure?
Nope. Intel chips just happens to clock a lot faster than contemporary RISC processors within similar price range. So it will clock faster than PSX3 even if launched at same time.
 
Back
Top