Details trickle out on CELL processor...

What does Xbox have anything to do with this discussion?

My God, you are thick-headed. To determine what was suitable for the Emotion Engine's performance when handling the tasks of the PS2, you would have to compare it to another competing in that very same area. Seeing how the Xbox is the most powerful console, I compared its performance to the PS2. Get it now?

And that's why you only see EE/GS being used in niche products or high priced products ie it isn't fit for general usage in many devices. If CELL were to be used in many devices then it would have to be compromised like Inane Dork described.

Just because you haven't witness it being used in other consumer products, don't assume it can't be. I believe this was one of the reasons for the missile guiding statement. And who said Cell will be used in every consumer product? We don't yet know this. So I would really suggest you not jump to such conclusions.
 
PC-Engine said:
And I didn't say they won't find uses for it. The point is specialized ASICs can do the same job better cheaper. Why don't we have Pentiums in DVD players to decode MPEG2? ;)

Cheaper in what way? I think you're right but it's situational. In other words, there are times when having smaller dedicated chips could be "cheaper" (depends on what manner we're calculating the costs) and times when it's not.

*Edit - you seem to agree that it could be cheaper in your later post.
 
To determine what was suitable for the Emotion Engine's performance when handling the tasks of the PS2, you would be to compare it to another competing in that very same area. Seeing how the Xbox is the most powerful console, I compared its performance to the PS2. Get it now?

But why would you want to compare it to Xcpu?

Just because you haven't witness it being used in other consumer products, don't assume it can't be. I believe this was one of the reasons for the missile guiding statement. And who said Cell will be used in every consumer product? We don't yet know this. So I would really suggest you not jump to such conclusions.

Sure it CAN be used in more products, but I have my doubts it'll go beyond the niche products. And if it does go beyond the niche products then it would have to be compromised.
 
PC-Engine said:
To determine what was suitable for the Emotion Engine's performance when handling the tasks of the PS2, you would be to compare it to another competing in that very same area. Seeing how the Xbox is the most powerful console, I compared its performance to the PS2. Get it now?

But why would you want to compare it to Xcpu?

Just forget it. I'm a little tired of explaining myself over something this simple, yet so difficult to grasp.

PC-Engine said:
Just because you haven't witness it being used in other consumer products, don't assume it can't be. I believe this was one of the reasons for the missile guiding statement. And who said Cell will be used in every consumer product? We don't yet know this. So I would really suggest you not jump to such conclusions.

Sure it CAN be used in more products, but I have my doubts it'll go beyond the niche products. And if it does go beyond the niche products then it would have to be compromised.

What don't you have doubts on, PC? That's your problem. Because you haven't seen it done before, you automatically assume it can't be.
 
Because the EE/GS has been out how long? Yet it's only starting to be used in other niche products now? If it takes 5 years to get cheap enough or cool enough to use in other products then what's the point really? Dedicated ASICs can be used right out of the gate, runs cool, and is cheap. After 5 years those same ASICs would be dirt cheap. It's like Inane Dork was saying why go through all the trouble to get a Pentium or CELL to work in a DVD player when other solutions can do the job cheaply right off the bat? Maybe SONY can't design a lot of the chips they're using now inhouse cheaply so they have to resort to buying someone elses chips or use another chip like EE/GS/CELL that wasn't designed for CE equipment other than games machines? The only reason why they used EE/GS in PSX was because that thing could also play PS2 games (not very well) I might add.
 
PC-Engine said:
Because the EE/GS has been out how long? Yet it's only starting to be used in other niche products now? If it takes 5 years to get cheap enough or cool enough to use in other products then what's the point really? Dedicated ASICs can be used right out of the gate, runs cool, and is cheap. After 5 years those same ASICs would be dirt cheap. It's like Inane Dork was saying why go through all the trouble to get a Pentium or CELL to work in a DVD player or a toaster?

What does this have anything to do with the way it should hold up as a dedicated processor for both a console and a consumer device? :? Please, Engine, don't make me laugh.
 
I think the point PCE is missing here, is that when you get out to the sort of processing that will go into scaling, filtering, and enhancement at internal resolutions, the ASICs required to do it properly are no longer simple or cheap. This is where EE/GS seems to come into its own for computational power, low cost, and versatility. We're not talking about simple MPEG2 decoding, anymore (of which the XCPU would have been barely capable of, and the P4 would be way overkill in computation, heat, and especially cost). The EE/GS was born to excel in image processing applications, and now it is a compact, monolithic device, and features the versatility of being limited in task by only by the software you make it run. So in a perverse sense, it seems Sony will be profitting from "PS2 hardware" long after it discontinues selling PS2's. It has reached the level of obiquity of ASICs, yet is able to own the level of above where ASICs cannot go. Sure you can find an ASIC that does go to that higher level, but it won't be cheap, and it will be useless once your task changes or you desire to use a different algorithm. The EE/GS appears to have reached the "best of both worlds" status. That said, I now step away from this here soapbox. :)
 
That's fine and dandy if it is indeed doing all the the work of many chips at the same time in a single chip, but as of right now it's just wishful thinking with nothing to back it up. :LOL:

Heck the EE/GS is supposed to provide simple PS2 games compatibility yet it's implementation in PSX says otherwise. :LOL:

We're not talking about simple MPEG2 decoding, anymore (of which the XCPU would have been barely capable of, and the P4 would be way overkill in computation, heat, and especially cost).

Do some research dude. PowerDVD only needs a 450MHz Celeron. :LOL:

The EE/GS was born to excel in image processing applications, and now it is a compact, monolithic device, and features the versatility of being limited in task by only by the software you make it run.

And it still runs hot needing a HSF even at 90nm. :LOL:

It has reached the level of obiquity of ASICs, yet is able to own the level of above where ASICs cannot go. Sure you can find an ASIC that does go to that higher level, but it won't be cheap, and it will be useless once your task changes or you desire to use a different algorithm.

That's why there are DSPs that are more generalized if algorithm flexibility is required. ;)
 
Spidermate said:
I'm not understanding you at all. The Emotion Engine was orginally designed for the PS2, no different from Cell. Inspite of this, however, it was still capable of delivering acceptible performance in other devices and electronics as well as the console after two years. Do you or do you not agree?
Of course I agree. How can I disagree with pure fact?

The point, which you aren't really addressing, is that hardware built for general processing and hardware built for graphics processing are quite different. Wherever Cell lands on that spectrum tells how well it will handle different loads.
 
PC-Engine said:
That's fine and dandy if it is indeed doing all the the work of many chips at the same time in a single chip, but as of right now it's just wishful thinking with nothing to back it up. :LOL:

Yes, cuz that article on the LCD HDTV is just more hype-ware... :rolleyes: You really are stuck in denial.

Heck the EE/GS is supposed to provide simple PS2 games compatibility yet it's implementation in PSX says otherwise. :LOL:

Haven't the slightest idea what you are ranting or misrepresenting on about now, but it is certain that you should save it for a separate topic (as usual).
 
Yes, cuz that article on the LCD HDTV is just more hype-ware... You really are stuck in denial.

Here read it again since you missed it the first time. ;)

Sony managed to cram in their Emotion Engine and Graphics Synthesizer (of PS2 fame) for better image rendering.

Now where does it say it's doing the job of mulitple ASICs??? Where does it say it does the scaling??? :LOL:

Haven't the slightest idea what you are ranting or misrepresenting on about now, but it is certain that you should save it for a separate topic (as usual).

That's right you have no clue...oh and uh I've added points to the previous post above for your reading enjoyment.

Please explain what they are referring to by better image rendering with respect to the EE/GS chip in that television. :LOL: ;)
 
PC-Engine said:
Yes, cuz that article on the LCD HDTV is just more hype-ware... You really are stuck in denial.

Here read it again since you missed it the first time. ;)

Sony managed to cram in their Emotion Engine and Graphics Synthesizer (of PS2 fame) for better image rendering.

Now where does it say it's doing the job of mulitple ASICs??? Where does it say it does the scaling??? :LOL:

So you were expecting there to be 1 single chip in that whole TV? BFD! Were you expecting them to document in a teaser ad how many ASICs were replaced by using an EE/GS chip, so that PCE could feel good about the product? BFD! The only term you need be concerned about from that line is "image rendering". Oh, that's not descriptive enough for you? BFD! :rolleyes:

Haven't the slightest idea what you are ranting or misrepresenting on about now, but it is certain that you should save it for a separate topic (as usual).

That's right you have no clue...oh and uh I've added points to the previous post above for your reading enjoyment.

No, just not entertaining your continued efforts to derail and obfuscate. See the discussion topic? It's about Cell. Fancy that!

Please explain what they mean by image rendering in that television. :LOL:

You seem to have something burning to reveal here. What do you think they meant?
 
Read the post above, I've clarified it for you. I don't think you have an answer though...better image rendering...um...riiiiiight.

You DO know why that tv has a nice image right? Hate to break it to you but it's not the EE/GS that's responsible. Heh multiple chips uh huh...each with their own HSF...sounds pretty cool. :LOL:

It's not the imaginary 5 GFLOPS from the GS either. :p
 
You could tell all that from a few lines in an ad? Well, that would explain your cereal box approach to understanding technology. Suit yourself. I'm more willing to admit there is not enough information there to definitively say exactly how it works or in what way it is being implemented. What is more certain is that all that is working together as a system. You are hellbent on believing it is a surficial part. BFD! It won't change reality for the rest of us (fortunately).
 
Just going by YOUR logic Sherlock.

Multiple EE/GS for better image rendering....um yeah ok.

Maybe it's just better image rendering for the GUI? ;) :LOL:

I'm pretty sure the mutiple color LED backlighting and the separate DRC scaling chip has nothing to do with the better image rendering. In your dream world that tv doesn't need the dedicated DRC chip or any other chips because the multiple superduper cheap, efficient, simple EE/GS chips do all the work. :LOL:

Must be that imaginary 5 GFLOPS of computing power from the GS that makes the image teh ROX!!
 
You misrepresented my words and then came up with a ridiculous scenario (a 1 chip TV). This is not my logic. It is simply your typical trolling tactics. Apparently, that is not enough to have you banned by now, but hey, BFD. :rolleyes:
 
McFly said:
But a smart noise reduction algorithm could use a fast chip or Sonys picture resolution enhancement technology just to name two. If there is more power then there will be a smart mind with an idea how to use it. Just because you don't see an use does not meen that there is no use. ;)

Fredi

I believe philips claimed they were using a chip with as much power as a 500 mhz pentium 3 in some of their tvs for interpolation or something like that.

Why don't we have Pentiums in DVD players to decode MPEG2?

Yeah, why don't we? I've seen some pentium 2s for as low as like $15 new, though I don't know if intel sells them for that price.
 
Back
Top