Cellebrate Car AcCelleration *spin-off*

Status
Not open for further replies.
Wow. You sure like to drop names, don't you? Imagine what those "greats" could have done with a decent architecture?

It must make you really sad that cell is dead.
That's what names are for (to be used). "Decent architecture" like "Xenon"? :) And, I'm fine with it. They have reworked some of the Cell architectural elements in Intel processors and GPUs are even pulling up the slack, now.

The interesting thing will be to see who will generally put out the best technical games, next-gen. My guess is that it will be the same people that are excelling this gen. Time will tell. Then, we can hear the same ole arguments about bad architecture for that gen. "If this was made a little bit easier, we could have done a lot more with it."

Uhhh,
1. Intel invented MLAA, not sony.
2. Sony is unlikely to give out their version of MLAA to non sony devs.
3. Even if they did give it out, it would be programmed to run on Cell, not what everyone else uses for post AA (the GPU).
4. Most devs seem to like FXAA more.
5. A few 360 games do use MLAA.

1. I said "Sony's MLAA" because their version goes beyond the original paper written by Intel. Read the "making of God of War III" article from Eurogamer. And, to think the initial port over was 120ms. I guess making use of the Cell architecture couldn't have helped to reduce that time to 20ms with additions, right? ;) That's the version I'm talking about.

4. I wonder why? It's universally applicable. ;)
So every GPU thread processes a single pixel in this approach. In the MLAA algorithm however, pixels are not independent, but have a rather strict order in which they need to be processed. In other words, MLAA is not embarrassingly parallel and thus hard to implement on a GPU. Edge detection is not the issue.

http://forum.beyond3d.com/showpost.php?p=1433406&postcount=443

I believe that's without the "beyond" part of Sony's MLAA implementation being in effect.

EDIT: I wanted to add some more information about Sony's MLAA implementation.

http://forum.beyond3d.com/showpost.php?p=1435976&postcount=248

It was extremely expensive at first. The first not so naive SPU version, which was considered decent, was taking more than 120 ms, at which point, we had decided to pass on the technique. It quickly went down to 80 and then 60 ms when some kind of bottleneck was reached. Our worst scene remained at 60ms for a very long time, but simpler scenes got cheaper and cheaper. Finally, and after many breakthroughs and long hours from our technology teams, especially our technology team in Europe, we shipped with the cheapest scenes around 7 ms, the average Gow3 scene at 12 ms, and the most expensive scene at 20 ms.

In term of quality, the latest version is also significantly better than the initial 120+ ms version. It started with a quality way lower than your typical MSAA2x on more than half of the screen. It was equivalent on a good 25% and was already nicer on the rest. At that point we were only after speed, there could be a long post mortem, but it wasn’t immediately obvious that it would save us a lot of RSX time if any, so it would have been a no go if it hadn’t been optimized on the SPU. When it was clear that we were getting a nice RSX boost ( 2 to 3 ms at first, 6 or 7 ms in the shipped version ), we actually focused on evaluating if it was a valid option visually. Despite of any great performance gain, the team couldn’t compromise on quality, there was a pretty high level to reach to even consider the option. And as for the speed, the improvements on the quality front were dramatic. A few months before shipping, we finally reached a quality similar to MSAA2x on almost the entire screen, and a few weeks later, all the pixelated edges disappeared and the quality became significantly higher than MSAA2x or even MSAA4x on all our still shots, without any exception. In motion it became globally better too, few minor issues remained which just can’t be solved without sub-pixel sampling.
 
Last edited by a moderator:
Sure. What else are we non-sports-loving, non-political geeks supposed to get all tribal about?
Blond or brunette / other ethnicity?
:LOL:
Anyway, I guess won't participate further in the debate as I have to tolerate others opinions even when against the majority/consensus as I have that position on quiet some others topics (not tech related though).

To Lucid dreamer, I have not forget the "formally known as a charlatan thread" and plenty of others.
If plenty of cleverer people did not change you opinion I can't see as my less educated one (/ recollection of opinions) could. Anyway as a side note, pick your example better if you want something pretty impressive done of SPUs you should not look too far in the past and look at what Dice pulled out with BF3.
Farewell :)
 
Blond or brunette / other ethnicity?
:LOL:
Anyway, I guess won't participate further in the debate as I have to tolerate others opinions even when against the majority/consensus as I have that position on quiet some others topics (not tech related though).

To Lucid dreamer, I have not forget the "formally known as a charlatan thread" and plenty of others.
If plenty of cleverer people did not change you opinion I can't see as my less educated one (/ recollection of opinions) could. Anyway as a side note, pick your example better if you want something pretty impressive done of SPUs you should not look too far in the past and look at what Dice pulled out with BF3.
Farewell :)

Actually, I consider the ones doing great things with Cell/PS3 as the cleverest group. Those are the ones I'm listening to. :) If they can't change your mind, I guess no one can. Also, the funny thing is that I believe I mentioned BF3, first. No worries. It's easy to miss, when one is not trying to pay attention to the references.
 
Please don't go, you were one of the signals among the noise.
Nice but I'm not any reference, you may want to re read the "predict the generation system" starting around pages 40 ;)
A lot of the reasons why Cell were abandoned are high lighted by bright people (also its success).
THe pretty obvious things to me re reading too is that, a cell 2 has described by (among others) nAo would have looked a lot like either Larrabee or the Power A2 (though IBM did not shoot for crazy high SP perofrmance with the latter, I believe that the SIMD are 4 wide but can process DP at close to full speed, it should have been doable for pretty chip to make those units 8 wide for SP calculation, though doubling the throughput).

EDIT:
as a side no not dismiss say Aaron Pink because he is not a ps3 dev (he is not a dev more of cpu architect), he is as insightful as can be ;) .
You can start page 39 more precisely the discussion start to move toward what a cell 2 could be and larrabee.
I failed to find some other interesting discussion for now.
EDIT 2
By the way re reading through this makes the talk about next gen even more boring, Aaron pink was right early 2008 (and possibly before) that the systems were set to be pretty much closed box pc.
 
Last edited by a moderator:
Okay, thanks for the pointers.
I know most beyond3d residents discussed it ad nauseum, but I just arrived here. :LOL:
 
I noticed over the years that it's not that many people devs in this forum thing Cell was worthless. It's just the same people saying that over and over until they were the loudest "voices". Here are more posts, from people that didn't think that, to back that up.

http://forum.beyond3d.com/showpost.php?p=1658961&postcount=209

http://forum.beyond3d.com/showpost.php?p=1659004&postcount=218

http://forum.beyond3d.com/showpost.php?p=1595533&postcount=126

http://forum.beyond3d.com/showpost.php?p=1596056&postcount=127

http://forum.beyond3d.com/showpost.php?p=1596708&postcount=136

EDIT: added an important link from some time ago.

http://forum.beyond3d.com/showpost.php?p=1589342&postcount=1
 
Last edited by a moderator:
A post from a "liolio" that seemed to have disappeared.

It's pretty obvious that in the PS3 (and in some other specific cases) the cell was indeed good.
Whereas Sony (and MS) by the way could have done better is another topic.
Say for the same silicon budget (and worse power draw) Sony went with a 4 cores CPU (using 4 PPU) the PS3 would have never keep up (and in some case exceed) with the 360 performances as the extra cores would never make up for the loss throughput of 5/6 SPUs running around. This topic is kind of a "none question" facts proved that Cell was good and hopefully either way if RSX was the best thing money could buy for Sony the system would have done an overall worse show off.

http://forum.beyond3d.com/showpost.php?p=1589700&postcount=18

Interesting.
 
Obviously its a good CPU, the question is would it be smarter if they sunk that 400 mil $ in GPU and gone for 4 PPU rather than this setup. For Sony with PS2 and PS3 it seems that it was CPU first, GPU later, while everyone else have gone exactly the opposite (now Sony too with PS4).

I guess that tells you everything you need to know about the architecture.

Hypothetically speaking, what if Sony went with this setup - 3 PPUs, GPU, no eDRAM, same silicon budget like PS3 but focused on GPU like MS did with Xenos? Imagine XGPU with twice the number of ALUs and performance advantage that would give them? It would be pretty unfair I would say. Not only would they get superior 3rd party games, developers wouldn't have to sweat blood, waste time and money to make CPU do what GPU should have done in the first place.
 
Last edited by a moderator:
This thread is a goldmine of awesomeness, with a pure seam of Lucid_Dreamer running through it!

Before I found my way to this forum, several years ago, I use to think ALL devs were extremely adaptable/flexible. I use to think they loved to be challenged with new and exotic high performance hardware. Hardware designers would come up with fast hardware and software devs would find new and interesting ways to make it sing. In my mind, it was some sort of man vs. machine paradigm. It appealed to me.
I'm not going to say what I think it has largely become, now. No disrespect intended, but I'll just say I've been extremely let down.

You're not going to say what you think it's become, but at the same time, you're going to say what you think it's become!

+ bonus appeal to emotion!

Then, titles like Battlefield 3, Uncharted 2 and 3, Gran Turismo 5, God of War 3, Killzone 3, etc shouldn't exist. Last of Us and God of War: Ascension looks to be even more of a case against that statement. This is a 6 year old product. The hardware was capable of, at least, this from day one. Only the mind sets/skill sets needed to catch up. Some have decided to catch up to the hardware and that hard work will serve them well in the coming years.

Lazy devs!

Because, "crappy hardware" can't yeild such results. Show me these games on the Wii.

Erroneous! Utterly meaningless challenge!

Every device is limited by something. There were ways planned around things like that, when the hardware was being made. Others are using those ways, apparently.

Vague! Bereft of any useful information!

And, it's not that they are only good look. They are great playing games, as well (from a technical perspective).

Subjectivity ... from an objective perspective.

Fucking lol.

Does it seem equally as silly to ignore the devs creating masterpieces and NOT complaining about the hardware? That's what's happening here. There have been devs that said a lot of devs are leaving a great deal of Cell performance on the table by not utilizing certain skills/tools. It goes ignored.

You had nAo and Deano C that have created great techniques on the PS3, their words yet goes ignored. Those are the devs you should be listening to. They have actually created groundbreaking games, from a technical standpoint. Yet, as you said, "Kool Aid is still the drink of choice 7 years later."

No actual point made, just vague noises made and names dropped!

How can games be good/great, technically, in spite of something? That's like saying I can buy cars, buildings, etc, in spite of being broke. It doesn't make sense.

Stupid question asked, answered with your own nonsense analogy!

When you judge a runner, you judge he/she by his/her best times. When you want to get a car's 0-60 time, you take the best time. Why do we do that for hardware? It's because of the human factor. There are better drivers than others and the best driver can yet do even more. You can't get more out of a machine than it can do. HOWEVER, you can get more out of a driver/operator that hasn't been able to exploit everything the machine has to offer. It seems that's were we are with the Cell/PS3.

Some people will never be able to bring the full potential of hardware to bare (drivers, developers, etc).

Appeal to emotion!

Of course, you would think my argument is BS. So, your reasoning is that Cell got discard for the future and that's why you won't accept the logic I put forth?

Incorrect assumption!

Let's forget that 400 engineers designed this "crappy hardware".

Must be great then!

Let's forget the budget used to design "crappy hardware". I guess that was a part of the design goal and was signed off on because of it.

Reverse logic --> Must be great then!

Let's forget all the real world tests and real world examples of top performance using this "crappy" Cell. Let's forget the devs that uses the Cell processor to create beautiful and breathing game worlds. After all, we don't judge ability by what can be best done on something. We judge ability by what can be done poorly on something, right? I mean, that makes sense, right?

Must be great then!

And all those living, breathing rail shooters and rail beat em ups show that Cell was the God of CPUs! Just like in Skyrim!

And, nAo HDR implementation is more about the spirit of Cell programming.

Hahaha flawless comeback!

This is the point where I lost it for real. Seriously wiping gin and tears off the keyboard at this point.

How the fuck can you still post here but fellow visionary Jeff Rigby is banned?

Of course, this is all forgotten/ignored. Then, the excuses appear.

You're talking about Digital Foundry, right?
 
A post from a "liolio" that seemed to have disappeared.



http://forum.beyond3d.com/showpost.php?p=1589700&postcount=18

Interesting.

I don't get why it would have disappeared?
anyway I stand by it, the ps3 without the cell would have had trouble keeping up with the 360.
Actually I like what A.Richard said to qualify the cell "unfailed" (cf Charlie's interview of T.Sweeney and A.Richard at E3 I think).
If you have more time to lose on my posts you may also find that I some I defended at some point the pov that for Nintendo on a tight budget (both silicon and power) a cell like solution would have been difficult to beat (perfs per watts and mm^2).
Thing is it is not that relevant to why the cell failed and why ultimately some strong critics ala A.Pink (and many others I guess outside of that forum or the gaming/3d realm) were right from scratch about its fate, and it is not because they can't handle the greatness.

They may sound to you like they are exaggerating or that they are partial but it is not how I read it. They have a strong distaste because for them the concept on which the chip is built upon was, in their views, failed. I've come to agree with them ( which is irrelevant I'm not a dev or an engineer so my pov is pretty much worthless anyway) but the whole market including designers gave up on it.
 
Last edited by a moderator:
This thread is a goldmine of awesomeness, with a pure seam of Lucid_Dreamer running through it!



You're not going to say what you think it's become, but at the same time, you're going to say what you think it's become!

+ bonus appeal to emotion!



Lazy devs!



Erroneous! Utterly meaningless challenge!



Vague! Bereft of any useful information!



Subjectivity ... from an objective perspective.

Fucking lol.



No actual point made, just vague noises made and names dropped!



Stupid question asked, answered with your own nonsense analogy!



Appeal to emotion!



Incorrect assumption!



Must be great then!



Reverse logic --> Must be great then!



Must be great then!

And all those living, breathing rail shooters and rail beat em ups show that Cell was the God of CPUs! Just like in Skyrim!



Hahaha flawless comeback!

This is the point where I lost it for real. Seriously wiping gin and tears off the keyboard at this point.

How the fuck can you still post here but fellow visionary Jeff Rigby is banned?



You're talking about Digital Foundry, right?

Thanks for your input and backed up by so many different devs/people! :rolleyes:

Like I said before,...no...I'll let a quote from another dev say it.

tunafish said:
If you've ever had a look at some cool demo compos, you know that there are truly insane and gifted people out there who wrest the very last of power from any architecture, whether it's cell or C64.

http://forum.beyond3d.com/showpost.php?p=1589371&postcount=2

What does that sound like to you?

Those are the people they should be willing to learn from instead of complaining (many never even touched the hardware) and refusing to learn. Those people should be looking to those best and brightest devs for tips, instead of basically calling them liars and/or ignoring them. I think it's shameful.

Again, we will see what next-gen holds in the way of excuses. GPGPU progamming is too hard? If they did this or that to the architecture, it would make it easier for me? Time will time.
 
Last edited by a moderator:
I don"t get why it would have disappeared?
anyway I stand by it, the ps3 without the cell would have had troubke kepping up with the 360.
Actually I like what A.Richard said about the cell to qualify the cell "unfailed".
If you have more.time.to lose on my posts you may also find that I some I defend the pov that for Nintendo on a tigh budge both silicon and power a cell like solution would have been difficult.to beat
Thing is it is not that relevant to why the cell failed and why ultimately some.critics ala A.Pink were right from scratch about its fate, and it is not because they can't handle the greatness.
They may siund to yyou like.they are exagerating or partial but it is not how I read it. They have a strong distaste because for them the concept on which the chip is designed were in their view failed. I've come to agree with them which is irrelevant but the whole market jncluding designers gave up o.

The person that said that seems to have disappeared.
 
Obviously its a good CPU, the question is would it be smarter if they sunk that 400 mil $ in GPU and gone for 4 PPU rather than this setup. For Sony with PS2 and PS3 it seems that it was CPU first, GPU later, while everyone else have gone exactly the opposite (now Sony too with PS4).

I guess that tells you everything you need to know about the architecture.

Hypothetically speaking, what if Sony went with this setup - 3 PPUs, GPU, no eDRAM, same silicon budget like PS3 but focused on GPU like MS did with Xenos? Imagine XGPU with twice the number of ALUs and performance advantage that would give them? It would be pretty unfair I would say. Not only would they get superior 3rd party games, developers wouldn't have to sweat blood, waste time and money to make CPU do what GPU should have done in the first place.
Tunafish seemed to disagree rather harshly to the notion that the Cell was a good CPU. :) That's one of the people that seemed to have won over liolio. His first page post in this thread seems to continue that perspective. Then, on this third page, he says he stands by his older quote I posted. It's very confusing.

It seems some people would say that setup didn't exist at the time. 4 PPUs would have been too hot, if I recall. These points were covered in the first and second page of the "Was Cell Any Good?" thread.
 
Thanks for your input and backed up by so different devs/people! :rolleyes:

Like I said before,...no...I'll let a quote from another dev say it.

That doesn't mean that all architectures are as good or as suited for a specific purpose as each other. And it doesn't mean that all processors are as good or as bad as each other.

You're (predictably) attempting to use tunafish's words "against" the comments he made at the start of this thread, but it doesn't work because there is no contradiction. It's just another clumsy attempt to use words that you hope will yield an emotional response against a viewpoint that you don't like because you have an emotional attachment to a processor that you didn't help build.

What does that sound like to you?

It doesn't sound like vindication for an architecture that everyone - including IBM and even Sony - have taken a big fat shit on.

Those are the people they should be willing to learn from instead of complaining (many never even touched the hardware) and refusing to learn. Those people should be looking to those best and brightest devs for tips, instead of basically calling them liars and/or ignoring them. I think it's shameful.

Once again, emotive language instead of anything wrapped around any supporting evidence or personal experience.

Again, we will see what next-gen holds in the way of excuses. GPGPU progamming is too hard? If they did this or that to the architecture, it would make it easier for me? Time will time.

And again, emotive language and nothing else. Between your twisted quotes, appeals to emotion, and inference based argumentation you're bringing what is basically noise to the forum.

If you think Cell is an awesome games console CPU then present some actual evidence instead of this kind of shit.
 
Tunafish seemed to disagree rather harshly to the notion that the Cell was a good CPU. :) That's one of the people that seemed to have won over liolio. His first page post in this thread seems to continue that perspective. Then, on this third page, he says he stands by his older quote I posted. It's very confusing.

It seems some people would say that setup didn't exist at the time. 4 PPUs would have been too hot, if I recall. These points were covered in the first and second page of the "Was Cell Any Good?" thread.
Good CPU for being versatile and doing stuff RSX needed help with, not good at all for standard/retail CPU. Scratch that, I thought 3 PPUs like in 360.

I know but I don't think Cell can be looked at isolated. It was good for covering up whatever RSX needed help with, but it wasn't necessity in the first place. With the same silicon budget and half a billion dollars they poured into developing it they sure could have gotten much better deal.
 
Status
Not open for further replies.
Back
Top