End of Cell for IBM

Discussion in 'Console Industry' started by Butta, Nov 20, 2009.

  1. deepbrown

    Veteran

    Joined:
    Apr 15, 2007
    Messages:
    3,056
    Likes Received:
    1
    Look at the highest voted comment here. I hate reading things out of 2007 - and then a bunch of supposedly reasonable people agreeing with it. You can see my little battle there - and I'm sure I probably come off as a "fanboy". But this is 2009 and developers have made great strides on Cell.

    Here's the comment:

    http://www.reddit.com/r/gaming/comments/a6szy/ibm_cancels_cell_processor_development/

     
  2. assen

    Veteran

    Joined:
    May 21, 2003
    Messages:
    1,377
    Likes Received:
    19
    Location:
    Skirts of Vitosha
    "Made great strides" is relative; his assessment of the overall worth of Cell is absolute.

    Also, do we look at 3-5 studios that have done great things with Cell, or at 300-500 that have failed, when we want to evaluate the architecture?
     
  3. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,741
    Likes Received:
    11,222
    Location:
    Under my bridge
    At the architecture, you evaluate the best possible. At the choice of Cell in the context of a games console, you can cast the net wider and come up with all sorts of arguments.

    Anyone wanting to criticise Cell as a fast processor architecture needs to present some decent alternatives that were around or even on paper in 2001-2005. There weren't any. Cell may be a dead end going forwards, but only because the rest of the world is catching up with the spearhead of STI. They took the leap when no-one else was brave enough to, took the flak for trying to get developers to change their patterns of behaviour to fit the necessary requirements of the future of high-throughput processing, and are now being brushed off as a bunch of idiots with a half-baked idea. Whether one likes Cell or not, it's disingenuous to belittle the work of STI and the results that their architecture is capable of.
     
  4. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    Classic CPU's will never achieve the perf/W perf/area of cell/gpu's etc. Cell was great at the time of it's unveiling. it is being criticized today because there have been no followup's so far which build on it's strengths and try to mitigate it's weaknesses.
     
  5. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    17,682
    Likes Received:
    1,200
    Location:
    Maastricht, The Netherlands
    I'm thinking in a completely different direction. The PowerCell32i would have been a double precision device like the PowerCell8i, while the PS3's Cell is a single precision optimised device. So my question is, do we need double precision for the PS4, or is single precision sufficient again? Because if so, something like the PowerCell32 would only have been useful for supercomputing.
     
  6. Weaste

    Newcomer

    Joined:
    Nov 13, 2007
    Messages:
    175
    Likes Received:
    0
    Location:
    Castellon de la Plana
    Didn't I see a lecture somewhere from one of the head software guys at IBM developing for Cell, where he said that they only needed to add double precision for financial institutions. Not that they needed double precision as such, but rather in that they were dealing in trillions of dollars.
     
  7. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    17,682
    Likes Received:
    1,200
    Location:
    Maastricht, The Netherlands
    Scientific calculations definitely frequently benefit from double precision, and also the double precision emulated on the single precision registers in Cell weren't 100% up-to-spec I believe (or was it even the single precision by itself?) - if you wanted to have the EEEC standard (or whatever it is called), you could enforce it but this would reduce speed even more.

    But apparently none of that is needed in current graphics programming. Perhaps it is needed in some modern physics calculations, but I wouldn't be surprised if it wasn't.
     
  8. Carl B

    Carl B Friends call me xbd
    Moderator Legend

    Joined:
    Feb 20, 2005
    Messages:
    6,266
    Likes Received:
    63
    His assessment is absolute, but it is an opinion at the end of the day. If you read 'abelsson's post below it, you get a different opinion.

    Not that their opinions (either of them) are new or different or interesting ultimately; as we all know, we've had these discussions around here a hundred times over every year since launch. Cell is not as easy to program for as some others, but at the same time many programmers have unknowingly made it harder on themselves than need be; the oft discussed PS3 --> 360 porting vs the other way around scenario comes to mind. And even the guy comparing Cell to NetBurst (!) gives Cell a nod when code is optimized.

    Again I don't want us to emphasize some remote conversation on reddit though in this thread, since isn't that what we hang around here to do anyway? :)
     
  9. Weaste

    Newcomer

    Joined:
    Nov 13, 2007
    Messages:
    175
    Likes Received:
    0
    Location:
    Castellon de la Plana
    I agree that there are applications that need it. I was referring to what he was saying from around 38:30 onwards.

    http://ocw.mit.edu/OcwWeb/Electrica...-2007/LectureNotesandVideo/detail/embed03.htm
     
  10. Crossbar

    Veteran

    Joined:
    Feb 8, 2006
    Messages:
    1,821
    Likes Received:
    12
  11. Weaste

    Newcomer

    Joined:
    Nov 13, 2007
    Messages:
    175
    Likes Received:
    0
    Location:
    Castellon de la Plana
    He's explaining in that video why they placed double precision into PowerXCell 8i, and according to him it was to do with their target market of finance requiring it, due to the sheer amount of legacy code using double precision, but states clearly that it could easily be done in single precision.
     
  12. Crossbar

    Veteran

    Joined:
    Feb 8, 2006
    Messages:
    1,821
    Likes Received:
    12
    The fact they did not do a redesign to let IO take advantage of all sides of the chip was probably a trade off decision, the costs and the gains were not worth it with regards to the volumes the chip would be produced in before it would be replaced.

    A new redesign of the IO layout will be inevitable at the next shrink, hence there is no need to worry about that the IO doesn´t scale, there is still plenty of space.
    Maybe even for two more shrinks.
     
  13. assen

    Veteran

    Joined:
    May 21, 2003
    Messages:
    1,377
    Likes Received:
    19
    Location:
    Skirts of Vitosha
    HPC applications are peculiar in that when a particular multi-million, multi-TFlops machine is set up, often the software that will be run is custom-written or at least heavily customized for it. This particular market can live with "hard to program", because the benefits of, say, 5x the programming effort, will be directly returned in the form of fewer nodes / lower electricity bill down the line.

    This explains both their willingness to look at Cell several years ago, and their eagerness to move on to GPUs now.
     
  14. SEGA R&D

    Banned

    Joined:
    Nov 22, 2009
    Messages:
    2
    Likes Received:
    0
    The reason CELL has been dumped is because IBM is advising its programmers to move to the OpenCL and GPU processing model instead.
     
  15. Crossbar

    Veteran

    Joined:
    Feb 8, 2006
    Messages:
    1,821
    Likes Received:
    12
    OK, then it was basically for backward compatibility reasons for the financial applications. Scientific calculations do benefit from the double precision, it was probably a requirement to get the HPC deals like Blue gene.

    I think Cells originally did something like one double precision op in 6 cycles and two calculations (SIMDS) in parallel per SPU, compared to one single precision op in 1 cycle and 4 calculations (SIMD) in parallel per SPU. If you see MADD as two ops, double those ops numbers. The PowerXCell 8i did the double precision ops in one cycle.

    I think that IBM never got their smart compiler (that would abstract away the SPUs) working in an efficient way and that is one reason (among others) why Cell didn´t become more successful for HPC purposes.
     
  16. AzBat

    AzBat Agent of the Bat
    Legend Veteran

    Joined:
    Apr 1, 2002
    Messages:
    5,952
    Likes Received:
    1,746
    Location:
    Alma, AR
    Going to switch gears a sec... Most of this talk has been focusing on Sonly and the Cell, but considering that the PPE in the Cell is pretty close to the cores in the Xenon CPU, wouldn't IBM's abandoning of the Cell have some effects on a possible derivative of the Xenon for the next Xbox too?

    Tommy McClain
     
  17. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    Right. Programmers getting used to GPU's and OCL programming model will help IBM make more money. Bingo. :lol:
     
  18. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    IBM said they were abandoning cell, not Xenon. I am sure MS see's Xenon as future scalable architecture.
     
  19. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,741
    Likes Received:
    11,222
    Location:
    Under my bridge
    Again, they didn't say that. We're awaiting clarification, but at this point what we know is only that PowerXCell 32i is shaleved. A future of Cell 2 or some other derivative may still be on the cards.

    From IBM's POV, what else have they got to compete with the upcoming techs in the super-computer space? Wouldn't they make more from selling systems with their own processors than Intel's or nvidia?
     
  20. Carl B

    Carl B Friends call me xbd
    Moderator Legend

    Joined:
    Feb 20, 2005
    Messages:
    6,266
    Likes Received:
    63
    Agree with Shifty above - essentially, IBM hasn't said anything as yet. Though ironically this whole rumor firestorm may be the catalyst we need to finally get something tangible out of either Sony or IBM in terms of the future of the architecture.

    And on some of the above... Cell is supported under OpenCL as well. If anything, Cell *is* IBM's present entry into the many-core, high parallelism area, even if it's extremely niche right now.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...