John Carmack on PS3 Video

Discussion in 'Console Industry' started by Ben-Nice, May 12, 2006.

  1. Legend

    Regular

    Joined:
    Mar 7, 2005
    Messages:
    275
    Likes Received:
    2
    for us slow pokes (dialup-ers) can someone post the key points mentioned, please?

    thanks.
     
  2. Kryton

    Regular

    Joined:
    Oct 26, 2005
    Messages:
    273
    Likes Received:
    8
    It's just the usual:
    * Sony pick asymmetric style for Cell
    * MS pick symmetric
    * Better peak theoretical performance on Cell
    * Nothing extremely bad with it but will take a little more work that studios closer to Sony will do but iD won't
     
  3. cyberheater

    Newcomer

    Joined:
    Jul 14, 2005
    Messages:
    27
    Likes Received:
    2
    you missed out that Carmack thinks that Sony choose the wrong design.
     
  4. Gholbine

    Regular

    Joined:
    Jun 19, 2005
    Messages:
    294
    Likes Received:
    1
    Which can be reduced to: "Cell means more work for me therefore it was the wrong decision."
     
  5. hey69

    hey69 i have a monster
    Veteran

    Joined:
    May 13, 2003
    Messages:
    2,930
    Likes Received:
    33
    Location:
    Belcika
    but he said the ps3 has more peak performance ..

    and how is 3 cores symetrical actually?
     
  6. AlBran

    AlBran Ferro-Fibrous
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,666
    Likes Received:
    5,759
    Location:
    ಠ_ಠ
    "identical" or "triplets" is probably what he meant. :p

    Anyone know where I can get the full interview?
     
  7. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,606
    Likes Received:
    11,034
    Location:
    Under my bridge
    Symmetrical in this case means the same. Cell is asymmetric because it mixes different types of processors, though to be honest it's only two. It's not 7 different processors. Devs already know how to write for for the PPE. The problem which Carmack really faces is learning how to use the SPEs effectively as they are different. If Cell consisted of one PPC core and 2 Intel cores, it'd still be asymmetric but he wouldn't be complain so much ;)

    What he did say which I don't think right, was that writing for the SPE's needs different tools and a different compiler to writing for the PPE, and he even called the PPEs Cells.
     
  8. ERP

    ERP Moderator
    Moderator Veteran

    Joined:
    Feb 11, 2002
    Messages:
    3,669
    Likes Received:
    49
    Location:
    Redmond, WA
    No he's right compiler is obviously different.
    The issue with the SPU's is there limited access to main memory, it's hard to work around for a lot of obviously parallel tasks.
     
  9. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,606
    Likes Received:
    11,034
    Location:
    Under my bridge
    Ooo. So SPE apulets are compiled separately and built into the main program? Doesn't the IDE provide SPE and PPE development within the same program?
     
  10. archie4oz

    archie4oz ea_spouse is H4WT!
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    1,608
    Likes Received:
    30
    Location:
    53:4F:4E:59
    Bah, that's no big deal... At least you have a compiler for the SPEs... It's not like the PS2 where you've got 2 different compiler targets, and none for the VUs (actually there were a couple of compilers, but nothing great), along with build chain having to deal with 3 different sets of vector opcodes... In this regard the PS3 is FAR more simpler and cleaner...
     
  11. pegisys

    Regular

    Joined:
    Mar 2, 2005
    Messages:
    593
    Likes Received:
    4
    I think thats the point id didn't make any ps2 game either. I think the thing about Carmack is that he likes to try new stuff and not have to rethink ways to get simple things done
     
  12. Farid

    Farid Artist formely known as Vysez
    Veteran Subscriber

    Joined:
    Mar 22, 2004
    Messages:
    3,844
    Likes Received:
    108
    Location:
    Paris, France
    id, themselvs, didn't developed anything for today's Consoles. Be it the PS2, the GC, the Xbox or even the DC.

    Quake III was ported on the PS2 by EA and it was ported on DC by Raster.
    Doom III on Xbox is the work of Activision's Vicarious Vision.

    The last game id developed on Console was Doom for the Atari Jaguar...
     
  13. DemoCoder

    Veteran

    Joined:
    Feb 9, 2002
    Messages:
    4,733
    Likes Received:
    81
    Location:
    California
    Yes, but for obvious, embarasslyng parallel tasks, one doesn't want a huge shared memory pool. One wants local memory to avoid concurrency issues and the need to use concurrent datastructures, or locks.

    Yes, a UMA and SMP architecture presents an easier model to code for, but it does not neccessarily provide optimal performance. To get optimal performance, one might need to implement local memory areas anyway to get rid of lock contention, and to allow the compiler to make optimizations that it may not be able to do if the default assumption is conservative (aliases, concurrency)

    I think when it comes down to it, if you want high performance on tasks which can be parallelized, you need to write your code in a way that takes advantage of the inherent data-parallel nature of your problem, rather than thinking you can get away with a solution that is a serial algorithm with a couple of OpenMP annotations launched in multiple threads.
     
    Grall likes this.
  14. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,606
    Likes Received:
    11,034
    Location:
    Under my bridge
    Which appears to be Carmack's point - he doesn't think the extra performance is worth the extra effort it seems. It's interesting the different take on these machines. You also get some devs saying PS3 is a challenge, but an exciting one with lots of potential. Everyone has their tastes. I guess for the games it depends where the average lies as to whether exotic hardware gets used properly or not.
     
  15. DudeMiester

    Regular

    Joined:
    Aug 10, 2004
    Messages:
    636
    Likes Received:
    10
    Location:
    San Francisco, CA
    If you want a friendly coding environment that badly, I'm sure it can be provided via a software layer. In fact, I wouldn't doubt that Sony will make such a library available.
     
  16. DemoCoder

    Veteran

    Joined:
    Feb 9, 2002
    Messages:
    4,733
    Likes Received:
    81
    Location:
    California
    That depends entirely on what you do with the extra performance. By his definition, he doesn't think its worth it, and maybe it isn't for the kinds of problems he is trying to solve (MegaTexture). Carmack seems to desire high texture rate:alu right now for example, which is the opposite of what other devs are pushing (low tex:alu)

    But other devs might have novel uses for SPUs on algoithms that carmack has no interest in. I do think Carmack is one of the best game programmers out there, but Doom for example represents one particular vision for where games can go. Each engine imparts it's own special set of restrictions depending on what it is optimized for.

    I think one can definately say that Xenos is probably easier to code for, and to avoid penalties and achieve optimality easier. But XGPU vs CELL is a different story, and I don't think one can make the conclusion that ease of use for a fixed platform target that may lead lower peak thoroughput is necessarily ideal.

    Ease of use is a factor if you want massive adoption at all levels. For a console, I may not want shovelware developers. I may want top-tier developers who can rench maximal performance from my system and come up with innovative designs. I want my top-tier developers to have ease of use and good tools, but one shoudn't design a system where the hands of those developers will be tied.

    For example, one could abstract away details of everything and force devs to go through a virtual machine layer like .NET, and use a builtin scenegraph language, and that may make for rapid development, less bugs, and ease of use, but it will lower the maximum performance of what's achievable.

    Rapid Application Development and Maximum Performance High Quality games may not go together. RAD means development productivity. But you have to be careful that the excess productivity doesn't give up too much in trade.
     
  17. ERP

    ERP Moderator
    Moderator Veteran

    Joined:
    Feb 11, 2002
    Messages:
    3,669
    Likes Received:
    49
    Location:
    Redmond, WA
    Sure but how many embarassingly parallel tasks are there in you average game?
    And how many are there you can add that add value to the game?

    Here's my current thought on this, having not gone through an entire dev cycle, so I'm likely to change my mind. I am pretty positive the limitting factor will be the speed of the PPU, we'll off load all the easy stuff, and probably add stuff that can be easilly offloaded. But I don't believe at there core most game engines will be particularly parallel.

    I think there is a real desire to scale gameplay, I want to put 100 people in the streets of a city to populate it instead of the 20 or so in GTA. I don't want them to disapear when I turn around. I'd like them to exhibit reasonable behavior in response to what's happening, I'd rather thay didn't run into each other all the time and look stupid. All this stuff will likely increase the load on the main game thread.

    Most of the game engines I've seen don't even do things like attempt to batch physics queries, or deal with the fact that they might be asynchronous. It's very much do a raycast here and change state in response to it.

    Parallelism isn't trivial in general and it's easy to actually reduce performance if your not careful, even something as seamingly trivial as creating an object asynchronously has subtle gotcha's that will screw you if your not careful. And unfortunately a lot of things appear to work until they don't.

    IMO It's going to be a long slow transition to effective parallelism in game engines.
     
  18. Acert93

    Acert93 Artist formerly known as Acert93
    Legend

    Joined:
    Dec 9, 2004
    Messages:
    7,782
    Likes Received:
    162
    Location:
    Seattle
    Wow, the 3rd time around with this info was the charm :D

    I think it is interesting see how different developers react to the systems, and as Demo said that may be due to the different problems they want to solve.

    Carmack has said he had wished the consoles were single-core this generation, and had moved to multi-core next gen. While this recent interview is more "PS3 v 360" in the past he has stated he was not very happy with either CPU, just more unhappy with Cell. Then again what solution was there? CPU makers hit the wall 3 years ago, there was not many options. It seems to be more an industry problem and that we should have been looking at parallelization in hardware, tools, and software years ago. Then again he is pretty up front that he thinks both consoles are not bad (from a dev perspective), unlike last gen. So his comments were all negative.

    John and id Software seem to focus on "smaller" developer teams, and as he has noted in the past the issue is not necessarily him but other developers he is concerned with. They seem to be pretty close with Raven, Splash Damage, etc.

    If I understand him correct, I think his general philosophy is the platforms should enable him to spend as much time possible working on the game and not fighting hardware. Dev time and budget are increasing, and game complexity is increasing. This frequently means your team size increases, and with it the risk of more problems. Multi-core processors just increases the stress from going from 1 generation to the next. Further adding to the complexity with asymetric cores is just another thing to worry about.

    Of course Carmack does not speak for everyone in the industry (PS2 devs are probably shaking their heads!), but his concerns seem pretty reasonable from a developer from a PC perspective and as the owner of an independant game studio. This is one reason I don't see MS and Sony getting together on a console. As similar as they are, they also have different philosophies and approaches and invested interests.

    What I would like to ask John is: "What would have been your preferable multi-core solution?" My guess would have been a dual core OOOe x86. Of course size, heat, and cost issues are there, as well as IP ownership and floating point performance.
     
    Frank likes this.
  19. Frank

    Frank Certified not a majority
    Veteran

    Joined:
    Sep 21, 2003
    Messages:
    3,187
    Likes Received:
    59
    Location:
    Sittard, the Netherlands
    Simple breakdown: John Carmac isn't happy about changing his habits. While there is much to gain by doing it the PS3 way, it's a lot of hard work. And he was doing very well doing what he did, thank you very much. And the Xbox360 is very much as he is used to things, Visual Studio and all the common classes and objects.

    Cynical? Yes. True? Yes. The higher people are elevated, the less they like having to start over and proving themselves once again. When you're considered a Demi-God, you're not trampling to change your territory and run the risk that you get your ass kicked by the inhabitants.
     
    Nite_Hawk likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...