*Game Development Issues*

Discussion in 'Console Industry' started by valioso, Oct 29, 2007.

Thread Status:
Not open for further replies.
  1. ERP

    ERP Moderator
    Moderator Veteran

    Joined:
    Feb 11, 2002
    Messages:
    3,669
    Likes Received:
    49
    Location:
    Redmond, WA
    The reason that they are fundamentally similar is that performance pretty much directly correlates to the way you access memory.

    The reason they still differ is that the "Cell" model isn't the only good way to keep memory accesses controlled and for some workloads with potentially large datasets it isn't the best model on a machine like the 360.

    Also it's not just about performance anymore, ease of implementation and maintenance figure into the equation aswell.

    You'd be stunned how much engineering time can be lost when some one writes a function that has a none obvious bug, that someone uses 6 months after the original author left the company.

    But most of these types of operations can be abstracted from the implementation anyway, so it should be largely irrelevant unless your designing interfaces that only support one implementation.
     
  2. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,709
    Likes Received:
    145
    I can believe that !

    It (always) takes a special team to deliver the extra omph.


    That's right. There are usually more than one way to tackle a problem.

    I understand. I have maintained someone else's 15 year old code. It's even tougher if you are under strict time pressure and have non-technical people to service.
     
  3. Pete

    Pete Moderate Nuisance
    Moderator Legend Veteran

    Joined:
    Feb 7, 2002
    Messages:
    5,187
    Likes Received:
    735
    I think the 'budget' part of Mint's statement is his point, that any gain from that "more robust architecture" is outweighed by the effort to attain it (that's why Carmack said 360 is like games are developed now, and why Valve bitched so much about Cell). So while your neat new system may take less CPU time, it wasn't worth the manpower (while joker's saying it isn't time-consuming to implement Cell best-practices on 360, that's separate from the argument that those practices might be limiting on 360 [tho less so than vice-versa]). In that sense, it's not optimal for the 360 from a holistic POV.

    nAo is saying all this is well and good, but PS3 is the reality, so hypotheticals to the contrary are as useful as designing with blinders on. To add to nAo's point, it's almost certain PS4 will be packing even more Cell, so railing against the inevitable would be time better spent optimizing for it.

    The argument (not yours) that 360 owners aren't getting the most for their money as a result of time spent 'accomodating' PS3 seems a bit short-sighted. Even imperfect competition is usually better from a consumer's POV.
     
  4. obonicus

    Veteran

    Joined:
    May 1, 2008
    Messages:
    4,939
    Likes Received:
    0
    Actually, though, what I'm referring to isn't CPU performance, but dev time. From what I've read and even accounts here, a 'SPE-centered' approach enforces certain behavior that may help prevent race conditions, as it forces you to be far more aware about your thread's data (I'm going by Insomniac's SPE Shader model, which seems like a good baseline). Naturally, it's not going to be a panacea; there's the old adage that you can program FORTRAN in any language.

    Maybe Carmack hasn't seen much of an advantage because he's already extremely careful regarding threads: the extra overhead could just be a PITA. (But then, as the legend goes, Von Neumann thought assemblers were a waste of time.)
     
    #764 obonicus, Aug 7, 2008
    Last edited by a moderator: Aug 7, 2008
  5. scently

    Regular

    Joined:
    Jun 12, 2008
    Messages:
    979
    Likes Received:
    169
    Exactly I really want to understand the statement that the Xenon is not a good cpu. It is not good in regards to what....the Cell? Because of theoretical performance or actual performance, or are we now saying that giving the choice you would choose the Cell over Xenon. By no means am I a developer is just that since the begining of this gen all I hear is Cell this and that without any proof.

    I would think that the best way to analyse these system is as a system overall.
    I would like to here what multiplat devs have to say about Xenon being "bad

    Mod : I've added the missing capitals for you. Maybe this got moved before you saw it, but please read and adapt. : http://forum.beyond3d.com/showthread.php?t=49336
     
  6. Nesh

    Nesh Double Agent
    Legend

    Joined:
    Oct 2, 2005
    Messages:
    12,480
    Likes Received:
    2,781
    It looks to me that Xenon IS a very good CPU. I dont think there is a single developer that says otherwise.

    According to some insightful replies here when I asked in the past it seems that Cell is probably a bit more powerful albeit hard to work on.
     
  7. jandlecack

    Banned

    Joined:
    Jul 19, 2008
    Messages:
    298
    Likes Received:
    0
    You know, stuff like this has been said for the past 3 or so years. Doesn't make much of a difference if the most talented technology coders aren't aboard your ship. Looking at 360 hardware as a whole, I am starting to believe the first party developers are too much migrated from PC to even begin maximizing and coding effectively for the one configuration. The 360 CPU is actually unremarkable; it's the GPU that, if maxed out, should still give a game the edge over PS3. Isn't happening.

    Whether teams who focus on PS3 solely can really use the Cell so much that it will improve "graphics" or the RSX can be tricked out to a good extent, we are still seeing the most impressive looking games exclusively on PS3 (Uncharted being number one; those textures + colours are unrivalled).

    Yea, Mass Effect looked great, but did it run great? Far from it. It felt like it was chucked onto the system although the PC version was the port (and a much better build).
     
  8. Nesh

    Nesh Double Agent
    Legend

    Joined:
    Oct 2, 2005
    Messages:
    12,480
    Likes Received:
    2,781
    Cant disagree

    A similar thing doesnt also count for the PS3 due to the difficulty to program the Cell too though? It will all depend on the tools ofcourse, and I wonder sometimes if the Cell is able to "branch" all game related tasks efficiently to the SPU's.

    It is still an exotic architecture and probably lacks some flexibility when it comes to what you can throw to the SPU's. Any clarifications from the experts in here? ;)
     
  9. joker454

    Veteran

    Joined:
    Dec 28, 2006
    Messages:
    3,819
    Likes Received:
    139
    Location:
    So. Cal.
    If you code for it in older programming methodologies then it will run kinda slow. However, there are still three cores in there. So, short term there was still enough power to use non optimal existing methods and/or code base and make framerate, so long as you as least made some basic use of threading. As with the rest of the 360, I think they struck a very good balance in the cpu design because you could get away with non optimal code in the short term, yet performance gets much better over time as you slowly switch to the new techniques which it prefers. This was *critical* in the short term when everyone was scrambling to ship product, and gives it good legs long term because code keeps getting faster as we re-write more of it to suit it's liking.
     
  10. Fafalada

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,773
    Likes Received:
    49
    The issue is not about quality of individuals as much as the fact that it's nigh impossible to have good codebase quality control as the team size increases.
    As ERP said, you get development process where it's "quantity over quality" as far as code goes and optimization becomes a patching process, which is not a great way to approach things if you want true efficiency, regardless of the underlying hardware.

    Ehm... It's ok conceptually, but I don't think I know anyone that thinks IBM did a good job with the console PPC cores (Xenos or Cell alike).

    It's his point of view because that's one of the realities of the industry. And it's not exclusive, or new to this generation of consoles either.

    On the same line of reasoning, C++ is no guarantee to give us better games either, but we still code games that way instead of say, Java, even though it means we spend more time coding.
    Or more likely, just throw more people at the problem.
     
  11. nAo

    nAo Nutella Nutellae
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,333
    Likes Received:
    129
    Location:
    San Francisco
    You got it wrong. What really upsets me is a half-trolling comment based on my passions not on my arguments.
    Who are you referring to? My preferences are crystal clear.

    Then avoid inflammatory comments not based on any factual statement.

    Multiplatform games are all about the lowest common denominator and always will. Developing on 360 and PS3 using PS3 as a 'lead platform' won't generally hurt 360, in many cases PS3 coding practices will actually benefit 360.

    The proof that you are wrong is in the mythological pudding: developers already went down the so called lazy route, when you don't know what to do you take the most simple and apparently logical decision. Too bad they/we were wrong, this is why the tune is changing. Change is always painful, almost no one changes just for the sake it. It's the evolution baby, you change or you die!
    And for the 1000th time, it's not about personal preferences, it's about shipping the best multiplatform game possible while trying to not die in the process.
    If you are looking for games that push your 360 then you have to search elsewhere, try with 360-only titles :)
     
  12. whome0

    Newcomer

    Joined:
    Feb 21, 2008
    Messages:
    235
    Likes Received:
    7
    About the older programming methods, do you reference to a monolith single-threaded game engine just as an old-skool PC game engines were built?

    Its actually a suprising an early multi-platform games "made by lazy devs" even run on PS3 and delivered some sort of (playable) game experience. I mean as we have understood, Xbox360-copypaste-to-PS3 port should be a failure from the start. Not to mention a time constraints given to the multiplat devs, must have been a miserable long and stressfull days.

    About the many-core game engines, what are the tasks PS3 developers put to run on SPU cores nowdays. AI, physics simulation, audiomixing, animation transformation, etc...?

    SPU cores have a local ram and both data+program must fit to a core. SPU does not see all 256MB+256MB memarea in one go, but must manually fetch/stream an active data to a local spu ram.

    Is Xbox360 engines (the new way) built to simulate same very small independent tasks and TaskManager thread controls the workflow. Tasks are implemented to fetch an active data to a "local task ram" and write it back before fetching a new chunk?
     
  13. T.B.

    Newcomer

    Joined:
    Mar 11, 2008
    Messages:
    156
    Likes Received:
    0
    Well, you can't really release the same game twice, once with and once without issues. But at the end of the day, this is more of a marketing thing and we're engineers,. We try to deliver what marketing wants, and they don't want inferior versions. ;) So the pressure may not come from the market itself, but it' still there.

    On a less hearsay basis, I promise you that you will see a lot less cross-marketing opportunities from the manufacturer of the inferior version. This may or may not impact sales, but it will impact your own marketing budget/clout.

    I have to wave the NDA-flag here. But think about a limited HDD and unlimited numbers of games and you'll get an idea. Sorry.

    Well, inside an SPE, the SPUs can only communicate through channels. Those are very minimalistic and basically lead to the MFC. The MFC is mostly an overglorified DMA-processor, so it moves data to and from local store. If you want to access OS-services, you can of course use a background thread that waits for commands from the SPEs and then processes it. But what do you use the SPU for then? (Disclaimer: I'm not an expert in writing operating systems on Cell. I may be missing something here)

    It's not misallocating, it's just sub-optimal. Basically, people who have to decide methodology for large shops *should* be risk-averse. It's a good thing, I guess, most of the time. People need time to switch methodologies, some take longer than others. Now if you have a 30 programmer team and your hot-shot engine progammers go fully job-driven, this will cause trouble for the less technically inclined people. And on the other hand, your hot-shot tech dudes don't usually make these decisions. ;)

    I'm not totally sure if I'm on the same page as nAo (sorry for inserting myself into your discussion, BTW), but I don't see it as Cell vs. Xenon. Yes, writing the SPU-code takes time and writing the VMX128 code takes time as well. These are the platform specifics. The rest is architecture you would even want on the PC. Interestingly, one of the first instances of "we have to do it like that" in our project came from the PC tech-god, btw.

    Point taken. It all depends on where you want to compete. I can make awesome XBLA titles that run on one core. If I want to compete with the big boys, I'll have to get as much out of the machine as possible. But sure, if I compete on gameplay and not on tech, PS3-like coding is overkill, no doubt. Even on the PS3.

    Of course! There are more models than the two. Hell, computer science is so full of models, you'd think it's fashion week.
     
  14. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,724
    Likes Received:
    194
    Location:
    Stateless
    In regard to this there was a presentation during the gamefest 2008.
    I try to gather informations in the related thread (tech section) nobody answered, as hinted by some members it must be under NDA.
     
  15. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    18,071
    Likes Received:
    1,664
    Location:
    Maastricht, The Netherlands
    Yeah, this doesn't sound completely right. Maybe someone can chime in ... I do have the full 10MB IBM Cell Programmer's Manual lying around here as a PDF but no time right now. I have never seen anyone talk about SPU and SPE like this before anyway, so that's interesting in itself. I think the point is that an SPE is quite fully featured - I heard Insomniac talk about running the game main loop from an SPE and that this may even be the best way to do it.

    Actually an answer to this question probably helps more - why are you talking about the SPU's capabilities in isolation of the SPE here?

    That does sound like you're on the same page as nAo.

    Ah but that I disagree with. Think about something simple like Super Stardust (or in a different way, Geometry Wars), a game with some really excellent gameplay. It's totally reinvigorated by using the Cell to allow for such an insane amount of objects flying around, colliding with each other, chasing you, and so on. There are a lot of new styles of gameplay involving physics (think from MotorStorm, through Pain, to LittleBigPlanet) that can offer something new in terms of gameplay. I still believe that in this respect, things haven't changed that much from the very first days of computing - gameplay and tech are still very strongly linked. However, you can still also do something like Fl0w or Braid of course, that doesn't need a lot of tech and still oozes gameplay, although then again the former does benefit a lot from motion sensing ... which brings me to the Wii and gameplay, and tech. I think you see where I'm going with this. ;)
     
  16. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    18,071
    Likes Received:
    1,664
    Location:
    Maastricht, The Netherlands
    I remember that there have been some discussions here that you can lock parts of the Xenon cache to work in a similar way as the SPU's local store, and something similar with the VMX unit in particular.
     
  17. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,724
    Likes Received:
    194
    Location:
    Stateless
    I also read something like that but based on comments it looks pretty much useless and pretty much a standart on PPC implementations.

    I was in fact thinking about these two presentations:

    Microsoft Directions in Parallel Computing and Some Short Term Help:
    This talk focuses on the native task scheduler being announced by the Parallel Computing Platform group in Microsoft this spring and offerings that are available in the XDK. The scheduling of tasks within games can improve resource utilization, load balancing, and performance. For games targeting the current generation of PCs and the Xbox 360 console, we discuss an interim solution. Previous talks given on this topic laid the foundation for using tasks to move work required by the engine from an over-utilized hardware core to an underutilized core. A progression of task and scheduler designs is presented that start with simple implementations and move to more complex designs that have low-overhead. The simple implementations are satisfactory for a small number of tasks but impose a prohibitive amount of overhead when the number of tasks grows. Finally, we present the work-stealing algorithm that pulls work from one core to another in the low-overhead scheduler.

    And

    also think that this give us another hint:
    Xbox 360 Compiler and PgoLite Update:
    The Xbox 360 compiler has changed dramatically in the last year, which changes the rules for how to write efficient Xbox 360 code. Many of the improvements automatically make your code faster, but others require you to change your code in order to reap the benefits. PgoLite has also improved and should be used differently, to get even better results. This talk summarizes the past year's developments, and gives simple rules for how to get maximum benefit from the changes.

    the proper thread is here:
    http://forum.beyond3d.com/showthread.php?t=49183

    But sadly most of this seems under NDA.
     
  18. T.B.

    Newcomer

    Joined:
    Mar 11, 2008
    Messages:
    156
    Likes Received:
    0
    Well, it's IBM nomenclature. SPE=SPU+MFC. My reason for breaking it up is to show that you'll need someone on the same *PE as the OS to make calls to the driver. I guess you can run an OS on the SPEs instead of the PPE, as all IO is memory-mapped, IIRC. But this is academic, as nobody does that. :)
     
  19. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,724
    Likes Received:
    194
    Location:
    Stateless
    I remember form early discussion about this that a cell without ppu couldn't run/boot an OS.
    EDIT
    I don't remember the reason, maybe something some IRC non supported, but it's been a while (likely to be wrong).
     
  20. chachi

    Newcomer

    Joined:
    Sep 15, 2004
    Messages:
    120
    Likes Received:
    3
    Which is why I said it the way I did.

    Coding practices are agnostic, they don't generally care what platform you implement them on. How well they run on a given platform is a different story, but that's not really the point. Nothing I've seen you or anyone supporting your position say suggests how you proscribed solution (leading on the PS3) will lead to better games, just less controversy.

    The "mythological" pudding says no such thing. Your so-called "lazy" way led to some of the best looking games among both 1st and 3rd parties. Assassin's Creed for example. It took them a while to get the PS3 version running as wll as the 360 one but I'd much rather see that then both systems run what is easily done on the weakest platfom with the current tools. For sure it'd be easier for Ubi to do what you suggest. They'd have saved money and since all anybody ever saw was the lowest common denominator nobody could complain much, but it wouldn't have made a better game.

    What makes it about personal preference is your suggestion that leading on the PS3 is the only or best way to implement this and if the games don't push the platforms quite as hard then it's still all to the good because it's only multiplatform after all. I don't see it that way and I don't think it's healthy for the industry to cement the 3rd parties in such a subservient role. That way lies Nintendodom.

    I think especially for multiplatform games the competition is what improves the platform. If the 360 wasn't around to show up lower resolution textures on the PS3 would Sony bother to shrink their system memory usage? If the 360 didn't have a better vertex rate would they have worked so hard to develop the Edge tools? How about online, if the 360 wasn't pushing Sony would we have free online and Sony spending a mint to improve their services?

    Well, I'd have to get one first. :)
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...