*Game Development Issues*

Discussion in 'Console Industry' started by valioso, Oct 29, 2007.

Thread Status:
Not open for further replies.
  1. FutureCTO

    Banned

    Joined:
    Oct 21, 2006
    Messages:
    103
    Likes Received:
    0
    Surprisingly true.
    The problem with the Xbox is latency to the CPU and core usage.
    The problem with the PS3 is that Cell needs to ask GDDR3 to send it to XDR first.
    Otherwise it is limited to 256MB minus the OS reserve.

    Both systems use 2006 graphics chips that would have 256MB Graphics RAM each.
    Yet only planned 256MB for processing. 512MB would have been a nice fit.
    768MB would have been nearly perfect this generation.

    Otherwise everybody has to go to a lower resolution to support more memory space for general processing.
     
    #521 FutureCTO, Feb 28, 2008
    Last edited by a moderator: Feb 28, 2008
  2. chachi

    Newcomer

    Joined:
    Sep 15, 2004
    Messages:
    120
    Likes Received:
    3
    Like I said before, you'll never know that a game has been downgraded if you never see a superior version of it. That's too bad, gamers are ultimately the ones who lose in this scenario. Hopefully LucasArts will lose enough sales that just dropping the baseline doesn't become the easy way to avoid the "lazy developers" tag.
     
  3. woundingchaney

    Regular

    Joined:
    Jan 29, 2006
    Messages:
    799
    Likes Received:
    1
    Location:
    Terre Haute, IN


    Why put 768 of video memory on a 128bit bus? Given the buses in the architectures are not traditional; one has to question what would be overkill in an amount vs. usability scenario.

    There are various aspects within both architectures that would need to be addressed in order to "raise" limitations outside of their "low" memory allocations. I do agree that raising the amount of memory would create some headroom, but it wouldnt be the end all solution (afaik).
     
    #523 woundingchaney, Feb 28, 2008
    Last edited by a moderator: Feb 28, 2008
  4. Mobius1aic

    Mobius1aic Quo vadis?
    Veteran

    Joined:
    Oct 30, 2007
    Messages:
    1,683
    Likes Received:
    259
    It would help with games that use insane amounts of memory at once. For instance Battlefield 2 and 2142, Crysis, FEAR. A game like Crysis needs lots and lots of memory, upwards of a full Gigabyte just for itself if you want it to look good and run smoothly, hence why I'm glad to have 4 GB of head room on my computer. Sure a 128 bit bus is a bit of a hampering on GPUs that should have a 256 but the memory is a gimmpage too.
     
  5. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,081
    Likes Received:
    651
    Location:
    O Canada!
    The question is more like How do you... not why!

    Its not possible, you'd need to go to a 192-bit bus, which has ramifications on the architectures, ramifications on die sizes, ramifications on motherboard traces, etc., etc. In otherwords, its too costly.
     
  6. FutureCTO

    Banned

    Joined:
    Oct 21, 2006
    Messages:
    103
    Likes Received:
    0
    I did not mention the bitness of the bussing involved.
    And yes it would have been too costly.

    But we are rarely satisfied by limits. Even when they are relatively generous to us.

    America, land of obesity, where people still gripe about buffet prices when they can't eat more snow crab than they pay for.
     
  7. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,081
    Likes Received:
    651
    Location:
    O Canada!
    Yeah, you did, by suggesting such a memory quantity.
     
  8. FutureCTO

    Banned

    Joined:
    Oct 21, 2006
    Messages:
    103
    Likes Received:
    0
    Alrighty then, I will mention bitness.

    Dave, are these the same changes you seem to think I had mentioned?
     
  9. FutureCTO

    Banned

    Joined:
    Oct 21, 2006
    Messages:
    103
    Likes Received:
    0
    I apologize for the curtness. Please understand, very few people enjoy being told what they were thinking.
    Earlier was an attempt at being brief on the convenience of memory size. But a longer post is needed.

    The motivation for my mentioning increasing memory space has to do with developer convenience. (John Carmack interviews.)
    However that is a greedy request, adding more cost to a system that already offers more bandwidth and storage than we are using.
    The same as crying for lower priced all-you-can-eat buffets, when all you are eating is the snow crab, and not trying something new.
    On the Xbox360 if you reduce the native resolution you lower processing demand placed on Xenon/Xenus and allow more space for logic.
    It is honestly that simple. And because it is easy to do this, many developers have lead with easy.

    On the PS3 if you lower the resolution you free up more GDDR3 memory.
    Yet the Cell can handle a lot more Geometry & Shader processes than Xenon. So why focus on reducing processing load by doing smaller GDDR3 resolutions?
    Additionally, the PS3 was not designed for Cell's cores rely on making direct requests from the GDDR3 memory.
    So to do this, you might create a very small buffer pool in the XDR space to stream from GDDR3 to XDR to Cell cores.
    Even though once you have done this, it is easy to do again, cell shader calls could be set up to automatically initiate the process.

    With dedicated/pre-partitioned memory you have the advantages of both higher bandwidth and lower memory latency.
    But you have the disadvantage of managing two separate pools and one storage space limit (XDR) to make calls from.
    {256MB of GDDR3 is more than generous enough to be equal to the graphical needs of all in game action on the Xbox360. It's target resolution promised no games below 720p. That was even part of its official FAQ. Yet as mentioned smaller resolution makes processing easier. So games like Halo3 (640p) wind up being less than 720p.}

    A better and smarter approach than wishing for additional XDR Memory
    When developing cross platform games leading on the PS3 it is easier to work around slowness on Xbox360 by making games smaller.
    Because when you come over from the PS3 you have more speed, more storage, though DRAM is divided in its physical space.
    So why not use the unified space on Xbox360 as a single collecting pool for, PS3 lead cross platform developing?
    It would be a lot easier to reduce texture sizes, frame rate, or size, while moving two separate memory pools into one.

    The decision to lead with the Playstation3 makes a lot of sense when you reverse the picture.
    If you start with the PS3 then you don’t have to get creative to overcome having divided memory or programming.
    Because all the divisions you make can be combined into a single pool on the Xbox360.
    Then IF your game is limited by reduced memory and processing speed you can simply shrink it down on the Xbox360.
    Problem solved. It is easier to down sample a few factors (Xbox360), than it is to grow into a separated parallel shape (PS3).
     
    #529 FutureCTO, Mar 1, 2008
    Last edited by a moderator: Mar 1, 2008
  10. chachi

    Newcomer

    Joined:
    Sep 15, 2004
    Messages:
    120
    Likes Received:
    3
    You make an excellent point, except for the fact that the GPU is generally what handles geometry and shading. It's because the RSX isn't capable of keeping up with the 360's GPU that developers have used Cell to offload some of the burden.

    Exactly, except for, you know, all those games that use more than 256MB. Hint: it's probably pretty much all of them.

    Bungie made a bad choice, IMO. They apparently chose a lighting method that required them to render twice the number of frames and to do that and meet their other targets they chopped the resolution. That doesn't mean a less obtuse developer would have a problem running at 720p, most of them do.

    Yes, because the 360 is the one with the more limited RAM, exactly. Make sure to cut the texture budget too, otherwise the 360 would really be in trouble!

    Interesting perspective. Here's how it's always worked in the past: a system with superior hardware can more easily handle the port from more limited systems because it has, well, superior hardware. This doesn't seem to be the case with the PS3, or at least it hasn't often been the case without a lot of blood, money, sweat, money, tears and some more money again. This is what all the fuss has been about; not that the PS3 has been held back by sub-par 360 led games, but that it couldn't match the 360 running those same games. Fanboys got riled up, Sony wasn't happy, developers were called lazy, etc.

    Fortunately for gamers Sony and some developers have a solution to all this: lead on the PS3. Hey presto - no more inferior PS3 ports. It's important that you actually lead on the PS3 rather than employ these same programming methodologies which are needed for PS3 performance (and supposedly work better on the 360 as well) because... um, yeah, what was that reason again?

    I know I'm pretty much one lone voice crying in the wilderness against this here but it'd be nice to see developers actually work to exploit each platform to their limit, whatever that happens to be. The least common denominator sucks.
     
  11. Cheezdoodles

    Veteran

    Joined:
    May 24, 2006
    Messages:
    3,930
    Likes Received:
    24
    768mb ram would also cost a minimum $1 billion more in production costs.

    Not to mention bit implications.
     
  12. Butta

    Regular

    Joined:
    Jan 18, 2007
    Messages:
    361
    Likes Received:
    2
    I always wondered, could they have done a 384MB XDR / 384MB DDR3 split on the PS3 given the current 128-bit buses?
     
  13. nAo

    nAo Nutella Nutellae
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,333
    Likes Received:
    128
    Location:
    San Francisco
    What you just wrote here doesn't make much sense.
    What does "superior hardware" mean?

    PS3 (and 360 as well) is mostly held back by poor engineering practices and lack of proper planning.
    In 2005, without having touched/seen a 360 or PS3 SDK ever, other ppl and I were writing on this very forum what we needed to do in order to push these new CPUs (and we were mostly spot on).
    Three years and milion of dollars later a lot of studios with their senior engineers and managers still haven't got a clue.
    They relied on old assumptions (next gen = the same old stuff, just faster) they used in the past and they were..emh..simply wrong, while they had all the information they needed to take better decisions.
    This barely works for 360 and it doesn't work at all on PS3. I can't wait to see what will happen in 2-3 years when we will have 32 cores per CPU..:)
    The best way to enforce this programming methodologies is to design and test them around PS3 but this is not necessary at all. Lead platform doesn't mean that everything has to be done FIRST on PS3, this is a another non sense.
    Having PS3 as a lead platform means that everytime you design some new subsystem in your engine you have to sit-down for more than 5 seconds and carefully think about your data design/flow, dataset size, etc.. and make sure that what comes out of your mind fits PS3/360 architectures. Unfortunately in order to do that you have to study and to learn new things.. do you see where the barrier is here?
    In 10 years we will maybe have programming languages and tools that will automatically care of this (though I'm kind of skeptic) and we won't need to spend a single minute thinking about these issues.
    Multiplatform games will always be about the lowest common denominator, but this LWD is a moving target and can set the bar quite hight as some AAA multiplatform titles already display (COD4/Burnout in primis)
     
  14. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    21,738
    Likes Received:
    7,406
    Location:
    ಠ_ಠ
    Well, they're not so much rendering the scene twice as they are outputting a given frame twice with two different levels of exposures. Doing so uses up most if not all the eDRAM and they didn't want to do tiling (for who knows what reason. The caveat of tiling is the increased geometry processing cost, but other devs here have considered it a "solved" problem).

    The reason they chose the lower rendering resolution is because the lighting algorithms are very taxing on the GPU's computational power; even if there was enough eDRAM for 720p and their double framebuffer, they might very well have chosen the same lower resolution since there are already times when the frame rate can drop during the single player campaign. i.e. if the framerate is mostly stable and we're seeing a few frame rate drops, just imagine how much worse it would be with 25% more rendering work (1280x720 is 25% more pixels than 1152x640).
     
  15. Mobius1aic

    Mobius1aic Quo vadis?
    Veteran

    Joined:
    Oct 30, 2007
    Messages:
    1,683
    Likes Received:
    259
    Giving up so much for their goddamn HDR was such a bad decision. Of course, millions of screaming fans I'm sure beg to differ, and even still, Halo 3 could've been rendered at 480p EDTV and it would still have been the smash hit it is of course.
     
  16. Heinrich4

    Regular

    Joined:
    Aug 11, 2005
    Messages:
    596
    Likes Received:
    9
    Location:
    Rio de Janeiro,Brazil
  17. _phil_

    Veteran

    Joined:
    Jan 3, 2003
    Messages:
    1,659
    Likes Received:
    13
    Even after the UT3 tweaks , UE3 is still very far from stressing the cell spus...
     
  18. Mobius1aic

    Mobius1aic Quo vadis?
    Veteran

    Joined:
    Oct 30, 2007
    Messages:
    1,683
    Likes Received:
    259
    To hell with UT3, it's not that great anyways. I want to see the PS3 build of CryEngine 2.0 instead :D. But on the difficulty with development, I honestly think it's best that the PS3 ends up being the lead platform for multi-platform titles, that way a game is completed on the "harder" platform first then it can be easily moved to the 360 or PC and we still get a quality PS3 game.
     
  19. TimothyFarrar

    Regular

    Joined:
    Nov 7, 2007
    Messages:
    427
    Likes Received:
    0
    Location:
    Santa Clara, CA
    Exactly!

    The console hardware (PS3 and 360) is amazingly fast when programmed correctly. Multi-platform devs are having problems not just with the PS3 but with the 360 as well. A good part of the problem is that you can just almost port legacy single threaded "engineered for the PC" code to the 360 and have it run almost good enough to ship a game. Split your "PC engineered" game into two threads: game on first core, and render on second core (and other lessor things elsewhere), and you probably can get enough performance to finish the project without doing any new thinking. For those types, the PS3 simply looks like a single core 360 (instead of a 3 core 360). So naturally complaints arise that the PS3 sucks or that the PS3 is hard to develop for because those devs are frustrated trying to optimize the code to run on only the PPE (without using the SPUs for anything important).

    The real problem is just as nAo said, threading + going SIMD, and to a lessor extent dealing with less cache, in-order execution, expensive branches, etc. How many devs do you think even get close to extracting the performance available on the 360? How many are even doing major SOA SIMD code?

    Look at the PC side of things. How many cross platform devs still prototype or develop mostly on PCs? Many PC devs are perhaps stuck in MSVC 2005 SP1 and have to support non-SSE2 class PCs. MSVC 2005 SP1 even with /arch:SSE (meaning compile for SSE class machines) still uses the standard x86 float stack most of the time when compiling. MSVC doesn't have GCC's easy assembly templates (where registers are picked in the compiler), and SSE intrinsics compile very poorly with MSVC 2005. The point here being that even SIMD stuff on PC's is a mess and way under-utilized. Not to mention, C/C++ isn't designed to take advantage of the hardware.

    The hardware trends are obvious: more cores, larger SIMD, in-order, smaller (ie shared) caches, etc. If devs are bitching now about the PS3, I cannot wait to see the reaction if they find Larrabee as the CPU+GPU in the next XBOX... how do you think they are going to manage to provide enough work to feed 64-128 threads each requiring 16-wide SIMD to get anything close to good floating point utilization, and all with similar data locality to get good cache utilization?

    The PS3 is an opportunity for devs to redesign and engineer algorithms + data flow which will work extremely well on future hardware. Those who are making good use of the SPUs on the PS3 now are going to be leagues ahead on future hardware, everyone else, well, will really being complaining!
     
  20. pipo

    Veteran

    Joined:
    Jun 8, 2005
    Messages:
    2,624
    Likes Received:
    28
    Let it roll...

    http://www.gamesindustry.biz/content_page.php?aid=34418
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...