Predict: The Next Generation Console Tech

Discussion in 'Console Technology' started by Acert93, Jun 12, 2006.

Thread Status:
Not open for further replies.
  1. anexanhume

    Veteran

    Joined:
    Dec 5, 2011
    Messages:
    2,078
    Likes Received:
    1,535
    8GB along with Cape Verde seems so odd. They tend to skimp on RAM, yet 8GB is a hefty amount even by today's standards. Perhaps there is some truth to the dual but not xfire GPUs rumor back in April?
     
  2. ERP

    ERP
    Veteran

    Joined:
    Feb 11, 2002
    Messages:
    3,669
    Likes Received:
    49
    Location:
    Redmond, WA
    Who said he was in a high position, almost any senior PM at MS could have created/given that presentation. Presentations like this are a dime a dozen at MS unless you know the context of why it was given it's hard to weigh the value of the content.

    FWIW I also don't like to equate individual competence with project success they are rarely related on large teams. There were probably a lot of smart people on the Zune project, I know there were a LOT of smart people on WinFS and that more or less managed to kill Vista as it was back then.
     
  3. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,360
    Likes Received:
    1,377
    There are rather major limitations though. The shared data has to be very small in order to fit into core-local memory. The AI characters can do independent decision making only as long as their decisions do not depend on what any other AI is doing. Not to mention that AI is fundamentally about decision making, i.e. branching, which generally wreaks havok with very parallel architectures with their small local memories, long pipelines and typically light weight branch prediction/handling hardware.

    So AI parallelizes nicely as long as you want to do comparatively trivial stuff on small data sets. As usual.
    That's not to say it's useless. I'm just pointing out some limitations for the benefit of those who do not have much personal experience with parallel codes.
     
    #12543 Entropy, Jun 19, 2012
    Last edited by a moderator: Jun 19, 2012
  4. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    7,610
    Likes Received:
    825
    AI cooperating at 60 Hz time steps is plenty fast enough (which from a simulation point of view means they are completely independent). Unless you are trying to simulate nearly instantaneously communicating hunter killer robots. The edge case instabilities with timestep simulation in things like routing are an opportunity to improve your AI model and get more realistic behaviour (anyone who has been in traffic knows that humans have edge case instabilities in routing as well).
     
  5. upnorthsox

    Veteran

    Joined:
    May 7, 2008
    Messages:
    2,106
    Likes Received:
    380

    Pick a lane already dammit!!! :razz::lol:
     
  6. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    18,762
    Likes Received:
    2,639
    Location:
    Maastricht, The Netherlands
    Can you give me a real life example where you get stuck? Large shared data is similar to a framebuffer, but that doesn't stop it being processed in parallel. You can have one step in the process isolate what AI is close enough to what other AI to group them together for chunking, you can bit-flag lots of state and decisions on AI and run them past decision logic, and distribute the data according to the key flags that were set. There's not a lot you cannot do - most of us just aren't used to it.
     
  7. anexanhume

    Veteran

    Joined:
    Dec 5, 2011
    Messages:
    2,078
    Likes Received:
    1,535
    Wouldn't the NP nature of round robin style individual communications through all pairs of AI be the big show stopper, not how often they need to communicate?
     
  8. Acert93

    Acert93 Artist formerly known as Acert93
    Legend

    Joined:
    Dec 9, 2004
    Messages:
    7,782
    Likes Received:
    162
    Location:
    Seattle
    See, this is where I am very disturbed by the GPU rumors. It isn't so much the graphics alone, but it is the major computational gains GPUs have made and the breadth of problems they can solve.

    The work loads a GCN like GPU can do compared to Xenos are night and day; to contrast what more do you get from a ballooning CPU budget? Definitely not as much from the GPU side!

    The benefit of a GPU-centric design means up front the next gen will have a "next gen look" and over the product life span there will be substantial performance in the reservoir to explore and exploit new techniques. A CPU-centric design is throwing a hole lot of transistors for very modest computational gains.

    I thought MS was at the forefront of investing in GPGPU and it being a disruptive force for the middling CPU market. If they throw in a 800GFLOPs-1TFLOPs GPU in a market full of True HD 720p and Full HD 1080p displays and people looking for Next Gen differentiation I really wonder where the excess overhead for GPU Compute / AMP will be as that GPU is going to be taxed.

    Of course MS and Sony also have CPU makers in their ears trying to get silicon budgets shifted their way as well so I am sure there is some politicking behind closed doors.
     
  9. ERP

    ERP
    Veteran

    Joined:
    Feb 11, 2002
    Messages:
    3,669
    Likes Received:
    49
    Location:
    Redmond, WA
    I'll make an observation here, I would guess that even if you were 100% utilizing the output of your GPU to render a scene something like 20 to 60% of the ALU flops would be going unused.
    ALU's aren't used at all when you;re rendering shadow, they are grossly underutilized when rendering post effects. If you do a deferred renderer then when you lay down the initial pass they are underutilized.

    It's hard to say how much compute you could do without impacting rendering, and a lot depends on the mix of texture units, ROPS etc to ALU's, but I would guess it's a none trivial amount.
     
  10. Acert93

    Acert93 Artist formerly known as Acert93
    Legend

    Joined:
    Dec 9, 2004
    Messages:
    7,782
    Likes Received:
    162
    Location:
    Seattle
    ERP, as a developer, what is your view of GPU Compute? Especially as we transition from DX9.x hardware like Xenos and RSX.

    Do you see compute, in game development, taking on more and more tasks that were traditionally the domain of the CPU only; is it too limited to be worth the investment; or is it too early to say?

    As a developer would you like to see a shift in silicon resources toward compute or do you think there is still a lot of things that a proper many-core CPU could do if it was a development baseline (assuming we have seen the industry stall due to the low core count of the 360 and most PCs)?

    What should we as fans, enthusiests, and consumers be looking forward to and what should excite us and what marketing pit falls should we be wary of (e.g. macho flops)? I am sure in the next 12 months we will see the emergence of the "next PR war" so an educated heads up to avoid the 'next big number' like raw peak mips, MHz, polygons, flops, cores, etc that we need to avoid being enchanted by?

    As a developer what are the two biggest issues you want to see new platforms resource/aid?
     
  11. ERP

    ERP
    Veteran

    Joined:
    Feb 11, 2002
    Messages:
    3,669
    Likes Received:
    49
    Location:
    Redmond, WA
    Well I don't work on a game team anymore, so you should take my opinion with a huge grain of salt.

    Compute certainly has it's place, it's actually hard to write performant none trivial compute jobs.
    Part of that today is tools, without a enough information to determine why your compute job is running slowly there is a lot of guess work involved.

    I think next gen challenges are going to be a lot like this gens were, it's as much about dealing with team size growth and production issues as it is technology. What's the right balance for geometry, how complex should you're shaders be etc etc.
    3d graphics algorithms will continue to move forwards and I think you'll see a big step forwards as the generation progresses as a result. I think compute will be a big part of that.
     
  12. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    Obviously we are still working on early, vague details, But based on those details my non-fictional take looks like this.

    Developers: "Give us more power."

    MS: "We'll double the memory like last time."

    Developers: "Not good enough."

    MS: "Deal with it."
     
  13. XpiderMX

    Veteran

    Joined:
    Mar 14, 2012
    Messages:
    1,768
    Likes Received:
    0
    Devs asks for more ram, DICE want 8gb for example.
     
  14. Acert93

    Acert93 Artist formerly known as Acert93
    Legend

    Joined:
    Dec 9, 2004
    Messages:
    7,782
    Likes Received:
    162
    Location:
    Seattle
    Consumer: MS, give me morz GPUz!

    MS: We gave you morz RAMz.

    Consumer: We needz more GPUz!

    MS: Deal with it.

    Consumer: Hmmm I wonderz what Sony has under the hood... morz GPUz!

    [​IMG]
    [​IMG]
     

    Attached Files:

    • gpuz.jpg
      gpuz.jpg
      File size:
      19.5 KB
      Views:
      304
  15. XpiderMX

    Veteran

    Joined:
    Mar 14, 2012
    Messages:
    1,768
    Likes Received:
    0

    [​IMG]

    :smile:
     
  16. Ruskie

    Veteran

    Joined:
    Mar 7, 2010
    Messages:
    1,291
    Likes Received:
    1
    I'll take fancy shaders and lighting over texture resolution and loading times any day of the week. Someone should contact that GAF guy who knew Wii U specs and tell him to give a closer look because meltdown is about to happen in near future :lol:

    In the end it could end up like this...

    Developers : Give us more power MS!

    MS : Here is 8 gigs of cheapest RAM available.

    Developers : Huh...Its cool I guess. Give us more graphics processing power.

    MS : NO! Here is 1 TFLOP and deal with it.

    Developers : Sigh...Ok. Here is screen tearing, frame dropping sub hd multiplatform game for you. Deal with it!
     
  17. Mianca

    Regular

    Joined:
    Aug 7, 2010
    Messages:
    333
    Likes Received:
    19
    Interesting the consumer doesn't seem to take part in those dialogues :)

    That being said, SONY won't need more than 2GB of RAM if they're really going for cloud gaming in the long run.

    All they need is a well-priced system that has enough power to sustain "traditional" console gaming for another few years - and is ready to be gradually integrated into next-gen cloud gaming stuff.

    As far as I'm concerned, it's also time for more variable SKUs. Like having a "base model" without Bluray drive (but with a few USB3 ports for later upgrades).
     
  18. french toast

    Veteran

    Joined:
    Jan 5, 2012
    Messages:
    1,667
    Likes Received:
    9
    Location:
    Leicestershire - England
    This is along my thinking and something almost no one ever considers when totting up the specs, you don't need 10 times the units to get 10 times the power of a 8 year old console.
     
  19. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,245
    Likes Received:
    4,465
    Location:
    Finland
    Interesting how people are so confident that they already "know the specs" ;)
     
  20. HEYDOL

    Newcomer

    Joined:
    Jun 12, 2012
    Messages:
    4
    Likes Received:
    0
    if the hd 4770 is the gpu inside the Wiiu, why Nintendo talk about only 1,5 times the raw power of the current gen console ?
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...