Predict: The Next Generation Console Tech

Discussion in 'Console Technology' started by Acert93, Jun 12, 2006.

Thread Status:
Not open for further replies.
  1. ADEX

    Newcomer

    Joined:
    Sep 11, 2005
    Messages:
    231
    Likes Received:
    10
    Location:
    Here
    Well there are some snippets of truth in there, even if they are just making things up.

    1) Larrabee is rather less than likely to be the PS4 GPU.
    2) POWER7 cores - the PowerXCell 32iv* was said to have 4 PPEs based on POWER7 technology.


    *Interestingly IBM only said the chip with 2PPEs was cancelled, they never said anything about the 4PPE version.


    So with that (and the news Larrabee has been canned). What will be in the PS4 now?
     
  2. SPM

    SPM
    Regular

    Joined:
    Dec 18, 2005
    Messages:
    639
    Likes Received:
    16
    The cancelled chip was the successor to the PoweXCell 8i which is used in IBM's Roadrunner supercomputers. The Roadrunner is made up of Opeteron and Cell chips for compatibility with existing Opteron code that some of IBMs customers have. What IBM will probably do is to to combine opteron chips and pure SPE chips, because the Power chip on the Cell is redundant when running opteron based applications with the SPEs acting as number crunching accelerators.
     
  3. grandmaster

    Veteran

    Joined:
    Feb 6, 2007
    Messages:
    1,159
    Likes Received:
    0
    They tend to be lifted wholesale from this forum. And then embellished with pure fantasy.
     
  4. sunscar

    Regular

    Joined:
    Aug 21, 2004
    Messages:
    343
    Likes Received:
    1
    1.) Not sure anybody really ever believed it would be (I always thought the idea was a joke).
    2.) PowerXCELL was never any more related to POWER7 than it was a PPC603C. POWER7 is a massive, hyper-expensive server MCM product with no business to really even be in a console. We keep getting "Oooo, but it's got DPFP"... and this is going to be *the* defining feature next gen, how?


    The reality is, honestly we don't know anything better now than we did six months ago, other than IBM has personally lost interrest in future revs of CELL (which does not break possibility of a custom CELL regardless), and that it looks like we're back to realistically a toss-up between Nvidia and ATI.
     
  5. assen

    Veteran

    Joined:
    May 21, 2003
    Messages:
    1,377
    Likes Received:
    19
    Location:
    Skirts of Vitosha
    I think yesterday's news that Larrabee will come out only as a SDK initially is conclusive proof that the next Cell, used in PS4, will be so powerful, it will have no problem emulating Larrabee in software!
     
  6. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    AMD
     
  7. V3

    V3
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    3,304
    Likes Received:
    5
    Breaking: Intel canned Larrabee to work on PS4.

    PS4 will most likely just the current Cell for CPU but with all the 8 SPUs and higher clock. Plus whatever GPU Sony decided to go with. Likely candidate are NV, AMD, PowerVR and maybe Toshiba.

    I hope they go with Toshiba, it'll most likely be underpower but at least it'll be some crazy solution that will be interesting to discuss on this board.

    If Kutaragi was still with Sony, PS4 would be easier to predict because he will hype it like no tomorrow. Without him Sony lacks vision.
     
  8. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,723
    Likes Received:
    193
    Location:
    Stateless
    Well Larrabee is canned and SCC will be the only INtel many core project Intel will push. Too bad it's a even more threatening chip for IBM: many x86, scalable, no extra headache coming from vectorization.
    I'm not confident in IBM trying to push a heterogeneous chip. Tilera, Intel, IBM? (even-though those products aims at different applications/workloads/markets).
    "One size fit it all" sounds like tomorrow maxim, by the way I've a question :) (should add this to my signature...) it's in regard to textures units.
    Say you have a cpu with VPU (no matter its width) that could work 64bits data or twice as much 32 bits data or four time as much 16 bits data would the lack of proper texture units would hurt that much? I mean most textures operations are done at 8bits precision and 16bits precision may be a bit of a waste but it may make things easier on both the software and hardware side.
    What do you think in this regard?
     
  9. SedentaryJourney

    Regular

    Joined:
    Mar 13, 2003
    Messages:
    476
    Likes Received:
    27
    I'm not particularly knowledgeable when it comes to CPU architectures or programming so I can't give a specific task or what-not, but basically any kind of task that is amenable to fast vector math. The kinds of jobs could be physical simulation, AI, audio, some rendering tasks, and other cpu jobs for handling data. As for whether or not Larrabee or GPGPU can run these tasks better, I don't know. I assume Cell would perform better than your typical wintel CPU but that's as far as I go.

    I understand that OoOE improves utilization by avoiding stalls due to data latency. An OoOE PPE sounds like a good idea to me but I heard that IBM was looking to pair SPEs with a power6 core which lead me to ask the question. I'm also wondering given an SPE like memory model if there's a way to design your jobs so they won't stall. SPEs are in-order and that's not likely to change for the next iteration of Cell.
     
  10. ADEX

    Newcomer

    Joined:
    Sep 11, 2005
    Messages:
    231
    Likes Received:
    10
    Location:
    Here
    I found it somewhat unbelievable given the source.

    POWER7 will be in a big MCM in some configurations but in others it'll just be a normal chip in a normal package.

    However, the cores themselves don't appear to very big or power hungry and given they're likely considerably faster then the PPE they could be used as a new PPE.

    They've done this before - The PowerPC G5 used a core from POWER4.
     
  11. sunscar

    Regular

    Joined:
    Aug 21, 2004
    Messages:
    343
    Likes Received:
    1
    See I could see something closer to this - like a few (two) smaller, stripped down POWER7 cores as PPE, with maybe 12-16 SPEs in tow, but I wonder if it'd even be relevant. I.E. would POWER7 cores have something more substantial to offer than say something from AMD, or another design from IBM. We still have to grapple with how lopsided the hardware might end up (things shifting toward powerful GPUs that largely ignore the CPU). That could figure greatly into what type of CPU we actually end up staring at. We could just end up with a rediculously powerful GPU paired up with some fairly spindly quad-core AMD of the day, and still have the strongest beast freak on the block (Ginormous arms, dangly little legs).
     
  12. Squilliam

    Squilliam Beyond3d isn't defined yet
    Veteran

    Joined:
    Jan 11, 2008
    Messages:
    3,495
    Likes Received:
    113
    Location:
    New Zealand
    So are next generation console chips likely to be fabbed on SOI, Bulk high K or just regular?

    What are the design tradeoffs between targetting SOI for perhaps lower power useage and your regular high K processes? Is it simply a question of a tradeoff between power + cooling and power regulation circuitry vs increased cost per viable die?

    Lastly if the 32nm SOI process at Global Foundries is ramping up now, how long would it take to build up a stockpile of 5-8M units, assuming the chip tapes out late Q1/early Q2? Im not assuming that its happening, but I just want to know if/it is possible.
     
  13. Sound_Card

    Regular

    Joined:
    Nov 24, 2006
    Messages:
    936
    Likes Received:
    4
    Location:
    San Antonio, TX
    I would love to see a 1080p 30fps standard next gen. No exceptions. I don't like how some games go below 720p, or how some games go below 30fps.
     
  14. Prophecy2k

    Veteran

    Joined:
    Dec 17, 2007
    Messages:
    2,467
    Likes Received:
    377
    Location:
    The land that time forgot
    Heck... why not go all the way and have a 1080p 60fps standard ;-)
     
  15. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    I want a settle on 720p (well, freedom to the developers, I don't doubt we'll see a lot of 1080p titles).
    Next-gen, consoles will be limited by watts (and then bandwith), you won't see again a custom version of a high end chip as on PS3. You can safely rule out a full size Fermi or a similar AMD offering.

    I favor a solid 720p, well anti-aliased (MSAA + transparency supersampling + selective supersampling built in the code of noisy shaders), more efficient on a 720p display than a downscaled 1080p. (perhaps by then most people will own 720p TV, followed by 1080p and SD in no particular order)

    downscaled 1080p is a crappy and expensive 2.25x ordered grid supersampling ;)
     
  16. brain_stew

    Regular

    Joined:
    Jun 4, 2006
    Messages:
    556
    Likes Received:
    0
    Something like the 1280x1080 used in GT: Prologue is a nice compromise if you ask me. It delivers really pristine image quality on 1080p displays with simple horizontal scaling and doesn't come with the 2x perf. cost that standard 1080p does.

    A mandate that not a single title can ship without 8xaf is needed as well. Lack of proper texture filtering, is the easiest way to make any game look like an ugly mess no matter how many effects it pushes, even moreso than aliasing (which a nice resolution like 1280x1080 will go a decent way to solving anyway) imo.
     
  17. Sound_Card

    Regular

    Joined:
    Nov 24, 2006
    Messages:
    936
    Likes Received:
    4
    Location:
    San Antonio, TX
    60 would be nice on sports, fighting, racing games... But could be too restrictive on FPS, TPS, or RPG's that set out and market their games with advance graphics. I could tell a difference between 30 and 60, I think anyone can, but I can tell more of a difference between 25 and 30 or 20 to 30. Thats just me, but it tends to make me spin.

    I read somewhere that 1080p sets lately are outselling 720p. Mainly because cost of TV's have come down so much, 1080p is a lot more viable. 720p made sense this gen(2005), but if were talking 2012 or 2011 for next gen, and most likely a 7 year life cycle for them (2018-2019), it makes a lot more sense to go 1080p. Regardless if you have a market base still on tubes, or a huge chunk on 720p.

    Gears of war looked fantastic on my 480p set. The artificial super sampling effect did some nice FSAA. In fact the game looked some what worse to me when I played it on my 720p set.

    I personally can't stand the look of 1280x720 expanded past 32 inches. And I won't be able to bare another generation of it.
     
  18. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    9,983
    Likes Received:
    1,494
    Aren't you going to be rendering almost 3x the pixels again. That would require a 3x leap in hardware power just to stay where we are at right now.

    No I think 720p will be the standard with loads of fsaa .
     
  19. corduroygt

    Banned

    Joined:
    Nov 26, 2008
    Messages:
    1,390
    Likes Received:
    0
    How about some different standards. I say 720p minimum, No tearing is a must and max 5% dropped frames on 30hz games and max 10% dropped frames in 60hz games. As long as these conditions are met just optimize for the highest resolution for your target refresh rate with 720p 30hz as minimum.
     
  20. msia2k75

    Regular Newcomer

    Joined:
    Jul 26, 2005
    Messages:
    326
    Likes Received:
    29
    It's 2x not 3.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...