Predict: The Next Generation Console Tech

Discussion in 'Console Technology' started by Acert93, Jun 12, 2006.

Thread Status:
Not open for further replies.
  1. DoctorFouad

    Newcomer

    Joined:
    Sep 30, 2011
    Messages:
    195
    Likes Received:
    0
    Exactly, people underestimate the power of the CELL processor especially its SPUs simply because they are doing GPU jobs, it was very disappointing for ps3 to have a relatively weak GPU, imagine CELL+XENOS :shock: that would have allowed the developers to do crazy physics, AI and animations...maybe even next gen wont be needed for 2013....

    I mean it is crazy, but the CELL is single handidly rendering EVERY post process effect in the majority of the first party sony games : motion blur, depth of field, color grading, anti aliasing....+ doing the lighting + the particle effects + vertex shading and basic polygons....+ the physics + animations all of this running in SPUs :shock:

    for those saying Liano is better than CELL : try program and run all of these things in Liano, good luck with that :wink:

    but just imagine if the CELL was free from any GPU stuff :shock: what could have happened to the games physics, AI, animations, interaction with environment ? I am very excited for next gen....
     
  2. hoho

    Veteran

    Joined:
    Aug 21, 2007
    Messages:
    1,218
    Likes Received:
    0
    Location:
    Estonia
    Yes, cell did need more "handiwork" to squeeze the performance out of but if someone did it then it delivered. On x86 you can have random spaghetti code run significantly faster than on SPE's, sure, but peak performance isn't all that good compared to 6-SPE Cell. It might be somewhat better but nothing stellar.

    Also, unless AMD fixes their cache architecture to not be horrible and attaches the APU to decent memory bandwidth then that will be more wasted potential.
    That's mostly because there haven't really been any half-decent AAA titles targeted towards PC and actually using it's powers in years so there really isn't anything good to compare against.
     
  3. DoctorFouad

    Newcomer

    Joined:
    Sep 30, 2011
    Messages:
    195
    Likes Received:
    0
    I dont know if this has or hasent been posted before, some trinity VS bulldozer figures : :grin:

    [​IMG]


    what this would mean for ps4 ?
     
  4. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    Unless things have changed, that's not what they are going for.
     
  5. IllusionistK

    Newcomer

    Joined:
    Nov 8, 2011
    Messages:
    54
    Likes Received:
    0
    Do you have an examples of code optimized for x86 that would suggest this. Not saying you're wrong, would just like proof.
     
  6. onQ

    onQ
    Veteran

    Joined:
    Mar 4, 2010
    Messages:
    1,540
    Likes Received:
    56
    you do know there is a GPU in that APU right?
     
  7. hoho

    Veteran

    Joined:
    Aug 21, 2007
    Messages:
    1,218
    Likes Received:
    0
    Location:
    Estonia
    Wasn't there some FXAA implementation that ran significantly faster on Cell than on i7? I couldn't find the source for it unfortunately.

    Also having 128 registers with fast local memory vs 16 (+ how many rename registers?) and absolutely horrible L1 in the BD isn't too good either. Yes, manual memory management isn't easy to do but when you spend enough resources on it you can squeeze out quite a bit of performance.
     
  8. RDGoodla

    Regular

    Joined:
    Aug 21, 2010
    Messages:
    609
    Likes Received:
    172
    Why an APU & HD 6670? A quad-core Athlon II and HD 7850 have similar power dissipation

    and die size with really much better graphical performance.
     
    #11268 RDGoodla, Apr 11, 2012
    Last edited by a moderator: Apr 11, 2012
  9. McHuj

    Veteran Subscriber

    Joined:
    Jul 1, 2005
    Messages:
    1,613
    Likes Received:
    869
    Location:
    Texas
    If the rumors of the PS4 devkit are true, that it's a 3850 Llano and a 6670, I think it's only to approximate the final performance of the APU. I think using a current gen APU in the dev kit, will give the devs early exposure to utilizing APU architecture and the shared memory between the CPU and GPU. Unless there's some weird multiple SKU setup with a low budget settop box that only utilizes the APU, I don't think we'll have a APU+discrete GPU.


    I agree that even an Athlon + 7850 would probably offer much better graphics performance. But I don't think all decisions are based on what's the best performance.

    I could be wrong and this is just my opinion, but I think that by using an APU with one shared memory pool really simplifies you're design and manufacturing process. Sure yields might be worse, but on a mature process the difference might not be that great.

    With APU, you have only one chip to test, package, and solder on to the PCB versus two for a CPU and GPU. The PCB design is simpler especially if there is only one memory pool. If you have dedicated system and video memory chances are good that they are different types so you have more components (and probably suppliers) to manage. The cooling design is probably simpler. With an overall simpler design, manufacturing is likely more reliable, quicker and thus cheaper.

    Going forward it probably cheaper and easier to shrink one chip instead of two especially if they're on different processes to begin with. MS has to work pretty hard to combine their CPU and GPU into the XCGPU, so it's easier to start that way.

    This threads is about predictions so I'm predicting an APU for both consoles in the future. I would love to see a dedicated CPU with 2500k level performance and 7850 level GPU and don't think its out of the real of possibility, I just don't think it will happen. I hope I'm wrong.
     
  10. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    If true, big if with these rumors, the answer would probably be something like "because Sony likes to be goofy and not do things the normal way" :lol:
     
  11. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    4,024
    Likes Received:
    2,851
    In this comparison, do you recall if the bandwidth and latency differences between Cell<->RSX and i7<->PCIe<->GPU were factored in?
     
  12. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    I believe two modules are enough.
    two bulldozer 3.0 modules, i.e. Steamroller, with a rather high frequency.

    suitable, embarassingly parallel calculations can be off-loaded to the GCN GPU.
    an APU even solves the link between GPU and CPU, being on die it can be ridiculously fast.
     
  13. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    Indeed, but FXAA and other GPU type operations which Cell is relatively good at wouldn't be amongst the CPU's workload in a console with a modern GPU which could handle such tasks far more efficiently than either Cell or the best x86's.
     
  14. almighty

    Banned

    Joined:
    Dec 17, 2006
    Messages:
    2,469
    Likes Received:
    5
    Just to throw it out there a next gen trinity APU is 100w + 63w for a 7670....
     
  15. hoho

    Veteran

    Joined:
    Aug 21, 2007
    Messages:
    1,218
    Likes Received:
    0
    Location:
    Estonia
    Wasn't that same FXAA implementation also drastically faster on Cell than on GPUs? I mean why did they even bother running it on i7 when PCs are generally equipped with far faster GPUs than consoles!

    On the whole I agree that GPU should be left for stuff that GPU is good for but still it does show that Cell has the potential of pulling off some insane stuff. It's only problem is it takes a ton of effort to implement it. Though I kind of wonder how much easier it is to implement the stuff on GPU vs Cell. I would imagine that it might actually be quite hard to do on GPU but it simply has a metric ton of raw power so it won't matter much if you waste it and implement stuff somewhat less efficiently on there than on Cell.
     
  16. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,245
    Likes Received:
    4,465
    Location:
    Finland
    7670 is just rebranded 6670
     
  17. Heinrich4

    Regular

    Joined:
    Aug 11, 2005
    Messages:
    596
    Likes Received:
    9
    Location:
    Rio de Janeiro,Brazil
    Fully agree, but maybe quad core Athlon II and HD 5850(thinking shrink to 28nm) could be more interesting yet.
     
  18. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,724
    Likes Received:
    195
    Location:
    Stateless
    Let see how those bulldozer cores v2 /piledriver fare before discarding them :)
    BD had good perfs in high threaded workloads better than Athlon II if memory serves right.
    BD also have better SIMD than Atlhon II or the cores in llano.

    If Sony goes with two pretty much off the shelves products there indeed may be better of with a std CPU+GPU set-up.

    I was thinking of an APU + GPU a while ago for the xbox (based on the double gpu rumors) but I was expecting a pretty "tiny" SoC (max 170sq. mm).

    Llano trinity both are ~230 sq. mm not tiny cheap.
    High end SKU consumes also a lot (100Watts of TDP could be more in real world).


    Overall if I put schedule issue (more a risk than a fact but a significant one) aside a reworked Kaveri make more sense than a llano + Kurt set up. The weak part in Kaveri is the ddr3 memory controller but I guess it would be pretty straight forward for AMD to replace it.

    The issue is AMD delivering on schedule, they have experience with TSMC so it's possible but say they have a new bug in the CpU somewhere or something like that... they won't have time for work around or will have delay.

    WRT the gpu In case of a std cpu+gpu set-up, I would discard anything with a 256 bit bus. So a reasonable bet would be a part akin to cap verde or a bit better. Cap verde may top at 12 CU, a replacement may go up to 16 CU. I would expect a CU number between 10 and 16.

    EDIT

    The more the time passes the more I believe that KB-smoker may be right on that, the 360 came together only a few months before release.
    If there is truth to the rumor, Sony may had this 6670 gpu to the devs kits to make up for the lack of raw power and bandwidth of llano.
    Kaveri with 2GB of GDDR5 (as more is problematic) should perform / out perform the said solution.
    FLOPS comparison are apple to orange between previous vliw5 architecture and the new gcn scalar architecture, you may want to remove 20% to the peak FLOPS figure.
    A8+ 6670 is 480+768 so 1248, minus -20% that's 998 MFLOPS. even if kaveri end south of that by a hand few FLOPS or even one hundred FLOPS, it should do the trick.

    I got to research more about those stream roller cores I don't know if there BD base like piledriver or based on older architecture.
     
    #11278 liolio, Apr 12, 2012
    Last edited by a moderator: Apr 12, 2012
  19. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    7670 sounds better though :razz:
     
  20. Heinrich4

    Regular

    Joined:
    Aug 11, 2005
    Messages:
    596
    Likes Received:
    9
    Location:
    Rio de Janeiro,Brazil

    Your viewpoints are perfect and fully agree with,but sometimes I think that if the APU will in fact succeed, because although the idea is excellent to marry cpu + gpu + memory controller etc on the same die (this is much more I know ..) perhaps i'm wrong here...I have the impression today and even next year the APUs is still very incipient and too ambitious to become something much more eficcient to compensate more powerfull the current paradigm CPU and GPU "separates,singles etc.


    Another interesting point that you touched,were the chances of bugs (similar to the multiplier in the Intel pentium or even worse damaging memory accesses etc) in these new processors coming from AMD ... and imagine if something similar occurs next gen consoles on the production line?

    I personally would prefer they used something that was had already tested and approved and if customized (put something extra on SIMD,die shrinks to 28nm,disable or retired pcs things etc.) cpus and gpus for next gen consoles maybe could be very interesting ... my "dream console" is something like quad Athlons II + Radeon HD 5850 (on the paper ...almost 2.1 TFlops ).
     
    #11280 Heinrich4, Apr 12, 2012
    Last edited by a moderator: Apr 12, 2012
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...