The future of consoles

Discussion in 'Console Industry' started by invictis, Jan 21, 2022.

  1. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,092
    Many billions of dollars went into the Cell. It was quite intresting at the time, even the Dutch where involved in the Cell's design.
     
    Nesh likes this.
  2. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    One problem with the Cell concept was external device interconnects. It's still an unsolved problem. We can look at all sorts of distributed computing concepts like XB1's 'cloud enhancement' and see it doesn't really work. Processing needs to be centralised to a system - either local console or cloud server - meaning we're stuck with Von Neumann architecture and the idea of a sort of organic cellular network of processors that just meet up and share workloads isn't likely to ever happen (in silicon).

    I wonder what the expectation was for Cell? Did the designers just think the software would magically solve the problem?
     
    DSoup, Nesh and PSman1700 like this.
  3. Nesh

    Nesh Double Agent
    Legend

    Joined:
    Oct 2, 2005
    Messages:
    13,999
    Likes Received:
    3,720
    Yeah it gave the impression that they would just put it up there and the software providers would deal with it. But what was the direct value proposition to even bother?
     
  4. orangpelupa

    orangpelupa Elite Bug Hunter
    Legend

    Joined:
    Oct 14, 2008
    Messages:
    10,467
    Likes Received:
    3,190
    And it was ludicrously fast for certain things. Unfortunately those certain things were not games but for folding proteins and for doing supercomputer work
     
    PSman1700 likes this.
  5. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    It'd be interesting to remap modern trends onto Cell. The move towards compute workloads and data driven game design would fit Cell better now than the conventional paradigms of PS3's era.
     
    Nesh and PSman1700 like this.
  6. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,092
    Maybe the emotion engine aswell, it was flexible for its time, maybe even today.
     
  7. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    4,024
    Likes Received:
    2,851
    What workloads don't map well enough to either of the CPU or GPU compute paradigms such that it would be worth allocating transistors, bandwidth and development resources to support a third?
     
    PSman1700 likes this.
  8. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    I wasn't advocating including Cell (this time around!). Just that it'd be interesting to see what PS3 could do after 15 years of software evolution.
     
    DSoup, mrcorbo and PSman1700 like this.
  9. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,092
    Probably a whole lot more then it did in its generational life span, as with any hardware. At the cost and pain of developers though ;) The PS2 was probably the most exploited console out there, some devs even liked that system due to finding new tricks to squeeze just that little bit more performance or visual feature.
     
  10. McHuj

    Veteran Subscriber

    Joined:
    Jul 1, 2005
    Messages:
    1,613
    Likes Received:
    869
    Location:
    Texas
    Depends what you mean by map well. From a strictly programming view, not many. It's easiest to program on a modern CPU.

    From a perf/area or perf/watt, AI inferencing. Neither GPU nor CPU is very efficient in terms of power consumption and silicon area for running a neural network inference that relies on matrix multiplications. That's why AI inference is a very hot area in silicon design among all the players in the space (Apple, Qualcomm, Google, Samsung, etc). They have noticed what NVIDIA is doing with DLSS and their tensor cores and solutions like that are going to start popping up all over the embedded world. It's looking like an area and power savings win to have a smaller GPU + AI engine for the upscaling versus just building a bigger GPU for the same output resolution.

    I fully expect consoles to take this approach in the future as well, I just wonder if that would be a black solution or would be open to developers as well.
     
    mrcorbo and PSman1700 like this.
  11. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    4,024
    Likes Received:
    2,851
    Playing Devil's Advocate a bit with myself here, but what about adding programmable logic blocks? Console makers could provide pre-built optimized functions in their SDK for developers to choose from. Engine developers could roll their own.
     
  12. vjPiedPiper

    Newcomer

    Joined:
    Nov 23, 2005
    Messages:
    136
    Likes Received:
    88
    Location:
    Melbourne Aus.
    I've always liked this idea, essentially placing a chunk of fpga style compute inside a console and allowing each game / app to program the fpga as they want.
    But sadly the numbers an possible perf improvements just dont stack up.
    Most fpga run in the 500-1Ghz range, and are physically pretty big. So i'm guess just chucking an extra few CU's at the problem is going to result in a simpler and faster GPU, that is much easier to use.
    Most limiting factors in games these days aren't about making 100 specific highly complex calculations go faster, it's about making a billion+ medium to highly complex calculations go faster.

    Also on a more recent example, RDNA 2 GPU's sort of do this, by sharing the raytracing and texturing capabilities. so devs can choose where to allocate the work. Whereas on NV, the raytracing HW is dedicated, and goes un-used if your not doing RT.
     
    mrcorbo likes this.
  13. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    On the surface, that sounds like N64's microcode to me.
     
  14. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    4,024
    Likes Received:
    2,851
    FWIW, I was viewing this is in the context of AMD's Xilinx acquisition. Just wondered if there was any use for that technology in a console SoC.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...