Next Generation Hardware Speculation with a Technical Spin [2018]

Discussion in 'Console Technology' started by Tkumpathenurpahl, Jan 19, 2018.

Tags:
Thread Status:
Not open for further replies.
  1. phoenix_chipset

    Regular Newcomer

    Joined:
    Aug 26, 2016
    Messages:
    546
    Likes Received:
    246
    Currently... Power consumption. But the higher price would more than negate that advantage.

    Also, considering we're looking at a 2021 launch for Sony, we may see navi's successor which is looking to be designed with ryzen's architecture in mind aka a more nvidia like approach.
     
  2. dobwal

    Legend Veteran

    Joined:
    Oct 26, 2005
    Messages:
    5,733
    Likes Received:
    1,995
    Not really. It makes sense that MS would work with Nvidia since its gpu dominates the PC space. Both AMD and Nvidia are important partners.

    MS first showed off DX12 with nvidia hardware.
     
  3. Wynix

    Veteran Regular

    Joined:
    Feb 23, 2013
    Messages:
    1,052
    Likes Received:
    57
    I doubt Nvidia can convince either console manufacturer into choosing them, unless they offer something truly remarkable for the $/perf.
    Both have been burnt by Nvidia in the past and so far amd has been a solid partner.

    Edit; many mistakes.
     
    #1363 Wynix, May 31, 2018
    Last edited: May 31, 2018
    Lightman and Silenti like this.
  4. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    12,711
    Likes Received:
    3,632
    I'm pretty sure Nintendo is kicking themselves in the ass right now considering the exploit in the tegra that has blown their console wide open.
     
  5. snarfbot

    Regular Newcomer

    Joined:
    Apr 23, 2007
    Messages:
    651
    Likes Received:
    225
    Lol I dunno just seems like something Microsoft would do.
     
  6. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,146
    Likes Received:
    8,533
    Location:
    ಠ_ಠ
    I was under the impression that their current devkits go for the full 16GB now (using the same mobo design), so this sort of packaging might not be desirable until 32Gbit density chips arrive for any hypothetical switch to GDDR6, lest they ask devs to go back to an 8GB kit or they also design an entirely different motherboard altogether.

    edit:

    Durango might be a neat case for switching due to the twin 16-bit channels.
     
    #1366 TheAlSpark, May 31, 2018
    Last edited: May 31, 2018
  7. anexanhume

    Veteran Regular

    Joined:
    Dec 5, 2011
    Messages:
    2,078
    Likes Received:
    1,535
    Clamshell?
     
  8. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,146
    Likes Received:
    8,533
    Location:
    ಠ_ಠ
    The RSX MCM packaging doesn't quite lend itself to clamshell on the motherboard PCB. :p
     
  9. anexanhume

    Veteran Regular

    Joined:
    Dec 5, 2011
    Messages:
    2,078
    Likes Received:
    1,535
    You said 16GB, so I assumed you were talking PS4?
     
  10. ToTTenTranz

    Legend Veteran

    Joined:
    Jul 7, 2008
    Messages:
    12,272
    Likes Received:
    7,226
    Was Sony still selling PS3 devkits in 2013 when they transitioned to 64bit GDDR5?

    Regardless, devkits don't have the same volume as the consumer versions, so they could just use an older PCB with 16x GDDR5 memory chips for the few devkits they're still selling.
     
  11. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,146
    Likes Received:
    8,533
    Location:
    ಠ_ಠ
    Yes, but I'm talking about the physical design to enable clamshell, which means placing the RAM on directly opposing sides of the PCB (because wiring), whereas the MCM is a separate module that goes on top of the motherboard PCB.
    i.e. where does the clamshell go?

    Well, 2Gbit GDDR5 also existed...

    But sure, I suppose it shouldn't be a huge deal since it ought to be fully compatible.
     
  12. anexanhume

    Veteran Regular

    Joined:
    Dec 5, 2011
    Messages:
    2,078
    Likes Received:
    1,535
    Ok. I just don’t see it as a show stopper given the low volume, as TTT points out.

    FWIW, AMD just announced the 2700E Zen+. 8C/16T, 2.8 GHz, 45W. Seems like a good baseline expectation for next gen. Clock speeds should only improve with the jump to 7nm, unless Zen 2 is a way bigger uarch than Zen/Zen+.
     
    Lightman likes this.
  13. turkey

    Veteran Newcomer

    Joined:
    Oct 21, 2014
    Messages:
    1,099
    Likes Received:
    878
    Location:
    London
    Given they patched the security exploit in the Wii DVD drives access password for the WiiU; simply by making it the same but in upper case suggests they probably don't care all that much, and would have failed far harder should they have developed the silicon themselves.
     
    function and DSoup like this.
  14. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,146
    Likes Received:
    8,533
    Location:
    ಠ_ಠ
    Just a possible consideration. I'm not familiar with Sony's production/policies on updating retail & dev kits, especially where a change in memory type is done.
     
  15. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    12,711
    Likes Received:
    3,632
    a 2.8ghz 8 core zen+ would run laps around the current jaguar cores. Even if they some how got jaguar up to 3.2ghz with 16 physical cores it would still run circles around it. I would expect Zen in a 2020 console.

    It seems that the tegra exploit is an older exploit that was known before the switch launched. I would think Nintendo would be pissed nvidia sold them a flawed chip like that
     
  16. turkey

    Veteran Newcomer

    Joined:
    Oct 21, 2014
    Messages:
    1,099
    Likes Received:
    878
    Location:
    London
    So Nintendo should be mad with Nvidia they themselves did not do any due diligence?
    It's not like they would not have known it was not a vanilla chip warts and all.

    I don't think security is Nintendo's priority, just enough to make it non trivial to bypass is sufficient. Switch has held out pretty well to attack really.
     
    DSoup likes this.
  17. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,727
    Likes Received:
    4,003
    Location:
    Wrong thread
    In principle, GDDR6 should allow some or all of the following: lower energy expenditure per bit accessed/stored; smaller external interface on the chip (smaller die) per unit of BW; simpler package (fewer micro bumps and package pins / bumps) for a given BW; fewer traces on the mobo so smaller or simpler mobo (fewer metal layers); lower total cost of memory due to fewer chips (eventually though certainly not at first).

    So ideally, cost, size, power. Or a better tradeoff of the three.

    Yeah, the GDDR5x does go into the first segment, but like you say it's real thin - going from tiny to nothing - so you have to squint!
     
  18. Xbat

    Veteran Newcomer

    Joined:
    Jan 31, 2013
    Messages:
    1,641
    Likes Received:
    1,311
    Location:
    A farm in the middle of nowhere
    This is why I think it would be foolish for the Xbox X to still play next gen games like some people want.
     
  19. Allandor

    Regular Newcomer

    Joined:
    Oct 6, 2013
    Messages:
    665
    Likes Received:
    619
    Well, even @7nm it would still need at least 20W, which is still more than jaguar. I really don't expect the build in a CPU that uses more than 10W everything else would reduce the power-consumption of the GPU which is still the main part in consoles. Maybe if the reduce frequencies a bit more (~2.5Ghz) it can reach the 10W "border".
     
  20. DieH@rd

    Legend Veteran

    Joined:
    Sep 20, 2006
    Messages:
    6,333
    Likes Received:
    2,337
    Quadcore 28nm Jaguar module [without the GPU] was built by AMD to target 15W. Gen8 consoles had custom APU with two of those modules.

    I think that 30-40W is a good target for gen9 console, and this 2700E fits quite well into that.
     
    Lightman likes this.
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...