Next Generation Hardware Speculation with a Technical Spin [2018]

Status
Not open for further replies.
Why? What does nVidia bring?
Currently... Power consumption. But the higher price would more than negate that advantage.

Also, considering we're looking at a 2021 launch for Sony, we may see navi's successor which is looking to be designed with ryzen's architecture in mind aka a more nvidia like approach.
 
considering that microsoft has been working more closely lately with nvidia on dxr and such. is it possible that they would consider going with nvidia for their next generation console?

this last gen, i think one of the reasons that sony was able to dominate was because of the overall similarity between the two companies products, if microsoft went with nvidia this time it would really shake things up.

Not really. It makes sense that MS would work with Nvidia since its gpu dominates the PC space. Both AMD and Nvidia are important partners.

MS first showed off DX12 with nvidia hardware.
 
I doubt Nvidia can convince either console manufacturer into choosing them, unless they offer something truly remarkable for the $/perf.
Both have been burnt by Nvidia in the past and so far amd has been a solid partner.

Edit; many mistakes.
 
Last edited:
I'm pretty sure Nintendo is kicking themselves in the ass right now considering the exploit in the tegra that has blown their console wide open.
 
As pointed out yesterday, back in 2013 Sony actually switched the RSX's memory pool from GDDR3 to GDDR5 and halved the bus to 64bit in the process.
By transitioning to 4* 16Gbit GDDR6 chips, the PS4's and PS4 Pro's SoC+RAM could look like this:

MwZRTGx.jpg


They could make a tiny motherboard out of it.

I was under the impression that their current devkits go for the full 16GB now (using the same mobo design), so this sort of packaging might not be desirable until 32Gbit density chips arrive for any hypothetical switch to GDDR6, lest they ask devs to go back to an 8GB kit or they also design an entirely different motherboard altogether.

edit:

Durango might be a neat case for switching due to the twin 16-bit channels.
 
Last edited:
I was under the impression that their current devkits go for the full 16GB now (using the same mobo design), so this sort of packaging might not be desirable until 32Gbit density chips arrive for any hypothetical switch to GDDR6, lest they ask devs to go back to an 8GB kit or they also design an entirely different motherboard altogether.
Clamshell?
 
I was under the impression that their current devkits go for the full 16GB now (using the same mobo design), so this sort of packaging might not be desirable until 32Gbit density chips arrive for any hypothetical switch to GDDR6, lest they ask devs to go back to an 8GB kit or they also design an entirely different motherboard altogether.

Was Sony still selling PS3 devkits in 2013 when they transitioned to 64bit GDDR5?

Regardless, devkits don't have the same volume as the consumer versions, so they could just use an older PCB with 16x GDDR5 memory chips for the few devkits they're still selling.
 
You said 16GB, so I assumed you were talking PS4?
Yes, but I'm talking about the physical design to enable clamshell, which means placing the RAM on directly opposing sides of the PCB (because wiring), whereas the MCM is a separate module that goes on top of the motherboard PCB.
i.e. where does the clamshell go?

Was Sony still selling PS3 devkits in 2013 when they transitioned to 64bit GDDR5?

Well, 2Gbit GDDR5 also existed...

But sure, I suppose it shouldn't be a huge deal since it ought to be fully compatible.
 
Yes, but I'm talking about the physical design to enable clamshell, which means placing the RAM on directly opposing sides of the PCB (because wiring), whereas the MCM is a separate module that goes on top of the motherboard PCB.
i.e. where does the clamshell go?



Well, 2Gbit GDDR5 also existed...

But sure, I suppose it shouldn't be a huge deal since it ought to be fully compatible.

Ok. I just don’t see it as a show stopper given the low volume, as TTT points out.

FWIW, AMD just announced the 2700E Zen+. 8C/16T, 2.8 GHz, 45W. Seems like a good baseline expectation for next gen. Clock speeds should only improve with the jump to 7nm, unless Zen 2 is a way bigger uarch than Zen/Zen+.
 
I'm pretty sure Nintendo is kicking themselves in the ass right now considering the exploit in the tegra that has blown their console wide open.

Given they patched the security exploit in the Wii DVD drives access password for the WiiU; simply by making it the same but in upper case suggests they probably don't care all that much, and would have failed far harder should they have developed the silicon themselves.
 
Ok. I just don’t see it as a show stopper given the low volume, as TTT points out.
Just a possible consideration. I'm not familiar with Sony's production/policies on updating retail & dev kits, especially where a change in memory type is done.
 
Ok. I just don’t see it as a show stopper given the low volume, as TTT points out.

FWIW, AMD just announced the 2700E Zen+. 8C/16T, 2.8 GHz, 45W. Seems like a good baseline expectation for next gen. Clock speeds should only improve with the jump to 7nm, unless Zen 2 is a way bigger uarch than Zen/Zen+.
a 2.8ghz 8 core zen+ would run laps around the current jaguar cores. Even if they some how got jaguar up to 3.2ghz with 16 physical cores it would still run circles around it. I would expect Zen in a 2020 console.

Given they patched the security exploit in the Wii DVD drives access password for the WiiU; simply by making it the same but in upper case suggests they probably don't care all that much, and would have failed far harder should they have developed the silicon themselves.
It seems that the tegra exploit is an older exploit that was known before the switch launched. I would think Nintendo would be pissed nvidia sold them a flawed chip like that
 
a 2.8ghz 8 core zen+ would run laps around the current jaguar cores. Even if they some how got jaguar up to 3.2ghz with 16 physical cores it would still run circles around it. I would expect Zen in a 2020 console.

It seems that the tegra exploit is an older exploit that was known before the switch launched. I would think Nintendo would be pissed nvidia sold them a flawed chip like that

So Nintendo should be mad with Nvidia they themselves did not do any due diligence?
It's not like they would not have known it was not a vanilla chip warts and all.

I don't think security is Nintendo's priority, just enough to make it non trivial to bypass is sufficient. Switch has held out pretty well to attack really.
 
Agree with your conclusions. Question is: what is benefit of a console shrink with GDDR6? Is saving power worth the extra cost? Will they help out their supplier by taking the really low binned modules?

In principle, GDDR6 should allow some or all of the following: lower energy expenditure per bit accessed/stored; smaller external interface on the chip (smaller die) per unit of BW; simpler package (fewer micro bumps and package pins / bumps) for a given BW; fewer traces on the mobo so smaller or simpler mobo (fewer metal layers); lower total cost of memory due to fewer chips (eventually though certainly not at first).

So ideally, cost, size, power. Or a better tradeoff of the three.

That doesn't make sense if you look at the PC graph. The 2016 notch on that graph matches up with NVidia launching GDDR5x equipped GPUs (May 2016). So each year signifies the start of each year and not the end of each year.

I believe those were also the first GDDR5x (standard ratified in Jan. 2016) products on the market.

[edit] Then again if you zoom way in on the graph, it looks like the GDDR5x/6 section may go into the first segment. So, not so sure now.

Regards,
SB

Yeah, the GDDR5x does go into the first segment, but like you say it's real thin - going from tiny to nothing - so you have to squint!
 
2.8ghz 8 core zen+ would run laps around the current jaguar cores. Even if they some how got jaguar up to 3.2ghz with 16 physical cores it would still run circles around it. I would expect Zen in a 2020 console.

This is why I think it would be foolish for the Xbox X to still play next gen games like some people want.
 
Ok. I just don’t see it as a show stopper given the low volume, as TTT points out.

FWIW, AMD just announced the 2700E Zen+. 8C/16T, 2.8 GHz, 45W. Seems like a good baseline expectation for next gen. Clock speeds should only improve with the jump to 7nm, unless Zen 2 is a way bigger uarch than Zen/Zen+.
Well, even @7nm it would still need at least 20W, which is still more than jaguar. I really don't expect the build in a CPU that uses more than 10W everything else would reduce the power-consumption of the GPU which is still the main part in consoles. Maybe if the reduce frequencies a bit more (~2.5Ghz) it can reach the 10W "border".
 
Well, even @7nm it would still need at least 20W, which is still more than jaguar.
Quadcore 28nm Jaguar module [without the GPU] was built by AMD to target 15W. Gen8 consoles had custom APU with two of those modules.

I think that 30-40W is a good target for gen9 console, and this 2700E fits quite well into that.
 
Status
Not open for further replies.
Back
Top