Predict: Next gen console tech (9th iteration and 10th iteration edition) [2014 - 2017]

Status
Not open for further replies.
Yup, 8 TFLOPs for 2020 would be pretty bad.
Basically, it would mean the very same graphics as the PS4 but at 4K.


You are not going see a 15-16 Tflops console in 2020.

Why not?
12-13 TFLOPs will be pretty much expected for early 2017's flagships. 2018 with widespread 10nm may see that kind of processing power going into mid-high end graphics cards.
By 2020, we'll have either a very mature 10nm process or a relatively recent 7nm.
Console APUs had better come with the equivalent performance of a mid-high end graphics card from 2018.
 
Yup, 8 TFLOPs for 2020 would be pretty bad.
Basically, it would mean the very same graphics as the PS4 but at 4K.




Why not?
12-13 TFLOPs will be pretty much expected for early 2017's flagships. 2018 with widespread 10nm may see that kind of processing power going into mid-high end graphics cards.
By 2020, we'll have either a very mature 10nm process or a relatively recent 7nm.
Console APUs had better come with the equivalent performance of a mid-high end graphics card from 2018.

I expect 14nm will get mid range graphics cards to around 8 teraflops by 2020. Thats what consoles will likely use.
 
If IHVs are stuck with 14nm by 2020, then something will have gone wrong really bad on Samsung, Globalfoundries and TSMC.
14nm has been in mass production since mid-2014 at Intel and mid-2015 at TSMC and Samsung. By 2020 the node will be over 5 years old. The only way that would happen is if 10nm is skipped and 7nm is late.
 
well it all comes down to cost effectiveness in newer nodes. Hard to predict 4 more years down the road, but it doesn't seen to be too far away to think 10nm might not be that cost effective for consoles.
 
well it all comes down to cost effectiveness in newer nodes. Hard to predict 4 more years down the road, but it doesn't seen to be too far away to think 10nm might not be that cost effective for consoles.

Then price them higher...

Consumers want to see an appreciable power increase from one gen to the next. If Sony/MS can't achieve that on the latest semiconductor fab process with a $400-500 console, then do so with a $500-600 console.

If gamers don't have a choice in the matter then they'll bite the bullet jump in at the higher price point.
 
The cost of transistor/sqmm is barely dropping for recent nodes. ICs at a particular performance level get smaller, but also more expensive per area unit, meaning you need to raise prices to exact significantly more performance. And consoles are highly cost sensitive.
 
The cost of transistor/sqmm is barely dropping for recent nodes. ICs at a particular performance level get smaller, but also more expensive per area unit, meaning you need to raise prices to exact significantly more performance. And consoles are highly cost sensitive.

Then isn't the answer to wait longer? Surely process improvements at each node will mean that you'll eventually get to a place where the cost of transisters/sqmm is at an acceptable level?

Plus, we can only really project by what we can see. Cutting edge technological breakthroughs in the near future could get Moore's Law back on track and see process improvements come quicker. Yeah, it's a little pie in the sky but we're talking about 2020, which is a good near-half-decade from now. That's an eon in the tech industry.

Question:
Out of curiosity, is stacked ASICs still a thing that looks likely to become a reality any time soon? Or is the cooling issue still its biggest stumbling block?
 
Think about it. PS4K - if real - will be around, what, 3-4TFlop in January 2017? Give or take one flop and a quarter year? That's coming from 1.8 TFlop from 2013.

How is PS5 going to be 17TFlop in the same time difference?? 10 would already be quite optimistic.
 
I can see 16TF in 2020 if the 10nm node is mature by then. Next flagships in at 14nm in 2017 will probably be about that range of performance and a drop down to 10nm would make them small enough for consoles.
 
Then isn't the answer to wait longer? Surely process improvements at each node will mean that you'll eventually get to a place where the cost of transisters/sqmm is at an acceptable level?
How long would we wait tho? We're already up to at least three years per node - it may be longer still. The ability to shrink electronics is leveling out severely - we'll hit the wall with regards to that well within our lifetimes, where a chip can't be shrunken any further at all.

Cutting edge technological breakthroughs in the near future could get Moore's Law back on track and see process improvements come quicker.
Assuming that is true, it would only stay true for a few years longer and then quantum effects will destroy any chance of having functional circuits - the transistors would simply be too small to reliably have a state of either on or off, and would randomly switch back and forth between the two...

Out of curiosity, is stacked ASICs still a thing that looks likely to become a reality any time soon?
It could only work for very low power devices, because of cooling as mentioned. IBM for example have explored microchannels carrying fluid, but that'd be another layer of complications, added expense and probably still fairly ineffective as flow rate through such tiny channels would be miniscule.
 
It could only work for very low power devices, because of cooling as mentioned. IBM for example have explored microchannels carrying fluid, but that'd be another layer of complications, added expense and probably still fairly ineffective as flow rate through such tiny channels would be miniscule.

Thanks Grall. This has always been an interesting area for me. I've never personally seen micro-channels as a viable solution, since the minuscule channel diameters would see viscous forces dominating over the fluid inertial forces, resulting in a low fluid Reynolds Number. The resultant issues are thus low fluid flowrates as well as poor fluid mixing within the channels. This impacts heat transfer as the system will have to rely on conductive heat transfer rather than convective, meaning the system becomes more reliant on the fluid thermal conductivity property than any extensive system property you actually have control over (e.g. fluid flowrate).

Depending on the diameter of the channels and channel length (likely chip length), you'd be more likely to see fluid flow through the channel by capillary action, father than forced fluid flow, as the pressure drop through the channel (i.e. resistance to fluid flow) will be really high. I suppose the channel density and heat generation per unit volume of silicon, may mean that you can get by with a system that works, even with such low fluid flows; provided the fluid inlet temps are low enough and the fluid density is high enough to provide sufficient heat capacity. Without any numbers it's impossible for me to know. At face value though, I'd assume it would be very difficult to engineer a system that works.

They're probably better off submerging the stacked ASIC in a high heat capacity, low temp fluid (e.g. possibly even cryogenic temps), and let conduction through the chip do its thing. After all a stacked ASIC will only be in the region of a few mm thickness, so I think that's probably a much more viable solution than micro-channels. The problems with this however lie in how to achieve such cold temps cost effectively. Essentially you'd need a refrigeration circuit, implying the need for fluid compression which increases power consumption by quite a bit.

(Not sure if this is too OT or not - MODs may feel free to move this to the cooling solution spin off thread)
 
@Prophecy2k
Interesting. I'm not sure if you're aware, but IIRC, Cray T90 systems allegedly used Fluorinert coolant sprayed from nozzles onto bare processor chips, and then sucking off the resulting vapors to a condenser unit. Complicated setup for a consumer system (and the chassis for a 32-processor machine weighed a literal shit-ton too to cope with thousands of liters of coolant - which in turn added significant additional weight, omg...), but phase change cooling is enormously powerful as we all know I suppose, so in this case it must have been worth it. :)

Perhaps one day something like that will become necessary to continue growing computer performace. Develop a fluid with a low evaporation point, but still able to carry significant heat... Such a setup could potentially cool chips below ambient temps.
 
I think some people are highly optimistic of the industry over the next few years. I think 10TFlops is max we can get into a console in 2020 but more likely 8TFlops is what we are gonna see.

Unless you want an $800 dollar console which ain't gonna happen.

People are trying to apply conventional wisdom of the console industry from last 20 years to the PC hardware industry as it today. PS1 Flops -> PS2 Flops -> Ps3 Flops etc... does not directly apply here.
 
I think some people are highly optimistic of the industry over the next few years. I think 10TFlops is max we can get into a console in 2020 but more likely 8TFlops is what we are gonna see.

Unless you want an $800 dollar console which ain't gonna happen.

People are trying to apply conventional wisdom of the console industry from last 20 years to the PC hardware industry as it today. PS1 Flops -> PS2 Flops -> Ps3 Flops etc... does not directly apply here.



If I remember well the firt 28nm GPU is the 7970 in December 2011 and we now use the same node process nearly 5 years later...
 
This might please some of you:

http://www.gameinformer.com/b/news/...e-and-a-half-says-xbox-boss-phil-spencer.aspx

"I’m not a big fan of Xbox One and a half. If we’re going to move forward, I want to move forward in big numbers," Spencer said. "I don’t know anything about any of the rumors that are out there, but I can understand other teams’ motivations to do that. For us, our box is doing well. It performs, it’s reliable, the servers are doing well. If we’re going to go forward with anything, like I said, I want it to be a really substantial change for people – an upgrade." - Phil Spencer
 
Well, even if they don't try much it will not be hard to provide substantial upgrade over Xbox One. :)

The PS4 rumours centre on going from what are mostly 1080p games to 4K, twice the GPU/CPU performance and a significant boost to the VR experience but Phil only want to do an upgrade with "big numbers" and "substantial change" ?

Ok, Phil. Whatever you say. :yep2: Certainly substantial change happening here, I'm finally upgrading from Windows 7 to Windows 10.
 
Status
Not open for further replies.
Back
Top