Next Generation Hardware Speculation with a Technical Spin [2018]

Status
Not open for further replies.
20 billions will really give a boosted PS5 not forward compatible 2023... 10 billions maybe a kind ultra PS4 fully backward and PS4-forward compatible (this is what I feel is coming 2019...
 
Quick bit of Googling suggests 100M transistors per mm^2 at 7nm. That's 35 billion in 350 mm^2 of die area. 10 billion would be a 100 mm^2 die, which is a tiny chip for a console.

Looking at Xbox, OXB > 360 was 5.6x transistor count over 4 years. 360 > XB1 was 10x over 8 years. That's roughly a linear 1.25x transistor increase per year. 2019 would be 6 years x 1.25 = 7.5x as many transistors as XBox One in Xbox 8K, which is 37.5 billion transistors (bit conservative based on numbers), which would be 375 mm ^2.

So like I say, it all depends on what the processes can achieve regards density, although cost will also be an issue if each mm^2 starts costing more and more. Looking at actual data instead of plucking ideas out of thin air definitely helps with estimates, though.
 
TSMC 7nm look set to increase density by a factor of three (or better) over 16nm FF given a couple of different metrics.
So HBRUs ballpark figure of 20billion transistors is probably not too far off the mark if the ASIC is produced on TSMC 7nm and die size is in the vicinity of 300mm2, using the XBoxX data sa representative of a console ASIC. (Actually, given the nebulous nature of the figures, it's probably pretty valid for Global Foundries as well.)
Note though that power considerations may affect the die size decision as well, since power (even if it doesn't seem too far off) doesn't necessarily scale quite as well as density does, and the interested parties may be wary of yields/cost.
 
20 billion probably is reasonable, but more like 2019/2020 and not 2023. Where'd that date and transistor count combo come from?
 
2023 cost affordability of 7nm given the market (bit coins boom & others)... But maybe sooner who knows ?!? Its 5 years from nowso maybe -in between- 2019 there is room for an advanced Ps4 10 billions transistors... Also market switching to online selling may suggest old hardware will live longer... Ps4 in 2020 will be 100 millions...
 
yes... Thats why I think about 2019@7nm extra beefed PS4 (10 billions transistor)... Joined with a shrinked Ps4-pro and ultra slim, extra cheap Ps4 (maybe without blue ray for targeting gaming and TV online market... With 4K only video capabilities)... Maybe 50 Us$ this last one
 
maybe Sony can accept to loose some money on such a device... ok maybe 100 $... Forgot the controller...
 
With an increased system cadence, I'm thinking nothing above 10 to 12 TFLOPS in the GPU compute range if we're looking at a late 2019 PS5 at the earliest. Basically Vega 64 performance shrunk down, though probably on Navi and w/e is the current Zen processor generation available at that point, and with HBM of course unless the memory industry really has gone to hell by then.

Should be true 4K out of the box, with games supporting the PS4 Pro at 1080p. Support the previous machine from before, in a cell phone like model. Take advantage of that x86 architecture! And perhaps Navi won't be so different that the PS4 Pro's Vega-ish graphics can still run it's code in a reduced manner.
 
With an increased system cadence, I'm thinking nothing above 10 to 12 TFLOPS in the GPU compute range if we're looking at a late 2019 PS5 at the earliest. Basically Vega 64 performance shrunk down, though probably on Navi and w/e is the current Zen processor generation available at that point, and with HBM of course unless the memory industry really has gone to hell by then.

Should be true 4K out of the box, with games supporting the PS4 Pro at 1080p. Support the previous machine from before, in a cell phone like model. Take advantage of that x86 architecture! And perhaps Navi won't be so different that the PS4 Pro's Vega-ish graphics can still run it's code in a reduced manner.
Samsung has invested very heavily to increase manufacturing capabilities for memory. Avalilability should be good generally, (although Samsung probably means to eliminate competitors.)
A customer like Sony isn’t buying RAM for the PS4 from the spot market, they contract the supplier directly.

It has been argued by some knowledgeable folk that ”True 4K” might not be a good trade off IQ wise for the next generation consoles. We’ll see.
 
We are all taking the XBox One X as a cost/price reference, but today noticed that after just 3 months is at 460€ on amazon and about 420€ on other retailers.
Are we really sure that is *that* expensive to produce?
 
Well I'm not taking XB1X as a reference. No-one should, as retail price isn't indicative of BOM or what profit margins (+ve and -ve) a company might aim for at launch of a new console. People should be looking at component costs - that's always been the tradition for B3D technical forecasts.
 
The cost of 7nm production will be more expensive than 16nm, expect chips to be smaller to compensate.
Adopting EUV in 2019 should bring significant improvements to yields to the foundries, which in turn brings the cost-per-healthy-chip down.
Sure, the price-per-waffer will increase, but the actual price per usable mm^2 could be very similar.
 
It was stated that it's sold at production cost
I don't understand your argument then. You're asking if the XB1X costs that much to produce based on current retail price of €420, but you've been told it costs $500 to make. If you accept MS's statement about cost to make (I can't find anything quite that definitive, only a mention that it's not making a profit, but we don't know how MS factors in software sales into their calculations), then the lower price is likely retailers discounting to shift stock. Retail price doesn't instantly indicate BOM. A lower retail price doesn't mean the BOM has dropped.

If there's a decent proof that XB1X is sold at cost, we can actually use it as a $500 reference point. ;)
 
I hope next gen home consoles comes after the "Navy" GPU generation as it is only after that arch that AMD recognizes its GPU as next-gen (according to some slides I saw recently). I suspect that there is truth to that and that till then their PU will ride the same "back-bones" nowadays GPU.

What i really want to see is a good affordable handheld which pick the ball where Nintendo dropped it with the switch. Definitely there are saving to be made on Nvidia design as technology has not stand still since Tegra X1 got released.
For example the tegra X1 there x4 A57 cores and x4 A53 ones as you know, each cluster has its local share of L2. In the Switch one cluster and its local share of the L2 is disabled. Now ARM has DynamIQ IPs (A75/55) available, it should be able to offer a much better CPU performance per watts and per mm2 (even on the same process) at the cost of some scheduling optimization. I the same footprint as four cores you could 2 big+4 tiny cores for example.
Then there is the GPU, Nvidia has made progress so did the competition, definitely PowerVR is competitive and while still lagging the Mali is making strides (no matter the G71 under performances).
The TegraX1 uses 20nm process better is availble now. I wonder how much it would cost to use Intel new 22nm process and how its costs relates to competition.
Then there image processing and post processing and specialized hardware that could help alleviating the sever resolution/bandwidth limitation any power constrain and cheap system is bound to struggle with.

Anyway the point I suspect manufacturer can do significantly better than the Switch.
 
I don't understand your argument then.
Following microsoft's statement we (all us but not you) fixed the bom for a 6TF console at about 500€/$. Having no other insight on anything we used it as a starting point for projecting the possible hardware in a 2019/20 console with a bom at about 400€/$.
Now the X significantly lowered the price in 3 months.
So:
- all the projections are based on wrong data
- it's just microsoft trying to sell more console at a loss knowing that the reference target will spend lot of money in contents
- blame Obama
?
 
This talk of a GCN successor launching in 2020 is making me rethink my 2019 prediction quite considerably.

A GCN successor with performance per watt improvements in excess of a GCN iteration must be an enticing prospect to Sony/MS. So we can assume they would want to utilise it one way or another - base console or mid-gen.

But, to implement it in a mid-gen console seems like a surefire way of creating compatibility problems, pissing away some performance or silicon, or, at least, costing resources by way of backwards compatibility R&D. So, it seems safe to say that a GCN successor is a good fit for the launch of a new generation.

With that in mind, 2020 would be the very earliest point at which a GCN successor based console could launch. But would that be the best course of action?

Here are the reasons why I'm unsure:
- 7nm should be fairly mature by 2020, but would a new macro architecture face problems with yield?
- Could a new macro architecture be manufactured at sufficient scale quickly enough?
- Is a new macro architecture likely to have teething problems that would make it worth waiting a revision or two?

I wasn't really paying attention to AMD when they first launched GNM, so I'm interested in the perspective of people here who have followed its launch and development.
 
Status
Not open for further replies.
Back
Top