Sony PS6, Microsoft neXt Series - 10th gen console speculation [2020]

2027 PS6.

12core/24 Thread Zen 6 based, @ 4-5ghz clocks.
GPU is RDNA 5/6 base, with approx 1.5 x 7900XTX perf raster, 2.5 RT.
Dedicated Upscaling hardware, be it ML, matrix math cores, or some special temporal upscaling support.
Likely sitting on top of 1-2Gb cache/edram, and then 20-24Gb GDDR, giving ~700Gbs.
Big Old SSD, via PCIE 4/5.

Anyway thats my guess! lets see how close I am in 4 years time!


As an aside.....
I wouldn't be surprised if we start seeing game engines outputting "mid-frames",
which are just motion data, and other low level info - perhaps some raw texture data,
and then a temporal upscaling system to use this as input to generate intermediate frames.
Basically, instead of having the GPU calculate all you motion vectors, optical flow, and Depth data from 2 different frames,
the GPU, creates that data for you, giving a much better quality temporal upscale as the result.
 
A 2027 PS6 only has 50 percent more gpu power than a GPU that will be out within a month? What kind of stagnation do you think GPUs will have for 5 years 😂

Also I kind of expect much higher bandwidth for ram. PS5 to ps4 was over double and you can get 1tb in certain cards already
 
A 2027 PS6 only has 50 percent more gpu power than a GPU that will be out within a month? What kind of stagnation do you think GPUs will have for 5 years
Power consumption. It's a hard wall. Limited to under 225 Watts for the entire system cpu, ram, gpu, nvme, USB, wifi, nics versus 450 watts for just the GPU.
 
Power consumption. It's a hard wall. Limited to under 225 Watts for the entire system cpu, ram, gpu, nvme, USB, wifi, nics versus 450 watts for just the GPU.
It's 5 years. Are efficiency gains really so dead gen over gen that multiple gens from now to then will be so limiting?

And graphene seems to be just a dream so far
 
A 2027 PS6 only has 50 percent more gpu power than a GPU that will be out within a month? What kind of stagnation do you think GPUs will have for 5 years 😂

Also I kind of expect much higher bandwidth for ram. PS5 to ps4 was over double and you can get 1tb in certain cards already
Only 2 more node shrinks to 3nm and 1nm. After that we have no more performance increases without big power jumps. 3nm will bring 50-70% over RDNA 3 and 1nm a further 50-70% over RDNA 4. We end up with almost 3x the performance of RDNA 3 in the best case scenario of performance scaling. 50% faster than RDNA 3 puts the PS6 at exactly half the power of the fastest thing money can buy on PC. Makes perfect sense considering consoles cant supply 300+ watts to the GPU alone.
 
Last edited:
Only 2 more node shrinks to 3nm and 1nm. After that we have no more performance increases without big power jumps. 3nm will bring 50-70% over RDNA 3 and 1nm a further 50-70% over RDNA 4. We end up with almost 3x the performance of RDNA 3 in the best case scenario of performance scaling. 50% faster than RDNA 3 puts the PS6 at exactly half the power of the fastest thing money can buy on PC. Makes perfect sense considering consoles cant supply 300+ watts to the GPU alone.
I wonder what comes after 1nm then. Does technology across the board just stagnate? Surely there have been people thinking about this inevitably for many years.

Graphene chips, quantum computing all seem like concepts we will never see in our lifetimes
 
I wonder what comes after 1nm then. Does technology across the board just stagnate? Surely there have been people thinking about this inevitably for many years.

Graphene chips, quantum computing all seem like concepts we will never see in our lifetimes
@Shifty Geezer favourite discussion topic. New exotic architectures, perhaps more accelerators. ML/AI technologies. If you can’t shrink further, you’re going to have to work smarter.

Or.

Cloud gaming. Which moves power requirements to the cloud and it’s just a question if we can solve latency issues by 2037.
 
@Shifty Geezer favourite discussion topic. New exotic architectures, perhaps more accelerators. ML/AI technologies. If you can’t shrink further, you’re going to have to work smarter.
Question. Is AMDs chiplet design something that will help with staving off that inevitable problem? Or is it all hot air?

I would assume chiplets would be easier to work with than trying to continue to make bigger and bigger single chips and such
 
Next generation is the last one.
That's only if people just suddenly stop buying consoles en masse and people on PC just stop buying hw in favor of streaming. For Nintendo it certainly won't be the last one so I doubt it will be so for any of them unless something extreme happens.
 
Cloud gaming. Which moves power requirements to the cloud and it’s just a question if we can solve latency issues by 2037.
That's only if people just suddenly stop buying consoles en masse and people on PC just stop buying hw in favor of streaming. For Nintendo it certainly won't be the last one so I doubt it will be so for any of them unless something extreme happens.
Conceptually, if hardware can't improve then the streaming services will be able to attract people with the better graphics. That could end up a driver towards centralised processing and thin clients. Put the 800W gaming hardware in the cloud, you time-share it 2 hours in an evening, other people use it throughout the day for other workloads, and the heat waste is managed responsibly for heating or energy reclamation of some sort. The limiting factor then would become infrastructure to support low latency communications.

In an ideal case, assuming human beings can get their act together!
 
Conceptually, if hardware can't improve then the streaming services will be able to attract people with the better graphics. That could end up a driver towards centralised processing and thin clients. Put the 800W gaming hardware in the cloud, you time-share it 2 hours in an evening, other people use it throughout the day for other workloads, and the heat waste is managed responsibly for heating or energy reclamation of some sort. The limiting factor then would become infrastructure to support low latency communications.

In an ideal case, assuming human beings can get their act together!
That's only if internet doesn't improve to what's it's been for years which seems more likely than anything
 
The limiting factor for the internet is largely infrastructure whci can be worked around, versus the limiting factor for local hardware improving which is the laws of physics.
 
I wonder what comes after 1nm then. Does technology across the board just stagnate? Surely there have been people thinking about this inevitably for many years.

Graphene chips, quantum computing all seem like concepts we will never see in our lifetimes
I think stagnation is what will happen. I think lots of hardware companies will go bankrupt if they cant find other markets to transition into.
 
I wonder what comes after 1nm then. Does technology across the board just stagnate? Surely there have been people thinking about this inevitably for many years.
900pm (picometer) will be next logical reduction after 1nm but requires a new material. Silicon atoms are approx 2nm so beyond that you need a new material, graphene atoms are 1.6nm and graphene has been widely commercially used since around 2010.

However, there is a real rethink about going smaller than 2nm because new materials like hexagonal boron nitride (h-BN) are larger (2nm) but show exceptional thermal performance meaning clocks can rise significantly. Realistic frequencies exceeding 10Ghz should not be stressful.
 
Back
Top