Predict: Next gen console tech (10th generation edition) [2028+]

At some point consoles will becom non affordable. Like that was with top smartphones. But I thisnk there is solution. 25 Tflops in 2026 sounds good for Series S replacement at price $399, but MS also can make Series X replacement with 50 Tflops at $599 or even $649. At that will be ok, I wait for some premium console for some years but that should be at launch. Otherwise MS will leave only low-end choice for everyone. But some people want at least mid range at not all of them want buy gaming PC. And gaming PC will be a lot higher price that even mid range console.
If, let's say, Sony can't make a PS6 at no less than 600$, then you will get a PS6s at a lower price. PS5 already teached them that the price reductions of the past are not possible anymore at the same pace, so lower tiers are the way to go.

Also, you are not getting 25 TF at 399$ in 2026. They are struggling to reduce the series x to less than 450$, so there is no way.
 
Also, you are not getting 25 TF at 399$ in 2026. They are struggling to reduce the series x to less than 450$, so there is no way.
Maybe they just want to get more $ from each console they sell, like Nintendo do for years starting with DS and Wii. I remember times when video games store where I work price for Wii was a little bit higher than cheap Cbox 360 SKU. :)
BTW we have video games museum in our store for some years, so if anyone will be in Riga, Latvia, please contact me. :)
 
I got really confused here. So that would be 30-32 and 50-60 in terms of PS5 and XSX or what? :)
I wrote in terms of PS5 and XSX. So 5pro is about 15-16 TF when compared to XSX and PS5. However, 5Pro will be advertised as a 30TF+ system. And next gen will be advertised as a 50TF+ system.

As stated by Sony, 5Pro is only 45% faster than PS5, so this makes sense. If PS5 is about 10TF on average, then 14-15TF performance would be 45% more.

But I want to provide a whole perspective here: useful reading

If for some reason developers can actually optimize for dual issue fp32, then the performance profile will rise closer to the theoretical 30TF. The thing is, I don’t know if that’s something that can be done. We haven’t seen (or read about theoretical ways to drive any major extraction of) it yet, best to ask Andrew since UE5 continues their transition into a massive compute pipeline with Nanite and Lumen.

Having said that, let’s assume best case scenario it can be optimized for, i suspect it will take until next generation for that to really get started and possibly another few years into next gen for it to be mainstream n
 
Last edited:
Protected? what is this? some kind of good vs evil fairy tale? AMD's graphics business will collapse with no console funding, they won't even survive with their lack of software features or hardware performance. They won't carve anything out, if they go exotic route (away from DirectX and API standards), they would die in a heart beat, as they don't have the marketshare or the mindshare for anyone to care.
What exactly do you think PC graphics is worth to AMD anyways when their biggest customers left are Microsoft and Sony ? Eventually, Sony is going to be the only voice they can hear and their designs will soon change to exactly accommodate them while Microsoft will be content with whatever ...
Nobody cares, PC is scalable, you can run the game on low and be done with it, unlike consoles, if you have a 3050 you can have a 120fps experience, you can't do that on consoles. You can always buy cheap hardware that runs all the games on PC, you have iGPUs that can run every PC game in the wild. That point is irrelevant.

And we already have technological jumps on PC, they are called Path Tracing now, and it's already bigger than any rasterization jump in the past decade.
If PC doesn't care about price accessibility anymore then I'm sure you won't mind the idea of Sony attempting to follow through on their recent statement to convert PC gamers to their own console platforms and what better way to do that than to introduce exotic hardware so that PCs are left with the inefficient bruteforce solutions ? Their perfectly content with the idea of AAA PC gaming being gatekeeped behind RTX 4090s in the future if it means that they can now show a justification to their shareholders to never release their games on PC again if it'll be met with poor sales ...
PC never experienced missing graphical effects in the past no matter what exotic hardware out there.
Deus Ex on PS2 had higher detailed polygon meshes than than it's PC counterpart for starters. There's also tons of instances like older Disney based games or FIFA 21 where the PC versions were clearly based on last generation consoles. PC also never got subsurface scattering for the Tomb Raider reboot in comparison to the next gen ports. Only the GC version of Tales of Symphonia supports 60Hz. MGS2/3 has missing or reduced rain and DoF on PC in comparison to PS2. I'm not going to go any further since you can take a look at a more detailed list here ...
As for high hardware demands, then this comes at the expense of consoles nuking themselves out, getting rid of backward and forward compatibility (which you seem to value so much given your arguments for Switch 2 to ditch NVIDIA), and raising costs of game developments/porting/remastering .. while also raising hardware costs. Engine costs will also explode, since console makers will have to fork special builds of popular engines to be able to use them (which you don't like).
Never have I claimed that AMD/console vendors can't to iterate an older standardized architecture into a new exotic HW design (they can keep both BC and incompatible features with PC) but for sure Nvidia has never really been this (whether exotic or not) and their current partner is well aware of this fact either way ...
You are basically all over the map on this one, your whole basis of arguments is anything opposite to what NVIDIA does, if some vendor goes to NVIDIA then oh it's bad, backward compatibility is lost. If some other vendor is going AMD, but NVIDIA is still sweeping the market, then ditch backward compatibility, ditch AMD, and go do something else! If new APIs are coming out with new features, then hail GPU driven rendering and hail AMD for sticking with the "good APIs", but if that is not enough and NVIDIA is acing the game with strong ray tracing and machine learning features, then ditch the APIs, ditch everything and go exotic! If NVIDIA is making a special UE5 build with the upgraded features then it's bad and no one will use it, but if we go exotic and fork out specific engine builds then that's fine and dandy, as long as they are not NVIDIA builds! Be consistent please.
And console vendors supposed to believe that PC apologists somehow represents their best interest when the likes of you keep championing the very same HW vendor who keeps losing out on these very contracts ?
GPU driven rendering is nice but is currently vaporware. I’m sure we will see multiple RT improvements from all vendors before GPU driven rendering takes off. Not sure why you’re framing them as competing technologies.
@Bold Explain how a yet to be standardized theoretical/hypothetical feature will a currently beat already API exposed feature to the punch in terms of app integration ?
Custom console hardware would be a lot more exciting than the current “mini PC” configs. I’m not sure AMD is the one to deliver that though. How will they pay for it?
I guarantee you that this 'custom hardware" would be easier/cheaper to implement within whatever constraints of their die area budget entails on top of current existing console architectures than making HW RT as 'robust' as a RTX 4090 (well on it's way to the reticle limit) ...
Texture caches are tiny, especially on some architectures, so they are not a panacea. They won't help much if you need to trace through all screen pixels. Register spilling with RT is not a problem when the software is properly designed, except maybe for some architectures that do traversal in software and require more registers for the additional traversal kernels.
Spilling variables/arguments from registers just means you don't care about performance anymore. Shading languages were designed to EXACTLY prevent that case as much as possible from happening and there's many GPU algorithms out there that are fundamentally memory bandwidth limited ...
BVH is comparable in cost to a depth prepass, G-buffer fill, shadow fill, or other raster passes, and it may cost even less. Many would rather get rid of some raster passes rather than BVH. And you're suggesting adding even more raster passes for planar reflections on steroids, etc., which is ridiculous when you mention the BVH cost. At least BVH is unified, and you can replace a lot of raster passes with it, preferably all of them in the near future, I hope.

Unless console holders have gone crazy, I have a hard time imagining them wanting to make games even more technically complex and difficult to develop with even more different raster subsystems to keep in sync with each other. Sony sleeps and dreams of developing games in half the time and cost, not the other way around. So, all those fantasies about creating custom raster hardware for all cases in life (reflections, shadows, and whatever else) are unrealistic not just from a technical standpoint but also have zero practical viability.
Tracing against a static pre-baked BVH is comparable to those passes. Not so much a with geometrically full featured dynamic runtime generated BVHs ...

Console vendors also have dreams of developing on RTX 4090s too but there's no market for $1000+ consoles so paying a few graphics programmers crafting more complex solutions is more sustainable than either having a dead platform or raking in billions in hardware losses ...
There have always been restrictions and rules for rendering correctness. What do they have to do with pipeline states and their hardware costs?
Restrictions = less programmability/more fixed function HW states

Restrictive interfaces in gfx APIs often correlates to HW designs having these "more optimized" states in the presence of enabling these fixed function units to speed up runtime execution. Nvidia are a well documented poster child of implementing HW for just about everything ...
Well, maybe because this is how it fits best for the SER realisation. You're really just guessing and grasping at straws here. It says nothing about the hardware states or anything else.
SER exists because their HW can't get full/optimum speed with either callable shaders or real function calls which implies that there exists some special HW state to enable a faster spilling mechanism on ray generation shaders than those other methods ...
These mostly come from pastgen consoles, where PC have always lagged in this regard due to more powerful hardware (CPU/GPU), standardization hell, and fewer requirements for console style optimizations, which typically come at the expense of GPU performance and stability on PC. It's no secret that RT was late to the party for current gen consoles, so consoles didn't push the development of RT much for many reasons, as they have too weak RT HW to change something on PC. Once more performant RT solutions are available on consoles, which are expected by the end of the year, people will start to care more about supporting new advancements on PC as well.
Using last generation consoles as an example is just a scapegoat when the nearly all (?) of the graphically high-end AAA UE5 games are only released with software lumen ...

RT integration in current games only went as far as it did because of the fact that many games or it's technology were tied to last generation consoles. RT wouldn't be proliferating anywhere near as much as it does now if so many games weren't still being released on last generation consoles or based on their technology which often have lower scene complexity. There wouldn't he a lot of RT modes featured in games if there built to maximize visual fidelity out of current consoles. "More performant" RT implementations isn't going to do anything for consoles in the future when they can't use much of the said overly contrived HW features ...
 
@Lurkmass :: just curious on your thoughts as to what would be considered “exotic” hardware as we move into the future? I see us moving into more accelerators, but still have the technology grounded in what we have in modern GPUs today.

Is there really a new architecture that is exotic enough to have more compute with less power or silicon ? Theoretically there should be, in application the cost to execute may be too high to visit these types of ideas.

With so much money being poured in AI research, it’s hard to think anything else will come in the next decade other than more innovation in the AI space.
 
Maybe they just want to get more $ from each console they sell, like Nintendo do for years starting with DS and Wii. I remember times when video games store where I work price for Wii was a little bit higher than cheap Cbox 360 SKU. :)
Unless they were lying to a court, they said that they were losing 100$ to 200$ on every Xbox sold. Which is insane to me but anyway... Sony even said that costs had gone up compared to before, which probably never happened in console history until now. Unless things change, we aren't getting big perf/$ advancements in the near future.
 
@Lurkmass :: just curious on your thoughts as to what would be considered “exotic” hardware as we move into the future? I see us moving into more accelerators, but still have the technology grounded in what we have in modern GPUs today.

Is there really a new architecture that is exotic enough to have more compute with less power or silicon ? Theoretically there should be, in application the cost to execute may be too high to visit these types of ideas.

With so much money being poured in AI research, it’s hard to think anything else will come in the next decade other than more innovation in the AI space.
His post that kind of started most of this conversation is here, answering basically the same question you are asking:

 
Unless they were lying to a court, they said that they were losing 100$ to 200$ on every Xbox sold. Which is insane to me but anyway... Sony even said that costs had gone up compared to before, which probably never happened in console history until now. Unless things change, we aren't getting big perf/$ advancements in the near future.
I dont think they were 'lying' as much as using creative accounting to make the claim. Kind of like how they say that Gamepass is profitable.
 
I dont think they were 'lying' as much as using creative accounting to make the claim. Kind of like how they say that Gamepass is profitable.
I would not be surprised that GP is profitable.

30M subscribers at an ASP of $8 per month is still 240M per month or 2.88B per year. I have massive doubts they spend more than 2.88B per year on gamepass.

Spider-Man 2 with 10M units sold with an ASP of $60 is only 600M. The margins on SM were very low but still profitable.
 
Spilling variables/arguments from registers just means you don't care about performance anymore.
Where did you see it in RT games? There might be a game or two with path tracing where long compute shaders with inlined RT may cause spilling on AMD, as they have to keep the traversal state, variables, and constants in registers alongside the inlined uber shader's stuff. However, there are no multiplatform RT games or other RT games with the same issue.

If you try hard enough, I have no doubt that one can achieve spilling even with rasterization. The issue with this theory is that the programmer should have the opposite goal.

Shading languages were designed to EXACTLY prevent that case as much as possible from happening
How is inlining supposed to prevent spilling from happening? Does inlining multiple shaders into one uber-shader mean you need fewer registers?

Tracing against a static pre-baked BVH is comparable to those passes. Not so much a with geometrically full featured dynamic runtime generated BVHs ...
Alan Wake 2 is full of animated foliage, and I don't see much of a problem with RT there. Besides, if console makers want to innovate and differentiate, here is a good chance for them to get ahead of PC by supporting hardware for BVH building. It had been done before by Imagination, so it wouldn't be completely new. And don't pretend you need a 4090 for BVH in games with modern geometry complexity, because you don't. Neither do you need it for consoles, as dedicated hardware can always help to catch up.

SER exists because their HW can't get full/optimum speed with either callable shaders or real function calls
What do function calls have to do with SER? SER is to improve coherence by reshuffling threads so that threads with the same shaders are executed together. Function calls have been supported for years in CUDA without any shader tables, which is certainly not a limitation of NVIDIA's hardware)

of the graphically high-end AAA UE5 games are only released with software lumen ...
SW Lumen is still RT. It's slower, supports only static geometry, and uses multiple lower resolution, ugly SDFs (because you can't approximate thin objects with it, so games using it are pretty much doomed to have light leaking) without any parametrization. This results in the ad-hoc requirement of even uglier constructs such as the shading cards, which introduce a ton of other problems. Software Lumen is just a poor man's RT because current gen consoles are bad at RT, nothing more.

RT integration in current games only went as far as it did because of the fact that many games or it's technology were tied to last generation consoles. RT wouldn't be proliferating anywhere near as much as it does now if so many games weren't still being released on last generation consoles or based on their technology which often have lower scene complexity.
There are plenty of current gen examples of complex games with RT, even on consoles. Your claims don't hold true even in the case of consoles, let alone for PC.
 
Last edited:
I would not be surprised that GP is profitable.

30M subscribers at an ASP of $8 per month is still 240M per month or 2.88B per year. I have massive doubts they spend more than 2.88B per year on gamepass.

Spider-Man 2 with 10M units sold with an ASP of $60 is only 600M. The margins on SM were very low but still profitable.
In a normal situation, you can directly compare cost of development/marketing versus sales revenue. You cannot do this with Gamepass. What they spend specifically on Gamepass isn't the point, because they are still spending all the money on game development too, which doesn't get included in any cost analysis. That makes Gamepass look profitable on the surface, but that's quite convenient when you dont have to include literally your biggest costs, right?
 
I would not be surprised that GP is profitable.

30M subscribers at an ASP of $8 per month is still 240M per month or 2.88B per year. I have massive doubts they spend more than 2.88B per year on gamepass.

Spider-Man 2 with 10M units sold with an ASP of $60 is only 600M. The margins on SM were very low but still profitable.
Gamepass is profitable, it's just that the conventional model would probably make them more money. Conditioning your userbase to wait for games on a subscription has obvious negative consequences, and that also has the effect that users aren't building a digital library on the ecosystem, which makes switching to other ecosystems easier. The calculation to understand the economies of game pass is probably kind of hard o_O

Welp, as I was typing my message @Shifty Geezer 😅
 
it means that they can now show a justification to their shareholders to never release their games on PC again if it'll be met with poor sales ...
Sony is the one who needed PC, not the other way around. Most Sony ports didn't smash through sales or anything even close to that. They recoup some of their investment by releasing games on PC and other platforms. The whole industry is moving into that direction, including Microsoft and Sony, pure console exclusives titles are dead, you can't design an entire hardware philosophy around a dead trend. It's over.

attempting to follow through on their recent statement to convert PC gamers to their own console platforms
They can certainly try, but they will fail. PC gamers are patient folks, they don't buy games on Epic Game Store and wait for them to be released on Steam, they simply don't care whether Sony releases their titles on PC or not. They didn't care during the era of PS3, nor PS4, and they won't care now.

Never have I claimed that AMD/console vendors can't to iterate an older standardized architecture into a new exotic HW design
You claimed multiple times that consoles moving away from AMD will result in the loss of backward compatibility, which is a no go for console makers. But now you are advocating for it. You also didn't answer the other points of ditching standard APIs, common engine builds after advocating for them hard.

And console vendors supposed to believe that PC apologists somehow represents their best interest when the likes of you keep championing the very same HW vendor who keeps losing out on these very contracts ?
No, PC apologists care only about progress, making games visually impressive with great graphics and performance, while you only care about the politics behind graphics, even if it means stifling progress, game development, costs, APIs and everything else. You seem willing to accept last gen graphics and technology for the sake of some politics that benefit no gamer, developer or user, now you have gone into overdrive mode wishing for exotic hardware that benefits literally no one! While also ignoring the reality of the situation where console vendors are doubling down on ray tracing and machine learning for their next console updates.

There's also tons of instances like older Disney based games or FIFA 21 where the PC versions were clearly based on last generation consoles
The choice to release games with last gen graphics is an economical choice (to reduce costs and cater for wider hardware), it happens for FIFA in pretty much every generation, even during the PS4 and PS5 eras (with their PC like architectures). This has nothing to do with exotic hardware.
Deus Ex on PS2 had higher detailed polygon meshes than than it's PC counterpart for starters
Hardly anything significant, stuff like that happen with other console ports as well. Besides, Deus Ex was released for PS2 two years after the PC version, the developer took the chance and upgraded the graphics when doing the new port, this has nothing to do with the exotic hardware of the PS2.
MGS2/3 has missing or reduced rain and DoF on PC in comparison to PS2
MGS3 never had a PC version until recently, with the Metal Gear Solid: Master Collection, which is released for all consoles. So if those effects are bugged, they are bugged for all platforms.
PC also never got subsurface scattering for the Tomb Raider reboot in comparison to the next gen ports
PC got them in the GamePass version. And you are mixing things again, the Tomb Raider reboot version is not based on exotic hardware.
 
Last edited:
Explain how a yet to be standardized theoretical/hypothetical feature will a currently beat already API exposed feature to the punch in terms of app integration ?

I was referring to hardware improvements for existing RT APIs and software. See RDNA 4 rumors. Intel is clearly beefing up their RT hardware for Battlemage and Nvidia is likely to do the same. PS5 Pro is supposedly heading in the same direction. Yet we are to believe that RT is dead in next gen consoles?

I guarantee you that this 'custom hardware" would be easier/cheaper to implement within whatever constraints of their die area budget entails on top of current existing console architectures than making HW RT as 'robust' as a RTX 4090 (well on it's way to the reticle limit) ...

As much fun as it would be to see the consoles try something different, noone can guarantee anything about hypothetical exotic hardware. It's all fairytale stuff at this point. Is there even any research in that space that you can reference to support your position?
 
Sony have had a few interesting patents in recent years, none of which have materialised in hardware. There's really no point speculating that that's something they'll try IMO.

Lurkmass is predicting something disruptive. We can log that and move on with some actual hardware debates and come back when there's evidence of this novel hardware.
 
I think the next console gen will have atleast 32 GB unified memory. Considering Nvidia still doesn't change their stance of VRAM with the RTX 50 series and still skimps on it, that's very concerning for the PC platform.
 
Why would next consoles have 32 GBs RAM if PC gets away without it? That's a lot of additional cost. If there's a RAM bump, I'd guess only to 24 GBs. I think it's more important to spend on BW, so I'd like to see stacked RAM. Would definitely take a 16 GB RAM console with HBM over a 32 GB console.

@Dictator : Just had an idea for another DF investigation. How about gimping a PC in various ways and seeing what has the biggest impact on games? So take a monster PC and then put in a poky CPU. Then revert to good CPU and try limited RAM. Then weaker GPU, then slow storage. Benchmark a bunch of games and see what the impacts of each bottleneck are and predict the importance of each part on next-gen consoles?
 
Back
Top