Predict: Next gen console tech (10th generation edition) [2028+]

Mesh Nodes is the significant part of Work Graphs. It's still beta, the whole thing is.
Just because Mesh nodes extension is incomplete doesn't mean that Work Graph by themselves are incomplete. You can already download existing samples and the supported drivers will run them just fine ...
Hellblade 2 is not AAA, it's AA and a corridor one as well, with limited environment/physics complexity.
You classify a game involving a multi-year development cycle with OVER 400 CREDITS just AA ?!
It's not very rarely, several devs are deploying games with it already.
Do you have any other names besides some indie puzzle game to back up your statement ?
We are not talking average here, this is a high end forum for cutting edge graphics and next gen technologies, which is what we are discussing, average is right across the street and it's called the Youtube comment section.
I wonder how your world will come crashing down when reviewers are forced to include UE5-based killer apps wo/ any HW ray tracing like STALKER 2, the next Tomb Raider/CD Projekt RED games ...
Can you really have a useful discussion without saying how they will do it? Otherwise it’s just wishful thinking.
We CAN have a useful discussion if you just stop ignoring data points that don't fit your narrative. End of story ...
Can’t wait. You seem to be pinning all your hopes on UE5 proving that RT is a waste of time. That’s a weird take but good luck with that. The irony is that Epic is continuously improving RT support and performance in their engine. I don’t see how that gels with your view that Epic is leading the charge to undermine RT adoption in the industry.

Besides consoles aren’t targeting only UE5 games. What about all the other game engines that are adopting RT? Will they also follow AMD into this brave new world? Would be nice to see a demo or two before getting too excited about that.
@Bold I got a hold that they've temporarily halted the development of their new feature for hardware lumen ...

It's like Andrew stated, if deformable geometry doesn't play well with Nanite then it is even more so the case for RT. AMD are looking to completely cut the chord for RT in UE by looking to crowd it out with other graphical features in development like skinned virtual geometry and substrate. I don't know why so many here are in refusal to look at the facts when our resident rendering expert here has espoused that *certain feature combinations should not exist* lest you want to destroy your performance ... (even on the highest end systems too)

As for the other engines, it's hard to say when many of their users keep intentionally supporting and releasing content for last generation systems ...
 
Last edited:
I wonder how your world will come crashing down when reviewers are forced to include UE5-based killer apps wo/ any HW ray tracing like STALKER 2, the next Tomb Raider/CD Projekt RED games
It won't make a single difference, reviewers are already including (SW Lumen) UE5 titles in their benchmarks and the picture still as ugly as ever for AMD, NVIDIA scoring massive wins in all areas (performance/power/features/upscaling, .. etc).

By your logic your world must be crashing down already from all path traced and ray traced Xbox/PlayStation games being tested and included in the benchmark set, not to mention the upcoming UE5 titles with path tracing releasing this month (Black Myth) and beyond.

And you simply don't know if the next Tomb Raider/Cyberpunk/Witcher UE5 game won't have path tracing at it's core, especially if NVIDIA is involved.

Just because Mesh nodes extension is incomplete doesn't mean that Work Graph by themselves are incomplete. You can already download existing samples and the supported drivers will run them just fine ...
If you want WorkGraphs to make a tangible difference in performance in dense geometry scenes you have to wait for Mesh Nodes to be released, hence why the whole API is still under development and will stay so for several years.

You are all over the place with this one, you speak of a great plan that has not even left the powerpoint slide stage, yet you speak of it as if it already happened, despite facts on the ground saying otherwise.
 
Last edited:
Sounds like y’all trying to just figure out who is more right, no one is taking the time to really understand what each other is saying.

No one here can predict the future, and even if you could it’s just pure dumb luck. We can talk about trends however, and the trend right now is towards both arguments, we are seeing a lot of growth in RT and we are seeing a lot of growth in virtual geometry. One won’t stop the other from progressing, though these two conflicting will cause developers to choose one path or another.

IHVs won’t be the ones to determine what games will be made, developers will make that choice. If they feel lighting is more important, perhaps they don’t do virtualized geometry, or vice versa.

Please stop fighting over who will or is right. You won’t know. Spend some time to understand however, there’s a lot of great information here being ignored for the sake of winning the argument.
 
Path Tracing: all the benefits of RTXDI plus vastly improved indirect lighting and is near perfect, reflections are perfect and caustics are added.
An important point not to overlook for UE is their non-gaming visualisation stuff. Epic have as much interest in producing production ready visuals for TV and movies as they have for realtime scalable rendering. As such, not that I take away from your overall argument, the existence of a rendering pipeline in UE may not represent anything game related or of interest to next-gen console hardware. Even if gaming ditches RT hardware completely next-gen, there'll be a fast-as-possible path-tracing solution in UE for server-farms to realtime render the next Star Wars something-or-other.
 
We CAN have a useful discussion if you just stop ignoring data points that don't fit your narrative. End of story ...

I’ll be happy to consider data points when they exist. Thus far you haven’t shared any data indicating RT is a dead end.

AMD are looking to completely cut the chord for RT in UE by looking to crowd it out with other graphical features in development like skinned virtual geometry and substrate.

Again something that only exists on paper. Can we reserve the hype for when it actually materializes and shows evidence of crowding out RT in shipping games and engines?
 
Again something that only exists on paper. Can we reserve the hype for when it actually materializes and shows evidence of crowding out RT in shipping games and engines?
In the RTRT hardware discussion preceding this generation, there was a lot of discussion about software solutions and just using generic compute. There were a lot of promising technologies. They didn't really get anywhere and RTRT hardware has improved in-game RT performance better than those solutions.

Maybe things will move back towards more software-based solutions, but even if we had good examples of promising tech now, that wouldn't prove it's the future. You'd need to see games choosing and implementing software solutions.

Of notably poignancy to next-gen hardware, the IHVs have to make choices now for hardware coming in a couple of years. Without clear evidence RTRT is a bad direction, I can't imagine any taking a punt on software-solutions without good evidence they are the better option. So we really ought to be looking at content made now, or the next year (depending when 'next gen' releases), that proves the value of non-RTHW.
 
the existence of a rendering pipeline in UE may not represent anything game related or of interest to next-gen console hardware
The levels I talked about are all (real time) game production related stuff, already incorporated into some games. There exists a 6th level, called Path Tracer as well, but it's professional grade and renders with sample count that exceeds the capabilities of real time hardware, and takes hours to render a single scene.
 
Path tracing doesn't inherently fix light leaks, hashing coordinates is inherently leaky for instance.
 
If you want WorkGraphs to make a tangible difference in performance in dense geometry scenes you have to wait for Mesh Nodes to be released, hence why the whole API is still under development and will stay so for several years.

You are all over the place with this one, you speak of a great plan that has not even left the powerpoint slide stage, yet you speak of it as if it already happened, despite facts on the ground saying otherwise.
You can already realize performance gains with Work Graphs! You can implement a barrierless persistent global work queue as seen in Nanite's persistent hierarchal cluster culling algorithm which is faster than the current PC implemention of doing barriers between compute dispatches to process every DAG level. GPU-driven PSO state changes w/ Mesh nodes for procedural rendering is purely a bonus on top of that ...
I’ll be happy to consider data points when they exist. Thus far you haven’t shared any data indicating RT is a dead end.
Like the STRONG correlation of the absence of hardware lumen in games featuring virtual geometry ? Again, your just going to deny making this connection ...
Again something that only exists on paper. Can we reserve the hype for when it actually materializes and shows evidence of crowding out RT in shipping games and engines?
It doesn't only exist in paper or slide decks. There's already an implementation these features even if they aren't complete yet! While the code for Nanite skinned meshes came in hot recently (it was leading up from their work on their displacement mapping reimplementation) do you really think Epic Games are going to drop an upcoming publically advertised feature that they've been working on some years now on like Substrate with the promise of giving artists the ability to create more expressive materials ?
 
Like the STRONG correlation of the absence of hardware lumen in games featuring virtual geometry ? Again, your just going to deny making this connection ...

Yes we know virtual geometry in UE doesn’t play well with RT. That’s not evidence of your purported industry wide move away from RT. Especially when that same engine supports RT just fine.

It doesn't only exist in paper or slide decks. There's already an implementation these features even if they aren't complete yet! While the code for Nanite skinned meshes came in hot recently (it was leading up from their work on their displacement mapping reimplementation) do you really think Epic Games are going to drop an upcoming publically advertised feature that they've been working on some years now on like Substrate with the promise of giving artists the ability to create more expressive materials ?

Can’t wait to see it in action.
 
What are the chances of Intel or Samsung getting those console contracts?

They both have somewhat of a bad reputation in the semiconductor business, so a constant revenue streams from consoles would be beneficial to them.

TSMC doesn't need console contracts, and their prices are through the roof right now.
Meanwhile Samsung and Intel would gladly accept to build those chips, even with very slim margins. They wouldn't be cutting edge compared to TSMC, but I'm sure both of them can get close to their 3nm chips, at least in 2028, for reasonable prices and yields.
 
One can imagine a future where these two features will intersect and play well together. It's a matter of time and not a matter of if.
When devs abandon pencil ray tracing and use (differential) cone tracing instead, otherwise they will get stuck in hybrid rendering. You can't just let RTX type ray traversal go on it's merry way when sampling something with LoD, at each level of the BVH you need to check whether you are at the right one. Which invalidates the design of the fixed function hardware.

Some of the hardware could perhaps still be leveraged, but it's awkward.

PS. differential cone tracing can also be termed ray differentials, if you are in the ray tracing camp and a sore loser.
 
Last edited:
Intel 18A is 'on track' for the first external customers tapping out in H1 2025, so that's not implausible for Xbox's next SOC(s), even if it's an AMD design.
Wouldn't that be too costly for a 2026 console-hybrid PC-whatever it is? If the target is 2x the power of current gen, Intel 18A is probably overkill.

I wouldn't exclude some TSMC 3nm equivalent for ps6 is 2028, costs have to be contained in some way.
 
When devs abandon pencil ray tracing and use (differential) cone tracing instead, otherwise they will get stuck in hybrid rendering. You can't just let RTX type ray traversal go on it's merry way when sampling something with LoD, at each level of the BVH you need to check whether you are at the right one. Which invalidates the design of the fixed function hardware.

Some of the hardware could perhaps still be leveraged, but it's awkward.

PS. differential cone tracing can also be termed ray differentials, if you are in the ray tracing camp and a sore loser.

The hardware can also evolve.
 
Back
Top