Predict: Next gen console tech (10th generation edition) [2028+]

C no How convenient of you to keep ignoring the only real world data point we have at our disposal when it doesn't fit your very narrative ...

It’s not convenient. It’s irrelevant as AMD and Intel aren’t in the same boat.

@Bold If you can't see that undercutting technology is a valid move then there's nothing to be gained for you from continuing this debate. How hard is it for you to understand that AMD would rather *prefer* virtual geometry/GPU-driven rendering be the dominant future when it benefits them by making it much harder to apply HW ray tracing on content like we see in UE5 ?

So in your estimation AMD is going to blaze this new path and the entire industry will follow leaving Nvidia with a bunch of unused hardware in their chips? What exactly makes you so optimistic that this is a valid strategy? Everyone and their dog seems to be investing in HWRT including the console guys (and Epic too!)

Is this just wishful thinking for an AMD win or is there an actual plan here?
 
Surface-based representations like SDFs are inherently superior to volume-based representations like voxels for approximating the area of the objects ...
The voxels can contain anything you want, like SDF or neural approximations. AMD uses AABB on top of the SDF, could be n-tree voxels too.
 
It’s not convenient. It’s irrelevant as AMD and Intel aren’t in the same boat.



So in your estimation AMD is going to blaze this new path and the entire industry will follow leaving Nvidia with a bunch of unused hardware in their chips? What exactly makes you so optimistic that this is a valid strategy? Everyone and their dog seems to be investing in HWRT including the console guys (and Epic too!)

Is this just wishful thinking for an AMD win or is there an actual plan here?
I think after all these messages the intention has been made pretty clear, but, *if I speak I'm in big trouble meme*
 
That driver doesn't support Work Graphs Mesh Nodes, which is the new significant feature of Work Graphs, ergo it's still beta.
Just because it doesn't support the mesh nodes extension yet doesn't mean you absolutely CAN'T use Work Graphs. You can very much use Work Graphs today if your driver returns success for the ID3D12GraphicsCommandList10::DispatchGraph API call!
None of these are a showcase for the engine, one is made by AA studio with AA production, and the other is a multiplayer game, you should have mentioned Black Myth Wukong instead, which supports Path Tracing on UE5, but that's what you always do, pick the wrong data point and build a wrong theory around it.
If you want single player and AAA example, there's Hellblade II but just because Marvel's Rival (current gen exclusive) is a multiplayer game doesn't mean it's not a better benchmark than something like The First Descendant (cross gen game w/ shoddy assets) and Black Myth Wukong uses the RTX branch of UE5 which is very rarely featured in actual shipping titles ...

The samples that you bring up are far less representative of the average graphically high-end UE5 game ...
It’s not convenient. It’s irrelevant as AMD and Intel aren’t in the same boat.
@Bold They soon very well might be with Intel's fabs being in turmoil ...

Two very similar businesses/parallels (multi-domain IC designers /w the other soon to be fabless) that could not be much more closer/comparable to each other yet both decided to pursue different strategies in PC graphics ...
So in your estimation AMD is going to blaze this new path and the entire industry will follow leaving Nvidia with a bunch of unused hardware in their chips? What exactly makes you so optimistic that this is a valid strategy? Everyone and their dog seems to be investing in HWRT including the console guys (and Epic too!)

Is this just wishful thinking for an AMD win or is there an actual plan here?
Hey, you don't have to like their scorched earth approach to realize that it's the scenario the MOST benefits them. They don't have to spend much effort to radically redesign their architecture or blow up their die sizes when they can just as easily poison (virtual geometry) the well (RT) to make it unusable ...
 
On the fly feeding compute shader diced geometry to ray tracing doesn't need to be impossible, Intel did it. Though premature choices in how to design the fixed function hardware can make it hard.
 
Last edited:
Hey, you don't have to like their scorched earth approach to realize that it's the scenario the MOST benefits them. They don't have to spend much effort to radically redesign their architecture or blow up their die sizes when they can just as easily poison (virtual geometry) the well (RT) to make it unusable ...

If this is their most attractive option then they are in pretty bad shape. It’s not clear to me how this strategy moves more AMD cards off shelves. You need leadership in application support and performance for that and in AMDs case they need a ton of it to counter the brand disadvantage. Which applications will carry the flag for AMD? Or is that all hypothetical too?
 
If this is their most attractive option then they are in pretty bad shape. It’s not clear to me how this strategy moves more AMD cards off shelves. You need leadership in application support and performance for that and in AMDs case they need a ton of it to counter the brand disadvantage. Which applications will carry the flag for AMD? Or is that all hypothetical too?
@Bold Isn't that obvious to you by now how it helps them there exactly ? By nullifying their competitor's biggest advantage, their benchmark results will improve dramatically all the while it bleeds their adversary out into paying for unused hardware ...

If you don't like the way AMD is undercutting your beloved ray tracing technology then maybe you ought to lobby your favourite IHV Nvidia to outright bribe AMD and/or Epic Games to incentive them elsewise they'll continue their desecration campaign against RT by rocking the Unreal Engine boat some more ...
 
@Bold Isn't that obvious to you by now how it helps them there exactly ? By nullifying their competitor's biggest advantage, their benchmark results will improve dramatically all the while it bleeds their adversary out into paying for unused hardware ...

I know that you want that to happen but you’re not providing any details on why you (or AMD) think it will happen.

If you don't like the way AMD is undercutting your beloved ray tracing technology then maybe you ought to lobby your favourite IHV Nvidia to outright bribe AMD and/or Epic Games to incentive them elsewise they'll continue their desecration campaign against RT by rocking the Unreal Engine boat some more ...

Huh, AMD isn’t undercutting anything. They are currently fighting for relevance. It would really help if you wouldn’t discuss slide decks as if they reflected reality. What desecration campaign? By whom?
 
I know that you want that to happen but you’re not providing any details on why you (or AMD) think it will happen.
How quick of you to deflect from the question of why to how/will ...
Huh, AMD isn’t undercutting anything. They are currently fighting for relevance. It would really help if you wouldn’t discuss slide decks as if they reflected reality. What desecration campaign? By whom?
There'll be more upcoming examples for you to grasp the strong relationship of UE5 preventing RT integration in case of your refusal to realize this now ...
 
Just because it doesn't support the mesh nodes extension
Mesh Nodes is the significant part of Work Graphs. It's still beta, the whole thing is.

maybe you ought to lobby your favourite IHV Nvidia to outright bribe AMD and/or Epic Games to incentive them elsewise they'll continue their desecration campaign against RT by rocking the Unreal Engine boat some more ...
What the heck is this? Is this some kind of religious crusade? You really pushed this into an insane out of whack territory.

If you want single player and AAA example, there's Hellblade II
Hellblade 2 is not AAA, it's AA and a corridor one as well, with limited environment/physics complexity.

and Black Myth Wukong uses the RTX branch of UE5 which is very rarely featured in actual shipping titles ...
It's not very rarely, several devs are deploying games with it already.

The samples that you bring up are far less representative of the average graphically high-end UE5 game ...
We are not talking average here, this is a high end forum for cutting edge graphics and next gen technologies, which is what we are discussing, average is right across the street and it's called the Youtube comment section.
 
Ok, there is more and more talk from credible and perceived sources about the new Xboxes. I think they will only do this if these hardwares appear soon. If this is such a topic in 2024, even officially from MS (see the spring podcast), then new Xboxes could easily be released in 2025.

It is a handheld and a separate desktop console. Comparing all this with MS's statement that this will be the biggest technological leap in the history of consoles, it would be worth looking at what the possibilities are for a November 2025 release.

It might be a $600 desktop console or
It will be a $1000 desktop console.

What hardware can be packed into the pricing framework of the two options in 2025? Opinions?
 
Hellblade 2 is not AAA, it's AA and a corridor one as well, with limited environment/physics complexity.
I think calling Hell Blade 2 AA is a stretch, team size maybe, but visually it's as AAA as any of Sony's first party games are.
Yay, another definitions debate. Nothing going to drive a technical discussion on next gen gaming hardware quite like sorting out whether Hellblade II is 'AA' or 'AAA'. 5 or 6 pages of what defines a AA or AAA game and we'll be well onto our way of pinning down the bus width and the speed of the fan.
 
How quick of you to deflect from the question of why to how/will ...

Can you really have a useful discussion without saying how they will do it? Otherwise it’s just wishful thinking.

There'll be more upcoming examples for you to grasp the strong relationship of UE5 preventing RT integration in case of your refusal to realize this now ...

Can’t wait. You seem to be pinning all your hopes on UE5 proving that RT is a waste of time. That’s a weird take but good luck with that. The irony is that Epic is continuously improving RT support and performance in their engine. I don’t see how that gels with your view that Epic is leading the charge to undermine RT adoption in the industry.

Besides consoles aren’t targeting only UE5 games. What about all the other game engines that are adopting RT? Will they also follow AMD into this brave new world? Would be nice to see a demo or two before getting too excited about that.
 
Isn’t AMD is adding its own fixed function RT cores in RDNA 4 or 5? Seems it’s going the same route as Nvidia and Intel.
 
Last edited:
Isn’t AMD is adding it’s own fixed function RT cores in RDNA 4 or 5? Seems it’s going the same route as Nvidia and Intel.
Even just the leaked improvements in the PS5 pro documents and the general rDNA 4 improvements for PC and PS5 pro should shut down this discussion:

  • Double Ray Tracing Intersect Engine
  • RT Instance Node Transform
  • 64B RT Node
  • Ray Tracing Tri-Pair Optimization
  • Change flags encoded in barycentrics to simplify detection of procedural nodes
  • Improvements to the BVH (Bounding Volume Hierarchy) Footprint
  • RT support for OBB (Oriented Bounding Box) e Instance Node Intersection
But what do you know.
 
Dense geometry alone can't make a game look next gen, as is obvious from all the UE5 games released so far, their massive flaws in indirect lighting, shadowing and reflections destroys the appearance of next gen (as was obvious in games such as Immortals of Aveum, Lords of the Fallen, Talos Principle 2, Layers of Fear and many others).

These games are full of light leaks, screen space relfections, screen space bounce lighting, light boiling and lack of indrect lighting all together. This doesn't mesh well with the expected next gen look.

The irony is that Epic is continuously improving RT support and performance in their engine
Currently UE5 offers 5 levels of ray tracing.

Software Lumen: basic RT with all the above issues.

Hardware Lumen: more accurate non screen space reflections (though aliased and poorly shaded with less detailed materials and shadows). Emissive lights become non screen space (with far better light coverage).

Hardware Lumen with Hit Lighting: more precise non aliased reflections and properly shaded, also use proper materials and shadows.

RTXDI: direct lights and emissives are done with ray tracing for more accurate and stable light coverage, every light source also casts shadows.

Path Tracing: all the benefits of RTXDI plus vastly improved indirect lighting and is near perfect, reflections are perfect and caustics are added.

Besides consoles aren’t targeting only UE5 games
Even within UE5, games are varying their approach, Many UE5 games don't use the full set, some don't use Nanite, some don't use Lumen or Virtual Shadows, some don't use any of the new features at all, there is a great inconsistency here from UE5 titles, the spectrum is large and no one can can target a specific variable.

Also the best possible performance is often achieved with a combination of the three big features (Nanite, Lumen, Virtual Shadows), so sometimes developers will sacrifice some or all of them to obtain a certain visuals/performance ratio.
 
Most UE5 games are the same: Wastelands, deserts, forests... Every game trying to get out of the these settings are failing. Robocop is great example: Broken lighting, broken reflections, missing materials etc.
 
Back
Top