Predict: Next gen console tech (10th generation edition) [2028+]

Do you care what I believe ? The circumstances (slow transistor tech advances/rising inflation) make it hard for them to be symmetrically competitive in any near future scenario ...

AMD is subject to the same wafer economics as everyone else. You’re suggesting that their strategy is to intentionally swim against the tide and bet the farm on half cocked tech that only exists in slide decks. It makes zero sense. How is that a winning strategy?
 
To get people to buy the next generation of hardware, they have to sell people on what the next generation of games brings. And this is a tough value proposition as the aging group of players who play Roblox, are absolutely OK with playing with block graphics.
I know quite a few people who grew up playing block graphics, 8 colours max, maybe 320x200 resolution, 20 fps, who now want 120 fps raytraced photorealism at 4K or reconstructed to that quality.

I don't think the tastes of 7-11 year olds will define them for the entirety of their lives. ;)
 
AMD is subject to the same wafer economics as everyone else. You’re suggesting that their strategy is to intentionally swim against the tide and bet the farm on half cocked tech that only exists in slide decks. It makes zero sense. How is that a winning strategy?
I must be forgetting something but what has embracing these technologies done for Intel's fortunes and what do they have to show for it again ? What makes you think that AMD will somehow succeed where Intel has failed with their attempts to follow Nvidia in the very same direction ?

It's precisely because of the fact that dire supply economics are similar for each players and that they now have strong evidence that symmetric competition (Intel) isn't working that AMD are even more motivated to find asymmetric means by utilizing what they already have on hand (virtual geometry/mesh shading/work graphs are all very much real and beyond just slide decks) to counter the further spread of RT ...

Does AMD have any reason left to pursue HW accelerated SIMD matrix operations now that Microsoft's official stance is to NOT expose an explicit gfx API for them ?
 
I must be forgetting something but what has embracing these technologies done for Intel's fortunes and what do they have to show for it again ? What makes you think that AMD will somehow succeed where Intel has failed with their attempts to follow Nvidia in the very same direction ?

It's precisely because of the fact that dire supply economics are similar for each players and that they now have strong evidence that symmetric competition (Intel) isn't working that AMD are even more motivated to find asymmetric means by utilizing what they already have on hand (virtual geometry/mesh shading/work graphs are all very much real and beyond just slide decks) to counter the further spread of RT ...

Does AMD have any reason left to pursue HW accelerated SIMD matrix operations now that Microsoft's official stance is to NOT expose an explicit gfx API for them ?
Intel is at it's first gen in a very competitive market. And what they have to show? Much better upscaling technology and better ray tracing performance compared to AMD.
They are better positioned than AMD technologically on their first try.

I don't understand why it's so absurd for you to think that maybe they will sacrifice 10-15% of raster performance upgrade gen on gen for ai and ray tracing. Instead of making a die with 36 CU's without ai and ray tracing, they'll make a 30 CU's die with those features in (of course, those are made up numbers).

Instead of 800 frames in CS GO, they'll get 700, but in cyberpunk maybe they will almost double the framerate. Today one of them is more important than the other.
 
I know quite a few people who grew up playing block graphics, 8 colours max, maybe 320x200 resolution, 20 fps, who now want 120 fps raytraced photorealism at 4K or reconstructed to that quality.

I don't think the tastes of 7-11 year olds will define them for the entirety of their lives. ;)
lol. Well met.

But imo, there’s a big difference between us and them I feel. They grew up with fancy graphics and sound, and they choose to play Minecraft and Roblox. By the numbers the platform preference of choice is mobile.

Better graphics were the reason we kept pushing new generations of hardware and games, and that just no longer seems as important for the ultra young next generation of gamers.
 
Intel is at it's first gen in a very competitive market. And what they have to show? Much better upscaling technology and better ray tracing performance compared to AMD.
They are better positioned than AMD technologically on their first try.
The most important question being unaddressed is anyone buying their discrete graphics products yet ? At least AMD has Sony and Microsoft as corporate customers ... (they've even went as far as modern mobile graphics hardware)
I don't understand why it's so absurd for you to think that maybe they will sacrifice 10-15% of raster performance upgrade gen on gen for ai and ray tracing. Instead of making a die with 36 CU's without ai and ray tracing, they'll make a 30 CU's die with those features in (of course, those are made up numbers).
Considering how most developers only ever test their RT implementations against Nvidia, how do you figure that they won't ship suboptimal code on AMD like they're already doing so for Intel ? Making up random numbers doesn't help your case when each vendors have very different architectures with varying performance scaling characteristics. For Nvidia, it might only involve giving up a small amount of die area to reach relatively high RT perf but when you take a look at Intel their highest end dGPU die is twice the size of AMD's N33 yet they perform very similarly across different benchmarks on average ...
Instead of 800 frames in CS GO, they'll get 700, but in cyberpunk maybe they will almost double the framerate. Today one of them is more important than the other.
Well unfortunately for you, RT implementations in games aren't the only modern benchmarks in town with high-end graphics tech out there. Ironic to list Cyberpunk when it's own developer (CD Projekt RED) is moving to UE5 and will probably use it's Nanite virtualized geometry system which will throw a spanner with RT integration ...
 
lol. Well met.

But imo, there’s a big difference between us and them I feel. They grew up with fancy graphics and sound, and they choose to play Minecraft and Roblox. By the numbers the platform preference of choice is mobile.
Ish. NSW continues to sell as well as any console from yesteryear. I think the huge limiting factor here is the cost of console hardware. It just hasn't price dropped to mainstream. People going to sink 500+ bucks into a device are more likely to do it on their phone than a console as a priority. Families are more likely to give a cheap mobile device to each child for $500 rather than have them fight over the TV on one $500 console.

If these consoles were $200, I think big-screen gaming would be back to being a priority. It's just better. Even MC and Roblox is better on controller. But now the cost of better is too high for half the market's potential. This doesn't much change your argument as the consoles aren't ever going to get cheap again, but I do think tastes and trends aren't going to be any different across people and we're simply seeing a different context affecting user choices.

I guess to summarise my argument, if a console appeared at $200 that produced 120 Hz realtime RT Roblox and MC with mega eye-candy, the kids would be all over it. But as that's not going to happen, graphical power won't swing these potential customers.
 
The most important question being unaddressed is anyone buying their discrete graphics products yet ? At least AMD has Sony and Microsoft as corporate customers ... (they've even went as far as modern mobile graphics hardware)

Considering how most developers only ever test their RT implementations against Nvidia, how do you figure that they won't ship suboptimal code on AMD like they're already doing so for Intel ? Making up random numbers doesn't help your case when each vendors have very different architectures with varying performance scaling characteristics. For Nvidia, it might only involve giving up a small amount of die area to reach relatively high RT perf but when you take a look at Intel their highest end dGPU die is twice the size of AMD's N33 yet they perform very similarly across different benchmarks on average ...

Well unfortunately for you, RT implementations in games aren't the only modern benchmarks in town with high-end graphics tech out there. Ironic to list Cyberpunk when it's own developer (CD Projekt RED) is moving to UE5 and will probably use it's Nanite virtualized geometry system which will throw a spanner with RT integration ...
Alright, I give up. The whole industry is going in the direction of dedicated hardware for ray tracing and ai, and AMD will too. What are you going to think when that happens, I wonder :whistle:

AMD will valiantly fight against hardware implementations when 80% of PC users, PS5 pro, switch 2 and in the future next gen consoles will have this hardware. In this fan fiction, the sheep kills the lion, somehow.
 
Alright, I give up. The whole industry is going in the direction of dedicated hardware for ray tracing and ai, and AMD will too. What are you going to think when that happens, I wonder :whistle:
They can probably still count on getting console contracts even if they don't subscribe to RT/AI ...
AMD will valiantly fight against hardware implementations when 80% of PC users, PS5 pro, switch 2 and in the future next gen consoles will have this hardware. In this fan fiction, the sheep kills the lion, somehow.
The last UE5 graphical showcase just shipped without hardware ray tracing so does this 'future' that you speak of involve some fairy tale where Unreal Engine somehow doesn't exist ?
 
Ish. NSW continues to sell as well as any console from yesteryear. I think the huge limiting factor here is the cost of console hardware. It just hasn't price dropped to mainstream. People going to sink 500+ bucks into a device are more likely to do it on their phone than a console as a priority. Families are more likely to give a cheap mobile device to each child for $500 rather than have them fight over the TV on one $500 console.

If these consoles were $200, I think big-screen gaming would be back to being a priority. It's just better. Even MC and Roblox is better on controller. But now the cost of better is too high for half the market's potential. This doesn't much change your argument as the consoles aren't ever going to get cheap again, but I do think tastes and trends aren't going to be any different across people and we're simply seeing a different context affecting user choices.

I guess to summarise my argument, if a console appeared at $200 that produced 120 Hz realtime RT Roblox and MC with mega eye-candy, the kids would be all over it. But as that's not going to happen, graphical power won't swing these potential customers.
Following this idea, the Series S should have sold more than the PS5... $300 is quite close to $200 and the price of the S was often discounted. See breakout numbers? No. Cheapness alone is not enough.

The S was just enough to help the X in the number of Xboxes currently sold. In this generation, MS could have achieved greater success only with many unique, let's say, exclusive games. 2-3, but even 5-6 per year is not enough, NOT enough. At the time of the X360, there were about 10-15 exclusive titles in every year! I've been saying this for years, the secret to success would have been the many good exclusive games. But it turned out differently, now they have to rebuild the Xbox brand.
 
Last edited:
It's NOT in any pre-release stage. You just refuse to see that it's already in a release state ... (ever since March this year)
It's still in preview phase, with experimental drivers from AMD and dev only drivers from NVIDIA and Intel, so no .. it's still in beta, and has been in beta for 3 years, who knows how many more years before it's out of beta?

Consoles are always there for them. Working strategy for PC =/= Working strategy for console
If they fall back too much behind consoles will abandon them.

Alright, I give up. The whole industry is going in the direction of dedicated hardware for ray tracing and ai, and AMD will too. What are you going to think when that happens, I wonder :whistle:
We have the most anti progress anti industry sentiment here from Lurkmass all because AMD isn't currently on the progress side. So save your breath, he will continue fighting against AI/RT "even though PS5Pro has them" until he hears from AMD themselves that they are on that side.
The last UE5 graphical showcase just shipped without hardware ray tracing so does this 'future' that you speak of involve some fairy tale where Unreal Engine somehow doesn't exist ?
Which showcase was that I wonder?
 
If they fall back too much behind consoles will abandon them.
The business aspect of choice of GPU probably means tech and features won't matter one jot. Someone picking nVidia for better game visuals , say, will probably end up being charged a lot more than the other vendor picking AMD, skewing the hardware price dramatically. (assuming the tech delta lies that way - this is NOT an nV versus AMD discussion!!)

It might be the case that AMD (or whoever) is the only horse in the race. Maybe Intel will take a hammering over the next few years and be desperate for some console action? But realistically the console market is a piddling little waste of time versus where these chip manufacturers can make real money in servers etc. Console companies might simply have to do with whomever has failed to secure business contracts and is in dire need of any customers at all, including a laughably tiny 20 million economy-priced units a year gaming console.
 
I must be forgetting something but what has embracing these technologies done for Intel's fortunes and what do they have to show for it again ? What makes you think that AMD will somehow succeed where Intel has failed with their attempts to follow Nvidia in the very same direction ?

AMD isnt starting from zero. Intel has bigger problems than cutting edge feature support and performance. The comparison is irrelevant.

It's precisely because of the fact that dire supply economics are similar for each players and that they now have strong evidence that symmetric competition (Intel) isn't working that AMD are even more motivated to find asymmetric means by utilizing what they already have on hand (virtual geometry/mesh shading/work graphs are all very much real and beyond just slide decks) to counter the further spread of RT ...

There is no evidence that “symmetric competition isn’t working” since there’s no symmetric competition. There’s an important piece missing from your proposal. What in your view will motivate people to buy AMD? Being slower at cutting edge tech surely isn’t it.
 
Ish. NSW continues to sell as well as any console from yesteryear. I think the huge limiting factor here is the cost of console hardware. It just hasn't price dropped to mainstream. People going to sink 500+ bucks into a device are more likely to do it on their phone than a console as a priority. Families are more likely to give a cheap mobile device to each child for $500 rather than have them fight over the TV on one $500 console.

If these consoles were $200, I think big-screen gaming would be back to being a priority. It's just better. Even MC and Roblox is better on controller. But now the cost of better is too high for half the market's potential. This doesn't much change your argument as the consoles aren't ever going to get cheap again, but I do think tastes and trends aren't going to be any different across people and we're simply seeing a different context affecting user choices.

I guess to summarise my argument, if a console appeared at $200 that produced 120 Hz realtime RT Roblox and MC with mega eye-candy, the kids would be all over it. But as that's not going to happen, graphical power won't swing these potential customers.
It’s a factor if there is a reason for it. A lot of console generations are linked to games that couldn’t be played in the last generation.

I think Xbox is a good example of a cheaper next generation system with better value per price point and the world will largely ignore it except North America. It’s a situation where despite being cheaper, if no one in your region is playing on it, it’s a dead ecosystem. Social gaming is too powerful to be caught on ecosystem where you cannot play with your friends (though this excuse seems less with cross platform play, but lack of cross platform chat is still an issue).

The only way for Xbox to dig its way out will need to be done through system sellers. Then value and platform plays a larger factor here for them.
 
Why else would they actually materialize a public GI library like Brixelizer which doesn't have any RT HW requirements while the others here continue to hype their RT research papers that doesn't go beyond internal experiments ?
There is a fundamental mismatch between current hardware ray tracing and prefiltering which AMD is more likely to acknowledge than NVIDIA. I wouldn't term it sabotage any more than NVIDIA having nothing better than coordinate hashing for a world space surface radiance cache and pushing that into games is sabotage, just the laziness from being on top.

Ray marching a SDF is still a form of ray tracing even when it doesn't use the current ray tracing hardware. The SDF does have level of detail and presumably the associated radiosity cache does too.

Traditional ray tracing is tied at the hip with Monte Carlo and by the time you seperate it, not much of the hardware support is useful any more.

If RTX didn't exist, I suspect devs would have been trying more advanced voxel cone tracing by now. Maybe for better results.
 
It's still in preview phase, with experimental drivers from AMD and dev only drivers from NVIDIA and Intel, so no .. it's still in beta, and has been in beta for 3 years, who knows how many more years before it's out of beta?
It is not in a preview phase like you keep incorrectly insisting. AMD has officially supported the API for Agility SDK 1.613 since the end of June in their driver release notes ...
If they fall back too much behind consoles will abandon them.
Abandon for whom LOL ? For other mobile hardware vendors ? Maybe ...

Intel and Nvidia ? Good joke there you have if you think such especially since the latter is leaving their only console customer hanging by a thread with no modern mobile SoC design in visible sight ...
Which showcase was that I wonder?
We had Nobody Wants to Die last month and the Marvel's Rivals closed beta where in both cases it was too hard to enable hardware ray tracing ...
AMD isnt starting from zero. Intel has bigger problems than cutting edge feature support and performance. The comparison is irrelevant.
Generally, AMD are the ones who are the most willing to differentiate their product stacks for each specialized markets even if it means a fractured segmentation support between their varying product lines like we see for their CPUs vs NPUs vs compute accelerators vs gaming GPUs vs FPGAs ...

There's barely enough demand to sustain the development of their console business, let alone their discrete graphics products. You implicating that AMD creates a new product line/architecture specifically for PC graphics would exactly ironically entail them "starting from zero" again since they cannot reuse one of their most useful results (Work Graphs/misc optimizations) which involved their work on consoles ...
There is no evidence that “symmetric competition isn’t working” since there’s no symmetric competition. There’s an important piece missing from your proposal. What in your view will motivate people to buy AMD? Being slower at cutting edge tech surely isn’t it.
How convenient of you to keep ignoring the only real world data point we have at our disposal when it doesn't fit your very narrative ...

You can stay in denial all you want but the data shows that AMD reusing their work from consoles can at least maintain double digit discrete graphics market share while Intel who's virtually free from any legacy cruft haven't even made a dent by shadowing the very leader of the industry ...

@Bold If you can't see that undercutting technology is a valid move then there's nothing to be gained for you from continuing this debate. How hard is it for you to understand that AMD would rather *prefer* virtual geometry/GPU-driven rendering be the dominant future when it benefits them by making it much harder to apply HW ray tracing on content like we see in UE5 ?
 
There is a fundamental mismatch between current hardware ray tracing and prefiltering which AMD is more likely to acknowledge than NVIDIA. I wouldn't term it sabotage any more than NVIDIA having nothing better than coordinate hashing for a world space surface radiance cache and pushing that into games is sabotage, just the laziness from being on top.

Ray marching a SDF is still a form of ray tracing even when it doesn't use the current ray tracing hardware. The SDF does have level of detail and presumably the associated radiosity cache does too.

Traditional ray tracing is tied at the hip with Monte Carlo and by the time you seperate it, not much of the hardware support is useful any more.

If RTX didn't exist, I suspect devs would have been trying more advanced voxel cone tracing by now. Maybe for better results.
I think we would very much still see SDFs since they take up far less memory (lower resolution) to prevent/improve light leaking when voxels need very high resolutions (higher memory consumption) to do the same ...

Surface-based representations like SDFs are inherently superior to volume-based representations like voxels for approximating the area of the objects ...
 
It is not in a preview phase like you keep incorrectly insisting. AMD has officially supported the API for Agility SDK 1.613 since the end of June in their driver release notes ...
That driver doesn't support Work Graphs Mesh Nodes, which is the new significant feature of Work Graphs, ergo it's still beta.

Abandon for whom LOL ? For other mobile hardware vendors ? Maybe ...
Yep maybe. Maybe even Intel or NVIDIA, you never know.

We had Nobody Wants to Die last month and the Marvel's Rivals closed beta where in both cases it was too hard to enable hardware ray tracing ...
None of these are a showcase for the engine, one is made by AA studio with AA production, and the other is a multiplayer game, you should have mentioned Black Myth Wukong instead, which supports Path Tracing on UE5, but that's what you always do, pick the wrong data point and build a wrong theory around it.
 
Back
Top