Could X1X et al run this/that game/effect? *spawn

Even the Series S would outperform a X1X despite being weaker on power, this game requires a modern feature set like Mesh Shaders that would not allow X1X to be able to run this title.

Someone didn't look at the minimum GPU requirements on PC before making such a ridiculous comment.
 
Someone didn't look at the minimum GPU requirements on PC before making such a ridiculous comment.
You think X1X has the compute power to push this all out without Mesh Shaders while still being able to maintain running the rest of game? Are you serious? You think it will exceed Series S performance just by brute forcing? On a jaguar CPU? How much bandwidth is going to be eaten here by the CPU because they can't get a fully GPU driven pipeline due to a lack of features?

I don't know if we saw the same video, everythign above 16px triangles were rendered with Mesh Shaders, anything smaller than 16px triangles were used with compute. 1070 benchmarks look horrid.
Look here example in question:

the CPU is already above 12GB. How are you going to fit that? There's no SSD streaming pipeline on X1X. You're on super old jaguar cores, compared to a 5800X. There's a complete lack of gpu driven featureset in X1X compared to Series S. You can't stream the geometry and the lighting into a super tiny of memory in and out.
 
Last edited:
You think X1X has the compute power to push this all out without Mesh Shaders while still being able to maintain running the rest of game? Are you serious? You think it will exceed Series S performance just by brute forcing? On a jaguar CPU? How much bandwidth is going to be eaten here by the CPU because they can't get a fully GPU driven pipeline due to a lack of features?

I don't know if we saw the same video, everythign above 16px triangles were rendered with Mesh Shaders, anything smaller than 16px triangles were used with compute. 1070 benchmarks look horrid.

The game runs on a GTX 1070, a GPU with no mesh shader support.

So you're previous comment about lack of mesh shaders not allowing X1X to run the game is completely and utterly wrong.

Accept it, and move on without typing a multiple paragraph of utter nonsense that still doesn't stop what you said from being wrong.
 
The game runs on a GTX 1070, a GPU with no mesh shader support.

So you're previous comment about lack of mesh shaders not allowing X1X to run the game is completely and utterly wrong.

Accept it, and move on without typing a multiple paragraph of utter nonsense that still doesn't stop what you said from being wrong.
You're right on that point, there is a fall back without mesh shaders, and in that context I am wrong - though to be fair, it does feel like you chose a PC part here to make your point work, which is not the comparison I made (X1X).
You reframed my argument and attacked that, and now your asking me to back down from my original statement, which you clearly side stepped. The implication of my statement is that without these features, sure you can run it, but can you run it with reasonable performance on a X1X, in which the Series S is down on both compute and bandwdith. Then the answer is no. Unless you want to make the claim that a X1X can run AC Shadows better than a Series S.

and to quote what I wrote
Even the Series S would outperform a X1X despite being weaker on power, this game requires a modern feature set like Mesh Shaders that would not allow X1X to be able to run this title.

You were definitely more ready to type than to read.
 
Last edited:
You're right on that point, there is a fall back without mesh shaders, and in that context I am wrong - though to be fair, it does feel like you chose a PC part here to make your point work, which is not the comparison I made (X1X).
You reframed my argument and attacked that, and now your asking me to back down from my original statement, which you clearly side stepped. The implication of my statement is that without these features, sure you can run it, but can you run it with reasonable performance on a X1X, in which the Series S is down on both compute and bandwdith. Then the answer is no. Unless you want to make the claim that a X1X can run AC Shadows better than a Series S.

There's nothing stopping them from having a traditional geometry pipeline on X1X and having visible LOD compared to XSS.

And as the game supports GPU's with no mesh shader support it may very well have such a set-up.

That would allow X1X to stretch it's legs in areas where it has always traditionally beaten XSS (resolution, textures..etc..
 
There's nothing stopping them from having a traditional geometry pipeline on X1X and having visible LOD compared to XSS.

And as the game supports GPU's with no mesh shader support it may very well have such a set-up.

That would allow X1X to stretch it's legs in areas where it has always traditionally beaten XSS (resolution, textures..etc..
I would disagree, even if we ignore memory limitations, it's quite plainly to old to perform.

A 1070 which is more powerful and has a higher featureset than a X1X, still underperforms compares to a Series S.
Using the video provided above, It can barely render 1080p with TAA at 30fps with absolutely all features set to low.
Series S is running at 1620p with TAA it largely is locked 30fps, with some dips below.

Comparing the videos of Shadows on a Series S vs 1070 @ 1080p with TAA, you can tell the Series S is running significantly higher settings.
 
On mobile but this is one.


Performance isn't great but it's not from a massive studio either so with more resources could have it running better.

But as it can be seen, from a CPU perspective, it can be done.

It's worth pointing out that that just because an UE5 game can be deployed to last gen machines, it doesn't mean that all UE5 games, and all UE5 features, will be possible on last gen hardware. (UE5 has a range of performance demanding tools like Lumen and Nanite that can be used or not used, or not available on certain hardware).

The busier a scene is the more workload there will be for the render thread (for example), which on Series S would have 3+ times the performance. In terms of multithreaded workloads the difference would be even bigger.

On the GPU side RDNA added Wave32, which seems to be better for compute and vertex shaders, both of which are obviously very important to modern games. For AC:Shadows on PC all pre-RDNA AMD cards are below minimum spec.

Modern games are also very much more built around fast, efficient streaming of assets requiring an SSD. Trying to pre-cache your way out of trouble on the Series X would eat into available memory and still deliver results worse than an NVMe drive with a nice hardware decompression block (RIP Jaguar cores).
 
I would disagree, even if we ignore memory limitations, it's quite plainly to old to perform.

A 1070 which is more powerful and has a higher featureset than a X1X, still underperforms compares to a Series S.
Using the video provided above, It can barely render 1080p with TAA at 30fps with absolutely all features set to low.
Series S is running at 1620p with TAA it largely is locked 30fps, with some dips below.

Comparing the videos of Shadows on a Series S vs 1070 @ 1080p with TAA, you can tell the Series S is running significantly higher settings.

Yeah, whatever it's doing to get around not having mesh shaders probably not helping. IIRC Alan Wake 2's performance on pre-RDNA AMD cards took a massive hit.
 
I would disagree, even if we ignore memory limitations, it's quite plainly to old to perform.

A 1070 which is more powerful and has a higher featureset than a X1X, still underperforms compares to a Series S.
Using the video provided above, It can barely render 1080p with TAA at 30fps with absolutely all features set to low.
Series S is running at 1620p with TAA it largely is locked 30fps, with some dips below.

Comparing the videos of Shadows on a Series S vs 1070 @ 1080p with TAA, you can tell the Series S is running significantly higher settings.

XSS is 720p to 1080p according to DF's review.
 
XSS is 720p to 1080p according to DF's review.
Doesn't change that it's running much lower settings. the 1070 native is also struggling to maintain 30fps. The Series S is capped at 30fps, so it even has headroom. Whereever Series S is dipping to 720p to maintain framerate I assure you, so would the 1070.

XSS is 720p - 1080p TAAU to 1620p. And is capped at 30fps, so it's still has headroom, with higher graphical settings.
1070 is 1080p native, peaks around 30fps, and is running lower settings, and doesn't have the extra burden of needing to also upscaling to 1620p.

And it's has what, 50% more GPU compute. It has 20% more VRAM. It has 45% more memory bandwidth on a larger bus.
 
Last edited:
Doesn't change that it's running much lower settings. the 1070 native is also struggling to maintain 30fps. The Series S is capped at 30fps, so it even has headroom. Whereever Series S is dipping to 720p to maintain framerate I assure you, so would the 1070.

The 1070 isn't X1X, and doesn't enjoy the optisation level that X1X has.

And XSS is capped at 30fps? You said "largely is locked 30fps"...... that's not 'capped' so make your mind up and be consistent will you.

You've gone X1X can't do it because of no mesh shader support, then when you were corrected you changed to then claiming XSS runs at higher resolution, then you were once again corrected and now you've moved to talk about settings without evidencing what settings XSS is even using.

Go and do some due-diligence and then come back because so far you've failed to check if mesh shader support was even a minimum requirements and then failed to even check what resolution XSS was even running at.

To quote Digital Foundry

The Series S version presents the least stable 30fps mode. Despite its many cutbacks to RTGI, hair physics and resolution, it's not quite as watertight as you might hope. Frame-rate dips are noticeably more common than on Series X or PS5 in their equivalent 30fps options, with battles and cutscenes being the most frequent cause of disruption as resolution bottoms out at 720p.
 
The 1070 isn't X1X, and doesn't enjoy the optisation level that X1X has.

And XSS is capped at 30fps? You said "largely is locked 30fps"...... that's not 'capped' so make your mind up and be consistent will you.

You've gone X1X can't do it because of no mesh shader support, then when you were corrected you changed to then claiming XSS runs at higher resolution, then you were once again corrected and now you've moved to talk about settings without evidencing what settings XSS is even using.

Go and do some due-diligence and then come back because so far you've failed to check if mesh shader support was even a minimum requirements and then failed to even check what resolution XSS was even running at.

To quote Digital Foundry
Capped means we can't see performance above 30fps. Largely locked at 30fps and being capped at 30fps is absolutely both true.
I never brought in the 1070 as a proxy into this discussion. And I never once said that Mesh Shader support was the only reason X1X cannot run this game. Go back and look at what I wrote. I said feature set _LIKE_ Mesh Shaders. And you've been pushing my words into something else to solidify your position. When I'm talking next gen featureset, that includes, the CPU, the SSD, SM6.8 vs SM6.0.
 

This video is AC:Shadows on an RX 580, paired with a Ryzen 5 8400F and no doubt an SSD.* I've timestamped it to start at the all low settings 720p native chapter. 720p is the minimum that the Series S lowers its resolution to, but it would normally be higher.

As you can see, it runs around 30 fps but frequently a bit below, compared to capped 30 fps on Series S with occasional dips below. The big takeaway in this video though is that the game looks like shit compared to the Series S version due to being all low settings. Glitches due to unsupported drivers don't help either.

On X1X GPU performance would probably be more stable, and perhaps a little higher (than rx 580) due to more optimisation. Some settings may also be able to be increased above low with a minimal hit on performance. Of course, CPU and HDD would not be up to the job for AC:S on X1X.

It's quite clear that the Series S is delivering well above what an older GCN GPUs do, even with a deficit in terms of "paper flops". The weakest part of the Series S - the GPU - is far better suited to current games than even the most powerful top end mid-gen console from 2017.

(*BTW The status bar and description shortcuts in the embedded video are a bit misleading, the first two chapters are 1080p and 900p with FSR performance and frame gen.)

And XSS is capped at 30fps? You said "largely is locked 30fps"...... that's not 'capped' so make your mind up and be consistent will you.

The game is both "capped at 30 fps" and "largely locked to 30 fps". The game is "capped" at 30 fps because the game limits itself to running no higher. But the game is "largely locked to 30fps" because sometimes performance drops below the level of the cap.

Both statements are true and in no way contradictory.
 
Last edited:

This video is AC:Shadows on an RX 580, paired with a Ryzen 5 8400F and no doubt an SSD.* I've timestamped it to start at the all low settings 720p native chapter. 720p is the minimum that the Series S lowers its resolution to, but it would normally be higher.

As you can see, it runs around 30 fps but frequently a bit below, compared to capped 30 fps on Series S with occasional dips below. The big takeaway in this video though is that the game looks like shit compared to the Series S version due to being all low settings. Glitches due to unsupported drivers don't help either.

It's insane they've even got the game to work as those GPU's are DX12 and the game requires DX12.1

The internet is full of RX580 owners posting DX12 error messages when trying to launch the game.

So I would take the performance it's displaying in the video with a grain of salt and no reflection of what it would do if it were fully supported.

It's quite clear that the Series S is delivering well above what an older GCN card do, even with a deficit in terms of "paper flops".

I still doubt that as we know from all the cross generation games at the start of the generation that it had a hard time even beating what X1X was delivering in terms of visuals.
 
It's insane they've even got the game to work as those GPU's are DX12 and the game requires DX12.1

The internet is full of RX580 owners posting DX12 error messages when trying to launch the game.

So I would take the performance it's displaying in the video with a grain of salt and no reflection of what it would do if it were fully supported.

I think it's the best reflection we have of how an older GCN card - the best GCN gpu as it ended up - would do under the circumstances. I do accept the driver situation is not ideal though.

I still doubt that as we know from all the cross generation games at the start of the generation that it had a hard time even beating what X1X was delivering in terms of visuals.

I went on to try and clarify that I'm thinking in terms of modern games. By that I mean huge amounts of geometry, massively increased emphasis on compute and GPU driven stuff, even core stuff that benefits from RT hardware on a base level. All the kind of stuff that RDNA2 was designed to enable in future software.

Cross gen games were really bad at leveraging any of the strengths of the Series S, but now they are being leveraged it's doing things that even the fastest last gen console wouldn't have a hope of doing. And I suppose that's how things should be. It took a long time for this gen to really start delivering, probably due to a long cross gen period necessary to support the ever growing cost of game development.
 
It's insane they've even got the game to work as those GPU's are DX12 and the game requires DX12.1

The internet is full of RX580 owners posting DX12 error messages when trying to launch the game.

So I would take the performance it's displaying in the video with a grain of salt and no reflection of what it would do if it were fully supported.
also important to note that the rendering is broken in that video
 
Video's on YouTube of GPU's comparable to X1X's show it would be no worse than the XSS version.
He didn’t say the GPU in the X1X. He said the X1X. And he’s right.

In both examples system ram exceeds 12GB of usage, theres not enough CPU and there isn’t enough streaming speed.

Secondly the X1X does not fully support dx12.1 either. Scorpio is incapable of running AC Shadows. Your position requires you to take massive liberties on “optimization”.
 
Last edited:
He didn’t say the GPU in the X1X. He said the X1X. And he’s right.

In both examples system ram exceeds 12GB of usage, theres not enough CPU and there isn’t enough streaming speed.

Secondly the X1X does not fully support dx12.1 either. Scorpio is incapable of running AC Shadows. Your position requires you to take massive liberties on “optimization”.

Jesus here we go again, you would think after the last few pages you would have learned to check before posting.

1. There's videos on YouTube of the matrix demo working in a HDD.
2. There videos on YouTube of it running on low RAM amounts.
3. There's videos on YouTube of UE5 running on a Core 2 Quad.
4. Not only would lack of DX12.1 not be a problem on a FIXED PLATFORM like a console, but UE5 is even certified and has UE5 games released on X1X. Showing that lack of DX12.1 is not an issue.

And your argument for Scorpio not being able to run AC Shadows is beyond weak. Tell me, what versio of DX API does PS5 use? 👀
 
Last edited:
Back
Top