this is a very sad and pathetic console generation. Mediocrity galore. Pitiful. I don't find the words. At least Tom had fun tweaking the settings on the Series S.
Last edited:
this is a very sad and pathetic console generation. Mediocrity galore. Pitiful. I don't find the words. At least Tom had fun tweaking the settings on the Series S.
yup, likely lower end textures. Kraken (better lossless compression) or memory footprint advantage seems likely.
I'm leaning towards texture compression differentials only because I'm expecting the UE5 virtual texturing system to be more than sufficient here in keeping texture pools small.I wonder if trying to keep certain assets within the GPU optimal 10GB of memory might lead to a big texture like the ground texture being reduced in size.
yup, likely lower end textures. Kraken (better lossless compression) or memory footprint advantage seems likely.
Series x is using a texture setting that is clearly much better than what series s is using, so I don't think that's it.Series consoles have Kraken and XSX has more usable memory than PS5.
I think it's just tied to XSS settings being used as base on XSX.
On my 6 GB VRAM GPU, the textures also look like that, they never load, regardless what setting or resolution.Nice, glad to see this get some coverage! It's definitely rough on console. IMO rather than chase 60 it would be a much better experience for them just to put in a properly paced vsync/2 30fps cap by default. Yes it can still fall below that occasionally but it would still make the majority experience a lot better. It's a shame that they decided to use volumetric clouds *as* fog as the performance cost is just ridiculous, but turning them off you lose a lot of visual appeal.
Having played many dozens of hours at this point though I'm sticking with my take that the game looks amazing, but it is definitely super heavy and doesn't scale down as well as one would like. You basically need a 4080+ class card to enjoy it at anywhere near max settings, although of course "max settings" are always kind of an arbitrary thing. I don't think the console versions necessarily strike a great balance out of the box, although having a smoother 30fps capped experience to start would go a long way IMO. The compromises for 60fps are just too much.
I also have no idea why the textures were slashed so much. Maybe on Series S there's always a memory argument but in general virtual texturing works very well...
I played some Fortnite yesterday on PS5 and I can confirm some amount of artifacting there as well.On my 6 GB VRAM GPU, the textures also look like that, they never load, regardless what setting or resolution.
By the way, something appears to be wrong with UE5's volumetric cloud system. I've had buggy and flickery artifacting clouds in Hellblade 2 and I have them again in Ark. They flicker and artifact like crazy and it's very, very unpleasant to the eye. It's not clear what upscaler Ark uses, since there's no setting for it, but it might be an issue with upscaling and the volumetric cloud system in UE5. This is medium settings, 1440p and a render resolution of 50% with the "innate"upscaler (probably low translation from German, it might refer to TSR?). I really, really don't think the clouds should look like that.
View attachment 11464
And now imagine the most horrible flickering you can ever imagine.
Here's the same issue in Hellblade 2 using DLSS at the same render resolution. (Low-Medium settings)
View attachment 11465
I don't have much time to download a 20 GB for Fortnite right now, but if my memory serves correctly, I've also noticed buggy and artifacting clouds there as well. (Although take this with a grain of salt, I might verify this at a later date). But observing the exact same issue in 2 recent UE5 games already, I think it's a clue that points toward this issue being engine related.
There's definitely upscaler settings on PC, it's just spread between two tabs in the settings IIRC. On console I assume it's TSR.It's not clear what upscaler Ark uses, since there's no setting for it, but it might be an issue with upscaling and the volumetric cloud system in UE5. This is medium settings, 1440p and a render resolution of 50% with the "innate"upscaler (probably low translation from German, it might refer to TSR?). I really, really don't think the clouds should look like that.
Ha, this is a pretty great practical graph of the bandwidth differences in the VRAM/PCIe test Could not have planned those squares/volumes out more perfectly.
I actually was gonna Look at it soon@Dictator : You guys looked at this?
Lossless Scaling. Machine learning Frame Generation and upscaling on ANY GPU!
This is something I mentioned in the FSR3 thread, but according to the Lossless Scaling author, they use Machine Learning and not FSR3 for this tech, so I decided to add a new thread. With Lossless Scaling you can have Frame Generation in any game of your collection, just like when this app...forum.beyond3d.com
Independent frame interpolation utility on Steam. Cyan's the only user reporting but it sounds really good.
Damn, maybe you are right… You could send an email to the developers explaining that XSX has more usable memory and that they can turn up the textures?Series consoles have Kraken and XSX has more usable memory than PS5.
I think it's just tied to XSS settings being used as base on XSX.
Damn, maybe you are right… You could send an email to the developers explaining that XSX has more usable memory and that they can turn up the textures?
Using ARK as the basis for your judgement of the console's capabilities is more pitiful than anything and you know it.this is a very sad and pathetic console generation. Mediocrity galore. Pitiful. I don't find the words. At least Tom had fun tweaking the settings on the Series S.