Forspoken (Project Athia) [PS5, PC]

  • Thread starter Deleted member 7537
  • Start date
Sigh!

Forspoken doesn't work on RX 400 and RX 500 GPUs, and all of the GPUs prior to them from AMD (RX 200, RX 300), as they lack DX12_1 feature levels, mainly Conservative Rasterization and Raster Order Views! This also locks all NVIDIA Kepler and Fermi GPUs from running the game, but allows GTX 900 and GTX 1000 to run it, as they support these features through DX12_1!


Someone on reddit saya they managed to run Forspoken on Radeon 480 on Linux, at 30fps.

Can't remember the resolution.

Maybe the dx12.1 req was an arbitrary req? Or Linux (proton?) somehow emulated 12.1 feature?
 
Sigh!

Forspoken doesn't work on RX 400 and RX 500 GPUs, and all of the GPUs prior to them from AMD (RX 200, RX 300), as they lack DX12_1 feature levels, mainly Conservative Rasterization and Raster Order Views! This also locks all NVIDIA Kepler and Fermi GPUs from running the game, but allows GTX 900 and GTX 1000 to run it, as they support these features through DX12_1!


Didnt know that. Polaris (400 and 500 series) are the newer gpus yet maxwell even though being two years older arch actually has the newer dx12 support….
 
Last edited:
The first 2-3 hours of "gameplay" looks absolutely horrible with how often it takes control away from you and how little you're actually doing anything.
This is my pet peeve. The worst offender is GTA V. The game opens at the bak job with Trevor, as Michael you take two f***ing steps and BAM! The game removes control from you and fires off tutorial boxes to read though.
 
Performance seems to slightly favor Nvidia cards but nothing crazy.

 

Square Enix gave me a free copy of this game.

Forspoken has very strange Ambient Occlusion implementation and lighting that made the visual look flat. This reshade mod subtly changes Forspoken visual by making the AO more pronounced with a larger radius (so it's less like an outline) and by adding a fake GI, while keeping the art style as close as possible to the original but less flat.
 
Whoa! After the 1st boss, Forspoken got much better.

There's less getting stopped in dialogue, more freedom of engagement, and Frey's mindset/psychology got better explained.

Later on, the game also did super effective cheap tricks to do environmental storytelling.

They should have made the game intro with a cold open IMO. So we got to experience the potential in the future that are locked into a boring slog in the first few hours.
 
I've went to extremes, installed a 1030 on my rig, hooked the monitor to that. That way, I was able to get a fully uninterrupted full 8192 mb buffer. All windows composition work and associated VRAM loads are on the 1030.
Game still refuses to use anything more than 6.7 GB, and still caps at 7264 mb in the VRAM usage bar.

7264 mb 2.png7264mb 1.png


AC Valhalla for example has no problems using 7.7 GB out of the whole buffer.



7264 mb 3.png


I had to run the game at 4k+%130 scaling to get this VRAM consumption btw.
 
Maybe there's a bug with the memory management indeed

I can go to 16GB used out of 6.7GB available on rtx 3070 LHR 8GB

Assuming the memory usage status meant

Currently in use / available
 
Maybe there's a bug with the memory management indeed

I can go to 16GB used out of 6.7GB available on rtx 3070 LHR 8GB

Assuming the memory usage status meant

Currently in use / available
More funny thing is, some people on an another platform were convinced that it was caused by my 16 GB RAM and low end CPU (2700). They claimed game would load textures fine on 32 GB budget and on a modern Zen 3 CPU, since it recommends so.

Did not take long for me to find a video with a 3070, 5600x and 32 GB RAM where textures fails to load, in the exact same fashion.


0:23 textures around the frey do not load
4:17 ground textures do not load

Overall in the video, you will never see app vram usage go past 6.4 GB, yet you will tons of N64 textures, despite user having 32 gb ram and a competent CPU. So it clearly has to do with VRAM. Why do I make fuss over this? Yes it is one example. Yes it is a bad example. BUT IT IS AN example. this has been the 5th time I've seen a video game kneecap the 8 GB VRAM budget to 6.4 GB or so. NVIDIA still plans to release 1440p capable 8 GB 4060ti or so I heard. To me, I must discourage people from buying such products. And I have to show these with examples. I cannot hope to beg like I did in spiderman. At least Nixxes were merciful enough to rais the cap to 6.9 GB from 6.4 GB. But even that is comical compared to Valhalla's and Cyberpunk's 7.7 GB CAP. Clearly, 7.7 GB can be used without creating problems for the game or system, but again, clearly, newer games have their own ideas. So in that respect, it is pointless to pester developers but rather get high amounts of VRAM instead. And honestly, 8 GB 4060ti rumors at this point just startles me. I just want a cheap midrange NVIDIA card with more than 12 GB VRAM. This can't be hard. I don't want to pay 4070ti, a properly 4K capable GPU 800 bucks when I know it will be kneecapped back to 1440p in a year or two.
 
Last edited:
I think it's not due to not enough memory, but due to bugged memory management.

Even when my Forspoken memory usage is higher than available vram, textures still won't load.

It seems the textures are put in wrong priority or something.

Large vram simply alleviate the symptoms, but I highly suspect it didn't actually fix the cause.
 
Btw Forspoken ignore ini file that work in ffxv.

So unless someone managed to decrypt the save file or made trainer / cheat engine table / Injector, fixing things via advanced config or console command would just be a dream.
 
Back
Top