Forspoken (Project Athia) [PS5, PC]

  • Thread starter Deleted member 7537
  • Start date
interesting that they sponsored me with a steam license key, despite in the "reason", i wrote something in the line of : i fix broken stuff.

hopefully this means that they actually do care for PC ver and want the best, and would be okay with people modding/fixing it by themselves.
 
If we believe PCGamingWiki :

source : https://www.pcgamingwiki.com/wiki/Forspoken

The relatively small improvement in loading times with DirectStorage turned off (if that flag is even doing anything) isn't the whole story IMO.

DS is primarily designed to reduce CPU overhead and so if you're testing this on an already very fast CPU at loading time, particularly on a slower (PCIE3.0) NVMe then there may be no CPU bottleneck to reduce.

This really needs testing on a range of CPUs starting with an old quad core. Let's see what difference it makes on an Ryzen 3100 with a fast PCIe4 NVMe for example. And crucially what is the fps difference in gameplay with such a CPU coupled with a high end GPU where we know significant CPU limitations exist but we know the game is also constantly streaming high amounts of data for as yet unknown reasons.
 
The presentation and world building is this game is its biggest downfall. I haven't seen a game that is uses "cut-n-paste" from your typical open world game as redundant as this one. Makes everything looks so mediocre and bland. I'm just going to speculate that the designers are rookies or just plain laziness. Don't even get me started on the story.
Probably nothing to do with lazy devs. The staff at luminous were probably not equipped to properly create a game of this scope. I would expect a lot more from characters and story though. Amy freaking henning is a big part of the staff on this game. Wtf was she doing
 
the rumors that some people have shared here that Forspoken development process was basically Destiny or FFXV all over again are probably spot on.

its full of so many things that start to make sense once i start to think of it as something that got multiple development reboots, and then the devs simply tried their best to clobber together a viable enough product, they didnt have enough time to polish everything for the final game we got.

so actually its quite an achievement, for the dev team to be able to release forspoken with this kind of quality.
 
If I were square enix, rather than shut down luminous productions, I would basically mandate them to create a smaller scoped game. Yes luminous was built for open worlds, but it also was something that probably takes a lot of time and a lot of staff to create even with tons of procedural generation going on
 
The relatively small improvement in loading times with DirectStorage turned off (if that flag is even doing anything) isn't the whole story IMO.

DS is primarily designed to reduce CPU overhead and so if you're testing this on an already very fast CPU at loading time, particularly on a slower (PCIE3.0) NVMe then there may be no CPU bottleneck to reduce.

This really needs testing on a range of CPUs starting with an old quad core. Let's see what difference it makes on an Ryzen 3100 with a fast PCIe4 NVMe for example. And crucially what is the fps difference in gameplay with such a CPU coupled with a high end GPU where we know significant CPU limitations exist but we know the game is also constantly streaming high amounts of data for as yet unknown reasons.
I can confirm you can't get to the main menu with gpu decompression command argument added.

I second you, we need lot of data....and also other implementations.
If the quality of textures and transition oddities i see in this game with less than 10GB ram become standard in all games i will need to upgrade sooner than i expected...
Maybe there is a technical justification and the current state of the game is normal but i'm a bit skeptical of it.
 
I can confirm you can't get to the main menu with gpu decompression command argument added.

I second you, we need lot of data....and also other implementations.
If the quality of textures and transition oddities i see in this game with less than 10GB ram become standard in all games i will need to upgrade sooner than i expected...
Maybe there is a technical justification and the current state of the game is normal but i'm a bit skeptical of it.
Yes but textures literally turn into N64 ones.

This reminds of me of the GTX 770 paradox.

Being only shy of 2 GB VRAM in lastgen games made 770 super obsolete, often forcing people to play with N64 textures.


I really have a hard time of understanding why devs never cared to create scalable texture presets for lower amount of VRAMs back the. N64 simply doesn't make sense but it is the way it happens.

I'm sure if devs wanted, they could've given decent looking textures for 2 GB owners on games such as RDR2. But they didn't care. I wonder if they will care for 8 GB and such going forward? If not, we're in due for an upgrade indeed.
 
rewatched the intro scenes, few interesting things
  • the "city" shown in the portal frey used to jump into Althia did not show the previous location before the screen fade to black
  • cuff says along the line of "im the thing you've been trying furiously to take off" despite Frey hasnt even noticed that she wore a bracelet.
maybe originally, after frey got the bracelet for the 1st time, frey would do some things in earth ("im gonna take care of everything"), including trying all kinds of way to remove the cuff.

then something happened, maybe frey was cornered / life threathened, and the bracelet casts a portal to Althia, and thus the cutscenes and dialogue we got when we first landed in althia.
 
Yes but textures literally turn into N64 ones.

This reminds of me of the GTX 770 paradox.

Being only shy of 2 GB VRAM in lastgen games made 770 super obsolete, often forcing people to play with N64 textures.


I really have a hard time of understanding why devs never cared to create scalable texture presets for lower amount of VRAMs back the. N64 simply doesn't make sense but it is the way it happens.

I'm sure if devs wanted, they could've given decent looking textures for 2 GB owners on games such as RDR2. But they didn't care. I wonder if they will care for 8 GB and such going forward? If not, we're in due for an upgrade indeed.
As long as the Series S exists, devs will provide scalable texture settings that actually look good during the generation.

But as for PS5 exclusive games, you are right there is a chance developers won't care about users with 8GB and less. Hopefully Forspoken is the exception and not the rule. We will see how Returnal will run, I'm quite hopeful about that one.

If the Forspoken way becomes a trend however, I surely hope it gets called out. Most have 8 and 6 GB cards. It'd be awful if a game would look much worse on a 3070 compared to the PS5.
 
Here's another video for the weird ray tracing-texture problem. This time it is more apparent and clear.


I just wish "vram consumption bar" was a standard among these games. How am I to know what are my limits from the perspective of engine? Yes, turning off ray tracing in this instance brings back textures. But what guarantee there is that it won't happen in another scene? Because even without ray tracing, I still get unloaded textures even at native 1080p. It's just weird to me.



As long as the Series S exists, devs will provide scalable texture settings that actually look good during the generation.

But as for PS5 exclusive games, you are right there is a chance developers won't care about users with 8GB and less. Hopefully Forspoken is the exception and not the rule. We will see how Returnal will run, I'm quite hopeful about that one.

If the Forspoken way becomes a trend however, I surely hope it gets called out. Most have 8 and 6 GB cards. It'd be awful if a game would look much worse on a 3070 compared to the PS5.
What if Forspoken way ends up being the norm for S too? That would be funny. The userbase of Series S seems to adapt to everything, they have no trouble playing games upscaled to 1080p via regular temporal upscaling from 700 to 900p. Meanwhile on PC, a majority of folks find FSR 2 unusable at 1080p and even DLSS questionable at 1080p. Maybe it has to do with standards, but clearly S people have lower standards (nothing wrong with it). I:'m talking about the usual regular temporal upscaling.

I wonder how will Forspoken end up working on Xbox S then. The game has 2 year exclusivity. Maybe within that time span, they will provide some extra texture settings or something. I mean they should've.

I agree with you. In this case, problem spans to 3080 too. Or 3080ti and 4070ti at 1440p and above. I'm sure with 4070ti at 1440p with ray tracing, similar problems would ensue. So it is simply not a good look for the game and hardware in general.

What actually angers me is that this weird behaviour happens when the game maxes out at a weird 6.4-6.6 GB VRAM usage. I really feel like 8 GB is getting gimped and borked big time here. Not only it was an amount that barely did the work, now they also put artificial caps to further kneecap it.

Somehow Cyberpunk had the ability to use the all available VRAM (conveniently) when the highest end NVIDIA cards at the time were mostly around 8-10 GB (aside from the 3090). I remember most 2020-2021 ray tracing games using all available VRAM budget to fit ray tracing and good quality texture togethers.

All of a sudden in 2022, 2023, we get this behaviour with Spiderman and Forspoken. It just seems a fishy a bit too. I don't know what to say. Its just... feels wrong. The resource is there. Why not use it? Do they expect me to stream, open 10 tabs of Chrome and a second screen with Twitch on it all the time or something?
 
Last edited:
Sigh!

Forspoken doesn't work on RX 400 and RX 500 GPUs, and all of the GPUs prior to them from AMD (RX 200, RX 300), as they lack DX12_1 feature levels, mainly Conservative Rasterization and Raster Order Views! This also locks all NVIDIA Kepler and Fermi GPUs from running the game, but allows GTX 900 and GTX 1000 to run it, as they support these features through DX12_1!

 
Last edited:
Sigh!

Forspoken doesn't work on RX 400 and RX 500 GPUs, and all of the GPUs prior to them from AMD (RX 200, RX 300), as they lack DX12_1 feature levels, mainly Conservative Rasterization and Raster Order Views! This also locks all NVIDIA Kepler and Fermi GPUs from running the game, but allows GTX 900 and GTX 1000 to run it, as they support these features through DX12_1!

That's bound to happen, nothing unusual or problematic. Eventually, games will start requiring DX12_2 too and I believe when they do, PC ports will get a lot better using techniques like SFS for efficient texture streaming and Mesh Shaders for better and faster geometry.
 
That's bound to happen, nothing unusual or problematic. Eventually, games will start requiring DX12_2 too and I believe when they do, PC ports will get a lot better using techniques like SFS for efficient texture streaming and Mesh Shaders for better and faster geometry.
Which should make those people (Yotubers/Reviewers) that recommend GPUs with lesser DX feature levels weary, because as we approach the era of pure current gen games, these GPUs will be left out.
 
Does Forspoken even use 12_1 features or is this an arbitrary requirement similar to games being compiled to require AVX despite not even using the instructions.
 
What if Forspoken way ends up being the norm for S too? That would be funny. The userbase of Series S seems to adapt to everything, they have no trouble playing games upscaled to 1080p via regular temporal upscaling from 700 to 900p. Meanwhile on PC, a majority of folks find FSR 2 unusable at 1080p and even DLSS questionable at 1080p. Maybe it has to do with standards, but clearly S people have lower standards (nothing wrong with it). I:'m talking about the usual regular temporal upscaling.
PC gamers are still usually sitting close to display compared to a console user.
Console users even PS5 & XSX will be ok at lower IQ compared to PC.
So it's not just XSS. But given the market its aimed at, I'm sure their more than happy in general with what their getting.
 
Sigh!

Forspoken doesn't work on RX 400 and RX 500 GPUs, and all of the GPUs prior to them from AMD (RX 200, RX 300), as they lack DX12_1 feature levels, mainly Conservative Rasterization and Raster Order Views! This also locks all NVIDIA Kepler and Fermi GPUs from running the game, but allows GTX 900 and GTX 1000 to run it, as they support these features through DX12_1!


Meh, those GPU are too old to run shit anyway!
 
I'm sure with 4070ti at 1440p with ray tracing, similar problems would ensue.
Nope, no problems here.

Somehow Cyberpunk had the ability to use the all available VRAM (conveniently) when the highest end NVIDIA cards at the time were mostly around 8-10 GB (aside from the 3090). I remember most 2020-2021 ray tracing games using all available VRAM budget to fit ray tracing and good quality texture togethers.

No more than 7GB use on my 4070ti on ultra RT settings
 
Back
Top