Forspoken (Project Athia) [PS5, PC]

  • Thread starter Deleted member 7537
  • Start date
Yes but can we really assume is it purely VRAM operations? Surely there has to be some physics, sound and such data that does not use VRAM on PC but has to use VRAM on consoles? Am I wrong on this? There has to be some kind of data that can safely operate only on RAM and do not has to exists on VRAM on PC, but has to exists on total memory budget on consoles? That's how I assumed 10 GB of 3080 would be fine for PS5 ports. If not, this is too problematic for the future of PC gaming, since the you know who is adamant on not letting 16 GB budget hit the midrange market. I always assumed 4070 would be the final point where he gives up and gives it 16 GB of budget but now with 12 GB 4070ti, I lost that hope.

There is other stuff inside RAM of PS5 than VRAM for sure.
 
Yes but can we really assume is it purely VRAM operations? Surely there has to be some physics, sound and such data that does not use VRAM on PC but has to use VRAM on consoles? Am I wrong on this? There has to be some kind of data that can safely operate only on RAM and do not has to exists on VRAM on PC, but has to exists on total memory budget on consoles? That's how I assumed 10 GB of 3080 would be fine for PS5 ports. If not, this is too problematic for the future of PC gaming, since the you know who is adamant on not letting 16 GB budget hit the midrange market. I always assumed 4070 would be the final point where he gives up and gives it 16 GB of budget but now with 12 GB 4070ti, I lost that hope.

Absolutely not. I would think the game code is using a couple Gigs and then you have general heap memory for objects and runtime world state data. So that 12.5 is down to maybe 10.5 GPU use or even less. I would assume it should fit within the PC 10GB VRAM for GPU uses.

Yes, a lot of assumptions but realistic assumptions.
 
The "Inn" spread all across the world feels so haphazardly placed. It also look as if it was simply teleported there instead of something that was built there.

The small villages also got the same issue

Does the break randomly move things or something? Does the game explains why the world is like that?
 
The "Inn" spread all across the world feels so haphazardly placed. It also look as if it was simply teleported there instead of something that was built there.

The small villages also got the same issue

Does the break randomly move things or something? Does the game explains why the world is like that?

It is the first time they use procedural tools inside the Luminous engine. I suppose they went too far with procedural placement.
 
Played the demo.
The music is good and the combat is kind of fun.
Everything else feels unfinished, almost to the point of a proof of concept world map...
 
Here's a weird behaviour, 1080p DLSS quality, clean background, standard texture streaming, ray tracing on. 6 GB VRAM usage and textures keep dancing. Ray tracing off, they get loaded like usual (but some of them will still dance between PS2 and PS5 textures, it is inevitable).

Seems like it is impossible to enable ray tracing on 8 GB budget even at 720p internal resolution at 1080p in this game, despite GPU is being MORE than capable of it.


We have Jensen to thank for. Sure, go ahead. Do the 8 GB 4060 with frame generation and then 8 GB 4060ti with 128 bit. Yeah, 10 GB for 4070 is totally viable and justifiable, why not. With such texture streaming techniques.
 
I don't know how or why it happens but I can't get normal quality textures in any way or form with 8 GB budget. I tried low, med high and ultra memory budget settings; none of them truly works. I cannot get any decent textures out of the game. For starters, I know that game has some really detailed textures from the PS5 version. I'M AT 1080P MIND YOU! Not even 1440p.

There should've been a proper sub-preset of texture quality that is geared towards 8 GB and lower cards. You cannot simply ignore the majority of PC userbase and hope that no one complains. Half of complains over steam froums are about textures not loading or weird textures. That also makes the impression that game looks just like that, since most people do not know that game uses a streaming technique.

I know PS5 has a higher budget but it has to be around 10 GB no? Can it really use the whole usable 13.5 gb for purely GPU bound operations? It really has no regular RAM-like operatiosn to fit into that buffer? I always thought the way Xbox SX has its memory setup, 10 GB was meant for GPU operations (so normal VRAM) and 3.5 GB was meant for CPU operations (so normal RAM on PC). If that is the case, how come only being 2 GB short of PS5 intended budgets results in worse-than-PS2 textures? How is this a solution? Is this practically what 8 GB users have to endure this gen? If so, RIP. PC gaming might as well end. I'm not so sure if devs can recoup their investments, let alone earn money, with only 16+ GB card userbase on PC, because even 12 G B cannot warrant high quality textures in this game, from the looks of it.

What angers me further is that THIS happens when the game barely uses 6.5 GB on a 8 GB budget (internally 6.2 GB). This is exactly how Spiderman behaved at launch. Uses 6.4-6.5 Gb at first, then nerfs itself to 6.2 GB and stays there. Imagine this behaviour on 6 GB cards, further chaos would ensue.


View attachment 8176


I just feel like these devs are disconnected with reality of what people have and what people do not have.
I agree, the texture quality is horrendous on 8 and 6 GB cards. It's like straight out from N64, this really can't be happening.
 
I don't know how or why it happens but I can't get normal quality textures in any way or form with 8 GB budget. I tried low, med high and ultra memory budget settings; none of them truly works. I cannot get any decent textures out of the game. For starters, I know that game has some really detailed textures from the PS5 version. I'M AT 1080P MIND YOU! Not even 1440p.

There should've been a proper sub-preset of texture quality that is geared towards 8 GB and lower cards. You cannot simply ignore the majority of PC userbase and hope that no one complains. Half of complains over steam froums are about textures not loading or weird textures. That also makes the impression that game looks just like that, since most people do not know that game uses a streaming technique.

I know PS5 has a higher budget but it has to be around 10 GB no? Can it really use the whole usable 13.5 gb for purely GPU bound operations? It really has no regular RAM-like operatiosn to fit into that buffer? I always thought the way Xbox SX has its memory setup, 10 GB was meant for GPU operations (so normal VRAM) and 3.5 GB was meant for CPU operations (so normal RAM on PC). If that is the case, how come only being 2 GB short of PS5 intended budgets results in worse-than-PS2 textures? How is this a solution? Is this practically what 8 GB users have to endure this gen? If so, RIP. PC gaming might as well end. I'm not so sure if devs can recoup their investments, let alone earn money, with only 16+ GB card userbase on PC, because even 12 G B cannot warrant high quality textures in this game, from the looks of it.

What angers me further is that THIS happens when the game barely uses 6.5 GB on a 8 GB budget (internally 6.2 GB). This is exactly how Spiderman behaved at launch. Uses 6.4-6.5 Gb at first, then nerfs itself to 6.2 GB and stays there. Imagine this behaviour on 6 GB cards, further chaos would ensue.


View attachment 8176


I just feel like these devs are disconnected with reality of what people have and what people do not have.
That's quite ugly. Are those low res textures eventually replaced by higher res ones?
 
DSO tried the RT implementation of the game, and they say it looks the worst ever.

However, this is the worst Ray Tracing implementation we’ve seen so far. Below you can find some comparison screenshots. The RT screenshots are on the left and the non-RT screenshots are on the right. And, we’re not exaggerating here, they look exactly the same. The game might have some areas in later environments that look better. However, and during our tests, these RT effects are as underwhelming as they can get. I mean, even the RT AO does not make any difference at all in the New York area. What a disappointment.

So bad baked GI, bad SSAO, bad Ray Tracing, and bad texture streaming, what else?
 
Up to 10% on a 4090 without GPU decompression? Sure I predicted a high performance cost but that was with GPU decompression. What about the much weaker GPUs then (that most people actually own)?
Indeed. Wish they benchmarked not just a 4090 and instead of a range of cards that way you actually get a trend instead of a singular data point before people jump to conclusions on the impacts of DS on GPU availability.
 
Indeed. Wish they benchmarked not just a 4090 and instead of a range of cards that way you actually get a trend instead of a singular data point before people jump to conclusions on the impacts of DS on GPU availability.

It's being talked about in the next gen NVMe thread. Nothing to do with GPU decompression (the game doesnt seem to use it), the benchmark is just counting the very high frame rate "fade to black" cuts inbetween scenes (which are longer on SATA SSD's) towards the overall average.
 
It's being talked about in the next gen NVMe thread. Nothing to do with GPU decompression (the game doesnt seem to use it), the benchmark is just counting the very high frame rate "fade to black" cuts inbetween scenes (which are longer on SATA SSD's) towards the overall average.
If we believe PCGamingWiki :
It is dubious if the game uses GPU decompression as it inhibits no difference from the default behaviour when using -noGPUDecompression command line agument, and outright crashes when forcing it to use GPU decompression with -useGPUDecompression. In most situations the game barely sees a difference in load times when disabling DirectStorage by -noDirectStorage or outright removing the DLL files.[2]
source : https://www.pcgamingwiki.com/wiki/Forspoken
 
The presentation and world building is this game is its biggest downfall. I haven't seen a game that is uses "cut-n-paste" from your typical open world game as redundant as this one. Makes everything looks so mediocre and bland. I'm just going to speculate that the designers are rookies or just plain laziness. Don't even get me started on the story.
 
Forspoken use GPU decompression and GPU usage shows why it was not an option on PS5 and Xbox Series. They needed to use a hardware decompressor.

Maybe with low-end hardware like that yes, however i think DS on such gpu-class would still see its advantages.
 
Watching the original GDC presentation on the game it appears to have had some big changes.

The presentation talks about it using FSR1.0 but the game uses FSR2.0

The presentation talks about GPU decompression being added in the future which they've done.

The presentation also says they use VRS Tier 1.
 
Back
Top