Forspoken (Project Athia) [PS5, PC]

  • Thread starter Deleted member 7537
  • Start date
Played the demo some more with horrible performance (seriously, why? The game was working smoothly before I got to the bridge)......

And the baffling world design becoming more and more apparent.

Its not just buildings looks randomly plopped with no design considerations, but the area around landmarks also was not properly designed, and enemies placements are like simply plopped wherever with no rhime no reason no design whatsoever.

Felt like the dev simply see an overview map, and they plopped enemies until the map looks like has been filled with enough enemies. Without any design considerations whatsoever, other than simplistic decisions...


"Oooh a bridge! Must plop enemies!",


"Humm a fort! Plop plop plop!"


Uh now the bridge and fort looks empty, better plop some enemies!

Plop plop plop

Edit:

Basically, the world design feels very similar to genshin impact.

And genshin impact tried to copy zelda botw but failed miserably. It totally missing the designs.
 
Btw anyone the music totally not to my taste. The battle music didn't make my heart goes pumping.

Better with music disabled, I imagine. Haven't tried
 
lol it most definitely is not 1728p average. Just do a run and check out how the main character looks, most definitely not 1728p.
Presumably at 30hz and dropping into the low 20fps range at points. John at DF said the game generally runs at the bottom end of it's DRS window (1080p) in the 40hz mode.
VGTech corroborates John's findings:

PS5 in the 40fps quality mode seems to have the same resolution setup as the 30fps quality mode but the average rendering resolution is lower as the DRS system is targeting 40fps. The 40fps quality mode seems to render at 1920x1080 much more often than the 30fps quality mode does.

I really doubt the average is 1728p. The DRS apparently is very aggressive though so getting an average pixel count seems like a tall task.
 

Damn. seems like had a rough time. Could barely get above 40fps without DLSS

Just attempted a like for like comparison on my 4070Ti as the second area where they compare performance is available in the demo. It's not straight forward though unfortunately as my CPU limitations come into play.

With everything set to max except RTAO (which I assume is a rough match for PS5 settings in 40hz mode quality mode) I'm generally in the high 60's - low 70's at either 1080p (where I'm clearly CPU limited) or 1440p, where my GPU load is generally in the highish 90% utilisation range). So assuming the PS5 is running at 1080p in that scene (and it's below the 40fps target for much of it so that stands to reason), I very (very) roughly seem to be getting around 70% more fps but also with around 80% more resolution.

That's without the game ready drivers installed. I'll try those out later.
 
Around 70 meta may not be blockbusting rating, but that always seems to fall into you may or may not enjoy the game.
Trouble is nowadays people will see it as not worth giving it a chance.

I assume that the only difference between DS1.0 and DS1.1 implementation wise is having to compress with GDeflate.
Being the first DS title, they may have had early access to it. Still, the work may not be worth taking on so close to release though.
 
Around 70 meta may not be blockbusting rating, but that always seems to fall into you may or may not enjoy the game.
Trouble is nowadays people will see it as not worth giving it a chance.

I assume that the only difference between DS1.0 and DS1.1 implementation wise is having to compress with GDeflate.
Being the first DS title, they may have had early access to it. Still, the work may not be worth taking on so close to release though.

Their loss. I play and enjoy lots of 70 score games. Dip into the 60’s sometimes too.
 
  • Like
Reactions: Jay
The first 2-3 hours of "gameplay" looks absolutely horrible with how often it takes control away from you and how little you're actually doing anything. Even after that part, the gameplay after that doesn't seem worth the cost. It doesn't seem to be anywhere on par with other 60-70 score games. But that's just me.
 
Another game that artificially caps VRAM usage to around %80-85 of total budget, just like Spiderman (which eventually I was right about and Nixxes increased the cap)


VRAM usage on 8 GB practically never goes above 6.8-6.9 GB despite background is completely being free of other applications. I see similer behaviour on 12 GB cards (if you have an uncluttered desktop, per game-VRAM usage will never breach over 9.6 GB and total VRAM usage will cap around 10.5 GB at best)

This behaviour seems to be more frequent on newer titles and it spells doom for already-VRAM-limited cards. I wish we could somehow spoof the VRAM amount the card reports to overcome such artificial caps and limitations. Otherwise it is sad, with this logic, all 12 GB cards have only 9.6-10 GB usable budget, 10 GB cards have 8.7-8.9 GB of usable budget and so on for such games.

I HAVE FREE VRAM DAMMIT. use it.
 

Attachments

  • zpa.png
    zpa.png
    822.2 KB · Views: 4
Maybe a Workaround for vram full bug?

Like in cyberpunk2077 and witcher 3 next gen, where the performance drops around 50% after the memory went full tilt for awhile with RT enabled.

Always fixable by simply going to the title screen and load the save game.
 
Another game that artificially caps VRAM usage to around %80-85 of total budget, just like Spiderman (which eventually I was right about and Nixxes increased the cap)


VRAM usage on 8 GB practically never goes above 6.8-6.9 GB despite background is completely being free of other applications. I see similer behaviour on 12 GB cards (if you have an uncluttered desktop, per game-VRAM usage will never breach over 9.6 GB and total VRAM usage will cap around 10.5 GB at best)

This behaviour seems to be more frequent on newer titles and it spells doom for already-VRAM-limited cards. I wish we could somehow spoof the VRAM amount the card reports to overcome such artificial caps and limitations. Otherwise it is sad, with this logic, all 12 GB cards have only 9.6-10 GB usable budget, 10 GB cards have 8.7-8.9 GB of usable budget and so on for such games.

I HAVE FREE VRAM DAMMIT. use it.

Yes I noticed this myself. The game reports using about 9.6GB out of 10.7GB, but I have 12GB.
 
My understanding was that Win10 (or WDDM after some version) has two behaviors, it reserves an amount of VRAM (fixed? some variability) that is not accessible to any application/process and that it also limits the maximum amount of VRAM any single process can access (not sure if this fixed or as a percentage of total VRAM). This applies to the primary graphics adapter. It also isn't something strictly related to gaming, any applications that use the GPU for compute will run into this, which is why for these usage cases using another graphics card as your display adapter and/or switching to Linux is sometimes employed.

The other part is DX12 from my understanding shifts most of the memory management onus to the application and therefore developers. Also that improper management can result in more critical faults (including crashes) compared to how it worked prior to DX12 and the greater amount of driver side management from the IHV. This would have implications in terms of how developers choose to approach VRAM management.

There's also an issue in terms of how eviction is handled which has implications when people are testing, as the the typical test is shorter (much shorter) than typical actual playtimes. Which would explain why there are more anecdotal reports of possible VRAM related issues from players than game tests show.

Also as another aside, modern games (and I use this term loosely) often don't really on some sort of fixed load profile and adherence to settings. Which means even if monitoring software indicates VRAM limits are met, the game might be doing so by essentially silently managing what content is actually loaded in compared to another GPU with more VRAM.
 
Last edited:

Up to 10% on a 4090 without GPU decompression? Sure I predicted a high performance cost but that was with GPU decompression. What about the much weaker GPUs then (that most people actually own)?
 


I'm not understanding the sense behind this. It implies that IN GAME streaming exceeds the bandwidth limits of a SATA SSD. That in itself would be very strange, however if that were the case it should also show up as some kind of (presumably significantly) degraded experience on the SSD based game. But we've heard no reports of that?

Up to 10% on a 4090 without GPU decompression? Sure I predicted a high performance cost but that was with GPU decompression. What about the much weaker GPUs then (that most people actually own)?

If it's not using GPU based decompression then there should not be any increased overhead on the GPU. The reduced fps in that benchmark could just as easily be an additional CPU overhead due to an increased CPU based decompression burden. If that were the case then GPU based decompression would actually alleviate this.
 
Forspoken use GPU decompression and GPU usage shows why it was not an option on PS5 and Xbox Series. They needed to use a hardware decompressor.
 
Last edited:
I don't know how or why it happens but I can't get normal quality textures in any way or form with 8 GB budget. I tried low, med high and ultra memory budget settings; none of them truly works. I cannot get any decent textures out of the game. For starters, I know that game has some really detailed textures from the PS5 version. I'M AT 1080P MIND YOU! Not even 1440p.

There should've been a proper sub-preset of texture quality that is geared towards 8 GB and lower cards. You cannot simply ignore the majority of PC userbase and hope that no one complains. Half of complains over steam froums are about textures not loading or weird textures. That also makes the impression that game looks just like that, since most people do not know that game uses a streaming technique.

I know PS5 has a higher budget but it has to be around 10 GB no? Can it really use the whole usable 13.5 gb for purely GPU bound operations? It really has no regular RAM-like operatiosn to fit into that buffer? I always thought the way Xbox SX has its memory setup, 10 GB was meant for GPU operations (so normal VRAM) and 3.5 GB was meant for CPU operations (so normal RAM on PC). If that is the case, how come only being 2 GB short of PS5 intended budgets results in worse-than-PS2 textures? How is this a solution? Is this practically what 8 GB users have to endure this gen? If so, RIP. PC gaming might as well end. I'm not so sure if devs can recoup their investments, let alone earn money, with only 16+ GB card userbase on PC, because even 12 G B cannot warrant high quality textures in this game, from the looks of it.

What angers me further is that THIS happens when the game barely uses 6.5 GB on a 8 GB budget (internally 6.2 GB). This is exactly how Spiderman behaved at launch. Uses 6.4-6.5 Gb at first, then nerfs itself to 6.2 GB and stays there. Imagine this behaviour on 6 GB cards, further chaos would ensue.


weird1.png


I just feel like these devs are disconnected with reality of what people have and what people do not have.
 
I know PS5 has a higher budget but it has to be around 10 GB no? Can it really use the whole usable 13.5 gb for purely GPU bound operations?

Minor nitpick, the PS5 only provides 12.5 GB of total memory to Games.
 
Minor nitpick, the PS5 only provides 12.5 GB of total memory to Games.
Yes but can we really assume is it purely VRAM operations? Surely there has to be some physics, sound and such data that does not use VRAM on PC but has to use VRAM on consoles? Am I wrong on this? There has to be some kind of data that can safely operate only on RAM and do not has to exists on VRAM on PC, but has to exists on total memory budget on consoles? That's how I assumed 10 GB of 3080 would be fine for PS5 ports. If not, this is too problematic for the future of PC gaming, since the you know who is adamant on not letting 16 GB budget hit the midrange market. I always assumed 4070 would be the final point where he gives up and gives it 16 GB of budget but now with 12 GB 4070ti, I lost that hope.
 
Back
Top