Final Fantasy XV [XO, PS4, PC]

Are we trying to argue that hairworks in Witcher 3 was well received? As far as I remember, it was critically panned and to this day pretty much all reviews turn it off when doing comparisons.

Just like in Witcher 3, the game is automatically setting the tessellation levels to x64, which breaks performance in both nvidia and AMD GPUs at the cost of no visual benefit over x32 or even x16 because it forces GPUs to process sub-pixel triangles.
It's just that it breaks AMD performance more, making nvidia GPUs look better in comparison.

The ultra tessellated invisible cows are just another example to join the legendary ultra tessellated concrete slab and ultra tessellated invisible ocean in Crysis 2.


It's mind-boggling that nvidia somehow still feels the need to resort to these tactics in their proprietary frameworks, when they clearly have the upper hand in hardware performance and performance/watt.
I wonder if the negative press they get from these things are worth it. They keep doing it, so I guess they are.


Not that it matters right now, though. Gamers aren't buying graphics cards.
No my point is that one can buy Witcher 3 and not be put off by Hairworks as context to the post I was responding to.....
Regarding Tessellation, well the benchmark options (such as Terrain) are only impacting GPUs by 1-2fps according GamersNexus, but yeah devs need to change that crap from x64 default if it is also in this demo and game; publications need to really persist asking the devs-studio the background and why it is left at x64 when it can be easily changed.

Anyway like I said, the issue with Hairworks in Witcher 3 was Geralt and hair being too close causing massive calculations, many Nvidia top tier owners wanted an option where animals still had Hairworks but always disabled for Geralt, they eventually got some kind of mod solution albeit not perfect and still can trigger the issue.
But, many bought the game and turned it off without any issues, and the game is not penalising AMD generally as the results for Vega and also Polaris GPUs when Hairworks is disabled and compared to Nvidia (context being Gameworks as a "black box" is not impeding AMD, what will be impeding AMD is that they will have to play catchup optimising for FFXV game in same way Nvidia has had to do recently for a few titles).

In FFXV the demo is different to Witcher 3 in that it is off on the chars but on the animal; yes it is bugged but the point is they may actually have a better implementation in FFXV if they overcome the bug, which may just be down to settings that are limited in the benchmark - sure this option will probably be only used by those with top tier Nvidia GPUs and I doubt it will be 4k unless limiting fps but will be interesting to see it tested for the game.
In an update from GamersNexus they said regarding the LOD bug:
For now, it appears as if the issue of LOD scaling will be addressed for the final launch. Theses issues included rendering unseen objects (without GameWorks) and rendering incorrect LOD for HairWorks effects. As we stated in the original piece, we believed this to be an oversight, rather than malice, and attributed the issue to tight development timelines at Square Enix.
 
Last edited:
No my point is that one can buy Witcher 3 and not be put off by Hairworks as context to the post I was responding to.....

Of course you can. The largest difference here is that Witcher 3 was Hairworks' first offense, so the devs were largely excused. Before that, only CoD: Ghosts had used it in the dog of the (sample) single-player campaign and few people in the PC arena really cared about that game's performance.

I'll be surprised if CD-Projekt Red will use gameworks again for Cyberpunk, as its use in Witcher 3 was perhaps the largest criticism made to the universally acclaimed game.

With FFXV, Square Enix definitely knew what they were getting into, as they participated with nvidia in a lot of co-promotion and the Witcher 3 debacle (spurred by AMD themselves) had already reached greater proportions.

That said, I understand if someone let it pass in Witcher 3 but decides to skip FFXV on principle.
 
After trying the benchmark, even with its hiccups, it was running decently on Standard mode on my PC. It also looks gorgeous, hence I'm on the fence about getting it for PC as well (I have it for PS4). I'll be getting a 4K HDR TV in the next 3 or 4 months and I curious about how it looks on 4K checkerboard and HDR on the PS4 Pro. If it does not look like a decent visual upgrade over the vanilla PS4, I might be tempted to get it for PC, even without HDR.
 
I played the game in the standard PS4 in a 1080p plasma TV and then in the PS4 Pro with a 4k HDR TV.
The difference is pretty big. None of what I've seen in press pictures for the PC version seems to stand out in comparison to the PS4 Pro version, IMO.

Then again, the Pro runs it at 30 FPS, so I guess a PC with a GTX 1080 can run it at 60 FPS or more.
 
I played the game in the standard PS4 in a 1080p plasma TV and then in the PS4 Pro with a 4k HDR TV.
The difference is pretty big. None of what I've seen in press pictures for the PC version seems to stand out in comparison to the PS4 Pro version, IMO.

Then again, the Pro runs it at 30 FPS, so I guess a PC with a GTX 1080 can run it at 60 FPS or more.

Well, if the PC Standard setting does not look better than PS4 Pro then its pointless for me, my PC is a bit weak for High.
 
I played the game in the standard PS4 in a 1080p plasma TV and then in the PS4 Pro with a 4k HDR TV.
The difference is pretty big. None of what I've seen in press pictures for the PC version seems to stand out in comparison to the PS4 Pro version, IMO.

Then again, the Pro runs it at 30 FPS, so I guess a PC with a GTX 1080 can run it at 60 FPS or more.
Or run it at 30fps with the various gameworks libraries *shrugs* - although top tier may be able to do quite a few options at 60fps depending upon resolution, considering GamersNexus tested with a 1070 with the benchmark bugged.
Like I said this will still appeal to quite a few Nvidia owners, but depending upon flexibility of configuration and whether a player wants utmost fps or some of those visuals.
 
Of course you can. The largest difference here is that Witcher 3 was Hairworks' first offense, so the devs were largely excused. Before that, only CoD: Ghosts had used it in the dog of the (sample) single-player campaign and few people in the PC arena really cared about that game's performance.

I'll be surprised if CD-Projekt Red will use gameworks again for Cyberpunk, as its use in Witcher 3 was perhaps the largest criticism made to the universally acclaimed game.

With FFXV, Square Enix definitely knew what they were getting into, as they participated with nvidia in a lot of co-promotion and the Witcher 3 debacle (spurred by AMD themselves) had already reached greater proportions.

That said, I understand if someone let it pass in Witcher 3 but decides to skip FFXV on principle.
But it is optional, and Nvidia does have a dominant position in dGPU and in the enthusiast tiers so why not provide more visual/game immersive options (my complaint is that Gameworks is too inefficient).
Not sure how one can accept it in Witcher 3 that can be disabled but then say it is OK to boycott FFXV on principle where it is also optional and can be disabled (if this bug only happens in the benchmark).
Especially if FFXV actually implements more options on Hairworks such as only animals, which is what most gamers wanted in Witcher 3 - those that own Nvidia cards and played around with Hairworks options anyway.

The backlash for Witcher 3 was caused by AMD complaining that Gameworks is a "blackbox" that impedes their performance generally rather than about Hairworks that could be disabled; the same reason was used for another game that did not use Hairworks but associated with Nvidia Gameworks - it may had been PhysX but running on the CPU.
However as we know the complaint from AMD is not entirely fair when those options are disabled and once AMD had time to optimise the game both Polaris and Vega actually have very good performance and equal if not a little better than comparable reference Nvidia cards.
As I said a lot of Nvidia owners wanted to be able to use Hairworks in Witcher 3 BUT with it truly disabled on Geralt as it is his close-up that ends up with massive calculations and makes it unusuable for even Nvidia gamers.
The backlash should had been really about the default tessellation setting along with why more options for managing Hairworks was not implemented or critically about the libraries inefficiencies and what Nvidia can do to make them more optimised but then it is a kind of 'middleware' solution.

We could change the argument to Gameworks vs what AMD is doing with its own optimised libraries (they may be fully open source but impossible to integrate due to level of development required to change to use on anything but AMD hardware) but that should be another thread and has been debated in the past with split perspectives.
Lara Croft hair at highest settings was better visual quality on AMD hardware than Nvidia, including snow textures that did not exist for Nvidia hardware; this was shown by PCGameshardware - just one example but we should take that to a different thread and worth noting the game devs-studio Crystal Dynamics were also involved in the development of the AMD hair library, probably why it has only appeared in their games or engine to date.
Just mentioning because while I am critical of Gameworks as it is bloated in terms of performance and could be much better if Nvidia spent more resources on it focused upon efficiency, this is not a reason to castigate its use if it can be disabled for hardware that does not support it as well; of course this comes down to seeing the actual game rather than the "benchmark".
 
Last edited:
(my complaint is that Gameworks is too inefficient).

And this is the main issue with Gameworks. Instead of doing something pretty and at the same time trying to make it as cheap as possible to run, they seem to be focused more on brute forcing expensive effects that can run better on Nvidia hardware. The problem is, even Nvidia hardware has a hard time running Nvidia Gameworks effects, and most of the time people just end up disabling them. With that said, I'm a big fan of VXAO and the PCSS Ultra+ used in Assassin's Creed Syndicate (which pushed shadow LOD extensively). When/if they finally decide to create something that's better for games rather than something that's better for Nvidia PR, Gameworks could be something great for PC games.
 
Since 1-2 years HBAO+ is sometimes even cheaper than SSAO and it looks better most of the time. Video games should get better AO in general. Consoles almost exclusively use SSAO even in 2018 and thanks to HBAO there are more choices on PC.

Nvidias Godrays in Wildlands improving the whole look of the game with minimal cost. Turf FX also looks very good.
 
Last edited:
Nvidias Godrays in Wildlands improving the whole look of the game with minimal cost. Turf FX also looks very good.
Turf FX doesn't look that much better then high with turf FX disabled, but costs my vega 56 massively on the performance front and introduces stuttering. High default = 5k score , high without gameworks = 8k score and the IQ difference isn't all that noticeable.
 
But it is optional, and Nvidia does have a dominant position in dGPU and in the enthusiast tiers so why not provide more visual/game immersive options (my complaint is that Gameworks is too inefficient).
Not sure how one can accept it in Witcher 3 that can be disabled but then say it is OK to boycott FFXV on principle where it is also optional and can be disabled (if this bug only happens in the benchmark).
Especially if FFXV actually implements more options on Hairworks such as only animals, which is what most gamers wanted in Witcher 3 - those that own Nvidia cards and played around with Hairworks options anyway.

The backlash for Witcher 3 was caused by AMD complaining that Gameworks is a "blackbox" that impedes their performance generally rather than about Hairworks that could be disabled; the same reason was used for another game that did not use Hairworks but associated with Nvidia Gameworks - it may had been PhysX but running on the CPU.
However as we know the complaint from AMD is not entirely fair when those options are disabled and once AMD had time to optimise the game both Polaris and Vega actually have very good performance and equal if not a little better than comparable reference Nvidia cards.
As I said a lot of Nvidia owners wanted to be able to use Hairworks in Witcher 3 BUT with it truly disabled on Geralt as it is his close-up that ends up with massive calculations and makes it unusuable for even Nvidia gamers.
The backlash should had been really about the default tessellation setting along with why more options for managing Hairworks was not implemented or critically about the libraries inefficiencies and what Nvidia can do to make them more optimised but then it is a kind of 'middleware' solution.

We could change the argument to Gameworks vs what AMD is doing with its own optimised libraries (they may be fully open source but impossible to integrate due to level of development required to change to use on anything but AMD hardware) but that should be another thread and has been debated in the past with split perspectives.
Lara Croft hair at highest settings was better visual quality on AMD hardware than Nvidia, including snow textures that did not exist for Nvidia hardware; this was shown by PCGameshardware - just one example but we should take that to a different thread and worth noting the game devs-studio Crystal Dynamics were also involved in the development of the AMD hair library, probably why it has only appeared in their games or engine to date.
Just mentioning because while I am critical of Gameworks as it is bloated in terms of performance and could be much better if Nvidia spent more resources on it focused upon efficiency, this is not a reason to castigate its use if it can be disabled for hardware that does not support it as well; of course this comes down to seeing the actual game rather than the "benchmark".


The main criticism around Gameworks is not about it being more optimized for nvidia architectures, or bringing nvidia-only features to some games.
It has nothing to do with the features being optional or not.

The problem with Gameworks (or TWIMTBP before that) is that its "features" have been regularly being outed as exceedingly detrimental to performance on all architectures with no IQ benefits. This is seen with sub-pixel and invisible (off-POV) triangles that are forcefully processed in many gameworks titles.
And they're only implemented like this because although it also hurts nvidia chips, it hurts AMD chips more.


TressFX isn't comparable to hairworks, because AFAIK TressFX doesn't force the chips to process sub-pixel triangles.

An AMD equivalent to hairworks would be if e.g. TressPhX called a bunch of demanding compute functions running in the background whose results would serve no purpose. Make sure to occupy a lot of ALUs because that would hurt nvidia architectures more.
 
And this is the main issue with Gameworks. Instead of doing something pretty and at the same time trying to make it as cheap as possible to run, they seem to be focused more on brute forcing expensive effects that can run better on Nvidia hardware. The problem is, even Nvidia hardware has a hard time running Nvidia Gameworks effects, and most of the time people just end up disabling them. With that said, I'm a big fan of VXAO and the PCSS Ultra+ used in Assassin's Creed Syndicate (which pushed shadow LOD extensively). When/if they finally decide to create something that's better for games rather than something that's better for Nvidia PR, Gameworks could be something great for PC games.
Oh I agree but it is a double edged sword because Gameworks is a middleware solution and not truly integral to the optimised game-rendering engine, that said I do think there is room for improvements as it seems bloated and needs to be re-designed.

Look at AMD's Purehair (an improved-evolution of TressFX), only one dev-studio has managed to use this and they were heavily involved in its development and is part of their engine, so it does seem such improvements outside of the game-rendering engine have a higher cost in terms of resources.
However as I said, that is not a reason to boycott a game that has Gameworks because it can be disabled, especially if FFXV has improved options with Hairworks over say Witcher 3, and will be interesting to see how good Turf effects is.
Hairworks does provide a noticable difference and is far from subtle say compared to VXAO, but I would like to see Gameworks made more efficient at source-library, which they may had done with their DX12 version and a shame FFXV is not using this from a technical interest.
But lets see how Hairworks pans out in the game with the lower options (if possible) on the higher tier Nvidia GPUs and without the bugs in the inadequate benchmark.
 
Last edited:
Ultra taxing effects are not strangers in the PC world, it's not like GameWorks effects are the only culprits there. We have incredibly tasking stuff like:

-Deus Ex Mankind Divided's Contact Hardening Shadows and MSAA
-Quantum Break's non upscaled resolution
-STALKER Lost Alpha's Max Sun Shadows
-Gears Of War 4's Insane Reflections and Insane DoF
-Flight X Simulator's Max View Distance
-ARMA 3's Max View Distance
-ARK Survival Evolved's Epic View Distance
-Watch_Dogs 2's Max Draw Distance

Hell, recently, even Final Fantasy 12 (a PS2 game) has the "Full Res Ambient Occlusion" the cripples all GPUs and runs far worse on AMD GPUs than NV GPUs (basically a Vega 64 = GTX 1070 here).

These effects are all vendor agnostic, and they all tax the hardware for small IQ improvements just as or worse than GameWorks effects. Though, in many aspects I find effects like HFTS, VXAO, PhysX, TXAA and Turf useful and add much needed IQ improvements. For example: the difference between HFTS shadows and regular shadows visually is night and day IMO despite the cost to performance.
 
Last edited:
And this is the main issue with Gameworks. Instead of doing something pretty and at the same time trying to make it as cheap as possible to run, they seem to be focused more on brute forcing expensive effects that can run better on Nvidia hardware. The problem is, even Nvidia hardware has a hard time running Nvidia Gameworks effects, and most of the time people just end up disabling them. With that said, I'm a big fan of VXAO and the PCSS Ultra+ used in Assassin's Creed Syndicate (which pushed shadow LOD extensively). When/if they finally decide to create something that's better for games rather than something that's better for Nvidia PR, Gameworks could be something great for PC games.

Hairworks in the Witcher 3 is the only real time hair and fur simulation which allows for more than one character on screen to have "real" hair/fur. It is quite ironic that people complain about it when there doesnt exist anything else.
 
Hairworks in the Witcher 3 is the only real time hair and fur simulation which allows for more than one character on screen to have "real" hair/fur
Well, technically Far Cry 4 enabled this for multiple animals on screen: bears, tigers, wolves .. etc. Call Of Duty Ghosts enabled this on multiple dogs during multiplayer matches.
 
Last edited:
Hairworks in the Witcher 3

This is what hairworks does to a 1080 Ti at 4K:

off
thewitcher3screenshotj7q80.jpg


on
thewitcher3screenshot44qp4.jpg


off
thewitcher3screenshotbopug.jpg


on
thewitcher3screenshotpxod2.jpg


I'm not buying into that Nvidia marketing spiel. Hairworks is bad and whoever wrote that crap should feel bad about it (or good depending on how much Nvidia is paying for that :p).

Edit: But wait! There's more:

off
thewitcher3screenshotgmoj8.jpg


on
thewitcher3screenshotwtp00.jpg

It's legitimately hilarious how bad this is.
 
Last edited:
It's not just how it looks, but also how it moves. And this isn't about Geralt's hair only too, there is extensive animal fur simulation as well.

I couldn't care less if it ends up eating half the GPU processing power to do what it does (and it's nothing exceptional either).
 
I couldn't care less if it ends up eating half the GPU processing power to do what it does (and it's nothing exceptional either).
That's why it's a choice, you can do without it, but If you have the horsepower, you get the extra visual benefit. It's not forced on you. For me, I play Witcher 3 fine at 30fps, I don't care much for 60fps in this game as 30 are very smooth, that's why I don't mind activating it and getting even a fraction of extra IQ simulation.
 
Back
Top