The Witcher 3: Wild Hunt revealed

Geometry Shaders are known to be abysmally slow on AMD hardware.

Anyway why doesn't Hairworks have some sort of OIT implementation?
 
It's more intentional carelessness than anything else, geometry shaders and tesselation are a convenient way to program this and it works okay on NVIDIA hardware. AMD kinda did the same thing with TressFX by not having an alternative to OIT ... coverage based AA is an appropriate way to handle this kind of dense geometry, but presently it doesn't suit NVIDIA hardware. Of course the difference is source code ... and lets forget about the FUD of the hardware manufacturers working with developers at source code level being unusual, it's not for the really big titles.

I wish CDPR hadn't toed the company line by saying it's unfixable, it's such a disingenious thing to say ... accepting sponsorship with a few strings attached is one thing but being a PR tool is another.
 
Tae-Yong Kim, physics programmer at NVIDIA answers some of our questions about HairWorks

NVIDIA HairWorks was firstly showcased at The Witcher 3 presentation half a year ago and recently used in an actual game title – Call of Duty: Ghosts – to provide “Dynamic Fur” simulation for animal characters. In comparison to other GPU accelerated physics features, Dynamic Fur was implemented through DirectCompute, which opens it for AMD users as well.

Tae-Yong Kim, physics programmer at NVIDIA, has agreed to answer some of our questions about HairWorks solution in general, and Call of Duty: Ghosts integration in particular.

PhysXInfo.com: It was really a big surprise to see NVIDIA HairWorks utilizing DirectCompute API, while other NVIDIA hardware physics effects is typically exclusive to CUDA capable GPUs. What was the reasoning behind this decision?

Tae-Yong Kim: One of the goals of NVIDIA GameWorks is to solve hard visual computing problems in a way that balances implementation efficiency and time-to-market, runtime performance, and ease of integration. This requires choosing the right technologies, and sometimes that will lead to CUDA solutions, other times to DirectCompute, and other times to solutions using completely different approaches.

With NVIDIA HairWorks, the balance landed in favor of DirectCompute, partly because the simulation portion of the algorithm is a small part of overall runtime cost, which is dominated by rendering.

PhysXInfo.com: Can one hope for HairWorks solution to be ever made available for next-gen consoles – PS4 and Xbox One?

Tae-Yong Kim: We are looking into the feasibility for using NVIDIA HairWorks on next gen consoles, so stay tuned.
http://physxinfo.com/news/12197/introducing-nvidia-hairworks-fur-and-hair-simulation-solution/
 
It would be very dangerous for NVIDIA to have separate implementations for the consoles and the PC ... I doubt they want to go there. Maybe if they just do the PS4, the different API gives them a little wiggleroom for plausible deniability.

PS. by the way, about the game ... apart from the remapping SNAFU anyone run into any other bugs yet? (I usually wait out two or so patches for RPGs.)
 
Yeah, I can't see them doing it on consoles, but maybe more for performance reasons. :devilish:

Haven't heard anything serious for bugs yet. Might have to wait a little while before people progress far enough though. Apparently the in-game frame limiter is rubbish for consistency/frame pacing.

edit:

Visuals.ini -> change Movieframerate= to 60 instead of 30. XD
 
Last edited:
Have given this game an hour or two, playing first with just about everything maxed out, hairworks on, etc. Specifically the plants highest density and farthest drawdistance makes this a really beautiful world to walk along in, along with pretty good lighting. Also just this weekend discovered Input Mapper which makes me able to use a DualShock 4 and that so far works flawlessly, though I was glad to have an old Xbox 360 controller lying around to get used to the button icons. ;) I'm finding I'm preferring that to a mouse for this type of game so far (tried it with a mouse first hour). Nice that when connected with a cable, you don't have to press the powerbutton, so the PS4 stays off. ;)

All in all I can see this is a really great game already.
 
It would be very dangerous for NVIDIA to have separate implementations for the consoles and the PC ... I doubt they want to go there. Maybe if they just do the PS4, the different API gives them a little wiggleroom for plausible deniability.
I honestly doubt Nvidia would give a shit about what forum warriors and a few game news site thought about it.
 
I think the entire game industry knowing they have the code for it to run well on the XBOX One but simply not allowing it to be used on the PCs would cross a line with game developers. Do they really want to be seen by every developer as an asshole only tolerated for their wallet? I kinda doubt it.

I think in the end when they've milked this for what it was worth they'll fix the code to take away most of the performance hit, for consoles and PCs alike.
 
I think the entire game industry knowing they have the code for it to run well on the XBOX One but simply not allowing it to be used on the PCs would cross a line with game developers. Do they really want to be seen by every developer as an asshole only tolerated for their wallet? I kinda doubt it.

I think in the end when they've milked this for what it was worth they'll fix the code to take away most of the performance hit, for consoles and PCs alike.
I don't think the developers matter either. It's the publisher and investors that care about the marketing agreements with Nvidia to ease the burden on the developers which reduces the bottom line on the game costs. With 75% of the GPU game market on PC, I just don't think they really care. The publishers get a checkmark for nice hair on the console AND PC version with no cost to themselves.
 
setting for the npc limit?

hacks.ini
[Gameplay/EntityPool]
SpawnedLimit=75

seems people starting to fiddle around this game. but still wondering about surround sound support. Google failed me.
 
It's more intentional carelessness than anything else, geometry shaders and tesselation are a convenient way to program this and it works okay on NVIDIA hardware. AMD kinda did the same thing with TressFX by not having an alternative to OIT ... coverage based AA is an appropriate way to handle this kind of dense geometry, but presently it doesn't suit NVIDIA hardware. Of course the difference is source code ... and lets forget about the FUD of the hardware manufacturers working with developers at source code level being unusual, it's not for the really big titles.

I wish CDPR hadn't toed the company line by saying it's unfixable, it's such a disingenious thing to say ... accepting sponsorship with a few strings attached is one thing but being a PR tool is another.

The impact of TressFX on Nvidia hardware isn't that far off the impact it has on AMD hardware now. And unlike Hairworks, TressFX didn't basically cripple their own hardware (well anything that isn't a Titan X) to the same extent, much less the competitions.

Regards,
SB
 
Heh, after spending something like 10 hours in the starting zone (!!!) I've finally gotten to the point where I can choose what I did in The Witcher 2 (didn't bother to import the save since I never finished it).

Good game so far. Graphics are generally good, but disappointing sometimes (texture quality could use a lot of work in a lot of places, flat walls, etc.).

Regards,
SB
 
Back
Top