Hairworks, apprehensions about closed source libraries proven beyond reasonable doubt?

Status
Not open for further replies.
I only said the level of cooperation was unknown, but it was still there ... they got to work in cooperation with the developers. CDPR has denied that being possible here, presumably because they don't have the source code either.

Thats effectively the thing i heard from Nvidia, it seems they dont have ask for the code .. the second is they maybe just dont care ( for different reason, money, time, ressource etc )
 
Last edited:
Every time AMD releases a new driver with improved performance, do you think they had source code? Do you think the reversed a DLL? I think not. They are using the lack of source code as a crutch

Explain me how AMD would optimize for a closed source DLL that does everything it can do be performant on a specific platform and not on another, I am curious.
You made an interesting assertion there.
 
Explain me how AMD would optimize for a closed source DLL that does everything it can do be performant on a specific platform and not on another, I am curious.
You made an interesting assertion there.
The CPU is identical irrespective of the GPU, so whatever code executes on the CPU is irrelevant.

What matter is the stuff that gets passed on from the DLL via the generic Windows driver to the GPU specific driver: a shitload of API state changing calls (setup texture commands etc.), and shader programs.

That's the stuff where the GPU specific driver can intercept and make specific optimizations. It could organize memory allocation for optimal performance, it can detect specific shader code sequences that can be mapped to certain instructions, it could so shader detection and replace it with a hand-written one etc.

Having the source code can help to understand what the code is doing, but it's not going to help one bit to make it faster: it's not as if they have the build environment of each game developer and they can at will replace one DLL by another. That's something only the game developer can do.
So the actual optimization work needs to be done at a lower level no matter what. I believe that's what Nvidia means when they say that they don't need the source code to optimize their driver for a particular game. And I believe that the same is true for AMD as well. Anything else would be unworkable.
 
The CPU is identical irrespective of the GPU, so whatever code executes on the CPU is irrelevant.

What matter is the stuff that gets passed on from the DLL via the generic Windows driver to the GPU specific driver: a shitload of API state changing calls (setup texture commands etc.), and shader programs.

That's the stuff where the GPU specific driver can intercept and make specific optimizations. It could organize memory allocation for optimal performance, it can detect specific shader code sequences that can be mapped to certain instructions, it could so shader detection and replace it with a hand-written one etc.

Having the source code can help to understand what the code is doing, but it's not going to help one bit to make it faster: it's not as if they have the build environment of each game developer and they can at will replace one DLL by another. That's something only the game developer can do.
So the actual optimization work needs to be done at a lower level no matter what. I believe that's what Nvidia means when they say that they don't need the source code to optimize their driver for a particular game. And I believe that the same is true for AMD as well. Anything else would be unworkable.
So suppose AMD does these optimizations. What then? Do you think they have absolutely no CPU overhead? Many people complain about the apparent increased CPU overhead of AMD's drivers relative to Nvidia, yet they never stop to consider why that might be true.

Regarding the closed source library vs. open source: What guarantee do you have that the closed source library does the same work on all platforms? With open source you can easily verify that.
 
Looks like the studio already released patch with improved HairWorks performance (http://www.pcgamer.com/witcher-3-patch-adds-new-graphics-options-and-improvements/). Give it a few more days and AMD will release a driver (beta of course) and it will be on to the next temper tantrum. (Though maybe it's in AMD's best interest to not try to fix anything?)

So let me explain it for you more clearly , after the theory who seems to be that Nvidia is sabotaging performance on AMD GPU's with gamework, your theory is: it is AMD who sabotage itself his gpu performance on Assassin Creed, FC4, Batman series for make believe that Nvidia is doing it, with the goal to achieve to discredit Nvidia .. ... ok.. Why not, maybe, im open to all theory.

The same theory push by Forbes finally. A theory who will certainly like Nvidia.

I think we should call back Mulder and Scully there for get the final response.

So suppose AMD does these optimizations. What then? Do you think they have absolutely no CPU overhead? Many people complain about the apparent increased CPU overhead of AMD's drivers relative to Nvidia, yet they never stop to consider why that might be true.

Regarding the closed source library vs. open source: What guarantee do you have that the closed source library does the same work on all platforms? With open source you can easily verify that.

In addition, the implementation can vary a lot of the initial code source, game by game, with specific optimization.

As for the overhead, its clear that if you need to do all the work on the driver level, the cpu overhead caused by it will explode, hence why when peoples use COD or FC4 as an example it is not really representative at all. ( i dont deny that it seems the AMD driver have a larger overhear that Nvidia one, but if you add what need to do the driver compared to the Nvidia one on thoses titles, but on thoses titles it is completely normal that it happend )
 
Last edited:
The same theory push by Forbes finally. A theory who will certainly like Nvidia.

Maybe Mulder and Scully are on to something since Ars Technica echoes the Forbes thoughts:

No doubt performance will get better as new drivers trickle in, and as CD Projekt Red begins the inevitable post-release bug fixing. As for the whole GameWorks debacle, frankly, the whole thing is getting tired. Expecting Nvidia to open-source its tech is wishful thinking, even if it would be in the best interests of consumers; the company is trying to make money after all. While it's sometimes easy to sympathise with the underdog, at this point we need to see some real innovation from AMD, rather than just bluster over allegedly dodgy business practices from the competition.
http://arstechnica.co.uk/gaming/201...completely-sabotaged-witcher-3-performance/2/
 
Maybe Mulder and Scully are on to something since Ars Technica echoes the Forbes thoughts:


http://arstechnica.co.uk/gaming/201...completely-sabotaged-witcher-3-performance/2/


And then again, the GTA5 things in preambule, .. do they know what tech have been "mixed" in GTA5? I was not know that GTA5 use PhysX + Bullet, Hairwork + tressfx, instead it use 2 independant plug in who have been added in the gamework library, HBAO+, HDAO and FXAA ..

Specially when Hairwork was demonstrated on a pre demo of Witcher3.

Do they know what type of contract is made by rockstar.. i can tell you the politic of rockstar with hardware brands is far different of some other studio. I ask me if thoses 2 articles have been write by the same person.

I like too how they goes after AMD, when on the end, this is gamers, the consumers on forums who complain.. not AMD guys .. (outside the 2 responses of Huddy who are cited in loop ). Have you seen AMD going in a public campaign against it. huddy have just respond to 2 question in an interview.

Things are going out of proportion in every aspect.

Instead of speaking with half words, let say it, it is AMD who sabotage the performance of their gpus with the gamework titles, for discredit Nvidia ..
 
Last edited:
And then again, the GTA5 things in preambule, .. do they know what tech have been "mixed" in GTA5? I was not know that GTA5 use PhysX + Bullet, Hairwork + tressfx, instead it use 2 independant plug in who have been added in the gamework library, HBAO+ and FXAA ..

Unfortunately in Witcher 3 it's not just HairWorks as a individual variable but also CD Projekt Red implementation of their own in-house developed Anti-Aliasing techniques which are probably not open-sourced. It's more difficult to identify potential outcomes when "new" techniques are expected to have an impact on an established gamework library.

GTA5 development may have been different in that no extra in-house techniques interacted with HairWorks or TressFx implementations, so risk was much less.
 
Last edited by a moderator:
AMD is welcome to work with software vendors to leverage benefits unique to their hardware, yes.
GCN based architectures dominate current console hardware and I've read quite a few suggestions that this may be a great opportunity for AMD.
That might not translate down to older architectures like the VLIW4 and 5 generations but those customers should probably be preparing for upgrades at some point.
Similarly, I believe HSA is intended as a prospective competitive advantage. That would of course be to the exclusion of non-HSA compatible vendors.

As a prospective videocard buyer I can then weigh whether such benefits are compelling to me.
AMD probably made some sales because customers felt a Mantle compatibility check mark was a plus for a particular card.
Conversely, I can also decide to avoid software where some part of the development effort went towards a Mantle port if it doesn't bring value to me.

Gotcha, so you'd be absolutely happy in a world where, if both IHV's had equal GPU share in the market you couldn't play half the games released because it ran horrible on your video card.

Even better, what if there were 3 IHVs? Then you'd be limited to only playing 1/3 of available games because the other 2/3 would perform horribly due to closed source sabotaging of performance. It's great that you applaud an IHV for abusing it's monopoly.

But, at least we're seeing some backlash from software developer's, at least those that aren't tied monetarily to Nvidia. And we should start seeing the death of Gameworks. And good riddance.

And this coming from an Nvidia user that isn't happy with the situation.

Regards,
SB
 
So suppose AMD does these optimizations. What then? Do you think they have absolutely no CPU overhead?
I'm sure they have some cost, but most of these should be one-time costs.

Many people complain about the apparent increased CPU overhead of AMD's drivers relative to Nvidia, yet they never stop to consider why that might be true.
Some months ago, somebody posted on a forum somewhere about his experiences as an interns or employee at Nvidia: they seems to spend tons of effort optimizing their drivers for specific games, so you'd expect Nvidia drivers to have tons of overhead as well. The opposite is true. But Nvidia seems to be better at using at least 2 CPUs in their driver. Maybe AMD should have spent more time on that aspect?
And it's not as if there are that many GameWorks with GPU acceleration out there anyway.

Regarding the closed source library vs. open source: What guarantee do you have that the closed source library does the same work on all platforms? With open source you can easily verify that.
It should be fairly trivial for a company like AMD to put a snooping layer between a game and the Windows API and dump all the transactions. And if they then discovered that this were the case, we'd hear it loud and clear. Since we didn't hear any whining of that sort, I'm pretty confident that this is not happening. ;)
 
I'm sure they have some cost, but most of these should be one-time costs.


Some months ago, somebody posted on a forum somewhere about his experiences as an interns or employee at Nvidia: they seems to spend tons of effort optimizing their drivers for specific games, so you'd expect Nvidia drivers to have tons of overhead as well. The opposite is true. But Nvidia seems to be better at using at least 2 CPUs in their driver. Maybe AMD should have spent more time on that aspect?
And it's not as if there are that many GameWorks with GPU acceleration out there anyway.


It should be fairly trivial for a company like AMD to put a snooping layer between a game and the Windows API and dump all the transactions. And if they then discovered that this were the case, we'd hear it loud and clear. Since we didn't hear any whining of that sort, I'm pretty confident that this is not happening. ;)


He was not speaking about Nvidia, he was speaking how complex are the driver todays, and this include AMD drivers too, we can think that a driver optimized for the Witcher, Assassin Creed, Batman, is way more complex on the AMD side than it is for Nvidia .

It is exactly what is the situation, a driver for gamework from AMD need 10x more engineer works and time to be made that for any other games .. with lets be honest, bad result whatever .

And this increase the overhead, largely, with the driver, they need to solve problem who are solved on the developper side for Nvidia.

Its crazy, do you imagine that Nvidia have put in the blackbox, " tesselation", ported from CG by ATI, pushed into DX" .. tesselation is now part of the gamework blackbox, the " made in Nvidia Tesselation"..
What does this mean ? , it is not an specific Nvidia features, but gamework use it in is box, hidding control from the main API. graphic engine...

Officially this new version of the tesselation, should increase performance by adding LOD on it ( well LOD is a basic features of the initial tesselation setting on directX, it was even part of the ATI in 2003 presentation of the tesselation: but no one game where Nvidia have push tesselation are using it.
 
Last edited:
Yeah, if only AMD would stop wasting their time revolutionizing graphics APIs on PC with Mantle and its progenitors and instead focus on real innovations like black box graphical prettification plug ins developed specifically to disadvantage nVidia hardware.
And here I thought that the whole point of a GPU was to make games look good!

It's in Nvidia's and AMD's best interest to promote rendering techniques that use up as many GPU cycles as possible, otherwise we'd all still be playing PacMan. How many studios out there have the expertise of creating a full physics library, new anti-aliasing techniques, fire, hair, smoke, lighting techniques etc.? Definitely not all of them. So both AMD and Nvidia develop libraries to make this happen. The only difference is that Nvidia spends probably an order of magnitude more than AMD. And that they don't like to give away their code, which, given the millions they've spent on it, is not entirely unreasonable IMO.

AMD spent probably a pretty penny on Mantle instead, and kept it closed until it became irrelevant in the PC space.

Enabling games to use state of the art rendering techniques or finding a way around bottlenecks in your driver: both are worthwhile endeavors. The problem with focusing all your effort on bottleneck reduction is that you're solving a one-time thing without lasting competitive benefit, while improving graphics quality is an open-ended problem.
 
He was not speaking about Nvidia, he was speaking how complex are the driver todays, and this include AMD drivers too, we can think that a driver optimized for the Witcher, Assassin Creed, Batman, is way more complex on the AMD side than it is for Nvidia.
What makes you so sure about that? Have you studied the shader assembly that gets transferred from the Windows driver to the GPU specific driver and concluded that it maps better to the Nvidia instruction set than the AMD instruction set?

Its crazy, do you imagine that Nvidia have put in the blackbox, " tesselation", ported from CG by ATI, pushed into DX" .. tesselation is now part of the gamework blackbox, the " made in Nvidia Tesselation"..
What does this mean ? , it is not an specific Nvidia features, but gamework use it in is box, hidding control from the main API. graphic engine...
AFAIK tessellation is not a black box at all: it's supposedly very well spec'ed by Microsoft and a standard component of the rendering pipeline. I don't think tessellation is the real issue, it's that AMD GPUs are worse at dealing with geometry, and that tessellation has the ability to generate tons of geometry.
 
Behaving ethically and pointing out that the competition doesn't is throwing a temper tantrum?
Accusing your competition of sabotage because your own driver/hardware can't keep up is. I have yet to see the first proof that there's anything unethical is going on. The performance on AMD GPU is usually bad compared to Nvidia. So was the performance of Dirt, but the opposite way around. Some architectures are better at one thing than the other. It's only logical that one's library should optimize for those features.
 
What makes you so sure about that? Have you studied the shader assembly that gets transferred from the Windows driver to the GPU specific driver and concluded that it maps better to the Nvidia instruction set than the AMD instruction set?


AFAIK tessellation is not a black box at all: it's supposedly very well spec'ed by Microsoft and a standard component of the rendering pipeline. I don't think tessellation is the real issue, it's that AMD GPUs are worse at dealing with geometry, and that tessellation has the ability to generate tons of geometry.

i m pretty sure that a driver for Assassin creed from Nvidia is way less complex than what should bring AMD for it.. for the Witcher, i cant tell i have not study them ..


As for tesselation, Nvidia have move it to his GameworkBlackbox .. so you take the conclusion. Inititally this is a pure DirectX setting, but if you implement the nvidia version, its no more a DX component, it is a part of the gamework boxes library. You understand the difference
 
Last edited:
As for tesselation, Nvidia have move it to his GameworkBlackbox .. so you take the conclusion. Inititally this is a pure DirectX setting, but if you implement the nvidia version, its no more a DX componenent, it is a part of the gamework boxes library. You understand the difference ..
No, I don't understand the difference at all. GameWorks is middleware as much as any other studio specific library is middleware. To the driver, it should make no difference.
 
No, I don't understand the difference at all. GameWorks is middleware as much as any other studio specific library is middleware. To the driver, it should make no difference.

For Nvidia driver ... not for the AMD one.


Im a bit sad about all this meltdown, the Witcher is a great game, and released by extremrely talentous guys, the work they have do, is phenomenal, the game is allready cult , in every aspect of the game
and all of this technical question seems to take more place than they should on the PC version ... This said, it is the last game of a gamework series, who have clearly show how was used gamework on the Nvidia side ... maybe the fault of AMD as some suggest ( the counter theory it is a bit too much easy and have never work, ask Richard Nixon about it ) ...
 
Last edited:
Status
Not open for further replies.
Back
Top