NVIDIA Game Works, good or bad?

A game developer who uses Mantle has the option to also use DX11. ;) Similarly, a game developer who uses GW has the option to use a different library tailored AMD. It's the same thing. Whether or not GW sits on top of DX11 is wholly irrelevant.

If GW contains a bunch of highly specialized code for effects that are non-trivial, packaged into easy to use libraries. That means tens or hundreds of man years of investment.

You expect them to just allow developers to hand this over to their only competitor? Not just the API calls, but the actual implementation? Really?

There is almost no differentiation in hardware. In a double blind test, none of us would know the difference between a 780 and a 290. That's not going to change. Companies need something else to make developers and customers chose one or the other. Mantle is one way, GW is just another.
 
You expect them to just allow developers to hand this over to their only competitor? Not just the API calls, but the actual implementation? Really?

Not in the slightest. I don't expect Nvidia to do anything but act in Nvidia's own best interests. And from Nvidia's perspective, a program that harms AMD's ability to optimize games by restricting the developer's ability to share code with AMD is a good thing. If you're NV, the idea that AMD has to bring up an entirely new set of libraries from scratch and convince developers to integrate them or be locked out level playing field optimization is a competitive advantage.

I expect NV to take every advantage it can find to promote its own products. I expect AMD to do the same. And in both cases, I reserve the right to call shenanigans when I see a situation I think is ultimately harmful to the end-user and the gamer.

I think we've talked this to death. You see the issue differently than I do, and that's fine. I appreciate you sharing your viewpoint and keeping the conversation civil.
 
If so, why don't AMD start making their own open source libraries similar to GW? I think that'd be the best as every one can optimize it and soon every one will ditch GW. On the other hand, if AMD can't (or don't want to) do that, then I don't see the point blaming NVIDIA for doing something.

You can't put the burden on somebody else to "fix" something that you did.

What is the best isn't AMD to do a open source library that is better than GW. It's GW to be open, or at least less obscure. The burden (if any) isn't with AMD, but with NVIDIA.

Otherwise where would be that argument end?

"If Intel paid for M$ to prevent Windows 9 to work with AMD CPU, why don't AMD pay M$ more? Or why don't AMD fund Linux so it can be better than Windows and grab 99% of the market?"

I believe I don't need to go further with the reductio ad absurdum. You can't always rely on AMD to be "the good guys that will save us from the monopolistic actions of evil company A".
 
A game developer who uses Mantle has the option to also use DX11. ;) Similarly, a game developer who uses GW has the option to use a different library tailored AMD. It's the same thing. Whether or not GW sits on top of DX11 is wholly irrelevant.
Right, I do sort of wonder if people would feel differently if GW simply refused to run at all on non-NVIDIA hardware. Confusingly, I imagine people would be less angry :)

The reality though is you can think of it as a proprietary library for NVIDIA hardware that has the extra bonus that it will run on other DX11 implementations (for now, may not always be the case for everything in it). Thus a developer is free to test and see if performance is acceptable on other IHV hardware and if so, that saves them a bunch of work. If not, it's a separate path, just like Mantle/GL/whatever else would be.

Again, I'm not defending this as ideal (and I think everyone here is on the same page with that part), but it's not really that abnormal.

As far as Mantle goes, the comparison does beg an interesting question... do AMD allow the Mantle implementations in games (and by extension, the API) to be shared with other IHVs? If not, I'm not sure the argument of how it's different has a firm leg to stand on. And really, I'd still like to get an answer from Nick or Dave on the licensing terms of AMD's samples... I'm not giving any points to you for "openness" until you clarify that situation :)

You can't put the burden on somebody else to "fix" something that you did.
See the first paragraph of my reply. How is saying "well other people could develop their own Mantle drivers" or "other people could support X extension" or whatever similar thing any different? It's not, but that's not to say that such things are universally unacceptable. Obviously any time you want to move the industry forward you need to put a burden onto the other folks to keep up somehow...
 
Right, I do sort of wonder if people would feel differently if GW simply refused to run at all on non-NVIDIA hardware. Confusingly, I imagine people would be less angry :)

If GW was an optional program that was GeForce-only, I agree that it would be seen as optional. It doesn't tilt the playing field at that point. I'm totally ok with custom, NV-only libraries.

As far as Mantle goes, the comparison does beg an interesting question... do AMD allow the Mantle implementations in games (and by extension, the API) to be shared with other IHVs? If not, I'm not sure the argument of how it's different has a firm leg to stand on. And really, I'd still like to get an answer from Nick or Dave on the licensing terms of AMD's samples... I'm not giving any points to you for "openness" until you clarify that situation :)

I am assuming that the number of changes NV would have to make to GK104 to support Mantle make it effectively impossible for NV to release a Mantle driver. I view this as a difference between technical and practical. Even if Kepler could support a Mantle API in its present form, no hardware changes required, it's probably not worth NV's time to do so if it can't get more than 5-10% out of the API, or worse, can't get anything positive at all.

I view the question like this:

1). Will Mantle deliver a substantial, sustained performance advantage to AMD? (Unknown).

2). Will a significant number of developers start using Mantle? (Unknown).

3). If 1&2 occur, will AMD attempt to enforce any sort of licensing or API restrictions that prevent Nvidia from developing a Mantle-analogous product, or competing fairly with the now-established API?

If #1 and #2 both occur (I'm a bit dubious on this score, frankly), then #3 becomes analogous to the GW situation today. We'd have a scenario where a dominant player / incumbent is taking what I'd view as overly aggressive action to block a competitor.
 
A game developer who uses Mantle has the option to also use DX11. ;)

No, it's not an option, it's mandatory. No developer can afford to make a Mantle-only game and AMD knows it well. Mantle is meant to complement Direct3D, not replace it.

Similarly, a game developer who uses GW has the option to use a different library tailored AMD. It's the same thing. Whether or not GW sits on top of DX11 is wholly irrelevant.

But the whole point of GW is precisely to avoid the extra effort of implementing all those effects by hand or even with another third-party library. I would expect 100% of Mantle games to have a Direct3D code path, and 0% of GW games to have the same effects implemented with a more AMD-friendly library—it's just redundant.

If GW contains a bunch of highly specialized code for effects that are non-trivial, packaged into easy to use libraries. That means tens or hundreds of man years of investment.

You expect them to just allow developers to hand this over to their only competitor? Not just the API calls, but the actual implementation? Really?

I'd be fine with just the API calls. AMD did it with TressFX, and it's a very good thing. Granted, it's just a couple of effects, but it's a lot of work and they did it anyway.
 
For now, but if the Mantle path diverges in terms of features (and not just performance) it gets fuzzier no? Do you think AMD would encourage or discourage that?

Well there might be Mantle-exclusive effects, sure, and AMD would most likely encourage it, but there will always be a Direct3D (or OpenGL) codepath for standard rendering.
 
See the first paragraph of my reply. How is saying "well other people could develop their own Mantle drivers" or "other people could support X extension" or whatever similar thing any different? It's not, but that's not to say that such things are universally unacceptable. Obviously any time you want to move the industry forward you need to put a burden onto the other folks to keep up somehow...

They are fundamentally different. In both cases quoted above you are asking one company to support one piece of technology, be it an API or an extension, that company is free to map it to their ISA/drivers as they feel it's best.

For example, AMD's PRT are the basis on some recent advances on DX, that's why AMD has HW support, while nvidia doesn't (they use SW). It's not wrong to ask nvidia to comply to new advancements on DX, is it?

IF we were saying "Mantle is proprietary, nvidia should do an open API which will replace Mantle!" Then it would be the same to the argument that my original post was referring. You are demanding that one company replaces one piece of technology created by the other.
 
Right, I do sort of wonder if people would feel differently if GW simply refused to run at all on non-NVIDIA hardware. Confusingly, I imagine people would be less angry :)

Why would that be confusing?

In the current situation. A developer would have no reason to use anything but the Nvidia provided libraries which AMD cannot optimize for as long as it runs "good enough" on the competitor's hardware. Nvidia can continue to optimize for the game after the game is released. AMD has no option to do this at any point before or after release.

If it was closed completely, the developer would have to code to standard libraries which AMD has access to. Even if not optimized at the developer level, it would still be potentially optimizeable by AMD at the driver level, or via developer relations. Something that is not even remotely possible with how Gameworks currently is.

Contrast that to Mantle.

In the current situation. A developer still has to provide a standard DirectX rendering path. Hence, Nvidia/Intel could still optimize for it either at a driver level or through developer relations.

In the future, if I understand things correctly, other IHVs will be able to opt-in to supporting Mantle or not. If all IHV vendors don't opt-in then the developer still has to provide a standard DX path. If the IHV supports it they'll be able to optimize things on their end. If they don't, then they just optimize for the standard DX rendering path.

That's different from the current situation with Gameworks where AMD has no option to opt in. And since it will run on all hardware, although obviously less optimally on non-Nvidia hardware, not only can AMD not do anything to increase performance with it, they cannot opt-out and just optimize for the standard DirectX path.

Imagine if Nvidia could do nothing about their performance in Tomb Raider 2013 (lots of optimizations post release). Something that would be impossible for AMD to do with regards to a gameworks title.

Or if AMD couldn't fix/optimize performance and stability in Rage by iDsoftware after release due to not being able to know what Gameworks is actually doing.

Regards,
SB
 
They are fundamentally different. In both cases quoted above you are asking one company to support one piece of technology, be it an API or an extension, that company is free to map it to their ISA/drivers as they feel it's best.
100% the same thing as telling someone to just do their own implementation of GW. There is zero difference regardless of your trying to draw an arbitrary line between "API", "extension", "library" or whatever other terminology.

In the current situation. A developer would have no reason to use anything but the Nvidia provided libraries which AMD cannot optimize for as long as it runs "good enough" on the competitor's hardware.
Not true, you're adding your own interpretation to developers here. Of course they have a reason to use something else: if they need it to run better on AMD/Intel!

Nvidia can continue to optimize for the game after the game is released. AMD has no option to do this at any point before or after release.
That's not true - the implementation of the GW effects is frozen and ships with the game. It would be a nightmare for ISVs if that could change by a driver update down the road. But hell if it were just part of NVAPI then it'd be an "extension" and apparently that would be ok ;)

Ultimately you guys are arguing that if Mantle provided a DirectX driver/fallback for other hardware you'd be mad about it, even though developers would still be free to do exactly what they can do today and write their own DirectX path. That's a silly position and I'm not even going to explain why if it isn't blatently obvious.

How about you just pretend that GW doesn't work outside of NVIDIA then and be happy about it. If developers choose to run GW code on non-NVIDIA hardware, that is equivalent to them writing the same path in DX. If other IHVs don't like it, they are welcome to "optimize" it by implementing it and the developers are welcome to use (or not use) the resulting implementations.

I started with some sympathy for this argument, but it is increasingly waning. While I still agree "research"/"sample" implementations should ideally be completely open source (and come on AMD guys, stop ignoring my question if you really mean what you say), the GW situation is no worse than any other proprietary extension, API or implementation.
 
I started with some sympathy for this argument, but it is increasingly waning. While I still agree "research"/"sample" implementations should ideally be completely open source (and come on AMD guys, stop ignoring my question if you really mean what you say), the GW situation is no worse than any other proprietary extension, API or implementation.

Except it isn't. In the case of DirectX while it is "closed" and proprietary to the Windows platform it is hardware agnostic. No hardware vendor controls it, although each vendor influences what features get implemented in each version of DirectX to an extent. As well, each IHV has equal access to the API.

In the case of Gameworks which purports to work on all vendors hardware, it is directly controlled by one hardware vendor with a vested interest in making sure it runs well on their hardware and no vested interest in making sure it runs well on competitors hardware.

In the case of Mantle it currently only works on one vendor's hardware. As such AMD have absolutely Zero control over how a game implementing it performs on the competitor's hardware. While for Gameworks, Nvidia is in complete control over how the game performs on AMD hardware as it currently stands.

Sure, a software developer could choose to do a Gameworks path and a standard DX path, but why would they do that as long as it ran "good enough" on AMD/Intel hardware? If it's only 10-20% slower than Nvidia hardware, the incentive would be low. And if that also happens to be 5-10% slower than a standard DX path, then why bother with the standard DX path? Again, it's good enough.

It's one thing to have a neutral party in relative control over how your hardware performs (Microsoft with DX for example) and another to have a hardware vendor that is your competitor be in charge of how well your hardware potentially performs.

Regards,
SB
 
Like I said, your entire argument hinges on the current non-existence of a Mantle->DX driver, which is just silliness. You can't fault NVIDIA for providing more compatibility than they had to. If game developers choose to use it, they have judged it and deemed it acceptable, and that's on them, period.
 
These decisions often rest with publishers more than developers. Some companies like Blizzard get to write their own tickets. Plenty of others don't. And with skyrocketing dev costs, the pressure is on game studios to cut expenses wherever possible.

You can still argue that NV is providing a service and that AMD should provide an equivalent one, but I wouldn't assume that all developers make calls like this because they *want* to.
 
100% the same thing as telling someone to just do their own implementation of GW. There is zero difference regardless of your trying to draw an arbitrary line between "API", "extension", "library" or whatever other terminology.

Why is it the same thing? Sorry, but I don't see an argument there. :smile:

IMHO, in one case it's accretive and in the other it's destructive. Sometimes supporting (accretive) is so costly that it doesn't make sense, and sometimes recreating (destructive) is so free of cost that it's just trivial, but they are still two different starting positions (accretive/destructive).

I'm not even talking about GW/Mantle/etc anymore, I'm debunking that "Why don't Company A just do their own?" argument. It's bad for the industry to go down this road, as if the correct answer for Mantle were to nvidia/intel/qualcomm do their own CTM API.

Make no mistakes, Mantle is a risk for the industry, but it has much more potential than any of us believed in the first place. That's why we even recognize it as a potentially valuable effort, pushing DX/OGL, seeding a new API or even becoming an industry standard. I personally don't see this same potential on GW.

Ultimately you guys are arguing that if Mantle provided a DirectX driver/fallback for other hardware you'd be mad about it, even though developers would still be free to do exactly what they can do today and write their own DirectX path. That's a silly position and I'm not even going to explain why if it isn't blatently obvious.

Why would Mantle have DX inside itself? You mean a JIT compiler? Can't see why would we need that.

How about you just pretend that GW doesn't work outside of NVIDIA then and be happy about it. If developers choose to run GW code on non-NVIDIA hardware, that is equivalent to them writing the same path in DX. If other IHVs don't like it, they are welcome to "optimize" it by implementing it and the developers are welcome to use (or not use) the resulting implementations.

I started with some sympathy for this argument, but it is increasingly waning. While I still agree "research"/"sample" implementations should ideally be completely open source (and come on AMD guys, stop ignoring my question if you really mean what you say), the GW situation is no worse than any other proprietary extension, API or implementation.

I believe I got the gist of it in this post: http://forum.beyond3d.com/showpost.php?p=1818497&postcount=33

You can't fault NVIDIA for providing more compatibility than they had to. If game developers choose to use it, they have judged it and deemed it acceptable, and that's on them, period.

But we can see the risks that come from that.
 
Last edited by a moderator:
I guess the best of both worlds here is GW for DirectX path and Mantle for AMD? Developers can spend more time on Mantle implementations as they no longer have to spend as much time on the DX path for both Nv/AMD.
 
Yes, absolutely. This was always initially implemented as code, controlled by the developer, the samples and code quickly followed and now there is TressFX 2.0 that Kaotic has linked to.

So, Nixxes did develop TressFX all by themselves and AMD discovered that it would be a great fit to add to its D3D11-effects stack?
 
On one side you have a closed, performance-critical graphics library written by a third party (e.g. a GPU vendor) running on all configurations (e.g. GPUs from different vendors), effectively giving the library owner full control on how well this code runs on all supported platforms without any possibility for the game developer or other partners to optimize/modify it.

This statement is nonsensical. To suggest that NVIDIA has full control on how well this code runs on all supported platforms is outrageous. If that were the case, then AMD cards would run the game like garbage. The reality is, and has been confirmed and only added "after the fact" to Joel's article, is that game developers certainly can gain access to and optimize GameWorks code if they choose to based on their licensing terms. And all major IHV's have the ability to make improvements to game performance and stability through their [generally closed-source] graphics drivers.
 
Last edited by a moderator:
Just to clear up a few points:

1). I looked hard for smoking guns. I checked multiple driver versions on both AMD and NV hardware to see if I could find evidence that one vendor took a harder hit than the other when performing a given DX11 task. There aren't any, other than tessellation in AO.

The ironic thing is that NVIDIA's GPU performance was way, way below normal relatively speaking (and well beyond the differences you noticed in your article) on recent AMD-sponsored games such as Dirt: Showdown and Tomb Raider. AMD worked with the game developers to do GCN-friendly shader replacements and add GCN-friendly rendering techniques at the last minute, and it took months and months and months for NVIDIA to extract more performance and stability on these titles.
 
Back
Top