NVIDIA Game Works, good or bad?

Kaotik

Drunk Member
Legend
Supporter
Game Works as a idea sounds good, giving really wide libraries and tools for devs to use to make making games easier.

ExtremeTech, on the other hand, says it's not all that good - according to their article, Game Works libraries are completely closed, the devs can't see what's going on, nor can AMD or Intel.

While nothing points at this point to NVIDIA deliberately hurting AMD or Intel performance with their libraries, the said companies can't optimize their drivers for them either, meaning that the performance difference could be bigger if they got the chance to optimize.

http://www.extremetech.com/gaming/1...surps-power-from-developers-end-users-and-amd
 
The accusation that AMD and Intel can't optimize for GameWorks seems a little strange to me. GPU companies often optimize via shader replacement in these types of scenarios. It's the same, really, as TressFX.
 
The accusation that AMD and Intel can't optimize for GameWorks seems a little strange to me. GPU companies often optimize via shader replacement in these types of scenarios. It's the same, really, as TressFX.

According to the article they can't see the actual code thus can't optimize it.
 
I think the problem here is not how others can't see the code. The problem is, if an IHV wants to work with a game developer or game engine developer, normally there wouldn't be too much trouble. However, in the case of using NVIDIA's libraries, it's unlikely that NVIDIA'll willing to work with AMD or Intel to optimise their libraries for AMD or Intel's hardwares.

Of course, it's game developers' choice to use such libraries, and they know that it's likely the libraries are optimised specifically for NVIDIA's hardwares. What AMD and Intel can do, IMHO, is to provide similar libraries, so if NVIDIA's libraries perform badly on competitors' hardwares, game developers have other options. Actually, this can be a healthy competition because if AMD's libraries perform well enough on NVIDIA's hardwares, game developers might decide to save the trouble of using NVIDIA's libraries, therefore, NVIDIA couldn't afford to make their libraries performing too badly on competitors' hardwares.
 
Nothing against IHVs helping developers adding features or make sure it's well optimized for their cards but closed libaries seem like the wrong approach imo.
 
Correct me if I'm wrong, but the way I read it, the whole article is really only about overtesselation in a single game, but then generalizes it to that whole gameworks thing even though there is no indication at all in other games that this is actually happening there too?
 
Correct me if I'm wrong, but the way I read it, the whole article is really only about overtesselation in a single game, but then generalizes it to that whole gameworks thing even though there is no indication at all in other games that this is actually happening there too?

Not exactly, it's about the Batman-case, too, but it's also about the fact that Game Works stuff is closed, so neither devs or competitors can optimize anything done with Game Works libraries
 
According to the article they can't see the actual code thus can't optimize it.


The way DirectX works, the driver can capture the intermediate representation as it's sent to the GPU. NV/AMD/Intel then optimize the way that intermediate representation gets translated to native GPU instructions in the driver. This can involve anything from making the JIT compiler perform certain optimizations to hand compiling the intermediate representation into a more optimal implementation.

They can optimize this code.
 
Correct me if I'm wrong, but the way I read it, the whole article is really only about overtesselation in a single game, but then generalizes it to that whole gameworks thing even though there is no indication at all in other games that this is actually happening there too?

For the record, the over-tessellation thing is known to have happened in Crysis 2 and Hawx 2 as well.
 
For the record, the over-tessellation thing is known to have happened in Crysis 2 and Hawx 2 as well.

I think, tbh, that everyone following the gaming/gfx world even the slightest bit knows that
 
As the author:

Just to clear up a few points:

1). I looked hard for smoking guns. I checked multiple driver versions on both AMD and NV hardware to see if I could find evidence that one vendor took a harder hit than the other when performing a given DX11 task. There aren't any, other than tessellation in AO.

My best understanding, however, is that AMD and NV both typically optimize a title by working with the developer to create best-case HLSL code. With GameWorks, NV controls the HLSL, and the developer either cannot access that code directly or cannot share it with AMD.

Therefore: Even if AMD and NV both take a 10% hit when enabling a given function, NV has been able to optimize the code. AMD cannot.

2). Implementing an AMD-specific code path or library is something that can only be done when a title is in development. Developers cannot finish a game, launch it, and then just turn around and patch in an equivalent AMD library. Or rather, perhaps they technically *could*, but not without a non-trivial amount of time and effort.

If I'm wrong on either of these points, I'd welcome additional information. But even if no smoking gun exists today, this seems to represent a genuine shift in the balance of power between the two vendors. I believe this is different than Mantle because GameWorks is a closed system that prevents AMD from optimizing, whereas Mantle does not prevent NV from optimizing its own DX11 code paths.

We've seen what happens when one vendor controls another vendor's performance. Sabotage. Obfuscation. It's too easy for the company that controls the performance levers to start twisting them in the face of strong competition.
 
Just to clear up a few points:

1). I looked hard for smoking guns. I checked multiple driver versions on both AMD and NV hardware to see if I could find evidence that one vendor took a harder hit than the other when performing a given DX11 task. There aren't any, other than tessellation in AO.

My best understanding, however, is that AMD and NV both typically optimize a title by working with the developer to create best-case HLSL code. With GameWorks, NV controls the HLSL, and the developer either cannot access that code directly or cannot share it with AMD.

Therefore: Even if AMD and NV both take a 10% hit when enabling a given function, NV has been able to optimize the code. AMD cannot.

2). Implementing an AMD-specific code path or library is something that can only be done when a title is in development. Developers cannot finish a game, launch it, and then just turn around and patch in an equivalent AMD library. Or rather, perhaps they technically *could*, but not without a non-trivial amount of time and effort.

If I'm wrong on either of these points, I'd welcome additional information. But even if no smoking gun exists today, this seems to represent a genuine shift in the balance of power between the two vendors. I believe this is different than Mantle because GameWorks is a closed system that prevents AMD from optimizing, whereas Mantle does not prevent NV from optimizing its own DX11 code paths.

We've seen what happens when one vendor controls another vendor's performance. Sabotage. Obfuscation. It's too easy for the company that controls the performance levers to start twisting them in the face of strong competition.

I agree with you on point 1. However, on point 2:

To my understanding, GameWorks program is basically a lot of libraries that game developers may find useful. Technically game developers are able to write most (if not all) these functions by themselves, but just like using a 3rd party game engine, why reinventing the wheel when someone else already did.

However, as I've said in my previous post, nothing prevent AMD or Intel to provide similar libraries. I don't know if it's possible to be even API compatible (there could be some legal problems, but the Google vs Orcale lawsuit about Java seems to favor the idea that API shouldn't be copyrighted). Even if they can't be API compatible, if the competing libraries are close enough, and if GameWorks proved to be "too evil" then nothing prevent game developers to switch to AMD or Intel or some other 3rd party's similar solution.

If NVIDIA barring developers using GameWorks from using other similar solutions, then it's a different story, but I think that's a stupid thing to do and unless GameWorks libraries are really that good, normal developers are not likely to accept such deal.
 
I agree with you on point 1. However, on point 2:

However, as I've said in my previous post, nothing prevent AMD or Intel to provide similar libraries. I don't know if it's possible to be even API compatible (there could be some legal problems, but the Google vs Orcale lawsuit about Java seems to favor the idea that API shouldn't be copyrighted). Even if they can't be API compatible, if the competing libraries are close enough, and if GameWorks proved to be "too evil" then nothing prevent game developers to switch to AMD or Intel or some other 3rd party's similar solution.

If NVIDIA barring developers using GameWorks from using other similar solutions, then it's a different story, but I think that's a stupid thing to do and unless GameWorks libraries are really that good, normal developers are not likely to accept such deal.

I don't know if Nvidia is banning developers from doing things (they have stated to me that developers are free to implement other solutions if they choose.) I think the larger problem is the difficulty of implementing an entirely separate code path for AMD.

With game costs skyrocketing and multiple game studio closures last year, sure, there are studios like Activision-Blizzard or Bethesda that can write their own tickets and use any tech they want. But smaller devs and studios don't have that kind of negotiating power, and business decisions can still tilt the market. NV holds something like 70% of the total discrete space -- given the other pressures on premium game development, it's not hard to see why suits might see the situation differently than the actual programmers.

But the inability to optimize is what bugs me about this. We need a general market in which AMD, NV, and Intel can all optimize against a title without slamming into game functions they can't touch. AMD presented the problem as significant, and while I acknowledge that they're most definitely a biased party, it still seems a potential problem.
 
I don't know if Nvidia is banning developers from doing things (they have stated to me that developers are free to implement other solutions if they choose.) I think the larger problem is the difficulty of implementing an entirely separate code path for AMD.

Well, technically AMD can implement them if they want to. One way AMD could do this is to initiate an open source project.

With game costs skyrocketing and multiple game studio closures last year, sure, there are studios like Activision-Blizzard or Bethesda that can write their own tickets and use any tech they want. But smaller devs and studios don't have that kind of negotiating power, and business decisions can still tilt the market. NV holds something like 70% of the total discrete space -- given the other pressures on premium game development, it's not hard to see why suits might see the situation differently than the actual programmers.

I understand this. Ideally such library should be done by a 3rd party willing to cooperate with all IHV. Unfortunately, such libraries are not likely to be free and might even be quite expensive (such as some existing libraries).

But the inability to optimize is what bugs me about this. We need a general market in which AMD, NV, and Intel can all optimize against a title without slamming into game functions they can't touch. AMD presented the problem as significant, and while I acknowledge that they're most definitely a biased party, it still seems a potential problem.

I think what AMD and/or Intel can do is to make similar efforts, and we hope for better results driven by positive competition. After all, NVIDIA make the libraries to sell their GPU, so I understand why they might not very keen on optimizing for competitors' products.
 
Back
Top