NVIDIA Game Works, good or bad?

It's in the best interest of IHVs to provide ISVs with the tools to integrate the most advanced (and calculation intensive) visual effects possible with little effort. Those super fast GPUs aren't very useful sitting idle...
Throwing up barriers to entry is however not in our best interest.

NVIDIA and Intel should never have bought physics middleware, NVIDIA should never have used closed source restrictive licensed application code and tied that together with their marketing deals and Mantle should have been handled within the framework of some new flavour of OpenGL ala ES.

PS. forgot the ARM Geomerics deal, also shouldn't have happened.
 

Looks like Gameworks is as bad or worse than I feared.

“Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code—the most desirable form of optimization.”

“The code obfuscation makes it difficult to perform our own after-the-fact driver optimizations, as the characteristics of the game are hidden behind many layers of circuitous and non-obvious routines,” Hallock continues. “This change coincides with NVIDIA’s decision to remove all public Direct3D code samples from their site in favor of a ‘contact us for licensing’ page. AMD does not engage in, support, or condone such activities.”

[strike]In his benchmarks, at ultra quality the 780ti is over 50% faster than the 290x.[/strike] [edit]Ooops, totally got the numbers mixed up[/edit] And there isn't much that AMD can do about it. It's not like they can look at the code and do some driver tweaks. It's not like they can look at the code and make suggestions to the developer on potential changes to the code patch for an AMD specific rendering path.

The only people that can help or harm performance of Gameworks software titles on AMD hardware is...Nvidia.

And since everything is obfuscated, Nvidia can do all the harm it wants and no one would ever or could ever know, except for Nvidia.

I own an Nvidia gfx card now, but needless to say, I won't be buying any game developed with Gameworks. It isn't likely to change anything but unless people vote with their wallets shit like this will just get worse.

I'm sure someone would argue, "but Mantle..." Ummm, Mantle doesn't obfuscate DirectX code behind multiple layers of obfuscation. Hell, the Mantle code is open for all to look at. And the Mantle code can't help or harm performance on Nvidia hardware. Even if Nvidia started to implement support for Mantle, Nvidia would still be in control of performance on their end as the code is there for all to see. Unlike Gameworks where Nvidia is the one and only company that can determine how a game will perform on AMD hardware.

Not the game developer. Not AMD. But Nvidia are the ones in complete control of how the game performs on AMD hardware. Blech.

Regards,
SB
 
I dont know how Forbes arrive to this numbers .. This is with last driver from Nvidia released for watchdog.. ( and a version of the 14.6beta for AMD )...

http://www.guru3d.com/articles_pages/watch_dogs_vga_graphics_performance_benchmark_review,1.html



Whatever, it stay unplayable whatever is the card with ultra setting.. there's a problem with Vram and pagefile, bringing a stutter fest. Problem is the game is not really a reference in term of graphism quality.


Edit: .. Oups sorry, not seen Olegsh have post too the link to the numbers of Guru3D. Anyway the article is about gameworks...
 
Last edited by a moderator:
The only thing that's really visible from these (and other) benchmarks is how the CPU seems to be limiting Nvidia cards less. DX11 command lists vs. reliance on mantle?
 
GameWorks games runs just fine on competitor's hardware, dead on arrival closed proprietary Glide 2013 API doesn't work at all on competitor's hardware and locks them out, but AMD defense force conveniently ignores the facts.

Also believing lies from AMD employees and even publishing them, Forbes is truly terrible.
 
dead on arrival closed proprietary Glide 2013 API doesn't work at all on competitor's hardware and locks them out, but AMD defense force conveniently ignores the facts.

It does not lock them out at all, please show some evidence where amd are preventing nvidia or intel from supporting mantle.

ps: The top people from nvidia have been interviewed many times has anyone ever asked them the question "why is your company so scummy"?
 
No point in answering Davros, it seems that poster had been properly rewarded.

Hopefully this place will be nicer with him and UT outside of it.
 
The Forbes guy also published a follow-up article with benchmarks. The R280X has exactly the same performance as a GTX770, which is completely in line with expectations.

So the question is: why does Nvidia love R280X so much? For sure, there must be something very sinister going on here.
 
I dont know how Forbes arrive to this numbers .. This is with last driver from Nvidia released for watchdog.. ( and a version of the 14.6beta for AMD )...

http://www.guru3d.com/articles_pages/watch_dogs_vga_graphics_performance_benchmark_review,1.html
Guru's benches are unreliable, the game is VRAM limited and their quality settings are so high, that 2GB and 3GB cards are instantly crippled.
Whatever, it stay unplayable whatever is the card with ultra setting.. there's a problem with Vram and pagefile, bringing a stutter fest. Problem is the game is not really a reference in term of graphism quality.
You can avoid all of that by carefully managing settings as not to exceed your v.ram limit. 2GB cards can only use 1080 at Ultra +FXAA, 3GB cards can use 1080p with MSAA or TXAA, and 4GB cards can play higher than 1080p.
 
Guru's benches are unreliable, the game is VRAM limited and their quality settings are so high, that 2GB and 3GB cards are instantly crippled.

You can avoid all of that by carefully managing settings as not to exceed your v.ram limit. 2GB cards can only use 1080 at Ultra +FXAA, 3GB cards can use 1080p with MSAA or TXAA, and 4GB cards can play higher than 1080p.

Looks to be more of an NV issue with the Ultra HD resolution than memory issue, R9 270X with 2GB mem isn't getting any big drops from Ultra HD res compared to WQHD or FullHD
 
I'm sure someone would argue, "but Mantle..." Ummm, Mantle doesn't obfuscate DirectX code behind multiple layers of obfuscation.
We already discussed this earlier in the thread. This argument still isn't compelling to me. Consider Gameworks under the same "free pass" you're giving Mantle - i.e. ISVs can happily write two paths and only use that path on NVIDIA - and there's no real difference beyond the fact that they have the *option* to use Gameworks on other hardware too. That option is what apparently vilifies NVIDIA despite it being the game developers who are choosing to exercise it.

Hell, the Mantle code is open for all to look at.
Don't think so... not that it matters.

And the Mantle code can't help or harm performance on Nvidia hardware.
If someone made a Mantle-only game would that be AMD's fault or the game developer's fault?

Not the game developer. Not AMD. But Nvidia are the ones in complete control of how the game performs on AMD hardware. Blech.
Pardon my language, but this is complete bull-shit. Game devs control their own code and can write or use whatever they want on whatever platform they want.

Anyways don't get me wrong, I'm actually not mad about either of these things as this stuff has been happening forever... nothing really new. But getting overly upset over one and praising the other is pure fanboy-ism and has no place here.
 
Looks to be more of an NV issue with the Ultra HD resolution than memory issue, R9 270X with 2GB mem isn't getting any big drops from Ultra HD res compared to WQHD or FullHD
Which makes their results even more unreliable, there is no way a 2GB card can handle 4K resolution with 2XAA (which takes 4GB of ram) without running out of memory, it is likely the correct Ultra/AA settings have not been activated with the 270X. they clearly stated that they had problems of this sort with all of the cards.

For a better all around benchmark, I suggest looking here:
http://gamegpu.ru/action-/-fps-/-tps/watch-dogs-test-gpu.html
 
Here's what's going to happen: both Nvidia and AMD will release new drivers that will correct some weird performance issues seen by the highest end configurations (R290X, SLI, Xfire.)

Performance of AMD may be slightly lower than expected compared to average, but as seen with R280X already, nothing outrageous.

IHVs have been doing game-specific optimizations since the beginning of time, the vast majority of them without access to any source code. For this game, AMD will have to work a little bit harder than Nvidia. Big deal, life isn't ways fair.

Tempest in a teacup.
 
We already discussed this earlier in the thread. This argument still isn't compelling to me. Consider Gameworks under the same "free pass" you're giving Mantle - i.e. ISVs can happily write two paths and only use that path on NVIDIA - and there's no real difference beyond the fact that they have the *option* to use Gameworks on other hardware too. That option is what apparently vilifies NVIDIA despite it being the game developers who are choosing to exercise it.

I'm actually vilifying the developers who choose to use Gameworks, hence, avoiding purchase of anything made with it.

If someone made a Mantle-only game would that be AMD's fault or the game developer's fault?

If it was Mantle only then I'd be boycotting those as well. As in this case it would be worse than Gameworks as it currently stands (currently only able to run on one IHVs hardware and only on some of their hardware).


Pardon my language, but this is complete bull-shit. Game devs control their own code and can write or use whatever they want on whatever platform they want.

In the case of Gameworks, do they really? And even if they do, are they going to also offer a non-Gameworks DirectX path to ensure their game can potentially operate somewhat optimally on all graphics vendor's hardware?

At least in the case of Mantle, it doesn't even touch the "standard" rendering path available to all graphics hardware by any graphics IHV making DirectX drivers.

In the case of Gameworks, it takes the standard DirectX that all graphics cards (on Windows) will use by default and obfuscates it and optimizes it such that any code written to it works optimally on their hardware. That part is harmless enough. The part that isn't is that since it is still DirectX base, all other graphics cards from any graphics hardware provider will have to go through it unless the developer makes a separate DirextX path in addition to the Gameworks DirectX path. And what developer would do that as long as the Gameworks path performed "well enough" on non-Nvidia hardware, even if it potentially ran dog slow compared to how it would be with a non-Gameworks DirectX rendering path? Additionally, what's to stop Nvidia deliberately doing things to cripple performance on competitor's hardware? The developer's can't discuss it with other IHVs after all. So it's not like the competitors would be able to find out and point it out or even attempt to fix something Gameworks does which has some unnecessary performance impact on their hardware.

So yes, I despise Gameworks. Not for what it is trying to do. But for that fact that it takes a standard, obfuscates it, deliberately hides it from it's competitors, makes developers unable to solicit input from other graphics IHVs on how to make it run better on their hardware (if they use Gameworks) or even approach the other IHVs if there is some problem to discuss a way to fix it leading to potentially non-optimal fixes being done through Gameworks that cripples performance further.

So, I guess I should amend that. If a game developer makes a game using Gameworks but no alternate default DirectX path for other graphics vendor's hardware to use, then I will not purchase the game and will be unlikely to purchase anything from that developer in the future.

The same would go for any developer making a Mantle game with no alternate default DirectX path for other graphic's vendors hardware. This one should be obvious as I would not be able to even run the game (using an Nvidia graphics card now).

Regards,
SB
 
Buddha,
What makes you think that GameWorks obfuscates DX11?

To me, it simply seems to be some middleware that makes use of DX11 without obfuscating anything. A developer can (and will) still use pure DX11 for everything that's not handled by GameWorks.

You need to render hair, use GameWorks. You need to render a face, use DX11 directly. This is why it works directly with AMD GPUs in the first place.
 
Back
Top