NVIDIA Game Works, good or bad?

This statement is nonsensical. To suggest that NVIDIA has full control on how well this code runs on all supported platforms is outrageous.

First, this fact is undisputed. Nvidia owns the code. Nvidia controls the code. Developers can optimize, yes, but with no ability to work with AMD, developers cannot optimize the game for AMD's products in anything but the most general way.

If that were the case, then AMD cards would run the game like garbage.

No, they wouldn't. I think you fundamentally misunderstand how this game is played. If NV made games run like garbage on AMD systems, they'd be opening themselves up wide to a class-action lawsuit from AMD owners, an antitrust lawsuit from AMD, and a developer revolt. No company would ever touch such a program.

When Intel sabotaged AMD's performance via compiler optimizations, they did it subtly. One generation of compiler wouldn't optimize for AMD. Intel promised the next one would. When the next one shipped, it would only use SSE, as opposed to SSE2 for certain instructions. You can read about it here -- Intel bought themselves about 10% in this fashion.

Why just 10%? Because 10% is just enough to not raise suspicions. 10% can be excused by "Well, I guess Intel just builds better chips."

Nvidia wouldn't need to make AMD cards run like garbage to exploit this scenario. If NV cards are 10% faster than AMD cards in enough titles, then AMD has to price its products lower to compensate. In Q2 2013, AMD shipped an estimated 5.32 million discrete cards. If Nvidia can exploit a 10% performance difference to lower the price of the average AMD GPU by just $10, then it just lifted $53.2 million worth of revenue out of AMD's pocket. That's the damage 10% can do.

My work does not demonstrate that GW is AMD-optimized (we know, by definition, that it isn't). I only showed that no overt penalty exists. There is no reason to assume that GCN and Kepler should always take an identical performance hit when GW features are enabled.

Only an exceedingly stupid company launches a program that blatantly breaks the competition's hardware from Day 1. Nvidia is not a stupid company. And they don't need to wreck AMD's performance to create a strategic performance advantage for themselves.
 
The ironic thing is that NVIDIA's GPU performance was way, way below normal relatively speaking (and well beyond the differences you noticed in your article) on recent AMD-sponsored games such as Dirt: Showdown and Tomb Raider. AMD worked with the game developers to do GCN-friendly shader replacements and add GCN-friendly rendering techniques at the last minute, and it took months and months and months for NVIDIA to extract more performance and stability on these titles.

Yes. And under GW, AMD could never have performed the equivalent optimizations at all.

Make no mistake, I'm no great fan of TWIMTBP or Gaming Evolved. I said so nearly 10 years ago. From that article:

"The line between “We built the best game possible that happens to run better on NVIDIA / ATI hardware” and “We built the best game possible to run better on NVIDIA hardware” is razor-thin, especially for a hypothetical company struggling to survive long enough to push their product out the door. While obviously there must be communication between hardware designers, software developers, and standard developers, such communication should not rise to the level of hand-in-hand marketing and development."

Nevertheless, I have to deal with what I've got in front of me. TWIMTBP and GE are established. They give an advantage to the relevant vendor that the other company can eventually fix after weeks or months of patching. Do I like that system? No. But it beats the GW alternative.
 
First, this fact is undisputed. Nvidia owns the code. Nvidia controls the code. Developers can optimize, yes, but with no ability to work with AMD, developers cannot optimize the game for AMD's products in anything but the most general way.

Look, this isn't rocket science. If game developers can view and optimize code, then NVIDIA doesn't control the code. If IHV's can optimize performance and stability by essentially replacing existing code with shader replacements and other techniques through graphics drivers, then NVIDIA doesn't control the code.
 
Yes. And under GW, AMD could never have performed the equivalent optimizations at all.

Nice way to sidestep the issue that performance and stability on NVIDIA cards was horrible for months and months on these recent AMD-sponsored titles. And how would you know that AMD wouldn't be able to do something similar under GameWorks? The game code of Dirt: Showdown and Tomb Raider was not magically altered to fix NVIDIA's issues. NVIDIA was forced to do most (if not all) the fixes in their graphics drivers.
 
Last edited by a moderator:
Nice way to sidestep the issue that performance and stability on NVIDIA cards was horrible for months and months on these recent AMD-sponsored titles. And how would you know that AMD wouldn't be able to do something similar under GameWorks? The game code of Dirt: Showdown and Tomb Raider was not magically altered to fix NVIDIA's issues. NVIDIA was forced to do most (if not all) the fixes in their graphics drivers.

It's not like AMD hasn't had their fair share and more of similar cases with TWIMTBP-titles in the past, and that's even without going to vendor locked features which AMD (/ATI) could have used, too, without the vendor lock :rolleyes:
 
Some info on GameWorks from NVIDIA's website:

NVIDIA said:
GameWorks puts NVIDIA’s tools and technologies – and some of our best engineers – into studios building games that push the boundaries of what’s possible.

At its heart are more than 300 world-class NVIDIA engineers, with thousands of years of collective experience, working at the cutting edge of the art and science of gaming.

Over the years, these passionate gamers – who also happen to be some of the best visual effects artists, engineers and computational mathematicians on the planet – have built a sprawling collection of tools that includes visual and physical simulation software development kits, algorithms, engines, and libraries.
 
So, Nixxes did develop TressFX all by themselves and AMD discovered that it would be a great fit to add to its D3D11-effects stack?
Controlled != Developed. Did our ISV engineers develop the original concept? Yes (AFAIK). Did we assist in the original implementation, I'm sure yes. However, was that code fully available and tweakable by the developer and the overall use and implementation in their control? Yes. Almost inevitably Nixxes/Crystal Dynamics further tweaked it for the implementation on the XB1/PS4 version.
 
Right, an eye for an eye, a tooth for a tooth. :rolleyes:

And this GW is tooth for.. which AMDs tooth exactly? That's right - none, AMD hasn't done anything similar, if you want to compare Mantle to something, compare it to CUDA ('till Mantle gets open)
 
Looking what NV did in android platform where some devs introduce games with special features that only can be used on tegra platform (that is technically can be run on another platform), then yes, I'm worried by this GW stuff.
I know that those game is sponsored by NV, but come on, locking features (graphical effects)?
 
AMS, I'm going to combine two posts into one here.

Look, this isn't rocket science. If game developers can view and optimize code, then NVIDIA doesn't control the code.

Who controls the amount, degree, type, and nature of the optimizations? Nvidia. Who decides the license terms in which source code is available? Nvidia. Who decides the nature of the GameWorks program? Nvidia. You're right, this isn't rocket science. It's Nvidia's code, Nvidia's license, Nvidia's terms and conditions, and Nvidia's program.

If you want to argue that Nvidia won't abuse its position, than say that. The code and the criteria for modifying the code are both Nvidia's.

If IHV's can optimize performance and stability by essentially replacing existing code with shader replacements and other techniques through graphics drivers, then NVIDIA doesn't control the code.

All of my research suggests this is enormously time-consuming and extremely difficult to the point of being impractical. Working with the developer to see the HLSL shader code that's being compiled and executed is an important part of the process. Without that access, optimizing the driver is vastly more difficult.

It's the difference between laproscopic surgery and cracking the chest wall to get at the heart. I have yet to see an actual developer pop in and say that no, GameWorks doesn't change anything, and optimization is just as easy and just as simple.

But regardless, NV always controls the code.

Nice way to sidestep the issue that performance and stability on NVIDIA cards was horrible for months and months on these recent AMD-sponsored titles.

Did you miss the point of the ten-year-old editorial I posted where I talked about my dislike for TWIMTBP and Gaming Evolved-style programs? This is NOT a new position for me. I didn't like them then, I don't like them now. But the fact is, GW takes the things I don't like about TWIMTBP and GE and makes them worse.. I'm not dodging anything.

And how would you know that AMD wouldn't be able to do something similar under GameWorks?

Because AMD never sees the GW code and is not part of the GW program. NV was able to work with the TR developers and see the TressFX code post-launch. That's precisely what AMD cannot do for a GameWorks library.

If TressFX had been analogous to GameWorks, Nvidia still wouldn't have access to the code. Instead, the entire implementation is freely available online. You don't even have to register as a developer or sign an NDA. You can download it.

NVIDIA was forced to do most (if not all) the fixes in their graphics drivers.

Nvidia was still able to see the shader code and optimize the driver based on that information. AMD cannot see the equivalent for GW.
 
Who decides the license terms in which source code is available? Nvidia. Who decides the nature of the GameWorks program? Nvidia. You're right, this isn't rocket science. It's Nvidia's code, Nvidia's license, Nvidia's terms and conditions, and Nvidia's program.
I realize I'm probably beating a dead horse with this, but I will again point out that the TressFX 2.0 samples (and indeed all of AMD's samples) are currently not usable or modifiable by third parties, due to the copyright. I've been willing to offer the benefit of the doubt on this issue for a while but at this point it seems like they are planning to reserve the right to take legal action later at their discretion rather than spell out the licensing terms properly. Sad, but as it stands I doubt any legal department would approve the use of that code.

AMD can obviously fix this with a proper license, but I wouldn't be surprised if it just continues to be ignored. It's hard to be very sympathetic in the mean time as their example code is currently no more usable than the closed GW libraries.
 
Nvidia was still able to see the shader code and optimize the driver based on that information. AMD cannot see the equivalent for GW.

I'm not sure you understand how shader replacement works. AMD definitely optimizes their driver without seeing the source code, and they also optimize the shader code without seeing the source code.
 
Reviewers don't compare Mantle performance numbers with DX because "Mantle is proprietary". Reviewers compare only DX numbers.
May I call you out on that when the first Mantle games arrive on the scene? Because, boy, are you going to be proven wrong on that one! Especially if the first batch of Mantle supported games don't have any additional features, which I kinda expect to happen.
 
Last edited by a moderator:
I'm not sure you understand how shader replacement works. AMD definitely optimizes their driver without seeing the source code, and they also optimize the shader code without seeing the source code.

It's possible I don't. My understanding is that the benefit of working with the developer is the ability to see HLSL code and to give optimization advice during the dev process. Driver-level optimization also happens, but the ability to see the HLSL is valuable in both cases.

Attempting to optimize a driver based solely on the operation of an already compiled DLL is something altogether different.

That's my understanding. If it's incorrect, please explain how.

Regarding AMD's licensing terms on TressFX, which the company has stated several times is open, this is the closest I could find to terms:

http://developer.amd.com/amd-license-agreement-sample-code-w_distribution-rights/

But there doesn't seem to be a license file directly attached to TressFX -- at least it's not in the downloadable.
 
As a hardware reviewer, we absolutely *will* compare Mantle's performance to DX. That's the entire question about how good Mantle is.
 
Controlled != Developed. Did our ISV engineers develop the original concept? Yes (AFAIK). Did we assist in the original implementation, I'm sure yes. However, was that code fully available and tweakable by the developer and the overall use and implementation in their control? Yes. Almost inevitably Nixxes/Crystal Dynamics further tweaked it for the implementation on the XB1/PS4 version.

Thx Dave for making that perfectly clear!

But regardless, NV always controls the code.
[…]
Because AMD never sees the GW code and is not part of the GW program. NV was able to work with the TR developers and see the TressFX code post-launch. That's precisely what AMD cannot do for a GameWorks library.
[…]
If TressFX had been analogous to GameWorks, Nvidia still wouldn't have access to the code. Instead, the entire implementation is freely available online. You don't even have to register as a developer or sign an NDA. You can download it.
[…]
Nvidia was still able to see the shader code and optimize the driver based on that information. AMD cannot see the equivalent for GW.
Honest question: In what form do you think (or know) are the Gameworks effects included in a game? If it's an effect-dll then there's no problem.

And don't make the mistake to think that a big IHV (who also happen to be big ISVs) has no possibility to actually see shader code. If nothing else: It has to run through their own driver compilers.

It's possible I don't. My understanding is that the benefit of working with the developer is the ability to see HLSL code and to give optimization advice during the dev process. Driver-level optimization also happens, but the ability to see the HLSL is valuable in both cases. Attempting to optimize a driver based solely on the operation of an already compiled DLL is something altogether different.
The main difference is timing. With dev access you can make sure the shader fits your hardware and that the game runs great out of the box (sth. Nvidia had the upper hand in for years, though this is changing). Without, you need to get your hands on either a DLL or an instance of the game itself to extract and analyze the shader code and see where the bottlenecks are and what you can to do alleviate them.

--
To the point of optimizations for own hardware vs. breaking the competitions performance: There are many subtle possibilities to optimize for your own hardware and not include an „if other_ihv then count to INF and back“. Just think of specific cache or register sizes, thread generation mechanisms, blend modes supported in the ROPs and many many more.
 
Last edited by a moderator:
Who controls the amount, degree, type, and nature of the optimizations? Nvidia.

That is false. As pointed out above, IHV's routinely manipulate and replace shader code through their graphics drivers. At the end of the day, if code can be manipulated and changed by any IHV (irrespective of where that happens in the pipeline), then by definition it cannot be fully controlled by one IHV.

GameWorks libraries include the following tools:

-- PhysX SDK (physics engine)
-- OptiX SDK (ray tracing engine)
-- VisualFX SDK (creation of realistic effects)
-- Core SDK (technologies related to GeForce)
-- Game Compute (compute shaders)
-- Graphics (documentation, tutorials, etc.)

This is NVIDIA's intellectual property. To suggest or imply that NVIDIA or any game developer working with NVIDIA should just magically turn over the software code related to these libraries for AMD to use for FREE is ludicrous.
 
Last edited by a moderator:
if you want to compare Mantle to something, compare it to CUDA ('till Mantle gets open)

CUDA is a parallel programming model, not a graphics API. At the time that CUDA was created, a good parallel programming model did not even exist. Parallel programming had to be taught (and continues to be taught) at universities across the globe. CUDA is also generally used for very specialized applications by scientists who are using NVIDIA GPU's to do mission-critical work. CUDA is nothing like Mantle (even if they are both tied to one IHV right now).
 
Ams,

We are using "control" in different ways. You are saying: "Because Nvidia cannot prevent some degree of optimization at all points in time it does not fully control the code."

I'm saying: "While Nvidia may need to allow a degree of openness to make the code appealing for practical use, it ultimately controls both the degree and nature of optimization it allows from developers and can attempt to thwart optimization attempts by other IHVs."

We're quibbling over the word "full," basically.

This is NVIDIA's intellectual property. To suggest or imply that NVIDIA or any game developer working with NVIDIA should just magically turn over the software code related to these libraries for AMD to use for FREE is ludicrous.

I'm not suggesting NV should make its IP available to AMD for free. I'm stating a belief that this kind of work distorts the gaming market and harms end users by making it vastly more difficult for AMD to reach parity in GW titles.
 
Back
Top